Multi-Server Array Solves Video Storage Problems
Multiple server attachment and high-speed I/O resulted in a twofold productivity improvement in GPS video applications.
By Heidi Biggar
If a picture is worth a thousand words, what are a million pictures worth? If you`re a power utility, government transportation department, or county assessing agency, pictures of the geography that you are responsible for can be priceless.
That`s the philosophy behind Geospan Corp., a Maple Grove, MN-based business with a patented process for creating spatially indexed video databases. Spatially indexed video images make it possible for government offices, utilities, and other industries to know exactly where their assets are and what they look like, whether they are power lines, street signs, parking meters, fire hydrants, or tax parcels.
Geospan uses customized vans and helicopters, each equipped with eight broadcast-quality cameras and interfaced with Global Positioning System (GPS) satellites, to collect video data of the urban streetscape and infrastructure. The resulting image databases, up to 2TB in size, are made available on CD-ROM, LAN-based hard drives, or via the Internet. Custom software makes it possible to view, inspect, survey, and assess massive amounts of geography at the desktop, without restrictions from weather, time of day, road safety, or other hazards.
For Geospan, the need for sophisticated storage management was a strategic one--without it the company would struggle to achieve its business objectives. After reviewing a number of alternatives, Geospan selected XIOtech Corp.`s Magnitude disk array due to its functionality and affordability. In addition, the array allowed Geospan to streamline operations and improve data availability by attaching multiple servers simultaneously to stored video data.
"Some of the high-end arrays we looked at were too expensive and the low-end arrays had good capabilities for smaller amounts of data--say, 100GB--but beyond that it was stretching their capabilities," says Ted Lachinski, Geospan`s president. In addition, the ability to partition the RAID array was of significant benefit in Geospan`s applications.
Geospan`s vehicles record video data at 30 frames per second, capturing more than 100,000 pictures per hour. Each frame is a different picture from a different camera angle. Integrated with every image is information from a GPS satellite to indicate the precise location of each of the images.
Processing the recorded data begins with a Windows NT server capturing each frame and converting it to encoded digital information. A second NT application server then indexes the data, associates it with the correct xyz coordinates, and chops the data into separate movie clips. The final step in the process is to copy the clips to CD-ROM, edit them, and burn a final CD-ROM product.
The procedure was previously performed by using portable hard disk drives that were moved from server to server for each step in the process. Even though this technique was time-consuming and inefficient, it was the only practical means of sharing the massive amount of data generated among the servers.
Tracking the numerous drives was difficult, and constant handling was causing a high percentage of failed drives. Doug Norman, Geospan`s data operations manager, explains: "When we lost a drive, we lost a lot of expensive time, sometimes more than a day. We`d have to figure out what we lost, replace the data, and then go through the whole process again."
Company officials knew there must be a better way. The mechanics of processing their video required a storage system that could cluster servers and share the data without manual manipulation. "Some of the systems we looked at were in the million dollar range, and still required a work-around," says Lachinski. "Our business model just wouldn`t fly with those kinds of costs. The Magnitude is the only system we could find that is both functionally flexible and affordable for ourselves and our customers."
The installation of the XIOtech array on Geospan`s network brought about significant benefits. The array is configured with sixteen 18GB disk drives, for a total capacity of more than 280GB, with a mix of RAID-5 and RAID-10. Four host adapter boards allow Geospan to connect up to four different servers to shared storage. The company attached two NT servers, one for the video digitizing application and another for indexing video clips. Another server is dedicated to backup and restore operations. A Web server will be connected in the near future.
"We immediately boosted our productivity by a factor of two," says Lachinski. Instead of pulling out and plugging in drives to move data between servers, the company is now able to logically direct data volumes to the server that is processing the data at any given time. Editing of data is now done online, eliminating the need to burn CD-ROMs twice.
Geospan has also found the solution to be extremely reliable. Within weeks of installation, Minnesota experienced severe storms, causing widespread power outages. The disk array remained operational throughout the storms, using its internal uninterruptible power supply. When one of the array`s drives succumbed to infant mortality, a hot spare took over for the failed drive without any downtime.
Because of the Magnitude`s high data I/O bandwidth, Geospan is currently planning new network hardware to further streamline their image processing procedures. Doug Norman reports that although the array can move data at around 80MBps, their configuration tops out at only 6MBps. Geospan is looking to exploit that extra bandwidth by expanding their network. Eventually, the company will require 60MBps to 70MBps, according to Lachinski.
"Magnitude will allow us to go to the next level in our business--from video streams to three-dimensional computer models," says Lachinski. "Because of the extreme storage and performance requirements of this new technology, we couldn`t take our old methods with us. Magnitude is an enabling technology not just for Geospan, but for our customers as well."
a) Switching data between servers was inefficient. b) The RAID array`s ability to share data through server clustering provided a better solution.