Internaut | Managing a flood of new data

Related Links

Government information technology managers are constantly inundated with new types of data arriving from an ever-increasing number of sources. It's their job to figure what's worth keepingfrom each data stream, how to store it, how to access it and how to make the data available to a wide variety of applications.

One type of information already having an impact is data generated by various civil engineering projects. That includes information from road and bridge sensors, water level sensors,smart lighting controls for buildings or public spaces, and even citywide networks of traffic controls, highway signs and monitorsalong fences and borders.

To understand that growing data flow and associated issues, let’s start with the sensors themselves. They are oftentransducers. A transducer typically measures energy produced by pressure, movement or heat then converts that energy into somethingelse, such as an electrical impulse that can be recorded as data. New types of sensors implanted in bridges can measure the movementof girders or plates, metal corrosion, and other types of wear. A local system usually collects sensor impulses and converts thatinformation into a specific type of data that can then be sent to a computer.

Those systems produce a variety of data types, many of which are proprietary. But a standard called Transducer Markup Language isbecoming increasingly common. It can be used to create a type of XML document that describes data produced by a transducer in astandardized way and includes metadata that describes the system producing the data, the date and time it was collected, and howmultiple devices relate to one another within a network or via Global Positioning System coordinates.

Marport Canada Inc.’s SmartBridge is one of the leading systems for that type of data collection.

Of the nation's nearly 500,000 bridges, the Federal Highway Administration cataloged 25.8 percent as structurally deficient orfunctionally obsolete as of 2006. That doesn't mean they are heading for collapse, but it does mean they need monitoring.Traditionally, that has meant periodic visual inspections. But as the 2007 collapse of the I-35W Mississippi River bridge inMinneapolis showed, visual inspections might not be enough.

The replacement bridge built in Minneapolis contained hundreds of special sensors, many cast right into the concrete. TheUniversity Of Minnesota and the Minnesota Department of Transportation monitor the data those sensors collect.

Realizing that there will be a growing demand for such systems, researchers at Clarkson University have developed a prototypebridge sensor that doesn't need a battery. It powers itself via the vibrations of a typical bridge, similar to those flashlights thatyou charge by cranking or shaking.

Right now, transportation-related sensors are leading the way. But other agencies will soon notice their own flood of sensor data.The Agriculture Department will see more data from crop and livestock sensors. The Energy Department will see more informationon energy consumption and how weather and cost affect it.Meanwhile, the Homeland Security Department is already dealing with data from border sensors and video surveillance systems.

Government data-center and network managers can prepare for the flood of sensor data by asking the following questions when systemsare installed:

Who is ultimately responsible for the sensor network? Who willmaintain it, and who is responsible for troubleshooting it if theflow of data is interrupted?

How will the collected data be moved from the remote site to areceiving facility? Via a government network? University network?Leased lines? If so, leased by whom?

Will the sensor data simply pass through a network on its wayto a specific end user (researcher, highway manager, etc.)? Willthat end user be solely responsible for collecting and storing thedata, or will government IT managers also be responsible for datacollection, database development, backups and more?

Is there a service-level agreement associated with the datacollection, perhaps covering how often data will be updated, howlong it will be stored and how accurately the information will berepresented?

Is any sort of data conversion necessary for multipleapplications to access and integrate the data?

Each of those issues could result in more work for an ITdepartment. Thus, staffing and system considerations should be partof any integration plan.

Finally, civil engineering projects are producing more than justsensor data these days. As design technologies and site mappingimprove, they create new data files for the government to manage,some of which are quite large.

In addition to computer-aided design files, site plans,elevation information and environmental impact reports, a solutioncalled building information modeling creates 3-D datasets thatallow users to navigation through all the data associated with alarge engineering project, including the management of afacility’s entire life cycle.

Clearly, civil engineering projects will continue to spawn a newflood of data types and system requirements. Being prepared andasking the right questions are essential for anyone who is expectedto handle that flood.

About the Author

Shawn McCarthy, a former writer for GCN, is senior analyst and program manager for government IT opportunities at IDC.