The sensor pack prototype, which is strapped to the user’s chest, is a sheet of plastic, roughly the size of a tablet computer, consisting of gyroscopes, a Microsoft (News - Alert) Kinect camera, accelerometers, a light detection and ranging rangefinder (LIDAR) and a push button to mark points of interest.

Basically, the system works by the use of laser scans to measure a given area.

According to the MIT (News - Alert) team, the sensor pack was tested in closed hallways without the use of any Global Positioning System (GPS) and resulted in an accurate and precise map of the environment in real-time as the user moved through the scenario.

MIT's lead researcher Maurice Fallon stated, "The operational scenario that was envisioned for this was a hazmat [hazardous materials] situation where people are suited up with the full suit, and they go in and explore an environment,"

He added, "The current approach would be to textually summarize what they had seen afterward - 'I went into this room on the left, I saw this, I went into the next room,' and so on. We want to try to automate that."

For some time, robots have been utilized as first responders in search and rescue operations, but there still is a need to human responders and to that end, MIT in collaboration with the U.S. Air Force and the U.S. Navy initiated the sensor pack project to give first responders another tool in disaster situations.

United Kingdom’s BAE Systems (News - Alert) has also created a similar prototype called Navsop, which does the same functions as the sensor pack, but uses a triangulation method instead of lasers and navigation satellites.