0:19Skip to 0 minutes and 19 secondsJAMES MILES: Well, we've done several different types of laser scanning. The one here is called phase scanning, but we've also used triangulation, and time-of-flight, and LiDAR. Phase scanning is used here simply because it allows for quick recording method here, from which we can then produce quite accurate models very quickly.

0:52Skip to 0 minutes and 52 secondsJAMES MILES: It records everything it can see. So, basically, this goes round 360 degrees, and has a field of view from minus 60 to 90. It goes around in a circle, and it records everything it can see within it. So, we have to take a series of different scans along different parts to make sure we capture everything.

1:12Skip to 1 minute and 12 secondsJAMES MILES: There's a few downsides. Depending on the scan that you're using, it takes a very long time to capture the data, and it also takes a very long time to process the data as you'll see in a bit.

1:22Skip to 1 minute and 22 secondsGREGORY DUNN: Interesting. And I notice that there's a, like a little orb back there. What does that do in relationship to the laser scanner?

1:28Skip to 1 minute and 28 secondsJAMES MILES: Each scan is unique, and they need a combination of different targets. So these spheres are targets, and then from this we can then stitch the multiple scanned data together.

1:42Skip to 1 minute and 42 secondsJAMES MILES: Well, because it gives us a very accurate representation, we can do very precise measurements. We can create animations of the data coming through, we can manipulate the viewpoint from different angles. And then we can use the data we have here for the beginning of the virtual reconstruction process for the building.

1:58Skip to 1 minute and 58 secondsGREGORY DUNN: Now I'm back inside with James, and we're going to look at the data that he has downloaded. So, now that you've downloaded it, what do you with the data?

2:06Skip to 2 minutes and 6 secondsJAMES MILES: Well, we've downloaded the data. And what we're doing now is, we process the data here. But there are several stages to process the scan data. First is, that you have to pre-process every single scan. What this does, is it finds the targets within the scan data. And it's also able to find lots of corners and straight edges. So, these features here, are overlapping features in the scan data, which the software then uses to process. After pre-processing has done, we then register all the scan data together. And then we're left with an overview of the scans.

2:40Skip to 2 minutes and 40 secondsGREGORY DUNN: So you can stitch together different, like when you do a laser scan of one area and then you go down and do a laser scan of another, you can stitch them together in here, so they're seamless.

2:50Skip to 2 minutes and 50 secondsJAMES MILES: Yeah. So, these targets, which we used here, so these little green blobs that we have here, those are the white targets that we used earlier on. Now, what the software does, as you can see from here, it has an initial scan data from one side. Because it knows the diameter of the sphere, it can then create a circular object for us. And through a series of these different targets going along, we can put multiple scans together.

Laser scanning buildings

In the previous steps you saw how much information is held in the fabric and form of our Roman buildings. Laser scanning allows us to record a dense cloud of three-dimensional points and colour values to capture some of this information, for buildings, excavated features, and objects.

The laser scanning at Portus has over the years developed from small object scanning to large architectural scans. The scanners employed at Portus have been a Vivid 910 triangulation scanner (small object and exterior tests), a Leica Scanstation 2 and a Leica Scanstation C10 both of which are time of flight scanners.

We have even tried very experimental techniques such as using a Microsoft Kinect (normally used for gaming) to scan areas inacccessible to the team. In one experiment we mounted the Kinect on the end of a five metre pole that Peter used to scan a room deep underground in the Imperial Palace.

Each type of scanner offers a different resolution of data recorded. In 2013 we began to use a Faro Focus 3D which is a phase scanner. In 2014 we are using an updated version of this system providing longer range scanning, better colour calibration and integrated global positioning.

The technology works by sending a laser beam out towards an object. Time of flight calculates the time taken for the laser beam to rebound back to the scanner and converts this to distance via our knowledge of the speed of light. Phase scanners compare outgoing and returning signals and provide better accuracy.

Using a laser scanner allows for an accurate three-dimensional representation of the object scanned. This can then be used as a tool for investigation: line data can be extracted, volumes calculated, animations (such as those at the bottom of this page) can be created, and changes over time can be observed. Laser scanning, like LIDAR, relies on line of sight but by setting up the scanner in multiple positions we are able to create a complete model. We can then place ourselves virtually within this in any position and look in any direction, including through surfaces.

The Faro Focus 3d that was used on site in 2014 has an accuracy of ±2mm, a range of 0.6m to 120m and the measurement rate of 976,000 points per second. It has an integrated camera to capture RGB values per point, which allows colour to be added to the model and can be used in both light and dark conditions. The laser scan models already completed have over a billion points and are linked together with the Total Station data and the photogrammetry data to create a detailed model of the buildings and excavated areas.

The laser scan I produced of the basement level allows us to record details when we need them, from the comfort of the lab. At 45 seconds in you can see a photo of same view of the basement in the sixth image of this slideshow from the BBC News website (2009).