Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

We have sent people into space, and the have expanded our awareness about the place we live.

So thanks to the many explorers of the past and present, and the tools we use, we can now see our world in a different way.

And what do we know... We know Ottawa is big. Really big.

Please allow me to take you on an journey of success at the City of Ottawa, and show you how we do what we do and how we do it.

What do we do at the City of Ottawa? We provide management and services in many areas.

Providing all these Services we have a public responsibility

We have a public responsibility to deliver and managing those services in a way that capitalizes on our investments, reduces costs, and provides access to information in a secure manner.

How do we do it? How do we provide all those services, in a cost effective and secure way? Well part of the way we do it, the part you may be interested in, is to make use of an Enterprise GIS called ‘MAP’. MAP is a graphical interface that allows us to associate business processes to municipal addresses.

‘ MAP’ is now at ‘end of life’. Replacement of the ‘MAP’ technology requires development of multiple data transformations to maintain existing legacy technology while moving forward with new and more advanced data display and analysis technologies.

Wow!

MAP allows us to associate multiple data sources and application software, such as CAD data

Digital Imagery

Tabular data from Service Desks 311

Handheld and Remote Sensors. To name just a few.

All of this information is brought together in our Legacy software MAP, and is presented securely to users.

Legacy display software eMAP

We are also sharing data with external users.

and supporting the Local Government or Community Maps initiative with FME and ESRI on the Desktop .

If you are not sure what the community maps program is, please have a look at the information available on the Safe Software web site.

The Service Ottawa program is our attempt to make it easy to engage and interact with citizens, businesses and clients, and it is dependent on current day, vendor supported Enterprise GIS technology.

Our Enterprise GIS as it is today, with a new component in development geoOttawa.

Amazing!

Building on our past success with MAP, we now have a new deployment technology in development, geoOttawa . As with any new developing technology, geoOttawa is available only to City staff. Hopefully, it will be available soon, in some similar format for public consumption.

geoOttawa combines local government data sets in a familiar web interface.

geoOttawa will provide easy to use information linked to municipal addresses.

What we are seeing is that FME provides a Safe simple efficient solution to share and transform data across the enterprise.

Lets take a look at some recent “wins” with FME.

The Salt Truck project was a significant WIN with FME as is required some really tricky spatial analysis to give us the results we needed. The goal was to monitor salting distribution across the City, lowering salt usage and increasing road safety.

What I did was to create variable length buffers around existing road segments, that did not terminate at the end of the line segments. Each buffer feature contained all the road segment attributes for that road segment.

Then I overlaid the GPS data points which contained data on salting, direction of travel, and speed of the truck. As you can see the GPS data was very ‘raw’, and was often about half the length of a football field away from the actual road centre lines. To compensate I joined the GPS points into a “raw” GPS route, and copied the point attribute data to the line segment in front of each point. Then recalculated the direction of travel. I had to recalculate that direction because I had to take into account that when a truck was turning a corner or stopped at a traffic signal, the GPS direction recorded could be off by as much as 90 degrees from the real direction of travel.

Here you can see the red ‘raw’ GPS routes created when the GPS points are joined into line segments. I next used the buffers as a cookie cutter and clipped all the ‘raw’ GPS route segments.

Then because I already had the information on which road segments where involved, based on the GPS points and buffered road segments, I could extract the full road segments, to display here in green, or generalized route, that was involved in the salting process.

So how do we determine the salt coverage? To do that I simply used the buffers to cookie cut the road segments and compared those segments to the cookie cut ‘raw’ GPS segments that fell within the same buffer. I count the number of passes on a road segment, estimate the salt coverage by knowing if the salter was on or off, and determine which side of the road was salted or not by the direction of travel.

Better than that! I had a SOAP service created, that called the fme.exe to run the workbench, with parameters provided by the user and served it all up on a web page. All that without the FME server.

I love cookies!

Requirement: to create a GIS data source of contiguous pipe with breaks at each valve in the data set. This creates groups of web like features made up of polylines that can be used as unique pressure zones. Used in water flow models.

The Enterprise Asset Management system will use Maximo and FME to synchronize data for approximately 30 ArcGIS feature classes of City assets between Edit and Publish databases, and the Maximo Asset Management system. This would include tabular data, as well as points lines and polygon data. Any change or update in a data set can be transferred to the other data sources in seconds.

This was a most difficult project, which had moving targets on both the data source and data output. Never a place you want to be with FME. Initially when I started developing workbenches for this process, it necessitated the construction of one workbench for each feature class, flowing from the Edit to Publish or Publish to Edit databases. An additional database was added for some feature classes to store Obsolete features, those features that would no longer function, or were removed from active use. After developing and re-developing workbenches many times, due to the moving target forcing me to rebuild with each change, I extracted the common workflow into 4 distinct workbenches, and placed the variable or moving target information [mostly data field names, or column names] in a single Excel worksheet. Using a WorkspaceRunner the worksheet was able to act as a controller to the run of the workbenches, which allowed me to hand off the project. The staff were able to make changes to the dataset without changing the basic workflow. Once this system was in place, the changes to the source and output data set structures began to stop. Basically staff realized the effect of having a moving target for the source and output data. No one wanted to edit the Excel worksheet.

The basic data flow in the FME workbench was to read both the Edit and Publish Databases and detect any changes in content. If a change was detected we would read the rules for processing that data in Maximo data tables, and if the data was obsolete write that data to the Obsolete feature class stored in the Obsolete database, and tag it as obsolete in the Publish database. If it was not obsolete, we would just carry out a regular update to the Publish database if required.