USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

October 11, 2017

One of the challenges to the full expression of collaboration on construction projects is sharing information including geometry, properties, scans in the form of reality meshes, descriptions and status of equipment, and other information. In the dark ages of design information exchange was by paper. At one time $2 billion of FedEx business was shipping paper drawings between project teams. The next step was converting paper drawings to PDF files which enabled drawings and other information to be sent electronically rather than relying on FedEx. But the data in PDF remains opaque to applications such as AutoCAD, Microstation, Revit, Microsoft Excel, and other applications which only know how to read application-specific DWG, DGN, RVT, or XLS files. iModels are Bentley's way of packaging project information, in the form of application-readable file formats, in a container that can be shared with other project teams. There are iModel plugins for other vendors' products, such as Revit, for example, which enables rich data to be exchanged between Revit and Bentley BIM.

iModel Version 1.0, which is widely used as evidenced by the the number of submissions to the Be Inspired awards that uses it, comprises a collection of files. Each file is accessible to the application that created it, for example, Microstation or Revit, but not to others.

Yesterday in his keynote at the Year in Infrastructure 2017 (YII2017) conference, Greg Bentley, CEO of Bentley, announced iModel 2.0 and this morning Keith Bentley, who is the founder of Bentley, described what iModel 2.0 is technically. It is no longer a collection of files, but a relational database. This enables all the information to be queried by all applications, not just the one that created the data.

Furthermore, Keith described iModelHub which adds data versioning making iModel 2.0 a versioned database which records changes in the data. iModelHub allows you to compare different versions to see what has changed. This is a powerful capability for construction projects which are constantly changing but which need to enable managers (and lawyers) to know what changed and when.

The is a very important advance which will allow much richer information to be shared by many applications. Additionally, iModelHub enables iModels to be shared via the cloud which dramatically expands accessibility to project information especially when used in conjunction with a collaboration enabler such as Bentley's ProjectWise.

April 19, 2017

At the SPAR3D 2017 conference in Houston, one of the interesting questions posed to a panel comprised of Greg Bentley (CEO of Bentley), Burkhard Boeckem (CTO of Hexagon Geosystems), and Shabtay Negry (Senior VP at Mantis Vision) is whether point clouds, whether from digital photography or LiDAR, will effectively disappear, being replaced by meshes for most reality modeling applications. Meshes are mathematical constructs, typically 3D triangular networks, that are smaller and much easier to manipulate than point clouds, which are typically huge and a challenge to edit.

Another fascinating example is the USS Arizona digital project which combined underwater lidar, sonar, aerial imaging, and existing photography and surveys to produce a survey-grade 3D model of the USS Arizona in the form of a mesh using Autodesk Remake. Surveys were conducted in 2014 and 2016. Just recently the two digital mesh models of the ship were compared to create a difference mesh to show areas where there has been change in the two years since 2014.

At SPAR3D there was a consensus among the panelists that meshes are replacing point clouds for most applications. Point clouds are just too big and have too little intelligence. They will remain as an intermediate data type because most scanners generate point clouds. Replacing point clouds for many applications meshes will grow in importance in the future. But realizing their full potential will require interactive, editable meshes that allow small bits to be changed or replaced without having to regenerate the entire mesh (which is often the case with triangulated surfaces). Easy to use analytic tools like those used to create the difference map of the Arizona will also be required. The panelists thought that with the appropriate edit and analytic tools people will be able to manipulate surfaces easily and may not even be aware of what the form of the underlying data structures is. Smaller files will also make cloud processing easier because in the future huge point cloud files won't have to be uploaded to the cloud.

February 20, 2017

Utilities have in general not been IT innovators. But the accelerating adoption of smart-grid technologies by utilities is resulting is driving the need for greater IT awareness and skills in the electric power industry. Large utilities have the requisite IT skills in house. Small utilities typically don't and outsource their IT development or adopt SaaS solutions. Medium-sized utilities fall in between. Now regulators appear to be ready to provide financial encouragement to push utilities to improve their IT capacity. Cloud-based computing is perceived by regulators as providing a way to future-proof IT investment in the rapidly changing smart grid world.

At Distributech 2017 Oracle released the results of a survey of utility industry regulators. Oracle Utilities and Zpryme surveyed 76 utility regulatory staff and commissioners in the U.S. What they found was that more than three quarters of the respondents believe that regulators should play some role in determining whether utilities use the cloud. 69% of U.S. regulators support capitalizing cloud-based software, which is a way to use accounting to encourage utilities to invest in cloud-based software. A third of the regulators have a specific and comprehensive strategy for utility cloud investment. The most significant benefit of cloud computing the respondents identified was improved flexibility.

May 02, 2016

In the future, the deliverables at the end of a construction project will not only include the completed building itself, but also a digital "reality model" of the structure. The digital model will include digital as-builts comprised of georeferenced point clouds captured by laser-scanning or meshes generated from images captured with a digital camera on a smartphone. That was the prediction of Bhupinder Singh of Bentley at the GeoBuiz 2016 Summit in Bethesda. This is a vision that is shared by other professionals in the construction industry including Ron Singh, Chief Surveyor at the Oregon Department of Transportation, who sees it as dramatically changing the construction process from surveying through design and construction to operations and maintenance.

Consumer devices and professionals

Consumer mobile devices will increasingly be the platform for many professional activities which currently require specialized devices and a PC computing platform. Bryn Fosburgh of Trimble pointed out that while currently the location accuracy of GPS equipped smartphones is 10 meters, centimeter precision is surely coming because all that is required is a relatively simple antenna redesign. Bryn sees a continuum in these devices between consumer and professional applications. But while consumers tend to blindly accept whatever is available on their smartphones, professionals using these devices will have to be aware of the limits of the hardware, what it can be used for and what not. For example, while smartphones in the future will be capable of centimeter accuracy in ideal locations, the accuracy will be less where there is a lot of scatter in the RF signals from the GPS satellites. To ensure reliable results a professional surveyor relies on multiple sources including GPS linked to an earth station and traditional total station measurements. Increasingly we will require other professionals to be involved. Bryn reported a rising demand for photogrammetrists, as a result of the growing volume and availability of high resolution imagery which needs professional to help interpret.

Platform as a service

For virtually all of the world's big IT companies including IBM, Oracle, SAP, and Microsoft and the world's big content companies such as Digital Globe, the cloud is central to their vision of the future. Together with others in the IT industry Xavier Lopez of Oracle sees the future in platform as a services (PaaS), but there are challenges. A major challenge is integrating data in different databases including traditional SQL databases from Oracle and Microsoft and nonSQL databases such as Hadoop. Another challenge is semantics. Frequently different disciplines use different terms for the same thing. I have blogged about the CB-NL project in the Netherlands aimed at solving the semantics problem in the construction industry. Another big issue that Xavier identified is data fusion without co-location - how do you enable efficient processing of a logically federated database when the data resides in different locations ?

Monetizing the cloud

But perhaps the biggest challenge is finding an appropriate business model. It is not completely transparent how the cloud is going to be successfully monetized, though the trend is toward some form of subscription or pay-per-use. An interesting model of the successful application of the cloud in the construction industry is Textura, just acquired by Oracle. Textura’s cloud serves as a place in which all of the participants in a project from the funding, through design, to construction can come together and access the shared assets necessary to complete the project. Textura’s network includes about 85,000 general and subcontractors who using Textura to collaborate on construction projects. Textura handles on average about 6,000 projects each month. The company’s shared database, which hosts project budgets, contractors, invoices, payments, change orders and compliance documents, resides in the cloud and is accessible to contract administrators, contractors, developers, subcontractors, consultants, title companies, and banks.

Big data and analytics

One of big drivers for the cloud is enabling the extraction of information from the huge volumes of data (most of which includes location) which are now available. I have blogged about the nearly 100 petabytes of earth observation imagery available from Digital Globe. According to Dr Richard W. Spinrad, Chief Scientist at the National Oceanic and Atmospheric Administration (NOAA) , NOAA is expecting to have 160 petabytes of archived geospatial data by 2020.

As another example, Jeff Jonas of IBM described a publicly available database which includes 600 billion transactions per day collected by telephone companies. This data, which includes location, can be mined (after being anonymized) for very interesting and commercially valuable information. Jeff Jonas described some examples - using it to analyze where people spend most of their time (their "pattern of life"), how many people go into a store, and how long they remain in a store.

Dipanshu Sharma, Founder & CEO, xAd illustrated live some of the things his company, which is in the location-based marketing business, does with this type of data. xAd can customize ads to target people who are inside a store ("There's a great deal on aisle 5"), just outside one ("Come on in, we have great deals on umbrellas") or close to a competitor's store ("Checkout XXXX, our prices are lower than YYYY") - the words are mine.

In all of these example, geospatial data and technology are key to the applications. Jeff Jonas foresees that the apps of the future on your smartphone will require geolocation and will be so compelling that just about everyone will enable GPS location on their smartphones.

April 14, 2016

I have just experienced two remarkable days at the Spar 3D Expo and Conference in The Woodlands, Texas, in itself a remarkable location near Houston. I will blog in more detail in the near future about the things and people I had the opportunity to experience over the past two days.

First of all, virtual reality. Everyone is aware of the Oculus Rift (see the new review at CNET if you aren't). There are other technologies that have come or are coming from Samsung, Sony and Google. But at SPAR 3D the most remarkable VR technology I personally experienced was what David Smith, CTO of Wearaility, talked about and allowed me and others to try - a set of incredibly light-weight glasses with acrylic lenses designed to work with a smartphone that gives you a virtual experience comparable to IMAX without the IMAX theatre. I tried it and found it an absolutely amazing surround experience. And if you have wondered where Michael Jones, of Keyhole and Google Maps fame, has landed, he is now CEO of Wearality.

Secondly, my personal passion is the geolocation of underground utilities. At SPAR 3D Mark Klusza, Founder & CTO, Real-time Metrology, Inc. described a major new advance in ground penetrating radar (GPR) technology he has developed that can detect a 3/4 inch rebar (among other things) four feet below the surface in clay.

For existing buildings creating a BIM model remains an art rather than science. But at SPAR 3D a number of new technologies were presented that can help automate, improve the quality, and speed the process. Foremost among them in my mind was presented by Nicolas Arnold, VP of Product Development, at SKUR. With SKUR's technology, which runs in the cloud, you can compare the 3D model that you have developed with the point cloud you derived the model from and identify all the points of significant variance. The same technology can also be used during a construction project to compare what has been built to what was designed - even when the building is only partially completed.

Ron Singh, Engineering Automation Manager/Chief of Surveys, Oregon Department of Transportation (DoT) described progress on his remarkable vision of how automation is upending how we design, build and monitor, operate and maintain U.S. highway systems. Richard Arrowsmith, Asset Information Group Team Leader, Highways England described the digital model of the English highway system. Stan Burns, President, Integrated Inventory, who has just retired from Utah DoT, described Utah DoT's comprehensive database of every piece of highway furniture and signage in the Utah highway system. Together these represent the future of the digitalization of national highway systems.

A new company, Indoor Reality, is the third startup of Dr Avideh Zakhor, who is on leave from the University of California Berkeley. Her new device is a backpack loaded with sensors including lasers and infrared that enables you to map the interior of buildings by simply walking through a multi-floor building, including up and down stairs. The resulting data can be used to automatically generate floor plans and 3D models and captures everything you need for an Energy Plus analysis.

Mark Shell and Darrell Gadberry, both of the City of Fort Worth Water Department, described a very versatile device with laser and sonar and camera devices on board that can be used for sewer line inspections of small and large diameter sewer pipes. Fundamentally this device enables condition-based maintenance of municipal sewer systems. Mark and Darrell reported that it has saved the Fort Worth Water Department $42 million.

I was able to interview Dr Avideh Zakhor, Nicolas Arnold, Ron Singh, Larry Kleinkemper of Lanmar Services who specialize in creating BIM models for existing buildings, and Greg Bentley of Bentley who has a broad, forward-looking perspective on the "digitalization" of infrastructure.

January 26, 2015

In the last few years I have come across a number of UAVs at the various conferences I have attended. They range from hexacopters to fixed wing aircraft but are limited typically to flight times of 40-50 minutes maximum because of battery capacity. I am also aware of a UAV with a daytime endurance estimated to be 5 to 12 hours depending on wing configuration, weather, and flight profile. All of these have a serious constraint in common, flights are severely limited by weather conditions.

Wave Gliders

In 2009 a pair of unmanned Wave Gliders travelled from Hawaii to San Diego encountering a wide range of weather conditions on the way. The 2,500 mile trip took 82 days. As of 2013 the total Wave Glider fleet had exceeded 400,000 nautical miles at sea.

The Wave Glider is an unmanned vessel used for collecting data with a variety of sensors that uses both wave-powered and solar energy to navigate under any kind of ocean conditions from doldrums to hurricanes and cyclones. The Wave Glider converts wave motion to propulsion by exploiting the difference between wave motion on the surface and below the surface where it is much less. The Wave Glider has a two-part architecture that allows this difference in wave motion to be converted to propulsion. It is also able to generate its own power. It has a maximum payload power capacity of 260W and a battery capacity of up to 6.86 kWh. The power system supports various types of energy sources and multiple powered sensors, as well as an auxiliary electric thruster and navigation, GPS, computing and communications equipment.

Everything is controlled by software. Since August, 2011, James Gosling, well-known as the "father of Java", has been the chief software architect at Liquid Robotics. The software environment has several components and can be used to controls either single vessels or fleets of Wave Gliders. It has a web application for remote mission management that includes piloting and data management tools. This software provides browser and programmatic interfaces (APIs) to allow users and applications to send commands to and receive data from the Wave Glider and its sensors. The Wave Glider has its own operating system for the onboard command-and-control computer, responsible for communications, navigation and collision avoidance, power and security. Sensor Management Software runs the onboard sensor management computer, managing the functions of and data collected from all the sensors on the Wave Glider. The Data Services Environment software provides cloud-based infrastructure that manages the flow of data from the Wave Glider to the remote pilots, mission specialists, and other users and software applications.

January 19, 2015

For the first time in a hundred years, the electric power utility industry is undergoing a momentous change. Distributed renewable power generation, especially solar photovoltaics (PV), is introducing competition into an industry that has been managed as regulated monopolies. Consumers with solar PV panels on their roofs are fundamentally changing the traditional utility business model. A recent report from the Edison Electric Institute (EEI) report refers to disruptive challenges that threaten to force electric power utilities to change or adapt the business model that has been in place since the first half of the 20th century.

Most utilities are in the midst of deploying smart grids, which basically amounts to applying the internet to the electric power grid to link intelligent electronic devices, sensors and grid control applications to enable data-driven decision making. One of the most important changes driven by the implementation of smart grid is the much greater importance of location. Geospatial technology (location, geospatial data management and spatial analytics) is seen as foundational techology for the smart grid.

The other major global change in energy is the shift in energy demand from the world's advanced economies to emerging economies. Energy demand from OECD countries has hit a plateau. Currently China is driving world energy demand. The International Energy Agency's (IEA) World Energy Outlook 2014 projects that in the future as demand slows from China, world energy demand will be driven by India, the Middle East, and Africa and Latin America.

Recently IDC Energy Insights released a report IDC FutureScape: Worldwide Utilities 2015 Predictions with predictions for the future of the utility business. Some of these are startling, suggesting that the utility industry is going to experience fundamental changes in how they do business over the next few years.

New business models

IDC predicts that utilities will be looking less at generation as a source of revenue. IDC predicts that by 2018 45% of new data traffic in utilities' control systems will originate from distributed energy resources that are not owned by the utility.

To make up for this loss of generation revenue IDC predicts that utilities will be looking for new business opportunities such as services. Specifically, IDC predicts that utilities will derive at least 40% of their earnings from new business models by 2017.

Technology

Cloud - By 2018 cloud services will make up half of the IT portfolio for over 60% of utilities.

Integration - In 2015 utilities will invest over a quarter of their IT budgets on integrating new technologies with legacy enterprise systems.

Analytics - By 2017 45% of utilities' new investment in analytics will be used in operations and maintenance of plant and network infrastructure.

Mobility - 60% of utilities will focus on transitioning enterprise mobility to capitalize on the consumer mobility wave.

Some of the important drivers for these trends include the global redistribution of energy demand from the world's advanced to the emerging economies, the rapid emergence of cloud-based provisioning and services, increasing regulatory pressure responding to customer demand to improve energy market transparency and competitiveness, cross-industry competition for technical, especially IT skills, smart analytics, and virtual and augmented reality beginning to be applied in business.

Top 10 technology trends

In March 2014 Gartner, Inc. identified the top ten technology trends which it saw impacting the global energy and utility markets. There is considerable overlap between IDC's business predictions and the technology trends identified by Gartner, Inc.

Social Media

Social media are beginning to be used as a customer acquisition and retention medium, as a consumer engagement channel to drive customer participation in energy efficiency programs, a source of information about outages, and as the emerging area of crowd-sourcing distributed energy resources coordination. Social media are also being used by utilities for communicating information about outages with customers.

Big Data

Smart grid will increase the quantity of data that utilities have to manage by a factor of about 10,000 according to a recent estimate. This trend is driven by intelligent devices, sensors, social networks, and new IT and OT applications such as advanced metering infrastructure (AMI), synchrophasors, smart appliances, microgrids, advanced distribution management, remote asset monitoring, and self-healing networks. The type of data that utilities will need to manage will change: for example, real-time data from sensors and intelligent devices including smart phones and unstructured data from social networks will play a much greater role for utilities in the future.

Mobile and Location-Aware Technology

Mobile and location-aware technology which includes hardware (laptops and smartphones), communication products (GPS-based navigation, routing and tracking technologies), social networks (Twitter,Facebook and others) and services (WiFi, satellites, and packet switched networks) are transforming all industries. Utilities for the most part have been slow to adopt consumer mobile technology, but this is changing.

Cloud Computing

Acccording to Gartner, areas such as smart meter, big data analytics, demand response coordination and GIS are driving utilities to adopt cloud-based solutions. Early adopters of cloud technologies include small utilities with limited in-house IT skills and budgets, organizations which provide application and data services to multiple utilities, such as cooperative associations and transmission system operators, and investor-owned utilities (IoUs) conducting short-term smart grid pilots.

Sensor Technology

Sensors, which are being applied extensively throughout the entire supply, transmission and distribution domains of utilities, provide a stream of real-time information from which a real-time state of the grid can be derived.

IT and OT Convergence

Virtually all new technology projects in utilities will require a combination of IT and OT investment and planning, such as AMI or advanced distribution management systems (ADMSs). This will be a challenge for many utilities, especially smaller ones, which don't have in-house IT skills.

Sensors and actuators embedded in physical objects are linked through wired and wireless networks, using the same Internet Protocol (IP) that connects the Internet. When intelligent objects can both sense the environment and communicate, they become tools for understanding utility grids and responding to changes in near real-time. Following McKinsey there are two key benefits arising from the Internet of Things for utilities

Traditional asset management approaches are too limiting for today’s performance-based, data-driven utility environment. Asset performance management solutions need to deliver real-time equipment performance, reliability, maintenance and decision support for effective resource management so that operations and maintenance teams are empowered with real-time decision support information, providing the right information to the right people at the right time and in the right context. The result is improved operational performance and better asset availability and utilization.

Business Intelligence and Advanced Analytics

Analytics will become essential as the volume of data generated by intelligent devices and sensors, mobile devices (the Internet of Things) and social media increases and huge pools of structured and unstructured data need to be analyzed to extract actionable information. Analytics will become embedded everywhere, often invisibly.

December 23, 2014

Urbanization is a worldwide phenomenon. Smart city technology is becoming an essential element in the development of the world's megacities. For example, the new Indian government's just released budget includes an allocation for initiating the development of 100 smart cities. Songdo IDB in Korea and Fujisawa in Japan are two smart cities already under development. China has 36 smart cities in development and a low carbon model city in Tianjin. Singapore plans to become a smart nation by 2015. Iskandar is Malaysia's first smart city. The Delhi-Mumbai Industrial Corridor (DMIC) incorporates smart city concepts.

According to the report "Smart Cities Market - Worldwide Market Forecasts and Analysis (2014 - 2019)", published by MarketsandMarkets, the global smart cities market is forecasted to grow from $410 billion in 2014 to $1.1 trillion by 2019 at a compound annual growth rate (CAGR) of 22.5%. This includes smart homes, intelligent building automation, energy management, smart health, smart education, smart water, smart transportation, smart security, and related services. Most of this activity is expected to occur in Asia and the Middle East.

Navigant Research forecasts that cumulative global investment in smart city technologies, including smart grid, advanced water monitoring systems, transportation management systems, and energy efficient buildings, could total $174.4 billion from 2014 to 2023, growing from $8.8 billion annually in 2014 to $27.5 billion in 2023. Navigant forecasts that annual smart city technology investment in Asia Pacific will increase from $3.0 billion in 2014 to reach $11.3 billion in 2023.

What are cities already doing ?

In a recent study by Arup and University College London, Delivering the Smart City, the researchers analysed the spending patterns of eight U.K. cities; Leicester City Council, Manchester City Council, Leeds City Council, Portsmouth City Council, Sheffield City Council, Liverpool City Council, Bristol City Council and Coventry City Council. These are medium-sized cities with populations between 200,000 and 760,000.

This type of analysis has become much easier because of the open data movement in the U.K. which has generated significant amounts of accessible data about government spending and procurement. The analysis showed that the eight cities were spending on average 6% of their expenditure on information technology (IT). That's an average of £23 million a year on IT. Four of the eight spent between 8 and 10% of their budgets on IT. To put this in context the proportion that these city governments are spending on IT is more than many industries and is comparable to the banking and financial services industry which spends on average 8% of operating expenditure on IT.

This research shows that cities are already investing a significant amount on something which is a foundation for smart city technology. IT already underpins many services offered by city governments today including public security and health, transportation, public works, natural resource management, and permitting and licensing.

Other research also shows that city governments globally are spending a significant proportion of their budgets on IT. Gartner analysed the IT spending patterns of 99 local governments across 80 countries in the U.S. and found that that IT accounted for 3.8% of their total operating expenses. U.S. local governments, including 3,200 counties and 19,000 cities, spent approximately $34 billion on traditional IT goods and services in 2013.

According to Gartner national and regional governments in the Middle East and North Africa will spend US $12.2 billion on IT products and services in 2014, up 1.3 % from 2013. This includes internal services, software, IT services, data center, devices and telecom services. Saudi Arabia is investing in various digital government initiatives including King Abdullah Economic City (KAEC). The United Arab Emirates (UAE) is planning multiple smart cities.

City governments are not the only organizations providing IT services to support city operations. For example, installation of 5,000 smart meters in homes and businesses across London involved investment from private companies including EDF Energy, Siemens, Logica, as well as the electricity transmission and network operators, National Grid and U.K. Power Networks, the transport operator Transport for London, and the city government Greater London Authority.

The U.S. spending analysis also showed that local governments, in addition to more traditional IT products and services, are procuring cloud-based services to modernize citizen services and reduce operational costs. An example is online portals for tax collection and business licensing. These IT projects are usually part of wider modernization projects being carried out within departments rather than standalone technology initiatives at the city level.

UK city governments buy IT products and services from large vendors

City governments in the U.K. tend to purchase their IT products and services from large vendors. According to the UK analysis of eight cities large and middle-sized vendors accounted for the overwhelming majority of IT spending (98%) with small businesses only accounting for 2%.

October 15, 2014

At Oracle Open World in San Francisco 2014, September 28–October 2 the Oracle Spatial and Graph SIG held a Meetup. By way of background Oracle Spatial and Graph support geospatial data and analytics for land management and GIS, mobile location services, sales territory management, transportation, LiDAR analysis and location-enabled Business Intelligence. The graph features include RDF graphs for applications ranging from semantic data integration to social network analysis to linked open data and network graphs used in transportation, utilities, energy and telcos and drive-time analysis for sales and marketing applications.

There were several very interesting discussions at the Meetup that revealed some interesting aspects of Oracle Spatial itselef and its interoperability with other products such as ESRI ArcGIS, Autodesk Map3D, and Bentley. These are taken from a summary recorded by Nick Salem moderator of the Meetup.

The Meetup was attended by about 20 users came from industries such as rail, local city government, national land management agencies, and defense. The SIG Board and members of the Oracle Spatial product development team joined up with them in the Oracle Technology Network Lounge.

Some of the most interesting discussion topics include new features in Oracle Spatial 12c and real world experience interoperating Oracle Spatial with CAD and GIS products from other vendors.

A new Oracle Spatial 12c SDO_POINTINPOLYGON function can be used to ingest large point datasets (point clouds) without the overhead of creating a spatial index.

Bechtel has developed a way to use ESRI ArcGIS versioning with Oracle Workspace Manager (OWM). They have developed in-house custom tools to usefeature-level metadata to be able to retrieve spatial content from their project and public spatial databases, as well as to compile the revision history of layers in the database to retrieve maps impacted by changes to individual features.

Dan Geringer, Oracle Spatial Solutions Architect, advised that other large enterprises have also successfully implemented OWM with ESRI ArcGIS.

Bulk processing and interoperability are two of the important advantages of Oracle Spatial for enterprises. Bechtel has large project-based deployments of Autodesk and Bentley CAD systems, and has worked with Autodesk in recent years to successfully write spatial data to their Enterprise GIS Oracle Spatial database from Autodesk Map 3D. They also have developed a GIS Desktop Procedure that details the required Autodesk client software configuration for Oracle Spatial and ESRI ArcSDE database connections. They will be updating it for use with Autodesk 2014 and 2015 clients.

Latest version of MapViewer (11.1.1.7) supports HTML5 and JSon and the ability to work in disconnected mode. Standalone, web-based vector editing is also available. Wish Li and Jayant Sharma demonstrated MapViewer on mobile devices, an Android phone and an iPad.

August 27, 2014

Globally, smart grid technology has emerged to help utilities deal with challenges such as increased reliability, the need to reduce non-technical losses, distributed renewable generation, and electric vehicles, but for small to medium utilities, access to IT resources limits their ability to implement smart grid solutions. Back in 2010 McKinsey was already seeing AMI vendors starting to look at options for providing AMI services using a "software as a service" model. Now power industry IT vendors and service providers are increasingly offering managed services solutions, referred to as smart grid as a service (SGaaS).

I blogged previously about Burlington Hydro, a small utility in an affluent part of southern Ontario that is integrating into an intelligent network many aspects of what is typically included in smart grid including intelligent network devices, self healing networks, smart meters, distributed generation, electric vehicles (EV) , factory ride-through systems (enables factories to continue functioning through outages), battery-based electric storage, bidirectional communications network linking the intelligent devices to the control center, and dramatically increased volumes of real-time data. Burlington Hydro has been working with a local IT consulting company AGSI to develop systems to manage their smart grid deployment in a real-time, big data IT environment. But what about small utilities who don't have the in-house skills or the revenue stream to support bringing in an outside IT consulting company ?

What struck me as as so unique about what Terraspatial offers and which is so valuable to small utilities is that it is a hosted solution. Basically, all the utility needs to install at its site is a browser, everything else is running in the cloud. The most important benefit of a hosted solution like this is that it has the potential to provide a high level of IT security without the need to increase the level of IT capacity that the utility needs to maintain in house.

Terraspatial's hosted solution is called PlantWorx for electric power utilities. The design goals of the solution that Terraspatial developed are very relevant to small utilities.

Hosted, which means that the utility does not need to own or manage servers or software.

Secure because it relies on the security of a major cloud hosting provider such as Rackspace or Amazon that can provide a level of security, including protection from internal tampering, role-based access by users, protection from external threats, the latest encryption, redundancy and back-ups, ISO certified data centers, and mirrored servers for persistent backup, in other words a much higher level of security than the average utility network is capable of.

Accessible from the office and the field

Integrated solution that supports staking through to accounting and reporting with interfaces to CAD, GIS, customer information systems, accounting and billing systems, materials management, and other systems

Now according to Navigant Research the growth in cloud-based services has increased the awareness of SGaaS. Offerings are available for a host of smart grid applications in several categories, including home energy management (HEM), advanced metering infrastructure (AMI), distribution and substation automation (DA and SA) communications, asset management and condition monitoring (AMCM), demand response (DR), and software solutions and analytics.

The complexity of smart grid deployments, systems integration, spatial analytics, real-time big data and cyber security and limited internal IT capacity are some of the drivers behind a growing market for SGaaS. Navigant Research forecasts that the global SGaaS market will grow from just under $1.7 billion in 2014 to more than $11.1 billion in 2023.