USGIF GotGeoint BlogUSGIF promotes geospatial intelligence tradecraft and a stronger community of interest between government, industry, academia, professional organizations and individuals focused on the development and application of geospatial intelligence to address national security objectives.

November 14, 2017

At the Fall BIMForum/BuildCon conference in Dallas two presentations about actual prefabrication experience - for data centres and for large automotive plants - both mentioned shortage of skilled labour as an important motivation for turning to prefabrication for construction projects. In addition to reducing onsite labour requirements, Improved quality, greater safety, faster project delivery, and in some cases reduced costs were mentioned as the key benefits of prefab.

A compelling presentation on the market drivers for and benefits of prefabrication for data centre infrastructure was given by Matthew Englert, Vice President at Rosendin Electric and President, Modular Power Solutions and Chris Crosby, CEO of Compass Datacenters. Data centers, which used to be defined in terms of servers and racks of servers, are now defined by containers of servers. Timing is of the essence for building data centres. Compass advertises "We’ll deliver your dedicated data center in six months from breaking ground on a pad ready site or we’ll pay you $100,000."

The electric power requirements of data centres are reckoned in the hundred of kilowatts and even megawatts. A significant portion of the cost of a data centre is the electric power rooms to convert and manage the power for the servers and other equipment in data centres.

Both Chris and Matthew said that an important motivation for turning to prefabrication for the power rooms is the decline in the number and quality of the skilled tradesmen required to build power rooms. Older and more highly skilled trades people are retiring and they are not being replaced by younger people. This is partly a North American problem because there are limited apprenticeship programs. Perhaps because in the past skilled trades people were simply imported from Europe.

There are other advantages to prefabrication. It is easier to attract skilled workers to work in a plant under a roof with the amenities of factory environment as opposed to working outside often in inclement weather. Weather is always a risk on a construction site. Another advantage of a factory environment is safety. Statistics show that injuries are less in a prefab environment compared to a construction site. Quality is another factor that is easier to control in a prefab plant.

For data centre power rooms the prefab "skids" built by Modular Power Systems are huge weighing 60,000 to 80,000 pounds and requiring large flatbed trucks to deliver from the prefab plant to the construction site. You can watch the construction of a couple of data centres at the Compass site and the power rooms are prefabbed offsite and delivered on flatbed trucks. They are offloaded onto concrete pads that have been prepared in advance. The process is very efficient and reduces the congestion of trades people at the construction site.

Chris was blunt in his assessment of the appropriateness of existing BIM software for designing data centres. His perspective is that BIM software is no better than traditional CAD. The problem is that the level of detail (LOD) currently supported by BIM software is simply not detailed enough. He argues that if you are going to use 3D design software then mechanical design tools such as Solidworks are better than BIM tools. He believes that BIM software is ripe for disruption. Designing buildings and other infrastructure in the future is going to require new design tools and that is going to require significant investment in new technology. This is a very similar sentiment to Greg Luth's perspective on the advantages of high definition building information modeling (HD BIM) for structural engineering of large manufacturing plants such as the Tesla Gigafactory in Nevada.

Based on his experience Chris believes that prefab always costs more the first time. Repetition is required to bring the costs down to where prefab is cheaper than stick built. Therefore Chris and Matthew are convinced that the next step is to move beyond prefab to manufacturing.

General Motors came to similar conclusions based on their experience in using Epsilon Industries prefabbed utility system upgrades for four GM automotive plants. As presented by Jeffrey Johnson, Manager, Process Energy Initiatives, Bob Stevenson, Senior Vice President, Ghafari Associates, Chris Wiederick, President, Epsilon Industries, and Christopher Hofe, Project Director, Barton Malow Company, the problems that encouraged the GM team to prefab most of the utility components were a shortage of skilled trades and wanting to avoid congestion resulting from many trades tripping over each other at construction sites. They estimated that prefabbing reduced their costs by about 20% and enabled them to deliver completed utility plant for the automotive factories in eight months instead of eighteen. For example, a complete chilled water system including a safety system and weighing 85 000 lbs was manufactured in Kingston, Ontario, trucked down and and was up and running in 2.5 weeks. Also since quality assurance for the chiller was completed in the factory in Kingston, the handover was virtually instantaneous compared to the prolonged period required for a stick built chiller.

For GM shortage of labour is a major problem and was a major motivator of the decision to use prefab components from Epsilon Industries. The GM team repeatedly emphasized that they didn't know how they could have staffed stick build projects at these four sites. Onsite congestion at a construction site is also a major problem that is relieved by prefabbing components. Work conditions on a construction site are often poor because of bad weather. Weather is always a risk that can hold up a stick built project. They also reported that supervising in a factory environment is much easier compared to a construction site. GM found it easier to recruit labour for a factory environment and they was able to retain better quality labour, people who wanted to settle down, as opposed to the high proportion of transients typical of construction projects.

October 27, 2015

At GeoAlberta 2015 Susan Ancel of EPCOR, a diversified water and electric power utility headquartered in Edmonton, Alberta described a unique solution to the problem of knowledge transfer from older, experienced to new, younger employees. EPCOR builds, owns and operates electrical transmission and distribution networks, water and wastewater treatment facilities and infrastructure in Canada and the United States.

I have blogged extensively about some of the workforce challenges that utilities in the developed economies are facing. These are often related to the aging workforce and in some countries to a shrinking workforce.

I have blogged about the shrinking workforce in countries around the world like Japan and Germany, where there have been declining populations. The shrinking workforce is acting as a brake on the German economy. Recently Germany liberalized its immigration laws and for the first time in a decade the population of Germany is actually increasing.

In the U.S. because of its policy of encouraging immigration, the shrinking workforce is not the result of population decline, but the result of a decreasing participation rate. One explanation for the decline is the aging workforce. Boomers are retiring at a rapid rate. This has created a serious problem for many utilities where it has been estimated that over half the work force will be eligible to retire in the next few years. But there is another trend that tends to mitigate this, the percentage of older workers who remain on the job post-retirement is trending upwards. For the age group 65 to 74 the proportion of workers still working is nearly twice what it was in the 1990s. In the long term the problem remains, finding young workers to replace the older workers when they retire.

A recent report from PricewaterhouseCooper's (PwC) Human Resource Services practice based on data from 29 utilities representing nearly a quarter of a million employees identifies some very interesting workforce trends, some new, in the power utility industry.

PwC research indicates that utilities are losing workers at an accelerating rate. The voluntary turnover rate increased by a full percentage point between 2010 and 2012, and for high performers and early tenured employees the rate of separation was especially high.

Critical knowledge loss

Older utilities workers tend to communicate information orally rather than documenting it in a database. When these workers leave, critical knowledge leaves with them. This impacts productivity and creates risk for the utility, especially when they leave before replacements can be effectively on-boarded.

Younger workers tend to be much more tech savvy. This is critical for the electric power industry as it moves toward the smart grid. But younger workers lack the electric power that only experience can provide. They desperately need to have access to information that is stored in the brains of older workers.

Knowledge transfer

PwC recommend that utilities organizations need to adapt rapidly to these technology, resource and demographic trends. In particular, they need to evolve an efficient and effective approach to succession planning and knowledge transfer. PwC points out that an aging workforce, deeply knowledgeable but prone to resist change, need to be fully integrated participants in the change effort.

At EPCOR succession planning resulted in Project Knowhow that put handheld devices into the hands of older workers with a cloud-based application that enabled them to rapidly take field notes. For example, they could take a picture and record the location of a transformer that was moved from its design location to avoid soggy, frequently inundated land. The could add notes like "I had to replace this transformer twice because of water damage, so we decided to move it 100 meters to the northwest." Susan credits the power utility in Sendai, Japan with the idea behind Project Knowhow. In Japan in a similar project handhelds had been put in the hands of experienced workers several years before the Fukushima disaster. The information gathered from older workers was instrumental in quickly dealing with the after effects of the tsunami.

Operational effectiveness is critical for utilities, but with a static or even decreasing workforce, productivity has to improve and continue to improve. Driven by goals of improving productivity and improving the quality of their decision making through data-driven analytics, utilities are recognizing the importance of understanding their core business processes and implementing standardized, streamlined IT systems for these processes. PwC recommends that as a first step these processes need to be documented (before the knowledge about them walks out the door) by projects such as EPCOR's Project Knowhow to safeguard core institutional knowledge as part of change management practices.

April 16, 2014

I've blogged many times about the aging workforce challenge facing many industries. At the SPAR International conference Wayne Rodieck of Anadarko Petroleum outlined what this means to the oil and gas industry and how he is uing IT to bridge the generation gap.

The International Energy Agency (IEA) is projecting that by 2020 the U.S. will surpas Saudi Arabia as the world's largest oil producer. US oil and gas production, driven by technologies that are unlocking light tight oil and shale gas resources, is rising dramatically. Since 2008 the U.S. oil and gas industry has increased production by 25%. This expansion has created 1.7 milllion new jobs.

Most of these have been filled by young workers with no or little experience in oil and gas, but better acquainted with IT technology than the experienced workers who are on the verge of retirement. As with other industries oil and gas is faced with the challenge of knowledge transfer to enable younger workers to be as productive as possible and to avoid a massive decline in productivity as the older generation retires.

Dr Apostol Panayotov of UC Denver described an online spatially-enabled asset information system at Anadarko that is designed to be accessible to both generations. It provides a web-based user interface designed to serve up information about equipment on oil and gas facility sites including visual photographs, facility attributes and accurate geolocation. It is based around point clouds captured by scanning valves, pumping stations, and other oil and gas pipeline infrastructure, but it hides the point cloud behind an intuituve user interface that relies on smart digital photographs of faciltiies. Clicking on a particular piece of equipment such as a valve, tank, or catwalk in a digital photograph links the user to information about that picece of equipment including geolocation and dimensions, information that is derived from a point cloud.

Wayne described a simple use case which explains what he sees as the critical advantage of this approach. There is an emergency and at 2:30 am he has to send a young, inexperienced worker out to a site where there are a 120 valves to turn off one of them. He feels confident that by providing the young worker access to this online system which includes accurate geolocaton that he go to the right site and turn off the right valve. Equally important the intuitive photograph-based used interface will not put off the experienced workers who are less comfortable with IT systems.

January 15, 2014

I have blogged extensively over the past few years about some of the workforce challenges that utilities especially in the developed economies are facing. These are often related to the aging workforce and in some countries to a shrinking workforce.

I have blogged about the shrinking workforce in countries around the world like Japan and Germany, where there have been declining populations. The shrinking workforce is acting as a brake on the German economy. Recently Germany liberalized its immigration laws and for the first time in a decade the population of Germany is actually increasing.

In the U.S. because of its policy of encouraging immigration, the shrinking workforce is not the result of population decline, but the result of a decreasing participation rate. One explanation for the decline is the aging workforce. Boomers are retiring at an increasing rate. This has created a serious problem for many utilities where it has been estimated that over half the work force will be eligible to retire in the next few years. But there is another trend that tends to mitigate this, the percentage of older workers who remain on the job post-retirement is trending upwards. For the age group 65 to 74 the proportion of workers still working is nearly twice what it was in the 1990s. In the long term the problem remains, finding young workers to replace the older workers when they retire.

A recent report from PricewaterhouseCooper's (PwC) Human Resource Services practice based on data from 29 utilities representing nearly a quarter of a million employees identifies some very interesting workforce trends, some new, in the power utility industry.

Increasing turnover

First of all the PwC research indicates that utilities are losing workers at an accelerating rate. The voluntary turnover rate increased by a full percentage point between 2010 and 2012, and for high performers and early tenured employees the rate of separation was especially high.

The research also shows that the turnover of utility employees in their first year was significantly higher in 2012 than in 2011 pointing to a retention problem. As the economy continues to strengthen PwC anticipates that first-year turnover will continue to increase. This is a new experience for many utilities because in the past working for a utility was seen as a job for life. With the new skill sets for smart grid and releated technologies requiring increasing IT skills, utilities are having to compete with other sectors for IT savvy workers. PwC reports that other industries have been experiencing intense competition for workers for some time and are better equiped to attract and retain new talent.

Accelerating executive retirement

The PwC research also found that while the number of employees currently eligible or due to become eligible over the next 3-5 years for retirement appears to have stabilized, retirement eligibility rates for executives have continued to increase. From 2011 to 2012 there was a 50% jump in the number of executives currently eligible for retirement.

Perpetuating a culture that discourages innovation

PwC found that by forcing older employees either to delay retirement or to remain in place as contractors, the recession has helped perpetuate a conservative culture that has inhibited innovation in the industry. Older, experienced workers are unlikely to push for changes when they have only a few more years until retirement and in a aheavily regulated industry there is little incentive for these workers to innovate because of the potential of creating risk for themselves and their employer.

Critical knowledge loss

There is also a tendency for older utilities workers to communicate information orally rather than documenting it in a database. When these workers leave, critical knowledge leaves with them. This impacts productivity and creates risk for the utility, especially when they leave before replacements can be effectively on-boarded typically through a mentoring process.

Retention

Younger workers tend to be much more tech savvy. This is critical for the electric power industry as it moves toward the smart grid. As I have blogged about frequently, younger workers' expectations with respect to technology are also higher. Sitting a younger worker who has been brought up on 3D technology down with a CAD application to create and print a 2D paper construction drawing risks losing him or her to more technically progressive utilities or to other industries.

Challenges

PwC suggests that there three key areas where there is potential for improvement:

Knowledge Retention & Succession Planning

Operations

Technology & Processes

Knowledge transfer

PwC recommend that utilities organizations need to adapt rapidly to these technology, resource and demographic trends. In particular, they need to evolve an efficient and effective approach to succession planning and knowledge transfer. PwC forecasts that the rate of process and technology change is likely to accelerate as smart grid adoption picks up. It also projects that a culture of continous improvement will become more typical in the utility sector than it has been in the past.

Operations

Operational effectiveness is critical for utilities, but with a static or even decereasing workforce, productivity has to improve and continue to improve. This means the adoption of new technology. PwC points out that an ageing workforce, deeply knowledgeable but prone to resist change, can significantly inhibit operational improvements unless these workers are fully integrated participants in the change effort. In my experience older workers can be just as and even more enthusiastic about change if they can see that it simplifies their work, eliminates tedious redundant and manual paper-based processes, and improves the quality of the final deliverable.

Technology & Processes

Driven by goals of improving productivity and improving the quality of their decision making through data-driven analytics, utilities are recognizing the importance of understanding their core business processes and implementing standardized, streamlined IT systems for these processes. PwC recommends that as a first step these processes need to be documented (before the knowledge about them walks out the door) to safeguard core institutional knowledge as part of change management practices.

December 04, 2013

At the Second Annual Summit on Data Analytics for Utilities in Toronto, Lincoln Frost-Hunt, Director of Enterprise IT at Hydro One, gave a riveting presentation on how one utility has evolved from a company that in the past tended to use "gut feel" decision-making to a company that is increasingly relying on data-driven decision making,

Hydro One manages both transmission, almost 30 000 km or 96% of Ontario's transmission network, and distribution, representing about 75% of Ontario, mostly rural but some urban as well. The total value of their assets is C$17 billion and they manage C$3 billion investment in their assets every year.

Data governance

Lincoln emphasized that a key prerequisite for a successful data-driven analytical approach is quality data. Bad data means bad decisions. They found that when they started to make investment decisions based on their analysis of their operational data, the deficiencies in their data sets immediately become apparent. For example, they found that only 76 % of their connectivity data was reliable, the remaining quarter was unusable. To ensure the necessary level of data quality and reliability data governance (processes that ensure that important data assets are formally managed) was a key underpinning of Hydro One's analytics strategy. They implemented major projects to cleanup their connectivity and other data. Realizing that data quality is perishable, they also implemented a data governance policy and focused on improving business processes that produced bad data such as the processes for managing as-builts, which I have blogged about on multiple occasions, most recently here. In addition to ensure that their data quality level does not degrade overtime and that it is maintained at the required level of reliability, they have implemented a system of quarterly audits.

Data analytics

Lincoln described several areas where Hydro One's data-driven approach has yielded value.

Customer billing

Since this is Ontario, all of Hydro One's customers have smart meters. Hydro One processes 65 000 bills per day. This is a key essential process because this is a major source of revenue. A certain proportion of these get kicked out because of problems. Since seven years ago when they first installed smart meters, tracking down the problems that caused these bills to be rejected was a slow manual process that used a lot of resources. In the last two years Hydro One has implemented an analytical tool set that allows them to resolve these problems and get the bills out to the customers much more rapidly. Since bills generates revenues, the value derived from the analytical approach was immediately apparent to managment and helped gain support for the data-driven approach to decision-making.

Asset management

Improving utilization of assets is a key objective of the regulator, the Ontario Energy Board (OEB). Hydro One took an analytical approach based on modeling equipment failure so that they could estimate the probability of failure based on external conditions such as age and average load. This allowed them to generate risk curves which enables them to demonstrate to the OEB that their investment strategy had found the sweet spot between the cost of maintenance and reducing the risk of major failure. Geospatial technology was a key part of the asset analytics toolset that helped them optimize the lifecycles of their assets.

Workflow optimization

Like most utilities, Hydro One is facing the challenge of an aging workforce, so improving productivity is a key corporate objective. This is an especial challenge for Hydro One because of its largely rural service area. Sending a crew to resolve a problem or respond to a service request could require hours of driving to the site and back. Putting together "work bundles" comprising all the service requests and trouble calls in a geographic area reduces the time the crew spends on the road and maximizes their time actually resolving issues and servicing requests. To do this they installed GPS, communications and ruggedized tablets on every truck so they knew where every crew was and the crews had access to the information they needed. They also were able to show all trouble calls and service requests on a map for a three month period so they could easily assemble work bundles by geographic areas. Another major benefit they found was that when they received an emergency trouble call, they could determine immediately which crew was nearest, thus minimizing travel time. They also invested in route optimization so that the crew knew the most efficient route to follow. The bottom line is that crews spent significantly less time driving and more time resolving trouble calls and responding to service requests.

Enterprise architecture

Lincoln described Hydro One's current enterprise architecture as comprised of three major enterprise systems with many heavily customized point solutions. I suspect that many utilities would find themselves with a similar architecture. Each point solution is a silo and the enterprise systems are big silos. Whatever interoperability there is has either been customized or acquired from a third party by Hydro One. Every time a vendor upgrades a point solution, Hydro One has to retest the links to other systems.

They evaluated three alternative architectures ranging from a best of breed approach with many point solutions focused on automating key business processes to a single enterprise system with bolt-ons or point solutions designed to run with the single enterprise system. They decided on a hybrid approach, a single enterprise system with minimal point solutions when justified by incremental value.

Basically Hydro One relies on SAP for key business functions like ERP, customer information management, and asset management. Their analytical tool set is Business Objects, another SAP module, integrated with a GIS designed to integrate with SAP.

As an example of a bolt-on, they use an advanced visualization tool (Space-Time Insight which is an SAP partner) as part of their analytics tool set. Lincoln said that visualization had really changed people's view of analytics because it enabled them to visualize operational data in space and time in ways they had been able to do before. Visualizing based on geolocation, time and network topology has helped them gain insights about their operations that they would never have seen by looking at Excel spreadsheets. It also created a bit of a challenge for Hydro One's IT folks, because it created an IT traffic jam - everyone wants to use the visualization tool and no one wants to look at Excel spreadsheets.

Smart zone

Hydro one has implemented what they call a "smart zone" around Barrie, north of Toronto, where all equipment are intelligent electronic devices (IEDs) including automated substations, basically a distribution smart grid. They intend to roll this out to the rest of their service territory.

Real-time big data

Hydro One realizes that the volume of data they need to manage is increasing astronomically, and with IEDs they need to be able to analyze and make decisions in real-time or at least near real-time. Lincoln said that their next major IT development focus is to be better able to manage "big data" in real-time. I have blogged on a couple of occasions (real-time monitoring smart devices for decision making and using social media streams to improve customer experience) about an example of technology that is designed to help do this.

November 11, 2013

At the Space-Time Insight annual SI World conference this year, Dave Haak of Accenture gave a riveting overview of the research that Accenture has conducted on smart grid deployment including a survey of utility executives in Europe and North America.
Some fundamental questions that Dave was able to shed some light on are

Is smart grid here to stay or is it a passing fad ?

Is smart grid expected to reduce the cost of maintaining the grid in 2030 ?

How do you manage the complexity of the modern distribution grid ?

Is the electric power utility industry entering a disruptive phase ?

Managing the smart grid

Dave discussed the complexity of the modern grid which has to support distributed generation including intermittent renewable energy sources like solarPV and wind, energy storage, PEVs, and potentially microgrids. Monitoring and managing the modern complex grid is a challenge that traditional tools that have been used for the current grid such as traditional GIS are simply not capable of. A new "converged" approach that tightly integrates GIS, intelligent grid analytical tools using grid topology (what's connected to what) and visualization, all in real-time, is required. It needs to integrate traditional IT systems (GIS, CIS, work management, financials, and meter data management) and traditional OT systems (outage management, distribution management, SCADA, and energy management). Accenture points out that Other industries such as telecommunications, retail and banking have gone through a similar disruptive phase and have applied IT including embedded analytics to transform their business processes to improve productivity.

Is smart grid here to stay ?

98% of respondents in Accenture's executive survey said that smart grid is a natural extension of the ongoing upgrading of the grid. In other words this is not a flash-in-the-pan fad that will go away. The evolution of grid operations is going to involve a broad change to how utilities are organized and how they operate. Accenture predicts that it will require a major overhaul of utility organizations and will affect operational groups, as well as information technology and security.

Staffing the next generation utility

It also requires development of new skills, either through hiring new staff or retraining existing staff. This is a key part of the challenge. I have blogged on multiple occasions about the challenge facing the electric power industry as the older generation retires and a younger, but inexperienced workforce takes their place. Accenture reports that utilities are finding it difficult to staff their workforces with the necessary combination of power engineering and IT skills.

Grid reliability and resilience

As the world becomes more dependent on IT solutions in our daily lives from point of sales to hospital admissions, reliability of the electric power network has become increasingly important. A few years ago 40% of U.S. power went to chip technologies. By 2015 this is expected to reach 60%. Reliability of electric power supply varies significantly among countries. For example, the duration of outages (SAIDI) in the U.S. is greater than the UK, France, or Italy and much greater in the than Germany. I was not happy to see that Canada's average outage duration is over twice that of the U.S and 20 times that of Germany.

A major challenge facing utilities as they move to the smart grid is how to achieve better reliability (and reduced emissions) with less cost. Accenture asked executives whether smart grid would reduce the cost of maintaining the grid in 2030. 72 % of respondents said that that expected that the costs of upgrading/maintaining the grid will be less in 2030 as a result of the adoption of smart grid technologies.

Smart grid business drivers in different regions

There are significant differences between different regions of the world in the priority business drivers for smart grid deployment. For example, in Brazil and India the primary driver is non-technical losses. According to the Accenture survey in North America the primary drivers for smart grid are improving grid reliability and outage response. In Europe where climate change is taken more seriously, distributed generation and support for renewables are priorities.

Increasing competition from microgeneration

Accenture's view is that the industry is experiencing a disruptive phase. Technologies such as distributed generation, storage, EVs and microgrids have the potential to impact competition and regulatory models in a fundamental way over the longer term.

In every part of the world decreasing solar PV prives combined with cheaper storage are beginning to displace centralized generation. This is changing the economics of the traditional electric power business model based on the existing, centralized power networks. Accenture argues that if consumers are given a cost-effective option to move off the grid (for example, grid parity of coupled solarPV and storage), utility revenues will be at risk.

Utilities under threat from PV and other microgeneration technologies will need to look at novel strategies to develop new sources of revenue and integrate new distributed generation capacity while maintaining quality and
security of supply.

23% of respondents said that they expected their utility would have a significantly different business model by 2030. For example, incentivizing consumers with
microgeneration capacity to stay on the grid or developing and
maintaining microgrid solutions on behalf of consumer groups.

Other areas of competition

The Accenture 2013 Executive survey indicates a substantial degree of competition from new entrants is expected even within the next 5 years. Respondents saw increased competition in the following areas:

While increased competition in areas such as distributed generation and demand aggregation is widely expected, respondents expect competition to increase even in the core distribution business including embedded energy storage, power electronics hardware and services and last-mile network deployment.

The changes that are required are fundamental and a majority of the respondents said that legislative and regulatory change will be required.

Data management

The telecommunications, insurance, and other industries view IT as a strategic asset, whereas utilities still treat it as a cost centre. Utilities are under invested in data management technology which does not position them well for the rapidly increasing volume of data and the competitive environment that they are beginning to experience. 96% of the utility executives that Accenture surveyed said that data management will be critical or important for managing the complexity of the network. Some industries are taking data management very seriously. A survey of 600 companies revealed that 66% of them have appointed Chief Data Officers.

Convergence and Analytics

The area where respondents saw the most value from analytics was grid operations (96%). Being able to use visualization and analytical tools to monitor and manage
complex modern networks which include distributed generation, storage,
PEVs, and potentially microgrids is the first priority in both Europe
and North America. This where the next generation of "converged" analytical tools that integrate GIS, artificial intelligence, grid analytical models that take account of network topology, and visualization tools and can operate in a real-time data streaming environment are required.

Outage management (93% in NA) and asset management (92%) were the next priorities in NA. In Europe the next priorities were asset management (92%) and volt/var analytics (83%).

In 2009 12% of the companies surveyed were using predictive analytics.
This had risen to 33% in 2012. But the same year 60% of internal
customers surveyed said that they wanted predictive analytics.

In the specific area of AMI, Accenture's analysis suggests that the value of using smart grid analytics to transform operating results conservatively could approach $40-$70 per meter per year. Financial benefits of this order provide a strong motivation for companies to use modern grid analytics to drive change in business processes.

September 10, 2013

I have blogged about the shrinking workforce in countries around the world like Japan and Germany, where there have been declining populations. The shrinking workforce is acting as a brake on the German economy. The German Chamber of Industry and Commerce
DIHK estimates that German economic growth has
been reduced by one percent by the labour shortage and that the problem
is getting worse. Recently Germany liberalized its immigration laws and for the first time in a decade the population of Germany is actually increasing.

In the U.S. because of its policy of encouraging immigration, the shrinking workforce is not the result of population decline, but the result of a decreasing participation rate. The labour participation rate is the percentage of the population that is employed or actively looking for work. In the U.S. as a result of increasing numbers of women entering the workforce and increasing longevity, the participation rate increased from the 1960s until about 2000 when it reached about 67%.

Since 2000, however, the participation rate has declined. One explanation for the decline is the aging workforce. Boomers are retiring at an increasing rate. This has created a serious problem for many utilities where over half the work force will be eligible to retire in the next few years.
But there is another trend that tends to mitigate this, the percentage of older workers who remain on the job post-retirement is trending upwards. For the age group 65 to 74 the proportion of workers still working is nearly twice what it was in the 1990s. In the long term the problem remains, finding young workers to replace the older workers when they retire.

March 12, 2013

I have blogged frequently on the workforce challenges facing the utility industry resulting from accelerating retirement among engineers and skilled workers and by the technology transformation associated with the smart grid. Community colleges in partnership with utiltiies have been the quickest to respond to this challenge. But universities, encouraged by program like the IEEE PES Scholarship Plus program, are responding as well, though they are also facing an aging workforce problem. Now something has appeared on the horizon in education that may provide a way of ramping up more rapidly to the challenge of training the next generation of engineers and skilled workers.

From October 10th to December 18th 2011 Stanford offered a free, on-line course, "Introduction to Artificial Intelligence", open to anyone, that attracted 160'000 students. The course was taught by Sebastian Thrun and Peter Norvig based on Stanford's introductory Artificial Intelligence course. Thrun led the development of the Google self-driving car. Norvig worked at NASA and is the co-author of a widely used college textbook on artifical intelligence. The term applied to this type of on-line education is massive open on-line courses (MOOCs).

There were a number of precursors. The one explicitly referenced by the Stanford instructors is the Khan Academy that offers videos to instruct anyone in a wide range of topics including arithmetic, algebra, geometry, trigonometry, calculus, probability and statistics, differential equations, Cosmology and astronomy, organic chemistry, finance and capital markets, microeconomics, macroeconomics, computer science, healthcare and medicine, drawing, programming Basics, animation, history, american Civics, and art history.

The open online course that I have watched over and over again is Open Yale's Roman Architecture given by Professor Diana E.E. Kleiner of Yale University and available for free on iTunes.

In August 2012, the online education company Coursera began offering free college courses. According to the New York Times, by January 2013, it had attracted a million users, growing at a rate outpacing even Facebook and Twitter. There are at least three other startups (Udemy, Udacity, edX) attempting to do the same thing.

The MOOC approach is getting some acceptance among traditional universites and colleges. The American Council on Education, which represents the presidents of 1800 U.S. accredited, degree-granting institutions, which include two- and four-year colleges, private and public universities, and nonprofit and for-profit entities. has recommended that colleges could grant credit for some of the free classes from Coursera.

January 24, 2013

I have blogged on many occasions about pathological business processes for designing, building and operating and maintaining infrastructure that plague many utilities. Some of these processes don't seem to have changed since Edison and Tesla's time and result in redundant data, wasted effort, backlogs, and poor data quality.

At the EDIST 2013 Conference in Toronto, Ottawa Hydro described how they used a Lean Process to address this problem specifically in the capital execution (design and build) process.

Ottawa Hydro spends $90 million every year on capital projects and this is forecasted to increase, because of aging infrastructure and the transformation to the new smart grid. But in this time of aging and shrinking workforces at utiities, they have found that while capital expenditures are going up, they have less staff to manage the process. The result has been frustrated staff and delays in construction because designs are not getting completed in time, increasing spend on overtime, and most importantly a significant carryover of capital, and worst of all and unhappy regulator (the Ontario Energy Board).

The Lean Process involved convening a team of 10 grass roots people from all divisions of the company who spent 5 days identifying the the current state and the issues associated with it. 66 issues were identified. They took a break for 2 weeks and then got together for 5 days of brainstorming resulting in a set of recommendations to upper management. They prioritized the recommendations based on their projected impact and size. They moved first with simple processes involving one department with immediate payback. After successfully completing a few of these, they moved on to more complex processes involving multiple departments.

The results speak for themselves. Even though the capital budget has increased they are getting the work done and have reduced overtime by 35%.

January 17, 2013

The Utilities Standards Forum (USF) is an organization representing about 50 Ontario electricity distribution companies. Its goal is collaboratively to define standards for its members to reduce duplication of effort.

The USF has developed design standards for electricity distribution that meet the requirements of Ontario Regulation 22/04 under the Electricity Act of 1998. Currently these are in the form of paper or PDF documents containing 2D isometric views of the components used in electricity distribution such as insulators, clamps, surge arrestors, fuse-cutouts, terminations, cross arms, cable guards, and other types of equipment.

There is pressure to move to 3D design standards. There are several motivations for this. In this time of the challenge of an aging workforce in the utility industry, increasing the efficiency of engineers and designers is an important motivator. But another equally important one is the increasing number of younger designers and engineers that have been brought up in 3D digital world and want to work in a modern 3D environment.

At the EDIST 2013 Conference in Toronto an overview of the new 3D design standards was presented by Ron Lapier and Lori Gallauger of the USF. The new 3D standard is a major step forward in two ways. First of all the components required for electricity distribution design are now 3D so that electricity distribution design can be performed in a 3D environment. But equally important USF intends to make them available digitally in the form of a library of DWG files containing 3D objects that can be brought into a 3D CAD environment and integrated with other components to create a completely 3D representation of an electricity distribution design. The design can be viewed and manipulated in 3D. As well 2D drawings for construction can be generated from the 3d model. The components have real world coordinates so that clearances and dimensions can be directly measured from the 3D model.