Tag Archives: predictive analytics

Last year, Scottish Local Government Chief Digital Officer Martyn Wallace spoke to the CIO UK podcast and highlighted that in 2019 local government must take advantage of artificial intelligence (AI) to deliver better outcomes for citizens. He explained:

“I think in the public sector we have to see AI as a way to deliver better outcomes and what I mean by that is giving the bots the grunt work – as one coworker called it, ‘shuffling spreadsheets’ – and then we can release staff to do the more complex, human-touch things.”

To date, very few councils have felt brave enough to invest in AI. However, the mood is slowly starting to change and there are several examples in the UK and abroad that show artificial intelligence is not just a buzzword, but a genuine enabler of change.

In December, Local Government Minister Rishi Sunak announced the first round of winners from a £7.5million digital innovation fund. The 16 winning projects, from 57 councils working in collaborative teams, were awarded grants of up to £100,000 to explore the use of a variety of digital technologies, from Amazon Alexa style virtual assistants to support people living in care, to the use of data analytics to improve education plans for children with special needs.

These projects are still in their infancy, but there are councils who are further along with artificial intelligence, and have already learned lessons and had measurable successes. For instance, Milton Keynes Council have developed a virtual assistant (or chatbot) to help respond to planning-related queries. Although still at the ‘beta’ stage, trials have shown that the virtual assistant is better able to validate major applications, as these are often based on industry standards, rather than household applications, which tend to be more wide-ranging.

Chief planner, Brett Leahy, suggests that introducing AI will help planners focus more on substantive planning issues, such as community engagement, and let AI “take care of the constant flow of queries and questions”.

In Hackney, the local council has been using AI to identify families that might benefit from additional support. The ‘Early Help Predictive System’ analyses data related to (among others) debt, domestic violence, anti-social behaviour, and school attendance, to build a profile of need for families. By taking this approach, the council believes they can intervene early and prevent the need for high cost support services. Steve Liddicott, head of service for children and young people at Hackney council, reports that the new system is identifying 10 or 20 families a month that might be of future concern. As a result, early intervention measures have already been introduced.

In the US, the University of Chicago’s initiative ‘Data Science for Social Good’ has been using machine learning (a form of AI) to help a variety of social-purpose organisations. This has included helping the City of Rotterdam to understand their rooftop usage – a key step in their goal to address challenges with water storage, green spaces and energy generation. In addition, they’ve also helped the City of Memphis to map properties in need of repair, enabling the city to create more effective economic development initiatives.

Yet, like most new technologies, there has been some resistance to AI. In December 2017, plans by Ofsted to use machine learning tools to identify poorly performing schools were heavily criticised by the National Association of Head Teachers. In their view, Ofsted should move away from a data-led approach to inspection and argued that it was important that the “whole process is transparent and that schools can understand and learn from any assessment.”

Further, hyperbole-filled media reports have led to a general unease that introducing AI could lead to a reduction in the workforce. For example, PwC’s 2018 ‘UK Economic Outlook’ suggests that 18% of public administration jobs could be lost over the next two decades. Although its likely many jobs will be automated, no one really knows how the job market will respond to greater AI, and whether the creation of new jobs will outnumber those lost.

Should local government investment in AI?

In the next few years, it’s important that local government not only considers the clear benefits of AI, but also addresses the public concerns. Many citizens will be in favour of seeing their taxes go further and improvements in local services – but not if this infringes on their privacy or reduces transparency. Pilot projects, therefore, which provide the opportunity to test the latest technologies, work through common concerns, and raise awareness among the public, are the best starting point for local councils looking to move forward with this potentially transformative technology.

Follow us on Twitter to discover which topics are interesting our research team.

Home to former President Barack Obama, sporting giants the Chicago Bulls, and the culinary delicacy deep dish pizza, Chicago is one of the most famous cities in the world. Less well known is Chicago’s ambition to become the most data-driven city in the world.

A late convert to the smart city agenda, Chicago was lagging behind local rivals New York and Boston, and international leaders Barcelona, Amsterdam, and Singapore.

But in 2011, Chicago’s new Mayor Rahm Emanuel outlined the important role technology needed to play, if the city was to address its main challenges.

Laying the groundwork – open data and tech plan

In 2012, Mayor Rahm Emanuel issued an executive order establishing the city’s open data policy. The order was designed to increase transparency and accountability in the city, and to empower citizens to participate in government, solve social problems, and promote economic growth. It required that every city agency would contribute data to it and established reporting requirements to ensure agencies were held accountable.

Chicago’s open data portal has nearly 600 datasets, which is more than double the number in 2011. The city works closely with civic hacker group Open Chicago, an organisation which runs hackathons (collaborations between developers and businesses using open data to find solutions to city problems).

In 2013, the City of Chicago Technology Plan was released. This brought together 28 of the city’s technology initiatives into one policy roadmap, setting them out within five broad strategic areas:

Establishing next-generation infrastructure

Creating smart communities

Ensuring efficient, effective, and open government

Working with innovators to develop solutions to city challenges

Encouraging Chicago’s technology sector

Array of Things

The Array of Things is an ambitious programme to install 500 sensors throughout the city of Chicago. Described by the project team as a ‘fitness tracker for the city’, the sensors will collect real-time data on air quality, noise levels, temperature, light, pedestrian and vehicle traffic, and the water levels on streets and gutters. The data gathered will be made publicly available via the city’s website, and will provide a vital resource for the researchers, developers, policymakers, and citizens trying to address city challenges.

This new initiative is a major project for the city, but as Brenna Berman, Chicago’s chief information officer, explains:

“If we’re successful, this data and the applications and tools that will grow out of it will be embedded in the lives of residents, and the way the city builds new services and policies”

Potential applications for the city’s data could include providing citizens with information on the healthiest and unhealthiest walking times and routes through the city, as well as the areas likely to be impacted by urban flooding.

A series of community meetings was held to introduce the Array of Things concept to the community and to consult on the city’s governance and privacy policy. This engagement ranged from holding public meetings in community libraries to providing online forms, where citizens could provide feedback anonymously.

In addition, the Urban Center for Computation and Data and the School of the Art Institute of Chicago ran a workshop entitled the “Lane of Things”, which introduced high school students to sensor technology. The workshop is part of the Array of Things education programme, which aims to use sensor technology to teach students about subjects such as programming and data science. For eight weeks, the students were given the opportunity to design and build their own sensing devices and implement them in the school environment, collecting information such as dust levels from nearby construction and the dynamics of hallway traffic.

The Array of Things project is funded by a $3.1 million National Science Foundation grant and is expected to be complete by 2018.

Mapping Subterranean Chicago

The City of Chicago is working with local technology firm, City Digital, to produce a 3D map of the underground infrastructure, such as water pipes, fibre optic lines, and gas pipes. The project will involve engineering and utility workers taking digital pictures as they open up the streets and sidewalks of Chicago. These images will then be scanned into City Digital’s underground infrastructure mapping (UIM) platform, and key data points will be extracted from the image, such as width and height of pipes, with the data being layered on a digital map of Chicago.

“By improving the accuracy of underground infrastructure information, the platform will prevent inefficient and delayed construction projects, accidents, and interruptions of services to citizens.”

Although still at the pilot stage, the technology has been used on one construction site and an updated version is expected to be used on a larger site in Chicago’s River North neighbourhood. Once proven, the city plans to charge local construction and utility firms to access the data, generating income whilst reducing the costs of construction and improving worker safety.

ShotSpotter

In January, Mayor Rahm Emanuel and Chicago Police Department commanders announced the expansion of ShotSpotter – a system which uses sensors to capture audio of gunfire and alert police officers to its exact location. The expansion will take place in the Englewood and Harrison neighbourhoods, two of the city’s highest crime areas, and should allow police officers to respond to incidents more rapidly.

Chicago Police Superintendent Eddie Johnson highlights that although crime and violence presents a complex problem for the city, the technology has resulted in Englewood going “eight straight days without a shooting incident”, the longest period in three years.

ShotSpotter will also be integrated into the city’s predictive analytics tools, which are used to assess how likely individuals are to become victims of gun crime, based on factors such as the number of times they have been arrested with individuals who have become gun crime victims.

Final thoughts

Since 2011, Chicago has been attempting to transform itself into a leading smart city. Although it’s difficult to compare Chicago with early adopters such as Barcelona, the city has clearly introduced a number of innovative projects and is making progress on their smart cities journey.

In particular, the ambitious Array of Things project will have many cities watching to see if understanding the dynamics of city life can help to solve urban challenges.

Follow us on Twitterto see what developments in public and social policy are interesting our research team.

If you found this article interesting, you may also like to read our other smart cities articles:

Daniel MacIntyre, from Glasgow City Marketing Bureau (the city’s official marketing organisation), opened the event by highlighting Glasgow’s ambitious target of increasing visitor numbers from two million to three million by 2023.

To achieve this goal, Mr MacIntyre explained that the city would be looking to develop a city data plan, which would outline how the city should use data to solve its challenges and to provide a better experience for tourists.

In many ways, Glasgow’s tourism goal set the context for the presentations that followed, providing the attendees – who included professionals from the technology and tourism sectors, as well as academia and local government – with an understanding of the city’s data needs and how it could be used.

Identifying the problem

From very early on, there was a consensus in the room that tourism bodies have to identify their problems before seeking out data.

A key challenge for Glasgow, Mr MacIntyre explained, was a lack of real time data. Much of the data available to the city’s marketing bureau was historic (sometimes three years old), and gathered through passenger or visitor experience surveys. It was clear that Mr MacIntrye felt that this approach was rather limiting in the 21st century, highlighting that businesses, including restaurants, attractions, and transport providers were all collecting data, and if marketing authorities could work in collaboration and share this data, it could bring a number of benefits.

In essence, Mr MacIntyre saw Glasgow using data in two ways. Firstly, to provide a range of insights, which could support decision making in destination monitoring, development, and marketing. For instance, having data on refuse collection could help ensure timely collections and cleaner streets. A greater understanding of restaurant, bar, and event attendances could help develop Glasgow’s £5.4 million a year night time economy by producing more informed licensing policies. And the effectiveness of the city’s marketing could be improved by capturing insights from social media data, creating more targeted campaigns.

Secondly, data could be used to monitor or evaluate events. For example, the impact of sporting events such as Champions League matches – which increase visitor numbers to Glasgow and provide an economic boost to the city – could be far better understood.

Urban Big Data Centre (UBDC)

One potential solution to Glasgow City Marketing Bureau’s need for data may be organisations such as the Urban Big Data Centre.

The UBCD is also involved in a number of projects, including the integrated Multimedia City Data (iMCD) project. One interesting aspect of this work involved the extraction of Glasgow-related data streams from multiple online sources, particularly Twitter. The data covers a one year period (1 Dec 2015 – 30 Nov 2015) and could provide insights into the behaviour of citizens or their reaction to particular events; all of which, could be potentially useful for tourism bodies.

Predictive analytics

Predictive analytics, i.e. the combination of data and statistical techniques to make predictions about future events, was a major theme of the day.

Faical Allou, Business Development Manager at Skyscanner, and Dr John Wilson, Senior Lecturer at the University of Strathclyde, presented their Predictive Analytics for Tourism project, which attempted to predict future hotel occupancy rates for Glasgow using travel data from Glasgow and Edinburgh airport.

Glasgow City Marketing Bureau also collaborated on the project – which is not too surprising as there a number of useful applications for travel data, including helping businesses respond better to changing events, understanding the travel patterns of visitors to Glasgow, and recommending personalised products and services that enhance the traveller’s experience (increasing visitor spending in the city).

However, Dr Wilson advised caution, explaining that although patterns could be identified from the data (including spikes in occupancy rates), there were limitations due to the low number of datasets available. In addition, one delegate, highlighted a ‘data gap’, suggesting that the data didn’t cover travellers who flew into Glasgow or Edinburgh but then made onward journeys to other cities.

Uber

Technology-enabled transport company, Uber, has been very successful at using data to provide a more customer oriented service. Although much of Uber’s growth has come from its core app – which allows users to hire a taxi service – they are also introducing innovative new services and integrating their app into platforms such as Google Maps, making it easier for customers to request taxi services.

And in some locations, whilst Uber users are travelling, they will receive local maps, as well as information on nearby eateries through their UberEATS app.

Uber Movement, an initiative which provides access to the anonymised data of over two billion urban trips, has the potential to improve urban planning in cities. It includes data which helps tourism officials, city planners, policymakers and citizens understand the impact of rush hours, events, and road closures in their city.

Chris Yiu, General Manager at Uber, highlighted that people lose weeks of their lives waiting in traffic jams. He suggested that the future of urban travel will involve a combination of good public transport services and car sharing services, such as uberPOOL (an app which allows the user to find local people who are going in their direction), providing the first and last mile of journeys.

Final thoughts

The event was a great opportunity to find out about the data challenges for tourism bodies, as well as initiatives that could potentially provide solutions.

Although a number of interesting issues were raised throughout the day, two key points kept coming to the forefront. These were:

The need to clarify problems and outcomes – Many felt it was important that cities identified the challenges they were looking to address. This could be looked at in many ways, from addressing the need for more real-time data, to a more outcome-based approach, such as the need to achieve a 20% reduction in traffic congestion.

Industry collaboration – Much of a city’s valuable data is held by private sector organisations. It’s therefore important that cities (and their tourism bodies) encourage collaboration for the mutual benefit of all partners involved. Achieving a proposition that provides value to industry will be key to achieving smarter tourism for cities.

Follow us on Twitterto see what developments in public and social policy are interesting our research team. If you enjoyed this article, you may also be interested in: