Google has developed an AI tool called Perspective that can analyze online content and score it based on how toxic it is, such as hate speech from trolls. Perspective is free to use and website owners can use it to monitor comment boards and automatically flag high-toxicity content. Google trained Perspective to predict toxic content by aggregating millions of comments from popular websites including Wikipedia and the New York Times and then having humans annotate whether or not a comment could be considered toxic so Perspective could learn to differentiate between benign and inflammatory comments.

Philadelphia startup Exyn Technologies has developed AI software that allows drones to autonomously navigate their environments even when they are out of range of GPS. Exyn’s software enables drones to combine and analyze data from a variety of on-board sensors to map routes to their destinations, meaning that a drone would not need to have a GPS connection to understand where it is supposed to go. This approach could allow drones to operate autonomously in challenging or isolated environments without GPS signal, such as mineshafts or warehouses.

The city of San Diego has announced plans to install a network of 3,200 connected sensors on city street lights to measure air quality, traffic, and pedestrian safety. The sensors will be capable of real-time reporting, and the city plans to publish the sensor data as open data, as well as use the information to help improve traffic flows and develop apps for city services.

The European Space Agency (ESA) has adopted an open access policy for its imagery, video, and other data and will publish this data under a Creative Commons license, which allows anyone to freely use and adapt this data for any purpose. ESA has made data freely available in the past, but the new policy provides specific guidelines for the agency to make the data as accessible and usable as possible. ESA will initially apply the open license to data it owns entirely and will work on increasing the accessibility of data it collected with the help of third parties.

Researchers at the Swiss Federal Institute of Technology in Zurich have developed a system of neural networks capable of improving the sharpness of telephotography, which could allow space telescopes such as the Hubble to take more useful images. The researchers trained the networks by having them analyze clear and slightly out of focus versions of the same images to learn how to discern greater detail out of lower-quality images. The system is capable of reproducing very faint structures, such as shapes of a galaxy’s spiral arms, that traditional methods used to improve the clarity of telephotography can not.

Researchers at the University of Cambridge and Microsoft have created a machine learning system called DeepCoder that uses a technique called program synthesis, which entails appropriating lines of code from different pieces of software, to create its own programs that can solve basic challenges. DeepCoder can quickly analyze large databases of code to identify relevant pieces and combine them much faster than humans or existing systems can, and as it assembles lines of code, it learns which arrangements do and do not work, allowing it to continuously improve.

The U.S. Department of Defense (DoD) has launched an initiative called Code.mil to help the agency take advantage of open-source code as well as make DoD code more accessible to the public. Code developed by government employees typically does not have copyright protections, which can make it difficult for agencies to apply open licenses to their code, so DoD developed a special licensing agreement that allows it to make its code publicly available.

Robotics firm HiBot USA has developed a system that uses a system of pipe inspection robots powered by AI that help municipalities make more informed decisions about upgrading their water infrastructure, potentially saving money and reducing the risk of pipe failure. Municipalities typically upgrade pipes based on their age, which does not necessarily mean they are replacing the pipes that need it most. HiBot USA’s system first predicts areas where pipes are most at risk, and then uses a pipe-crawling robot to gather data about pipe integrity. Then, the system uses AI to analyze this data and compares it with data about soil characteristics and other factors to predict the likelihood a pipe needs to be replaced.

Two groups of researchers from the University of Maryland and IBM have run a series of experiments involving two different quantum computing systems competing against each other. The systems relied on different technologies to process qubits and the tests involved measuring the performance of each system running the same algorithms. The experiments showed that IBM’s system, which relied on qubits sharing information through a central hub, was faster, but this process can cause fragile quantum states to be destroyed, and that the University of Maryland’s system, which relied on interconnected qubits, was more reliable.

Researchers from the University of Birmingham in the UK have developed a smartphone app that uses a phone’s accelerometer, gyroscope, and GPS sensors to monitor the smoothness of train rides. The app runs while users go about their commutes and an artificial neural network aggregates this data to create a map of the health of railway infrastructure. With these maps, transit authorities could easily prioritize maintenance for areas that cause the greatest disruptions of trains.

Joshua New is a senior policy analyst at the Center for Data Innovation. He has a background in government affairs, policy, and communication. Prior to joining the Center for Data Innovation, Joshua graduated from American University with degrees in C.L.E.G. (Communication, Legal Institutions, Economics, and Government) and Public Communication. His research focuses on methods of promoting innovative and emerging technologies as a means of improving the economy and quality of life. Follow Joshua on Twitter @Josh_A_New.