Chinese researchers have successfully transmitted entangled photons from an orbiting satellite to Earth in what experts say is a major advance for quantum communications. The beamed particles set up an instantaneous link between a pair of ground stations separated by more than 744 miles. The use of a satellite enabled the scientists to avoid ground-level interference, which had significantly restricted the transmission range of the entangled particles. "We have achieved a two-photon entanglement distribution efficiency a trillion times more efficient than using the best telecommunication fibers," says Lu Chaoyang at the University of Science and Technology of China. "We have done something that was absolutely impossible using conventional approaches." ID Quantique co-founder Gregoire Ribordy says although the experimental process is too slow and complicated to be practical for quantum communications, a second satellite could expand the transmission range to enable "large-scale, global-scale quantum communications" within five years.

The U.S. Department of Energy (DoE) on Thursday announced $258 million in funding awards to six technology companies to advance supercomputing speed under DoE's Exascale Computing Project (ECP). "Continued U.S. leadership in high-performance computing is essential to our security, prosperity, and economic competitiveness as a nation," says DoE Secretary Rick Perry. "These awards will enable leading U.S. technology firms to marshal their formidable skills, expertise, and resources in the global race for the next stage in supercomputing." Six companies will share the three-year grant to support research and development (R&D) in hardware technology, software technology, and application development to develop the first U.S. exascale systems by 2021. ECP director Paul Messina stresses the importance of joint public-private R&D for addressing development of memory architectures, higher speed interconnects, better reliability, and strategies for boosting computing power without unaffordable increases in energy demand.

Researchers at Maluuba, a Montreal-based startup recently acquired by Microsoft, last month used artificially intelligent software agents to master Ms. Pac-Man--a significant accomplishment as the video game employs unpredictable movements of opponents and goals to make mastery by reinforcement learning difficult. Maluuba researchers say they cracked the game by generating more than 150 reinforcement-learning agents, each of which focuses on how one game element impacts the score. Individual agents deliver their strategy recommendations to a central decider, which pools their suggestions to determine what moves to make next. Duke University's Silvia Ferrari says Maluuba's approach may be difficult to apply to real-world problems, given that humans would have to intervene in the partitioning of specific tasks the multiple agents would be assigned to solve. However, Maluuba's Harm van Seijen says a system composed of smaller components that can be checked individually offers greater transparency, and "can give you more insight and control into how the decision is made."

Researchers at Peter the Great St. Petersburg Polytechnic University (SPbPU) in Russia have proposed a new channel coding method for the fifth generation of wireless systems (5G). SPbPU professor Peter Trifonov says they designed codes that surpass state-of-the-art competitors in performance and decoding simplicity. The researchers generalized the construction of polar codes originally proposed by Turkish scientist Erdal Arikan, and obtained polar subcodes. The team then excluded some codewords from Arikan's polar codes, which could be easily entangled by the receiver, and introduced additional restrictions on the symbols of their codewords in order to simplify the error correction task of the decoder. In addition, the SPbPU researchers say they proposed a computationally simple decoding algorithm for polar codes and subcodes. The team says the improved code performance enables communications systems to operate in more challenging environments, resulting in improved coverage and enabling the system to support a larger number of users.

Researchers at the Korea Advanced Institute of Science and Technology (KAIST) have developed a semiconductor chip called a CNN Processor (CNNP), which runs artificial intelligence algorithms with ultra-low power, as well as K-Eye, a face-recognition system using CNNP. The K-Eye system features a wearable device that can be used with a smartphone via Bluetooth, and can operate for more than 24 hours with its internal battery. K-Eye users can check information about people by using their smartphone or smartwatch, and the device also enables users to access a database via their smart devices. Meanwhile, K-Eye's included dongle can recognize and share information about users at any time. The K-Eye system also includes an "always-on" image sensor, which can determine if there is a face in its camera range and will capture frames and set the device to operate only when a face exists.

Application code must keep up with processor advancements as parallel computing gains traction, which Intel's Joe Curley says his company is doing by parallelizing public codes for the newest x86 central-processing unit generations. In an interview, Curley says the Intel Parallel Computing Centers produce output that software developers and academics can apply to teach people cutting-edge code modernization. "We focus efforts on open source communities and open source codes...to improve the understanding of how to program in parallel and how to solve problems," Curley says. He notes the inclusion of artificial intelligence and machine-learning methods are a recent code modernization development, which when properly implemented can exponentially boost performance. Curley also says most projects in this space concentrate on challenging algorithms via parallelization to "achieve massive increases in performance." Among the efforts encompassed by Intel's initiative are applications used by manufacturers in product design, and advanced imaging diagnostics employed by the medical industry.

Phone Metadata Reveals Where City Migrants Go and Who They CallNew ScientistChris BaraniukJune 14, 2017

Researchers at the University of Washington and Zhejiang University in China analyzed a month's worth of telecommunications metadata--more than 698 million call logs--from Shanghai in an attempt to help authorities manage mass migration more effectively in the future. Although the data did not contain names and addresses, the researchers say it did suggest whether mobile users were locals or migrants to the city. They expected the data to reveal migrants gradually behaving more in the manner of locals as they spent time in Shanghai. However, the data did not bear that hypothesis out, as the researchers found the migrants stayed in contact with multiple people who shared their place of birth, and tended to spend more time in the center of the city. The researchers note the data did show migrants became more similar to locals over time in certain ways, such as average call duration and the distance they traveled.

Open source development's popularity is tempered by poor documentation and provocation, mainly rudeness, among developers, according to a new GitHub survey. The poll pointed to the pervasiveness of incomplete or outdated documentation in the open source community, as well as the overriding importance of an open source license in many developers' decisions to use or contribute to projects. In addition, GitHub says the support of inclusivity is a key benefit of documentation, while about 25 percent of the community lack high proficiency in reading and writing English. GitHub also estimated that 18 percent of respondents personally experienced a negative interaction with another user, and 50 percent witnessed one between other people. GitHub found such interactions can tangibly impact open source projects by discouraging continued developer contribution. The survey also found that although open source software is used worldwide, its contributors don't yet reflect its broad audience.

U.S. government agencies require better tools and datasets to assess how artificial intelligence, autonomous vehicles, and other emerging technologies are impacting the private-sector workforce, according to a report from the National Academy of Sciences' Committee on Information Technology, Automation, and the U.S. Workforce. The two-year study warns policymakers are "flying blind" without information to inform their responses to disruptive workforce trends caused by advanced technology. Study author and Carnegie Mellon University professor Tom Mitchell cites a lack of knowledge for policymakers to answer even basic queries, and he says part of the problem is rooted in the government's historical reliance on in-house data. Mitchell suggests policymakers mix in datasets currently being produced, stored, and used by non-government organizations, at which the private sector excels. Mitchell also notes universities are a rich data resource, as are job archives such as LinkedIn, Burning Glass, and Monster.com.

In an interview, University of Delaware (UD) professor Lori Pollock discusses the school's Partner 4 Computer Science (CS) program, designed to equip K-12 educators with the skills for teaching students computer development. "Teachers learn practical classroom activities to teach computational thinking in algorithms, data, abstraction, programming, and Internet, as well as participate in training sessions for specific technologies...with models of how to teach them to different ages," Pollock says. She notes the five-year-old program was initially funded by the U.S. National Science Foundation, and since its inception it has evolved into a multi-component initiative. Its elements include an annual paid Professional Development Summer Workshop, an annual Summit for CS Education in Delaware, and material support and circulation for a college field experiences course in which UD undergraduates assist in teaching CS at local schools and libraries. Pollock says greater CS integration into classrooms and courses increases access toward broadening participation in computing.

Researchers at the University of Texas at Arlington (UTA) say they are developing computer tools to detect social bots within the World Wide Web that create and spread fake news. The team, led by UTA professor Chengkai Li, will use highly sophisticated algorithms to combat the bots and the spread of fake news. Li says the project will focus on Twitter bots, Twitter accounts run by computer programs that automatically publish and forward content, follow other accounts, leave comments, and conduct seemingly "real" activity. In addition, Li says the project will concentrate on national security threats instead of domestic politics. "These bots often are sponsored by nation-states that are hostile to U.S. interests," Li notes. "This project needs to have a worldwide reach." Li says the UTA researchers will leverage expertise in computational fact-checking, static and dynamic code analysis, data mining, and security.

Apple's open source WebKit browser engine is now equipped with full support for WebAssembly, described by Apple's Saam Barati and colleagues as a low-level binary format designed as an appropriate compilation target for C++ and similar languages. WebAssembly is designed to increase Web application speeds and to let non-JavaScript languages run in browsers; Google, Microsoft, and Mozilla already support the technology. Some JavaScript functions are beyond WebAssembly's capabilities, so it is designed to be used with JavaScript. WebAssembly's WebKit deployment can reuse some machinery already in WebKit, such as an ECMAScript module for WebAssembly implementation. To permit sharing of modules between Web workers and enable features such as threads, Apple's internal representation of WebAssembly code has been rendered thread-safe. Meanwhile, Google has decided to support WebAssembly for running native code in the browser, and has dumped its Portable Native Client technology.

In an interview, Stanford University professor Andrew Ng and Tencent Holdings Artificial Intelligence (AI) Lab executive director Tong Zhang discuss AI's promise to transform all industries and have far more beneficial than negative uses. Zhang doubts a "singularity" event in which AIs dominate mankind is likely, given that the technology solves many specialized programs but cannot offer a single, universal solution for simultaneously addressing all challenges. Ng says AI can gain a foothold in an industry once it has digitizated, followed by the creation of data that AI can be fed for automated decision-making and other useful functions. "The broader pattern is that in any task in which a lot of people are doing relatively routine, repetitive work, that creates a very strong incentive for AI teams to come and automate that task," Ng says. He notes tasks ripe for AI automation typically require less than a second of mental thought in a human.