April, 2013

The Microsoft Research Connections blog shares stories of collaborations with computer scientists at academic and scientific institutions to advance technical innovations in computing, as well as related events, scholarships, and fellowships.

In the late 1800s, the guanaco, a close relative of the llama, was hunted to near extinction. As we mark this year’s Earth Day (April 22), I want to share my excitement about a new tool that looks to make the future a little brighter for the guanaco and other threatened species in Latin America. That new tool is LiveANDES (Advanced Network for the Distribution of Endangered Species).

Developed by a partnership among researchers at the Pontifical Catholic University of Chile, the LACCIR (Latin American and Caribbean Collaborative ICT Research) Virtual Institute, and Microsoft Research, LiveANDES is designed to collect, house, and analyze data about Latin America’s wildlife—data that could prove vital to the preservation of the region’s rich but increasingly threatened biodiversity, which has suffered grievously from loss of habitat and climate change.

Mariano de la Maza, a wildlife officer in Chile’s Parks and Protected Areas Service, sees this decline on a daily basis. “The main problems of the Chilean forest are habitat loss and the fragmentation and degradation of native forests,” he says.

LiveANDES begins with field observations, made not just by wildlife biologists and park rangers but by “citizen scientists,” including hikers, eco-tourists, and other nature enthusiasts. As Cristian Bonacic, director of the wildlife laboratory at Pontifical Catholic University, notes, “When people go to the wild, they can encounter an endangered animal by chance.” These chance encounters can provide extremely valuable information about the location and status of threatened and endangered wildlife.

All that’s needed is a smartphone equipped with the LiveANDES app. Imagine you’re hiking in the Chilean countryside, and you think you’ve spotted a rare species. You simply take its picture with your smartphone and upload the picture and any sighting comments into LiveANDES. Your photo and annotations, along with the phone’s recognition of your geographical location and a time stamp, are then logged into the LiveANDES database, ready for parsing by the university team.

Once processed, the data becomes available to scientists locally and around the world, as well as to the public, in both Spanish and English. Bonacic praises LiveANDES for the way it helps researchers “share that information with the scientific community, park rangers, and people at large.”

Knowing where and under what circumstances a threatened species is living can help biologists devise strategies to stabilize and, one hopes, restore these vulnerable populations. Moreover, the information gathered in LiveANDES also will help keep the International Union for Conservation of Nature (IUCN) red list of endangered and threatened species accurate, complete, and up-to-date.

The LiveANDES platform was built by using Microsoft technologies, including Windows Phone, Microsoft SQL Server data management software, and Bing Maps for locating and visualizing the animals, and the Microsoft .NET Framework for programming. It not only houses data about Latin America’s wildlife, including photographs, audio and video recordings, and location and sighting data, but it also makes parsing huge volumes of data manageable for researchers.

According to Ignacio Casas, the executive director of LACCIR, LiveANDES integrates with the fourth paradigm, a foundational concept of eScience, in which data-intensive computing facilitates scientific discovery. LiveANDES is designed to make parsing the huge volumes of data recorded manageable for researchers.

Bonacic and his colleagues look forward to receiving a barrage of wildlife data from rangers, biologists, and, of course, citizen scientists. Thanks to LiveANDES, this data deluge will be manageable and actionable.

I am inspired by this project, as it tackles an extremely challenging environmental problem, which is the rapid decline of important elements of our natural heritage. Each animal species is an important piece of a puzzle, and each citizen scientist and researcher can play a crucial role in the preservation of endangered species for the next generation. I’m hopeful that LiveANDES will help the guanaco and other vulnerable species survive to see Earth Day 2113!

This April, Paris will be even more exciting than usual, as the Microsoft Research Machine Learning Summit takes place on the company’s “Le Campus.” This year, we will be streaming the keynotes and interviews live from the summit on April 23, from 13:30 to 17:00 GMT (9:30 A.M. to 1:00 P.M. Eastern Time and 6:30 A.M. to 10:00 A.M. Pacific Time).

This free online event will kick off at 13:30 GMT with the opening keynote (recorded earlier in the day) from Andrew Blake, director of Microsoft Research Cambridge. Professor Blake will describe advances in computer vision, with machines that learn to see. Then at 15:00 GMT, you can watch the live stream of Judea Pearl, director of the Cognitive Systems Laboratory at the University of California, Los Angeles. Professor Pearl will speak about the development and application of mathematical tools to study cause-and-effect relationships. What’s more, following their keynotes, these renowned experts will conduct an online Q&A—giving you the opportunity to engage directly with these eminent researchers.

In addition, there will be “Research in Focus” interview segments that describe cutting-edge work in machine learning. Fei-Fei Li of the Stanford Vision Lab and Sebastian Nowozin of Microsoft Research will discuss developments in teaching machines to see, and Zoubin Ghahramani of the University of Cambridge will describe his work on building an “automated statistician.”

Each year, the Software Engineering Innovation Foundation (SEIF) awards US$25,000 grants to support academic research in software engineering technologies, tools, practices, and teaching methods. SEIF is supported by Microsoft Research Connections Computer Science in conjunction with the Research in Software Engineering Group (RiSE). This year, we were joined by the Microsoft Technology Policy Group.

SEIF supports fundamental and applied research. As Tom Ball, research manager in the RiSE Group at Microsoft Research Redmond says: “SEIF is based on the premise that solid software engineering foundations are fundamental to every kind of system Microsoft builds, so software engineering makes a good base from which attract a wide variety of research in hot topic areas and to partner with academics and groups inside Microsoft Research.” Accordingly, the SEIF 2013 Request for Proposals added device and cloud computing and natural user interface (NUI) based applications to ensure a more comprehensive representation of digital technologies.

CheckCell: Data Debugging for SpreadsheetsEmery Berger, University of Massachusetts Amherst, United States

Understanding Parallelism and Automating Refactoring for Readability and Performance Danny Dig, University of Illinois at Urbana-Champaign, United States

Mobile/Social Debugging Games for Computing EducationAndrew J. Ko, University of Washington, United States

Engineering Integrity and Confidentiality for the STAR-Vote Electronic Voting SystemDan S. Wallach, Rice University, United States

NUI applications facilitate human-computer interaction (HCI) by providing more natural forms of input such as gesture, voice, context, anticipatory processing based on a user’s past actions, and environmental awareness. The goal of NUI applications is to provide more intuitive and sophisticated forms of input that are adaptive to the user and require minimal user training—in particular, for the aging population, people with disabilities, socially or geographically isolated individuals, and underserved populations—to promote digital inclusion where other interfaces, such as keyboard and mouse, are impractical.

Additionally, with the advent of new tablet devices and ever more powerful phones, applications that use software services and cloud computing become both challenging and rewarding areas for researchers to explore.

Four of this year’s SEIF awards support this area of scientific exploration:

Nilanjan Banerjee at University of Maryland, Baltimore County, is constructing a wearable assistive device that recognizes gestures for paralysis patients.

Eelke Folmer at University of Nevada, Reno, is creating a spatial navigator for people who are blind.

Gillian R. Hayes at University of California, Irvine, is building interactive surfaces with body-based interactions to provide guidance to children with autism spectrum disorder (ASD).

Shaun K. Kane and Amy Hurst at University of Maryland, Baltimore County, will work on Wheeltop Interaction: Full-Body Gesture Control for Power Wheelchair Users

These are just some highlights from the 16 innovative software engineering projects recognized by this year’s awards, now in their fourth year. You can read more about them and the rest of the SEIF winners on the SEIF website.Congratulations to the winners of the 2013 SEIF awards!