News & Events

Headlines

The latest advance in a new type of optics aimed at improving microscopy started with a game of tennis three years ago between Mooseok Jang a graduate of Professor Changhuei Yang's lab and Yu Horie working with Professor Andrei Faraon. "The hope is that our work will prompt further interest in this area of optics and make this type of microscopy and its advantages feasible for practical, everyday use—not just as a proof of concept," says Josh Brake, a graduate student in Yang's lab who continues to work on the project with Faraon and Yang. [Caltech story]

Caltech and Disney Research have entered into a joint research agreement to pioneer robotic control systems and further explore artificial intelligence technologies. Pietro Perona will work with Disney roboticist Martin Buehler to create navigation and perception software that could allow robotic characters to safely move through dense crowds and interact with people. Aaron Ames will work with Disney Research's Lanny Smoot to further explore robot autonomy and machine learning by creating objects that can self-navigate and perform stunts. Yisong Yue has been working with engineers from Disney Research on the use of machine learning to analyze the behavior of soccer players and to measure audience engagement. [Caltech story]

Professor Venkat Chandrasekaran and graduate student Armeen Taeb have developed an empirical statewide model of the California reservoir network. This work offers reservoir managers insight on how to plan and respond to drought conditions. "The bread and butter of hydrology is using physical laws to describe water phenomena. But the behavior of these reservoirs is not solely determined by physical laws of the water cycle, but also by demands and what these reservoirs are being used for," Taeb explains. [Caltech story]

Lihong Wang, Bren Professor of Medical Engineering and Electrical Engineering, and colleagues have improved a technique for taking three-dimensional (3-D) microscopic images of tissue, allowing them to see inside living creatures with greater precision than before. "This gives us the ability to look through opaque materials and see what's inside," Professor Wang says. "It's like an extension of the human eye, like Superman's X-ray vision." [Caltech story]

Azita Emami, Andrew and Peggy Cherng Professor of Electrical Engineering and Medical Engineering; Investigator, Heritage Medical Research Institute; and EAS Division Deputy Chair, along with her colleagues including Professor Mikhail Shapiro have developed microscale devices that relay their location in the body. "We wanted to make this chip very small with low power consumption, and that comes with a lot of engineering challenges," says Professor Emami. "We had to carefully balance the size of the device with how much power it consumes and how well its location can be pinpointed." [Caltech story]

Professor John Doyle and colleagues are among only nineteen groups in the United States to receive National Science Foundation (NSF) funding to conduct innovative research focused on neural and cognitive systems. They aim is to integrate the capabilities of deep learning networks into a biologically inspired architecture for sensorimotor control that can be used to design more robust platforms for complex engineered systems. [NSF release]

Caltech and Cornell teamed up to create the iNaturalist Challenge, a competition to create the best machine-learning algorithm for identifying the world's plant and animal species. The contest was an outgrowth of the institutions' previous work together on Visipedia, a visual encyclopedia created by a network of people and machine-learning computers that harvest image information off the internet. The technology was developed for the encyclopedia by Pietro Perona's Vision Group at Caltech and Serge Belongie's Computer Vision Group at Cornell Tech. [Caltech story]

Professor Ali Hajimiri and colleagues have developed a new camera design that replaces the lenses with an ultra-thin optical phased array (OPA). The OPA does computationally what lenses do using large pieces of glass: it manipulates incoming light to capture an image. "Here, like most other things in life, timing is everything. With our new system, you can selectively look in a desired direction and at a very small part of the picture in front of you at any given time, by controlling the timing with femto-second—quadrillionth of a second—precision," says Professor Hajimiri. [Caltech story] [ENGenious silicon photonics feature]

Engineers at the Optical Imaging Laboratory led by Professor Lihong Wang have developed an imaging technology that could help surgeons removing breast cancer lumps confirm that they have cut out the entire tumor—reducing the need for additional surgeries. “What if we could get rid of the waiting? With 3D photoacoustic microscopy, we could analyze the tumor right in the operating room, and know immediately whether more tissue needs to be removed,” Professor Wang explains. [Caltech story]

Lihong Wang, Bren Professor of Medical Engineering and Electrical Engineering, and colleagues are now able to take a live look at the inner workings of a small animal with enough resolution to see active organs, flowing blood, circulating melanoma cells, and firing neural networks. "Photoacoustic tomography combines light and sound synergistically for high-resolution imaging of molecular contrast," says Professor Wang. [Caltech story] [Read the paper]