Projects

As part of the Traffic21 initiative at CMU, we are investigating the design and application of adaptive traffic signal control strategies for urban road networks. Our research has three broad themes: (1) development of signalization strategies that allow real-time response to shifts in traffic conditions

As part of the Traffic21 initiative at CMU, we are investigating the design and application of adaptive traffic signal control strategies for urban road networks. Our research has three broad themes: (1) development of signalization strategies that allow real-time response to shifts in traffic conditions

As robotics technology evolves to a stage where co-robots, or robots that can work with humans, become a reality, we need to ensure that these co-robots are equally capable of interacting with humans with disabilities. This project addresses this challenge by exploring meaningful human-robot interaction (HRI) in the context of assistive robots for blind travelers.

As robotics technology evolves to a stage where co-robots, or robots that can work with humans, become a reality, we need to ensure that these co-robots are equally capable of interacting with humans with disabilities. This project addresses this challenge by exploring meaningful human-robot interaction (HRI) in the context of assistive robots for blind travelers.

As DoD autonomous vehicles begin to take on more-complex and longer-duration missions they will need to incorporate knowledge about the current state of their sensing, actuation, and computing capabilities into their mission and task planning.

As DoD autonomous vehicles begin to take on more-complex and longer-duration missions they will need to incorporate knowledge about the current state of their sensing, actuation, and computing capabilities into their mission and task planning.

The research project aims to design and demonstrate new sensor technologies for autonomously gathering crop and canopy size estimates from a vineyard -- expediently, precisely, accurately and at high-resolution -- with the goal to improve vineyard efficiency by enabling producers to measure and manage the principal components of grapevine production on an individual vine basis.

The research project aims to design and demonstrate new sensor technologies for autonomously gathering crop and canopy size estimates from a vineyard -- expediently, precisely, accurately and at high-resolution -- with the goal to improve vineyard efficiency by enabling producers to measure and manage the principal components of grapevine production on an individual vine basis.

We are developing implantable biodegradable electronic devices offer the potential to provide therapeutic functions for limited periods of time - weeks to months – degrading in register with the anticipated needs of the application and thus not requiring surgical removal. One application is a biodegradable radio frequency (RF) power generator connected to electrical stimulating electrodes to enhance bone regeneration.

We are developing implantable biodegradable electronic devices offer the potential to provide therapeutic functions for limited periods of time - weeks to months – degrading in register with the anticipated needs of the application and thus not requiring surgical removal. One application is a biodegradable radio frequency (RF) power generator connected to electrical stimulating electrodes to enhance bone regeneration.

We are developing implantable, wireless MEMs-based sensors for various applications, such as monitoring bone regeneration and left ventricular pressure, to provide timely feedback to clinicians to help make better decisions on timing of therapeutic interventions.

We are developing implantable, wireless MEMs-based sensors for various applications, such as monitoring bone regeneration and left ventricular pressure, to provide timely feedback to clinicians to help make better decisions on timing of therapeutic interventions.

We have designed and built inkjet-based bioprinters to controllably deposit spatial patterns of various growth factors and other signaling molecules on and in biodegradable scaffold materials to guide tissue regeneration.

We have designed and built inkjet-based bioprinters to controllably deposit spatial patterns of various growth factors and other signaling molecules on and in biodegradable scaffold materials to guide tissue regeneration.

We have developed a manufacturing process to convert donated blood plasma and platelets into inexpensive, off-the-shelf bioactive plastics to enhance and accelerate tissue healing. These materials contain nature’s own mix of growth factors in highly concentrated solid to semi-solid forms that controllably elute these factors as the bioplastics degrade. This technology is currently in human clinical trials.

We have developed a manufacturing process to convert donated blood plasma and platelets into inexpensive, off-the-shelf bioactive plastics to enhance and accelerate tissue healing. These materials contain nature’s own mix of growth factors in highly concentrated solid to semi-solid forms that controllably elute these factors as the bioplastics degrade. This technology is currently in human clinical trials.

In this project, we are investigating ways to leverage spatial context for the recognition of core building components, such as walls, floors, ceilings, doors, and doorways for the purpose of modeling interiors using 3D sensor data.

In this project, we are investigating ways to leverage spatial context for the recognition of core building components, such as walls, floors, ceilings, doors, and doorways for the purpose of modeling interiors using 3D sensor data.

"Extrinsic Dexterity" is a way to get dexterous manipulation with a very simple hand, by coordinating finger motion with arm motion. The more common approach is to depend entirely on the fingers of the hand, which requires at least three fingers and at least nine motors. We have demonstrated Extrinsic Dexterity using the single motor of the MLab Hand, coordinated with the motions of the arm.

"Extrinsic Dexterity" is a way to get dexterous manipulation with a very simple hand, by coordinating finger motion with arm motion. The more common approach is to depend entirely on the fingers of the hand, which requires at least three fingers and at least nine motors. We have demonstrated Extrinsic Dexterity using the single motor of the MLab Hand, coordinated with the motions of the arm.

Cyber-Physical Systems (CPS) encompass a large variety of systems including example future energy systems (e.g. smart grid), homeland security and emergency response, smart medical technologies, smart cars and air transportation. The goal of this project is to develop cognitively-based analytic models of human operators so that they can be integrated with models of the physical/robotic system so that the whole mixed human-CPS system can be formally verified.

Cyber-Physical Systems (CPS) encompass a large variety of systems including example future energy systems (e.g. smart grid), homeland security and emergency response, smart medical technologies, smart cars and air transportation. The goal of this project is to develop cognitively-based analytic models of human operators so that they can be integrated with models of the physical/robotic system so that the whole mixed human-CPS system can be formally verified.

In this project, we are developing mapping and localization methods that combine aerial imagery from satellite and aerial platforms with maps and perception from ground-based robots to produce integrated maps even when GPS is unavailable.

In this project, we are developing mapping and localization methods that combine aerial imagery from satellite and aerial platforms with maps and perception from ground-based robots to produce integrated maps even when GPS is unavailable.

We have developed a novel and relatively simple method for magnifying forces perceived by an operator using a tool. A sensor measures the force between the tip of a tool and its handle held by the operator’s fingers.

We have developed a novel and relatively simple method for magnifying forces perceived by an operator using a tool. A sensor measures the force between the tip of a tool and its handle held by the operator’s fingers.

In this project we develop the trajectory planning system for an autonomous helicopter. The helicopter is used for cargo delivery. To read more about the trajectory planning system see the following publications.

In this project we develop the trajectory planning system for an autonomous helicopter. The helicopter is used for cargo delivery. To read more about the trajectory planning system see the following publications.

We have developed a new image-based guidance system for microsurgery using optical coherence tomography (OCT), which presents a continuously updated virtual image in its correct location inside the scanned tissue. OCT provides real-time, 6-micron resolution images at video rates within a 2-6 mm axial range in soft or transparent tissue, and is therefore suitable for guidance to various targets in the eye.

We have developed a new image-based guidance system for microsurgery using optical coherence tomography (OCT), which presents a continuously updated virtual image in its correct location inside the scanned tissue. OCT provides real-time, 6-micron resolution images at video rates within a 2-6 mm axial range in soft or transparent tissue, and is therefore suitable for guidance to various targets in the eye.

Our goal is to allow people and intelligent and dexterous machines to work together safely as partners in assembly operations performed within industrial workcells. To ensure the safety of people working amidst active robotic devices, we use vision and 3D sensing technologies, such as stereo cameras and flash LIDAR, to detect and track people and other moving objects within the workcell.

Our goal is to allow people and intelligent and dexterous machines to work together safely as partners in assembly operations performed within industrial workcells. To ensure the safety of people working amidst active robotic devices, we use vision and 3D sensing technologies, such as stereo cameras and flash LIDAR, to detect and track people and other moving objects within the workcell.

We leverage perception technology originally developed for ground-based robot vehicles during 20 years of research at the Field Robotics Center. We combine this proven perception and control technology with aircraft-centric engineering and optimization.

We leverage perception technology originally developed for ground-based robot vehicles during 20 years of research at the Field Robotics Center. We combine this proven perception and control technology with aircraft-centric engineering and optimization.

The goal of this project is develop the next level of capability for a low-flying, map building MAV scout. The research will demonstrate rapid scouting in cluttered environments and acquire relevant semantically annotated maps.

The goal of this project is develop the next level of capability for a low-flying, map building MAV scout. The research will demonstrate rapid scouting in cluttered environments and acquire relevant semantically annotated maps.

Articulated locomoting robots are the center of ongoing research project in the Robotics Institute Biorobotics lab. We call our robots Modsnakes because they have a modular, i.e. repeating, structure of the same element. Each module has a single rotational degree of freedom and when chained together, form the snake robot.

Articulated locomoting robots are the center of ongoing research project in the Robotics Institute Biorobotics lab. We call our robots Modsnakes because they have a modular, i.e. repeating, structure of the same element. Each module has a single rotational degree of freedom and when chained together, form the snake robot.

Safe and independent navigation of urban environments is a key feature of accessible cities. People who have physical challenges need practical, customizable, low-cost and easily-deployable mobility aids to help them safely navigate urban environments. Technology tools provide opportunities to empower people with disabilities to overcome some day-to-day challenges.

Safe and independent navigation of urban environments is a key feature of accessible cities. People who have physical challenges need practical, customizable, low-cost and easily-deployable mobility aids to help them safely navigate urban environments. Technology tools provide opportunities to empower people with disabilities to overcome some day-to-day challenges.

We are using video cameras to give vision to the ultrasound transducer. This could eventually lead to automated analysis of the ultrasound data within its anatomical context, as derived from an ultrasound probe with its own visual input about the patient’s exterior. We are exploring both probe-mounted cameras, as well as optically-tracked stand-alone cameras which could view a larger portion of the patient's exterior.

We are using video cameras to give vision to the ultrasound transducer. This could eventually lead to automated analysis of the ultrasound data within its anatomical context, as derived from an ultrasound probe with its own visual input about the patient’s exterior. We are exploring both probe-mounted cameras, as well as optically-tracked stand-alone cameras which could view a larger portion of the patient's exterior.

The goal of this project is to increase the effectiveness of paratransit service providers in managing daily operations through the development and deployment of dynamic, real-time scheduling technology.

The goal of this project is to increase the effectiveness of paratransit service providers in managing daily operations through the development and deployment of dynamic, real-time scheduling technology.

We are researching and developing methods to empower consumers and service providers in the design and evaluation of accessible transportation equipment, information services, and physical environments.

We are researching and developing methods to empower consumers and service providers in the design and evaluation of accessible transportation equipment, information services, and physical environments.

This project is developing technology to map riverine environments from a low-flying rotorcraft. Challenges include dealing with varying appearance of the river and surrounding canopy, intermittent GPS and a highly constrained payload. We are developing self-supervised algorithms that can segment images from onboard cameras to determine the course of the river ahead, and we are developing devices and methods capable of mapping the shoreline.

This project is developing technology to map riverine environments from a low-flying rotorcraft. Challenges include dealing with varying appearance of the river and surrounding canopy, intermittent GPS and a highly constrained payload. We are developing self-supervised algorithms that can segment images from onboard cameras to determine the course of the river ahead, and we are developing devices and methods capable of mapping the shoreline.

We present a fleet of autonomous Robot Sensor Boats (RSBs) developed for lake and river fresh water quality assessment and controlled by our Multilevel Autonomy Robot Telesupervision Architecture (MARTA).

We present a fleet of autonomous Robot Sensor Boats (RSBs) developed for lake and river fresh water quality assessment and controlled by our Multilevel Autonomy Robot Telesupervision Architecture (MARTA).

The Integrated Automation for Sustainable Specialty Crops Farming project teams the National Robotics Engineering Center (NREC), the University of Florida, Cornell University and John Deere to bring precision agriculture and autonomous equipment to citrus growers.

The Integrated Automation for Sustainable Specialty Crops Farming project teams the National Robotics Engineering Center (NREC), the University of Florida, Cornell University and John Deere to bring precision agriculture and autonomous equipment to citrus growers.

We are developing a robotic phenotyping systems for phenotyping crops for rapid breeding decisions. This system positions sensors within the canopy for measurements not observable from above or below. Machine learning and computer vision algorithms are then used to generate phenotyping data from the raw sensor data.

We are developing a robotic phenotyping systems for phenotyping crops for rapid breeding decisions. This system positions sensors within the canopy for measurements not observable from above or below. Machine learning and computer vision algorithms are then used to generate phenotyping data from the raw sensor data.

The purpose of this project is to develop methods for place matching that are invariant to short- and long-term environmental variations in support of autonomous vehicle localization in GPS-denied situations.

The purpose of this project is to develop methods for place matching that are invariant to short- and long-term environmental variations in support of autonomous vehicle localization in GPS-denied situations.

This research project aims to develop methods to automatically collect visual image data to infer, estimate and forecast crop yields -- producing yield maps with high-resolution, across large scales and with accuracy. To achieve efficiency and accuracy, statistical sampling strategies are designed for human-robot teams that are optimal in the number of samples, location of samples, cost of sampling and accuracy of crop estimates.

This research project aims to develop methods to automatically collect visual image data to infer, estimate and forecast crop yields -- producing yield maps with high-resolution, across large scales and with accuracy. To achieve efficiency and accuracy, statistical sampling strategies are designed for human-robot teams that are optimal in the number of samples, location of samples, cost of sampling and accuracy of crop estimates.