The Age of Autonomous Robots Is Upon Us

Visit a small business in 2016 and you might notice a fresh face in the room: a robot.

Hulking robotic devices have long been a fixture on factory floors and in other heavy industrial environments. And the dirt-sucking Roomba and its peers, rudimentary versions of the humanoid butler Rosie from The Jetsons, are certainly popular in homes across the country.

But a new class of robots has arrived, promising to be simple enough for a small team to build and cheap enough to justify the effort. There are robots that monitor and stock shelves in grocery stores. There are robots that will mow the lawn for you. There are cars that will drive themselves, office assistants that require regular recharging, and food delivery guys that won’t ask for a tip.

It’s the kind of thing you’ve seen for years in science-fiction movies, but the reality has been far more elusive. A decade ago, a robot that could do these things would have pushed a corporate lab to the limits of what was possible, let alone be cost-effective enough to bother. That’s no longer the case. Businesses no longer need NASA or Toyota to realize a more automated future—just a small team of passionate builders at a technology startup.

Why now? The reason is simple: timing. Advancements in the robotics industry that have been building for decades are converging. And trends in the broader technology industry—such as the growing library of open source software, advances in gaming systems, increasingly powerful and energy-efficient processors, and cheap sensors—have helped to accelerate the trajectory of “smart” autonomous machines.

“It’s related to the hardware revolution and development kits that are sold so cheaply and accessibly,” says Alex Libman, chief executive of NUA Robotics, an Israeli startup that makes a suitcase that follows its owner like a pet dog. “There’s so much data out there that allows people to build. People can build a small lab in their home for nothing.”

Sensors: Achievement Unlocked

A key ingredient driving the robotics boom may already be in your living room: a video game console.

Intelligent robots need to interact with their surroundings, which they see and process using sensors. Until recently, sensors cost thousands of dollars, making it impossible to build a robot on the cheap.

That changed in 2010 with the arrival of Kinect, a motion-sensing system built for Microsoft’s MSFT Xbox gaming console. Seemingly overnight, a capable vision system was available at the nearest electronics store for mere hundreds of dollars. Kinect quickly became the eyes for a generation of robots.

Today, Intel INTC hopes to replace Microsoft’s dominance among do-it-yourself types with its RealSense camera. The sensor’s thickness is measured in millimeters, not centimeters. It’s cheaper and more energy efficient. And it’s the latest example of an affordable product (a developer kit costs $99) that could help unlock the potential of robotics.

“We didn’t want to develop RealSense because we want to ship it in a few million PCs,” says Achin Bhowmik, general manager of Intel’s perceptual computing group. “We want to ship it into hundreds of millions of intelligent systems.”

Even cheaper types of sensing are on the way. Dispatch, a San Francisco robotics startup developing a mobile box that can autonomously travel through cities to make deliveries, had to find an alternative to Kinect and RealSense that could cope with bright sunlight. The startup hasn’t yet decided which technology it will use, but it has considered both a camera-based system and Lidar, the laser-based technology that Google’s self-driving cars use to sense their surroundings.

One problem: It cost Google GOOG upward of $70,000 to outfit its cars with Lidar. But relief is coming. Dispatch co-founder Sonia Jin is hopeful that there will be sub-$1,000 Lidar systems available within the next year or two. Units that cost just a few hundred dollars appear to be on the way, she says.

For more on robotics, watch:

Real-time Data Processing: An Inauspicious Start

In the late 1990s, Steve Cousins witnessed something surprising: a computer vision system that could track a person’s location in a video, even when they walked behind an obstacle. It was a big step up from the tasks already being performed by robots, which tended to be simple and pre-programmed; no intensive computing necessary.

In 2007, Cousins became CEO of Willow Garage, a famed (and now defunct) research center that built autonomous robots. He hunted down the Xerox researcher who presented the video a decade before and asked if he could use the tracking technology.

The researcher’s answer was no. It took days for Xerox’s computer vision system to process the data necessary to accurately track the person’s movement during that one-minute video. In 2007, the same feat could be accomplished in about 30 minutes—impressive, but too long a time period to use in a Willow Garage robot. Autonomous robots need to be able to take in their surroundings and react. They can’t spend minutes, let alone seconds, analyzing what they see.

“You have to be able to process the data in real time,” Cousins says. “A lot can happen in a second as someone is coming toward you. If you don’t realize it as it’s happening and slow down or take evasive action, you can have a collision, and collisions are something we try very hard in robotics to avoid.”

Real-time data processing became a reality by the late 2000s, when Moore’s Law had pushed processing power to the point where real-time decisions were possible. Today, robots can almost instantly analyze a massive amount of three-dimensional data about their surroundings. They can also compute distances, identities, and other essential information.

Cousins is now the CEO of Savioke, which makes a robot that can autonomously ferry small items like toothbrushes to hotel guests. After someone makes a request for an item, the robot leaves its dock near the front desk of the hotel and makes its way down a hallway, up the elevator, and to the guest’s room, all without assistance. Thanks to real-time processing, it can recognize people or other objects as obstacles and stop or steer around them, all thanks to the ability to monitor its surroundings.

Open Source Software: No Starting From Scratch

Today, most corporations do not differentiate themselves by their back-office functions, preferring to outsource the task to one of a handful of enterprise software providers.

For a long time there wasn’t an equivalent in the robotics industry—until open source software came long, that is. Free to use for anyone, open source software gave the growing robotics industry a shared knowledge base for basic functionality, sparing them from years of necessary programming and freeing them to work on what makes their ‘bot better.

The Robot Operating System, or ROS, is one such option, built by Willow Garage. ROS is now used in everything from the humanoid search-and-rescue robots that competed in the DARPA Robotics Challenge to Tally, the aforementioned autonomous robot that tracks inventory on store shelves. Mirza Shah, the CTO of Simbe, the company that manufactures Tally, estimated that without ROS it would have taken the startup 22 years, rather than 18 months, to build the robot.

The “consumerization” trend for robots continues. Roboticists are starting to think about what an app store for robots might look like. It’s only a matter of time, some say, before an explosively popular autonomous robot shoves the category into the mainstream.

“We are barely scratching the surface of this category of intelligent, autonomous machines,” Bhowmik says. “I believe we are on the cusp of a revolution.”

Talent: The Roboticists Are Coming

The convergence of these technologies has ignited interest among students who would otherwise work in other engineering jobs. Universities have decades of institutional knowledge in robotics research; until recently, students who studied the discipline were forced to find jobs only tangentially related to their studies.

Now they’re seeing high demand for their skills, according to Maja Matarić, who directs the Autonomous Systems Center at the University of Southern California School of Engineering.

“They’ve always been ready. There just haven’t been jobs,” she says. “Now everybody is hiring. And that’s happened within two or three years.”

Budding roboticists who end up in a mature ecosystem like Silicon Valley will find even more institutional knowledge to tap in the form of talent, engineering, and investor support, says Brady Forrest, vice president of the Highway1 startup accelerator. It’s not just about proximity to capital—it’s also about finding a like-minded community.

Not that the money isn’t helping. Venture funding for robotics has grown to $922.7 million in 2015, up from $341.3 million in 2014, according to estimates by Hizook, a news site for academics and professionals in the robotics industry. A recent International Data Corporation report projects that the world will spend $135.4 billion on robotics and related services in 2019, up from $71 billion in 2015.

That $71 billion was dominated by the manufacturing industry, which accounted for 63.4% of robotics spending last year, according to the report. Expensive industrial machines will continue to drive worldwide spending. But a growing share is coming from the burgeoning consumer and industrial applications outside manufacturing, which in turn are attracting robotics venture funding as well as attention from Silicon Valley accelerators such as Highway1, Lemnos Labs, and HAX.

The legacy of Willow Garage still looms over the industry—a PayPal Mafia, of sorts, for the robotics industry. In the years leading up to the company’s shuttering in 2014, Willow Garage employees gradually left to found their own companies. The knowledge they took with them in building some of the finest autonomous robots in the last decade has now dispersed to many teams, all working on different ideas. Former robot development manager Melonee Wise has gone on to become CEO of Fetch Robotics, which is working on a pair of robots that can pick items off warehouse shelves and ferry them away. Founder Scott Hassan’s new company, Suitable Technologies, makes a telepresence robot. The list goes on.

“There are a lot of shoulders you can stand on as you do a startup,” says Cousins, the former Willow Garage CEO. “We went from starting Savioke to putting a robot in the field in 10 months. That’s almost unheard of. At two years old, this startup is raising a lot of money and building a lot of robots. We’re part of a great ecosystem that’s tuned to let these things happen.”

Visit a small business in 2016 and you might notice a fresh face in the room: a robot.

Hulking robotic devices have long been a fixture on factory floors and in other heavy industrial environments. And the dirt-sucking Roomba and its peers, rudimentary versions of the humanoid butler Rosie from The Jetsons, are certainly popular in homes across the country.

But a new class of robots has arrived, promising to be simple enough for a small team to build and cheap enough to justify the effort. There are robots that monitor and stock shelves in grocery stores. There are robots that will mow the lawn for you. There are cars that will drive themselves, office assistants that require regular recharging, and food delivery guys that won’t ask for a tip.

It’s the kind of thing you’ve seen for years in science-fiction movies, but the reality has been far more elusive. A decade ago, a robot that could do these things would have pushed a corporate lab to the limits of what was possible, let alone be cost-effective enough to bother. That’s no longer the case. Businesses no longer need NASA or Toyota to realize a more automated future—just a small team of passionate builders at a technology startup.

Why now? The reason is simple: timing. Advancements in the robotics industry that have been building for decades are converging. And trends in the broader technology industry—such as the growing library of open source software, advances in gaming systems, increasingly powerful and energy-efficient processors, and cheap sensors—have helped to accelerate the trajectory of “smart” autonomous machines.

“It’s related to the hardware revolution and development kits that are sold so cheaply and accessibly,” says Alex Libman, chief executive of NUA Robotics, an Israeli startup that makes a suitcase that follows its owner like a pet dog. “There’s so much data out there that allows people to build. People can build a small lab in their home for nothing.”

Sensors: Achievement Unlocked

A key ingredient driving the robotics boom may already be in your living room: a video game console.

Intelligent robots need to interact with their surroundings, which they see and process using sensors. Until recently, sensors cost thousands of dollars, making it impossible to build a robot on the cheap.

That changed in 2010 with the arrival of Kinect, a motion-sensing system built for Microsoft’s MSFT Xbox gaming console. Seemingly overnight, a capable vision system was available at the nearest electronics store for mere hundreds of dollars. Kinect quickly became the eyes for a generation of robots.

Today, Intel INTC hopes to replace Microsoft’s dominance among do-it-yourself types with its RealSense camera. The sensor’s thickness is measured in millimeters, not centimeters. It’s cheaper and more energy efficient. And it’s the latest example of an affordable product (a developer kit costs $99) that could help unlock the potential of robotics.

“We didn’t want to develop RealSense because we want to ship it in a few million PCs,” says Achin Bhowmik, general manager of Intel’s perceptual computing group. “We want to ship it into hundreds of millions of intelligent systems.”

Even cheaper types of sensing are on the way. Dispatch, a San Francisco robotics startup developing a mobile box that can autonomously travel through cities to make deliveries, had to find an alternative to Kinect and RealSense that could cope with bright sunlight. The startup hasn’t yet decided which technology it will use, but it has considered both a camera-based system and Lidar, the laser-based technology that Google’s self-driving cars use to sense their surroundings.

One problem: It cost Google GOOG upward of $70,000 to outfit its cars with Lidar. But relief is coming. Dispatch co-founder Sonia Jin is hopeful that there will be sub-$1,000 Lidar systems available within the next year or two. Units that cost just a few hundred dollars appear to be on the way, she says.

For more on robotics, watch:

Real-time Data Processing: An Inauspicious Start

In the late 1990s, Steve Cousins witnessed something surprising: a computer vision system that could track a person’s location in a video, even when they walked behind an obstacle. It was a big step up from the tasks already being performed by robots, which tended to be simple and pre-programmed; no intensive computing necessary.

In 2007, Cousins became CEO of Willow Garage, a famed (and now defunct) research center that built autonomous robots. He hunted down the Xerox researcher who presented the video a decade before and asked if he could use the tracking technology.

The researcher’s answer was no. It took days for Xerox’s computer vision system to process the data necessary to accurately track the person’s movement during that one-minute video. In 2007, the same feat could be accomplished in about 30 minutes—impressive, but too long a time period to use in a Willow Garage robot. Autonomous robots need to be able to take in their surroundings and react. They can’t spend minutes, let alone seconds, analyzing what they see.

“You have to be able to process the data in real time,” Cousins says. “A lot can happen in a second as someone is coming toward you. If you don’t realize it as it’s happening and slow down or take evasive action, you can have a collision, and collisions are something we try very hard in robotics to avoid.”

Real-time data processing became a reality by the late 2000s, when Moore’s Law had pushed processing power to the point where real-time decisions were possible. Today, robots can almost instantly analyze a massive amount of three-dimensional data about their surroundings. They can also compute distances, identities, and other essential information.

Cousins is now the CEO of Savioke, which makes a robot that can autonomously ferry small items like toothbrushes to hotel guests. After someone makes a request for an item, the robot leaves its dock near the front desk of the hotel and makes its way down a hallway, up the elevator, and to the guest’s room, all without assistance. Thanks to real-time processing, it can recognize people or other objects as obstacles and stop or steer around them, all thanks to the ability to monitor its surroundings.

Open Source Software: No Starting From Scratch

Today, most corporations do not differentiate themselves by their back-office functions, preferring to outsource the task to one of a handful of enterprise software providers.

For a long time there wasn’t an equivalent in the robotics industry—until open source software came long, that is. Free to use for anyone, open source software gave the growing robotics industry a shared knowledge base for basic functionality, sparing them from years of necessary programming and freeing them to work on what makes their ‘bot better.

The Robot Operating System, or ROS, is one such option, built by Willow Garage. ROS is now used in everything from the humanoid search-and-rescue robots that competed in the DARPA Robotics Challenge to Tally, the aforementioned autonomous robot that tracks inventory on store shelves. Mirza Shah, the CTO of Simbe, the company that manufactures Tally, estimated that without ROS it would have taken the startup 22 years, rather than 18 months, to build the robot.

The “consumerization” trend for robots continues. Roboticists are starting to think about what an app store for robots might look like. It’s only a matter of time, some say, before an explosively popular autonomous robot shoves the category into the mainstream.

“We are barely scratching the surface of this category of intelligent, autonomous machines,” Bhowmik says. “I believe we are on the cusp of a revolution.”

Talent: The Roboticists Are Coming

The convergence of these technologies has ignited interest among students who would otherwise work in other engineering jobs. Universities have decades of institutional knowledge in robotics research; until recently, students who studied the discipline were forced to find jobs only tangentially related to their studies.

Now they’re seeing high demand for their skills, according to Maja Matarić, who directs the Autonomous Systems Center at the University of Southern California School of Engineering.

“They’ve always been ready. There just haven’t been jobs,” she says. “Now everybody is hiring. And that’s happened within two or three years.”

Budding roboticists who end up in a mature ecosystem like Silicon Valley will find even more institutional knowledge to tap in the form of talent, engineering, and investor support, says Brady Forrest, vice president of the Highway1 startup accelerator. It’s not just about proximity to capital—it’s also about finding a like-minded community.

Not that the money isn’t helping. Venture funding for robotics has grown to $922.7 million in 2015, up from $341.3 million in 2014, according to estimates by Hizook, a news site for academics and professionals in the robotics industry. A recent International Data Corporation report projects that the world will spend $135.4 billion on robotics and related services in 2019, up from $71 billion in 2015.

That $71 billion was dominated by the manufacturing industry, which accounted for 63.4% of robotics spending last year, according to the report. Expensive industrial machines will continue to drive worldwide spending. But a growing share is coming from the burgeoning consumer and industrial applications outside manufacturing, which in turn are attracting robotics venture funding as well as attention from Silicon Valley accelerators such as Highway1, Lemnos Labs, and HAX.

The legacy of Willow Garage still looms over the industry—a PayPal Mafia, of sorts, for the robotics industry. In the years leading up to the company’s shuttering in 2014, Willow Garage employees gradually left to found their own companies. The knowledge they took with them in building some of the finest autonomous robots in the last decade has now dispersed to many teams, all working on different ideas. Former robot development manager Melonee Wise has gone on to become CEO of Fetch Robotics, which is working on a pair of robots that can pick items off warehouse shelves and ferry them away. Founder Scott Hassan’s new company, Suitable Technologies, makes a telepresence robot. The list goes on.

“There are a lot of shoulders you can stand on as you do a startup,” says Cousins, the former Willow Garage CEO. “We went from starting Savioke to putting a robot in the field in 10 months. That’s almost unheard of. At two years old, this startup is raising a lot of money and building a lot of robots. We’re part of a great ecosystem that’s tuned to let these things happen.”