The most prominent headline-grabbing project of tech giants in recent years is arguably the effort to develop autonomous vehicles, including the Waymo self-driving car project at Alphabet (GOOGL – Get Report) .
However, such conspicuous projects, which can risk fatalities in the real world, are obscuring more realistic attempts by Big Tech to put robots into service.
Numerous efforts are underway between Google and Amazon (AMZN – Get Report) and Microsoft (MSFT – Get Report) and Facebook (FB – Get Report) to accelerate the development of robotic systems, building upon years of training robots on basic tasks. The coming decade is more likely to see practical robotics progress than it is self-driving vehicles. Although the technology still is years from maturity, here’s how we could see the field move past the Roomba, the floor-cleaning robot from iRobot (IRBT – Get Report) that is to date the only successful mass-manufactured robotic system for the public.
Robotics today is just emerging from a fairly primitive state. Products such as Roomba, and industrial robots such as the Sawyer machine from Germany’s Hahn Group, have been programmed for a limited set of tasks using extensive knowledge of physics. But the artificial intelligence community inside Facebook and the other giants is trying to build a class of robots with much broader capabilities that don’t have to be explicitly programmed for every new task — robots that learn, in a sense.
Facebook’s A.I. research group recently offered up a novel programming tool, called PyRobot, which abstracts much of details of controlling basic hardware functions in the robotics systems that are on the market. The tool bundles together lots of pieces of code that a researcher might ordinarily have to write themselves. The idea is to build upon software that is becoming a kind of de-facto standard. That includes something called the “Robotics Operating System” developed at Stanford University’s Personal Robotics Program a decade ago. The ROS, as it’s called, becomes a platform supported by robot hardware makers, so that programmers don’t need to know as much about each robot in order to program on any of them.
Facebook is one of the big tech firms that is very, very interested in robots. During a conference in New York last month, Facebook’s head of A.I. research, Yann LeCun, declared that one of the most interesting problems in artificial intelligence is how robotics systems develop a sense of the built environment, rather than the kinds of A.I. that are confined to a server computer or a smartphone.
For LeCun, and others, robots are not only potentially practical devices, they represent the future of artificial intelligence and other kinds of machine learning.
To advance efforts, Facebook, Amazon, Google, and Samsung Electronics have all contributed funding to a new “open research commons” on the University of California at Berkeley campus that is under the auspices of Berkeley’s A.I. lab.
That Berkeley lab has done ground-breaking work in recent years to make it so the Sawyer and other robot systems can discover how to move about and grasp and place things in the environment, all without being explicitly told to, the way robots were traditionally programmed. Under the direction of professor Sergey Levine, the Berkeley group is carrying out what Levine likens to a robot childhood, taking machines through the basic stages of acquiring a knowledge about the world. Facebook’s LeCun, at that conference in New York in June, specifically cited Berkeley, and he said similar work is underway inside the social media giant.
Helping efforts such as PyRobot and the Berkeley work are new, cheaper robot systems. They include compact arms for picking up things such as from Trossen Robotics of Downers Grove, Illinois, which retails for $2,099. The most important thing is that such low-cost machines are now going to be informed by the vast machine learning efforts of the tech giants, to gain capabilities that no one company could anticipate or design.
Of course, Alphabet famously went all-in on robots, some years ago: It bought an MIT spin-off, Boston Dynamics, makers of those creepy four-legged animal-like robots, in December of 2013, and a company called Schaft that came out of research at the University of Tokyo. But Google subsequently decided to sell Boston Dynamics to Japanese conglomerate Softbank (SFTBY) , and reportedly has shut down the Schaft program.
Those developments are not really indicative of much: companies such as Alphabet spend freely to experiment and to learn about a field. It’s likely that the near future of robots will look a lot more like the very simple systems such as LoCoBot that can be cheaply obtained and simply programmed, rather than the giant, monstrous creations of Boston Dynamics.
What will these new robots do? Probably, Google and Facebook and Amazon will start small, selling consumer devices that can carry out household tasks, just like iRobot’s products. The key differentiator for these systems will be that they’re tied to the vast cloud computing resources and machine learning efforts of the tech giants. Hopefully, these robots will be both less ambitious a gamble than self-driving cars, and less awkward than Google Glass.