What This Ping-Pong Robot Tells Us About the Next Phase of Human-Robot Interaction
Grow Your Business, Not Your Inbox
There were dozens of robots at CES, and many of them were the kinds with human features -- faces, arms, legs and even personalities. Some of them, such as a giant machine that can play table tennis against a person, didn’t feel so “artificially” intelligent, given how good it was at what its creators built it to do.
This giant robot, Forpheus, created by Japanese technology company Omron, is not available for consumers to buy. It’s a table tennis tutor that plays against humans, though it doesn’t try to crush its competition. It senses movements and facial expressions, gauging skill level and emotions and adapting its game to the player at hand.
“We try to use technology to improve people’s lives and make people the best they can be,” says Keith Kersten, a marketing manager with Omron. “Forpheus incorporates real technologies that we use, just with different applications.”
For instance, Omron’s facial recognition technology is already embedded in millions of mobile devices today. Down the road, the company foresees potential for use in semi-autonomous vehicles to detect whether a driver is getting tired or not paying attention and take over when necessary, or with medical patients to sense illness or health episodes and call upon assistance.
These are some of the subtle types of interactions we’ll have with robots as technology develops further. They go deeper than the idea of humans and robots becoming companions, though CES included a range of social bots specializing in everything from retail to eldercare. Rather, the ability to process emotions (often powered by cloud servers, but nonetheless) shows promise to build empathy between people and machines. Whether it takes the form of a “face-to-face” interaction or an automatic task, when we see robots begin to sense our humans needs, it will broaden the scope of what robots can add to human lives.
“All it takes is for a robot to do something really well, and it will flip the switch in our heads from novelty to practical,” says Alex Gaughan, who works in SaaS marketing at Montreal-based C2RO Cloud Robotics. “We’re seeing some momentum for actually figuring out the social benefit and how they add value to business.”
Gaughan points out that in the Western hemisphere, people tend to intellectualize robots more so than people in Asia, who tend to use more natural human vocabulary to describe their interactions with them. People in the Americas think along the lines of, Oh, this is a robot. What are the implications of me interacting with this robot?
“We don’t even realize that we’re already using the AI that’s in these things on our phones every day,” Gaughan says.
Despite the traction of voice assistants such as Amazon Alexa and Google Home, we’re in the “Stone Age” of using voice to communicate with machines, explains Vijay Umapathy, senior product manager at Jibo. “Someday people will look back and say, ‘OK, the voice thing was there, but I had to talk to it?” Umapathy says. “Technology has been and is getting more and more capable of having its own agency.”
The team at Jibo is working on upgrades to make its eponymous social robot initiate more interactions with the humans whose homes it (or, “he”) occupies. Leading up to Jibo’s launch in October 2017, Jibo developers worked to make him seem more aware of his surroundings, turning his “head” to look at people as they move about a room so it’s clear he perceives they’re there before he says anything.
Right now, the biggest voice assistants on the market mainly field simple questions such as “What’s the weather today?” The Echo Show adds a screen -- a second interface -- to Alexa/Echo. Similarly, Jibo is not limited by one type of interface. One of Umapathy’s colleagues has a sign in his office that reads, “The Relationship Is the UI,” or user interface. This represents the belief that it takes a more ambient, all-encompassing presence to reveal what interactions with a piece of technology can accomplish.
“It's kind of the difference between the reality that we have today and honestly, Star Wars,” Umapathy explains, noting that the film franchise inspired Jibo founder Cynthia Breazeal’s interest in robotics. “If you look at a lot of droids in Star Wars movies, the scene starts with them actually walking in and saying, 'Hey, something's wrong,’ by beeping in whatever language or speaking, in the case of C-3PO. 'Master Luke, Master Luke!' We believe that we should live in a world where interactions start from the technology and don't just wait on the user to actually do something.”
Again, this doesn’t have to be interactive, such as a personified robot rolling up to you and predicting what you want next or a pingpong-playing robot helping you improve your volley. It’s about the technology inside those easy-to-conceptualize bots that can sense who you are and want you want. It could take the form of a smart home integrated system that senses you’ve arrived home from work and automatically adjusts the lights.
“That would be pretty magical,” Umapathy says, “and it would also probably result in a very different set of usage of a lot of these different services.”