If you, like Jeopardy contestant Ken Jennings, are ready to welcome “our new computer overlords,” chances are you won’t have to wait long. Progress in the field of artificial intelligence has advanced by leaps and bounds over the past few years, churning out incredible machines like IBM’s Watson, which soundly defeated Jennings and fellow player Brad Rutter at America’s toughest trivia game. Now, scientists at Georgia Tech’s Center for Robotics and Intelligent Machines (RIM) say they are within a decade of creating personal robots capable of cleaning our homes, taking us on guided tours and caring for our grandparents in nursing facilities. However, computer scientists simply won’t be able to program every robot to do all the things we will want them to do. This means we’ll have to tell robots what do to and how to do it. How will we do that?
Tip 1: Use English
Scientists want our interaction with robots to be as intuitive as possible, so that means designing them to process our natural language. Usually, computers are programmed using math-based languages, but most people don’t want to earn a degree in computer science just to tell a machine to vacuum the floor. With this in mind, scientists are developing programming languages based on English syntax rather than mathematical symbols, which is no small feat. Math-based languages allow for only one means of expression, while natural languages like English can phrase a single thought in half a dozen ways. When programming in a natural language, scientists must factor in all, or most, possible phrasings of input commands—a tedious task, but one that won’t leave future citizens guessing for the exact phrasing that will get robots to take out the trash.
Tip 2: Throw Away that Keyboard
Though almost anyone can type a command into a computer, that kind of input method will prove impractical when granny needs her robot to help her out of the bathtub. Scientists know that personal robots will be expected to operate on voice commands for ease and efficiency of use. Some of this technology is already available through mediums like the iPhone’s Siri software, which allows users to make phone calls, send texts and search the Internet by voice.
However, tomorrow’s robots will need to do more than just process simple voice commands; they will also need to learn the tasks their owners want them to do. This will require them to have electronic brains capable of being programmed with both visual and auditory information in the human-like process of “active learning.” RIM’s Maya Cakmak, Ph.D., is bringing this advanced technology to life by programming a robot named Simon to learn new tasks by asking questions. Her study on the subject, entitled “Designing Robot Learners that Ask Good Questions,” was recently presented at the 7th ACM/IEEE Conference on Human-Robot Interaction (HRI). Cakmak’s work will someday allow ordinary people to program robots without ever touching a keypad or phrasing commands in seemingly bizarre ways. However, you may still have to demonstrate for your robot exactly how to line up your collectible action figures.
Tip 3: Teach It to Ask the Right Questions
Robots can’t ask just any kind of question if they are to learn and communicate effectively with their human masters. People don’t want to spend all day teaching their robots how to hang up a jacket, for instance. So, what kinds of questions should a robot ask to facilitate a smooth robot-human interaction? Surprisingly, humans have provided the answer.
In an experiment, Cakmak asked a group of people to pretend to be robots bent on learning a new task. The questions participants asked in the course of their learning were sorted into three categories: label query, demonstration query and feature query. Cakmak found that 82 percent of the questions fell into the feature query category. When Cakmak asked the group to then rate which questions were “smartest,” 72 percent chose feature queries. Since humans seem to overwhelmingly prefer feature queries, this is the type of question learning robots will ask in the future.
A feature query seeks to define the features a particular task. The example given in Cakmak’s study was, “Can I pour salt from any height?” Technically, anyone can pour salt from almost any height, but it may not be appropriate or desirable to do so, especially when the flavor of your mashed potatoes is at stake. This differs widely from a label query (“Can I pour salt like this?”), which simply yields a ‘yes’ or ‘no’ response.
Tip 4: Observe the Subtle Cues
There’s more to communication than just verbalizing. Eye contact, hand gestures, tone of voice and body language are all part of the equation, and things that robots will have to master to truly integrate into our world.
“Other human beings understand turn-taking,” says Aaron Bobick, chair of Georgia Tech’s School of Interactive Computing. “They understand that if I make some indication, they’ll turn and face someone when they want to engage with them, and they won’t when they don’t want to engage with them. In order for these robots to work with us effectively, they have to obey these same kinds of social conventions.”
In the future, robots will be able to wave, beckon and communicate in other non-verbal ways. They’ll also be able to analyze their masters’ social and physical communication cues and respond to them appropriately. Researchers at Georgia Tech found that when they equipped their robot Simon with cameras, it could predict with 80 percent accuracy whether or not it had attracted a person’s attention with a simple mechanical gesture. Talking to a socially conscious robot means you won’t feel the urge to scream to make your wishes understood.
Tip 5: Remember, It is Still a Robot
Scientists are working hard to give robots more human-like qualities, such as smooth movements and somewhat random behavior. The purpose in doing these things is to make human-robot communication as natural and productive as possible. Someday, people may even be able to learn tasks by observing how robots perform them. No matter how human they seem, though, they’re still machines incapable of acting beyond their programming. This is especially important to remember if you ever find yourself in a hospital about to get a sponge bath from Georgia Tech’s “Cody.” No, the robot isn’t touching your arm to comfort you. It doesn’t have that capacity. You can climb down from the ceiling, now.