01223 307738 info@robionics.com

Former Google VP: Machines emotionally intelligent in 2016

Former Google VP: Machines emotionally intelligent in 2016

Andrew Moore, ailment the Dean of the Carnegie Mellon School of Computer Science and a former Vice President at Google, case just told me something exciting. Moore predicts that 2016 will see a rapid proliferation of research on machine emotional understanding in machines. Robots, smart phones, and computers will very quickly start to understand how we’re feeling and will be able to respond accordingly.

“When it comes to things like advertising, this may feel uncomfortable. I have a background in internet advertising and if I wasn’t busy being a dean I would be launching a startup to enable your tablet to watch you while you read web pages in order to see if you’re reacting positively or negatively.”

It’s not difficult to see how valuable that information would be to advertisers, nor is it likely that the technology will advance without heavy scrutiny and criticism.

Because humans give off a number of discernible cues about how they’re feeling, both consciously and unconsciously, research in this area has taken several parallel paths. Voice patterns can reveal stress and excitement and the movement of facial muscles provides a revealing map of a person’s inner state. One of the biggest breakthroughs for emotional sensing by machines is actually rather mundane.

“Cameras are now higher resolution. High res cameras can track slight movements on the face and even individual hairs.”

Roboticists and computer scientists have applied advances in machine vision to an existing body of research from the field of psychology on emotional cues. At present, most emotionally intelligent machines are picking up on the same types of emotional cues that humans pick up on, such as body language and facial movements. But there is the tantalizing possibility that machine learning could be employed to enable machines to figure out even better strategies for interpreting emotion by measuring cues that we can’t pick up on, or at least aren’t aware of. Eventually this could result in devices that are more emotionally perceptive than humans.

We’ve already seen some impressive examples of emotional intelligence in machines. Aldebaran’s Pepper robot, which debuted as an in-store clerk at Softbank stores in Japan before going on sale to consumers, is sophisticated enough to joke with people and gauge their reaction. Based on those inputs, Pepper uses machine learning algorithms to refine its social behavior.