Science & Technology

Pepper, The Emotion-Reading Robot

Note: This article is hosted here for archival purposes only. It does not necessarily represent the values of the Iron Warrior or Waterloo Engineering Society in the present day.

It’s a pretty sad day for me to admit that there is now a Japanese robot who can boast better skills in emotion-reading than me. Pepper, a household companion robot, is deigned to recognize and respond to human emotions.

The emotional analysis process is explained as relying primarily on the user’s facial expression and tone of voice in order to register what emotion is being expressed, using two cameras, four microphones and over a dozen touch- and laser-based sensors in order to collect its data.

After using presumably complex algorithms to conclude upon a specific emotion, Pepper will learn to respond properly through monitoring the human reaction to its chosen response. It then compiles and compares the results with the data stores of other Peppers online. If these cute little robots are distributed as widely as SoftBank — the telecommunication company responsible for Pepper — hopes, the robots will have more than enough data to properly react to a given scenario in no time at all.

Of course, the obvious flaw of the logic that could break the idea is the assumption that the technology in Pepper that allows for emotional analysis is sufficient to make judgement calls. It is known that human emotions are extremely complex matters to be dealt with, and are sometimes not expressed in the most easily monitored places, such as when expression or tone of voice. It may take a close familiarity with an individual to understand that they are feeling particularly happy or sad, noticing a small change in habits or mannerisms. Some people are also intrinsically more or less expressive than others, and yet there is no talk currently available in Pepper’s abilities to detect happiness and sadness differently amongst different individuals— what if someone’s tone of voice sounds more sullen just in normal speech? And what if Person A enjoys jokes when angry, while Person B only becomes angrier? It would be more logical for each individual Pepper device to have two separate sources of data— the massive compilation of resources collected from Peppers around the world, and a local cache of data for its specific user(s).

Regional or cultural differences in displays and exchanges of emotions would also be a huge factor to consider–– different countries, and different languages have their own nuances in the expression of emotion. Some entail more hand gestures for emphasis, while others place larger importance on word choice. I personally shudder to think about what technology Pepper would need to be capable of understanding sarcasm–– there are individuals who I interact daily and I STILL can’t tell when they are being sarcastic.

Complexities and complications aside, Pepper marks a clear step in the correct direction of the artificial intelligence and robotics field. If robots are ever to become fully integrated in our society as workers, analyzing situations regarding human interactions and their emotions will be absolutely imperative.

Pepper is scheduled to be made available for the public early next year, with a price range just above $2000 USD.

Leave a Reply