Emotional AI coming soon?

By October 06, 2016
ia

Machines capable of interpreting human emotions may soon be able to provide users with a closely tailored service. The potential applications look especially promising for the car of the future.

Nowadays Artificial Intelligence (AI) seems to be on everyone’s lips. This was emphasised once again at the most recent TechCrunch Disrupt event, held in San Francisco in mid-September, where the topic was raised in almost every discussion. Machine learning is a particularly fertile area of Artificial Intelligence. Danny Lange, Head of Machine Learning at Uber, believes that the best way of describing the concept – which everyone is talking about without actually knowing exactly what it involves – is to regard it as a paradigm shift. “We’re moving from a Newtonian, deterministic way of writing software, where the all-knowing programmer writes a complete model of your world, and we’re seeing this major shift to more of a Heisenberg world, where it’s about uncertainty and probabilities. Basically we’re now using experience, using data, to have learning algorithms build and use these models and get results that are really predictions with probabilities, rather than having finite deterministic programmes. And as the world changes the data changes and we rebuild the models. This allows us to continuously have a software system that is more in line with the real world,” he explained to the TechCrunch Disrupt audience.

 

German physicist Werner Heisenberg, discoverer of the Uncertainty Principle

Given that Artificial Intelligence is not about fixed states but processes, we can reasonably ask what the next step is going to be. In addition to AI, another fundamental trend in the era of new technologies has to do with diversification and an increasingly natural way of interacting with machines. Whereas in the past we were quite happy to type on a keyboard, we can now interact via touch and voice. Our interaction with machines is increasingly coming to resemble the interchanges we have with our fellow human beings. So will these two trends combine and result in a new era of Man-Machine communication? This is the view of Egyptian-born entrepreneur Rana el Kaliouby, co-founder and CEO of MIT Media Lab spin-off Affectiva, which is working in the field of artificial emotional intelligence. The startup is building a platform based on computer vision and deep learning, plus the world’s largest emotion data repository, so as to enable machines to perceive users’ emotions by recognising their expressions.

 

Danny Lange and Rana el Kaliouby on stage at TechCrunch Disrupt SF

Differentiating between a Japanese smile and a Brazilian smile

 “A lot of AI is very cognitive, it’s very transactional – it’s about doing one thing. Our vision is to bring artificial emotional intelligence to our devices and digital experiences,” explained Rana el Kaliouby, pointing out: ”Emotions weave into every aspect of our lives. Our emotional state drives our well-being, our health, how we connect with each other, and how we make decisions (…). People who have higher emotional intelligence are more likeable, more persuasive; they have a better trust relationship with other people. And that translates to technology too. So as we morph into a world where we’re a lot more digital and surrounded by a lot more devices – whether it’s an Uber app, a chatbot, an AI-based scheduling system or your social robot – they need to build a rapport with the consumer (…). I envision a world where all our devices have a little emotion chip and it reads your emotions in real time,” she told the TechCrunch Disrupt audience.

Affectiva has analysed over five million videos in order to build up a solid database incorporating dynamic facial expressions, voice and gestures. This in turn feeds a machine-learning algorithm capable of deducing an emotional state from people’s physical attitudes. And since not everyone expresses emotions in exactly the same way, the software has been designed to adapt to the specific characteristics of each individual. “It has enabled us to understand the difference between a Japanese smile and a Brazilian smile, or how women express emotion, versus men. We’ve selected the data to encompass all these cases,” explained Rana el Kaliouby.

What’s on the menu, then?

So what, in practical terms, could AI contribute to the user experience, for example when using Uber? At the moment, the road transport platform provides its customers with on-demand services. Depending on the request, it can take customers to their destination or deliver a meal. The addition of a layer of emotional intelligence could make for a more interactive relationship. It would for example be possible to spot that a given customer is stressed and simply wants to be driven to his/her destination as fast as possible, while another customer might basically just want to relax and enjoy a calm journey. Alternatively, the customer might be a slightly confused tourist looking for some information on his/her current whereabouts, etc. Using emotional AI, UberEats service would potentially be able to analyse the customer’s emotional state and suggest a selection of dishes that might suit his/her mood, from a quick sandwich for someone in a hurry to a convivial dinner for a person who fancies celebrating.

Close encounters with your car

Emotional AI technology also has enormous potential for self-driving cars. We are just entering the first phase of their deployment. Vehicles capable of driving by themselves are being rolled out step by step on our roads, although it is still mandatory to have a human driver behind the wheel to take over the controls in the event that something untoward happens. However, it is very hard to keep concentrating for the entire journey when you are not actually driving the car yourself.

So we might envisage a system set up to identify the emotional state of the driver and send him/her a gentle reminder to re-focus on the road if his/her attention is seen to be wandering. A system that has already been installed in the first Uber self-driving taxis, on the roads in Pittsburgh since September, helps to spot when a passenger is ill-at-ease and find a way to help him/her relax, whether by playing calming music or by explaining how the system works. There are in fact many ways of enabling seamless communication between people and machines so as to help create a first-rate experience. Nevertheless, it is still highly unlikely that we will ever see vehicles falling in love, like Herbie the Love Bug.

Legal mentions © L’Atelier BNP Paribas