Affectiva

Humanizing technology with emotion AI

Rana el Kaliouby envisions a future where all our digital devices have a chip that senses and reacts to our every emotion in real time. As CEO and cofounder of Affectiva, one of the leading developers of emotion AI technologies, she is well positioned to help make this a reality.

By: Daniel de WolfF

Rana el Kaliouby envisions a future where all our digital devices have a chip that senses and reacts to our every emotion in real time. As CEO and cofounder of Affectiva, one of the leading developers of emotion AI technologies, she is well positioned to help make this a reality. She has been recognized by Entrepreneur as one of the “7 Most Powerful Women to Watch In 2014,” inducted into the “Women in Engineering” Hall of Fame and is a recipient of Technology Review’s “Top 35 Innovators Under 35” award. The seed for the venture was planted while el Kaliouby was pursuing her PhD in computer science at Cambridge University. “I realized I was spending more time with my laptop than with other human beings,” she says. Yet despite the intimacy she shared with this machine, it had no idea how she was feeling. She began to wonder, “What if computers could understand our emotions?”

Before long el Kaliouby was doing postdoctoral work at the MIT Media Lab alongside founder and director of the Affective Computing Group and eventual Affectiva co-founder Rosalind Picard. Picard’s publication, Affective Computing, which gave name to a new field of research, proposed that in the future computers will need to understand human emotion. “If you look at human intelligence,” says el Kaliouby, “people who have higher emotional intelligence tend to be more likeable, they’re more persuasive and more effective in their lives. We at Affectiva think this is true of artificial intelligence as well.” She continues, “As more and more of our interactions with technology become conversational, perceptual, relational, the social and emotional awareness of these interfaces will become critical.”

As more and more of our interactions with technology become conversational, perceptual, relational, the social and emotional awareness of these interfaces will become critical.


Today Affectiva is backed by leading investors including Kleiner Perkins Caufield & Byers, Horizon Ventures, Fenox Venture Capital, and WPP. The MIT spinout whose mission is to humanize technology also boasts one third of Fortune Global 100 and more than 1,400 brands as users of their technology. For three years at MIT Media Lab, el Kaliouby and Picard worked to develop what she calls an “emotional hearing aid” for those with autism spectrum disorder. It was called MindReader, and it involved reading glasses with a camera connected to a device that analyzed facial expressions and provided real time feedback to the user. The pilot program at a Rhode Island school for children with autism was extremely successful. El Kaliouby recalls seeing the subjects reacting to the feedback, making eye contact, engaging in meaningful human interactions and generally becoming more curious about the expression of emotion.

When exhibiting their work to Media Lab member companies, corporations like Proctor & Gamble, Toyota, and Samsung recognized the genius of the technology but wondered whether it might be applied to various use cases outside the realm of autism and mental health. The initial thought, according to el Kaliouby, was to hire more researchers. But it was Frank Moss, the Media Lab’s director at the time, who suggested this was no longer a research problem but rather a commercial opportunity. “I was intrigued by this idea of taking emotion recognition technology in new directions, applying it to different industries and ultimately fulfilling my vision of an emotional digital world,” says el Kaliouby.

In the realm of deep learning, effective algorithms are only part of the puzzle. The data powering these networks is essential. To date, Affectiva has collected 5.5 million face videos from 75 different countries, which amounts to approximately 2.5 billion facial frames. These frames are used to train Affectiva’s machine learning and deep learning algorithms to understand human emotions, and the sheer volume of data is part of what separates Affectiva from their competitors. Thus far, their emotion recognition technology has garnered significant attention in the media and advertising industries. Their product, Affdex for Market Research, is a cloud-based solution that allows advertisers to measure unfiltered and unbiased consumer emotional responses to digital content from anywhere in the world.

Thanks to Affectiva, traditional focus groups are quickly becoming a thing of the past. “Affdex captures the emotional journeys of thousands of viewers as they unfold,” says el Kaliouby. The data is then aggregated, compiled, and presented in a dashboard provided for clients. Currently fourteen different market research partners, including leading firms like Millward Brown and Nielsen, all use the technology to measure consumer emotion responses to digital content. A powerful outcome of these partnerships is that the data collected allows Affectiva to fundamentally improve the technology and advance the state of the art with their proprietary machine learning algorithms.

Affectiva’s core emotion engine analyzes any video stream and maps it to an emotional state. And for the benefit of application developers, they’ve packaged it as software development kits (SDKs) and cloud-based APIs. “Our own device SDKs run in real time and don’t send any videos to the cloud, which is important for privacy reasons,” explains el Kaliouby. “It allows any developer to very quickly emotion-able their very own digital experience.” With the idea of ubiquitous emotion technology in mind, they’ve shrunk the machine learning models to enable them to run on any device, including iOS, Linux, mac OS, Windows, Unity and even Raspberry Pi. A large part of why el Kaliouby and her team built the SDKs was to allow them to diversify and explore new verticals.

As we transition into semi-autonomous and fully-autonomous vehicles, it is going to be imperative that cars understand the mental state of their drivers.


The automotive industry is a perfect example. “As we transition into semi-autonomous and fully-autonomous vehicles, it is going to be imperative that cars understand the mental state of their drivers,” explains el Kaliouby. “As cars redefine themselves as conversational, infotainment interfaces that want to understand the emotional engagement of the user to personalize the experience—the lighting in the car, the music—this has the potential to be a big market for Affectiva.” They have just finished a proof of concept with a large Japanese car manufacturer, which involved installing cameras and Affectiva’s Emotion AI in cars in Tokyo and Boston, and collecting driver data. El Kaliouby also mentions that Affectiva’s tech is used in a number of social robots.

Throughout this diversification process, MIT ILP has played a substantial role in connecting Affectiva to new industry partners. El Kaliouby says, “One of the reasons that we are so excited to join the STEX25 program is that we are constantly looking to diversify into new markets. And this is where we can tap into the MIT ILP network.” She is also in the process of organizing the first ever Emotion AI Summit at the MIT Media Lab (September 13, 2017). “Emotion AI is a core capability that is growing into a multibillion dollar industry, and it is transformative to many different verticals. We at Affectiva are excited to bringing together business and thought leaders who are interested in exploring artificial emotional intelligence for their own data platforms, devices, and technologies. And we’re very much looking forward to the opportunity to expose ILP members to this type of technology.” Consider it another step towards Rana el Kaliouby’s vision of ubiquitous emotion AI.

Rana el Kaliouby, CEO & Cofounder, Affectiva
Rana el Kaliouby, CEO & Cofounder, Affectiva