Search
Close this search box.

Meet Hume AI, The First AI That Understands Your Emotions

Hume AI is on a mission to create emotionally attuned AI that enhances human interactions, and with their latest $50M Series B funding, they're ready to bring that vision to life.

In 2021, Dr. Alan Cowen, a former Google researcher and a leader in the field of affective science, embarked on a bold mission: to humanize artificial intelligence by integrating emotional understanding into its very fabric. His vision for Hume AI was born out of his profound work in psychology and emotion research, which underscored the critical role emotions play in human decision-making and well-being. Named after the Scottish philosopher David Hume, the company reflects his philosophical outlook that emotions drive human choice and happiness. Hume AI’s goal? To ensure that AI not only serves human goals but also caters to our emotional well-being.

The Journey Begins: A Mission Rooted in Emotion

Dr. Cowen’s work on semantic space theory—a computational approach to understanding human emotions—became the foundation of Hume AI. His research revealed nuances in voice, facial expressions, and gestures that are key to how humans communicate across cultures. Recognizing that existing AI systems lacked the ability to fully comprehend or respond to human emotions, Dr. Cowen saw an opportunity to fill this gap. His inspiration stemmed from a simple yet profound realization: emotions are essential for understanding human preferences, yet AI lacked emotional intelligence. This gap in AI capabilities set the stage for Hume AI’s flagship product, the Empathic Voice Interface (EVI).

The Birth of EVI: An AI That Listens Beyond Words

At the heart of Hume AI’s technology is the Empathic Voice Interface (EVI), an emotionally intelligent conversational AI trained on data from millions of human interactions. Unlike traditional AI systems, which follow instructions based solely on keywords or voice commands, EVI can interpret emotional cues in a user’s voice, understand the meaning behind words, and respond empathetically. The goal? To create a more human-like, natural conversation between users and AI.

EVI’s ability to detect and understand human emotions comes from Hume’s proprietary empathic large language model (eLLM), which can generate vocal responses optimized for user satisfaction. This model doesn’t just listen; it learns. Over time, EVI improves its emotional understanding, fine-tuning its responses to meet users’ emotional needs. With features such as end-of-turn detection, tone and emphasis recognition, and even voice modulation, EVI is built to make AI interactions feel human.

“Speech is four times faster than typing; it frees up the eyes and hands and carries more information in its tune, rhythm, and timbre,” Dr. Cowen explains. “That’s why we built the first AI with emotional intelligence to understand the voice beyond words. Based on your voice, it can better predict when to speak, what to say, and how to say it.”

Key Features and Applications of EVI

EVI introduces several revolutionary features that set it apart in the field of AI:

  • Emotional Understanding: It deciphers not just the words spoken but the feelings and intentions behind them, enhancing its responses.
  • End-of-turn detection: EVI accurately determines when a user has finished speaking, preventing awkward conversational gaps.
  • Interruptibility: Like a human partner, it stops speaking when interrupted, ensuring a natural dialogue flow.
  • Expressive text-to-speech (TTS): EVI’s responses sound natural, mimicking human tone and rhythm.
  • Adaptive Learning: EVI learns from each interaction, improving user satisfaction over time.

This empathic AI system has the potential to revolutionize various industries, including:

  • Customer Support: Creating more empathetic and efficient service experiences.
  • AI Assistants: Developing AI capable of human-like interactions, especially in personal assistant technologies.
  • Mental Health: Assisting therapists and caregivers by providing emotionally sensitive AI companions that understand users’ emotional needs.

EVI’s potential extends far beyond conversational AI. The system’s multi-modal capabilities allow it to analyze facial expressions and body language, adding further depth to its emotional intelligence. This positions EVI to impact sectors like healthcare, where emotional understanding can aid in diagnosing mental health issues, education, where adaptive learning systems can respond to student needs, and automotive technology, where in-car assistants can detect driver stress and enhance safety.

The Science Behind the Innovation

Dr. Cowen’s extensive research on mapping human emotional experiences has guided Hume AI’s technological developments. His work involved studies with over 1.5 million participants across the globe, creating a comprehensive atlas of emotional expression. The findings highlighted that while emotional expressions vary across cultures, common themes exist, providing a universal framework for understanding emotions.

This knowledge enabled Hume AI to design emotionally intelligent systems that cater to diverse users, ensuring cultural sensitivity while maintaining accuracy in emotional recognition. EVI’s ability to modulate voices across various tones and expressions—such as adjusting pitch, nasality, or femininity—further personalizes the experience.

A Commitment to Ethics and Empathy

Hume AI’s technological innovations are guided by an ethical framework that is rare in the fast-evolving AI industry. The company established The Hume Initiative, a non-profit that has released the first set of concrete ethical guidelines for the development of empathic AI. This commitment to ethical AI sets Hume apart, as the company works to ensure that its technology prioritizes human goals and well-being without crossing moral boundaries.

“We’re trying to create AI that’s more sensitive to human needs and goals, but we need to be really careful about how we do that,” Dr. Cowen emphasized. The company is aware of potential risks, including privacy concerns, manipulation of emotions, and the accuracy of emotional detection technologies. By working closely with ethics experts and maintaining transparency, Hume AI is taking the necessary steps to address these challenges.

Funding and Future Developments

Hume AI’s mission has garnered strong backing from investors. In March 2024, the company secured $50 million in a Series B funding round led by EQT Ventures, with participation from Union Square Ventures, Comcast Ventures, and LG Technology Ventures, among others. This funding has fueled Hume’s research partnerships with prestigious institutions like Mt. Sinai, Harvard Medical School, and Boston University Medical Center, where they are exploring healthcare applications.

Hume AI’s voice modulation technology, introduced with EVI 2, now allows users to adjust its base voices across various scales, adding flexibility to how users interact with AI. It boasts emotional intelligence, recognizing shifts in users’ tone and mood to provide empathetic responses. The interface communicates with human-like qualities, adapting to the tone of a user’s voice, while offering versatile expression by generating a range of emotions, accents, and speaking styles to suit individual personalities.

 EVI 2 also features rapid response times, engaging users in fluid conversations with subsecond interactions. 

With personalization at its core, it tailors its interactions to align with each user’s preferences. The system’s multilingual capabilities allow it to switch seamlessly between languages, making it accessible globally. Additionally, its multimodal functionality integrates voice recognition across various applications, from customer service to gaming. Currently available in beta through the Hume AI app, the EVI 2 API is also accessible to developers for integration into their applications.

Hume AI is on a mission to create emotionally attuned AI that enhances human interactions, and with their latest $50M Series B funding, they’re ready to bring that vision to life. Their new flagship product, the Empathic Voice Interface (EVI), is the world’s first AI with emotional intelligence, designed to process the nuances of human speech—its tone, rhythm, and emotion. From personal AI assistants to immersive gaming, EVI opens up new possibilities for voice-based interactions that feel more natural, supportive, and responsive.

The team at Hume AI is committed to optimizing AI for human well-being, and EVI marks a significant leap forward in making AI that understands and responds to emotional cues. Whether it’s being superhumanly helpful or funny, EVI is set to transform the way we communicate with machines.

As Hume AI continues to push boundaries, they’re looking for talented engineers and AI researchers to join them in shaping the future of empathetic AI. If you’re ready to be part of the next big leap in emotionally intelligent AI, explore their open positions here: https://lnkd.in/dbEed6ph.

Picture of Anshika Mathews
Anshika Mathews
Anshika is an Associate Research Analyst working for the AIM Leaders Council. She holds a keen interest in technology and related policy-making and its impact on society. She can be reached at anshika.mathews@aimresearch.co
Subscribe to our Latest Insights
By clicking the “Continue” button, you are agreeing to the AIM Media Terms of Use and Privacy Policy.
Recognitions & Lists
Discover, Apply, and Contribute on Noteworthy Awards and Surveys from AIM
AIM Leaders Council
An invitation-only forum of senior executives in the Data Science and AI industry.
Stay Current with our In-Depth Insights
The Most Powerful Generative AI Conference for Enterprise Leaders and Startup Founders

Cypher 2024
21-22 Nov 2024, Santa Clara Convention Center, CA

21-22 Nov 2024, Santa Clara Convention Center, CA
The Most Powerful Generative AI Conference for Developers
Our Latest Reports on AI Industry
Supercharge your top goals and objectives to reach new heights of success!