- FryAI
- Posts
- AI Is Given A Heart: The Rise Of Emotional AI
AI Is Given A Heart: The Rise Of Emotional AI
Welcome to this week’s Deep-fried Dive with Fry Guy! In these long-form articles, Fry Guy conducts in-depth analyses of cutting-edge artificial intelligence (AI) developments and developers. Today, Fry Guy dives into the latest developments in emotional AI. We hope you enjoy!
*Notice: We do not receive any monetary compensation from the people and projects we feature in the Sunday Deep-fried Dives with Fry Guy. We explore these projects and developers solely for the purpose of revealing to you interesting and cutting-edge AI projects, developers, and uses.*
🤯 MYSTERY LINK 🤯
(The mystery link can lead to ANYTHING AI-related. Tools, memes, and more…)
“Beep. Boop. Bop. I am a robot.”
When we think about interactions with AI, we often think about the robotic voice and stale text of a model that fails to understand our human experience. But the days of beeping and booping are coming to an end. Like Tin Man from the Wizard of Oz, AI is getting a heart.
In this article, we are going to explore recent monumental updates in emotional AI. We will discuss how these updates might shape the future of our society and inevitably change the way we interact with and view AI. In the end, we will look at why these breakthroughs in emotional AI aren’t being talked about as much as they should be. Spoiler alert: it’s mostly because of fear!
AI GETS A HEART
Emotional AI has come a long way in the past year, and lately it has been coupled with generative AI to make one of the most dynamic duos of all time.
Earlier this year, Hume AI released its Empathetic Voice Interface (EVI). This voice-to-voice model is able to detect and understand the emotions behind the words of users based on voice tone and inflection, allowing for dynamic and realistic conversations. This was the first API to be released that measures nuanced vocal modulations, guiding language and speech generation.
Hume AI’s EVI model has been trained on millions of human interactions and is able to detect emotions based on different frequencies associated with a user’s voice. Based on this vocal data, the model scores the probability that the user’s voice is expressing certain emotions and can predict with confidence whether the user is expressing doubt, excitement, sadness, confusion, and more. Impressively, the model changes its response accordingly. For instance, when the model asks how I’m doing today, if I say, “I’m fine,” in a cheerful tone, the model might interpret me to be doing well and respond by saying how it is happy that I am having a good day. However, if I say, “I’m fine,” in a flat or sad tone, the model might ask me how it could help make my day great instead of just fine. Impressively, the EVI model also improves over time as it reflects on the reactions from users. In this way, it actually learns how the user expresses their emotions. Users can access the API and try the demo right now to test this out.
Following Hume AI’s EVI, OpenAI recently released GPT-4o, which exhibits a form of emotional AI via computer vision. This was first seen in a demo from OpenAI during their release of GPT-4o (below). The user can take a picture of themselves or open a live video stream, and the GPT-4o model can detect what emotions the user is portraying. For instance, if the user is smiling, the model is able to recognize that the user is happy. If the user is crying, the model will be able to deduce that the user is upset or frightened and respond accordingly. Additionally, GPT-4o’s voice system now allows ChatGPT to engage in more expressive tones itself, portraying a mood which matches that of the person, whether it is professional, sarcastic, enthusiastic, dramatic, etc.
Emerging emotional AI features like Hume AI’s EVI and GPT-4o’s emotional visual recognition represent the future of AI. No longer will AI be the boring, “robotic” and emotionless entity it once was. With the rise of emotional AI, humans and AI will be able to partake in much more dynamic communication.
As AI gets continually placed into different roles in society and gets asked to perform various business functions, the emotional AI aspect will become more and more valuable. Imagine the HR agent who is able to sympathize with human emotions. This will transform call centers greatly, allowing for more human-like exchanges. Furthermore, imagine the AI therapist who not only understands your problems but is also able to detect and respond to the underlying emotions you feel. This will allow the models to engage effectively in much more sensitive settings, personalizing business and changing the way we think about and interact with AI models.
WHY DO SOME HAVE A BAD FEELING ABOUT THIS?
If emotional AI is so amazing, why aren’t more people talking about it? Why is it not being incorporated into every new tool and feature? It might be because it is a little scary. Despite the promising upside of AI’s emerging emotional intelligence, many remain concerned that this will have unforeseen negative effects.
When ChatGPT was released in November of 2022, it quickly grew in popularity and now has over 180 million users. Since its conception, ChatGPT and other models have been primarily interacted with in the form of chatbots. Though these have scared some, many people feel that their data is secure when they talk to these bots. They have the freedom to engage in conversation about whatever they would like, they can remain anonymous, and nothing is too personal. In this way, users can enjoy these chatbots from a distance, bouncing in and out of conversations without much worry about privacy.
Now, emotional AI and computer vision are emerging, and this is changing everything. Not only is AI reading text, but it is analyzing user emotions and trends. This gives the AI insights it never had before and makes many feel uneasy, as if they are being analyzed on a personal level. Furthermore, as computer vision and vocal recognition advance (which is needed for emotional AI), human-AI interaction is becoming more and more visual and vocal rather than text-based. This means that AI is not just reading anonymous text anymore. Now, it is analyzing a user’s voice, face, and surroundings, storing the data and learning about the user in a much more personalized way. Because this technology is so new, this transformation makes many uneasy, as they are not sure when they are being monitored, what is being monitored, and for what purposes it is being used for.
In addition to privacy concerns, some experts are concerned that emotional AI is not possible through mere visual and vocal means. People express their emotions differently, so a smile and a high-pitched frequency voice might mean one person is happy, while it means another is suppressing their anger. Unless there is a deep relational knowledge of the person, making assumptions based on these mere perceptual clues could be dangerous. Because of this, we might see some awful decisions being made by AI in sensitive situations, some which might contribute to lawsuits based on insensitivity. It seems entirely plausible that these sort of things will happen as emotional AI is used to make various decisions and guide human-AI interactions.
One last concern for the deployment of emotional AI is how it will impact human-AI relationships. As AI starts to understand emotions and portray emotions of its own, will some humans start to feel differently about AI? Until now, it has been relatively easy to separate human value from AI value, but as AI begins to understand us emotionally and engage in more human-like interactions, will this soften people to the value of AI-powered agents in a way where we view them on par with actual humans? This might ring especially true for those who engage in relationships with an AI. With the rise of emotional AI, these relationships (which already seem so real to many) will be amplified in their emotional intensity. As the digital world booms, it will be interesting to see whether some humans begin to ascribe more and more value to these AI entities, even valuing them over other humans..
Whether concerns like over-monitoring, invading privacy, and causing value confusion deter people from engaging with AI or lead them to protest its development remains to be seen. However, the push will continue one way or another and if you don’t get on board, you might fall behind.
AN EMOTIONAL FUTURE
There are few things as cutting edge as emotional AI. For many, that is super exciting. For others, however, it raises a cautionary warning that maybe this AI thing is moving a little too fast. Emotional AI might be the breakthrough that begins to drive a wedge between those who jump on the AI train and those who think it is time to slow down.
Regardless of what side you are on with this emerging technology, it seems certain that the way we interact with AI is moving past mere chatbots and text prompts as we enter into emotional, human-like visual and voice conversations. In fact, it’s safe to say boring chatbots are a thing of the past … but try not to get too emotional about it.
Did you enjoy today's article? |