Estimated reading time: 2 minutes, 34 seconds

The Love Language Of AI Featured

null null

As algorithms get exponentially more advanced, AI will become even more lifelike. But, the biggest gap between real and virtual is a hurdle almost impossible to imagine in a machine; can AI account for emotion and feelings? To date, one of the hardest things for AI to emulate is human emotion. AI robots shown at the Consumer Electronics Show still appear “off” when compared to humans. We already make synthetic models that look convincingly human. Remarkably lifelike and able to respond with intelligence and knowledge far exceeding one human’s capacity, they still ring false.

For an AI to fully resemble a human, there must be the ability to successfully express emotion. But human emotions are so nuanced, even we don’t always understand them! Can we program a machine to adequately convey emotions? …Can a program move to the next step and possess feelings?

Machines And Emotion

We all know that when an AI speaks, it’s just drawing from a computer program. The machine pulls data from large sets and follows codes that inform about ways to apply the data into coherent sentences. It’s basically just following the rules. More advanced programs that include anticipation, prediction, and decision-making are also just following a set of rules, albeit a much more complicated set within an algorithm.

The groundwork exists for machines to replicate emotion, even though the ability to feel does not exist. With new developments such as the Neural Text-To-Speak (NTTS) approach, AI could conceivably follow the rules enough to simulate emotions. AI that uses NTTS can, at long last, navigate those nuances of human emotion and parse out the tonal differences between an enthusiastic and an excited sounding voice. Suddenly, the wooden responses from a digital voice is gone, and the machine sounds much more natural.

And natural is the key. To bridge the gap that would create a realistic humanoid, there must be a way to make it seem less “off,” less digital. With NTTS, a machine could sift through its data, predict a response, and deliver a reply with the appropriate inflection, the emotional cues of language.

“I Love You, Too!!”

While the notion of an AI capable of emotion is far-fetched and unlikely to appear on the horizon any time soon, closer to fruition may be AI that responds to human emotions. Facial recognition and audio software already respond to human faces and voices. It’s not a great stretch of the imagination to consider a machine that can recognize feelings and deliver an appropriate—though programmed— response; it’s a simulation, not an emulation.

For example, today, if you tell the virtual assistant device Alexa, “I love you,” the device will say a variety of pleasantries, devoid of anything other than the usual affable tone. Tomorrow, you might get a response with such affection that you suddenly find yourself feeling affectionate, too.

It all gets very interesting quickly, then, as human nature is programmed to respond to emotion. What will the next evolution of society look like once a machine can invoke emotions in humans?

Read 5423 times
Rate this item
(0 votes)
Deborah Huyett

Deborah Huyett is a professional freelance writer with experience working for a variety of industries. She enjoys and works with all types of writing, and she has been published or ghostwritten for blogs, newsletters, web pages, and books. A former English teacher, Deborah’s passion for writing has always been grounded in the mechanics while appreciating the art of writing. She approaches projects as creative challenges, matching voice and tone for any audience

Visit other PMG Sites:

We use cookies on our website. Some of them are essential for the operation of the site, while others help us to improve this site and the user experience (tracking cookies). You can decide for yourself whether you want to allow cookies or not. Please note that if you reject them, you may not be able to use all the functionalities of the site.