Are algorithms already reading our emotions?

Are algorithms already reading our emotions?

Remember Spike Jonze’s movie, “Her”, where the main character starts developing a really intense relationship with a computer? Everything was the result of an algorithm which was able to read human emotions and adapt to the user’s personality. Of course, this was happening somewhere in the future, but what are the odds of seeing something similar happening soon?

Nowadays, a lot of the sites and apps we’re all using daily, like Google, Facebook or Instagram are using advanced algorithms, able to display information tailored to each individual. It’s true, we’re still far from breaking the barrier in human-machine communication, but we’re getting closer to doing it with each day.

Let’s take Google for example. It’s no secret that they’re using one of the most complex algorithms ever created and each user sees different results when making them same search, based on their history. Also, Facebook has started showing updates in your news feeds based on the friends and pages you interact with the most, while Instagram did a similar thing this week. But how close are these giants to being able to read emotions?
A team of scientists, led by David Bamman and Noah A. Smith, managed to find an answer regarding the relation between words and the emotions they evoke. They believe that not only the given statement should be examined, but also the context in which something is said.
The scientists started on Twitter, examining tweets which contained the hashtags “#sarcasm” and #sarcastic” and it turned out that the authors’ profiles and target audiences were the most important factors shaping the meaning.
Currently, the results of the research aren’t used at a large scale, but the Department of Defense is using it, helping them spot potentially dangerous referenced in offensive jokes.
Another tool which can be considered proof of the progress made in reading human emotions is Affectiva. It’s able to track them in real time, using video recorders built in devices like computers, smartphones or wearables.
The idea behind it is simple, as it’s reading the mimics and translates them into emotions, identifying what engages users the most. Of course, the areas where this can be applied are a lot. For example, it can help a website display content in real-time, based on users’ reactions. Right now, movie making companies are using Affectiva to pick the best moments that should be included in trailers.
There are many other companies out there which have developed abilities to identify emotions, by interpreting huge data amounts and putting them through machine learning systems. Still, a lot of people are still skeptic about this idea, fearing that algorithms able to read emotions won’t lead to any good.
Despite this, let’s not forget that the main purpose behind this idea is to make our lives easier. After all, social networks had a lot of critics as well, claiming that people are deliberately giving away personal information and look where are today, feeling the need to even share our current location…

 

Start your free trial now

 

Submit a Comment

Your email address will not be published. Required fields are marked *