Communication nowadays is an essential part of our daily lives, be it for personal or professional purposes. It has never been easier to get a message across to someone in mere seconds, especially with the advent of smartphones in the new millennium’s early years.
While being an extremely quick method of communication, texting or chatting doesn’t quite replicate face-to-face conversations. Elements like facial expressions, hand motions and changes in the tone of voice are completely lost through text messages. These elements, called non-verbal cues, can make up to 93% of face-to-face conversations.
How are Emojis used to replace non-verbal cues?
When email was introduced in the 1970s, its benefits were quickly realised and it became a widely used service in a matter of years. With non-verbal cues being such an important element in normal conversations, a substitute for these was needed once email became second-nature for people. Emoticons like “:-)” and “:-(“ started being used in friendlier email conversations, intending to mimic facial expressions.
The picture characters known as emoji were created for a Japanese pager service in the 1990s, adopted by Western Cultures virally once a hidden emoji keyboard was discovered on Apple’s iPhone. One is presented with endless possibilities as to what an emoji could be due to the emoji being pictured in essence, and with new emoji being consistently added to the Unicode database since their inception, there are now 2700+ different and unique emojis, and counting.
The Problem with Emoji
By correctly using emojis at the right time and in the right contexts, an understanding is quickly reached. This would be the equivalent of two people joking and laughing together – both are sharing the same sentiment. However, while having such a large number of emojis to choose from provides flexibility and variety, the issue of interpretation arises – an often overlooked issue that has the potential to cause misunderstandings.
For example: Imagine meeting your closest friend Greg, and you’re telling him about how you bought a really nice slice of pizza which you were craving for an entire week, and as soon as you’re about to take your first bite, you realize that the bottom of the pizza is completely burnt.
You’re devastated. You’re staring at the ground holding back tears. Greg tells you “That sucks, I’m really sorry to hear that” with genuine concern in his voice.
Feeling slightly better, you look up at Greg to say thanks. But Greg has the biggest grin you’ve ever seen, and he looks like he’s about to burst out laughing. Greg is a jerk.
Now imagine that, but over a chat. This would be the equivalent of:
You: Hey Greg, the WORST thing has just happened to me! I’ve been craving pizza all week. I finally got round to buying a slice, and as soon as I go to take a bite, I realise that it’s BURNT! I’m devastated! 😢😢😢
Greg: I’m really sorry to hear that, that sucks! 😂😂😂
You: Why is that funny? I said I’m genuinely devastated! 😠
Greg: Yeah I got that! This means “crying” 😂 … right?
You: Greg .. no. 🤦
There are a number of emojis that are often misinterpreted. Here are some examples (note that “Intended Use” refers to the actual name of the emoji, and what the graphical representation is intended to represent):
|Weary (tired) Cat
|Sick or Unwell
|Anger or Frustration
Most of the emoji available still have Japanese influences tied to them. The Sleepy Face emoji is a good example of this since the bubble coming out of the face’s nose is representative of Japanese Manga. For Japanese cultures, this emoji is immediately tied to drowsiness, while this may not be true for Western cultures.
With emoji being so nuanced, such different interpretations are expected. So what can be done to avoid this?
How Machine Learning comes into play
Research has found that given a certain set of personality traits, you can reliably predict which emoji a person is going to use based on the conversational context. Such traits include age, gender, ethnicity, and internet fluency. More often than not, a 55-year-old male is less likely to use emoji in chats than a teenager.
Research also shows that emojis are used less in professional environments and more in friendlier contexts. When used in the right context and at the right time, they greatly improve the participants’ understanding of what is being said, while also positively impacting their relationship with one another. If emojis are used incorrectly, or if the chat participants have different interpretations for the same emoji, then misunderstanding and confusion would arise, which could negatively impact both the conversation and the relationships of those involved.
In a closed test environment with around 300 users of widely varying characteristics, machine learning models were built such that, given different conversational contexts, a chatbot would be able to use an appropriate emoji for any given user 77% of the time. In the remaining 23% of the instances, the chatbot would use no emoji at all to prevent any misunderstandings.
A prevalent issue comes to mind when speaking about chatbots – most of the time, you can always tell that you are speaking to a chatbot, as the conversation feels very rigid and impersonal. When applying the above-mentioned concepts, it is entirely possible to have chatbots react and respond appropriately depending on the user’s characteristics. One may even change its behaviour depending on what the user is requesting:
- If it is a simple inquiry into how a product works, the chatbot could be friendlier and more open to conversation, attempting to build a relationship with the customer.
- If the customer is encountering a problem with a product, the chatbot could take a more serious stance, giving the impression that it is truly concerned about the customer’s issue while suggesting a list of articles that may help.
If you would like to learn more about how Artificial Intelligence, Machine Learning and Chatbots can benefit your business, feel free to contact us.