“How are you?”, the most asked but possibly also the most avoided question in the world. Despite how we really feel, our answer will always be: “Fine and you?”, which implies that we expect an equally pointless answer from the other person as well. We find it difficult to describe our feelings and to communicate about them; nothing new there, but thanks to a new trend in the digital world this might just change thoroughly.
Today’s world is flooded with digital technologies: computers, smartphones, but also our cooker, washing machine, car, you name it. In 1998 the first mainstream computer was launched into the market: the Macintosh 128k. Today, a mere 17 years later, life without such computers is unimaginable. We even continuously carry one in our pocket: the smartphone. In next to no time, digital technology has succeeded in totally adapting our way of life and turning it upside down. More than ever are we depending on technology.
This makes our relation with digital technology increasingly complex. These days most people won’t leave the house without their smartphone, tablet or laptop. This type of connectivity results in a unique device & user relationship: technology is no longer a machine which can take over one single action; it has become an extension of our personality which enables us to respond to the current society’s demands. In a very sly way it became intertwined in our daily life and our physical me so that it has become very difficult to distinguish between both. You may have noticed that (unconsciously) I presented technology in that sentence as it if has its own willpower, as if it has its own soul. This is not the case, of course, but the idea of Artificial Intelligence, as it is called, does contain a certain level of truth.
Hot tweetaway: How #EmotionalTechnology intertwines with and improves the quality of our daily life insit.es/2dWRa58 via @CoolBrands #tech
We talk about Artificial Intelligence when devices have become so intellectual that they can make decisions themselves about their actions. So a self-activating lawnmower does not respond to this definition, if it automatically cuts the grass when it is dry enough? Correct! A second very important element is missing for that: emotions. Until today devices are emotionless objects, yet we keep getting closer. Will there be such a thing as Artificial Intelligence one day? No idea, but I do know that these technological developments are unstoppable and that we had better give them a place in our lives. I bumped into this quote in literature, which I agree with:
“The way I see it, our digital future can unfold itself in two ways: it can either take over or intertwine even more with our daily life and improve our quality of life in doing so. I personally think the second option is the one we’re heading towards right now.” – Erick, Trendwatcher
Emotional technology (or E.T.) is the new stage in our eternal search for artificial intelligence. E.T. stands for measuring biometric data in order to define emotions and then using these data as input for various digital applications. You may notice that this shows a strong resemblance with the Internet of Things. This abstract definition becomes clearer with an example:
Sensors in certain wearables can detect the level of excitement in a car driver. When a driver becomes too excited in traffic, there is a risk of road rage, excessive speed or even accidents. When the sensor that we carry on us detects an increase in excitement, the playlist in the car will adapt in a way that a quieter song will help the driver relax, without making this obvious.
In fact, digital technology will teach us a certain level of empathy so that digital applications can understand us better and act upon this. As the previous definition and the accompanying example clarify, E.T. is based on 2 major pillars: detection and conversion, both of which have been developed to such an extent in the past few years that is has now become possible to combine them both in Emotional Technology:
- Detection: extracting biometric data has become a real hype these past few years. It has become possible for us, consumers, to monitor our own body and in this case our emotions, now more than ever and with the help of miscellaneous handy wearables. This ensures that a gigantic amount of data is becoming available for new applications.
- Conversion: connecting products to each other which until recently were asocial has been facilitated by the arrival of the Internet of Things. Think about the radio in the example; thanks to the hyper-connectivity of our digital world, the music is not only influenced by the device’s buttons, the input can also originate from various other sources (e.g. excitement sensor). That is how raw data can be translated into concretely observable actions.
Both conditions have matured to such an extent today that they can now be combined into intelligent Emotional Technology. In the following paragraphs I will explain more about both pillars.
Scientists have been able for quite a while now to measure bio-signals which are related to emotions. Speech, perspiration, muscle tension, body temperature and heartbeat have proven to be very strong indicators of our state of mind. Up to the digital revolution, measuring those was only possible in specifically equipped laboratories. But today things have changed: we constantly carry highly technological movement sensors in our pockets, always have a camera at our disposal and are capable of operating our devices with advanced speech technology. UT researcher Egon L. van den Broek of the institute Centre for Telematics and Information Technology (CTIT) extensively researches affective computing:
“Twenty years ago measuring it was a problem, today we carry an advanced laboratory in our pockets, every moment of the day, so to speak. The image of a person covered in wires and sensors is long gone. Measuring all these emotions no longer is such a difficult task.”
Hot tweetaway: We carry a laboratory of sensors in our pockets, making it easier to track #emotions insit.es/2dWRa58 via @CoolBrands #emotionaltechnology
These days, there are many commercial applications for the main bio-signals; here are some examples:
Heartbeat: as our intuition indicates, our heartbeat is a strong indicator of the extent to which we are excited or of our degree of arousal.
- Most smartwatches and fitness trackers launched in the market today have an optical heartbeat meter, e.g. Apple watch, Moto 360, Samsung smartwatch, Garmin Forerunner, Pebble… By combining a basic optic sensor to a few leds, the heartbeat can be registered rather accurately.
- Even a basic webcam image can register your heartbeat and breathing these days, through minor color variations in one’s skin. Philips translated this technology into an iPad app: Vital Signs.
Perspiration: the extent to which our skin is a conductor for electricity depends on our perspiration. This fluid production is very sensitive to the extent of excitement and fluctuates throughout the day, according to our emotional condition (without us being aware of it). By connecting two electrodes to the skin and measuring the degree of conductivity between them, we can also determine the arousal.
- Together with Affectiva, MIT developed the Q-sensor bracelet which communicates your state of mind through an RGB led in real-time. XOX by Saatchi & Saatchi is also based on this same concept.
- The Moodmetric ring was developed to reduce stress and is also based on the same principle.
Speech: the intonation we talk with in daily life is the way to link emotions to our verbal messages. Thanks to a continued development of speech technologies, this intonation can now also be deciphered digitally.
- Moodies Emotions Analytics is a smartphone app developed by the specialized company Beyond Verbal which can detect corresponding emotions based on the intonation of your sentences (the app is very popular among people with autism disorders).
Body language: our facial expressions reveal a lot about how we feel, but so does our entire body language. Today several tools are already capable of translating these postures and expressions into categorized emotions:
- SimSei is a digital tool which is currently still in its infancy but has the ambition of realizing a basic examination through an app and a virtual psychologist. Besides the verbal info and thanks to Microsoft Kinect, it can also analyze the body language so as to determine possible psychological problems.
- Affectiva (the company which also marketed the Q sensor) has developed an online tool which analyzes people’s facial expressions when they are watching videos. This way we can already measure the impact of, for example, advertising spots on our emotions.
Undoubtedly there are many more applications and examples of sensors already available as consumer product these days and the list will continue to grow in the coming years. This list represents what the landscape looks like today and what the options are.
However, mere biometric data detection and thus also the users’ emotion is not sufficient; the important part starts when these data are used as input for other (digital) applications. We apply this again to the already used example: knowing when a person gets excited when driving will not entail a better driving behavior in them. It is only when we use the measured data as input for the car playlist that we really create an additional value for the consumer. At that moment, however, the technology is showing a certain level of (acquired) empathy which again brings us one step closer to artificial intelligence. A good implementation of emotional technology translates into a digital world which adapts to its user, not the other way around! Such symbiosis no longer ensures that technology estranges us from each other; on the contrary, it stimulates us to engage in emotional connections. (source: The Necessity of Emotional Technology)
Hot tweetaway: Blurred lines in #technology: human vs non-human is becoming even more vague with #emotionaltechnology insit.es/2dWRa58 via @CoolBrands
Various options illustrate the potential. What if…
- the bedroom lights automatically adapt to both partners’ state of mind, so that you always know whether your significant other is in the mood for romance or would rather talk about the kids? No more embarrassing conflicts.
- an application linked all emotional data in a city to the geographical data and then renders a heat map of places with a positive ambiance and places where a feeling of insecurity has the upper hand?
- my smartphone measured the emotional connection when dating and compared it with the data of others, so as to present the perfect emotional match for me?
Impact on Marketing
A previous short brainstorm shows that the combinations and options of emotional technology are endless, but what will their impact be on (online) marketing? Evidently this entails many opportunities, hence the relevance of briefly listing some of the options:
- Through facial recognition one can check the emotions of an ad’s viewer. As marketer this generates a huge amount of information on the impact of a given TV spot, of printed media… This technique has already been applied commercially by Affectiva.
- When using a product, the emotions caused by each of the actions can be analyzed. This results in rational data which indicate what actions frustrate the user or which ones make them every enthusiastic.
- The impact of ads also depends strongly on our mood: a wellness resort message will have more impact when one really feels stressed out than when one is already relaxing at home. Linking ads to these states of mind will result in much more relevant content.
Emotional technology is the next stage in the continuous humanizing of technology. The way we will interact with our devices will further blend with our actions, which will make the boundary between human and non-human even more vague. The amount of data available through all kinds of wearables and health trackers will be valued and used as input for many intelligent applications. Furthermore, the marketer disposes of a very powerful tool which can make that content measurably more relevant. Tomorrow we will probably not talk about artificial intelligence just yet, but we do find ourselves at the start of a promising trend with unlimited options. So, who knows, maybe we will soon get a relevant answer (through whichever channel) to our question “How are you?”.