“Searching for emotion. Please wait.”

July 8, 2015 David Suydam

Imagine a world where your home knows you had a bad day. Maybe you’re under stress at work, you’ve broken up with your significant other, or you’ve had to spend the morning reapplying for your driver’s licence.

What if there was a technology that could automatically detect how you feel the moment you entered your house? Immediately, your smart-home would know what to do: the lights would soften, your favourite music would float through your sound system and your kettle would already be boiling water for your favourite cup of tea. A message on your home dashboard would indicate that your sister wants to talk to you. She’s been notified by your smart-home that you’re not feeling well because no one knows how to administer a little TLC better than she does.

You’d settle on your couch with a cup of hot tea and your sister’s familiar, comforting face would beam on your screen. Later that night, Netflix would suggest a good comedy tailored to your preferences. A bath would run automatically at the perfect temperature right before bedtime. In the absence of control over our circumstances, at least we could consistently rely on the comfort of our own home to help turn a bad day into a better one.

The technology to transform your smart-home into a sanctuary already exists. What’s missing is the ability for your smart-home to read your internal emotional state and intuitively respond. These are some of the fascinating, next-level projects we’re working towards at Architech.

Currently, we’re experimenting with technologies that can pick up micro-expressions, assess elevated heart rates, measure the dilation of your pupil, and isolate states of arousal. When calibrated together, our computers would be able to pinpoint these separate physical reactions into an accurate emotional reading.

It’s the “how” that has so far eluded the world’s technological capabilities, and for good reason. In order to detect human emotion, our emotions first need to be modeled in a way that a computing system can understand. The challenge, naturally, is that emotions are extremely complex phenomena. Think about it this way: you may be surprised and happy at the same time, or you may simultaneously feel relaxed and sad. Emotions are not concrete, isolated events. Positive and negative can co-exist and the line between pleasant and unpleasant feelings is typically blurry.  

Psychologists have standardized models of affect. A good model to use as a jumping off point is to describe an emotion in terms of “arousal” and “valence”. Arousal describes the intensity of an emotion and ranges from high to low. Valence, on the other hand, describes an emotion based on positive or negative criteria.

An example of how this would work is by comparing fear and disgust. Fear and disgust are both negative valence emotions but fear is of a higher arousal level than disgust, meaning the average person will have a more intense reaction to fear. Their heart will beat faster; perhaps their body will naturally recoil. A computer would be able to pick this up in the most subtle detail.

At the same time, it’s important to note that emotions are ultra-personalized states. I guarantee that the way I experience fatigue is completely different from what fatigue feels like to other people. Therefore, computers have to be calibrated to recognize each individual. A system needs to figure out my own personal baseline before it can map happiness and sadness for me.

Think about what this kind of research means. Not only are we building technologies that can disrupt our future ways of living but, in the process, we’re also discovering aspects of our bodies that we had no idea previously existed. Imagine what it means to experience that kind of human-centric innovation. Stay tuned as we explore these ideas together.

About the Author

David Suydam

With an exemplary reputation in his field, David's vast experience as a technology consultant and solution architect has spanned more than 15 years in some of the most complex business environments throughout Canada and around the world. David's focus on creative solutions is based on a strong belief that traditional software development practices are flawed, and his team routinely demonstrates a better approach with open source, agile methods and cloud computing.

Follow on Twitter More Content by David Suydam
Previous Article
Design thinking with empathy creates human-centred solutions
Design thinking with empathy creates human-centred solutions

Discover how design thinking helped address a blood donor shortage at the UK National Health Service.

Next Article
The Future of Retail: Consumer Immersion
The Future of Retail: Consumer Immersion

Traditional retail businesses (and even a few e-commerce ones, too) are still casting about for new solutio...

We help our clients get their products to market up to 87% faster.

Find out how