Making computers feel

March 12, 2015 David Suydam

About 10 years ago, a colleague of ours set out to prove that the human electrocardiogram is unique for every person. It was an ambitious project that later made its way into the development of the Nymi wristband. To get started, she needed to do a series of experiments to prove that the human heartbeat indeed has a unique and stable waveform for every person. Even before running these experiments, she could anticipate a few factors to worry about - like what happens when a person exercises, or gets a cardiac condition. What she didn’t foresee was that when her team finally solved these issues they would run into a new and interesting challenge: human emotion!

For one of the experiments (the first outside the lab), the research team asked volunteers to wear wireless sensors under their clothes and recorded their ECG signals as they went about their daily activities. When the signals came back, the team was amazed to notice that every few minutes there were slight variations in the waveforms. The team knew the volunteers hadn’t exercised. In fact, the volunteers would swear that they just had a normal work day, answering emails, having meetings, and going to classes. The research team was puzzled; this technology worked so well in the lab, why wasn’t it performing the same in the real world? And then it hit them! In the real world, the research team couldn't control how people feel.

It turns out that the autonomic nervous system has endings in the cardiac muscle. This means that the shape of a heartbeat can be affected by emotions. Fascinating! For a moment our colleague considered abandoning her doctoral thesis to focus on building an emotion detection system - a sentiment that didn't quite resonate with her academic advisor. She subsequently spent a year trying to erase emotion from the human heartbeat so that biometric authentication can work.

Affective computing is the engineering field that studies systems which recognize, interpret, process and simulate human emotion. Our colleague is convinced that emotional intelligence is the last barrier for meaningful human-computer interaction. Machines know how much we walk, how much we eat, how we sleep - but they still don’t know how we feel. At Architech, we're attacking this problem from a behavioural and physiological point of view. We're not only interested in facial expressions or body postures (which can lie) but we're also looking at ways to assess the most intimate experiences of human emotion as they are reflected in our biosignals.

We're building technologies that can motivate you, make you laugh, calm you down before a job interview, or tell that economics teacher that he or she is boring. Seriously. This is what we do!

About the Author

David Suydam

With an exemplary reputation in his field, David's vast experience as a technology consultant and solution architect has spanned more than 15 years in some of the most complex business environments throughout Canada and around the world. David's focus on creative solutions is based on a strong belief that traditional software development practices are flawed, and his team routinely demonstrates a better approach with open source, agile methods and cloud computing.

Follow on Twitter More Content by David Suydam
Previous Article
The Future of Retail: Making It Personal
The Future of Retail: Making It Personal

If you're a retailer, you probably don't need reminding that your industry is in the middle of a full-scale...

Next Article
The retail industry is under siege
The retail industry is under siege

Retailers are struggling to adapt to lower foot traffic and increased competition from e-commerce. Traditio...

We help our clients get their products to market up to 87% faster.

Find out how