What might your devices say about your emotional state?

Post by Niamh Walsh, Learnovate

A surprising consequence of the tsunami of disruptive advances in technology we’re experiencing today is a renewed attention on particularly human skills – those that applications and devices cannot easily replicate. As advances in machine learning, artificial intelligence, virtual reality, robotics and the Internet of Things proceed apace, employers increasingly recognise they need people working in teams with strong interpersonal and intrapersonal skills. Teamwork, communication, perseverance and self-motivation are critical human skills in complex environments with geographically dispersed teams and fast-changing technology ecosystems. Key to these skills is awareness of our own emotions and sensitive reading of the emotions of colleagues.

We read other people. We gauge their mood, infer their feelings and try to predict responses to what we say and do. This helps humans succeed in complex societies where working together is crucial to thriving. Without conscious awareness, we continually observe each other – noticing minute changes in a person’s voice, posture, movement, facial expressions and eye contact. While we are not always accurate in understanding each other, recognising another person’s emotional state remains a critical ability in our hypersocial world.

With a new emphasis on growing self-awareness in the workplace, companies are applying established technology in new ways to detect changes in emotional states and innovating with new technologies to enable applications and devices to recognise human emotion.

Some methods of emotion detection are already well known: sentiment analysis of text articles, emails, chat and social posts are commonplace with increasing levels of accuracy. Face tracking is another well-known method of emotion detection where applications compare video data to a large database of facial expressions. As datasets increase exponentially, the algorithms processing the text or video data become more powerful.

The combination of speech technology and AI techniques allow applications to identify patterns in voice and sound data from microphones or recorded calls. Applications ‘listen’ for heightened or lowered arousal within a voice. This is particularly pertinent in high-risk environments such as transport where train drivers, for example, may be monitored to ensure they are alert and focused on the job.

Some applications monitor the way we interact with our devices over time, for example, our patterns of typing to infer changes in our emotional state. Devices that are worn or carried can differentiate between relaxed or agitated movements and gestures. Our patterns of breath can be detected with camera data or sensors on fabric that detect chest movement. Devices can monitor your heart, blood pressure, skin conductivity, muscular contractions, neurotransmitters and hormones. All of this data provides information about changes in your emotional state. By combining several methods of tracking emotional states, accuracy increases.

In addition to common approaches such as sentiment analysis and face tracking, devices may make inferences about changes in your emotional state by monitoring your voice, breathing, movement, heart, blood pressure, skin conductivity, muscular contractions, neurotransmitters and hormones.

There are huge implications for privacy and security in gathering and processing such sensitive data about individual human beings and building vast datasets on human beings collectively for machine learning algorithms to munch through. Awareness of our own emotional state and emotional changes in others can impact communication, teamwork, productivity, focus, learning, creativity and innovation. Side by side with potential risk, the detection of emotion in humans by devices promises powerful innovations that can be implemented for the human good.

Next time you’re angry typing, rolling your eyes, and muttering under your breath, you may want to consider what your laptop, smartphone, eyewear, smartwatch or headset would say about your emotional state.

Talk to us

If you want to find out how Learnovate can help you solve challenges in learning and technology, talk to Tom at tom.pollock@learnovatecentre.org or Linda at linda.waters@learnovatecentre.org +353 1 896 4910.

Wearables & Emotion Recognition

Learnovate has published a report on the development of emotion recognition in wearables. In this report, we explore the technology, and assess the market, for wearable devices that detect our emotional responses. Find out if the convergence of emotion recognition technology and wearable devices will transform work relationships, reduce risks in safety-critical industries, and enhance productivity in the workplace.

This research report is a Learnovate core research project funded by Enterprise Ireland and IDA Ireland.

Report Author

Clifton Evans

Request Access

Funded by Enterprise Ireland and IDA Ireland, Learnovate’s industry-led collaborative research is undertaken for the collective benefit of the Irish learning technology industry. Complete the form below to request access.