Immersive Emotions

Tags
‍Data Visualization & Installation Art
Role
Researcher & Designer
Year
Researcher & Designer

Project Overview

Immersive Emotions is an interactive public installation that helps users understand their emotions and how their body language might give insight into how they are feeling. Using biosensors embedded in a glove, the data from these sensors is mapped to a display.

This project was accepted into the ACM Mobile Human-Computer Interaction Conference 2022.

Key Features

  • Wearable Device
  • Bio Sensors
  • Camera

Research

There are approximately twenty-seven human emotions that are all interconnected; however, there are six that are most easily identifiable. These six basic emotions are happiness, sadness, fear, anger, disgust, and surprise. Using these six emotions, we investigated how they related to neuroaesthetics and body language. Additionally, we researched how the human body responds physically to these emotions through biometric data.

Blue often registers to calm feelings, orange to happy or excited, red to anger, and purple to shyness or timidity. We recognize that color associations can be cultural and subjective. Not everyone might relate these emotions to their respective colors.

Still body movements can correspond to calmness, big, open and circular body movements to happiness, angular and abrupt movements to anger and downwards and slow movements to shyness.

Normal body temperature, less movement and steady heartbeat can correspond to calmness. Increased temperature, increased sweat, increased heartbeat and more movement can correspond to either happiness or anger. Normal or decreased temperature, less movement, increased sweat and increased heartbeat can correspond to shyness or timidity.

Data Capture

To capture biometric data, we used a heart rate sensor, a galvanic skin response (GSR) sensor, an accelerometer, a gyroscope and a temperature sensor. These sensors were all embedded into the wearable device that goes onto the hand. Additionally, we used a camera to capture body positioning.

Wearable Device

The wearable glove was made using neoprene to hold the sensors, while also ensuring sensors that relied on skin conductivity were secured on the skin of the fingertips.

Data Visualization

The visual that is displayed based on the movement of the user is abstract. It is meant to provoke reflection and curiosity within the user. Particle size, their path, their color, opacity, and the overall area where the visualization occurs are indirect reflections of the body metric values, gestures, and posture.

Paper

You read the paper here.