
Ai Nakada

About
Emotions are crucial in communication, as the ability to understand and translate emotions enables better communication experiences. The aim of my work is to provide opportunities for people to recognise their own true feelings using biofeedback. Biofeedback describes the process of tracking data and representing this back to the user, providing information from psychophysiological recordings about the levels at which physiological systems are functioning.
Ai Nakada's work at the RCA comprises a study of biodata and biofeedback through experience design methods. You can listen to the emotion soundtracks used for testing below (these are categorised as 'Anger', 'Surprise', 'Tender', 'Happy', 'Fear', and 'Sad'). These were used to elicit and track changes in unconscious physiological signals in each of the moving images. The physiological indicators are correlated data based on both pupil and heart rate, which are then compared to generate a narrative of relationship. This is then used to generate insights on the ways in which biofeedback can inform emotional communication processes.
Statement

Do you think you really understand your feelings?
Is it possible to perceptually monitor and understand one's own emotions?
Even as we can increasingly monitor cognition through existing and emerging technology, we still don't understand precisely how emotion, and perception of emotion, operate in the human brain. For example, no concrete differences can be found between sensation and perception, or perception and cognition, respectively. I would argue that sensation is knowing one's own state, and perception is knowing one's surroundings, and cognition is the mechanism by which we do this. But much in this area still remains beyond the realm of scientific understanding.
Ai uses cognitive psychology, especially concerning perception, as a key factor in user research, combining these approaches with methods from experience design to better understand how people experience and perceive emotion.
Education
2010 - 2014 BA, Psychology, Japan
2016 - 2018 Master of Engineering, Tokyo Institute of Technology
2020 - 2022 Master of Art, Royal College of Art
Work
User Interface developer
User Interface research
The Eyes Speak More than the Mouth
Seeing is an activity. We do not passively let visual information fall onto our retina, but actively seek out objects of interest by moving our body, head, and eyes. The saccadic and smooth-pursuit eye movements that control gaze direction have been extensively studied (e.g. Kowler, 2011).
But eye movements do far more than direct gaze. Once gaze has been directed at an object of interest, our eyes continue to move to provide our brain with the best possible image: the curvature of the lens changes (accommodates) to control focus; and our pupils enlarge (dilate) or shrink (constrict) to control how much of the lens’s surface is exposed, and consequently how much light enters the eye. In this review, I will focus on this last type of eye movement: pupil responses.
The pupil changes its size in response to three distinct kinds of stimuli: it constricts in response to brightness and near fixation and it dilates in response to increased cognitive activity, such as increased levels of arousal or mental effort.
Medium: Biodata capture from pupils
Size: 50 cm x 50 cm
In Collaboration with:
Heartbeat Sensor
Heartbeat Sensor is an electronic device that is used to measure the heart rate i.e. speed of the heartbeat.
Youmile ECG Module AD8232 ECG Measurement Pulse Heart Rate Sensor Module Kit ECG Monitoring Sensor for Arduino with Dupont Cable
Biodata Relationship: Pupil and Heart Rate
This image reflects the correlation between trends and heart rate measured by Arduino sensors and Matlab in RGB.
Pupils and heart rate change with changes in emotion.
This sample uses both of these values to change colour and movement.
Perception research
Process
In Collaboration with: