Spotlight on Dr Soumya C. Barathi

Research
CAMERA Spotlight on Dr Soumya C. Barathi

Dr Soumya C. Barathi is a Marie Curie FIRE (Fellow with Industrial Research Enhancement) and a CAMERA researcher at the University of Bath. Her research is multi-disciplinary, spanning health, engineering, human-computer interaction (HCI), and psychology. Her research during her doctoral studies at the Centre for Digital Entertainment (CDE), focused on improving exercise performance, maintaining motivation, and tracking user experience while using virtual reality exercise-controlled games (VR exergames)

Her research along with Dr Michael Proulx, Prof Eamonn O’Neill, and Dr Christof Lutteroth explores creating a novel multi-sensory tracking system to monitor user experience reflected by the emotion (affective state) of the player at run-time in high intensity VR exergaming. They have identified predictors of affect using psychophysiological measurements. This opens doors to the exciting new possibility of affectively adaptive high intensity VR exergames which are capable of dynamically adapting their exergame intensity according to the affective state of the player at run-time leading to optimal player experience and facilitating exercise adherence.

Her previous research with Prof O’Neill and Dr Lutteroth in collaboration with researchers from the Health Department and HCI and Psychology researchers from the University of Auckland presented a psychophysical training technique called interactive feedforward. Players trained and competed with an enhanced self-model in a high intensity VR exergame with appealing aesthetics and engaging gameplay. It resulted in an improvement of exercise performance while maintaining intrinsic motivation.

Her research along with her collaborators has led to multiple publications in the prestigious ACM CHI conference which is a top tier international conference with an acceptance rate of less than 25%:

 

Affect Recognition using Psychophysiological Correlates in High Intensity VR Exergaming

Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems

Recognising the affective state of VR exergame players could enable us to personalise and optimise their experience. However, affect recognition based on psychophysiological measurements for high intensity VR exergames presents challenges as the effects of exercise and VR headsets interfere with typical measurements. This paper presents novel predictors of affect based on gaze fixations, eye blinks, pupil diameter, and skin conductivity for affect recognition in high intensity VR exergaming.

Paper: https://dl.acm.org/doi/abs/10.1145/3313831.3376596

 

 

Guidelines for Affect Elicitation and Tracking in High Intensity VR Exergaming

Momentary Emotion Elicitation and Capture Workshop at CHI 2020 Conference

This position paper on VR exergaming provides an overview of advances made in affect elicitation and tracking. It outlines guidelines for evoking underwhelming, overwhelming, and optimal affective states and tracking the affective state using psychophysiological measurements in high intensity VR exergaming. It discusses the research challenges that need to be addressed to implement affectively adaptive high intensity VR exergaming.

Paper: https://meec-ws.com/papers/MEEC_2020_paper_10.pdf

 

Interactive Feedforward for Improving Performance and Maintaining Intrinsic Motivation in VR Exergaming

Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems

This paper presents a novel method called interactive feedforward, which is an interactive adaptation of the psychophysical feedforward training method where rapid improvements in performance are achieved by creating self-models showing previously unachieved performance levels. Interactive feedforward was evaluated in a cycling-based VR exergame where players interacted and competed with their self-model at real-time in a VR experience. Interactive feedforward led to improved exercise performance while maintaining intrinsic motivation.

Paper: http://dx.doi.org/10.1145/3173574.3173982

 

Research portfolio: https://www.soumyacbarathi.com/

 

Keep Up To Date?

Join Our Centre Newsletter