Discerning Affect from Touch and Gaze During Interaction with a Robot Pet

Xi Laura Cang, Paul Bucci, Jussi Rantala, Karon Maclean

Research output: Contribution to journalArticleScientificpeer-review

3 Downloads (Pure)

Abstract

Practical affect recognition needs to be efficient and unobtrusive in interactive contexts. One approach to a robust real-time system is to sense and automatically integrate multiple nonverbal sources. We investigated how users' touch, and secondarily gaze, perform as affect-encoding modalities during physical interaction with a robot pet, in comparison to more-studied biometric channels. To elicit authentically experienced emotions, participants recounted two intense memories of opposing polarity in Stressed-Relaxed or Depressed-Excited conditions. We collected data (N=30) from a touch sensor embedded under robot fur (force magnitude and location), a robot-adjacent gaze tracker (location), and biometric sensors (skin conductance, blood volume pulse, respiration rate). Cross-validation of Random Forest classifiers achieved best-case accuracy for combined touch-with-gaze approaching that of biometric results: where training and test sets include adjacent temporal windows, subject-dependent prediction was 94% accurate. In contrast, subject-independent Leave-One-participant-Out predictions resulted in 30% accuracy (chance 25%). Performance was best where participant information was available in both training and test sets. Addressing computational robustness for dynamic, adaptive real-time interactions, we analyzed subsets of our multimodal feature set, varying sample rates and window sizes. We summarize design directions based on these parameters for this touch-based, affective, and hard, real-time robot interaction application.

Original languageEnglish
JournalIEEE Transactions on Affective Computing
DOIs
Publication statusE-pub ahead of print - 7 Jul 2021
Publication typeA1 Journal article-refereed

Publication forum classification

  • Publication forum level 3

ASJC Scopus subject areas

  • Software
  • Human-Computer Interaction

Fingerprint

Dive into the research topics of 'Discerning Affect from Touch and Gaze During Interaction with a Robot Pet'. Together they form a unique fingerprint.

Cite this