K-EmoPhone mobile and wearable sensor dataset contributes to advancements in affective computing, emotion intelligence technologies and attention management.
Sensors embedded in our smartphones, watches, and even vehicles and homes can now give researchers unprecedented insights into human behavior and preferences. The devices we use to call a friend or post on social media can become windows into our psychological state and behavioral patterns — these data can be used to track signs of stress and emotions.
This new paradigm for use of abundantly available data feeds into an emerging field known as affective computing, which aims to develop systems that can recognize and interpret human emotions. Affective computing research is often conducted using data collected in controlled laboratory environments where participants either portray specific emotions or are exposed to stimuli that trigger particular emotional responses. Their physiological signals, facial expressions, and speech patterns are then captured and recorded. While this approach has its merits, viewing an emotional video clip in a lab doesn’t quite evoke the full range of human emotion as experienced in the real world.
To address this, a team of researchers including Khalifa University’s Prof. Ahsan Habib Khandoker and Prof. Leontios Hadjileontiadis, Chair of the Department of Biomedical Engineering, has developed a dataset that incorporates real-world emotion, stress, and attention labels gathered from university students. With researchers from Korea Advanced Institute of Science and Technology, Profs. Hadjileontiadis and Khandoker collected data from students’ Android smartphones and Microsoft Band 2 smartwatches using a variety of sensor data. Additionally, participants were asked to report their emotional state — happiness, stress, attention levels, task disturbance, and emotional change — up to 16 times a day. All of the data collection was undertaken according to a plan approved by the Khalifa University research ethics committee.
Named K-EmoPhone, the dataset offers an in-depth look at human emotions through behavioral, contextual, and physiological data. The data were collected from participants as they navigated their daily lives, with the technology and wearables prompting responses throughout the day. The dataset was published in Nature Scientific Data.
“Despite the remarkable strides in building affective computing datasets, there is still a clear need for more comprehensive, real-world, multimodal datasets that include a broad range of in-situ emotional labels,” Prof. Hadjileontiadis said. “The K-EmoPhone dataset promises to illuminate the nuances of emotional states over time and has potential applications across a wide range of domains, from affective computing to attention management. We believe that such a comprehensive dataset will greatly benefit future research in data-driven understanding of human behavior and emotion.”
The research team used PACO, an open-source smartphone app that enables researchers to design and conduct experience sampling method studies (ESM). This approach aims to collect in-the-moment emotions, stress, attention levels, and other aspects of cognitive state, casting light on the human condition as it unfolds in everyday life.
The participants received push notifications as prompts to respond to a questionnaire, randomly appearing up to 16 times per day over a week. Each prompt would disappear after 10 minutes to reduce recall bias, ensuring immediate responses for the most accurate emotional state representation. Data were also gathered from the participants’ smartphones and wearables. Special data collection software was designed to unobtrusively capture data reflecting mobility, network traffic, social communication, application usage, and device status around the clock. The smartwatches provided additional sensor readings related to physiological responses, environmental contexts, and mobility.
“The K-EmoPhone dataset has been curated to help researchers study affective and cognitive states using multimodal data, encompassing physiological signals, personal contexts and interactions captured by smartphones, personal attributes, and mental health,” Prof. Khandoker said. “It is unique in its focus on timely responses to affective and cognitive states in real-world data collection settings.”
Potential applications include building machine learning models to predict mental well-being and productivity, emotion cognition, and stress detection. It could also be used in attention management studies and could shed light on how emotional states are affected by tasks that require timely responses. All data are open and available to any researcher, while the KU team is currently working on analyzing the data for emotion recognition and modeling behavior change.
06 July 2023