MSA 2024: The first International Workshop on Multimodal Social Agents & Robots for Mental Health and Wellbeing – A new world with foundation models at ACII 2024
Every day, hundreds of millions of people go to bed anxious, depressed, unhappy, with no peace of mind or knowing what to do tomorrow to make their lives better. Human-like conversational AI agents with affect sensing capabilities can revolutionise digital mental health and wellbeing solutions. Multimodal agents that can interact through face, voice and natural language are essential to mimic human-human-like interactions. The advent of vision-language Foundation models such as GPT and DALL.E has dramatically transformed human-AI interaction. Coupled with large language foundation models, Social agents and robots equipped with multimodal affective interactions, have the potential to enable novel approaches to assess mental health conditions such as clinical depression and anxiety.
The MSA-2024 workshop jointly organised with BLUESKEYE AI at ACII-2024 conference in Glasgow, aims to explore applications of cutting-edge Multimodal Social Agents and Robots to Digital Mental Health and Wellbeing solutions, providing a platform for researchers working on mental health technologies to network, share ideas and best practices and explore new opportunities and potential solutions to current challenges.
Organisers
Joy Egede, Assistant Professor at University of Nottingham
Mani Tellamekala, Head of R&D at BLUESKEYE AI
Michel Valstar, Chief Evangelist and Scientific Officer at BLUESKEYE AI
Nick Cummins, Lecturer at King’s College London
Prof Pierre Philip, Université de Bordeaux
Sharon Mozgai, University of Southern California
Keynote Title: Autonomous Virtual Agents and Sleep Health Management: A Clinical and Populational approach
Keynote Speakers
Prof Pierre Philip
Prof. Philip is a psychiatrist by training and member of the CNRS Service and Research Unit "Sleep, Attention and NeuroPSYchiatrie" (USR6033 - SANPSY). Pr Philip has a large experience in clinical research and is Director of the Neuro-Psychopharmacological Research Platform and coordinator of PEPR research program AUTONOM-HEALTH dedicated to the development of embodied conversational agents used in the field of sleep and mental health. He is currently coordinator of the Sleep Research Group (GDR CNRS 3737) and scientific advisor for the CNRS (INSB). Since July 2020 he is head of the university department of sleep medicine in The University Hospital of Bordeaux.
Keynote Abstract
Due to the huge amount of sleep complaints among the general French population generated since the Covid-19 outbreak, the medical resources have been overloaded with patients’ management and sleep and mental health problems had to be treated on a large-scale autonomous model. We developed a research program on virtual agents to target and treat sleep complaints and social stress. Up to now 60000 subjects have downloaded a free app (KANOPEE) on the google and apple store to benefit from sleep recommendations. We showed that these autonomous behavioral interventions could significantly improve sleep complaints, sleep schedules, but also mood disorders. We also studied the major determinants explaining the acceptability of autonomous virtual agents among the general population from various age groups. We will present our major findings and try to highlight the future of research in autonomous virtual agents in the field of mental health.
Call for Papers
We are inviting academic papers submissions that explore applications of face, voice, and language Foundation Models in building mental health applications using virtual agents or robotics technologies.
Topics of interest include:
Recognition of mental health conditions from face and voice
Improving delivery of mental health treatment using empathetic social agents and robots
Providing mental health advise using social virtual agents and robots
Reports of real-world deployment of virtual agents and social robots
Ethical issues of virtual agents and social robots for mental health management
Paper submission guidelines
Papers should be formatted using the ACII 2024 templates (e.g., Word, Latex or Overleaf) and guidelines which can be found at https://acii-conf.net/2024/authors/submission-guidelines/.
Papers will be reviewed in a double-blind manner, i.e., each paper will be assessed by two or three reviewers.
Accepted papers will be published and indexed in the IEEE digital Xplore Library under the ACII 2024 Workshop proceedings.
Authors should please submit papers to the MSA webpage on EasyChair.
Important dates
Call for papers published: 19 February 2024
Submission deadline: 5 July 2024
Notification to authors: 14 July 2024
Camera ready deadline: 31 July 2024
Workshop: Sunday 15 September 2024
Workshop Program
9.00 - 09.10am:
Welcome and opening remarks
Title: Autonomous Virtual Agents and Sleep Health
Management: A Clinical and Populational approach
Prof Pierre Philip, Université de Bordeaux
09.10 - 10.00am:
Keynote talk (40mins talk,10mins Q&A)
10.00 - 10.30am:
Oral Session 1: Social Agents & Robots for Mental Health and Wellbeing (12mins presentation +3mins Q&A per paper)
Paper 1: Enhancing Patient Intake Process in Mental Health Consultations Using RAG-Driven Chatbot
Paper 2: Learning Graph Representation for Predicting Student Mental Wellbeing in Robot Assisted Journal Writing Context
10.30 - 11.00am:
Tea break
11.00 - 11.45am:
Oral Session 2: Inclusive Affective Technologies for Mental Health and Wellbeing (12mins presentation +3mins Q&A per paper)
Paper 3: Multimodal Gender Fairness in Depression Prediction: Insights on Data from USA & China
Paper 4: Driver Monitoring Systems in Automated Vehicles for the Older Population
Paper 5: Acoustic Characterization of Huntington’s Disease Emotional Expression: An Explainable AI Approach12.30 – 13.30:
Lunch
11.00 - 11.45am:
Oral Session 2: Inclusive Affective Technologies for Mental Health and Wellbeing (12mins presentation +3mins Q&A per paper)
Paper 3: Multimodal Gender Fairness in Depression Prediction: Insights on Data from USA & China
Paper 4: Driver Monitoring Systems in Automated Vehicles for the Older Population
Paper 5: Acoustic Characterization of Huntington’s Disease Emotional Expression: An Explainable AI Approach
11.45 - 12.30:
Prize presentation and closing
Best paper prize award
BLUESKEYE AI will award a prize of £500 to the best workshop paper.