Organization
Proposers information:
Dr. Chuang Yu: Postdoctoral Researcher at Cognitive Robotics Lab of University of Manchester, UK
Dr. Siyang Song: Postdoctoral Researcher at University of Cambridge, UK
Dr. Leimin Tian: Research Fellow at Human-Robot Interaction group of Monash University, Australia
Dr. Zhao Han: Post-Doctoral Fellow of Computer Science at the interactive robotics MIRRORLab, Colorado School of Mines, USA
Prof. Xiaofeng Liu: Professor at Hohai University, China
Prof. Aiguo Song: Professor/Dean at College of Instrument Science and Engineering, Southeast University, China
Prof. Adriana Tapus: Full Professor and Director at Doctoral School of Institut Polytechnique of Paris, Autonomous Systems and Robotics lab, IP Paris, France
Important dates:
Submission deadline: 24 June 2022
Notification of acceptance: 22 July 2022
Camera-ready: 15 Aug 2022
Workshop date: 21 Oct 2022 (TBC)
In recent years, robotic applications have entered various aspects of our lives, especially healthcare services. It is common in these applications that a user interacts directly with a robot. In Human-Robot Interaction (HRI), trust and mutual adaption are established and maintained through a positive social relationship between the robot and the human interactor and rely on the perceived competence of a robot on the social-emotional dimension. How a user perceives a robot's social intelligence and their social relationship with the robot can have a direct influence on the outcomes of an HRI system, for example, whether a user decides to accept the recommendation from a robot. Moreover, in many HRI applications, social-emotional interaction with the intended users is the main goal of the system or a core strategy to achieve the desired outcomes. For example, in HRI techniques can be further applied to various human external behaviours and internal states understanding applications for various, such as gesture and facial expression recognition, emotion/dimensional affect analysis, as well as mental health (e.g., depression, anxiety, bipolar, etc.) and personality recognition. Thus, such affective HRI applications require emotion-awareness and social-emotional competence in the robot's functions to deliver acceptable services. In addition, for robots deployed in shared spaces with humans, even when direct HRI is not expected to occur, social intelligence of robots, such as the ability to follow certain social norms or to predict human intentions, is key to safe and effective deployment.
This workshop provides a communication and collaboration platform for researchers from the human-robot interaction (HRI), emotion recognition, affective computing, deep learning, and healthcare communities. This workshop will focus on discussing the following research questions:
How to perceive unimodal or multimodal affective human behaviour adaptively/accurately in HRI?
How to efficiently generate natural and affective robot behaviour in HRI?
How to advantageously facilitate human users’ mental and physical well-being with affective HRI applications?
Proposers information:
Chuang Yu: Ph.D. at Institut Polytechnique de Paris (France) and Postdoctoral Researcher at University of Manchester, UK
Adriana Tapus: Full Professor and Director at Doctoral School of Institut Polytechnique of Paris, Autonomous Systems and Robotics lab, IP Paris, France
Xiaofeng Liu: Professor at Hohai University, China
Wenxuan Mou: wenxuan.mou@manchester.ac.uk Postdoctoral Researcher at University of Manchester, UK
Chuande Liu: Ph.D. candidate at Southeast University, China
Siyang Song: Postdoctoral Researcher at University of Cambridge, UK
Yiyue Luo: Ph.D. candidate at Massachusetts Institute of Technology, USA
Robotics and AI have experienced dramatic advancement in the past few years. As a result, humans come into contact with more and more autonomous robot systems with AI in daily life. Therefore, human-centered intelligent robotics (HCIR) is a new trend and challenge. Humans should be always in the loop and the robots should be aware of humans. A human-centered robot should be dedicated to responding and conducting expressive verbal or non-verbal behaviors for social interaction. It should also assist humans with adaptive physical actions for effective and safe interaction. Furthermore, HCIR is also related to the interpretability of intelligent robots. An explainable or comprehensible system instead of a black-box one will conduct a more natural and trustworthy human-robot interaction with the human in the loop. Therefore, human-centered intelligent robotics as a multifaceted concept faces multiple challenges:
(1) how to perceive human behaviors well with a learning method. Multimodal behaviors including face action, body action, tactile information, EEG, and speech for human behaviors understanding[1][2][3]. Furthermore, long-life learning with the human in the loop also should be considered in a natural long-term interaction [4].
(2) how to adaptively respond and conduct expressive behaviors for social interaction or assistive behaviors for physical interaction. The human-centered robot should respond to human interaction with multimodal social cues, including verbal and non-verbal behaviors, for natural human-robot interaction[6]. A human-centered robot should also explore the physical behaviors, for example, the robots that open doors with visual servoing [7].
(3) how to build up a trustworthy robot system with explainable or comprehensible AI. The human-centered intelligent robot should take the system interpretability into consideration, which leads to a reliable human-robot interaction [8][9].
The session topics include but are not limited to:
- Human-robot interaction with AI
- Human behavior perception with AI
- Robot verbal or non-verbal behavior generation
- Social human-robot interaction with AI
- Physical human-robot interaction with AI
- Assistive service robots with AI for vulnerable populations such as children or the elderly
- Robot learning with human in the loop