Embodied Communication of Goals and Intentions

Workshop at ICSR 2013

27 October 2013

This workshop aims to bring together researchers from different fields working on robots that communicate with humans. The focus is on human-robot interaction and on embodied communication of goals and intentions. It is assumed that there is a strong influence of action performance, gazing behaviour, spatial arrangement and spatial flow of action to infer goals and intentions from humans. Learning and understanding in a social context should not be considered as an one-sided process. Thus, it is interesting to study situations from the perspective of both the learner's and the teacher's perspective. In this workshop, the intention is to investigate the challenges posed by such complex interaction systems from different research perspectives. Therefore, we will host talks given by researchers with different backgrounds. The aim is to report on the state-of-the-art and promote the exchange of ideas on how to enable a robot to interact with a human in a more natural way so that it can directly learn from human instruction.

Organisers

  • Katrin Solveig Lohan, iCub Facility, Istituto Italiano di Technologia
  • Konstantinos Theofilis, Adaptive Systems Research Group, University of Hertfordshire

Invited Speakers

Planning for Human-Robot Interaction: Representing Time and Intention

Frank Broz, University of Plymouth
In many social tasks, it is important to reason about the intentions of others in order to coordinate behaviour when goals are shared or resolve conflicts when differing goals could lead to them. In this talk, I will describe a modelling approach that represents human-robot interactions as partially observable Markov decision processes (POMDPs) where the intention of the human is represented as an unobservable part of the state space and the robots own intentions are expressed through the rewards. The state space and transition structure for these models are designed to represent timedependence in action outcomes and changes in the environment, which is necessary to successfully coordinate behaviour in many domains. I will present results comparing the performance of policies from these models to the performance of humans interacting with other humans for an interaction task in a simulated environment. I will also discuss potential applications this approach to other domains in human-robot interaction, including face-to-face interaction.

Interaction with socially interactive robot companions

Kheng Lee Koay, University of Hertfordshire
The talk will discuss the role of embodied communication and interaction in human-robot interaction scenarios in an assistive context. Examples of research on robot companions will be presented, i.e., home companion robots meant to assist people in their own homes. The emphasis of the talk will be on modes and modalities of interaction in order to create engaging scenarios.

Communicating by Moving - Anecdotes about and Insights into Human- Robot Spatial Interaction

Marc Hanheide, University of Lincoln
Enabling a robot to move among humans is not only a question of safety, but also of non-verbal communication of intentions and goals. The spatially interacting partners (humans and robots) continuously monitor and signal these by means of motion trajectories, body orientation, facial expressions, and gaze. In my talk, I will present our research in this area covering the understanding of communicative signals, qualitative reasoning about trajectories and our ideas on long-term adaptation of navigation patterns in human-robot spatial interaction.

List of Topics

  • Goal extraction
  • Embodied communication
  • Mutual Gaze
  • Face-to-Face Communication
  • Turn-taking in HRI
  • Human-aware navigation
  • Spatial prompting
  • Human robot spatial interaction

Important Dates

  • 31 August 2013 | Full/short paper submission
  • 20 September 2013 | Notification of acceptance
  • 27 October 2013 |Workshop at ICSR 2013

Important Links

Search