Human-Robot Interaction Research in the Adaptive Systems Research Group, University of Hertfordshire, School of Computer Science (http://adapsys.feis.herts.ac.uk/)

@ University of Hertfordshire

 

European Integrated Project Cogniron: Cognitive Robot Companion, 2004-2007:

 

Summary of Research Objectives:

The overall objectives of this project are to study the perceptual, representational, reasoning and learning capabilities of embodied robots in human centred environments. In the focus of this research endeavour is the development of a robot whose ultimate task is to serve humans as a companion in their daily life. The robot is not only considered as a ready-made device but as an artificial creature, which improves its capabilities in a continuous process of acquiring new knowledge and skills. Besides the necessary functions for sensing, moving and acting, such a robot will exhibit the cognitive capacities enabling it to focus its attention, to understand the spatial and dynamic structure of its environment, to interact with it, to exhibit a social behaviour, and to communicate with other agents and with humans at the appropriate level of abstraction according to context. The project will develop methods and technologies for the construction of such cognitive robots able to evolve and grow their capacities in close interaction with humans in an open-ended fashion. Expected results are basic methods, algorithms and architectures, and their integration, long-term experimentation and scientific evaluation on embodied robotic systems in different settings and situations. The project involves 10 European partners and runs over 4 years, started in January 2004.

 


 


The research team at University of Hertfordshire includes: Prof. Kerstin Dautenhahn, Prof. Chrystopher L. Nehaniv, Dr. René te Boekhorst, Dr. David Lee, Dr. Khenglee Koay, Dr. Aris Alissandrakis, Mick Walters, Sarah Woods, Dr. Nuno Otero, Joe Saunders, Dr. Iain Werry. The Herts Cogniron team is involved in research on imitation learning (how a robot can learn new skills from a human demonstrator), psychological studies into human perception’s of robots  (e.g. how a robot should look like or behave in order to be believable and acceptable to users), and social behaviour and embodied interaction (how to design and implement socially acceptable behaviour for a robot to interact with people and perform certain tasks in a human-inhabited environments, such as a living room). In order to provide a more ‘naturalistic’ and ecologically valid environment our recent HRI studies take place in the Robot House located in a residential area.

 

European Integrated Project Robot-Cub: ROBotic Open-architecture Technology for Cognition, Understanding and Behaviour, 2004-2009:

 

Summary of Research Objectives:

The construction of cognitive systems cannot progress without the development of a sound formal understanding of cognition. One recurrent theme in all approaches towards cognitive systems is the need for embodiment: the instantiation of a physically-active agent that can explore and manipulate its environment in order to develop the ability to effect useful interaction and communication with others. Our goal in Robot-cub is two-fold: (1) to create an open physical platform for embodied research that can be taken up and used by the research community at large to further their particular approach to the development of cognitive systems, and (2) to advance our understanding of several key issues in cognition by exploiting this platform in the investigation of several cognitive capabilities. This will be achieved through a well-specified programme of experimental research, drawing on our broad multidisciplinary backgrounds in human developmental psychology, physiology, cognitive robotics, mechatronics, and perceptual science. Significantly, the design of the robot-cub will be informed by this research programme and, in turn, the “robot-cub” will facilitate the modeling and development of cognitive capabilities. To achieve this goal we intend to construct an embodied system able to learn: i) how to interact with the environment (mainly) by complex manipulation and through gesture production/interpretation; and ii) how to develop its perceptual, motor and communication capabilities for the purpose of performing goal directed manipulation tasks. The embodied cognitive system (Robot-cub) will be shaped, physically and mentally, like a human child and will be designed as an “open system” to be shared by scientists as a common tool for addressing cognition and human-machine interface. This objective, therefore, will be achieved by jointly designing the mindware and the hardware: the body of Robot-cub with the goal of providing the scientific community with a set of physically instantiated tools indispensable to study cognition. The project involves 10 European partners and runs over 5 years. The project started in September 2004.

 


 


The Herts team is mainly involved in studying social interactions of a robot with its environment. Specifically, we study designing algorithms for a robot that, while it “grows up” and interacts with its environment, learns interaction histories that allow it e.g. to recognize different interaction partners, including other robots or humans. Such a robot would be able to behave differently towards different interaction partners, based not only on the current context and immediate past but also based on interaction histories that allow to generate predictions that can influence e.g. whether and how the robot will interact socially. This work is inspired by dynamical and developmental models of human memory and interaction. One of the robots we are using for studying interaction kinesics and synchronization in human-robot interaction (which plays an important part e.g. in child-caretaker interactions and children’s development of social cognition) is KASPAR, developed by our team. The Herts research team includes: Prof. K. Dautenhahn, Prof. C. L. Nehaniv, Dr. R. te Boekhorst, Assif Mirza, Michael Blow, Lars Olsson and other researchers.

 

 

Aurora project (Autonomous Robotic platform as a Remedial tool for children with Autism, since 1998):

 

The project systematically investigates the possible usage of robots in autism therapy. Particularly, we study whether and how interaction skills can be taught to autistic children. So far we have been using two types of robots: mobile robots and a small humanoid robotic doll. The different robots facilitate different types of interactions. Our general set up of the trials is very playful, the children are not required to solve any tasks other than playing, and the only purpose of the robot is to engage children with autism in therapeutically, or educationally relevant behaviours such as turn-taking and imitation. A key issue is that the children proactively initiate interactions rather than merely respond to particular stimuli. The chosen set up is social, i.e. it involves not only the robot and the autistic child present, but can include other children, the teacher or other adults. This social scenario is used by some children in a very constructive manner demonstrating their communicative competence. In our work we analyse different types of interactions that we observed in a series of trials where children with autism played with the robots, including trials with pairs of children. In a comparative study we studied how autistic children interact with a mobile robot as opposed to a non-robotic toy. Our findings include evidence that the children paid more attention, and directed more eye gaze towards the robot. We have shown in a longitudinal study, where children were exposed many times to a small humanoid robot, that the robot is able to encourage basic interaction skills, such as imitation of body movements. We have shown that the robot can facilitate joint attention, in a setup involving a robot, a child with autism, and the experimenter. In a recent study where we investigated the children’s responses to strangers we showed that their initial responses were significantly more social when interacting with a robot dressed in plain clothing, or a human dressed as a robot, as compared to a robot with many human-like features or a human stranger in normal clothing. Other work with typically developing children investigates how a robot can recognize different styles in how children interact with robots. Related work studies how a computer-based system can assess and assist children with autism in their development of narrative skills which are important e.g. in social communication. Our current research investigates among other things, whether these encouraging results can have a therapeutic or educational impact on children with autism, i.e. whether a robot is really able to help children with autism improving their social skills (which we have not shown yet). The research team includes the following Adaptive Systems group members at University of Hertfordshire: Prof. Kerstin Dautenhahn, Ben Robins, Dorothée François, and Megan Davis, as well as other members at University of Hertfordshire and other institutions, cf. www.aurora-project.com for a description of the research team.