RC1: Child-robot interaction strategies. DREAM will advance the state of the art in child-robot interaction by conducting studies that use a rigorous methodology, with adequate sample size, stronger designs, and clearly defined inclusion criteria of the participants (i.e., verified with reliable tests). The ASD diagnosis confirmation will be performed with ADOS and an empirically supported program. Applied Behavioural Analysis (ABA) will be used, in which the robot will be integrated.The interaction studies will be driven by three distinct interventions: one focusing on joint attention, one on imitation, and one on turn-taking. These three scenarios cover principle components of therapeutic interventions, with each scenario being further broken down into elements of increasing interaction complexity.
RC2: Multi-sensory data fusion and interpretation for diagnostic support. In DREAM our approach is to favor off-body sensors, preferably those embedded in the robot, since children with ASD do not like having to wear sensors, especially cumbersome ones that may agitate their sensitive skin. To ease the complexity of the sensing and data-fusion task, we transform the therapy room into a smart space environment with additional sensors that provide information which otherwise would require very advanced and possibly infeasible signal analysis. We will only consider small inconspicuous wearable sensors if all else fails. By addressing this challenge of data analysis, modelling, and interpretation, DREAM will make several useful contributions, satisfying the need for better diagnostic tools, facilitating early and consistent therapy.
RC3: Child-specific behaviour assessment. Effective child-robot social interactions in supervised autonomy RET require the robot to be able to infer the psychological disposition of the child. Such dispositions can be inferred from gaze, body and speech behaviours. To this, we can add social contingency: both the contingency of verbal interactions, in terms of intonation, prosody, or speech duration, as well as the contingency between the child’s behaviour and the behaviours of their partners (robots) while there are engaged in a play session. Another important aspect to be considered is so-called “testing behaviour”, a systematic variation of activity of the child while closely watching the interactive robot. This is related to perceiving intentions of others and to dynamics of imitation: role-reversal behaviours, turn taking, initiation of new behaviours, etc. DREAM will develop computational models that can assess the behaviour of a child and infer her or his psychological disposition. Endowing robots with a capability for behaviour assessment and inference of psychological disposition will allow them to act based on the child’s immediate goals and intentions rather than only on explicit movements. Furthermore, such a capability, in addition to being able to better predict the outcome of their own actions, will allow the robot to react more appropriately to the emotional, attentional, and cognitive states of the child, to learn to anticipate the reactions of the child, and to modify its own behaviour accordingly.
RC4: Cognitive social behaviour for supervised autonomy. DREAM plans to remove the current exclusive reliance on Wizard of Oz (WoZ) control of robots in therapeutic settings. While full autonomy is currently unrealistic and to an extent undesirable, we will implement supervised autonomy, in which the robot user (the therapist, nurse, psychologist or teacher) gives the robot particular goals and the robot autonomously works towards achieving these goals during the therapy session with the ASD child. This does not exclude WoZ: the robot can still rely on the professional user for interpretation and annotation of behaviours, which cannot be accomplished autonomously. However, as the robot will operate autonomously for periods of up to 15-30 minutes depending on the target group, this will free professionals and will contribute to more consistent and as such more reliable diagnosis. The cognitive control architecture itself will follow current best practices in cognitive architectures, providing for a fast reactive sub-system; an embodied perceptual attention sub-system that both adapts to sensory input and actively expresses the robot’ s attention to the child; a deliberative sub-system that provides the mapping between the child’s behaviour and the appropriate robot behaviour; and an action sub-system to produce the required orofacial actions (gaze, facial expressions), body gestures and actions, and speech. It will also incorporate a self-monitoring meta-cognition sub-system to provide real-time checks of the robot’s behaviour and trigger embedded actions to seek the intervention of the therapist when the need arises.
RC5: Ethics of robotics and ethics of human-robot interaction. The use of robots in therapeutic contexts raises several ethical issues. Some considerations are obvious, although not necessarily easy to address: the robot should be human-friendly, it should not harm humans, and it should be able to achieve the therapeutic aim for which it is designed and used (the diagnosis and treatment of autism) – it should be functional and effective in this sense. Other ethical questions are particularly important in the context of the sensitive data on individual medical case histories. The degree of autonomy to be afforded to the robot is another key issue. The more autonomous the robot, the less control therapist, parents, robot operator/designer etc. have over the robot-child interaction and this then raises the issue of who is responsible for the robot’s actions and behaviours. Autonomy also raises the problem of trust: are parents happy to leave their child ‘in the hands of the robot’? Will the child trust the robot? These questions will be approached during the development of the DREAM project.