Systems for human machine interaction

Augmented interaction between the user and a machine is advisable in many applications. Sometimes, the machine is even used to replace capabilities that were lost by the user, as in the case of a prosthesis that helps an amputee doing a movement. Moreover, in an advanced phase of Amyotrophic Lateral Sclerosis, patients may enter a Complete Locked-In State (CLIS), loosing even the residual eye movements that would permit assisted communication by eye tracking. In this case, human machine interaction systems may help the patient to communicate. We are investigating two interfaces that could be useful to decode user intention, which could be used either to control an assistive system or to support at least a limited communication in the case of CLIS.

  1. Brain-Computer Interface (BCI) based on EEG, processed to decode some user intentions. This requires a preliminary work on automatic removal of artefacts and then the classification of the cleaned EEG to identify the user will.
  2. The pupil accommodative response (PAR), voluntarily driven by shifting the gaze in depth, is associated to an autonomous change of pupil size. We showed that PAR is little affected by different environmental and experimental variables and could be potentially useful to communicate with locked-in patients with preserved visual and autonomic functions.

ERC Sector:

  • PE7_9 Man-machine-interfaces


  • Brain-computer interfaces
  • Human machine interfaces

Research groups