Disabled Patients Thought-Control Remote Telepresence Robots
Researchers at the École Polytechnique Fédérale de Lausanne (EPFL) have developed a revolutionary brain-machine interface to restore a sense of independence to the disabled. The research, which shows that it’s now possible to remotely control a robot from home with one’s thoughts, produced excellent results in both human and technical terms.
Nine disabled people, including nine quadriplegics, and ten healthy people in Italy, Germany and Switzerland took part in the task of piloting a robot located at EPFL with their thoughts.
Increasing the Independence of Disabled Patients With Brain-Computer Interfaces
Brain–computer interfaces (BCI) permit direct communication between the brain and an external computer system. BCIs are often directed at assisting, augmenting, or repairing human cognitive or sensory-motor functions.
Participants trained with wearable non-invasive sensors capable of analyzing their brain signals. They were then able to control the wheeled robot and move it around, transmitting their thought commands in real time over the Internet from their home country. The robot shown in the picture, located in an EPFL laboratory, moved around according to the intention encoded in the remote pilot’s brain signals while displaying the face of the pilot via Skype video, and sent its video feed back to the pilot. The person at the controls was able to move the robot around and interact with people at EPFL.
Professor José del R. Millán, the head of a team of researchers at the Defitech Foundation Chair in Brain-Machine Interface (CNBI) at EPFL, said:
Each of the 9 subjects with disabilities managed to remotely control the robot with ease after less than 10 days of training.
Millán added that thought-operated telepresence robots could soon become part of daily routine for people suffering from a disability, but only if insurance companies will help finance the development of these technologies.
This video shows how telepresence robots can give people with disabilities the feeling of being home. The conclusions of the study are discussed in the June special edition of Proceedings of the IEEE, dedicated to brain-machine interfaces.
The EPFL paper, titled “Towards Independence: A BCI Telepresence Robot for People With Severe Motor Disabilities,” presents a step forward towards increasing the independence of people with severe motor disabilities, by using brain-computer interfaces to harness the power of the Internet of Things. It analyzes the stability of brain signals as end-users with motor disabilities progress from performing simple standard on-screen training tasks to interacting with real devices in the real world, and presents the results of nine end-users with motor disabilities who were able to successfully operate a telepresence robot in a remote environment. The robot is able to avoid obstacles by itself, even when it is not told to, so that the remote pilots are able to take a break if they need to rest.
The EPFL research was funded by the Tools for Brain-Computer Interaction (TOBI) project of the European Commission (FP7 Program). TOBI is an integrated project to develop practical non-invasive BCI technology that will improve the quality of life of disabled people. Non-invasive BCI are based on electroencephalogram (EEG) signals. The EEG is recorded through electrodes placed on the user’s head. This technology is not invasive and only records the electrical activity of the brain without interfering with it.
It’s important to note that this frontier medical application is but one of the initial baby steps toward the development of high-performance, high-speed BCI technologies that will move from research labs to consumer gadgets in the next decade.
Images from EPFL and Shutterstock.