The Project


A variety of neurological conditions can lead to severe paralysis, even to typical locked-in syndrome (LIS), where patients have retained their mental abilities and consciousness, but suffer from complete paralysis (quadriplegia and anarthria), except for vertical eye movement control. Such conditions are caused by an insult to the ventral pons, most commonly an infarct, haemorrhage, or trauma, such as stroke, encephalitis, Guillain Barré syndrome as well as neurodegenerative diseases of motor neurons, such as Amyotrophic Lateral Sclerosis in which the patient gradually loses control of his/her muscles and, consequently, the ability to communicate.

Although there is no official documentation of the precise percentage of the people suffering from locked-in syndrome, it is a fact that LIS is a rare syndrome. Nevertheless, there is no proven effective treatment, while ten year survival rates as high as 85% have been reported.

As these patients maintain their mental functions unaffected, their motor impairment and social exclusion often lead to depression and resignation. As a consequence, providing even minimal means of communication and control can substantially improve the quality of life of both patients and their families.



Our goal is to design and implement an integrated brain computer-interface for the navigation of a robot car and a wheelchair, based solely on the user’s brain signals. We are envisioning a reliable system, affordable and easy to use, in order to be used efficiently by patients with severe paraplegia or locked-in syndrome and operated easily by a healthcare provider, without the need of an IT specialist.


Innovative application of non-invasive brain-computer interfaces

During the last decades researchers have focused their efforts into studying and developing Brain-Computer Interfaces (BCIs). BCIs use brain activity signals as control signals, in contrast with the traditional human-computer interfaces that are usually based on the control of peripheral nerves or muscles. These interfaces were initially developed to help patients suffering from severe paralysis or locked-in syndrome, due to impaired neuromuscular activity, to control external devices, such as computers, speech synthesis devices, or robotic arms.

Electroencephalographic (EEG) recordings are used as control signals in the BCi we are developing for the i-AMA project. The excellent time resolution of this brain signal recording technique, which allows for real-time applications, along with its non-intrusiveness, affordability, portability and ease-of-use makes EEG the most popular signal capturing choice in such interfaces.

The EEG-based BCI we are developing uses Steady-State Visually Evoked Potentials (SSVEPs) and the processing chain consists of the following steps:

  • The user focuses his/her gaze onto specific visual stimuli, which produce distinct brain patterns.
  • An EEG recorder continuously records the user’s brain activity, while the recorded signals are analyzed in real-time, using sophisticated signal analysis algorithms, in order to detect those distinct brain patterns.
  • The detected patterns are translated into device control signals (e.g. motion commands for a wheelchair) and they are wirelessly transferred to the devices of interest.
  • The successful use of the devices of interest provides the feedback of the interface to the user, so that he/she can adjust his/her intentions and, therefore, the control commands.

The system at hand will utilize and implement sophisticated feature extraction algorithms in the frequency domain, as well as state-of-the-art machine learning and artificial intelligence techniques, thus allowing for the development of personalized interfaces, according to each patient’s distinct brain patterns.