Tagged with: brain-computer interfaces neuroengineering power of thought science
Scientists at Washington University in St Louis published a new study that shows humans can control a cursor on a computer screen by using thoughts of specific vowel sounds. This application has the potential to help millions of people who have trouble speaking, be it from a speech-related disability brought about by stroke, stuttering or epilepsy. Research published in a Journal of Neuroengineering paper said the technique will lead to better “brain-computer interfaces” for those with disabilities.
The study showed that by directly connecting the patient’s brain to a computer, that computer could be controlled with up to 90% accuracy – and that was done even with no previous training. The technique involves placing electrodes directly onto a patient’s brain to record electrical activity; it’s called electrocortiography (ECoG). Unlike EEG that involves wearing a cap on the head to pick up signals outside the skull, electrocortiography involves a surgical procedure to place a plastic pad containing several electrodes under the skull to tap directly into the brain’s cortex (the outermost layer of the brain).
Four patients who had previously undergone electrocorticograph implantation in order to establish the source of incurable epileptic seizures were also used in this study. They were told to think of four different phonemes – “oo”, “ah”, “ee” and “eh” – and then their brain signals were recorded. The results showed that the higher-frequency signals could reliably move a cursor on a computer screen.
Dr. Eric C Leuthardt, the lead author of the paper and from the Washington University School of Medicine, told BBC, “For a brain-computer interface, especially for someone who is severely impaired, they need something that is absolutely, completely reliable. If you think of EEG (systems), they move, they’re susceptible to noise, and the likelihood for reliability is much lower. Just a few discrete but reliable signals – tantamount to being able to move a cursor in two dimensions and effect a ‘click’ – could lead to a vast number of applications.”
Dr. Leuthardt called the study one of the earliest examples of “reading minds.” Since the study successfully showed what microscale ECoG recordings mean, and where, then future operations to utilize this technology would only need to use a very small and minimally invasive implant. The surgery might need to require an opening less than a centimeter wide to insert the implant on the speech intentions area of the brain.
Also, Dr Leuthardt said that they “want to see if we can not just detect when you’re saying dog, tree, tool or some other word, but also learn what the pure idea of that looks like in your mind. It’s exciting and a little scary to think of reading minds, but it has incredible potential for people who can’t communicate or are suffering from other disabilities.”
This is very exciting since even by using EEG and “skull” caps combined with the power of thought have shown the ability to guide electric wheelchairs, steer cars, and play video games. This could take an “I think I can” positive attitude to new amazing heights!
Image Credit: BBC