We don’t always say everything we are thinking, and most of the time we probably shouldn’t. But, what if you couldn’t say anything you were thinking at all? What if your words were locked in your brain and your mouth unable to expel them to the world?

A team of neuroscientists have developed two computer programs that can reconstruct the words a patient hears by analyzing their brain activity. The team published their findings in the current issue of PLoS Biology. I wanted to read the study for several reasons, but I admit the words, human superior temporal gyrus, in the intro completely captured my attention.

When the ear hears words it takes the complex sounds and breaks them down into phonemes (the smallest units of sound) that are then transmitted to the brain. One way to see this auditory breakdown is to follow its path to the brain and record the neural responses to the words, as each word makes its own unique pattern in the brain. In this study, researchers focused on whether the spoken words could be reconstructed from the neural responses. Fifteen patients undergoing neurosurgical procedures for epilepsy or brain tumor were implanted with subdural electrode arrays (electrodes measuring brain activity) on their lateral temporal cortex (a section of the brain involved in auditory perception) and electrocorticographic (electrodes placed on the brain to record electrical activity in the cerebral cortex) recordings were taken.

What does the brain sound like when it talks? Weird and spectral to me. Listening to the following audio file from the study reminded me of recordings made during an episode of Ghost Hunters! Here is an audio file of the computer’s reconstruction of the spoken word:

Researchers were able to accurately guess the word 80% to 90% of the time. The study is a major stride forward in developing systems that can generate speech for people unable to walk or speak due to disease or injury.

Braingate is one company that has been working on creating technology for severely disabled individuals to communicate through thought. My late husband, Stephen Heywood, was the first ALS patient to volunteer for a clinical trial with Braingate. He had a sensor inserted into his cortex in February of 2006. Here is a picture of his little top hat, as we called it.

The sensor records the signals related to imagined limb movement. A computer system is hooked up to the sensor and is then used to decode those signals into a command for the decoder. When hooked up to the system, Stephen could control the mouse on the monitor simply by imaging a computer mouse in his hand and the movements his right arm would make to control the mouse.

Studies and companies like these will one day allow for people unable to walk or communicate to do so again.

About these ads