Long-term search for a computer that can speak your mind


This is research Setting: A woman speaks Dutch into a microphone, and 11 fine needles made of platinum and iridium record her brain waves.

The 20-year-old volunteer suffers from epilepsy. Her doctor inserted those 2 mm-long metal pieces-each with as many as 18 electrodes inlaid with them-into the front and left of her brain, hoping to Find the origin of her epileptic seizures. But for a single research team, that little neuroneedle therapy is also a lucky breakthrough, because the electrodes are in contact with the part of her brain responsible for producing and expressing spoken language.

This is the cool part. After the woman spoke (this is called “public speaking”), and after the computer used algorithms to equate the voice with the activity in her brain, the researchers asked her to do it again. This time, she hardly spoke in a low voice, imitating words with her mouth, tongue and chin. That is “intentional speech”. Then she did it again-but didn’t move at all.The researchers asked her to only Imagine the words said.

This is a version of the way people speak, but the other way around. In real life, one part of our brain forms silent thoughts, another part turns them into words, and then other parts control the movement of the mouth, tongue, lips and larynx to produce audible sounds of the correct frequency to speak. Here , The computer makes a woman’s mind jump in the queue. When she was thinking, they registered-the technical term is “imaginary speech”-and were able to play audible signals formed by interpolated signals from her brain in real time. These sounds are not as easy to understand as words. this workThe release at the end of September is still preliminary. However, the simple fact that they occur at millisecond speeds of thinking and action shows that the emerging use of brain-computer interfaces has made amazing progress: to make people who cannot speak speak.

This incompetence-from neurological disease or brain damage-is called a “neurological disorder.” This is debilitating and scary, but people do have some ways to deal with it. In addition to speaking directly, people with joint disorders may use devices that translate the movements of other body parts into letters or words; even blinking can work.Recently, a brain computer interface was implanted into the cortex of a person with lock-in syndrome, allowing them to translate imaginary handwriting Into an output of 90 characters per minute. Good but not great; a typical spoken English conversation is 150 words per minute.

The problem is, like Move arm (Or cursor), the formation and production of speech is really complicated. It depends on the feedback, there is a 50 millisecond loop between when we say something and hear ourselves say it. This is what allows people to perform real-time quality control of their speeches. In this regard, it is what makes humans learn to speak in the first place-to hear language, make sounds, hear ourselves making these sounds (through the ears and auditory cortex, other parts of the whole brain) and compare what we are doing what we are trying to do Things.



Source link

Recommended For You

About the Author: News Center