- By Sebastian Anthony on December 21, 2011 at 3:44 pm
Share This article
Take prosthetic arms, for example: We don’t have a clue about the calculations that occur in the brain to trigger arm muscle motor neurons, but that doesn’t stop us from slapping some electrodes onto a subject’s bicep muscles and measuring the electric pulses that occur when you tell him to “think about moving your arm.” By the same logic, a brain-computer interface can measure what our general cranial activity looks like when we’re thinking something and react accordingly, but it can only do this through training; it can’t actually understand our thoughts. Taking this one step further, though, Sheila Nirenberg of Cornell University has been trying to work out how the retina in your eye communicates with your brain — and judging by a recent talk at TEDMED (embedded below), it seems like she’s actually cracked it.
Now, reading the brain’s output (as in a prosthetic arm) is one thing, but feeding data into the brain is something else entirely — and understanding the signals that travel from the retina, through the optic nerve, to the brain is really about as bleeding edge as it gets. Nirenberg still used a brute force technique, though: By taking a complete animal eye and attaching electrodes to the optic nerve, she measured the electric pulses — the coded signal — that a viewed image makes. You might not know what the code means, but if a retina always generates the same electric code when looking at a lion, and a different code when looking at a bookcase, you can then work backwards to derive the retina’s actual encoding technique.
Nirenberg did this until she produced mathematical equations that, with startling accuracy, encode images into neuron pulses that can be understood by an animal brain. In the image below, the far left picture represents the pre-Nirenberg state of the art prosthetic eye, and the mid two images are what her prosthetic are capable of. Not quite as good as the real thing, but when you imagine that this is a silicon chip being implanted into the eye of a blind animal and then wired into the optic nerve, you really ought to be awestruck. In case you’re wondering, the “transducer” that the image references is a piece of hardware that converts the output from the silicon chip into signals that are ready to travel along the optic nerve to the brain.
You’ll note that we’ve used the word “animal” throughout, and not “human.” So far, Nirenberg seems to have carried out most of her experiments on mice — but as far as we know, the eye, optic nerve, and visual cortex in mice and humans are fairly similar. The next step must surely be working out the mathematical equations that simulate the human retina, and then full-blown human trials. Personally, as someone who is short-sighted but not blind, I would rather go down the wireless contact lens display route — but imagine, just for a second, if one day a prosthetic retina with a higher resolution than its flesh-and-blood counterpart is made. Imagine if you could hit a button to digitally zoom in with your eyes — or, more likely, just think about zooming in.
Perhaps even cooler, though, Nirenberg insists that this same technique — wiring up electrodes to our sense organs and brute forcing the encoding technique — could also be used to produce prosthetic ears, or noses, or limbs that can actually feel. Presumably, at some point, with enough data points under our belt, we might begin to unravel the human brain’s overarching communication codecs, too. The age of bionics is almost here!
Nenhum comentário:
Postar um comentário