COMPUTER can READ LETTERS DIRECTLY FROM YOUR BRAIN. Game Over, Man.
This is either terrifying game-over material, or the lube you’ve been looking for to stroke your cyberpunk bits with. A group of scientists Daring to Approach the Lords have concocted a way to read letters. Directly from your fucking brain.
By using brain scan data and a set of computer algorithms, scientists from the Netherlands were able to determine which letters a person was looking at. The breakthrough suggests it’ll soon be possible to reconstruct human thoughts at an unprecedented level of detail, including what we see, remember — and even dream.
To make it happen, researchers from Radboud University Nijmegen constructed a system comprised of a functional MRI scanner, shape recognition software, and a training algorithm.
During the brain scans, the researchers pulled data from the occipital lobe, an area at the back of the brain that reacts to visual stimuli. Test subjects were told to look at a series of letters flashing on a screen, including handwritten versions of the letters B, R, A, I, N, and S. This activity would excite specific areas in the occipital lobe, which the scientists extracted in the form of 2x2x2 millimetre information snippets called voxels.
They then taught an algorithm to correlate certain pixels, namely the configuration of specific letters, to these voxels. This allowed the system to reconstruct the image being viewed by the participant, albeit an unclear, fuzzy one. The process was further refined by teaching the system in advance what letters look like — essentially giving it prior knowledge.
“Our approach is similar to how we believe the brain itself combines prior knowledge with sensory information,” noted the researchers through a release. “For example, you can recognise the lines and curves in this article as letters only after you have learned to read. And this is exactly what we are looking for: models that show what is happening in the brain in a realistic fashion.”
Eventually, the researchers hope to apply scaled-up versions of their model to working memory and subjective experiences such asdreams or visualizations.
Next, the researchers plan to use more powerful fMRI scanners to improve resolution, which will allow them to link images of faces to 15,000 voxels in the brain.
Read the entire study at NeuroImage: “Linear reconstruction of perceived images from human brain activity.”
[io9]