Human Camera: Scientists Reconstruct Pictures from Brain Activity

We’re now one step closer to being able to take photographs with our minds. Scientists at UC Berkeley have come up with a way to reconstruct what the human brain sees:

[Subjects] watched two separate sets of Hollywood movie trailers

[…] brain activity recorded while subjects viewed the first set of clips was fed into a computer program that learned, second by second, to associate visual patterns in the movie with the corresponding brain activity.

Brain activity evoked by the second set of clips was used to test the movie reconstruction algorithm. This was done by feeding 18 million seconds of random YouTube videos into the computer program so that it could predict the brain activity that each film clip would most likely evoke in each subject.

Finally, the 100 clips that the computer program decided were most similar to the clip that the subject had probably seen were merged to produce a blurry yet continuous reconstruction of the original movie. [#]

Unlike the cat brain research video we shared a while back, the resulting imagery in this project isn’t directly generated from brain signals but is instead reconstructed from YouTube clips similar to what the person is thinking. They’re still calling it a “major leap toward reconstructing internal imagery” though. In the future this technology might be used to record not just our visual memories, but even our dreams!

Discussion