Decoding Mouse Brain Signals to See and Hear Like Them

TL;DR Summary
Researchers from the École Polytechnique Fédérale de Lausanne (EPFL) have developed a machine-learning algorithm that can decode a mouse’s brain signals and reproduce images of what it’s seeing. The algorithm, called CEBRA, was trained on a black and white movie clip from the 1960s of a man running to a car and then opening its trunk. CEBRA was able to correctly identify specific frames the mouse was seeing as it watched and generate matching frames that were a near-perfect match. The research potentially paves the way for improvements in how we study both human and animal brains and how we understand the brain’s reactions to visual and other stimuli.
- Researchers See Through a Mouse's Eyes by Decoding Brain Signals Gizmodo
- Scientists can now use AI to convert brain scans into words WION
- Movie clip reconstructed by an AI reading mice's brains as they watch New Scientist
- Seeing through the eyes of a mouse by decoding its brain signals Medical Xpress
- Cracking the Code of Sound Recognition: Machine Learning Model Reveals How Our Brains Understand Communication Sounds Neuroscience News
Reading Insights
Total Reads
0
Unique Readers
0
Time Saved
2 min
vs 3 min read
Condensed
80%
550 → 109 words
Want the full story? Read the original article
Read on Gizmodo