Predict what a mouse sees by decoding brain signals
HTML-код
- Опубликовано: 2 май 2023
- A research team from EPFL has developed a novel machine-learning algorithm that can reveal the hidden structure in data recorded from the brain, predicting complex information such as what mice see.
DOI: doi.org/10.1038/s41586-023-06...
Altmetric: nature.altmetric.com/details/...
Full article: actu.epfl.ch/news/see-through...
Article en français: actu.epfl.ch/news/predire-la-...
Mackenzie Mathis Lab : www.mackenziemathislab.org/
EPFL: www.epfl.ch/
Additional footage: figshare.com/articles/media/P...
#brainmachineinterface #cebra #artificialintelligence #neuralnetwork #brain - Наука
Misleading to the point of intellectual dishonesty. They're not reconstructing the video, they're just try to reconstruct the order of the frames.
Was going to comment almost exactly this. That's modern Science™ for you, it's all about the soundbyte and the headline and no actual results.
I mean it seems like that’s exactly what they say in the interview… “ we are able to predict which frame the mouse is currently watching”
Please stop making reasonable comments while we're having fun s******* on these people
@@MichaelEvanspishposh Yes, I now see the phrase you're referring to at around 1:45. To me, putting that critical caveat in the latter half of one sentence 2/3 of the way through the video does not constitute an honest attempt to communicate accurately. I don't think even that caveat makes clear whether they're predicting the frame from scratch at a pixel level, from among a million test images, or from among the 900 frames in this short film. They've made it so easy to draw the wrong conclusion and so hard to draw the right one.
If you look at the associated press release ("Full article" link in the description), they play much the same game. The title is, "Predict what a mouse sees by decoding brain signals". Then they make statements like "predicting complex information such as what mice see", "they can decode from the model what a mouse sees while it watches a movie", "CEBRA can predict unseen movie frames directly from brain signals alone".
Only later do they slip in the faintest qualification, "CEBRA learns to map the brain activity to specific frames".
The incentive of a university PR department is to make research produced at that institution sound amazing, to attract glory, applicants, employees, and funding. But, when taken too far, this becomes a public misinformation campaign. Researchers should push back, not collaborate.
"High-resolution image reconstruction with latent diffusion models from human brain activity" is a paper that does the trick (for images) in humans using 7t fmri and latent diffusion
Don't be fooled by what you see in the video and listen what she said.
To be decoded is not the entire movie but only which moment of the movie the mouse is looking at.
Just the frame number practically. An exellent result anyway.
was the AI also fed with the movie clip and it is just predicting which pattern is being processed by the brain?cause that's a lot less impressive then whats being shown here, if the ai could analyse the brain activity and create the possible image being watched then it would be truly newsworthy.
They said the AI is predicting which frame of the movie is being watched.
@NotTheRealDeadpool what a waste of effort
I was thinking the same thing because the car was a remarkable carbon copy. I didn't hear the matching the frame bit. Which makes more sense because the man running isn't in ever frame the AI represented from the brain.
@@ZZZ-Hip-Hop with that data, we can now easier correlate what brain activity corresponds to what frame and train an even bigger network, that is possible to recreate from the brain - i think that is just step 1
Yeah, but I still find amusing. Like, with this you could use a rat as a detection of some sort, like train the model to understand the brain signals that a certain face (a FBI wanted, or something) and then release it in an area, if it detect that, give an alarm and the location. Right? Kinda cool and terrifying if it can do something like that
amazing how the mouse never get distracted and always keep the video frame-perfect
you didnt watch the, did you? it just predicts which frame of the movie the mouse is currently seeing, also where the mouse thinks it is in the room, because the algortihm learnt what in the brain happens when the mouse is in that position of the room by training data of filming and reading the brain simultaneaously
@@JanSeewald it is still a bulshit nothing else. Mouse brain as well as all other brains on this planet do not see a picture but rather feel the picture which means mouse can detect known and unknown objects and try to interpretate their meanings but never see a complete picture.
This means when someone is showing movie to the mouse it is not a big challange to quess or maybe estimate probability which frame is already seen by mouse - its nothing new, similar studies have been going on for many years. Scientists can precaisly estimate that brain "see" particular objects.
AI can speed up this process however we will be never able to "see" what any brain see because it works different.
How do they actually know that mouse is really "watching" that movie?
Ah sweet, man made horrors beyond my comprehension
Sad that the mouse isn't even looking at the actual movie, but its environment, making it nearly impossible to discern some of the composed, non-blurred variables related to inner/subconscious thought
i do not believe the portrayed decoded video is actually real.
first, how did they have the mice pay perfect attention to the movie?
second, doesn't the retina have lower "resolution" levels in its periphery?
the decoded image is both too steady and too homogeneous high res.
It’s not real. It’s like a time stamp. The algorithm is figuring out “mouse is watching video at 1:32” and the computer is merely playing the video at that timestamp.
All the algorithm does is determining which frame the mouse is most likely watching... Don't comment if you didn't even understand the research subject.
@@enterb7643 , the news video was produced in a deliberately misleading manner. Not everyone has genius-level of tech experience like you and I have, so the article seems to have misled many, many folks on RUclips and Reddit. Very disingenuous of the producers.
great stuff, I'm just sitting here in my dark room celebrating making my first C program that actually all it does is copying an other program
C or C# or C++??? IF it's C, maybe Python would be a far better place to be...just saying...
@@kevinrtres I can't leave C right now , It's like starting something and not finishing it.
@@justcurious1940 I understand that feeling - however, I'd recommend taking a leap forward - Python has the basic C language structure so you'll just skip out on the difficult parts of the C language. Trust me you don't want to have to deal with the long searches for your bugs.
But, follow your heart if that's what satisfies.
@@kevinrtres My heart is telling me to keep learning C, but my brain is telling me to go and learn JavaScript or Python, I don't know what to follow. tell me about your experience , do u use Python in your job ?
@@justcurious1940 It can be hard to change course when one's mind is attached to certain aspects of one's efforts. "C" forms the backbone of both Javascript and Python so if you have already put lots of effort into C it's not a waste - it makes it easy to transition to either of those you mention.
Personally I use Javascript a lot because i work on websites. I also use C# a lot because most of the apps I work on are written in it. I haven't used Python but it's dead simple to learn if you have a C background.
So it depends on what direction you wish to follow. The language is merely the tool to help you get where you want to go, it is not the be all and end all. Javascript is mostly prevalent on the front end and Python can run a lot of things in the background - whilst it can also be used to supply front end interfaces if you really push it.
Just do some homework - check the web to see where each language is used the most and pick whatever environment you think you like better. Your C is not wasted so don't be too concerned about leaving it.
One step closer to the plot of Strange Days
Great movie!
@@matthewcape3183 Worth watching?
They are just associating some brain signals to a specific movie frame.
Weird that every part of screen is in the same resolution. Mouse has a very good perception 🤔🤔
because the video title is misleading. They are only predicting which parts of the video the mouse is most likely to be watching.
@@Targeting-Must-End okay, thanks
Next up pulling vision data like body cam footage.
You haven't watched the TV series Black Mirror have you.
@@CliftonPhotographer I have... not sure why you're trolling???
@@TravisKelleher they're basically saying its unethical
This system allows them to find out what frame of the movie they are watching.
It does not allow them to reconstruct the unfamiliar scene they are watching.
So no, body cam reconstruction is not possible with this.
@@M0d4l3 hopefully not yet, I hope this is step one to mak it easier to train further neural networks to regenerate
OK, now I want the CEBRA plugin as an option on K-Lite Mega Codec Pack.
I believe that this intelligence used to capture the pixels could be used to retrieve images behind translucent glass.
Dream storage and retrieval system, for humans, please and thank you.
thats not what they did, they just mapped signals to a frame of the video to predict them later, but no reconstruction happened.
@@JanSeewald yeah, acknowledged. still impressive, but not mind-boggling as would be if they could reconstruct. it would be one thing to "jack" the mouse's raw optic nerve data, and another to capture the scene after some neural processing, to show what the mouse is perceiving instead of merely sensing. I suspect you'd need to be able to do the latter before you could hope to capture dreams, and that's sci-fi for now.
I'm studying about it... and transferring the neural systems of a human being in an algorithm to the metaverse... but apparently I've already been surpassed
I want to record and watch my dreams.
thats not what they did, they just mapped signals to a frame of the video to predict them later, but no reconstruction happened.
@@JanSeewald But this is not the same? They are reconstructing the video that the mouse watched. When you are dreaming, you have brain activity. So you can record that signals and recreate it.
@@carlosleoncode but if you watch this video, they clearly didn't do that, because they said so
Now do it without a reference image lol. Show it something it hasn't been trained on and lets see the prediction.
It's all an algorithm, unique to each brain, is how our visual cortex records the information. Reference frames are important to know you're on the right track.
The measurement too giving enough data for the machine to learn from is the only roadblock.
Once that data is fully available, the actual interpreting part (thanks to machine learning) would be easy.
But right now we can only get enough of the data from any brain to just know what frame the reference image that is going through the cortex.
Basically just enough time stamps.
Better tools is the ONLY hiccup concerning the advancement of this type of technology.
If they put a canera in any animal the camera got all images.
Cyberdyne Systems Model 101 - 800 Series Terminator...
Has begun.
Rato de 1,5m de altura usando guimbal e camera 4k.
Flying mouse aerial footage
Вау! Супер. Очень интересно, но какое исследование будет дальше?
Shut up
But how the computer can read the mice's brain ? What is the output, I can't see any wire, electrode, nothing
Wi-fi.
People are mad on the comments because they didn't understand it. They deserve some reason because this video displays the information quite misleading. If you don't pay attention to the researches explanations (which requires a bit of knowledge) you will really understand they r decoding any brain signs into real-time images when in reality they re just finding patterns and applying what that pattern might represent.
that is scary...very scary...
Ahora pueden saber el.rostro de un asesino o recrear escenas del crimen
How do you genetically engineer mice neurons to glow green.
CRISPR
super flying mouse
No one will have any privacy whatsoever forever and ever very soon. Completely other domains ;)
I think you misunderstand what's shown here, the Ai isnt creating the video from the brainwaves, its just able to predict which frame of the video the mouse is looking at by analysing the brainwaves and putting them in the correct order
finally, in a near future I will be able to watch my own dreams xD
Black Mirror it's real
Ну, по крайней мере эта технология будет очень полезна для судмедэкспертов.. Если её развить, то жертве можно будет установить в мозг датчики, подать какое-нибудь напряжение и посмотреть кто и как совершил убийство
ogo hixiya sebe :G
@@gregandark8571 а других методов применения этой технологии наврядли найдётся. Ну, если бы можно было оцифровать сны, было бы круто
Они не восстанавливают видео, а лишь восстанавливают последовательность кадров, которые увидела мышка.
То есть не видео, а просто порядок кадров. Врут без вранья
Jesus said: "Having eyes, don't you see? Having ears, don't you hear? Don't you remember?" the mouse is watching but does not see. This project is worth shit
3D graph reminds me of ECU tuning in a high performance car. Makes me wonder how long until we can install biological hardware and tune it to our natural and aftermarket sensors.
Creepy
Ok .. This is ✅for artificial eyes. How about "Accurately constructing imagination..?" into an image/slideshow/animation/movie?
They can predict the frame they are watching. Not decoder the video itself.
Every time I check r3ddit there's a leap in technology. Wtf is going on
Lol you haven't noticed you've been using essentially 100+ year old tech your whole life? It's all old ideas with a little digital thrown on top to make it seem new. This is just another piece of the conciousness assisted technology they've had locked up in the projects for 70+ years. The gov literally admitted there's UFOs which is 1 step from weve got tons of downed UFO tech and they've stolen more than 30 trillion so far where do you think that cash goes? Donald rummsfeld stood up the day before 9/11 and announced 13 trillion had gone missing and everyone forgot about it after the twin towers got hit. They'll keep trickling bits of this tech out because it doesn't benefit them to keep it locked up anymore.
I am calling some BS on this. If they were truly reading the mouses thoughts, you would see a POV of the mouse watching the video on a screen. Not an exact duplicate of the video, only blurrier. This only proves that the mouses brain was stimulated by watching the screen. 🤔
AOS POUCOS VAI ACONTECENDO O QUE JÁ SABÍAMOS ELES ENTRARÃO EM NOSSA MENTE COM MUITO MAIS FACILIDADE E REALIDADE.....
Can I get Mackenzie's number?
Very interesting. Would love to see a discussion of ethics on this.
Can you give some examples of the types of questions you would want addressed in regards to ethics and this?
@@saenzperspectivesShould employers be allowed to monitor employee’s brain feed?
What ethics. Who ever deploys it first wins dictator of a the millennium prize.
@@kevinhill1575 What do you mean by "what ethics"? Ethics is one of the most important parts of scientific discussions. Prizes don't mean much. You die one day and it doesn't matter how famous you are. Only the impact remains.
@@saenzperspectives Sure! I have only recently come to learn about this vision so I would first like to admit that I don't have great questions to ask yet. I need to learn more about the project itself and see knowledgeable people discuss ethics of this project so I can come up with questions of my own. For now, I just want to ask the basics: Is it causing any prolonged suffering to the animals involved? If yes, how is it justified? How restricted will the technology be and why? Was the data sourced ethically?
These are just the first questions that come to my mind.
Now of course, there are bigger debates and questions about things like genetic engineering and using animals in such projects but for this particular research, I am not going into those. I don't believe in an absolute right or wrong and I am open to understanding deeper nuances. :)
Thank you for asking!
Quem acreditou que o homem foi a 🌙 agora está radiante com essa informação 🤣🤣🤣
Ou seja, quem tem razão
Woah. This is incredible!
O rato ate voa na Cena.
Kkkkk os brasileiros estão por todas as partes 😎
This is awesome i hope these constructs can aid in devoloping environmental tools for mice bred to mimic human cognitive problems.
Could we be facing the new revolution of visual communication in real time in any part of the planet live transmission without the need to use a smartphone and camera?
Has science gone too far? Yes.
Best time for humanity was from mid 1990s to 2010, we had internet without mass social media algorithm platforms and technology was in perfect balance with human life.
Caralho aí sim
Как вы вообще ведётесь на это, смешно) Какие громкие заявления и заголовки-декодировали кадры из мозга! Нет, нет и ещё раз нет. Невозможно, т.к. нет понимания процесса. Всё, что они смогли сделать, это вывести мышь (генномодифицированную) для своих узких экспериментов и сопоставлять данные из фильма и активность мыши во время просмотра, после чего опять включали мыши фильм и + - смогли определить, на каком она моменте. Это полная фигня для получения грантов. Не ведитесь ;)
I believe there is alot of gas lighting in these comments.
Lets settle this, turn the advanced ai loose on the problem and watch it solve and in the fastest way possible all your nay saying...
There us an ai for everything, its already here and the public is just now catching on while the propeganda of either the people scared to be discovered by this technology (memory replaying scares alot of people retroactively) or any other reason to not let the public have access to it.
I guess she doesn't know a mouse from a rat cuz they keep showing a rat instead of an actual mouse.🤦🏾♀ Farther along in the video a man was commenting on their data and he said rat instead of mouse...finally. 🧐🙄
Пахнет тупостью. Но это не точно.
It's time
Rip to anyone who believes this baloney
Interesting that I get the impression the "prof" is an AI generated presenter!!!
So if this video content is true - and it seems to be - , then how much easier is it for the God who made us to know exactly what we're thinking - right down to our motives. That should be the wake-up call that unless we have the righteousness of Jesus imputed to us, the eternal flames will be our end-destination. Make sure where the journey ends - today! because one cannot be sure to still be around tomorrow!!!
you know its bullshit when the professor is that attractive
Its not bullshit just a bit misleading. They are not reconstructing visual data just from brainwaves, the AI is simply able to determine which frame the mice is looking at by analysing the brainwaves. No new visual data is created here
Fake