Be My Eyes Accessibility with GPT-4o
HTML-код
- Опубликовано: 12 май 2024
- Say hello to GPT-4o, our new flagship model which can reason across audio, vision, and text in real time. Featuring Andy from @bemyeyes
Learn more here: www.openai.com/index/hello-gp... - Наука
GPT 4o: "You’ve reached the current usage cap for GPT-4o, please try again at 10:00 pm".
Blind guy: "fuck"
He he !
idk they say this is for free users it might run much more easy so it might have a cap but its a long as one but very good point
Imagine neuralink brings back vision only for it to be accessible by hour subscription
ahahahahahahahaha ı'm pretty sure I will hit that cap
Hahaha, they did indicate that it's going to be free but a vision/audio input/output stream is a huge jump in data, I am very interested to see how they believe they can support this. Super exciting that they clearly believe that it it ready to roll out to people even on the free tier, over the next few months!
Helping people must be a core principle of technological progress.
as long as it also makes rich people more rich lol, do you think they're investing billions of their own money purely for the benefits of you and me?
@@chrisstucker1813two sides of the coin
Lissen I tell u how this order works:
1. Money
2. Goofing off random stupid technologies that our primate brains find funny
3. Helping people
@@chrisstucker1813 но ты же в итоге извлекаешь из этого выгоду, да еще и бесплатно. зачем жаловаться?
@@chrisstucker1813 yeah no shit, its a business, you think you figured out some great detail everyone else missed? how are you this naive 😂😂😂
real life pokedex, rip rabbit
Rabbit will update to the model also
Rabbit left the hole!
Cancelled mine 2 weeks ago
GPT does this nearly instantly, 100x better
@@J3R3MI6What, add an external battery and camera?
I hope rabbit enjoyed their 2 seconds of spotlight because this will send it home.
Not just rabbit but all of the other AI assistant start-ups as well
According to the rabbit team, they are in the process of integrating GPT-4o.
@@nelsond6 Cool but why would I use it instead of a phone? Why would I need another gadget in my pocket that has the exact same purpose as an app in my phone?
I agree the rabbit one is completely dead it is 100% useless.
if a smartphone phone is running faster and better and you have a functioning touch screen there is zero reason for the other.
It is like trying to reinvent the wheel and making just one spoke of a wheel and saying it's the next best new invention in wheels that doesn't work.
@@Nigochen I agree. I will be using my phone for my AI needs. I think the rabbit is for those not wanting to use their phone because the phone is too distracting to them. I think that's why.
I’m blind and I cried when I watched this demo. This is absolutely life changing
How did you watch!! was it a mockery? :(
I am visually impared as well. We use screen readers. We gear and we have family.
Right? Reminds me of how I felt the first time Apple demonstrated VoiceOver. That’s why I went out and got an iPhone in 2010. As somebody who relied on being able to feel the physical buttons of a normal cell phone, I thought touchscreen phones were going to completely lock us out of the technological future, but it ended up doing the opposite thanks to Ingenuity. I’ve already been using ChatGPT for all kinds of things. The image recognition is insane
@@merajhossainpromit6152 blind people still use words casually like “watch” and “see,” for example “did you see that movie?” It doesn’t necessarily literally mean that we saw with our eyes, though most blind people do have some degree of vision, even if it’s just light perception. I mean, saying something like “I listened to a great movie over the weekend” just sounds awkward lol
Wut?
GPT-4o when a dog comes into frame: "Can I pet dat dawg??"
I get that reference! 😂
"What da dawg doin?"
Literally GPT-4o: "That's a good dog right there!"
CAN I PET DAT DAAAAAAAWG???
2 videos now gpto really wants to pet that dog in fact.
integrating this into smart glasses or something like that would be life changing
This is how AR goes mainstream. AI assistant that can see everything you see basically gives you a realtime second brain
Tony Stark Edith irl
Why though? Literally everyone that exists at this point has a smart phone
@@fahadtahir9942 If your blind you don't want to hold up a phone with your arm continuously .. it's very tiresome plus you most likely have a stick or dog leash in one hand as it is so it's nice to have one hand free
@@fahadtahir9942 because it's hands free. Especially for tasks that require both your hands that you want the AI to guide you through, you'll want AR glasses
What a use case
Tip of the iceberg, so exciting!
joi
1% helping people
99% helping people kill each other
@@break2262 Based on what?
@@break2262 That is every technology. Cars replaced horses, but also gave us tanks, computers helped us land on the moon, but also gave our nuclear weapons guidance systems. Satellites were originally designed purely for military use but also gave us GPS. Etc, etc, nothing new.
OpenAI gives me the same excitement I was getting from Apple back in the day. FINALLY, a company is innovating again.
Nice example
For real. True innovation this is the next leap forward. Wish I was born today so I could see where it goes
If it is even real. Remains to be seen.
Don't forget Meta. I have the Rayban Meta glasses and it can do this in a product you can already buy today.
0:42 Oh my god. The real time taxi thing is very impressive and will actually be useful to many. Amazing job. 🎉
Taxi driver is not blind, he would still stop.
@@AbdulrahmanB-ue2gn Not usually. You have to hail a taxi to get it to stop, people stand on the side of the road for all kinds of reasons. Just looks like he's taking a photo of buildings or something
The taxi part of the video is staged. you can clearly see the taxi already putting its LEFT indicators ON to make a stop. Usually taxis would zip pass by of you do not hail or show your hand from a few 100 mts away.
Regardless, good tech though.
I honestly find it really hard to believe it can process video data that fast. That has to be somewhat staged a bit. Would be glad to be proven wrong tho
I wonder how it knows the context of the exact appearance of taxis in that region. I've never seen a taxi like that in my life. So either it has been trained on all taxis in all areas and will assume anything looking like a taxi in general is one (like an all yellow car), or it was only trained on taxis in that area.
I have no jokes for this. This is amazing!
Her Part 2 looks great!
I'd like to see her part 3 if you catch my drift eehehheee🤤
ok
Jajsjs
@@Dannnnehwtf dude
Belíssima. Sensualidade em cada movimento.
🧡💛💚💙💜
Eu amo essa música,essa batida 🅒🅗🅔🅒🅚 🅑🅘🅞
Que coisa........maravilhosa ver uma criança
preocupado com a preservação da natureza!
Lindo de ver!.....Parabéns Eve e Adam, Deus abençoe
brought tears to my eyes.
From which emotion?
Me too. Today was a paradigm shift. In a year we’ll never know how we lived without this.
"I can see you have tears in your eyes. You must be feeling some strong emotions" - Some AI
Me too, because this will help the blind so much
@@TheFoxholeRUclips Some people are already wondering now how they lived without ChatGPT lol
OpenAI shocked the world today - great product release!
Man this is so impressive, glad to see a positive change in technology for blind people!
Yeah. I couldn't believe it at first. 😊😊
Holy smokes… this one made my eyes water.
You not alone🥹
the added comments like "good job getting that taxi" or "how exciting to be here" would drive me nuts
As a blind user of Be My Eyes, I can’t wait until this is generally available. I am currently using GPT four - turbo, a text based model, and it isn’t practical for some tasks.
As a volunteer, I’ll miss the calls, but I can see this is better than me
I hope they don’t get rid of the volunteers though, there are some things that we will still need volunteers for :-)
It's scripted. Are you guys really this stupid?
I wonder when it will be available.
It's an incredible update. I talked with one of my coworkers and tried the new version the last days. I'm truly amazed by how clever it has become and how much better it has become in such a short period. Where are we in just 2, 5, 10 or 20 years from now with this development speed? Wauw!
20 years is a lifetime in this world
This got me emotional. This is beautiful!
Absolutely incredible. I'd been watching all the other demos without even considering how it might be used to enable the blind to better perceive the world around them. Huge smile on my face. Bravo.
I'm so impressed by this honestly. I just saw the other video of the 2 people playing rock paper scissors, and that was also very impressive.
this reminds me of the movie HER, also the voice sounds so similar to Scarlett's. Won't be surprised if it was modelled after her voice. She has such a soothing voice.
What a game changer for the visually impared
This is what we've all wanted - AI that makes life easy, especially for those that face physical challenges.
This can be a total game changer and a useful tool for visually impaired folks, nice work openai 👍🏻
Sie ist vielleicht einfach die bescheidenste und herzerwärmendste Person, die es gibt❤️❤️❤️😮
That's the real use of technology and technology should make our life easier that should be the end goal 😊
You guys are the best. Good job, really good job. What a cool use case
I think the only thing missing is that GPT-4o seems to only say something after the user did, but it for example it does not react things that suddenly happen on the video feed or if you'd say "hold on a sec" it wouldn't ask after some time "are you still there?". But other than that, VERY IMPRESSIVE
yeah , but there is always a lot of things going on at once , so would it be useful to react to everything? best case is for the AI to be intelligent enough to recognise when an important event needs to be reacted to , but i don't now if gpt4 is capable of that...
Right, I was thinking about that too. It would be cool if they added an optional 'chat' mode for it, in which it also engages the conversation itself and asks questions or starts topics itself, also based on the camera input.
Imagine this was integrated into glasses which has a camera.
The idea that blind people have a better 'view' of the world makes me happy
This goes even beyond what we thought was pretty incredible watching Star Trek.
"Where are the apples in this store?"
"Right in front of you Dave"
"Ey what the f man leave my damn basket alone"
Someones gonna remake the movie HER irl aren't they😂
You better believe it. It's only a matter of time.
Just hopefully they don't leave us 😅😅
this is sad that it only has 451,000 views. this is amazing.
You know what this mean? People who don't see won't struggle as much. Good job open ai!
That seems like an exaggeration, but it definitely helps a lot.
Maybe in the future the AI could artificially recreate the environment and create vision for blind people? Maybe that would depend on what is causing the blidness, but if the problem isnt with vision centers in the brain then why not?
kudos to openai for curing blindness
won't struggle as much*
@@MorbiusBlueBalls Thank you for correcting me.
Meta glasses and GPT4.o , awesome combination for the visually impaired.❤
They can already pretty much do this (Hey Meta what do you see), but 4-o has a nice speed boost. Competition is good!
A monumental game-changer for the visually-impaired!
This is definitely one of the most useful applications of the vision option of GTP-4o. For vision impaired people this is not just a little fun feature to play with, but can really improve their mobility and awareness of their surrounding. Very nice!
Yes! I'm blind but I have cognitive issues where I can't remember spacial information unless directions are created in list form in a word doc with extraneous details, which makes it pretty much impossible for me to travel outside alone so this might finally make it possible
@@RogueError617 Yeah I can see this technology having a lot of potential for helping people who are blind or otherwise vision impaired. But AI is also a double edged sword, being helpful on one side but also potentially taking peoples livelihoods away from them by taking over their jobs.
Maybe it could be integrated into smart glasses along with an earbud or something
I extract a lot more than your expectations from be my AI current Access but definitely when this update will be live how amazing it would be. I can't wait to try it!.
Relying on others is a very stupid thing these days. I remember I was very excited when I get the description of my teacher's lecture with be my AI. I really enjoyed how she presented by making a diagram and the way be my AI explain that diagram. It was awesome! I can't wait. Imagine when my teacher is giving lecture and I just point this open. AI chat gpt4 on whiteboard same time the chat gpt will explain to me the movements like sign language of my teacher and that excitement will be very amazing.
When I got that description of the diagrams. I sent that actually to the be my ai and they were very happy that I'm using this product like this. I love to do experiments. Let's see what I will do when it will come in my hands. Haha
What a wonderful use case! love it!
WHAT!?! Amazing 🤩
This actually made me tear up, we are there.
Are you guys actually cooperating with Be My Eyes? Great idea. Think more this way OpenAI. Being great is not only about impressing general public and investors! Way to go!
Insane leap for humankind 🔥
I got really emtional after looking at this clips even as a person with good eye sight. Other demo are all nice dog and pony shows, but it shows how amazing Open AI already is.
i remember watching the movie "Her" when i was a teen... this is insane it's happening so fast
The possibilities are endless. 🔥
That's what i love so much about technology, can you imagine our world in 2030
Incredible, absolutely incredible.
Honestly one of the most important and helpful usages of the tech I've seen so far. Mostly everything else has been more disrupting and hurting people in it's implementation.
I’m glad technologies like this are being developed
Holy tomoly absolutely crazy. I love how it can help people
Wow! Kudos to all the scientists and engineers and everyone.
What a time to be alive!
this is something u guys doing good! 👍
Integrating this in an eye wear, will make it a real life Jarvis
This will really help me on my vacation this fall. My husband usually tries to explain to me what’s going on around me, but he’s not very good at it. So hopefully with his new AI technology, it will explain my surroundings to me, as I’m going on a cruise.
I think it would also help me watching TV.
I'm amazed and shocked at the same time
RIP dogs with jobs! Even man's best friend isn't safe!🐕🦺
ahahh best comment
ChatGPT still can't warn a blind person about every uneven step, overhangs, and things like that, but the dog can't answer questions about what things look like, either, so should make an amazing combination!
they took arr jobs
@@SarcasticTruth77 the birth of a new cool partnership!
lol
Imagine if this becomes wearable, its amazing.
What a power to humanity.
I'll believe this works this good when I see and hear it with my own eyes.
This is incredible.
Now this is an amazing use case!
GPT 4o can tell all of this from looking at the sky. Amazing.
As a blind person, I'm thankful to be living in 2024, we're experiencing something that wasn't possible for centuries, feels like I'm part of a revolutionary movement
This will be the best thing ever happened in my life
I hope they‘ll roll out this very soon. I am super exited
Man I can't even image what 2030 would look like 😮😍 proud to be born in 21st century.
As a blind guy who has been using the Be My Eyes app for several years with live assistance, this is a cool development for sure. In certain safety-critical situations, or where accuracy is absolutely mandated, I think I will stick with the real human helper… But this could be super useful for less consequential things like taking a virtual look around, as they showed in the video. I don’t know if I would jump into a car because AI told me to. Knowing my luck, that would be the one time she decided to hallucinate lol
This is fantastic!
This makes me so happy
Ai help who have learning disabilities and learning problems thanks open ai and god you guys are doing great things
This is what AI should be all about! Kudos OpenAI
This technology would be so cool with smart glasses
I'm telling you now, I didn't know he was blind until I saw the dog. That is NUTS.
Wow!!! Love it!!!
My brother is visually impaired and am so happy about this
Some people who are not actually blind also need this, they are so careless in road.
openai is rlly opening the ai now
hehe
Genuinely impressive
Was that a Richard Ayoade vocal cameo? I love this so much. Even before the video got to the end and I saw it was a Be My Eyes collab, I was thinking to myself this will potentially straight up replace Be My Eyes with the instant live feedback.
When will this become available to the public?
this is insane
As a completely blind person, this gives me so much hope! Can you please provide any details how soon this will actually be available to be my AI or in general use? Or if it is available already, what app do we download? I am so anxious to get this application
They should build a pair of glasses with a camera and speaker like the Facebook glasses. A third party should build a small app with an API so there are no limits. You could charge a monthly fee, and I don't think this will be too costly. This will change the lives of blind people for sure. I had the idea right away when I saw the video of Greg Brockman.
Wow, fantastic!!
It's gonna be a huge help for the blind people
Incredible!
This is so much more impressive than the singing feature!
What a time to be alive
this is really cool today i cant sleep
Blind people watching this video must be so hyped!
this is monumental
This company will be remembered throughout the History just like Apple
Simply amazing.
this being integrated to the blind person's glasses so they do not have to hold the phone all the time would be a life changer. I hope they'll train (or have trained) the AI with other languages to the same level of fluency (more languages than the handful of those major ones like Apple typically does)
This is beautiful
So good!