I believe that one day I shall convince AI to help me build a time machine, so I can go back to the 90s and buy a ton of NVDA stock. "Help me help you!"
We will soon be able to tell a game agent to create a game world with a story-line that we describe to it. We may give it historical or well known characters to influence its avatar creation process. In minutes we will be able to enter the game world that we only imagined a few minutes prior.
Yeah, if only Musk had been able to build a new car company from scratch and blow away every US automaker. Or revolutionize space travel and take over 80% of the global launch business. Like you said, all BS, no action 😂
In terms of data creation stop thinking traditionally. We just spent the last 10 years deploying 5G throughout the world. That means all the physical spaces have yet to be really contextualized as data points. We have a job to do and that's going to get down to the quantum level. We haven't even started
It generates 9 out of 10 pixels, not 9 out of 10 frames. It only generates 3 out of 4 frames, but the 1 rendered frame is also scaled up, meaning only like a half or less of its pixels are rendered, and that's how you arrive at only 1 of 10 pixels overall being rendered and other 9 generated. At least that's what I understood from the presentation. It also makes me wonder about when they say it's two times more powerful than the previous generation. By which they probably mean 2 times the framerate. But if the previous generation turned 1 rendered frame into 2 total and the new generation turns 1 rendered frame into 4 total, and the total number of frames per second doubled, doesn't it mean that the number of honestly rendered frames per second didn't actually change?
The claim could encompass other improvements: The new AI models in DLSS 4 operate 40% faster and use 30% less VRAM than their predecessors1. DLSS 4 introduces transformer-based AI models, which may provide better image quality and stability2. The RTX 50 series GPUs have hardware improvements that contribute to overall performance beyond just frame generation3. Therefore, while the number of traditionally rendered frames might not have changed, the overall system improvements and efficiency gains could justify the "two times more powerful" claim.
It might be true, but why would they care about the honestly rendered frame? They see these AI generated frames as the future and want to push this aspect as far as they can. It doesn't matter to them what gamers want. The generated pixels are mostly ok so average gamers wouldn't mind, and it is only going to get better in each version. I wouldn't be surprised if the DLSS 6 would be even better than the honest render.
@@13thxenos >why would they care Well, I do. Because as an average gamer, my monitor only shows 75 frames per second, so I don't care if they're gonna turn 60 frames into 120 or 240. And I don't feel like rendering 20 to turn them into 60 is gonna work out well.
@@HanakoSeishin I know you do, I and a lot of others do to. I'm just saying that nvidia clearly don't and won't in the future. This is the path they chose to go.
I have often worried that humanity relies too heavily on society's charlatans, but now I feel optimistic that scientists are firmly guiding our future.
@@rocketPower047 I would love a holodeck but this is more of a tech that changes your entire area into a scene. Tropical beach? a snowstorm in the woods? a spaceship? a solarpunk city scene (like tomorrowland). Just sit back on your couch and relax to whatever scene you picked.
Sounds like 'Ready Player One' VR tech (or yes, a holodeck). I think I'd prefer an actual tropical beach at this point. I'm getting pretty tired of screens...
*AI power problem: Solved* Project Digit is crazy, I figure 25-30x more powerful than Apple’s best M4 Max chip. He mentioned agentic workflows, a lot of companies will buy these to run their “digital employees” locally. $3k for 24/7 production inference on advanced models is *dirt* cheap. Price comparison: A month’s worth of 24/7 fp4 inference on an H100 costs about ~~$1,800. So one of these little boxes would pay for itself in 6-8 weeks. A much bigger point: AI datacenter deployment is already power-constrained. OTOH, thousands of companies could run a stack of these in a spare room and hardly notice the blip in their electric bills. *This is an end run around the AI power problem.*
I don't know Nvidia to be a hype company (i truly don't as I don't follow much of the tech space), so this all seems sincere to me. In that case, that little mini computer is a MUCH bigger deal than he's making it out to be. Like, it was treated like a footnote, but that's a powerful f****** machine if it can do what he's claiming. Scary powerful.
I could put my entire data on one of those super Cs and run my business on my local language model not having to worry about my data getting lost or gobbled up by AI? If so sounds good. Many businessmen are worried about losing proprietary data to AI..
I honestly think Elon Musk's XAI34v is the safest bet for long term hold, and will survive out of every other altcoins. It will get adopted in US, Ecuador, Asia, starting from Japan, and slowly spread out and gain. This is a winning coin, apart from all the technical greatness.
Those DGX "supercomputers" are small enough to fit in the body of a robot... How much power do they use for inference and training at worse case scenario? And much do they weight?
Jensen Huang is amazing! :) He inspired me to continue learning about generative AI two years ago, even when my boss told me to stop "playing" with AI tools. :)
Good for you. Your current job probably won't exist in 5-10 years, and your boss won't give a crap when he has to can you. So get ready for the next thing.
[+ @OhHenGee1796 ] ..and with that kind of attitude, your Boss will be getting his own pink-slip in about 18 months, so you might as well begin considering new decorations for when you take over his office. Or better yet.. go find a better company that is happy to have someone who's been "playing" with AI for 2 years!
I guess all the demo videos of robots have the audio dubbed in. I realise it was about a dozen of them but I had no idea they made so much noise when they moved.
Because Rosie has 5 stepsisters ready and willing to milk you for all that you are worth. Sometimes you have to have post clarity on the cost of these systems.
@JoePiotti Ya Optimus 3 should be out by end of month and pretty sure they are already working some smaller tasks fully autonomous at the Tesla factory
8:12 I'm curious about the claim that there's no more data to train models. Wouldn't new data sources like ocean sensors and the James Webb telescope provide endless opportunities for model improvement?
I'm sure we'll see stuff as they catch up. Maybe they're working on their own... or firing staff once they realized nvidia smoked them. Either way, competition is good.
Wow! He's certainly created some work for you. I can't hi k of about 7 or 8 vids for you to cover. Will you try out the Jetson Nano and give a review. May will be interesting with the Digits Project release. Will you be looking at the models NVIDIA has released?
Companies like sam Sara or people net have millions of hours of watching commercial truck drivers; local and long haul. I remember in 2005 people net watched a group of us truck drivers for about a year. They said it was for future automated drivers or the information that they could sell for driverless vehicles.
Interesting bit about the "AI rendering games in realtime" .. would this not mean that every first person game that a player plays is a unique environment? .. ie. if you play the same game 100 times then the world around your avatar will never be exactly repeatable!
They can have memory to keep the scenes the same, and anyways its only following how the game tells it to, so the world is already built but running on ai
@JayJay @shirowolff9147 Could either of you two games person help me with a genuine query please..... with these digi-boxes, if I developed with AI a game, a universe, could I drop current news and events into that game, for example drop in today's newspaper for players/actors to read? Sorry to sound dense but I'm a learner.
@@shirowolff9147 it's possible there will be minor differences, but not material differences. Like who cares that the rock on the ground was shifted 2" to the right?
To be hones I was expecting that by now we would at least have DevOps assistent AIs. Different from programming, where you sometimes have hundreds of files of code, infrastructure setups, monitors etc. don't have that. I mean an AI that is expert in Kubernetes, Docker, hypervisors, Automation, Monitoring and log tools etc. I means some tools like OpenShift or Datadog have AI integrations but they are rather a joke. I assume it is because the lack of computational power and that models are usually trained as general Chat bots. What do you think?
Matthew, I would argue you agent thesis will come into play but not soon, right now it's too expensive to run and to develop compared with traditional software.
Fully agree with Matthew. I still can't understand why so many "AI vloggers" keep pushing the idea of AI Agents when there's only data and "a layer of interface/AI" on that. Why splitting it into gazillions of pieces (agents)?!?
Because thats how its called? Just like theres different cars but you call them cars, you dont call it by every part its made of? Its really simple bro
jensen is the goat. 🐐💚🖤 he is designing the future for all of humanity. can you imagine being him? started as a dennys bus boy? i recommend _the nvidia way_ by tae kim. its new. it’s excellently done and jensen’s real life story is absolutely amazing. 🙏🤖📈🇺🇸
11:38 i think this slip up is because behind the scenes these people are interacting with agentic AI more than humans these days... he's so used to talking to Cosmos or whatever lol
When it comes to the training data, won't the quality be a race to the bottom using synthetic data? It will be similar to compressing the same file multiple times without reducing the the file size on disk! Data quality will need to be checked and validated. Who will own the data once this has been done? From public to private data sources, will eventually make it more of a pay-to-play model sooner than we think!
Not much difference for the eyes you wont be able to see the difference for consumer products but maybe beneficial for bio medical research, health care and safety industry. However, with the Nvidia famous for pricing their incremental product reveals. Will the cost of healthcare continue to increase? As health care professionals are advising patient 's with evidence based healthcare decisions.
Under that curve of the AI dynamics, I am missing another one illustrating the need for human presence in the AI world. We may just experience the most powerful virus created which can come up with its own directives and execute them. The last step is to establish a presence in robots, replacing the need for maintenance workers and human management.
I think not. They have the models closed for at least a year (both gpt4 and o1) before announcing them. GPT4 was demoed to Microsoft before chatgpt was released, under NDA, and the Microsoft research was published few months after gpt4 was announced. Recently I watched a guy from OpenAI interview and it slipped out that o1 was ready few months before they fired Sam alt. Now they are talking about o3, but what is in training and in preview for select customers? What we see from all these companies are the stripped down models that can serve hundreds of millions of subscribers. AGI is locked in a basement somewhere.
Theoretically, its probable that AI can generate higher quality data than humans in many ways. The coach can teach a player to get better even though the player can easily outdo the coach... and so on.
How could you not get excited for cheap compute!!! I’ve already been using nano to create compute nodes on the network for processing or rather transcoding video mostly. So between Mac minis being cheap and things like this cheap super compute
Poor industry, since the launch of GPT 3.5 it has been stunned and shocked almost daily
😂😂😂
I am the industry... and I have no more nerve endings
I am the industry and I want MOAR
Thankfully a shockingly large amount of compute is being used to train AI therapists to deal with this stunning issue. It will be a game changer!
@@gareth4045 Indeed, and that's the only way after all the shocking and stunning that the industry won't go insane.
Imagine when AI doesn't just solve problems but actually discovers problems we didn't know we needed to solve. Now that's discovery on another level.
is there anything in the last 2 years that has not "just stunned the entire industry"?
Every Intel press-release
Yeah sure. A lot of things only SHOCKED the entire industry.
All that turning and the industry should be just going in a circle
I wonder if he wears that jacket to bed
Marvel movies
The potential of XAI34v is unreal! Excited to see where this goes after watching your video!
Plot twist: The jacket was being rendered in real-time by a Blackwell GPU in his back pocket.
No is made out of them
Matthew you are doing awesome with your videos keep it up brother!!
I believe that one day I shall convince AI to help me build a time machine, so I can go back to the 90s and buy a ton of NVDA stock. "Help me help you!"
with my luck i would cause a butterfly effect ripple that would make that one ingenious nvidia developer get hired by AMD instead.
don't you think it was already done? LOL
If that happened in future we would be seeing the people from future doing this in present
@@jagatsimulation not if it would open a new timeline/worldline (aka threa in the simulation) each time you maneuver within time, to avoid paradoxa.
@@kliersheed T F are you saying
We will soon be able to tell a game agent to create a game world with a story-line that we describe to it. We may give it historical or well known characters to influence its avatar creation process. In minutes we will be able to enter the game world that we only imagined a few minutes prior.
The real world. We going to create multiple competing new real worlds. LIfe is now a video game.
or even better, let is read a book and create a movie
Just tell your life story to the AI and then live it again 🤓
Only if you using the correct pronouns or be punished by your leftist masters..
Nah. We will become too lazy. We will have an agent to tell another agent what to do
Jensen is like a rock star among us nerds.
He is the opposite of Musk. A doer and not a bullshit talker
Yeah, if only Musk had been able to build a new car company from scratch and blow away every US automaker. Or revolutionize space travel and take over 80% of the global launch business.
Like you said, all BS, no action 😂
This guy gets it
@metafa84 opinion with nothing to back it up. Rude as well.
How do you think the digit will perform up against a mac m4 with 128 gb RAM?
🤯LOVE, LOVE, LOVE the idea of a "mini home AI supercomputer". Wow!! Would love to get my hands on one of those. 😍
"Do you like my jacket?"
"uhhhh....noooo !"
😅
It's a cybertruck equivalent of all jackets.
Like why ask that question?
You have changed Jenson you have changed. Fancy jackets now? The old jacket was a classic.
Morpheus jacket matrix,
The jacket thing is kind of stupid he looks like the Fonzy from happy days why do you want to look like a gay biker from the 50's so bad.
Great video! How to pre-order a mini super computer?
In terms of data creation stop thinking traditionally. We just spent the last 10 years deploying 5G throughout the world. That means all the physical spaces have yet to be really contextualized as data points. We have a job to do and that's going to get down to the quantum level. We haven't even started
quantum these nuts
@@AddyEspresso We'd have to see them
@@AddyEspressoAt least you advertised their size appropriately
so basically we have an infinite amount of data to feed models if we can just get it piped to them all the way down. makes sense.
"Bodies and minds will be the two big products in this next wave of industrial revolution" - Yuval Harari at 24:33 in the video 'Ewe Schal Rise'
It generates 9 out of 10 pixels, not 9 out of 10 frames. It only generates 3 out of 4 frames, but the 1 rendered frame is also scaled up, meaning only like a half or less of its pixels are rendered, and that's how you arrive at only 1 of 10 pixels overall being rendered and other 9 generated. At least that's what I understood from the presentation.
It also makes me wonder about when they say it's two times more powerful than the previous generation. By which they probably mean 2 times the framerate. But if the previous generation turned 1 rendered frame into 2 total and the new generation turns 1 rendered frame into 4 total, and the total number of frames per second doubled, doesn't it mean that the number of honestly rendered frames per second didn't actually change?
The claim could encompass other improvements:
The new AI models in DLSS 4 operate 40% faster and use 30% less VRAM than their predecessors1.
DLSS 4 introduces transformer-based AI models, which may provide better image quality and stability2.
The RTX 50 series GPUs have hardware improvements that contribute to overall performance beyond just frame generation3.
Therefore, while the number of traditionally rendered frames might not have changed, the overall system improvements and efficiency gains could justify the "two times more powerful" claim.
It might be true, but why would they care about the honestly rendered frame? They see these AI generated frames as the future and want to push this aspect as far as they can. It doesn't matter to them what gamers want. The generated pixels are mostly ok so average gamers wouldn't mind, and it is only going to get better in each version. I wouldn't be surprised if the DLSS 6 would be even better than the honest render.
@@13thxenos
>why would they care
Well, I do. Because as an average gamer, my monitor only shows 75 frames per second, so I don't care if they're gonna turn 60 frames into 120 or 240. And I don't feel like rendering 20 to turn them into 60 is gonna work out well.
@@HanakoSeishin I know you do, I and a lot of others do to. I'm just saying that nvidia clearly don't and won't in the future. This is the path they chose to go.
@13thxenos yeah totally, the "pure gamers" will notice because they are such "pros" 😄😯🫨😵💫
Are you still testing out new llms? Llama 3.3? DeepSeek-V3?
Yes!
@@matthew_berman Are you really really sure?
So, is that good? 😮@@matthew_berman
Deepseek is pretty decent he managed to beat out Claude on a programming challenge of mine
Jensen is the Architect from the Matrix 😎
I have often worried that humanity relies too heavily on society's charlatans, but now I feel optimistic that scientists are firmly guiding our future.
Is that sarcasm or optimism ? I think both are true to the extent that even the charlatans can’t tell whether they are scientists or not.
if that is what you come away from this video with you might have a mental deficiency
By charlatans you mean Elon Musk and Vivek Ramaslimey? 😅😮
@@pdcdesign9632 Fauci, Gates and yes Elon and Vivek!
Give me an AI tech that can change the inside of your house to look like a tropical beach, but your outside still looks like a house.
A holodeck?
@@rocketPower047 I would love a holodeck but this is more of a tech that changes your entire area into a scene. Tropical beach? a snowstorm in the woods? a spaceship? a solarpunk city scene (like tomorrowland). Just sit back on your couch and relax to whatever scene you picked.
Sounds like 'Ready Player One' VR tech (or yes, a holodeck). I think I'd prefer an actual tropical beach at this point. I'm getting pretty tired of screens...
I know a Polish builder who can do that for you. Very good rates!
The shift to AI-driven computing is fascinating, but what about job displacement in tech? How do we prepare for that?
*AI power problem: Solved*
Project Digit is crazy, I figure 25-30x more powerful than Apple’s best M4 Max chip.
He mentioned agentic workflows, a lot of companies will buy these to run their “digital employees” locally. $3k for 24/7 production inference on advanced models is *dirt* cheap.
Price comparison: A month’s worth of 24/7 fp4 inference on an H100 costs about ~~$1,800. So one of these little boxes would pay for itself in 6-8 weeks.
A much bigger point: AI datacenter deployment is already power-constrained. OTOH, thousands of companies could run a stack of these in a spare room and hardly notice the blip in their electric bills.
*This is an end run around the AI power problem.*
I don't know Nvidia to be a hype company (i truly don't as I don't follow much of the tech space), so this all seems sincere to me. In that case, that little mini computer is a MUCH bigger deal than he's making it out to be. Like, it was treated like a footnote, but that's a powerful f****** machine if it can do what he's claiming. Scary powerful.
I could put my entire data on one of those super Cs and run my business on my local language model not having to worry about my data getting lost or gobbled up by AI? If so sounds good. Many businessmen are worried about losing proprietary data to AI..
I honestly think Elon Musk's XAI34v is the safest bet for long term hold, and will survive out of every other altcoins. It will get adopted in US, Ecuador, Asia, starting from Japan, and slowly spread out and gain. This is a winning coin, apart from all the technical greatness.
Those DGX "supercomputers" are small enough to fit in the body of a robot... How much power do they use for inference and training at worse case scenario? And much do they weight?
Jensen Huang is amazing! :) He inspired me to continue learning about generative AI two years ago, even when my boss told me to stop "playing" with AI tools. :)
Good for you. Your current job probably won't exist in 5-10 years, and your boss won't give a crap when he has to can you. So get ready for the next thing.
[+ @OhHenGee1796 ] ..and with that kind of attitude, your Boss will be getting his own pink-slip in about 18 months, so you might as well begin considering new decorations for when you take over his office. Or better yet.. go find a better company that is happy to have someone who's been "playing" with AI for 2 years!
You called it a month ago and now XAI34v is blowing up glad I listened
Great overview! Please do a review of the your project AI when bought and installed.
Very cool stuff, can’t wait to see wonders in 2025. Very good video.
I guess all the demo videos of robots have the audio dubbed in. I realise it was about a dozen of them but I had no idea they made so much noise when they moved.
It’s almost like the real rendered frames become a controlnet of sorts.
The jacket....no.
He could used new AI materials ?
...spare the croc......
Design with AI ?
The Adaxum team is setting the bar high for presales. Transparency & bonuses equals community trust!
this video is a perfect candidate for chapter markers
Great summary, thank you
Thank you for the announcement video for Elon Musk's XAI34v Token!!! Finally they got into crypto...can't wait to see what's next
If they have Jetson why not name the personal AI pc Rosie
Because Rosie has 5 stepsisters ready and willing to milk you for all that you are worth. Sometimes you have to have post clarity on the cost of these systems.
Optimus WASN'T actually one of the robots on stage which is kind of weird.
Tesla has been sandbagging on Optimus, don’t worry, it’s further along than most people think.
@JoePiotti Ya Optimus 3 should be out by end of month and pretty sure they are already working some smaller tasks fully autonomous at the Tesla factory
Exciting times!
Shocking leather, stunning jacket
Great content!
How many DIGITS do I need to run 4o?
23:42 no Optimus in the lineup
could be because tesla designs their own chips, they've got a supercomputer called dojo
I know this is recent, but it looks so much like last year's. Especially the robot lineup, where last year out came the Disney droids.
这个视频太有趣了,每一秒都值得期待!
I started investing to cloud GPU one year ago and thank god i did.
Can't believe I almost missed out on Cardano and XAI34v! Thanks for the heads-up in your video!
NVIDIA's 50 series GPUs look impressive! 2025 could be transformative with these AI advancements.
I'm working on building an entire coding team of Agents Architect/PM/QA/Frontend Dev/ Backend Dev
8:12 I'm curious about the claim that there's no more data to train models. Wouldn't new data sources like ocean sensors and the James Webb telescope provide endless opportunities for model improvement?
Is there any response to "digits" from the rest of the hardware world? aka intel, amd, etc.
I'm sure we'll see stuff as they catch up. Maybe they're working on their own... or firing staff once they realized nvidia smoked them.
Either way, competition is good.
Looks like Palantir is getting some competition coming with Omniverse and Cosmos.
Wow! He's certainly created some work for you. I can't hi k of about 7 or 8 vids for you to cover. Will you try out the Jetson Nano and give a review. May will be interesting with the Digits Project release. Will you be looking at the models NVIDIA has released?
Where can you buy that jacket, tho?
Companies like sam Sara or people net have millions of hours of watching commercial truck drivers; local and long haul. I remember in 2005 people net watched a group of us truck drivers for about a year. They said it was for future automated drivers or the information that they could sell for driverless vehicles.
Are tops now fp4. Is 8 x fp4 the same as one fp32 in complexity?
Once you connect a camera to the model it can get original data. Imagine what other sensors one could gather real world data.
Interesting bit about the "AI rendering games in realtime" .. would this not mean that every first person game that a player plays is a unique environment? .. ie. if you play the same game 100 times then the world around your avatar will never be exactly repeatable!
They can have memory to keep the scenes the same, and anyways its only following how the game tells it to, so the world is already built but running on ai
@JayJay @shirowolff9147 Could either of you two games person help me with a genuine query please..... with these digi-boxes, if I developed with AI a game, a universe, could I drop current news and events into that game, for example drop in today's newspaper for players/actors to read? Sorry to sound dense but I'm a learner.
@@shirowolff9147 it's possible there will be minor differences, but not material differences. Like who cares that the rock on the ground was shifted 2" to the right?
The price is $3,000 for Digits. Also there was no Optimus robot on stage.
This is evolution theory played out digitally. Survival of the most useful.
As your channel continues to grow, you will need to drop those investments to be an impartial source.
What kind of timeline until we’re seeing AAA games being rendered? Any idea?
How do I invest in CREW?
Thank you.
To be hones I was expecting that by now we would at least have DevOps assistent AIs. Different from programming, where you sometimes have hundreds of files of code, infrastructure setups, monitors etc. don't have that. I mean an AI that is expert in Kubernetes, Docker, hypervisors, Automation, Monitoring and log tools etc. I means some tools like OpenShift or Datadog have AI integrations but they are rather a joke. I assume it is because the lack of computational power and that models are usually trained as general Chat bots. What do you think?
Matthew, I would argue you agent thesis will come into play but not soon, right now it's too expensive to run and to develop compared with traditional software.
How would this mini super computer be the same price as a 5090 GPU, but have more vram?
So is the AI computing done on the GPU itself? I thought AI took huge amounts of processing and power so how is this achieved on such a small scale?
Because ai servers can do much more and they are online for billions of people, while a graphics card would be only for one person so its easier
@@shirowolff9147 Got it, thanks.
Fully agree with Matthew. I still can't understand why so many "AI vloggers" keep pushing the idea of AI Agents when there's only data and "a layer of interface/AI" on that.
Why splitting it into gazillions of pieces (agents)?!?
Because thats how its called? Just like theres different cars but you call them cars, you dont call it by every part its made of? Its really simple bro
jensen is the goat. 🐐💚🖤
he is designing the future for all of humanity. can you imagine being him? started as a dennys bus boy? i recommend _the nvidia way_ by tae kim. its new. it’s excellently done and jensen’s real life story is absolutely amazing. 🙏🤖📈🇺🇸
Ai soon rules in the galaxy!
I don't see how the Digits device will run a 200B parameter model. The vram is 128GB. It cannot house the 200B model in memory. Inference will be slow
Quantization
70B models run competently on 24GB GPUs, do the calculation.
Yes only Quantized models will run at the 200B size
11:38
i think this slip up is because behind the scenes these people are interacting with agentic AI more than humans these days...
he's so used to talking to Cosmos or whatever lol
When it comes to the training data, won't the quality be a race to the bottom using synthetic data? It will be similar to compressing the same file multiple times without reducing the the file size on disk! Data quality will need to be checked and validated. Who will own the data once this has been done? From public to private data sources, will eventually make it more of a pay-to-play model sooner than we think!
Why's jensen gotta be the focus of every slide?
It's an automatic buy for me. This is the only product that interests me so far at CES.
amazing. Im also at CES. Any chance to say hello?
It's always about being early. Adaxum token is still in its early phases, and I see huge potential here!
can ai help me test my web3 app to check for bugs and UI/UX quality ?
Lol. What's the point of web3 now?
Not much difference for the eyes you wont be able to see the difference for consumer products but maybe beneficial for bio medical research, health care and safety industry. However, with the Nvidia famous for pricing their incremental product reveals. Will the cost of healthcare continue to increase? As health care professionals are advising patient 's with evidence based healthcare decisions.
Mat - FE or third party 5090?
Why haven't they showcased more 8-bit AI video games?
Now we have to do this locally with open models, and then we can crush big tech
You wish 😊
Adaxum's ecosystem and AI features look like a game-changer. Investing early might pay off big time!
Under that curve of the AI dynamics, I am missing another one illustrating the need for human presence in the AI world. We may just experience the most powerful virus created which can come up with its own directives and execute them. The last step is to establish a presence in robots, replacing the need for maintenance workers and human management.
Tokenomics on point, a strong development fund, and a clear focus on community growth, Adaxum has it all!
Didn't know that crocodile dundee is trending again
Where's the patient care?
Re: World Model -- I think that Meta (verse) was trying to do that at the social level
I just wish i had dropped more than 500 into nvidia last November. If only i had dropped everything i had...
27:33 technicians explaining the device before the debut. ruclips.net/user/shorts8PwYQRaLTUI?si=S5IVKhBSJaxMWjuR
Any hope will see O4 or Orion by the end of the year?
I think not. They have the models closed for at least a year (both gpt4 and o1) before announcing them.
GPT4 was demoed to Microsoft before chatgpt was released, under NDA, and the Microsoft research was published few months after gpt4 was announced.
Recently I watched a guy from OpenAI interview and it slipped out that o1 was ready few months before they fired Sam alt.
Now they are talking about o3, but what is in training and in preview for select customers?
What we see from all these companies are the stripped down models that can serve hundreds of millions of subscribers.
AGI is locked in a basement somewhere.
Theoretically, its probable that AI can generate higher quality data than humans in many ways. The coach can teach a player to get better even though the player can easily outdo the coach... and so on.
How could you not get excited for cheap compute!!! I’ve already been using nano to create compute nodes on the network for processing or rather transcoding video mostly. So between Mac minis being cheap and things like this cheap super compute
Im blown away
By comparison, Hewlett Packard Enterprise "El Capitan" achieves 1.742 exaflops. But it costs $600 million
good job sir thanks for report
Esports like shooters will end with this generative future but something like The Witcher this is awesome
Nice video 👍👍👍
I think it we are living in a simulation, then their must be way to go back in time?