Don't sleep on Nvidia.
HTML-код
- Опубликовано: 28 июн 2024
- While everyone's focused on @OpenAI's #chatgpt, I'm looking at a bigger AI breakthrough happening right now. A couple weeks ago, @NVIDIA held Spring GTC 2023, where they talked about their involvement in the #ai industry, powering all of your favorite AI tools from ChatGPT to Adobe Firefly and just about everything else in between.
------------------------
🐱🚀 GitHub: github.com/forrestknight
🐦 Twitter: / forrestpknight
💼 LinkedIn: / forrestpknight
📸 Instagram: / forrestpknight
📓 Learning Resources:
My Favorite Machine Learning Course: imp.i384100.net/YgYEBJ
Open Source Computer Science Degree: bit.ly/open-source-forrest
Python Open Source Computer Science Degree: bit.ly/python-open-source
Udacity to Learn Any Coding Skill: bit.ly/udacity-forrest
👨💻 My Coding Gear:
My NAS Server: amzn.to/3brqO7b
My Hard Drives: amzn.to/3aKetMi
My Main Monitor: amzn.to/3siQfPa
My Second Monitor: amzn.to/3keHT84
My Standing Desk: amzn.to/3boAcbC
My PC Build: bit.ly/my-coding-gear
My AI GPU: amzn.to/3uvmUmz
When there is a Gold Rush. Be the one Selling the Shovels
Beautiful analogy.
Calling open ai closed ai is based. They shouldn't be allowed to call themselves open ai anymore.
They’re Open, you can build exactly what they built. They even told you how to do it.
@@Stopinvadingmyhardware if the source code and training data isn't opensource then the program isn't open source. The weights are essential to the functionality of the program. GPT is proprietary, even wikipedia agrees. You can't install chatgpt onto your own data center even if you have the money to run one.
"OpenAI" in the eyes of the law is a legal marketing term. It's just a name. They could call it "Your Mom's House" if they wanted to, just as how marketing words such as "natural" have no meaning.
One of my favorites was the orange flavored soda drink called "C-Plus", which actually had 0 vitamin C.
@@reck0n3r when they were founded they did make open source ai so the open is obviously referring to open source software which they no longer make. That's false advertising.
@@jacobschweiger5897 I know what you're saying. What I'm saying, and what people have difficulty understanding is that all corporate names and product names are legal terms. The company name "OpenAI" is just that, a company name. It could even be "OpenDildo" if they wanted, it makes no difference.They have the right to change their business practices.
It's up to the customer to be aware and on the ball with how big corporations operate - lying is the rule, not the exception.
People are WAY too easily misled and distracted by words and names.
Now, if they explicitly state in their business practices that they ARE, indeed, creators of an open source product, but now it's become closed source, that's a different story.
Don't get caught up with names and marketing terminology. The very aim of marketing, if you understand it's evolution over the past 100+ years, is not to tell the truth, but to deceive unwitting customers into buying products and services that they otherwise wouldn't.
Imagine how many people bought "C-Plus" thinking they were getting an orange soda that contained vitamin C. It had none. 😂 It even said in fine print on the can under nutritional info "not a source of vitamin C".
"C-Plus" was the legal product name. That's it. Its not a promise of anything. It's shady advertising, but that's what advertising and marketing is, or at least, what it's mostly become over the centuries.
When a video calls “openai” “closedai”, I know it’s already a community caring person making this video.
Not that wanting money is a sim they committed. That’s not the problem. Just lying about it now after all the promises that allowed it to prosper is the problem.
But then the same person is praising the nvidia? Double standarts
@@user-cg1qy3gc5j not praising, keeping in radar. ClosedAI is no longer really community oriented but claim to be fully so, and so we dunk on that. NVIDIA was always greedy so we always dunk on them. But both have great tech so we keep them in our radar. That’s what it’s about.
Nvidia took a calculated approach towards the AI space and basically became THE supplier of GPUs. No others are even close to the lead Nvidia has
Nvidia: When there is a gold rush, sell shovels.
@@katech6020 mattocks.
Love or hate Jensen Huang, you can't argue that their hardware and software are the cream of the crop. They're killing it, and the competition, the little there is, cannot keep up.
The problem is the cost of entry into the space. Even for established companies it's a huge risk to take at this point. You have to be compatible with what is essentially a proprietary platform. Unless there is regulatory intervention, I don't think it's feasible for any company to even try.
Really wish seeing someday a full CUDA translation layer for AMD/Intel GPUs, so that a lot of the AIs are not necessarily only NVIDIA GPUs accelerated.
It's already happening, the top frameworks are moving away from CUDA, there's Triton, HIP and the Intel one (I don't remember the name).
@@LtdJorge But they require modifications that have to be made by the user? Or is that not necessary? Because I like playing with AI, but I don't wanna dive into complex coding and perhaps having to re-compile things, etc. just for compatibility
Also Pytorch's compability with AMDs ROCm seems to be pretty great. Sadly, ROCm's compability with AMD GPUs is lacking 😂
@@LtdJorge oneAPI is not just AI tho
anyone following the development of CUDA saw this coming way back in 2007. When projects like folding at home added CUDA support, it was clear CUDA could be used to speed up training models. Once CUDANN came out, they took the lead and didn't look back.
Nvidia is the only company that takes AI and accelerated computing with GPU seriously. I too complain about their recent price strategies but as a Deep Learning engineer, a video editor and a gamer, I happily give them my money for how good their support for their GPUs is. You buy a Radeon 7900 XTX card at a 1000$ and you can not even run simple machine learning models on it because it is not supported by rocm. In the meantime, every Nvidia card in existence is supported by CUDA. I started deep learning with a 970 and have progressed through the years to now a 4090. Nvidia is the only company that provides such a mature package, making their GPUs extremely versatile. Yes the prices are a bit too much, but if you want to use you GPU for more than gaming they are well worth it. People also forget that Nvidia is the only company that constantly continues to innovate: Raytracing, DLSS, the scheme of separating AI cores from compute cores and RT cores, etc. Everyone else is just trying to catch up with Nvidia.
Closed AI is right! lol
Just noting, Nvidia isn't the only heavy hitter in AI. It does depend on the market segment, but particularly in open sourced development, where AI is run locally, a lot of people have gotten frustrated with Nvidia's market segmentation and artificial VRAM limitations.
Particularly for LLMs, it's not uncommon to see people shooting for support under ROCm, and many popular AI applications like Stable Diffusion, or Textgen WebUI, have decent support for the right AMD GPUs. Beyond that, there's probably techniques that could be used with AMD GPUs (notably their instinct offerings, which are very reasonable used...Just look at the MI25) given that they have a "supercomputer heritage", particularly in CDNA, and have absolutely insane FP64 performance, which probably could be used in combination with sparsity to produce some impressive and unique results.
I think it's also worth noting that in the absolute upper end of things, many of the largest supercomputers are AMD.
That's not the only issue, though. If I wanted to do AI cheaply, and I mean, really cheaply, I could go out and buy an ARM SBC with an RK3588S on it, and run AI slowly, in software. As a person got used to it, there *are* tools for converting models from popular formats into once that can be accelerated by the NPU onboard, and Qualcomm also has interesting offerings in the area as well.
In addition, it's not like Intel's sleeping, either. Intel's not necessarily able to offer the same core counts as AMD, due to their difference in strategy, but at the same time, they are able to offer an impressive AI software stack, and dedicated accelerators onboard their server processors, and while they aren't necessarily directly as fast as GPUs, they can absolutely do the trick for a lot of applications, and I can definitely tell you that the memory pool and types of problems a Xeon server can handle are in a league of their own, compared to the relative limitations of VRAM on a GPU.
So it's worth noting that Nvidia is certainly at least a generation head of their competitors... But it doesn't take too many bad launches to show that if you want to eat an elephant, you do it one bite at a time. Just look at intel's server marketshare compared to AMD over the past eight years.
I really enjoyed the content, Nvidia is a silent yet powerful giant in the tech industry
Have you looked into AWS Code whisperer? It is like code pilot but with some additional tools
Haven't looked into it yet. Do you think it's better than Copilot? Overall, or maybe in just some areas?
@@fknight It is on par, if not better to be honest, but the best part is that it's free.
Cyberdyne systems seems to be doing really well.
every NVIDIA event, im gonna leave a "Forrest sent me here" on their event live chat
Considering the complexity of their products, I find it hard to believe there is anything “low key” about the fact that geniuses work at Nvidia and other top tech companies.
On this date, NVDA was $273.61 - Great foresight!
Question… I am majoring in CS but I don’t like programming - not anymore. I am more fond of the math and the theory aspect of CS and this is my last semester, so it’s too late to change my major to mathematics. What can a career path (where I don’t have to be a ‘programmer’) look like in this industry?
Become a data scientist, or even a researcher. You could also get into graphical or video game side of the CS. Look up the mathematics behind Quake, you will get an idea.
This video format is sooooo much better!
this is old news. You guys should have been on NVDA since the dip last year. Not just GPUs but they also provide many FSD boards for autonomous vehicles. But there’s still some pocket change left for the late comers.
how about AMD?
That comment sounds so cringe. The latest tech and innovation is nothing if the rest of the society is in disrepair.
Nvidia should thank the scientist who used his sons GPU as joke which changed AI research forever.
you mean Bryan Catanzaro ?
I love the company. With that being said, buying anywhere near current valuation is simply foolish. Current valuation 5 years of forward looking AI hardware dominance ALREADY baked in.... VERY similar to TSLA when it was over $400. Amazing company but it was a far better short at that valuation. Even if you believe all the hype regarding their hardware potential (which I 100% do) It's tough to say the stock should be selling anywhere over $100. I'll buy higher (planning to buy around $150) because I think they will dominate and eventually we will start seeing them growing quickly again.
There will always be hype bubbles that burst and boom. I agree
Keep the great work up dud!
Actually, building their own hardware and AI infrastructure is exactly what Microsoft is doing, and has been doing since 2019, on TSMC's 5nm process. I would guess they identified their dependence to NVidia as a major issue for the future of providing cloud AI compute.
Linux Master Race.
In case you think you are early...you are not, and its all priced in.
You're the best! Appreciate youre work. Thanks, keep going! Hi from Ukraine!
🚀
high quality video, thank you!
the fact that no other GPU can reliably do fraction of what Nvidia GPU can do with AI productivity, I hate Nvidia's price tag and small VRAM but I still bought a 4070Ti for that reason alone
@3:14
A What?? DUDE!!!
It's great that there is another monopoly
I never knew they donated that DGX to Open API
During a gold rush, sell shovels....
great video
But where is AMD in all of this? and what about Intel? Though I guess, the expertise companies have in GPUs have carried over very well to ML/AI research.
Did they simply made a wrong bet or were they also trying and not succeeding as much?
I was wondering the same
The software is the real deal here. And while AMD does technically have a CUDA competitor with their ROCm, it is not as widely supported as CUDA. For example just last week they announced that they will start supporting ROCm on selected consumer cards for devs using Windows (it only worked on Linux before). So much in software is powered and inspired by hobbyist side projects, a company simply can't afford not supporting consumer cards if they want a lead in this space.
Intel and AMD seem to be more focused on their CPUs than their GPUs which is fine; but I wish someone would compete with Nvidia more competitively to keep prices down.
@@turun_ambartanen There is vulkan and MLIR available, although not widely used yet. Vulkan has a lot of hardware support compared to ROcm and is more powerful. I wish MLIR was readily available but sadly its just not there yet (not packaged on windows, linux distributions or macos officially).
Why is it always asshole companies like Apple, Qualcomm or Nvidia that have unchallenged technologies... I really hope some other company, AMD, Intel, or whatever creates GPU's that are good for A.I. processing.
I hope that AMD will further prove that nvidia's pricing is absurd.
they won't.. because even AMD's current pricing is absurd. 7900 XTX isn't worth $1000 lol
I miss the Nvidia when the 10 series was released. Still the best GPU generation ever released. Now I have an AMD GPU because of Nvidia's awful pricing.
Nvidia has always been scummy, you just never saw their real face until you took a big L on your wallet. They only care about enterprise clients now, not your consumers.
Its rough out here for sure
interesting
I stil think the Radeon 9700 Pro is better.
Cool ad
I hope amd's ROCM keeps rolling. I've got it running on a few of their gpus including my 6900xt.
W amd user (I use amd because of better linux support)
Yeah, as a fellow RDNA 2 user, I say RDNA 2 is a piece of crap for A.I. I will move to Arc (can't move to Nvidia, because of awful Wayland support).
@@typingcat No one said RDNA 2 is suppose to be the best at AI, or that it was designed for AI in general. You're just grasping straws.
RDNA is designed for gaming performance, not AI. CDNA is a new branch they made for compute performance. Anything related to AI would be strictly it's own thing as AMD doesn't like to mix and put everything into one basket like Nvidia does.
Nvidia is good at market speech and PR, good at writing their software to do what they want (usually at a cost), but are not as great at engineering even with their huge budget compared to AMD. Vice versa, AMD is good at engineering, but their budget constraints and less money in general means they work with less on the drivers and software side of things. This is why AMD went with OpenGPU and other projects, knowing it spreads the word on widespread use of technology and builds on software in general, but aids everyone who needs the resources (including AMD).
Nvidia would rather lock everything behind proprietary doors and exclusivity based black boxes if you let them.
Pay attention to how Nvidia speaks to the public and how much money they put into their presentations, if you know a thing about marketing or PR you'd know where Nvidia stands in the stance of controlling markets and swaying mind (shares). They are no different to Apple in the same manner that they have a 'cult' type of leader that drives a following.
Common Nvidia Stockholder W
Nvidia is SKYNET!
im ai with the braids
How about DOJO. Would you do a content on Musk's Ai super computer Dojo?
Nvidia is high key greedy. Too greedy.
I'm already snoring.
So Nvidia already jump the AI train so Early meanwhile Intel just some lazy king in the throne of all compunting and AMD just trying to not Bankrupt
Sure, but when the AI sexbots will come out?
Open Tinder, they have been here for a decade now.
Hope AMD can catch up
Just call it OpenAI dude
WTF does “low key” even mean?
It makes you sound like a kid about to steal my crypto.
Its time to buy that NVIDIA stock lol
Did you do it?
Huh? Who was sleep on Nvidia? Everyone knew AI was coming along, and everyone knew Ncidia was trying to dominate it. This framing is addressing an audience that doesn't exist. Hopefully the video is still meaningful.
Ima be honest, I thought Nvidia was insane for putting that much money into AI.
But honestly i guess ima clown cus jesus they really paid off.
Literal no one is sleeping on nvidia 😂
Invest
The average gamer are butthurt that Nvidia divorced them and is an elite company now for Fortune 100 Companies and Fortune 100 Gamers.
Joey Fortnite is stuck with lower tier graphic cards from AMD and god forbid...Intel now.
Nvidia stock has skyrocketed in a bear market since the release of the $1600 Nvidia GeForce RTX 4090 graphics card.
Jensen Huang has earned his trademark black leather jacket BOSS look!
Was waiting for the whole video for you to point the 20 series super card in the background. Very dissapointed 0/10.
Deadass nvidia the goat. no cap
:)
There was a recent drought of GPU's that gamers blamed on bitcoin mining. I had to remind them that more than half of all NVidia chips were in server farms.
Who’s sleeping on Nvidia? Tf?
31st comment
Cool💾
Still I hope to see whatever Mozilla is doing for AI
& hope to see AMD try to play the AI game & give us an alternative
Alternatives are good remember
“Don’t sleep on nvidia”… you mean one of the most profitable companies in the world right now? I don’t think anyone is…sleeping on a product people are lining up to buy.
Who is sleeping on nvidia? Lol
Hate on NVIDIA, but you can't hate the fact they have really good software and technology advancement than other GPU markets. I'm betting hard on ray tracing on their side.
novideo
First comment
32nd comment
nice video bro, didnt know that you were a nvidia shill.
speaking facts about the industry makes me a shill? Fine. I'll just hate on everything from now on so you think I'm cool.
How is he a shill ?? Nvidia dominates the backend of AI market and is now investing in front end as well with AI model services announced in the recent GTC. Jensen is a visionary and had made long term bets in the past and continues to do that even now. Maybe you are pissed of because of their high prices but when you are the best in the business and there is no competition you automatically get the pricing power and in the end if Nvidia is charging high prices it encourages competition but if other companies are failing to compete what could anybody do.
nice comment bro, didnt know that you were an amd shill.
Well there is a reason Microsoft is developing it's own AI processors, nobody wants to pay the Nvidia price for something that can be done 5x more efficient with specialized compute units... once again Nvidia is pricing itself out of a market(like they did with smartphone SOCs, car SOCs and lately with GPUs)
Yeah, I think Nvidia just cannot focus on multiple market segments. They could've made AI computing separate from their gaming segment. Instead they are killing off gaming to focus on AI.
I guess now NVIDIA can just stop caring about gaming GPUs lol.
RTX 5090 for 5k when
TSMC is the hidden dragon noob.
ASML has entered chat
nvidia chills coping so hard xD