AMD'S HUGE AI Chip Announcements to Take Down Nvidia (Supercut)
HTML-код
- Опубликовано: 6 июн 2024
- Highlights from #amd Advancing AI keynote presentation, featuring competing technologies to #nvidia GPUs and many reveals across AMD's entire suite of data center solutions. Highlights include the launch of the AMD MI300X GPU, AMD MI300A APU, hyper-scale networking solutions & more!
💰 Want my AI research and stock picks? Let me know: tickersymbolyou.com/ai/
Simply Wall Street's Advanced Micro Devices (AMD Stock) Valuation: simplywall.st/stocks/us/semic...
Nvidia ( #nvda Stock ) Valuation: simplywall.st/stocks/us/semic...
💬 Join The Conversation & Stay Up to Date 💬
Follow me on Twitter: / tickersymbolyou
Join me on Discord: / discord
Support on Patreon: / tickersymbolyou
Timestamps for this AMD Advancing AI Event Supercut:
00:00 Investing in Artificial Intelligence
01:49 Launch - AMD MI300X GPU
04:35 AMD MI300X vs Nvidia H100
08:54 AMD 8 GPU Platform vs Nvidia DGX H100
11:26 AMD ROCm vs Nvidia CUDA
14:36 AMD Networking vs Nvidia NVLink & InfiniBand
17:49 Launch - AMD MI300A APU
20:04 AMD MI300A vs Nvidia H100
22:35 Launch - AMD Ryzen 8040 AI PC CPUs
24:37 AMD Ryzen 8945 vs Intel Core i9 13900H
📝 Resources & References 📝
@AMD Presents: Advancing AI: • AMD Presents: Advancin...
See additional disclosures at tickersymbolyou.com/disclosures/
🙏 Thanks for watching! - Наука
Do you think AMD's new chips will disrupt Nvidia's insane GPU market share?
I am no expert, but I know nVidia has big leverage. This lady was very focused and confident in the presentation, so AMD has what it takes, I guess it will come down to the manufacturing bottleneck aka TSMC, because the demand is clearly there!
One fear is customers tend to stick with what they know. If they've already invested in Nvidia, then they may be hesitant to switch.
@@audasPhotonic chip? 3 nm? If using pure light, a 3 nm wavelength would be well past ultraviolet and nearing the x-ray spectrum.
Of course! There will be top competition ahead!
@@audasI know.
This feels like that time that AMD released Ryzen to compete with Intel. Now it's competing with Nvidia datacenter. Exciting times.
competition is good for everyone
@@TickerSymbolYOU it will not hurt AMD cpu and gpu divizion beacuse of this at AMD?
Yeah Ive been on Ryzen for awhile. They still have their stability issues.
Except that NVIDIA is not Intel which is incompetent!!
Cloud providers cannot allow themselves to rely on a single vendor to provide GPUs. Adding one or two more vendors will increase competition and reduce costs. Nvidia will still be dominant, but there is more than enough room for AMD to capture good share from this huge AI TAM.
With that in mind.. I wonder if AMD will skyrocket in value like nvidia did.. and intel will be left in the dust.
@@b3owu1fdoubtful
@@b3owu1fNvidia made 18bln in 1 quarter in their last fiscal. That is almost more than all of AMD combined for their fiscal year which is projected to reach 22bln. AMD has to sell to left over demand that Nvidia can’t fulfill due to possible supply constraints. Nvidia is also getting into the server CPU market with their Grace CPU. It’s risky for AMD since server sales have high margins.
Yeah I think it kinda has to after watching this, AMD should rise quite a decent bit in recent months to come@@b3owu1f
Nvidia has already launched H200. So that comparison with H100 is not reflective of the best out there.
No, they haven't. H200 is not to be released until 2024.
it feels like they really do not mess around and are very very aware who is the real boss here.
This constant calling out "our competition", as it were more than one company is kind of annoying, but also very interesting.
Worth to remember and mention - NVIDIAs chip was released over half year ago, and they would for sure deliver another generation soon.
I worked for AMD 1982 - to 1994 and left semiconductors for solar and nanotech in 2008 and I tell ya - this is totally insane - wow.
What do you do in nanotech? I’m also working on nanotech magnetochorloic’s.
Competition is good, hope more software vendors will support ROCm, since Cuda is so far ahead at the moment
Yeah, ROCm vs CUDA could be where the real battle is
Why will anyone else support it? ROCm drivers is AMD's job and till now they have done a shoddy job. Even on the gaming front those drivers or GPU's arent the best. What do we do?
@@TickerSymbolYOU Long term, OpenSource usually catches up on features and quality of code.
@@TickerSymbolYOU Its not a battle and won't be for many years to come. We have seen how many AMD open source initiatives fail? Maybe if they take a book out of Intel's book and actually subsidize developers to use their hardware we might see some swing...... I don't think they have to pockets to do that though.
I just found out that a beta version of PyTorch with ROCm support exists. That makes we believe it's going to be adopted.
Thanks for sharing this. What a day $AMD had today, love the competition with NVDA, we the investors and traders will benefit from it.
I do not know to much detailing of the AI based hardware products. But i admire that two biggest companies of accelerated graphic GPU,s are in USA you should appreciate that . Its not about winning or losing its about infinite opportunities lying in front of the world and only two companies AMD and NVIDia is leading the way . Man grow up . We are still relying on textile and agriculture products in Pakistan . And you are light years ahead of us . Appreciate what you have.
Hey thanks for this super cut very informative but I can’t wait to hear your breakdown
Coming soon!
Using LLM2 inference benchmark based on H100. AMD claiming a 20% improvement with MI300X while Nvidia claiming almost 100% improvement using their H200 that was announced a month before this AMD announcement. Nvidia also claiming almost double energy efficiency with H200 while AMD MI300X less efficient than the H100. Finally, AMD focusing on inference instead of training just like Intel... I think this new chip is DOA
Inference is more important for end-user.
@@victorcoda all the money is in datacenter sales. End users aren’t the target client for these chips.
Yes, they are :) @@nick_g
It actually depends on the price. What matters is performance per dollar, not just the performance itself
@@TickerSymbolYOU fair point. Large companies with cloud operations offering services shouldn’t care either way. As long as they’re able to offer compute services with a good profit margin. However AMD with an 1,100 PE ratio is not at all warranted vs Nvidia with a 62 PE ratio. You’re right that performance per dollar is what matters. Nvidia H200 is supposed to be about 100% more the performance at about the same energy draw of the H100 700Watts while the MI300X is 20% more performance for 750W. Those AMD margins will be heavily squeezed. AMD being so vague in their presentations also not a good sign. A lot of their advantages are tied to slapping faster memory on top. H200 already getting memory upgrade so advantage nullified.
I’m guessing the presentation deck behind the presenters wasn’t made in PowerPoint….
Will you make a video with your thoughts?
absolutely
Usually it is very complicated for a rank 3 to become rank 1.
How long before Jensen says ‘hold my beer’
Lisa Su is an excellent speaker!
I would like to say "Second!"
...but I am more inclined to ask. If everybody and their 40+ bln markcap mother is making an AI chip, whose gonna fab that?
AFAIK TSMC 3nm and 5nm backlog is packed up for at least 3 years forward.
Someone's gotta build them... Real men have Fabs !
Doesn't give warm and fuzzy when the opening speech is about industry profits
Part of a very important coin been talked about in the BCL
Can it run t-800?
all jensen needs to do is to buy all his buddy's tsmc capacity for 3 years
AMD already has tons of capacity at TSMC booked so no worries there. The GPU chiplet design and interconnect is industry leading making it cheaper than monolithic designs like the H100 and H200. No doubt Nvidia has a 12 month lead, but the 100+% profit margin on the H100 day are short lived, AMD will for sure be in second place for a while due to software, however it offers extremal cost effective solutions that are within striking distance of the H200 costs when it comes to TCO and is better than H100. So its defiantly competitive however there is a market leader and it costs a ton of cash to switch platforms mid program. Time will tell. New instances - AMD is the clear winner, expansion or development will Nvidia has a slight edge depending on size.
Agreed. Also, it's not really one or the other -- datacenters are portfolios of chips and solutions that grow over time. It's not like they can't have both products in different clusters.
@@TickerSymbolYOU Oh please don't talk such nonsense. We live in fan boy polarized world, you must choose your religion and stick to it. In all seriousness, another good summary from a tech event. Long all the cloud/AI platforms and the companies that supply the hardware (utility companies of this decade and maybe century)
@@TickerSymbolYOUData centers tend to craft their software around a single ecosystem because developers never want to change. If they've invested in Nvidia, they will stay with Nvidia. AMD is still in a great place to pick up new customers and having Microsoft in on the AMD action is a huge step forward.
Yeah, honestly I feel like if Nvidia is the Apple of chips, then AMD is the Samsung of chips. They will get there, I believe in AMD. Been using their CPUs and GPUs for years.
@@SomeUserNameBlahBlah Developers never want to change, but competitors make their products compatible with the competition. One example would be when Microsoft Excel first came out, you could use all of the commands of Lotus 1-2-3 which is how people were using initially, and as these shortcut commands were being executed, the software would show the user how to do the same thing in Excel. When the Commodore 64 first came out, there was a cartridge you could purchase to be able to run CP/M programs. Unidata, Universe, D3, JBASE, Sequoia were all implementations of the Pick Operating system developed by McDonnel Douglas and could could compile the code without re-writes. This made it far easier for developers to change platforms.
Can you imagine Lisa Su and Jensen Huang in the same room! The combination of computing power!
!!!
Holding AMD since 2018, crazy fun ride so far
🙌
what a turn around 2018 was $10-12. Only wish I had more to buy them.
Tech can be a hard domain or investors, an established player can be disrupted overnight. The providers of hard disks were disrupted by new entrants into the sector, who created physically smaller disk of an inferior capacity that what the existing manufacturers provided to their established customer base. They wrongly concluded that there was no demand for these new products. The new entrants found new markets for their products, for example portable medical devices, tablets, phones, glucose meters so that by the time the smaller drives could compete in capacity, the old established players were too far behind. Established players will sometimes pull resources from the new tech to work on problems for the current customers using the old tech so it can be important to have a separation such as a spin off to avoid such problems delaying the new products. Suggest: "The Rigid Disk Drive Industry: A History of Commercial and Technological Turbulence" - Clayton M. Christensen.
@@mon7eban Yes, i remember i bought it because ryzen chips, and I really like pcs DIY, a youtube channel with a lot of technical stuff was leaking next gens cpus and it was like intel was in problems, so I bought my first 100 shares of AMD
Holding since 2014
did NVDA build their first AI chip back to 2012 to open AI?
Nope.. However, they were at the forefront of AI hardware development at that time, and their GPUs were instrumental in the early development of AI.
she did not talk about efficiency though
She does talk about power efficiency throughout the presentation. I thought I kept a good bit of it.
I literally just sold half my shares yesterday to add on NVDA :')
🥲
that is smart
@@MARKXHWANG maybe in the long run but I still could’ve sold them today instead
I think it totally depends on the current and future trends as far as adoption rate is concern it depends on the execution.
Nvidia will be the leading AI driver , sticking with Nvidia , going long
How delusional can you really be ?
@@Zombiesmoker
When I cash in you will still be trying to make yours !
@@maximumoverload5134 i do not work for money. So nah false af 😂
Infinity cache, that is a lot of cache.
"To take on NIVDEA" - no actually the CEO and founder of AMD and Nvidea are very close cousins. They probably cooked this for you on a dinner table lol. Both will survive and keep each other relevant. if you don't get it, sorrry
Everywhere??? as long as they come up with counter measures for AI we’re already here in the Terminator chapter
The fact that AMD doesn't support CUDA or cuDNN will severly limit the adoption of this in the short/medium term. They are absolutely imperative for running most AI workloads. So don't expect the same kind of stock gains that have been seen with Nvidia.
I do think not supporting CUDA is a huge pitfall
I think that is true now but won't be in the future. Especially with this AI wave, there is definitely enough interest in an open source alternative. The SW moat is not always a good one when it is all closed source.
The only thing they need to support is pytorch thats it.
@@TickerSymbolYOU at the moment unfortunately it is, but hopefully libraries starts supporting other acceleration platforms sooner than later.
AMD has ROCm which has been developing at breakneck speed this year. ROCm with its HIP layer is basically CUDA, HIP is a CUDA replacement. Also cuDNN functionality does exist, unfortunately AMD calls it MiOpen. Framework developers are moving away from CUDA though, they are switching to things like Triton. Triton works with AMD now too, with AMD's own fork but Triton 3.0 will support AMD natively.
Will AMD encounter the very same Advanced Packaging problem that's the cause of Nvidia's current massive backlog?
AMD could potentially encounter similar Advanced Packaging (AP) challenges that Nvidia is currently facing. AP involves complex processes like chip bonding and interconnects, which are crucial for integrating multiple dies into a single package. These processes are often time-consuming and resource-intensive, leading to potential bottlenecks and backlogs.
Thanks
AMD on AI for 15 years? If that is the case then you should have product early this year or even last year instead of Q4 2023. If total market is really that huge as she said then Nvidia will benefit from it most
Thanks, as always, for providing this time saving service for your audience. It is greatly appreciated.
Happy to help!
Little known fact:
-Lisa su of AMD and Jensen Huang of Nvidia are relatives.
-Lisa Su brought AMD from the ashes.
Good information to know
Wow. The info provided by her and all of you posting is fantastic. I have nothing to say but listen. Fascinating! And ooh myyyy....
Glad you found some value in the content!
what a monster of Computing
Almighty..... there is no way I can fathom all of what is being said here. No doubt I have to watch it again...great stuff, we are in for a real treat. Love it all. Thanks You. Bought AMD rinse and repeat ,..time to get back in again.
Glad you found some value in the content!
She is comparing their chips to H100s which are already out while Nvidia already has H200s in the market. Come on…
Completely agree.
I own both Nvidia and AMD stocks for a long time. Competition is good.
good for customers bad for you
There's no competition here. Jensen and Lisa are cousins
Agreed. And it’s awesome to own this market in just 2 stocks
@@geezer2365 how is that bad for him, both stocks have been doing spectacular over the past half a decade. I own both as well, amazing gains on both.
@@jeffreytrinh9907 *There is competition and Jensen and Lisa are cousins. Fixed that for you.
There is no brake-trough in this new GPU. 20% is not much. I would expect factor improvement.
The presence of Microsoft everywhere related to AI is kinda scary
This chip will change the world
Bittensor tao is perfect
Not a chance!
The great content.
Nvidia is ahead and I'm sure Nvidia will use ai to make their GPU even better.
Amd seems to be ignoring gamers for this cash cow, they might loose both.
AMD ignoring gamers ? AMD is huge in gaming, powering the most popular game consoles, sure they are not as big in d-GPU's, but they are in the game for sure.
it's a matter of who is ahead, it's a matter of there is room here for AMD to grow, there is plemty of space for multiple competitors in this space. You don't have to be largest in the this market in order to make billions of $ every year.
Nvidia was the one ignoring/shafting gamers with their GPUs
Then why less Steam users are using nvidia compared to last years/last 2 years while AMD has grown like 6% out of total users during 6xxx series. While nvidia has this big 75% You will see that most of them are using 1660s and 3060. About 0.9% are using 4090. 4080 is even lower. Not the best look If You look into the stats You will see that the shifting îs slow but it does exist.
@@MissMan666 90% of consols user dont khow what ships they are using. and if u wana be that talk the switch sold the most units and they use NVidia ships.
You guys forgot! Nvidia has Grace Hopper 200! It’s ARM chip like Apple did!
It’s the best CPU!
Please check website before commenting here!
In other words, the investment needed to make AI somewhat useful is astronomical and totally prohibitive for most use cases. Exactly like it has been since the 1970’s.
anyone think AMD will reach 200 in 2024?
Hope so, would make it my first 100 bagger ;)
@@normansteinmetz643 i am going to buy more of their shares.
Maybe in 2025, and 300 in 2027
“Real men have fabs”.
Jerry Sanders, the founder of AMD.
Nvidia has the AI market in its pocket
And it probably will for a long time
@@TickerSymbolYOU Unfortunately, yes!
all depends how amd's hardware will deliver performance for same money as nvidia. along with rocm advancements things might change.
@@adamstewarton Indeed, the ROCm needs to get some improvements and gain market. Cuda is the only thing blocking amd and intel to AI market (in my opinion)
Nvidia‘s lead in AI integration for its GPU can not be easily overcome by AMD; not knowing what AMD had done with its GPU. Acquiring Xilinx was a smart move, hope to see some breakthrough there. So many acriniums --ZenDNN, ROCm, VItisAI.. a growing userbase will be its key to success.
Behind in distribution
its worse than crysis/physx era which back then they still competitive at dx10/11 games, now they just behind at gaming
It's pretty clear that they aren't trying to kill Nvidia, they are instead focusing on a part of the market they think will be bigger and compete there. They are targeting the inference segment and not the training segment.
All the money that's come in so far has been for development and training, not AI applications. AI inference, so far, isn't useful or widespread. This could all end up being the next Web 2.0 or Web 3.0 or Big Data, waves that came and went.
Good stuff!!! 😊
Glad you enjoyed!
AMD has done an amazing job! But I think $400B in 2027 for AI hardware is an understatement.
AMD will definitely take some percentage of the market share from Nvidia. But it saddens me, that even today it seems Lisa Su doesn't fully understand the potential of AI in the coming years. She didn't understand it in 2016 (Google AI-first strategy), she didn't understand it in 2020 (GPT-3) and today is the same. I really hope I'm very wrong on this!
I just like seeing AMD do everything in the computer world
Gaudi 3 for the win.
The thing that was holding back AMD the most was software if RoCm 6 is realy that good and open sourced on top i can see them takeing market shares from Nvidia but A100 is also quit old already so i can see Nvidia haveing something to compeat as well
Nvidia says AMD shows The values without Using Nvidia's Optimisation Software.. Which will accelerate the Values more than AMD ..
You following the leader is going to mean you miss out on a lot of gains. The biggest stocks have less downside but faaaar less upside. Should have kept to your guns or you'll continue bad timing trying to play catch up
I lost my job but it sucked anyway, but I made a lot of money on Nvidia! The more people out of jobs the more nvidia goes up. People are obsolete almost. It’s amazing how far humanity as come.
5 yr PE ratio avg: 59.8
Current PE ratio: 952.2
You're looking at GAAP numbers which have Xilinx acquisition Good Will. Look at the PE with non-GAAP numbers.
And yet a PS ratio of 8.
It's expensive ! Sometimes can't find the P/E Ratio but WOW !
I have an idea of installing many Neural PU cards to many PCI-E slots for becoming a brainless computer to an AI computer.
Good news for amd and the users. Competition is always healthy for new technology.
Cathie Wood's ARKK just passed Fidelity and is the number 1 ETF this year. Are you still a Cathie Wood hater?
I think at least they try. But Nvidia is leagues ahead. If nvidia was actually being pushed, we would have way better gpus from them.
What do the three biggest semiconductor companies NVDIA, AMD, Broadcom have in common? They are all lead and built to be the successful leader they are today by Chinese CEOs and technocrats.
Taiwan not Chinese.
It’s Taiwan!
Not China!
Does America admit Taiwan is China?
@@natcc6474 Chinese lack skills to be world leaders.
i dont understand why we havent moved to an analog processor for AI brains... ?!?!!?
Diamond hands on AMD $500 by 2027
ROCm is so bad that the hardware just doesn't matter. No threat for NVIDIA until that is fixed and fixing of that is not going to start before AMD fixes their software side.
I got to see it to believe it
So who are the customers for this? Big companies who'll use it as their company "brain" in such areas as R&D, logistics, design etc?
Anyone with a big enough data center. Cloud service providers, ISPs, etc.
they already have Microsoft and meta
I'm with IBM something secure with greater limitations
She is being very carfull not to say A.G.I. That would be the time to think Skynet.
The good thing about A.I. is that it can be used by the IRS to analyze each individual taxpayer.
A.I. is also good to analyze each person to develop a Social Credit index (SCI) retained by the government and not publicized like in China. By keeping the SCI secret, it can avoid privacy laws and allow both national police and state/local police to determine who to watch more closely or have phone conversations and the microphones in cars listened to by the A.I. analyzers for legal violations.
The anthology series Black Mirror had an episode "Nosedive" which showed the value of A.I. in the useful monitoring the population and determing who has greater access to societal infrastructure. Also, things that are secretly determined such as who gets scholarships, admissions to higher education, higher insurance rates, IRS audits, or who is allowed to win lotteries or random entry contests can also be managed by A.I for the good of mankind. A.I. is also useful in determine whose votes should be lost in the system until after the election is over. The vote can then be counted, but will be evaulated as invalid due to being late. This therefore satisfies the law that everyones' vote is counted and that those who legally voted did have their vote accepted and counted. Let's salute a Brave New World where everyone obeys the law and hate speech is monitored by A.I. not just on the internet, private emails, but also in cars (through their microphones and cameras) as well as at home thru the PC microphones, smart TV microphones, and smartphone monitoring. Black Mirror - Nosedive. Eye opening.
WOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWOWO
OOOOOOOOOOOOOOOOOOOOooooooooooooooooooooooooooooooooo
If AMD just gets 20% TAM of AI Chips in next 5 yrs, its already a big win given where the stock price is now.
and u think Nvidia well sleep in the next 5 yres or what ?
Please can this mi300X be placed in a laptop.
Lol😂
Just why? Don‘t use laptops for serious workloads.
coming for NVDA the best company AMD
So much stuff doesnt matter, there probably wont be any support in ROCM drivers or software support.
the entire 2020 to today the winners here are AMD and Nvidia.
so I stay with Nvidia so when i see the number is not too much better and so amd have aloot to work on compatibility
How the hell is this lady so good at speaking?
This presentation shows too many numbers and I think this happens when a presenter is not sure what to tell.
Nvidia has orders six months in advance. But the AI market is showing explosive growth. AMD can sell as many chips as it can produce. Nvidia and AMD compete for the assembly of chips in production, and not among themselves. AMD is several years late and will not be able to in any way prevent Nvidia from maintaining its leadership in the AI market
Agreed; the AI market is big enough for the both of them
AMD is in the game with Nvidia, now. Intel has cracked and won't be climbing the mountain.
It's true that AMD has made significant strides in the GPU market with their latest releases, particularly the MI300 series, which offer competitive performance and efficiency compared to Nvidia's offerings. However, stating that AMD is definitively "in the game" with Nvidia and that Intel has "cracked and won't be climbing the mountain" might be a bit of an oversimplification.
Both Asian cousins are working hard.
Making us China folk proud
AMD: We have AI at home
😂
Nvidia willn't die a hat trick any minute now
amd is much faster in every aspect but the stock is not. How come ?
There is 'something' missing from AMD - the presenters just don't have the depth, conviction and knowledge of NVIDIA. Its like feels a bit Plastic Cut and Paste job to catch up in the space.
it definitely feels a bit cookie-cutter compared to Jensen going on excited rants about the future, doesn't it?
I have faith in AMD
Ir we want better a.i. stuffs we need support AMD alternatives.
Do you know Lisa Su is Jensen Huang's cousin?
I've heard that! So crazy
@@TickerSymbolYOU Imagine the discussion at the family get together. I've got more AI than you cuz.
Great presentation but the problem that AMD is facing is essentially you cannot get over the barrier of needing to wait for production to happen we have only a few global providers of The foundry services that will produce your super advanced a mega crazy chip if you cannot get that chip to your customers when they need it they're going to go elsewhere and Nvidia is already got that lined up they've got the whole production supply already roped up because you know they're doing multi-billion dollar deals with tsmc and Samsung and global foundries , the people who are producing their chips and so while AMD has a greater design potentially they can't get that design produced in the numbers necessary to give it to their customers and then about price if you are now in the beginning of your ramp for your new chip you have to usually price it higher just at the start so that you can make up on your research and development costs and your production costs so it's just a really really difficult mountain for these guys to climb AMD even if they're chip names of the design conceivably is a slightly more advanced than the best chipset shipping right now.
This lady turned AMD around.
She absolutely did. Awesome CEO.
Will never happen. Nvidia will always hold the crown 👑