Exciting times for AI semiconductor stocks TSMC, AMD and NVDA. which are all experiencing a surge in value. It's interesting to watch the competition develop, given these stocks are major contributors to Al chip growth. On the increase of my personal holdings, I've witnessed amazing impact on my shares.
Intel and AMD will definitely have their share of the market. TSMC is at max capacity and investing in other semiconductor companies will be an absolute power move, Different chips are good at different things and Nvidia has been very specialised, which leaves other aspects of Al open.
certainly kayla, i had bought NVDA shares at $300, $475 cheap b4 the 10 for 1 split and with huge interest I keep adding, i’m currently doings the same for PLTR, POET and AMD constructively. Best possible way to get ahead, is participating behind top experienced performers.
i own three business, right now I'm compiling and picking stocks that l'd love to hold on to for a few years before retirement, do you think these stocks would do better over the years? My goal is to have at least $2 million saved for retirement.
The competitive actions and innovations from all of these companies including Amazon is pretty exciting. It's exciting because not only will companies try to outdo each other's product but it will create more thinkers to innovative push limits and continue to expand capabilities especially with A.I.
I think another way to think about these AI chips is to realize this is how they bypassed the bus size limitation. They have loaded the calcs into the chips so the data doesnt clog up the bus.
It's not a thinking machine it's just a compartmentalized calculator. Ingenius but not special other than they solved a big problem to increase speed and reduce heat.
That’s a good way to see it! It’s etching the machine level math into silicone so you don’t clog the software with the “instructions”. Have you seen the new start up etched?
@@6lack5ushi nah. I just knew they couldn't advance the speeds further until they conquered the hardware limits and the heat. They've scared the public with marketing but it's just a hardware shift and a indexed big data grab. It will steal jobs though from business administration roles due to no longer needing people to do spreadsheets, eventually. Not yet, but in a hundred years.
*Summary* - 0:00: Introduction to the topic of Generative AI and its growing popularity. - 0:09: AI chips, some small as palm-size, driving the AI boom, with skyrocketing demand. - 0:19: Estimated market for AI accelerators in data centers growing from $150 billion to over $400 billion. - 0:29: Tech giants racing to design faster, better AI chips. - 0:47: Amazon's chip lab in Austin, Texas, designing AI chips for AWS servers. - 1:00: Ron Diamant, chief architect of Amazon's custom AI chips Inferentia and Trainium. - 1:11: Description of chip components, including tens of billions of transistors per chip. - 1:30: Difference between AI chips and CPUs in packaging and function. - 2:18: The necessity of additional components for AI chips to operate. - 2:31: Amazon's two AI chips for training and inference, with training being more resource-intensive. - 3:06: Energy demands and heat generation of AI chips, with cooling methods. - 3:32: Integration of Amazon's chips into AWS servers. - 4:18: Competition in AI chip market, with Nvidia dominating and major cloud providers creating custom chips. - 5:01: Current uses of generative AI in consumer products; potential long-term payoff despite current hype. - 5:24: Rapid advancement of AI technology requiring chip and software development to keep pace. - 5:50: Amazon's recent release of a new version of Trainium, indicating a long-term commitment to AI chip development.
@@GreatTaiwanNow RUclips has that functionality. Below a video there is an 'Ask' option. You can ask to summarize the video with time stamps and RUclips does that for you.
Want to be even more blown away? The actual design and manufacturing of these chips one of the greatest technological achievements of all time, many geniuses involved.
Yes, because in Europe we are right now enforcing strong laws against the damaging effects from AI as we've already seen presented by Google Gemini with black president George Washington, indian people dominating France and white men being erased from History. Furthermore Europe are - unlikethe US - trying to protect our populations against all the upcoming scams, dating crimes, big lies promoted by Big Tech, media and governments around the world hich will be utilized by criminals, gangs, hackers etc. as soon as AI takes off. And THAT is the reason WHY european countries are per se MUCH richer and HAPPIER than the US despite the fact that most new tecnology is developed in the US :-))
Good. More demand for my job in the future. Silicon chip industry is going to be the future. We will see more things getting miniaturised on chips. Like Medical diagnostics, Particle Accelerator etc etc
@@haha-hk9tx I'm pivoting to Quantum Photonics or superconducting Photonics soon. Planning on masters Nonetheless even in traditional realm there's a lot of room for innovation. Like the way PD is done or even using Analog System for heavy NN load
@1.08 Wafer - it is a dummy wafer. A good wafer has to be kept in a clean room, and people have to wear clean suits to prevent particle contamination on the wafer.
its still both a wafer, one needs an addition… noone said this specific wafer he holds up to show is going to really be used in an aws server nor is it even remotely relevant… but you aren’t running into an electronics store telling them the nonworking phones on display to show the dimensions aren’t really phones, are you?
The AI chips are the modern-day equivalence of the dedicated chip hardware from the 80s for 8-bit computers. Same cooling solution too. And they slap a sexy name on it so it will sell.
@@VeganSemihCyprus33 I graduated as a computer engineer three decades ago. Part of that formal education was AI and chaordic systems. That video is a philosophical take on what I believe and apply.
I like how in the first place we need all those dedicated chips to run specific task, then we don't like it and tried to make a chip that can run every task. And now we trying to make all those chips for specific task again because all-around chip can't cut it.
its the patterns of computations that are done that are very different between AI & general computing. general: very quickly, a sequence of very different things needs to happen e.g. surfing, verify security token of website, load part of the webpage, show notification from another programm, in AI: push a lot of data with known location through a known series of operations (the AI model) - the latter is much better with slower, but highly parallel operations, the first is better with blazingly fast, random (unknown in advance) operations
@@homo-sapiens-dubium I had to reread this a few times, but this was the best explanation I've gotten to date. That makes sense. Thank you :)) haha computers are so confusing to me. I'm 30 and not old enough where I should lack innate tech abilities, but I just never adopted them at all. Might be because my parent's were much older when they adopted me and weren't good with it themselves besides my dad who was awesome with it but wasn't patient enough to teach my mom and and me. Haha That said, thank you, I think I kind of get it a little more now haha. I'm an esthetician so my expertise is in skin health and cosmetic chemical formulation so I never had to learn about how all these chips work but it's super impressive and way beyond me how they figured all this out in the first place haha. Like didn't he say there were hundreds of millions of the processing things in each chip? It was probably way more than that but the scale is the thing hard for me to understand. It's so small haha
@@willcookmakeup hey there random human! I'm glad if I made your day any bit more interesting and fun, thank you! Youre awesome by even looking into things you have no professional exposure to, congrats for being curious! The world is full of things we have no idea about, there are rabbit holes full of learning and fun everywhere - if there was just more time & energy to go after them..!? computers are stupid. They are electrical machines, not more. Every complex gigantic problem is solved by breaking it into smaller pieces, and repeating the breakdown process until it is managable. processors work with a set of instructions, primary school operations - + / * are a part of them, there are a few more, but they are just as simple. these operations are applied onto "chunks" of memory, that are basically 01 numbers. A programm is now simply a sequence of these basic operations on chunks of memory. Some chunks are from the programm itself, some from e.g. user input or another memory location on the PC. This is called assembly programming. More complex programms are written in "higher level languages" that can express more abstract things, like variables! These are then "translated" by other programs (called compilers) to that same assembly that can be executed by the CPU - thats it! Specialized chips now have a completely different architecture. Think of a farmers tractor and a car. Different use, completely different design, but same principle: wheels and an engine. So it is with chips! Keep learning & being awesome! I should learn about skincare too - I have 0 knowledge on that!
However you feel about Amazon their AWS tech is amazing. They have optimized virtual machines that run at essentially the same performance as "bare metal" as if you were running on a real machine, which is just crazy hard to do. They're number one in cloud for a reason
Xen is an awesome open source project that AWS is a contributor to but they have a dedicated proprietary Nitro hypervisor and custom hardware solutions that outperform what other Xen based products offer @@user0K
Everyone wants to cut out NVidia as soon as possible. NVidia should flip the tables and become its own AI company or buy/partner with one. They would absolutely dominate.
The planet needs humanity to reduce GHG emissions, and what do we do? We keep inventing more and more energy-guzzling tech. There is a direct correlation between total (human) energy consumption and the deterioration of the environment (incl. the climate crisis). We need to reduce energy consumption, not increase it. The AI revolution is a luxury we simply cannot afford.
Did I understand it that an AI chip is just basically a bunch of CPUs condensed into one? Instead of a dual-core or quad-core, it's more like a octa-core?
AI? The bias is in the training data. The bias will be geared to Google, Amazon, Micr$oft sales system, and a few others that are under the radar. If you want the best gizmo - it will have paid one of the above to be included in the data set, or you will have to make it for yourself. You think you can avoid them? There is a router that some fibre providers use exclusively that is made by an "Amazon" company (if you dig for the info). So are they scraping data? You decide. I have.
Wafers are cut from cylindrical ingots of silicon and so they come out round. It is far cheaper to just discard the edge pieces of an incomplete CPU than it would be to add extra steps to cut the wafers into a square shape
The chip is compass by various components usually qudrari for pulses but if the pulse is on a single row on a wire a wire chip an electrical cable with pulses should have greater space gains, a hollow chip
What makes NVIDIA such a dominant player in the GPU market? Do their chips simply outperform Google or Amazon chips, or they have the pioneer's advantages?
A question comes to mind, can we train JUST ONCE ON EVERYTHING, and allow that information to propagated in applications, in fact it does not even need to be propagated to the end use (Hey Bob, do you have 3 Peta bytes of storage at home), but the end result of the inference can be done on standard CPUs/GPUs, provided you have a nice pipe to the Internet.
They do not have to be built on the lowest available lithography scale that is currently trending. Ai Chips can be (even though slighlt larger) be easily produced specifically for Ai by the Chinese now even...Peace
I really wonder what Nvidia datacenter chips will be primarily used for besides classic HPCs and non AI datacenter computing once big market AI players AWS, Google, M$ say thank u Nvidia - no more orders we got our own chips. Nvidia AI market share will most likely significantly decrease unless they come up with something very special or never seen. Their R&D must be definitely "cooking" something special😁
Question is, how much capacity in the “TSMC”s are bought out by Nvidia. Is Amazon going to have enough capacity support to bulid its AI chips in the first place? Not a big chance I suppose.
They do not have to be built on the lowest available lithography scale that is currently trending. Ai Chips can be (even though slighlt larger) be easily produced specifically for Ai by the Chinese now even...Peace
@@MiscCrap I just saw news reporting chinese company "Moore Threads" Launches its MTT S4000 48 GB GPU for AI Training. Maybe "Amazons" aint that rely on TSMC when it comes to A chips afterall.
Well I never...that said 2024 is for Touching Pathways (*robots outstretched hand to finger reaching for its symboic human counter part) SCARY :)@@shengenma3116
🎉🎉🎉🎉 Shalom King. Did you breach ? I only wanted the cd chip with memory for storage recording's,such as: movie's, videos and song recording. I have a Small Camper Hummingbird 2018. The Chip disc or whatever y'all have already. Did you add more storage space for instance usage for flat screen in camper 19", Tablet? I don't have my camper and i need to record my song rehearsal without using this cell phone,pocket recorder.
And the craziest part is; All this magic is done by your brain at insanely much higher level and your brain only consumes the energy of a 100 watt light bulb. While these AI chips needs megawatts to come close to your brain capacity. ❤🙏😔 God is the greatest engineer of all.
Hello Wall Street journalpls an aspiring comp science student here please if you have a spare laptop that you ain’t using like it really just sitting there in your basement please I’ll be glad if you can give me so I can start learning how to code please it would really have a big impact on my life thank you
As someone who's dabbled on the consumer side of generative AI, I can say, at first your excited, then you find out it hardly ever generates any results you want that are usable for anything at all, then you realize prompting is complicated with a steep learning curve trying to talk into a black box model that most of the time ignores your requests. It ain't cracked up to be all that. It's like asking a Mexican to draw you a picture of a cat. He'll do it for free or cheap, but what you get may not be usable for anything at all. So you ask for another. And another. And another. Nope, that's not what I want.
there is no such thing as AI chip, or AI , these are whats inside GPUs, arithmetic operations units , they can do floating point or integer math, which is really what graphics cards do since they have to spit out a specific color for each pixel at a specific depth on the screen. but its the fad word of the 2020s to call everything AI.
I think this technology is aiming at this RTU VEF to be able to see it as some banking process and help to understand what that is and where it came from. Let them know - time to buy RTX, but we have few series 1000 spares and we haven't told you how much it already costs. So that is only thing you want at home 0:48. It's some thing in your basic VSCode tools and they still want for them to be the journalists to process that two hours Amirdrassil video and to let them know that that is - two UML objects, but in the end of masquerade - it's nobody job and I still want to reject your CV application 0:45 so I could make car sensors someday, because every 2000 year car has 6 CD changers with letters disc. Maybe text should be formatted as four crimson raiders campaign choices for the ones who just met Nazmir Blood Trools and Ghuun 5:08.
0:14 " no bigger than the size of your palm"... I'm out from this moment peeps, Mis-Information...NPU's, think of what Elon was pouting for placement within someones brain years ago...CORRECTION: Some no bigger than a 5 cent piece (much more plasible unless the masses must be kept at bay still)...Peace
Exciting times for AI semiconductor stocks TSMC, AMD and NVDA. which are all experiencing a surge in value. It's interesting to watch the competition develop, given these stocks are major contributors to Al chip growth. On the increase of my personal holdings, I've witnessed amazing impact on my shares.
Intel and AMD will definitely have their share of the market. TSMC is at max capacity and investing in other semiconductor companies will be an absolute power move, Different chips are good at different things and Nvidia has been very specialised, which leaves other aspects of Al open.
This is the type of in-depth detail on the semiconductor market that investors need, also the right moment to focus on the rewarding AI manifesto.
Im unconvinced about intel future. Do you really think nvidia can keep up this surge rate and stream for the better part of the decade?
certainly kayla, i had bought NVDA shares at $300, $475 cheap b4 the 10 for 1 split and with huge interest I keep adding, i’m currently doings the same for PLTR, POET and AMD constructively. Best possible way to get ahead, is participating behind top experienced performers.
i own three business, right now I'm compiling and picking stocks that l'd love to hold on to for a few years before retirement, do you think these stocks would do better over the years? My goal is to have at least $2 million saved for retirement.
This is a kind of journalism that make even tech understandable for the lay person.
This is great content WSJ
Because not much was explained about the technology. Only what it can do and where it would be useful
The competitive actions and innovations from all of these companies including Amazon is pretty exciting. It's exciting because not only will companies try to outdo each other's product but it will create more thinkers to innovative push limits and continue to expand capabilities especially with A.I.
Nope, nothing exciting about Lex Luthor and his dystopian conglomerates!
I think another way to think about these AI chips is to realize this is how they bypassed the bus size limitation. They have loaded the calcs into the chips so the data doesnt clog up the bus.
It's not a thinking machine it's just a compartmentalized calculator. Ingenius but not special other than they solved a big problem to increase speed and reduce heat.
That’s a good way to see it! It’s etching the machine level math into silicone so you don’t clog the software with the “instructions”. Have you seen the new start up etched?
@@6lack5ushi nah. I just knew they couldn't advance the speeds further until they conquered the hardware limits and the heat. They've scared the public with marketing but it's just a hardware shift and a indexed big data grab. It will steal jobs though from business administration roles due to no longer needing people to do spreadsheets, eventually. Not yet, but in a hundred years.
@@6lack5ushiCan't be silicone.
@@mike74h what do you mean?
Explains the $3.00 “ad-free” experience now for Amazon.
*Summary*
- 0:00: Introduction to the topic of Generative AI and its growing popularity.
- 0:09: AI chips, some small as palm-size, driving the AI boom, with skyrocketing demand.
- 0:19: Estimated market for AI accelerators in data centers growing from $150 billion to over $400 billion.
- 0:29: Tech giants racing to design faster, better AI chips.
- 0:47: Amazon's chip lab in Austin, Texas, designing AI chips for AWS servers.
- 1:00: Ron Diamant, chief architect of Amazon's custom AI chips Inferentia and Trainium.
- 1:11: Description of chip components, including tens of billions of transistors per chip.
- 1:30: Difference between AI chips and CPUs in packaging and function.
- 2:18: The necessity of additional components for AI chips to operate.
- 2:31: Amazon's two AI chips for training and inference, with training being more resource-intensive.
- 3:06: Energy demands and heat generation of AI chips, with cooling methods.
- 3:32: Integration of Amazon's chips into AWS servers.
- 4:18: Competition in AI chip market, with Nvidia dominating and major cloud providers creating custom chips.
- 5:01: Current uses of generative AI in consumer products; potential long-term payoff despite current hype.
- 5:24: Rapid advancement of AI technology requiring chip and software development to keep pace.
- 5:50: Amazon's recent release of a new version of Trainium, indicating a long-term commitment to AI chip development.
touch some grass
Haha, I think it's fine you do this, but people should probably be able to sit through a 6 minute video.
@@ChiekoGamers probably was summarized by some AI extension (chatgpt summarize)
@@GreatTaiwanNow RUclips has that functionality. Below a video there is an 'Ask' option. You can ask to summarize the video with time stamps and RUclips does that for you.
@@PD-cx4ep I don't have it .. never heard about it are you sure it's not a plugin u downloaded?
For completeness, CPU cores also run in parallel
Want to be even more blown away? The actual design and manufacturing of these chips one of the greatest technological achievements of all time, many geniuses involved.
It’s a great video to introduce the AI chip industry.
It's all US companies that innovates and lead this kind of advanced chips
Isn't ARM also doing pretty good work?
They have the money and can get engineers from all over the world.
Yes, because in Europe we are right now enforcing strong laws against the damaging effects from AI as we've already seen presented by Google Gemini with black president George Washington, indian people dominating France and white men being erased from History.
Furthermore Europe are - unlikethe US - trying to protect our populations against all the upcoming scams, dating crimes, big lies promoted by Big Tech, media and governments around the world hich will be utilized by criminals, gangs, hackers etc. as soon as AI takes off.
And THAT is the reason WHY european countries are per se MUCH richer and HAPPIER than the US despite the fact that most new tecnology is developed in the US :-))
....and NO, the chips are NOT produced in the US, but in TAIWAN, and the founder is NOT AMERICAN, but ASIAN, LOL!
yea, try to manufacture in US
Good. More demand for my job in the future.
Silicon chip industry is going to be the future.
We will see more things getting miniaturised on chips. Like Medical diagnostics, Particle Accelerator etc etc
lol no, it's getting harder and harder to push moore's law for silicon chips, future will be photonics
@@haha-hk9tx I'm pivoting to Quantum Photonics or superconducting Photonics soon. Planning on masters
Nonetheless even in traditional realm there's a lot of room for innovation. Like the way PD is done or even using Analog System for heavy NN load
I always envy your skills and awareness. Thanks for the training!
@1.08 Wafer - it is a dummy wafer. A good wafer has to be kept in a clean room, and people have to wear clean suits to prevent particle contamination on the wafer.
That's kinda silly.😂😂 He showed the wafer to demonstrate - how it looks like.🙄🙄
its still both a wafer, one needs an addition… noone said this specific wafer he holds up to show is going to really be used in an aws server nor is it even remotely relevant… but you aren’t running into an electronics store telling them the nonworking phones on display to show the dimensions aren’t really phones, are you?
Very well explained specially for us beginners..keep it up!!! love watching your videos... ️ Im your subscriber now every video got thumbs up .
Respect!One of the most efficient strategies I have ever met!🔝
This is just the beginning
The AI chips are the modern-day equivalence of the dedicated chip hardware from the 80s for 8-bit computers. Same cooling solution too. And they slap a sexy name on it so it will sell.
well they were sexy back then
@@veryCreativeName0001-zv1ir I'm a big C-64 fan. I'm surprised they still sell hardware for it. We need hard-core fans like that for AI.
Bringing back sanity when it is most needed 👉The Connections (2021) [short documentary] 👍
@@VeganSemihCyprus33 I graduated as a computer engineer three decades ago. Part of that formal education was AI and chaordic systems. That video is a philosophical take on what I believe and apply.
I like how in the first place we need all those dedicated chips to run specific task, then we don't like it and tried to make a chip that can run every task. And now we trying to make all those chips for specific task again because all-around chip can't cut it.
wsj, show how amazon will make amazon consumer electronic products like amazon phone and computers
Bringing back sanity when it is most needed 👉The Connections (2021) [short documentary] 👍
Your video on how to properly use indicators in trading was very helpful. I learned how to put them into practice and make more profit.
This was very good.
I dont know how any of this works lol, I didn't realize chips for AI needed to be different from normal ones lol
Think of it as a highway.
Cpu - 8 fast moving lanes (octa core)
Gpu/ai chip - 1000s of very slow moving lanes
Each has its own use.
its the patterns of computations that are done that are very different between AI & general computing. general: very quickly, a sequence of very different things needs to happen e.g. surfing, verify security token of website, load part of the webpage, show notification from another programm, in AI: push a lot of data with known location through a known series of operations (the AI model) - the latter is much better with slower, but highly parallel operations, the first is better with blazingly fast, random (unknown in advance) operations
It's just an HBM memory with a lot of SIMD processors. AI chips is just a marketing name😂
@@homo-sapiens-dubium I had to reread this a few times, but this was the best explanation I've gotten to date. That makes sense. Thank you :)) haha computers are so confusing to me. I'm 30 and not old enough where I should lack innate tech abilities, but I just never adopted them at all. Might be because my parent's were much older when they adopted me and weren't good with it themselves besides my dad who was awesome with it but wasn't patient enough to teach my mom and and me. Haha
That said, thank you, I think I kind of get it a little more now haha. I'm an esthetician so my expertise is in skin health and cosmetic chemical formulation so I never had to learn about how all these chips work but it's super impressive and way beyond me how they figured all this out in the first place haha. Like didn't he say there were hundreds of millions of the processing things in each chip? It was probably way more than that but the scale is the thing hard for me to understand. It's so small haha
@@willcookmakeup hey there random human! I'm glad if I made your day any bit more interesting and fun, thank you! Youre awesome by even looking into things you have no professional exposure to, congrats for being curious! The world is full of things we have no idea about, there are rabbit holes full of learning and fun everywhere - if there was just more time & energy to go after them..!?
computers are stupid. They are electrical machines, not more. Every complex gigantic problem is solved by breaking it into smaller pieces, and repeating the breakdown process until it is managable.
processors work with a set of instructions, primary school operations - + / * are a part of them, there are a few more, but they are just as simple. these operations are applied onto "chunks" of memory, that are basically 01 numbers. A programm is now simply a sequence of these basic operations on chunks of memory. Some chunks are from the programm itself, some from e.g. user input or another memory location on the PC. This is called assembly programming. More complex programms are written in "higher level languages" that can express more abstract things, like variables! These are then "translated" by other programs (called compilers) to that same assembly that can be executed by the CPU - thats it!
Specialized chips now have a completely different architecture. Think of a farmers tractor and a car. Different use, completely different design, but same principle: wheels and an engine. So it is with chips!
Keep learning & being awesome!
I should learn about skincare too - I have 0 knowledge on that!
However you feel about Amazon their AWS tech is amazing. They have optimized virtual machines that run at essentially the same performance as "bare metal" as if you were running on a real machine, which is just crazy hard to do. They're number one in cloud for a reason
really? a lot of servers are running with Xen.
Non-virtualized servers are _rare_.
Also, virtualization is basically a CPU feature for a loooong time.
Xen is an awesome open source project that AWS is a contributor to but they have a dedicated proprietary Nitro hypervisor and custom hardware solutions that outperform what other Xen based products offer @@user0K
And communism and islam are religions of peace.
Cool guerrilla marketing, bro!
👌🏻
Bringing back sanity when it is most needed 👉The Connections (2021) [short documentary] 👍
5:53 , 2:19 , 1:34
Privacy policies
Space x
AI chips are shaping the future of technology! Eager to see their impact across industries
amazing video
In 2123: Take a look inside the Amazon's robotics lab where they built thought police T1000 robots for Big Tech platforms
Bringing back sanity when it is most needed 👉The Connections (2021) [short documentary] 👍
You can't stop progress
maybe if i ask rly nicely like pretty please
Bringing back sanity when it is most needed 👉The Connections (2021) [short documentary] 👍
scared for when generative ai becomes destructive ai
By increasing complexity of the AI chip more and more can help to get information that can lead to more and more advanced inventions.
Even can put humanity to AI Robot
It's just an HBM memory with a lot of SIMD processors. AI chips is just a marketing name😂
NICE!
Amazing, can you please tell me that how you perfectly guess the stochastic goes up or down
None of these chips are going to matter when quantum comes online. I think it’ll be here by 2030 and it will radically change everything around us.
Some chips no bigger than my palm? That is a very massive chip. 0:15
No more idea. You have to unlock the free energy and done. Mass produce everything.
There's no such thing as free energy.
the real ocean boilers
#AI grab your 🥤🍿
Everyone wants to cut out NVidia as soon as possible. NVidia should flip the tables and become its own AI company or buy/partner with one. They would absolutely dominate.
Is that last guy an AI himself
Amazing ❤
The planet needs humanity to reduce GHG emissions, and what do we do? We keep inventing more and more energy-guzzling tech. There is a direct correlation between total (human) energy consumption and the deterioration of the environment (incl. the climate crisis). We need to reduce energy consumption, not increase it. The AI revolution is a luxury we simply cannot afford.
3:17 Anti-static wristband? 🤣🤣
Did I understand it that an AI chip is just basically a bunch of CPUs condensed into one?
Instead of a dual-core or quad-core, it's more like a octa-core?
I'm just holding out for AI-powered Google Home devices
Bringing back sanity when it is most needed 👉The Connections (2021) [short documentary] 👍
AI? The bias is in the training data. The bias will be geared to Google, Amazon, Micr$oft sales system, and a few others that are under the radar. If you want the best gizmo - it will have paid one of the above to be included in the data set, or you will have to make it for yourself. You think you can avoid them? There is a router that some fibre providers use exclusively that is made by an "Amazon" company (if you dig for the info). So are they scraping data? You decide. I have.
Cleaning 'em all must be a nightmare!
Pls, make a vid 'bout that!
No thanks I’ll stick to Nvidia.
Bringing back sanity when it is most needed 👉The Connections (2021) [short documentary] 👍
Why are the wafers circular? What happens to the halves etc that are not complete around the edges?
discarded
Wafers are cut from cylindrical ingots of silicon and so they come out round. It is far cheaper to just discard the edge pieces of an incomplete CPU than it would be to add extra steps to cut the wafers into a square shape
@@Axelarateofrayni thank you very much, I was always wondering
2:14 we call that gpu
Dexter from Dexter’s lab
It’s just a graphics card, don’t be fooled by fancy marketing.
Bringing back sanity when it is most needed 👉The Connections (2021) [short documentary] 👍
The chip is compass by various components usually qudrari for pulses but if the pulse is on a single row on a wire a wire chip an electrical cable with pulses should have greater space gains, a hollow chip
I love Amazon!
I think that google has the edge because they have the user generated content needed to train AI. Without content to train AI these chips are useless.
❤
No one will be able to make more advanced chips in TSMC
What makes NVIDIA such a dominant player in the GPU market? Do their chips simply outperform Google or Amazon chips, or they have the pioneer's advantages?
💚🖤📈🇺🇸
nVidia the 🐐🤖
A question comes to mind, can we train JUST ONCE ON EVERYTHING, and allow that information to propagated in applications, in fact it does not even need to be propagated to the end use
(Hey Bob, do you have 3 Peta bytes of storage at home), but the end result of the inference can be done on standard CPUs/GPUs, provided you have a nice pipe to the Internet.
I don't think you understand how LLMs and models work lol
@@Eleganttf2how would you explain to it though? i have troubles articulating this
They do not have to be built on the lowest available lithography scale that is currently trending. Ai Chips can be (even though slighlt larger) be easily produced specifically for Ai by the Chinese now even...Peace
no
How can you prove Chinese can build advanced ai chips ?
❤❤❤❤❤❤
IC chip from Malaysia 🇲🇾 too ??
5:27 Meanwhile they're showing random big pcbs and components that are not chips and AI.
I really wonder what Nvidia datacenter chips will be primarily used for besides classic HPCs and non AI datacenter computing once big market AI players AWS, Google, M$ say thank u Nvidia - no more orders we got our own chips. Nvidia AI market share will most likely significantly decrease unless they come up with something very special or never seen. Their R&D must be definitely "cooking" something special😁
tsmc inside
Question is, how much capacity in the “TSMC”s are bought out by Nvidia. Is Amazon going to have enough capacity support to bulid its AI chips in the first place? Not a big chance I suppose.
They do not have to be built on the lowest available lithography scale that is currently trending. Ai Chips can be (even though slighlt larger) be easily produced specifically for Ai by the Chinese now even...Peace
@@MiscCrap I just saw news reporting chinese company "Moore Threads" Launches its MTT S4000 48 GB GPU for AI Training. Maybe "Amazons" aint that rely on TSMC when it comes to A chips afterall.
Well I never...that said 2024 is for Touching Pathways (*robots outstretched hand to finger reaching for its symboic human counter part) SCARY :)@@shengenma3116
But are they using AI to help develop AI?
I ca see the bubble
There is no beating Nvidia
AI will destroy this world
🎉🎉🎉🎉 Shalom King. Did you breach ? I only wanted the cd chip with memory for storage recording's,such as: movie's, videos and song recording. I have a Small Camper Hummingbird 2018. The Chip disc or whatever y'all have already. Did you add more storage space for instance usage for flat screen in camper 19", Tablet? I don't have my camper and i need to record my song rehearsal without using this cell phone,pocket recorder.
I don't believe in the future of silicium for AI. I rather think it will be DNA or stem cells or stuff such as CRSPR
And the craziest part is; All this magic is done by your brain at insanely much higher level and your brain only consumes the energy of a 100 watt light bulb. While these AI chips needs megawatts to come close to your brain capacity. ❤🙏😔 God is the greatest engineer of all.
Instinct isn't found in AI.
English Canada
WSJ is catching up but its not as good as CNBC
The robots will be building themselves soon so this technology will look laughable.
☺️☺️☺️☺️☺️☺️☺️☺️
Whats the bet when AI becomes self aware it thinks its own image is a cat . True AI wont be restricted to chip ..
Yummy.
Amazon doesn't make AI chips. 2:35
Hello Wall Street journalpls an aspiring comp science student here please if you have a spare laptop that you ain’t using like it really just sitting there in your basement please I’ll be glad if you can give me so I can start learning how to code please it would really have a big impact on my life thank you
As someone who's dabbled on the consumer side of generative AI, I can say, at first your excited, then you find out it hardly ever generates any results you want that are usable for anything at all, then you realize prompting is complicated with a steep learning curve trying to talk into a black box model that most of the time ignores your requests. It ain't cracked up to be all that. It's like asking a Mexican to draw you a picture of a cat. He'll do it for free or cheap, but what you get may not be usable for anything at all. So you ask for another. And another. And another. Nope, that's not what I want.
I don’t trust Amazon,Google and Meta for any tech
Am sen5ntiward ahit
Its all just spinoffs of my work.
there is no such thing as AI chip, or AI , these are whats inside GPUs, arithmetic operations units , they can do floating point or integer math, which is really what graphics cards do since they have to spit out a specific color for each pixel at a specific depth on the screen. but its the fad word of the 2020s to call everything AI.
I think this technology is aiming at this RTU VEF to be able to see it as some banking process and help to understand what that is and where it came from. Let them know - time to buy RTX, but we have few series 1000 spares and we haven't told you how much it already costs. So that is only thing you want at home 0:48. It's some thing in your basic VSCode tools and they still want for them to be the journalists to process that two hours Amirdrassil video and to let them know that that is - two UML objects, but in the end of masquerade - it's nobody job and I still want to reject your CV application 0:45 so I could make car sensors someday, because every 2000 year car has 6 CD changers with letters disc. Maybe text should be formatted as four crimson raiders campaign choices for the ones who just met Nazmir Blood Trools and Ghuun 5:08.
Thought it was Zelensky in the thumbnail ☠️☠️
0:14 " no bigger than the size of your palm"... I'm out from this moment peeps, Mis-Information...NPU's, think of what Elon was pouting for placement within someones brain years ago...CORRECTION: Some no bigger than a 5 cent piece (much more plasible unless the masses must be kept at bay still)...Peace
WSJ only promoting Amazon chips because Bezos owns both companies?
This video is NOTHING about how the chips work.
the guy talking is not an engineer, "imagine a millionth of a millimeter" yea how you gonna do that
Looks like Zelensky in the thumbnail
People with that dialect make me feel uncomfortable...
Stupid woke sound voice over ruined the video form me.
Awful web server :)
AI for NS
I hope he does get rich off of it, good guy