It’s not a mispronunciation, you said a completely different word. Which means you haven’t read the company name in articles/docs more than a handful of times. Which means you’re talking with a hell of a lot of self-assumed authority for someone who basically knows nothing
I don't care about AI stuff but hearing about a company creating an open alternative using RISC-V to overthrow Nvidia's monopoly is great news. I hope the same happens in the DIY market because it has become incredibly stale and boring. Everyone using RISC-V CPU's sounds like a pipe dream but I think it will happen one day.
It'll be a pipe dream so long as the OS and software don't support it. Apple has transitioned to custom ARM chips because they've put it in the work for supporting it in software, but we don't quite see that on the PC side.
@@Slavolko As far as I understand, as long as all vendors implement the same standardized instruction set this should work fine. Like with graphics API's where each vendor implements DirectX or Vulkan instructions for its hardware so the software developers can use the same code for all the supported GPUs.
@@Einygmarissue is most software isn’t written for RISC-V, so there needs a lot of work to make it work on RISC-V (like having a translation layer like Apple has done from x86 to ARM to ease the usage before being properly ported).
The thing is, with their approach it also requires that the customers are capable of building their own custom models that are better than the box solution Nvidia ships - and this is the major caveat here. The highlighted companies like Google, Amazon, Tesla, etc - they are all big tech companies with massive resource pools and existing highly skilled software teams that can take on such jobs. But that isn't the case for the majority of the industry's needs as a whole. Most companies that actually benefit from these products, don't have fully stacked teams of google/microsoft level technicians and developers at their disposal, and they don't have the time nor expertise to scout the market for people to perform those tasks through 2rd party services. With Nvidia you get something that works out of the gate, has reliable support, and a major community of other players using the exact same systems - trouble shooting the exact same problems, and reaching solutions and knowledge gain that is very hard to replicate with bespoke in-house software. It's also one of the reasons Apple's OS and Windows are so popular -only here at a far lower economic scale for the end user, but the concept is the same. Nvidia isn't looking just at the big established corpos - they're looking at all of the up-and-coming enterprises that are looking to build their systems off of modern AI solutions and computing systems. These companies will gobble up Nvidia's product stack like hot bread. Said in another way, it's far easier to build a product stack that allows for full end user customization, than it is to create a full system lineup of hardware and software that works out of the box and has worldwide use and experience behind it. Nvidia can always make the switch when they need to - but RISCV based system manufacturers can't do the opposite. They're already locked - similar to how AMD will never reach the software suite integration and support that Nvidia has in enterprise. It would require decades of effort to just get on the same playing field, and even then you still have to convince customers to stop using what they've been using for years. I'd much rather be in Nvidia's shoes here, they have more options - which is a major benefit in a rapidly shifting industry.
Who has had his company renamed hahaha I kid but I'm sure the second letter in kellers company isn't an 'R'. I used riscV microcontrollers alot and it'll be good to see better fp performance trickle down hopefully. Drawing simple graphics with dual core esp32 ic to a tiny screen is painful if you want to make menus. It's why I use teensy4.1 for my silly builds. I've been excited for tenstorrent for a while.
Makes no sense. Keller stopped working at amd years ago... he does short contract jobs. He comes in, designs/organizes a masterpiece, gets boreed and leaves onto the next company/project. That's what he does...
@@ctsd623 he does it out of passion and works quick but results are not seen for atleast 3 to 4 years when it comes to silicon development. Plus he gets shares and I bet a small percentage at the other end like some actors do. At this point though I suspect he'd do it for free if he believes in it. In fact I bet at tt he hasn't made any roi yet. Yet haha He deserves it though. He looks at chips like I do at guitar but a better analog would be to use a player like Paul Masvidal or Roger Patterson haha
The more you buy, the more you save! Jensen's AI stranglehold may well fade, Raja-Keller Tenstorrent chips maybe gonna be killer sellers, so sayeth Coreteks: the futurist tech soothsayer!
What a time to be alive, open source hardware might become a reality 🔥🚀 I still remains skeptical though especially since Raja is involved but i truly want him to do well in his collaboration with Jim...
@@offline__The open source Nvidia Linux driver has come a long way in a short time. It's still rough, but it'll get there. I'm with ya though, I think it was a dick move to open source ONLY the newer architectures. The 700 series seems like the perfect target for open source optimizations, because regardless of how old they are, a lot of people still use 'em, along with the 900 & DEFINITELY the 1000 series.
Correct me if I'm wrong, but I'm pretty sure that Pascal is still the most popular Nvidia architecture at the moment, when it comes to the number of active users (1060 & 1070 are the most popular I think)
I really hope Jim Keller can put Nvidia in it's place and offer a competing product with the same performance but at a third of the price AI is still new enough that Nvidia's dominance in it can still be challenged.
I said this years ago, when everybody thought Intel had completely crushed AMD, and I first heard that Jim Keller had come back to AMD to help them design their Ryzen CPU. He's the Luke Skywalker of chip design. He can do it. Darth Huang and the nGreedia empire can and will be defeated by Luke Keller. :)
@@tringuyen7519 the issue is that making a hybrid APU arch framework is really really hard. OpenCL was meant to be an example, SYCL a more recent thing (and that also has limitations). We need a bigger idea, I think.
@@arenzricodexd4409 This is exactly what people said about Intel in 2016: "There are literally hundreds of engineers like Jim Keller at Intel! How can one guy beat hundreds of people! It's impossible!" Yet critical aspects of the Ryzen core, as well as the chiplet concept and the interconnect design were all Keller's. One man DID undo Intel's iron, monopoly grip on x86 big iron CPUs. If you think Keller can't do it again, you're betting against a man who has never designed a dud chip. He also was the main architect of the DEC 21164 Alpha CPU, AMD's Athlon, the author of AMD's x86-64 64 bit instructions, as well as Apple's A4 and A5 chips. Do you really want to bet against him?
Well put facts, but you are forgetting big players learned from mistakes of 80-90s for example Facebook or Google should be superseeded already but instead they just buy their competitors. I think same will happen with new competitors, Nvidia or AMD will just buy them ....
It is very exciting to see the progress of RISC-V. People think the biggest advantage of RISC-V is that it is open source but I think being open source only facilitates the biggest advantage of RISC-V, it is modular. You can put in what you need and leave out what you don't in the design and have an extremely custom chip. A chip doesn't need to be good at everything (scalar, vector, matrix and spatial instructions) if there is no need. The biggest obstacle for RISC-V remains manufacturing. Access to the most advanced process nodes from TSMC, Intel and Samsung can only be afforded by a few. Advance process nodes offer the performance efficiency we need to make RISC-V something everyone wants. This is why the main target of high performance RISC-V is the server market. Perhaps a company like Samsung can come to the rescue of RISC-V and develop Android Smartphones using RISC-V. They can switch from using ARM for their Exynos Mobile Processors.
It's interesting seeing how RISC-V is evolving. Something I think we will see become widespread in the near future is companies creating custom, hybrid solutions that combine RISC-V platforms as a base, but with highly customized, or even unique, proprietary IP blocks embedded alongside or within them. Every company will be able to have their own custom hardware designs, done either in-house or through specialist design companies, that they have IP rights over, working side by side with the open standard instructions, and potentially other IP licenced from third-parties. It will be the kind of heterogeneous computing AMD have been pushing, but far more customized and instead of customers being limited to AMD IP (or whoever) they will design their own, or take different ones from different designers and combine them with the open standard. Any company that can facilitate this process, from design co-ordination and IP integration, all the way to arranging fabrication and packaging, in a way that is more rapid and nimble than the big incumbents, will take off and become huge.
Until we get standard bodies and standards around the system as a whole rather than the ISA of the CPU RISCV will continue to be bespoke in use. Look at the issues Rene Rebe ran into with his early standard version of SIMD on the RISCV SBC he has.
One of your best videos, well potrayed. I am designing a RiscV accelerator, and i can first hand say the pace of software development and maturity is impressively fast. An area where RIsc V lagged was verification tooling, and this has been maturing well and is becoming more robust. Also, almost all the cool innovation in hardware is happening in RiscV. The biggest advantage is this velocity and freedom. We're already reaching the limits of silicon, which means that now specialization is the way to go to extract the next magnitudes of compute performance
It's great to see RiscV becoming a major factor. What do you think of Dojo? Tesla claims they will use it to go from something like 5-6 exaflops to 100 in the next 14 months. Who knows when they might make any of that available to the outside world but I imagine they will at some point and it seems like they could cut into the market pretty good when they do.
@@YolandaPlayne Self driving is a thing no one has accomplished before and is inherently unpredictable. If they are planning to build this in the next fourteen months, it means they've already lined up the production with TSMC. It's not the kind of thing you can place a last minute order for. Since there's already tesla designed silicon in the cars I think it's safe to assume they know how to process works.
@@YolandaPlayne seems believable. All they have to do is order fab capacity from TSMC. It's also 7nm which should have plenty affordable capacity. Their solution seems incredibly scalable too, with 25 chips in one tile.
Before Nvidia.....the king 3dfx ruled gaming. Before Apple... Nokia ruled the phone marked.... Before Phillips... Was dominant in television.. Before Kodak....ruled the photo industry. Before blockbuster was the marked former and King. All these companies (except Phillips). Is almost extinct..
I'm not sure where you got the Tenstorrent LG information but after reading up on it for some time. Tenstorrent seems to only license thier RISC-V CPU cores not including any AI hardware. And after remembering Ian Cuttress interview with Jim Keller: Jim Keller said something like surprisingly people want to license just the CPU cores so we are probably gonna do that. So i think LG is just looking to move away from arm as thier Smart-TV embedded chip and use Tenstorrent CPU IP to produce it's own Smart TV chip. which would run a Linux based operating System. Likely nothing to do with AI that part or at least no hint of it having anything to do with it.
Why do you think AMD's goal is to disrupt Nvidia? Most likely it's what you wanted to happen and then you say RX 7900 is the 'worst' because it wasn't cheap enough. I guess you never factored in that prices would fall. The RX 7900XT is cheaper than a RTX 4070Ti, consumers will still buy the 4070Ti because RT and DLSS3. The market is a multi-variable equation. How cheap would the RX 7900 series need to be to make it the 'best'? AMD's goal is to make a profit with good margins because they need to justify their actions to the board and stakeholders.
Yep ...intel is a prime example of how you can suddenly fall down when you're at the top. Ignoring innovation, blatantly milking consumers and prioritising shareholder demands and margins above all else. Intel is lucky they may have caught the fall in time to climb back up ...we'll see.
Until nvidia (and amd) decide that the crypto mining boom is over and stop fleecing their normal (average) gpu customers for everything they can squeeze out of them, I will continue to hope that something happens that gives them some humility. Will that ever happen? Probably not. But I can hope.
Going smaller doesn't always means smaller nodes but just getting stuff becoming way more simple in order to scale in parallelization. How else would those compute units become chiplets?
Thanks for another quality video Corteks. Any chance of investigating A.I hardware and the benchmarking of it? How can we measure performance of A.I.? Is anything readily available to measure the use of AI on GPU vs GPU, CPU vs CPU, GPU vs GPU?
AMD had the opportunity of the decade because Nvidia has so many Ampere stocks so they raised the prices like crazy and left the doors wide open. AMD went for chipsets to cut the costs and what have we got? 7900xtx as fast as the 4080 for $1000 just because Nvidia raised the price of the 4080 to $1200. 7900xt as fast as the 4070ti for $800 just because Nvidia raised the price of the 4070ti to $800. AMD had a chance and they blew it big time because RDNA 3 didn't deliver and we got nothing from Chiplets savings. I'm afraid that with the AI market, it will be a similar story. AMD was able to fight with Intel because this was a big, lazy, self-confident corporation used to the monopoly they had. Nvidia is a different kind of player just being ruled by greed. As for the riskV - sooner Nvidia (or Intel, or AMD) will buy it and this will be it.
Both AMD & NVIDIA had record breaking unsold last gen backstock to sell thru, that & all the unfounded fears of recession is why prices were so high. It worked, both companies have vastly cleared the backlog, & both are bringing down prices. I suspect they wont bring prices too much lower though, because the silicon is more profitable as ai chips.
Thinking more on this, you can see why Xilinx was such a key pickup for AMD. Being able to sell a chip with it's own FPGA you can customize for a specific AI workload is a great idea. Then selling a GPU or dedicated chip that can perform that task is a killer sell. If ROCm was as easy to use as CUDA they would be able to have the whole stack under control.
Unbelievable that we hear a voice and assume they’re an expert and meanwhile they don’t even know the company name. It’s so shameful. How could I ever think “oh hey that guy over there said so and so, I’ll take that opinion on board” after this. Hillarious shit man
LG aren't just licencing generic RISC V chip designs from Tenstorrent though, are they? I suspect they're licencing their AI add-on modules as well, otherwise they'd just use an open-source RISC-V core and save themselves the licence fee. If that's the case then the story is just "LG switches chip vendors", as they'll be locked in to Tenstorrent's RISC-V extensions just as if they'd gone with ARM or Nvidia.
Let me know when it arrives. I'll be interested in how it handles graphics. But I am also interested in IF/How, UnReal 5 is going to change things. On the graphics front. Will we be able to run high Q graphics without a high Q card?
Gene editing, Healthcare prediction, individualized drug creation (protein folding), robotics symbiosis, robotics in general.. there is no market that will not get disrupted by AI
AMD and Intel should really watch this! Jim Keller as an industry veteran completely understands the "hole in the market" not just from the technological, but business model perspective as well! Tenstorrent concept and model is excellent and their approach is future proof. I just hope its not way too much advanced and forward looking as industry might not be ready for something like this and would rather let Nvidia milk it even more... Nevertheless something new and innovative including open source is much needed as Nvidia becomes something like a monopoly on the AI market, getting more powerful and more wealthier by every hour... just imagine what would happen if they would merge with Arm...
Discreet graphics ain't going anywhere for a while, and nVidia can transition to a market share approach where they'll make some long-term commitments at a discount with TSMC and/or Samsung and/or whichever chip fab. The thing is, with nVidia, they have top-notch driver maturity, so they will play well with games, editing, simulation, etc. In other words, they're reliable for hardware compatibility. So there is some incentive for them to shrink some of their last-gen chips down and go the budget route.
Good report Celso I copied your report link into my SA distribution which you can find at my Seeking Alpha comment line and as always I have added my own observations. mb
A very astute video into the nature of Corporate Structures as much as it is technology. Irony as everything is being forced into the Corporate Structure of operation now days including Public Services and Government. Talking to a DB developer startup who's company was going Public Listing - I mentioned to him it would be a great time to get out and look at something new. As Companies - Structures become too large, they fall in on their own foot print. As one advocating for Open AI systems and keeping Software and Hardware Open and accessible - this is good news.
Cheeses... What is with that echo, it's way worse than normal? Put a duvet behind, and a couple of pillows in front of you when recording, and get a carpet for your floor.
I cant wait for tensorent to take off Alternatively, i cant wait for AMD or someone else to make a machine learning ASIC that is an FPGA Imagine, killing off ML/AI usage of desktop GPUs like ASICS did for schitt-coin, but instead of only being viable for a few months, even days before the algorithm changes and a new ASIC needs to be designed and manufactured, if the ASIC could be re-designed on the fly for new algorithms, offering ASIC performance, with CPU levels of relevance/life span
@@HikarusVibrator Hey man, i got it wrong too tenstorrent and i'm not going to remember that after hearing tensorrent so many times (Edit, come to think of it, i dont think i can recall hearing it ever pronounced correctly except for maybe a Dr Ian Cutress video)
@@denvera1g1 this guy is talking about the intricacies of GPUs and giving very weighty opinions of companies/products in an industry full of very very smart people. He acts as an expert and convinces the fools who watch his channel that TRANStorrent are going to beat up Nvidia. Come oooon dude this is like the video game “journalist” that couldn’t get past the Cuphead demo. People who know tech never do this
Are there any examples of "open source" anything dethroning a major closed source company (other than Linux for servers)? Even Vulkan is hardly getting used anymore over DX12 for some reason.
Star citizen is moving over to vulkan as well. When that gets going it will pull the rest of the games industry with it when they outsource their code to build huge interconnected worlds
Over the next 10 years the market will require more capacity? Yes, the capacity will be provided by Nvidia as always. Smaller players will have to buy the gpus like everyone else, if they can't afford it they will have to buy cheaper models. Taking this argument and coming to the conclusion that new chip manufacturers will enter the market is nonsense. The trend is clear, firms that produce cpus and gpus and storage devices will become less over time, not more.
Wow, way too much tech jargon It would be great if the industry could create a modular compute architecture at the hardware level. FPGA is the expensive approach and ASICs are the cheap approach. CPU and GPUs are the everything is a nail so lets use a hammer approach.
Closed system where you can do nothing or open source where you have to do many things yourself? I think the thing that's the easiest to use will win. And I'm pretty sure that would be some user friendly but effective AI.
These guys couldn't get done with AMD, they couldn't do it with Intel's money. They aren't going to threaten Nvidia with a broke ass startup running the discount architecture. Risk-5 is IoT junk For OEM's not willing to pay the ARM licensing fee.
this would be interesting 4 open source! good luck 2 all...some super talents coming together, 2 achieve a historic purpose?😎👍 this will probably be a DPU!! or CPU+DPU+GPU!! wow! I get excited!!
I basically see your videos as I love the way you speak, and your knowledge is amazing. Nvidia reminds me of the INTEL of old under Andy Grove who wrote the headline making book ONLY THE PARANOID SURVIVE. When he was the head INTEL that company could not be overtaken by any other company because he was always scared, and he made his company work so hard to make sure no other company takes any lead. The current CEO of NVidia is the same: he is scared, always scared. He will never give an inch. He will push boundary after boundary to be the leader of the industry. I am now 61 years old and started using computers when were called word processors way back in 1980. I always keep the latest machine with the greatest hardware. My GPU has always been Nvidia. I tried ATI two times and both cards failed. No Nvidia card has ever failed for me. I am right on AMD 7950X. I wanted to upgrade to the 7950X3D but when I saw virtually no performance increase, I decided to wait. In the meantime, I bought a gaming laptop with the 13980HX, and I am amazed by the speed. For my desktop when I had no choice but to upgrade to a new DDR 5 motherboard, I had the choice of going for RAPTOR lake, but I said to myself: AMD is giving an upgrade route while INTEL is end of life for this socket and the very high-power draw scared me. So, I thought I was safe in AMD land. Surprise. Surprise. INTEL Is now releasing updated RAPTOR lake which should beat any AMD processor, even their 3D ones. So, I think even when INTEL is so down with its antiquated chip making facilities it is not letting AMD really take over in any way. Once it catches up it will again push AMD towards the grave. Nvidia is the same. I am an Indian and Raja Koduri seems like a nice guy, but he has constantly failed as does not have the craziness of the current NVidia CEO. The guy can do anything. Make any gamble but his gambles are very calculated and mostly pay off. I am now just a little scared of the new upcoming 5 series products as I think the GPU will be the complete computer where you install a CPU and SSD inside! I just hope I am not tempted to upgrade as already my current machine is using a 1200W power supply.
Nvidia has long sold gpus, now also cpus and more integrated systems with their stated goaø being to create accelerators of different kinds starting with gpus. The thing is that Nvidia has not created a chip purely for machine learning rather it has molded it's gpus to fit.
hard disks are not going away because ssds exist, ssds can not be used for long term data storage, hard disks will always be necessary, so there is really no point of hard disk manufacturers to pay attention to ssds.
PLEASE, Coreteks! You was always so groovy relaxed tempo orator. NOW you use that "natural pause removal algorythm" or you've been trended to speak like that. Anyway, after a sentence at the dot, one second breathe. Read some poetry for sakes not to forget the rythm of sylables. OrMaybeWeRemoveSpacesAndDesideToSeparateWordsLikeThis?
Polaris was a solid midrange GPU range for low end pricing. My RX 580 8GB still good at 1080p modern gaming and it's good at compute. Got it for like $170 brand new a few years ago and that was a steal considering the midrange to high end GPU market now
AMD Radeon's poor efficiency, lack of features, their lying and terrible pricing had me purchasing a 4090... I still bought into their new AM5 platform though.
Speaking of new technologies that companies ignore. Optane. Intels strategy for optane is utter garbage Unless that optane really does cost $200 for 120GB to manufacture, Intel was asking datacenter prices, from their consumer market. Do you know what is going to need Optane in the next 4 years? QLC NVMe Just like how its impossible to buy SLC NVMes and very hard to find MLC NVMes, so too will be TLC in about 4 years. What happens when you write more than 10GB at once to a high end 4TB gen 4 QLC NVMe that is 1/2 full? It slows to about 40MB/s or only slightly faster than USB 2.0 Do you know what was faster transfering files from my camera than that $379 NVMe? A $299 16TB HDD, instead of starting out at ~4600MB/s and dropping down to 40, the SATA HDD started out at 300MB/s, and sometimes dropped to 260MB/s the total time it took to transfer that ~60GB of photos and videos(200Mbps 4K60) was about 50% faster with the HDD than the QLC NVMe If i had a 120GB optane drive for cache, that file transfer would have ran at 1500MB/s no matter which drive i was using, as long as it didnt go too much over 120GB. This is why my new file server is HDD+Optane(4x960GB 905P) instead of my old all flash file server, which is now my on-site backup + portable sync server for my off-site that does not have wired internet) The upside of my old file server is that 1: Pure flash saves alot on energy, 2: re-builds are far less stressful because reading from an SSD is very unlikely to kill it, unlike an HDD which is equally likely to die on both reads and writes
@@Unicorn-Coin To penetrate a market, it helps to have a household name, they needed something like the 960GB 905P at $250 that was not advertized as being selling point for your other products because it 'requires' speciffic newer processors instead of waiting several years to drop it to $399, and constantly hiding the fact that it actually works great on older intel parts and newer AMD parts With the public perception that Optane is 10-20x more expensive than NAND, and requires you to buy overpriced parts to even use it, Intel poisoned public perception Yes, Datacenter is where you make the money, but if Intel were smart like Nvidia and AMD used to be before 2020, they would have affordable consumer parts What makes businesses rush to buy good products, are their employees having good experiences with consumer versions of those products, 10k sales pitches arent as good as hearing your own employees sing the praises of a product.
@@Unicorn-Coin To clarify, i dont think it was the right time to market it for gaming, but in 4 years, when we have PLC NAND gen 6 NVMe drives that slow down to 20MB/s when installing game updates, that is when Optane is for gaming. My other comment about consumer products is directed at homelab people, people who spent maybe $1500 building their own 20+TB file server, people who have alot of games, but cant afford an 8TB NVMe, so get a 16TB HDD and a 1TB Optane for 1/4 the price. I think Intel's strategy to target gaming was on the right track, but their marketing was all wrong. To make it successful, they needed a different software suite that allowed older, and non-intel products to work(without buying 3rd party tools like Primo Cache), sure they lack hardware acceleration, but the walled garden approach of Apple, will scare off ANY personal computer enthusiast, they stay away from Apple because they like owning their own hardware, and just a hint of 'this is not your computer, you are not allowed to use it that way' makes most enthusiasts turn away Sure Intel fanboys would buy it, but even myself, an agnostic who only had Intel up until i built a 3950x file server, did not buy optane because of the perception of 'having my rights taken away'
You're showing videos from the 70s and 80s about hard drives being inferior to SSDs. And claiming they were wrong. Sure, they were wrong 30 years later. They didn't make a mistake at all going for hard disks until SSDs became viable decades later.
Sorry I mispronounced tenstorrent, my bad
Lol! I was wondering what was going on. It's okay man, I made a mistake once.
I was just kidding with my comment. Great content. Some of your predictions for a few years ago are close to fruition I suspect.
Is there going to be a part 2 video because you only touched on Tenstorrent. Will the consumer eventually benefit from RISC-V or what??
At least we now know Tenstorrent processors are chips on Tren
It’s not a mispronunciation, you said a completely different word. Which means you haven’t read the company name in articles/docs more than a handful of times. Which means you’re talking with a hell of a lot of self-assumed authority for someone who basically knows nothing
I don't care about AI stuff but hearing about a company creating an open alternative using RISC-V to overthrow Nvidia's monopoly is great news. I hope the same happens in the DIY market because it has become incredibly stale and boring. Everyone using RISC-V CPU's sounds like a pipe dream but I think it will happen one day.
It'll be a pipe dream so long as the OS and software don't support it. Apple has transitioned to custom ARM chips because they've put it in the work for supporting it in software, but we don't quite see that on the PC side.
@@Slavolko As far as I understand, as long as all vendors implement the same standardized instruction set this should work fine. Like with graphics API's where each vendor implements DirectX or Vulkan instructions for its hardware so the software developers can use the same code for all the supported GPUs.
@@Einygmar I know, but my point is that you'll need to convince OS and software developers to support your new hardware standard. That's all.
@@Einygmarissue is most software isn’t written for RISC-V, so there needs a lot of work to make it work on RISC-V (like having a translation layer like Apple has done from x86 to ARM to ease the usage before being properly ported).
The thing is, with their approach it also requires that the customers are capable of building their own custom models that are better than the box solution Nvidia ships - and this is the major caveat here. The highlighted companies like Google, Amazon, Tesla, etc - they are all big tech companies with massive resource pools and existing highly skilled software teams that can take on such jobs. But that isn't the case for the majority of the industry's needs as a whole. Most companies that actually benefit from these products, don't have fully stacked teams of google/microsoft level technicians and developers at their disposal, and they don't have the time nor expertise to scout the market for people to perform those tasks through 2rd party services.
With Nvidia you get something that works out of the gate, has reliable support, and a major community of other players using the exact same systems - trouble shooting the exact same problems, and reaching solutions and knowledge gain that is very hard to replicate with bespoke in-house software.
It's also one of the reasons Apple's OS and Windows are so popular -only here at a far lower economic scale for the end user, but the concept is the same. Nvidia isn't looking just at the big established corpos - they're looking at all of the up-and-coming enterprises that are looking to build their systems off of modern AI solutions and computing systems. These companies will gobble up Nvidia's product stack like hot bread.
Said in another way, it's far easier to build a product stack that allows for full end user customization, than it is to create a full system lineup of hardware and software that works out of the box and has worldwide use and experience behind it. Nvidia can always make the switch when they need to - but RISCV based system manufacturers can't do the opposite. They're already locked - similar to how AMD will never reach the software suite integration and support that Nvidia has in enterprise. It would require decades of effort to just get on the same playing field, and even then you still have to convince customers to stop using what they've been using for years. I'd much rather be in Nvidia's shoes here, they have more options - which is a major benefit in a rapidly shifting industry.
Ian Cutress (TechTechPotato) has some great interviews with Jim Keller regarding Tenstorrent
Indeed, he does.
Jim Keller is a living legend
Who has had his company renamed hahaha I kid but I'm sure the second letter in kellers company isn't an 'R'.
I used riscV microcontrollers alot and it'll be good to see better fp performance trickle down hopefully. Drawing simple graphics with dual core esp32 ic to a tiny screen is painful if you want to make menus. It's why I use teensy4.1 for my silly builds.
I've been excited for tenstorrent for a while.
And Raja is legendarily fucking useless.
Without keller AMD would likely go bankrupt now.
Makes no sense. Keller stopped working at amd years ago... he does short contract jobs. He comes in, designs/organizes a masterpiece, gets boreed and leaves onto the next company/project. That's what he does...
@@ctsd623 he does it out of passion and works quick but results are not seen for atleast 3 to 4 years when it comes to silicon development. Plus he gets shares and I bet a small percentage at the other end like some actors do. At this point though I suspect he'd do it for free if he believes in it. In fact I bet at tt he hasn't made any roi yet. Yet haha He deserves it though. He looks at chips like I do at guitar but a better analog would be to use a player like Paul Masvidal or Roger Patterson haha
This is so amazing, years ago open hardware sounded like a joke but now it's a threat
.
Good.
open source AI models are also a threat. a leaked google document says google and openai have no moat, and open source might crush them both.
The question Jensen would ask when talking about this topic with Keller is "but does it run Crysis?"
Nvidia cant run cysis either 😅😅😅😅😅
The more you buy, the more you save! Jensen's AI stranglehold may well fade, Raja-Keller Tenstorrent chips maybe gonna be killer sellers, so sayeth Coreteks: the futurist tech soothsayer!
What a time to be alive, open source hardware might become a reality 🔥🚀
I still remains skeptical though especially since Raja is involved but i truly want him to do well in his collaboration with Jim...
I can hear EVGA yelling fuck yea from here.
Can we get a yay for competition and open source
Sadly nvidia didn't make my gpus driver open source for some reason (GTX770) 😭
@@offline__The open source Nvidia Linux driver has come a long way in a short time. It's still rough, but it'll get there. I'm with ya though, I think it was a dick move to open source ONLY the newer architectures. The 700 series seems like the perfect target for open source optimizations, because regardless of how old they are, a lot of people still use 'em, along with the 900 & DEFINITELY the 1000 series.
Correct me if I'm wrong, but I'm pretty sure that Pascal is still the most popular Nvidia architecture at the moment, when it comes to the number of active users (1060 & 1070 are the most popular I think)
YAY!
@@JacksonPhixesPhones couldn't have said it better
I really hope Jim Keller can put Nvidia in it's place and offer a competing product with the same performance but at a third of the price AI is still new enough that Nvidia's dominance in it can still be challenged.
I said this years ago, when everybody thought Intel had completely crushed AMD, and I first heard that Jim Keller had come back to AMD to help them design their Ryzen CPU. He's the Luke Skywalker of chip design. He can do it. Darth Huang and the nGreedia empire can and will be defeated by Luke Keller. :)
Nvidia’s CUDA is why it’s succeeding now & why it will lose in the future. CUDA is GPU specific. It doesn’t work with a hybrid APU architecture.
@@Moshe_Dayan44 and there probably tens if not hundreds of people like Jim Keller inside nvidia.
@@tringuyen7519 the issue is that making a hybrid APU arch framework is really really hard. OpenCL was meant to be an example, SYCL a more recent thing (and that also has limitations). We need a bigger idea, I think.
@@arenzricodexd4409 This is exactly what people said about Intel in 2016: "There are literally hundreds of engineers like Jim Keller at Intel! How can one guy beat hundreds of people! It's impossible!" Yet critical aspects of the Ryzen core, as well as the chiplet concept and the interconnect design were all Keller's. One man DID undo Intel's iron, monopoly grip on x86 big iron CPUs. If you think Keller can't do it again, you're betting against a man who has never designed a dud chip. He also was the main architect of the DEC 21164 Alpha CPU, AMD's Athlon, the author of AMD's x86-64 64 bit instructions, as well as Apple's A4 and A5 chips. Do you really want to bet against him?
Well put facts, but you are forgetting big players learned from mistakes of 80-90s for example Facebook or Google should be superseeded already but instead they just buy their competitors. I think same will happen with new competitors, Nvidia or AMD will just buy them ....
It is very exciting to see the progress of RISC-V. People think the biggest advantage of RISC-V is that it is open source but I think being open source only facilitates the biggest advantage of RISC-V, it is modular. You can put in what you need and leave out what you don't in the design and have an extremely custom chip. A chip doesn't need to be good at everything (scalar, vector, matrix and spatial instructions) if there is no need.
The biggest obstacle for RISC-V remains manufacturing. Access to the most advanced process nodes from TSMC, Intel and Samsung can only be afforded by a few. Advance process nodes offer the performance efficiency we need to make RISC-V something everyone wants. This is why the main target of high performance RISC-V is the server market. Perhaps a company like Samsung can come to the rescue of RISC-V and develop Android Smartphones using RISC-V. They can switch from using ARM for their Exynos Mobile Processors.
It's interesting seeing how RISC-V is evolving. Something I think we will see become widespread in the near future is companies creating custom, hybrid solutions that combine RISC-V platforms as a base, but with highly customized, or even unique, proprietary IP blocks embedded alongside or within them. Every company will be able to have their own custom hardware designs, done either in-house or through specialist design companies, that they have IP rights over, working side by side with the open standard instructions, and potentially other IP licenced from third-parties. It will be the kind of heterogeneous computing AMD have been pushing, but far more customized and instead of customers being limited to AMD IP (or whoever) they will design their own, or take different ones from different designers and combine them with the open standard. Any company that can facilitate this process, from design co-ordination and IP integration, all the way to arranging fabrication and packaging, in a way that is more rapid and nimble than the big incumbents, will take off and become huge.
Until we get standard bodies and standards around the system as a whole rather than the ISA of the CPU RISCV will continue to be bespoke in use. Look at the issues Rene Rebe ran into with his early standard version of SIMD on the RISCV SBC he has.
One of your best videos, well potrayed. I am designing a RiscV accelerator, and i can first hand say the pace of software development and maturity is impressively fast. An area where RIsc V lagged was verification tooling, and this has been maturing well and is becoming more robust. Also, almost all the cool innovation in hardware is happening in RiscV. The biggest advantage is this velocity and freedom. We're already reaching the limits of silicon, which means that now specialization is the way to go to extract the next magnitudes of compute performance
It's great to see RiscV becoming a major factor.
What do you think of Dojo? Tesla claims they will use it to go from something like 5-6 exaflops to 100 in the next 14 months. Who knows when they might make any of that available to the outside world but I imagine they will at some point and it seems like they could cut into the market pretty good when they do.
100 exaflops? In 14 months?? Ain't no way.
@@ireallyreallyreallylikethisimg Agreed. I don't believe it. Just another marketing promise by Elon. "Self driving"? A trademark, not a feature.
@@YolandaPlayne Self driving is a thing no one has accomplished before and is inherently unpredictable.
If they are planning to build this in the next fourteen months, it means they've already lined up the production with TSMC. It's not the kind of thing you can place a last minute order for. Since there's already tesla designed silicon in the cars I think it's safe to assume they know how to process works.
@@YolandaPlayne seems believable. All they have to do is order fab capacity from
TSMC. It's also 7nm which should have plenty affordable capacity.
Their solution seems incredibly scalable too, with 25 chips in one tile.
@@daniel_960_ capacity is tight, this is known.
Before Nvidia.....the king 3dfx ruled gaming.
Before Apple... Nokia ruled the phone marked....
Before Phillips... Was dominant in television..
Before Kodak....ruled the photo industry.
Before blockbuster was the marked former and King.
All these companies (except Phillips). Is almost extinct..
Jim Keller been with tenstorrent few years now , raja joining might ruin his project , rajas track record is not very good .
TENS TOR ENT - THERE IS NO R
Not much to say about it, agree, ( EX Dec Alpha employee here :P )
I'm not sure where you got the Tenstorrent LG information but after reading up on it for some time.
Tenstorrent seems to only license thier RISC-V CPU cores not including any AI hardware.
And after remembering Ian Cuttress interview with Jim Keller:
Jim Keller said something like surprisingly people want to license just the CPU cores so we are probably gonna do that.
So i think LG is just looking to move away from arm as thier Smart-TV embedded chip and use Tenstorrent CPU IP to produce it's own Smart TV chip. which would run a Linux based operating System. Likely nothing to do with AI that part or at least no hint of it having anything to do with it.
Both WD and Seagate both have made the successful transition from hard disks to SSD, I am sure other companies have made the change.
I will never again trust in a project that Raja Koduri is a part of
Keller kinda offsets Koduri imo
@@Unicorn-CoinPat Gelsinger does. Look at the 180 with arc since Raja was removed from overseeing it.
So when is nVidia going to buy this company?
Why do you think AMD's goal is to disrupt Nvidia? Most likely it's what you wanted to happen and then you say RX 7900 is the 'worst' because it wasn't cheap enough. I guess you never factored in that prices would fall. The RX 7900XT is cheaper than a RTX 4070Ti, consumers will still buy the 4070Ti because RT and DLSS3. The market is a multi-variable equation. How cheap would the RX 7900 series need to be to make it the 'best'? AMD's goal is to make a profit with good margins because they need to justify their actions to the board and stakeholders.
Yep ...intel is a prime example of how you can suddenly fall down when you're at the top. Ignoring innovation, blatantly milking consumers and prioritising shareholder demands and margins above all else. Intel is lucky they may have caught the fall in time to climb back up ...we'll see.
"Tenstorrent" not "Trenstorrent" Celso
Until nvidia (and amd) decide that the crypto mining boom is over and stop fleecing their normal (average) gpu customers for everything they can squeeze out of them, I will continue to hope that something happens that gives them some humility. Will that ever happen? Probably not. But I can hope.
I remember the Sun Microsystems computer in the engineering lab at NJIT in Newark, NJ had 1 GB of ram. That was 1996. It was unheard of. : )
Going smaller doesn't always means smaller nodes but just getting stuff becoming way more simple in order to scale in parallelization. How else would those compute units become chiplets?
Why Intel even allowed these guys to leave is beyond me. They're the greatest minds for GPU's and CPU's.
Thanks for another quality video Corteks. Any chance of investigating A.I hardware and the benchmarking of it? How can we measure performance of A.I.? Is anything readily available to measure the use of AI on GPU vs GPU, CPU vs CPU, GPU vs GPU?
I'm curious to how RISC-V performance stacks up to the other architectures? That will in no doubt will affect its adoption.
AMD had the opportunity of the decade because Nvidia has so many Ampere stocks so they raised the prices like crazy and left the doors wide open.
AMD went for chipsets to cut the costs and what have we got?
7900xtx as fast as the 4080 for $1000 just because Nvidia raised the price of the 4080 to $1200.
7900xt as fast as the 4070ti for $800 just because Nvidia raised the price of the 4070ti to $800.
AMD had a chance and they blew it big time because RDNA 3 didn't deliver and we got nothing from Chiplets savings. I'm afraid that with the AI market, it will be a similar story.
AMD was able to fight with Intel because this was a big, lazy, self-confident corporation used to the monopoly they had. Nvidia is a different kind of player just being ruled by greed.
As for the riskV - sooner Nvidia (or Intel, or AMD) will buy it and this will be it.
It looks like AMD can't see past the next quarter. They want to imitate Nvidia in any way possible.
Chiplet saved amd money, but then GPUs are not selling and amd losing money after all , lol
AMD eventually create superior hardware only to lose again with inferior software. No one is replacing CUDA at this stage.
Both AMD & NVIDIA had record breaking unsold last gen backstock to sell thru, that & all the unfounded fears of recession is why prices were so high. It worked, both companies have vastly cleared the backlog, & both are bringing down prices. I suspect they wont bring prices too much lower though, because the silicon is more profitable as ai chips.
First gen chiplets
Raja is a doom upon all he touches. Putting him onboard your ship is like ordering the crew to punch a 10 foot hole in the side of your boat.
Thanks, I couldn't have said it better.
Thinking more on this, you can see why Xilinx was such a key pickup for AMD. Being able to sell a chip with it's own FPGA you can customize for a specific AI workload is a great idea. Then selling a GPU or dedicated chip that can perform that task is a killer sell. If ROCm was as easy to use as CUDA they would be able to have the whole stack under control.
Jim's interview with tech tech potato was the 1st time i figured out what jim was building. I think he has a winning model.
Tenstorrent, not T"R"enstorrent I believe.
@sersou my bad
Unbelievable that we hear a voice and assume they’re an expert and meanwhile they don’t even know the company name. It’s so shameful. How could I ever think “oh hey that guy over there said so and so, I’ll take that opinion on board” after this. Hillarious shit man
No it’s Transtorrent, the opposite of Cistorrent
LG aren't just licencing generic RISC V chip designs from Tenstorrent though, are they? I suspect they're licencing their AI add-on modules as well, otherwise they'd just use an open-source RISC-V core and save themselves the licence fee.
If that's the case then the story is just "LG switches chip vendors", as they'll be locked in to Tenstorrent's RISC-V extensions just as if they'd gone with ARM or Nvidia.
Let me know when it arrives. I'll be interested in how it handles graphics. But I am also interested in IF/How, UnReal 5 is going to change things. On the graphics front. Will we be able to run high Q graphics without a high Q card?
Hope a company uses this open hardware to make an affordable 4K 240Hz nano-LED gaming monitor.
This video highlight the issues with capitalism in general. Infinite growth in a finite world means death.
Make gpu's upgradable like motherboards.
Make GPU chip's swappable like you do with your CPU when it's time to change it.
@@gregandark8571Would certainly improve longevity ownership of the card.
you could pitch-in the idea to investors, go with it
Yay another episode of hopium for the tech industry with the most soothing voice ever
NVidia is* not "are" NVidia is not multiple entities.
how do we invest in tenstorrent?
Pff most of the AI stuff isn’t really great for the end consumer. Most use cases can’t even generate money in any reasonable way.
Gene editing, Healthcare prediction, individualized drug creation (protein folding), robotics symbiosis, robotics in general.. there is no market that will not get disrupted by AI
AMD and Intel should really watch this! Jim Keller as an industry veteran completely understands the "hole in the market" not just from the technological, but business model perspective as well! Tenstorrent concept and model is excellent and their approach is future proof. I just hope its not way too much advanced and forward looking as industry might not be ready for something like this and would rather let Nvidia milk it even more... Nevertheless something new and innovative including open source is much needed as Nvidia becomes something like a monopoly on the AI market, getting more powerful and more wealthier by every hour... just imagine what would happen if they would merge with Arm...
Which company Intel acquired and used their IP for the thread director?
Discreet graphics ain't going anywhere for a while, and nVidia can transition to a market share approach where they'll make some long-term commitments at a discount with TSMC and/or Samsung and/or whichever chip fab. The thing is, with nVidia, they have top-notch driver maturity, so they will play well with games, editing, simulation, etc. In other words, they're reliable for hardware compatibility. So there is some incentive for them to shrink some of their last-gen chips down and go the budget route.
Brute-force the $150-300 market
This is indeed a crazy time to watch the industry
Good report Celso I copied your report link into my SA distribution which you can find at my Seeking Alpha comment line and as always I have added my own observations. mb
A very astute video into the nature of Corporate Structures as much as it is technology.
Irony as everything is being forced into the Corporate Structure of operation now days including Public Services and Government. Talking to a DB developer startup who's company was going Public Listing - I mentioned to him it would be a great time to get out and look at something new. As Companies - Structures become too large, they fall in on their own foot print.
As one advocating for Open AI systems and keeping Software and Hardware Open and accessible - this is good news.
Cheeses... What is with that echo, it's way worse than normal?
Put a duvet behind, and a couple of pillows in front of you when recording, and get a carpet for your floor.
Thanks CT.. Always Stellar delivery!!
Where can I find information on the images at 8:10 of the video?
Unrelated question: When you buy Office key, where the F do you get download link?
Maybe in a few years Raja will leave so Tenstorrent can make some good products.
Yeah, I swear his track record seems to be constantly ignored.
Holy crap! Your commentary in this video is on another level. Good stuff.
Hey you should cover the feud between George Hotz and Jensen, he talked about It in a recen podcast with Lex Friedman
I cant wait for tensorent to take off
Alternatively, i cant wait for AMD or someone else to make a machine learning ASIC that is an FPGA
Imagine, killing off ML/AI usage of desktop GPUs like ASICS did for schitt-coin, but instead of only being viable for a few months, even days before the algorithm changes and a new ASIC needs to be designed and manufactured, if the ASIC could be re-designed on the fly for new algorithms, offering ASIC performance, with CPU levels of relevance/life span
It’s TRANCETORRENT not tensOrent didn’t you listen?
@@HikarusVibrator Thats my problem, i didnt see it spelled
@@denvera1g1 i was joking with you because the guy who made the video got it so horribly wrong
@@HikarusVibrator Hey man, i got it wrong too
tenstorrent
and i'm not going to remember that after hearing tensorrent so many times
(Edit, come to think of it, i dont think i can recall hearing it ever pronounced correctly except for maybe a Dr Ian Cutress video)
@@denvera1g1 this guy is talking about the intricacies of GPUs and giving very weighty opinions of companies/products in an industry full of very very smart people. He acts as an expert and convinces the fools who watch his channel that TRANStorrent are going to beat up Nvidia. Come oooon dude this is like the video game “journalist” that couldn’t get past the Cuphead demo. People who know tech never do this
Are there any examples of "open source" anything dethroning a major closed source company (other than Linux for servers)? Even Vulkan is hardly getting used anymore over DX12 for some reason.
Valve uses it pretty heavily for the Steamdeck.
Blender in 3D spaces
Star citizen is moving over to vulkan as well. When that gets going it will pull the rest of the games industry with it when they outsource their code to build huge interconnected worlds
not all was sceptic :P i was tellin that ARM can grow on back of big tech and dethrone some on top
I think tesla setting an example with dojo will have a huge impact on the industry. Scaling to 100exaflop in 1.5 years.
Greed, for lack of a better word, is good. Greed is right. Greed works . Jensen Huang
Over the next 10 years the market will require more capacity? Yes, the capacity will be provided by Nvidia as always. Smaller players will have to buy the gpus like everyone else, if they can't afford it they will have to buy cheaper models. Taking this argument and coming to the conclusion that new chip manufacturers will enter the market is nonsense. The trend is clear, firms that produce cpus and gpus and storage devices will become less over time, not more.
if i had drank a shot when ever u said "disrupt" i would be dead
I like AMD going into chiplets. And I will not judge chiplets on video cards, until the technology is more mature.
Wow, way too much tech jargon
It would be great if the industry could create a modular compute architecture at the hardware level. FPGA is the expensive approach and ASICs are the cheap approach. CPU and GPUs are the everything is a nail so lets use a hammer approach.
Closed system where you can do nothing or open source where you have to do many things yourself? I think the thing that's the easiest to use will win. And I'm pretty sure that would be some user friendly but effective AI.
Mr.Raja is the Nvidia Killer ..King Shark
These guys couldn't get done with AMD, they couldn't do it with Intel's money. They aren't going to threaten Nvidia with a broke ass startup running the discount architecture. Risk-5 is IoT junk For OEM's not willing to pay the ARM licensing fee.
this would be interesting 4 open source! good luck 2 all...some super talents coming together,
2 achieve a historic purpose?😎👍 this will probably be a DPU!! or CPU+DPU+GPU!! wow! I get excited!!
Great video, I was going through a huge pessimistic wave lately but this video gave me a bit of hope for the future.
Guy who doesn’t know company’s real name impresses you eh. Nice.
@@HikarusVibrator yes
I basically see your videos as I love the way you speak, and your knowledge is amazing. Nvidia reminds me of the INTEL of old under Andy Grove who wrote the headline making book ONLY THE PARANOID SURVIVE. When he was the head INTEL that company could not be overtaken by any other company because he was always scared, and he made his company work so hard to make sure no other company takes any lead. The current CEO of NVidia is the same: he is scared, always scared. He will never give an inch. He will push boundary after boundary to be the leader of the industry. I am now 61 years old and started using computers when were called word processors way back in 1980. I always keep the latest machine with the greatest hardware. My GPU has always been Nvidia. I tried ATI two times and both cards failed. No Nvidia card has ever failed for me. I am right on AMD 7950X. I wanted to upgrade to the 7950X3D but when I saw virtually no performance increase, I decided to wait. In the meantime, I bought a gaming laptop with the 13980HX, and I am amazed by the speed. For my desktop when I had no choice but to upgrade to a new DDR 5 motherboard, I had the choice of going for RAPTOR lake, but I said to myself: AMD is giving an upgrade route while INTEL is end of life for this socket and the very high-power draw scared me. So, I thought I was safe in AMD land. Surprise. Surprise. INTEL Is now releasing updated RAPTOR lake which should beat any AMD processor, even their 3D ones. So, I think even when INTEL is so down with its antiquated chip making facilities it is not letting AMD really take over in any way. Once it catches up it will again push AMD towards the grave. Nvidia is the same. I am an Indian and Raja Koduri seems like a nice guy, but he has constantly failed as does not have the craziness of the current NVidia CEO. The guy can do anything. Make any gamble but his gambles are very calculated and mostly pay off. I am now just a little scared of the new upcoming 5 series products as I think the GPU will be the complete computer where you install a CPU and SSD inside! I just hope I am not tempted to upgrade as already my current machine is using a 1200W power supply.
Nvidia has long sold gpus, now also cpus and more integrated systems with their stated goaø being to create accelerators of different kinds starting with gpus. The thing is that Nvidia has not created a chip purely for machine learning rather it has molded it's gpus to fit.
Damm!!!! This dudes voice is mad deep.
Amd isnt necessarily "following intel in cpu land" - from what I hear amd is actually ahead of intel cpus.
You're wrong! wrong I say! there's no stopping Big green they provide stability to the market just like International Business machines did😮😮😮.....
Brilliant analysis coreteks.
paraphrasing Coreteks: "7900 series worst cards made by AMD"
GPU sales: 7900 series at current prices are selling like hotcakes.
hard disks are not going away because ssds exist, ssds can not be used for long term data storage, hard disks will always be necessary, so there is really no point of hard disk manufacturers to pay attention to ssds.
Thank you coreteks team...
PLEASE, Coreteks! You was always so groovy relaxed tempo orator. NOW you use that "natural pause removal algorythm" or you've been trended to speak like that. Anyway, after a sentence at the dot, one second breathe. Read some poetry for sakes not to forget the rythm of sylables. OrMaybeWeRemoveSpacesAndDesideToSeparateWordsLikeThis?
I don’t know if raja will dethrone Nvidia. Didn’t Intel give him the boot recently.
Not according to Intel. I am unaware of any internal dialog at Intel indicating displeasure with actually getting products to market. He did leave.
Solid and interesting analysis, as always.
It's TENStorrent not Trenstorrent
@e2rqey my bad
So far every single project of Raja was a failure. Amd vega and intel gpu
Polaris was a solid midrange GPU range for low end pricing.
My RX 580 8GB still good at 1080p modern gaming and it's good at compute.
Got it for like $170 brand new a few years ago and that was a steal considering the midrange to high end GPU market now
Let’s ship raja to Apple for a little while to allow MS to catch up.
@@scroopynooperz9051your rx 580 is good for requirement but it's an outdated gpu even 1080 ti is out dated
AMD Radeon's poor efficiency, lack of features, their lying and terrible pricing had me purchasing a 4090... I still bought into their new AM5 platform though.
Speaking of new technologies that companies ignore.
Optane.
Intels strategy for optane is utter garbage
Unless that optane really does cost $200 for 120GB to manufacture, Intel was asking datacenter prices, from their consumer market.
Do you know what is going to need Optane in the next 4 years?
QLC NVMe
Just like how its impossible to buy SLC NVMes and very hard to find MLC NVMes, so too will be TLC in about 4 years.
What happens when you write more than 10GB at once to a high end 4TB gen 4 QLC NVMe that is 1/2 full?
It slows to about 40MB/s or only slightly faster than USB 2.0
Do you know what was faster transfering files from my camera than that $379 NVMe?
A $299 16TB HDD, instead of starting out at ~4600MB/s and dropping down to 40, the SATA HDD started out at 300MB/s, and sometimes dropped to 260MB/s the total time it took to transfer that ~60GB of photos and videos(200Mbps 4K60) was about 50% faster with the HDD than the QLC NVMe
If i had a 120GB optane drive for cache, that file transfer would have ran at 1500MB/s no matter which drive i was using, as long as it didnt go too much over 120GB.
This is why my new file server is HDD+Optane(4x960GB 905P) instead of my old all flash file server, which is now my on-site backup + portable sync server for my off-site that does not have wired internet)
The upside of my old file server is that 1: Pure flash saves alot on energy, 2: re-builds are far less stressful because reading from an SSD is very unlikely to kill it, unlike an HDD which is equally likely to die on both reads and writes
intel's strategy for optane WAS utter garbage
@@Unicorn-Coin To penetrate a market, it helps to have a household name, they needed something like the 960GB 905P at $250 that was not advertized as being selling point for your other products because it 'requires' speciffic newer processors
instead of waiting several years to drop it to $399, and constantly hiding the fact that it actually works great on older intel parts and newer AMD parts
With the public perception that Optane is 10-20x more expensive than NAND, and requires you to buy overpriced parts to even use it, Intel poisoned public perception
Yes, Datacenter is where you make the money, but if Intel were smart like Nvidia and AMD used to be before 2020, they would have affordable consumer parts
What makes businesses rush to buy good products, are their employees having good experiences with consumer versions of those products, 10k sales pitches arent as good as hearing your own employees sing the praises of a product.
@@Unicorn-Coin To clarify, i dont think it was the right time to market it for gaming, but in 4 years, when we have PLC NAND gen 6 NVMe drives that slow down to 20MB/s when installing game updates, that is when Optane is for gaming.
My other comment about consumer products is directed at homelab people, people who spent maybe $1500 building their own 20+TB file server, people who have alot of games, but cant afford an 8TB NVMe, so get a 16TB HDD and a 1TB Optane for 1/4 the price.
I think Intel's strategy to target gaming was on the right track, but their marketing was all wrong.
To make it successful, they needed a different software suite that allowed older, and non-intel products to work(without buying 3rd party tools like Primo Cache), sure they lack hardware acceleration, but the walled garden approach of Apple, will scare off ANY personal computer enthusiast, they stay away from Apple because they like owning their own hardware, and just a hint of 'this is not your computer, you are not allowed to use it that way' makes most enthusiasts turn away
Sure Intel fanboys would buy it, but even myself, an agnostic who only had Intel up until i built a 3950x file server, did not buy optane because of the perception of 'having my rights taken away'
Tenstorrent just needs to play the long game....
They wont be a huge player until at least 2025...
Truths in this presentation, thank you for being open and honest with your findings, you are of a dying or breed.
x86 licence FEee ive never encountered it? ? ? ? ?
Here! The more you but the more you save! 😂
You're showing videos from the 70s and 80s about hard drives being inferior to SSDs. And claiming they were wrong. Sure, they were wrong 30 years later. They didn't make a mistake at all going for hard disks until SSDs became viable decades later.
I gave upon investing and have AI do it for me, but it keeps buying more nvidia😭
Open source will save capitalism
Perfect,
Thanks for sharing .
Midjourney is closed though