NVidia are SCARED!

Поделиться
HTML-код
  • Опубликовано: 16 ноя 2024

Комментарии • 260

  • @Coreteks
    @Coreteks  Год назад +66

    Sorry I mispronounced tenstorrent, my bad

    • @pf100andahalf
      @pf100andahalf Год назад +4

      Lol! I was wondering what was going on. It's okay man, I made a mistake once.

    • @jstro-hobbytech
      @jstro-hobbytech Год назад

      I was just kidding with my comment. Great content. Some of your predictions for a few years ago are close to fruition I suspect.

    • @olternaut
      @olternaut Год назад +1

      Is there going to be a part 2 video because you only touched on Tenstorrent. Will the consumer eventually benefit from RISC-V or what??

    • @dawidvanstraaten
      @dawidvanstraaten Год назад

      At least we now know Tenstorrent processors are chips on Tren

    • @HikarusVibrator
      @HikarusVibrator Год назад

      It’s not a mispronunciation, you said a completely different word. Which means you haven’t read the company name in articles/docs more than a handful of times. Which means you’re talking with a hell of a lot of self-assumed authority for someone who basically knows nothing

  • @genblob
    @genblob Год назад +94

    I don't care about AI stuff but hearing about a company creating an open alternative using RISC-V to overthrow Nvidia's monopoly is great news. I hope the same happens in the DIY market because it has become incredibly stale and boring. Everyone using RISC-V CPU's sounds like a pipe dream but I think it will happen one day.

    • @Slavolko
      @Slavolko Год назад +5

      It'll be a pipe dream so long as the OS and software don't support it. Apple has transitioned to custom ARM chips because they've put it in the work for supporting it in software, but we don't quite see that on the PC side.

    • @Einygmar
      @Einygmar Год назад +4

      @@Slavolko As far as I understand, as long as all vendors implement the same standardized instruction set this should work fine. Like with graphics API's where each vendor implements DirectX or Vulkan instructions for its hardware so the software developers can use the same code for all the supported GPUs.

    • @Slavolko
      @Slavolko Год назад +2

      @@Einygmar I know, but my point is that you'll need to convince OS and software developers to support your new hardware standard. That's all.

    • @AndersHass
      @AndersHass Год назад +1

      @@Einygmarissue is most software isn’t written for RISC-V, so there needs a lot of work to make it work on RISC-V (like having a translation layer like Apple has done from x86 to ARM to ease the usage before being properly ported).

    • @Real_MisterSir
      @Real_MisterSir Год назад +6

      The thing is, with their approach it also requires that the customers are capable of building their own custom models that are better than the box solution Nvidia ships - and this is the major caveat here. The highlighted companies like Google, Amazon, Tesla, etc - they are all big tech companies with massive resource pools and existing highly skilled software teams that can take on such jobs. But that isn't the case for the majority of the industry's needs as a whole. Most companies that actually benefit from these products, don't have fully stacked teams of google/microsoft level technicians and developers at their disposal, and they don't have the time nor expertise to scout the market for people to perform those tasks through 2rd party services.
      With Nvidia you get something that works out of the gate, has reliable support, and a major community of other players using the exact same systems - trouble shooting the exact same problems, and reaching solutions and knowledge gain that is very hard to replicate with bespoke in-house software.
      It's also one of the reasons Apple's OS and Windows are so popular -only here at a far lower economic scale for the end user, but the concept is the same. Nvidia isn't looking just at the big established corpos - they're looking at all of the up-and-coming enterprises that are looking to build their systems off of modern AI solutions and computing systems. These companies will gobble up Nvidia's product stack like hot bread.
      Said in another way, it's far easier to build a product stack that allows for full end user customization, than it is to create a full system lineup of hardware and software that works out of the box and has worldwide use and experience behind it. Nvidia can always make the switch when they need to - but RISCV based system manufacturers can't do the opposite. They're already locked - similar to how AMD will never reach the software suite integration and support that Nvidia has in enterprise. It would require decades of effort to just get on the same playing field, and even then you still have to convince customers to stop using what they've been using for years. I'd much rather be in Nvidia's shoes here, they have more options - which is a major benefit in a rapidly shifting industry.

  • @esra_erimez
    @esra_erimez Год назад +28

    Ian Cutress (TechTechPotato) has some great interviews with Jim Keller regarding Tenstorrent

  • @esra_erimez
    @esra_erimez Год назад +140

    Jim Keller is a living legend

    • @jstro-hobbytech
      @jstro-hobbytech Год назад +4

      Who has had his company renamed hahaha I kid but I'm sure the second letter in kellers company isn't an 'R'.
      I used riscV microcontrollers alot and it'll be good to see better fp performance trickle down hopefully. Drawing simple graphics with dual core esp32 ic to a tiny screen is painful if you want to make menus. It's why I use teensy4.1 for my silly builds.
      I've been excited for tenstorrent for a while.

    • @iwishilivedinafreecountry5749
      @iwishilivedinafreecountry5749 Год назад

      And Raja is legendarily fucking useless.

    • @MrYevelnad
      @MrYevelnad Год назад +9

      Without keller AMD would likely go bankrupt now.

    • @ctsd623
      @ctsd623 Год назад +5

      Makes no sense. Keller stopped working at amd years ago... he does short contract jobs. He comes in, designs/organizes a masterpiece, gets boreed and leaves onto the next company/project. That's what he does...

    • @jstro-hobbytech
      @jstro-hobbytech Год назад

      @@ctsd623 he does it out of passion and works quick but results are not seen for atleast 3 to 4 years when it comes to silicon development. Plus he gets shares and I bet a small percentage at the other end like some actors do. At this point though I suspect he'd do it for free if he believes in it. In fact I bet at tt he hasn't made any roi yet. Yet haha He deserves it though. He looks at chips like I do at guitar but a better analog would be to use a player like Paul Masvidal or Roger Patterson haha

  • @lucasLSD
    @lucasLSD Год назад +36

    This is so amazing, years ago open hardware sounded like a joke but now it's a threat
    .

    • @ireallyreallyreallylikethisimg
      @ireallyreallyreallylikethisimg Год назад +2

      Good.

    • @estring123
      @estring123 Год назад +5

      open source AI models are also a threat. a leaked google document says google and openai have no moat, and open source might crush them both.

  • @Merrinen
    @Merrinen Год назад +17

    The question Jensen would ask when talking about this topic with Keller is "but does it run Crysis?"

    • @miyagiryota9238
      @miyagiryota9238 Год назад

      Nvidia cant run cysis either 😅😅😅😅😅

  • @XMomaR
    @XMomaR Год назад +40

    The more you buy, the more you save! Jensen's AI stranglehold may well fade, Raja-Keller Tenstorrent chips maybe gonna be killer sellers, so sayeth Coreteks: the futurist tech soothsayer!

  • @blackmennewstyle
    @blackmennewstyle Год назад +15

    What a time to be alive, open source hardware might become a reality 🔥🚀
    I still remains skeptical though especially since Raja is involved but i truly want him to do well in his collaboration with Jim...

  • @pfcokelly
    @pfcokelly Год назад +9

    I can hear EVGA yelling fuck yea from here.

  • @TechDunk
    @TechDunk Год назад +21

    Can we get a yay for competition and open source

    • @offline__
      @offline__ Год назад +2

      Sadly nvidia didn't make my gpus driver open source for some reason (GTX770) 😭

    • @JacksonPhixesPhones
      @JacksonPhixesPhones Год назад +2

      ​@@offline__The open source Nvidia Linux driver has come a long way in a short time. It's still rough, but it'll get there. I'm with ya though, I think it was a dick move to open source ONLY the newer architectures. The 700 series seems like the perfect target for open source optimizations, because regardless of how old they are, a lot of people still use 'em, along with the 900 & DEFINITELY the 1000 series.

    • @JacksonPhixesPhones
      @JacksonPhixesPhones Год назад +1

      Correct me if I'm wrong, but I'm pretty sure that Pascal is still the most popular Nvidia architecture at the moment, when it comes to the number of active users (1060 & 1070 are the most popular I think)

    • @JacksonPhixesPhones
      @JacksonPhixesPhones Год назад +1

      YAY!

    • @offline__
      @offline__ Год назад +1

      @@JacksonPhixesPhones couldn't have said it better

  • @fredsorre6605
    @fredsorre6605 Год назад +28

    I really hope Jim Keller can put Nvidia in it's place and offer a competing product with the same performance but at a third of the price AI is still new enough that Nvidia's dominance in it can still be challenged.

    • @Moshe_Dayan44
      @Moshe_Dayan44 Год назад +4

      I said this years ago, when everybody thought Intel had completely crushed AMD, and I first heard that Jim Keller had come back to AMD to help them design their Ryzen CPU. He's the Luke Skywalker of chip design. He can do it. Darth Huang and the nGreedia empire can and will be defeated by Luke Keller. :)

    • @tringuyen7519
      @tringuyen7519 Год назад +3

      Nvidia’s CUDA is why it’s succeeding now & why it will lose in the future. CUDA is GPU specific. It doesn’t work with a hybrid APU architecture.

    • @arenzricodexd4409
      @arenzricodexd4409 Год назад

      @@Moshe_Dayan44 and there probably tens if not hundreds of people like Jim Keller inside nvidia.

    • @socratic-programmer
      @socratic-programmer Год назад

      @@tringuyen7519 the issue is that making a hybrid APU arch framework is really really hard. OpenCL was meant to be an example, SYCL a more recent thing (and that also has limitations). We need a bigger idea, I think.

    • @Moshe_Dayan44
      @Moshe_Dayan44 Год назад +2

      @@arenzricodexd4409 This is exactly what people said about Intel in 2016: "There are literally hundreds of engineers like Jim Keller at Intel! How can one guy beat hundreds of people! It's impossible!" Yet critical aspects of the Ryzen core, as well as the chiplet concept and the interconnect design were all Keller's. One man DID undo Intel's iron, monopoly grip on x86 big iron CPUs. If you think Keller can't do it again, you're betting against a man who has never designed a dud chip. He also was the main architect of the DEC 21164 Alpha CPU, AMD's Athlon, the author of AMD's x86-64 64 bit instructions, as well as Apple's A4 and A5 chips. Do you really want to bet against him?

  • @prolamer7
    @prolamer7 Год назад +6

    Well put facts, but you are forgetting big players learned from mistakes of 80-90s for example Facebook or Google should be superseeded already but instead they just buy their competitors. I think same will happen with new competitors, Nvidia or AMD will just buy them ....

  • @EnochGitongaKimathi
    @EnochGitongaKimathi Год назад +5

    It is very exciting to see the progress of RISC-V. People think the biggest advantage of RISC-V is that it is open source but I think being open source only facilitates the biggest advantage of RISC-V, it is modular. You can put in what you need and leave out what you don't in the design and have an extremely custom chip. A chip doesn't need to be good at everything (scalar, vector, matrix and spatial instructions) if there is no need.
    The biggest obstacle for RISC-V remains manufacturing. Access to the most advanced process nodes from TSMC, Intel and Samsung can only be afforded by a few. Advance process nodes offer the performance efficiency we need to make RISC-V something everyone wants. This is why the main target of high performance RISC-V is the server market. Perhaps a company like Samsung can come to the rescue of RISC-V and develop Android Smartphones using RISC-V. They can switch from using ARM for their Exynos Mobile Processors.

  • @Pushing_Pixels
    @Pushing_Pixels Год назад +1

    It's interesting seeing how RISC-V is evolving. Something I think we will see become widespread in the near future is companies creating custom, hybrid solutions that combine RISC-V platforms as a base, but with highly customized, or even unique, proprietary IP blocks embedded alongside or within them. Every company will be able to have their own custom hardware designs, done either in-house or through specialist design companies, that they have IP rights over, working side by side with the open standard instructions, and potentially other IP licenced from third-parties. It will be the kind of heterogeneous computing AMD have been pushing, but far more customized and instead of customers being limited to AMD IP (or whoever) they will design their own, or take different ones from different designers and combine them with the open standard. Any company that can facilitate this process, from design co-ordination and IP integration, all the way to arranging fabrication and packaging, in a way that is more rapid and nimble than the big incumbents, will take off and become huge.

  • @trapexit
    @trapexit Год назад +6

    Until we get standard bodies and standards around the system as a whole rather than the ISA of the CPU RISCV will continue to be bespoke in use. Look at the issues Rene Rebe ran into with his early standard version of SIMD on the RISCV SBC he has.

  • @juniorjunior8494
    @juniorjunior8494 Год назад +3

    One of your best videos, well potrayed. I am designing a RiscV accelerator, and i can first hand say the pace of software development and maturity is impressively fast. An area where RIsc V lagged was verification tooling, and this has been maturing well and is becoming more robust. Also, almost all the cool innovation in hardware is happening in RiscV. The biggest advantage is this velocity and freedom. We're already reaching the limits of silicon, which means that now specialization is the way to go to extract the next magnitudes of compute performance

  • @ageofdoge
    @ageofdoge Год назад +13

    It's great to see RiscV becoming a major factor.
    What do you think of Dojo? Tesla claims they will use it to go from something like 5-6 exaflops to 100 in the next 14 months. Who knows when they might make any of that available to the outside world but I imagine they will at some point and it seems like they could cut into the market pretty good when they do.

    • @ireallyreallyreallylikethisimg
      @ireallyreallyreallylikethisimg Год назад

      100 exaflops? In 14 months?? Ain't no way.

    • @YolandaPlayne
      @YolandaPlayne Год назад +1

      @@ireallyreallyreallylikethisimg Agreed. I don't believe it. Just another marketing promise by Elon. "Self driving"? A trademark, not a feature.

    • @ageofdoge
      @ageofdoge Год назад +3

      @@YolandaPlayne Self driving is a thing no one has accomplished before and is inherently unpredictable.
      If they are planning to build this in the next fourteen months, it means they've already lined up the production with TSMC. It's not the kind of thing you can place a last minute order for. Since there's already tesla designed silicon in the cars I think it's safe to assume they know how to process works.

    • @daniel_960_
      @daniel_960_ Год назад

      @@YolandaPlayne seems believable. All they have to do is order fab capacity from
      TSMC. It's also 7nm which should have plenty affordable capacity.
      Their solution seems incredibly scalable too, with 25 chips in one tile.

    • @YolandaPlayne
      @YolandaPlayne Год назад

      @@daniel_960_ capacity is tight, this is known.

  • @christiansrensen3810
    @christiansrensen3810 Год назад +2

    Before Nvidia.....the king 3dfx ruled gaming.
    Before Apple... Nokia ruled the phone marked....
    Before Phillips... Was dominant in television..
    Before Kodak....ruled the photo industry.
    Before blockbuster was the marked former and King.
    All these companies (except Phillips). Is almost extinct..

  • @skywalker1991
    @skywalker1991 Год назад +7

    Jim Keller been with tenstorrent few years now , raja joining might ruin his project , rajas track record is not very good .

  • @WildEngineering
    @WildEngineering Год назад +3

    TENS TOR ENT - THERE IS NO R

  • @ronnyspanneveld8110
    @ronnyspanneveld8110 Год назад +2

    Not much to say about it, agree, ( EX Dec Alpha employee here :P )

  • @chillidog8726
    @chillidog8726 Год назад +1

    I'm not sure where you got the Tenstorrent LG information but after reading up on it for some time.
    Tenstorrent seems to only license thier RISC-V CPU cores not including any AI hardware.
    And after remembering Ian Cuttress interview with Jim Keller:
    Jim Keller said something like surprisingly people want to license just the CPU cores so we are probably gonna do that.
    So i think LG is just looking to move away from arm as thier Smart-TV embedded chip and use Tenstorrent CPU IP to produce it's own Smart TV chip. which would run a Linux based operating System. Likely nothing to do with AI that part or at least no hint of it having anything to do with it.

  • @Michael_Brock
    @Michael_Brock Год назад +2

    Both WD and Seagate both have made the successful transition from hard disks to SSD, I am sure other companies have made the change.

  • @YourSkyliner
    @YourSkyliner Год назад +10

    I will never again trust in a project that Raja Koduri is a part of

    • @Th3_Gael
      @Th3_Gael Год назад

      Keller kinda offsets Koduri imo

    • @donnydarko7624
      @donnydarko7624 Год назад

      ​@@Unicorn-CoinPat Gelsinger does. Look at the 180 with arc since Raja was removed from overseeing it.

  • @adi6293
    @adi6293 Год назад +8

    So when is nVidia going to buy this company?

  • @maxwellsmart3156
    @maxwellsmart3156 Год назад +2

    Why do you think AMD's goal is to disrupt Nvidia? Most likely it's what you wanted to happen and then you say RX 7900 is the 'worst' because it wasn't cheap enough. I guess you never factored in that prices would fall. The RX 7900XT is cheaper than a RTX 4070Ti, consumers will still buy the 4070Ti because RT and DLSS3. The market is a multi-variable equation. How cheap would the RX 7900 series need to be to make it the 'best'? AMD's goal is to make a profit with good margins because they need to justify their actions to the board and stakeholders.

  • @domm6812
    @domm6812 Год назад +1

    Yep ...intel is a prime example of how you can suddenly fall down when you're at the top. Ignoring innovation, blatantly milking consumers and prioritising shareholder demands and margins above all else. Intel is lucky they may have caught the fall in time to climb back up ...we'll see.

  • @LaMouche99
    @LaMouche99 Год назад +4

    "Tenstorrent" not "Trenstorrent" Celso

  • @pf100andahalf
    @pf100andahalf Год назад +3

    Until nvidia (and amd) decide that the crypto mining boom is over and stop fleecing their normal (average) gpu customers for everything they can squeeze out of them, I will continue to hope that something happens that gives them some humility. Will that ever happen? Probably not. But I can hope.

  • @AnthonyRBlacker
    @AnthonyRBlacker Год назад

    I remember the Sun Microsystems computer in the engineering lab at NJIT in Newark, NJ had 1 GB of ram. That was 1996. It was unheard of. : )

  • @AlexSeesing
    @AlexSeesing Год назад

    Going smaller doesn't always means smaller nodes but just getting stuff becoming way more simple in order to scale in parallelization. How else would those compute units become chiplets?

  • @gecko2000405
    @gecko2000405 Год назад +1

    Why Intel even allowed these guys to leave is beyond me. They're the greatest minds for GPU's and CPU's.

  • @bobbavet
    @bobbavet Год назад +1

    Thanks for another quality video Corteks. Any chance of investigating A.I hardware and the benchmarking of it? How can we measure performance of A.I.? Is anything readily available to measure the use of AI on GPU vs GPU, CPU vs CPU, GPU vs GPU?

  • @Crihnoss
    @Crihnoss Год назад +1

    I'm curious to how RISC-V performance stacks up to the other architectures? That will in no doubt will affect its adoption.

  • @tomtomkowski7653
    @tomtomkowski7653 Год назад +15

    AMD had the opportunity of the decade because Nvidia has so many Ampere stocks so they raised the prices like crazy and left the doors wide open.
    AMD went for chipsets to cut the costs and what have we got?
    7900xtx as fast as the 4080 for $1000 just because Nvidia raised the price of the 4080 to $1200.
    7900xt as fast as the 4070ti for $800 just because Nvidia raised the price of the 4070ti to $800.
    AMD had a chance and they blew it big time because RDNA 3 didn't deliver and we got nothing from Chiplets savings. I'm afraid that with the AI market, it will be a similar story.
    AMD was able to fight with Intel because this was a big, lazy, self-confident corporation used to the monopoly they had. Nvidia is a different kind of player just being ruled by greed.
    As for the riskV - sooner Nvidia (or Intel, or AMD) will buy it and this will be it.

    • @puriko89
      @puriko89 Год назад +7

      It looks like AMD can't see past the next quarter. They want to imitate Nvidia in any way possible.

    • @skywalker1991
      @skywalker1991 Год назад +3

      Chiplet saved amd money, but then GPUs are not selling and amd losing money after all , lol

    • @tdreamgmail
      @tdreamgmail Год назад +1

      AMD eventually create superior hardware only to lose again with inferior software. No one is replacing CUDA at this stage.

    • @macronomicus
      @macronomicus Год назад

      Both AMD & NVIDIA had record breaking unsold last gen backstock to sell thru, that & all the unfounded fears of recession is why prices were so high. It worked, both companies have vastly cleared the backlog, & both are bringing down prices. I suspect they wont bring prices too much lower though, because the silicon is more profitable as ai chips.

    • @heyno3306
      @heyno3306 Год назад

      First gen chiplets

  • @Ojref1
    @Ojref1 Год назад +7

    Raja is a doom upon all he touches. Putting him onboard your ship is like ordering the crew to punch a 10 foot hole in the side of your boat.

    • @pilsen8920
      @pilsen8920 Год назад

      Thanks, I couldn't have said it better.

  • @Kaptime
    @Kaptime Год назад

    Thinking more on this, you can see why Xilinx was such a key pickup for AMD. Being able to sell a chip with it's own FPGA you can customize for a specific AI workload is a great idea. Then selling a GPU or dedicated chip that can perform that task is a killer sell. If ROCm was as easy to use as CUDA they would be able to have the whole stack under control.

  • @pilsen8920
    @pilsen8920 Год назад

    Jim's interview with tech tech potato was the 1st time i figured out what jim was building. I think he has a winning model.

  • @sersou
    @sersou Год назад +5

    Tenstorrent, not T"R"enstorrent I believe.

    • @Coreteks
      @Coreteks  Год назад +2

      @sersou my bad

    • @HikarusVibrator
      @HikarusVibrator Год назад

      Unbelievable that we hear a voice and assume they’re an expert and meanwhile they don’t even know the company name. It’s so shameful. How could I ever think “oh hey that guy over there said so and so, I’ll take that opinion on board” after this. Hillarious shit man

    • @HikarusVibrator
      @HikarusVibrator Год назад

      No it’s Transtorrent, the opposite of Cistorrent

  • @anndroid2161
    @anndroid2161 Год назад +5

    LG aren't just licencing generic RISC V chip designs from Tenstorrent though, are they? I suspect they're licencing their AI add-on modules as well, otherwise they'd just use an open-source RISC-V core and save themselves the licence fee.
    If that's the case then the story is just "LG switches chip vendors", as they'll be locked in to Tenstorrent's RISC-V extensions just as if they'd gone with ARM or Nvidia.

  • @bobmnz6914
    @bobmnz6914 Год назад

    Let me know when it arrives. I'll be interested in how it handles graphics. But I am also interested in IF/How, UnReal 5 is going to change things. On the graphics front. Will we be able to run high Q graphics without a high Q card?

  • @ChittyBang66
    @ChittyBang66 Год назад

    Hope a company uses this open hardware to make an affordable 4K 240Hz nano-LED gaming monitor.

  • @DanT10
    @DanT10 Год назад +1

    This video highlight the issues with capitalism in general. Infinite growth in a finite world means death.

  • @Dmwntkp99
    @Dmwntkp99 Год назад +7

    Make gpu's upgradable like motherboards.

    • @gregandark8571
      @gregandark8571 Год назад +2

      Make GPU chip's swappable like you do with your CPU when it's time to change it.

    • @Dmwntkp99
      @Dmwntkp99 Год назад +3

      @@gregandark8571Would certainly improve longevity ownership of the card.

    • @n0madfernan257
      @n0madfernan257 Год назад +1

      you could pitch-in the idea to investors, go with it

  • @Voidkitty_
    @Voidkitty_ Год назад +1

    Yay another episode of hopium for the tech industry with the most soothing voice ever

  • @HTV-2_Hypersonic_Glide_Vehicle
    @HTV-2_Hypersonic_Glide_Vehicle Год назад +1

    NVidia is* not "are" NVidia is not multiple entities.

  • @aaronmcquaid
    @aaronmcquaid Год назад +2

    how do we invest in tenstorrent?

  • @platin2148
    @platin2148 Год назад +4

    Pff most of the AI stuff isn’t really great for the end consumer. Most use cases can’t even generate money in any reasonable way.

    • @questmarq7901
      @questmarq7901 Год назад +1

      Gene editing, Healthcare prediction, individualized drug creation (protein folding), robotics symbiosis, robotics in general.. there is no market that will not get disrupted by AI

  • @El.Duder-ino
    @El.Duder-ino Год назад

    AMD and Intel should really watch this! Jim Keller as an industry veteran completely understands the "hole in the market" not just from the technological, but business model perspective as well! Tenstorrent concept and model is excellent and their approach is future proof. I just hope its not way too much advanced and forward looking as industry might not be ready for something like this and would rather let Nvidia milk it even more... Nevertheless something new and innovative including open source is much needed as Nvidia becomes something like a monopoly on the AI market, getting more powerful and more wealthier by every hour... just imagine what would happen if they would merge with Arm...

  • @El.Duder-ino
    @El.Duder-ino Год назад

    Which company Intel acquired and used their IP for the thread director?

  • @SweatLaserXP
    @SweatLaserXP Год назад

    Discreet graphics ain't going anywhere for a while, and nVidia can transition to a market share approach where they'll make some long-term commitments at a discount with TSMC and/or Samsung and/or whichever chip fab. The thing is, with nVidia, they have top-notch driver maturity, so they will play well with games, editing, simulation, etc. In other words, they're reliable for hardware compatibility. So there is some incentive for them to shrink some of their last-gen chips down and go the budget route.

  • @aalhard
    @aalhard Год назад

    This is indeed a crazy time to watch the industry

  • @mikebruzzone9570
    @mikebruzzone9570 Год назад

    Good report Celso I copied your report link into my SA distribution which you can find at my Seeking Alpha comment line and as always I have added my own observations. mb

  • @pierrebroccoli.9396
    @pierrebroccoli.9396 Год назад

    A very astute video into the nature of Corporate Structures as much as it is technology.
    Irony as everything is being forced into the Corporate Structure of operation now days including Public Services and Government. Talking to a DB developer startup who's company was going Public Listing - I mentioned to him it would be a great time to get out and look at something new. As Companies - Structures become too large, they fall in on their own foot print.
    As one advocating for Open AI systems and keeping Software and Hardware Open and accessible - this is good news.

  • @GegoXaren
    @GegoXaren Год назад

    Cheeses... What is with that echo, it's way worse than normal?
    Put a duvet behind, and a couple of pillows in front of you when recording, and get a carpet for your floor.

  • @1Aquadon
    @1Aquadon Год назад

    Thanks CT.. Always Stellar delivery!!

  • @SARS1PP1US
    @SARS1PP1US Год назад

    Where can I find information on the images at 8:10 of the video?

  • @alpha007org
    @alpha007org Год назад

    Unrelated question: When you buy Office key, where the F do you get download link?

  • @rightwingsafetysquad9872
    @rightwingsafetysquad9872 Год назад +2

    Maybe in a few years Raja will leave so Tenstorrent can make some good products.

    • @donnydarko7624
      @donnydarko7624 Год назад +1

      Yeah, I swear his track record seems to be constantly ignored.

  • @kyounokuma
    @kyounokuma Год назад

    Holy crap! Your commentary in this video is on another level. Good stuff.

  • @antemasq6520
    @antemasq6520 Год назад

    Hey you should cover the feud between George Hotz and Jensen, he talked about It in a recen podcast with Lex Friedman

  • @denvera1g1
    @denvera1g1 Год назад +2

    I cant wait for tensorent to take off
    Alternatively, i cant wait for AMD or someone else to make a machine learning ASIC that is an FPGA
    Imagine, killing off ML/AI usage of desktop GPUs like ASICS did for schitt-coin, but instead of only being viable for a few months, even days before the algorithm changes and a new ASIC needs to be designed and manufactured, if the ASIC could be re-designed on the fly for new algorithms, offering ASIC performance, with CPU levels of relevance/life span

    • @HikarusVibrator
      @HikarusVibrator Год назад

      It’s TRANCETORRENT not tensOrent didn’t you listen?

    • @denvera1g1
      @denvera1g1 Год назад

      @@HikarusVibrator Thats my problem, i didnt see it spelled

    • @HikarusVibrator
      @HikarusVibrator Год назад +1

      @@denvera1g1 i was joking with you because the guy who made the video got it so horribly wrong

    • @denvera1g1
      @denvera1g1 Год назад

      @@HikarusVibrator Hey man, i got it wrong too
      tenstorrent
      and i'm not going to remember that after hearing tensorrent so many times
      (Edit, come to think of it, i dont think i can recall hearing it ever pronounced correctly except for maybe a Dr Ian Cutress video)

    • @HikarusVibrator
      @HikarusVibrator Год назад

      @@denvera1g1 this guy is talking about the intricacies of GPUs and giving very weighty opinions of companies/products in an industry full of very very smart people. He acts as an expert and convinces the fools who watch his channel that TRANStorrent are going to beat up Nvidia. Come oooon dude this is like the video game “journalist” that couldn’t get past the Cuphead demo. People who know tech never do this

  • @sinephase
    @sinephase Год назад +6

    Are there any examples of "open source" anything dethroning a major closed source company (other than Linux for servers)? Even Vulkan is hardly getting used anymore over DX12 for some reason.

    • @phoenixrising4995
      @phoenixrising4995 Год назад +4

      Valve uses it pretty heavily for the Steamdeck.

    • @shinkiro69420
      @shinkiro69420 Год назад

      Blender in 3D spaces

    • @scislife2398
      @scislife2398 Год назад +1

      Star citizen is moving over to vulkan as well. When that gets going it will pull the rest of the games industry with it when they outsource their code to build huge interconnected worlds

  • @MrValgard
    @MrValgard Год назад

    not all was sceptic :P i was tellin that ARM can grow on back of big tech and dethrone some on top

  • @daniel_960_
    @daniel_960_ Год назад

    I think tesla setting an example with dojo will have a huge impact on the industry. Scaling to 100exaflop in 1.5 years.

  • @johnnyutah9939
    @johnnyutah9939 Год назад

    Greed, for lack of a better word, is good. Greed is right. Greed works . Jensen Huang

  • @antimsm6705
    @antimsm6705 Год назад +1

    Over the next 10 years the market will require more capacity? Yes, the capacity will be provided by Nvidia as always. Smaller players will have to buy the gpus like everyone else, if they can't afford it they will have to buy cheaper models. Taking this argument and coming to the conclusion that new chip manufacturers will enter the market is nonsense. The trend is clear, firms that produce cpus and gpus and storage devices will become less over time, not more.

  • @WTFBI5
    @WTFBI5 Год назад

    if i had drank a shot when ever u said "disrupt" i would be dead

  • @MozartificeR
    @MozartificeR Год назад

    I like AMD going into chiplets. And I will not judge chiplets on video cards, until the technology is more mature.

  • @jeffmofo5013
    @jeffmofo5013 Год назад

    Wow, way too much tech jargon
    It would be great if the industry could create a modular compute architecture at the hardware level. FPGA is the expensive approach and ASICs are the cheap approach. CPU and GPUs are the everything is a nail so lets use a hammer approach.

  • @nowonmetube
    @nowonmetube Год назад

    Closed system where you can do nothing or open source where you have to do many things yourself? I think the thing that's the easiest to use will win. And I'm pretty sure that would be some user friendly but effective AI.

  • @M3ganwillslay
    @M3ganwillslay Год назад

    Mr.Raja is the Nvidia Killer ..King Shark

  • @CoreyKearney
    @CoreyKearney Год назад +1

    These guys couldn't get done with AMD, they couldn't do it with Intel's money. They aren't going to threaten Nvidia with a broke ass startup running the discount architecture. Risk-5 is IoT junk For OEM's not willing to pay the ARM licensing fee.

  • @__--JY-Moe--__
    @__--JY-Moe--__ Год назад

    this would be interesting 4 open source! good luck 2 all...some super talents coming together,
    2 achieve a historic purpose?😎👍 this will probably be a DPU!! or CPU+DPU+GPU!! wow! I get excited!!

  • @orlof507
    @orlof507 Год назад

    Great video, I was going through a huge pessimistic wave lately but this video gave me a bit of hope for the future.

    • @HikarusVibrator
      @HikarusVibrator Год назад

      Guy who doesn’t know company’s real name impresses you eh. Nice.

    • @orlof507
      @orlof507 Год назад

      @@HikarusVibrator yes

  • @ajaypuri130
    @ajaypuri130 Год назад

    I basically see your videos as I love the way you speak, and your knowledge is amazing. Nvidia reminds me of the INTEL of old under Andy Grove who wrote the headline making book ONLY THE PARANOID SURVIVE. When he was the head INTEL that company could not be overtaken by any other company because he was always scared, and he made his company work so hard to make sure no other company takes any lead. The current CEO of NVidia is the same: he is scared, always scared. He will never give an inch. He will push boundary after boundary to be the leader of the industry. I am now 61 years old and started using computers when were called word processors way back in 1980. I always keep the latest machine with the greatest hardware. My GPU has always been Nvidia. I tried ATI two times and both cards failed. No Nvidia card has ever failed for me. I am right on AMD 7950X. I wanted to upgrade to the 7950X3D but when I saw virtually no performance increase, I decided to wait. In the meantime, I bought a gaming laptop with the 13980HX, and I am amazed by the speed. For my desktop when I had no choice but to upgrade to a new DDR 5 motherboard, I had the choice of going for RAPTOR lake, but I said to myself: AMD is giving an upgrade route while INTEL is end of life for this socket and the very high-power draw scared me. So, I thought I was safe in AMD land. Surprise. Surprise. INTEL Is now releasing updated RAPTOR lake which should beat any AMD processor, even their 3D ones. So, I think even when INTEL is so down with its antiquated chip making facilities it is not letting AMD really take over in any way. Once it catches up it will again push AMD towards the grave. Nvidia is the same. I am an Indian and Raja Koduri seems like a nice guy, but he has constantly failed as does not have the craziness of the current NVidia CEO. The guy can do anything. Make any gamble but his gambles are very calculated and mostly pay off. I am now just a little scared of the new upcoming 5 series products as I think the GPU will be the complete computer where you install a CPU and SSD inside! I just hope I am not tempted to upgrade as already my current machine is using a 1200W power supply.

  • @gabfid3
    @gabfid3 Год назад

    Nvidia has long sold gpus, now also cpus and more integrated systems with their stated goaø being to create accelerators of different kinds starting with gpus. The thing is that Nvidia has not created a chip purely for machine learning rather it has molded it's gpus to fit.

  • @niamat5129
    @niamat5129 Год назад

    Damm!!!! This dudes voice is mad deep.

  • @Humanaut.
    @Humanaut. Год назад

    Amd isnt necessarily "following intel in cpu land" - from what I hear amd is actually ahead of intel cpus.

  • @zenairzulu1378
    @zenairzulu1378 Год назад

    You're wrong! wrong I say! there's no stopping Big green they provide stability to the market just like International Business machines did😮😮😮.....

  • @user-wm1xm5gm2k
    @user-wm1xm5gm2k Год назад

    Brilliant analysis coreteks.

  • @sacamentobob
    @sacamentobob Год назад

    paraphrasing Coreteks: "7900 series worst cards made by AMD"
    GPU sales: 7900 series at current prices are selling like hotcakes.

  • @antimsm6705
    @antimsm6705 Год назад

    hard disks are not going away because ssds exist, ssds can not be used for long term data storage, hard disks will always be necessary, so there is really no point of hard disk manufacturers to pay attention to ssds.

  • @MozartificeR
    @MozartificeR Год назад

    Thank you coreteks team...

  • @phillgizmo8934
    @phillgizmo8934 Год назад +2

    PLEASE, Coreteks! You was always so groovy relaxed tempo orator. NOW you use that "natural pause removal algorythm" or you've been trended to speak like that. Anyway, after a sentence at the dot, one second breathe. Read some poetry for sakes not to forget the rythm of sylables. OrMaybeWeRemoveSpacesAndDesideToSeparateWordsLikeThis?

  • @phoenixrising4995
    @phoenixrising4995 Год назад

    I don’t know if raja will dethrone Nvidia. Didn’t Intel give him the boot recently.

    • @nimrodery
      @nimrodery Год назад

      Not according to Intel. I am unaware of any internal dialog at Intel indicating displeasure with actually getting products to market. He did leave.

  • @CupOfAwesome
    @CupOfAwesome Год назад

    Solid and interesting analysis, as always.

  • @e2rqey
    @e2rqey Год назад +1

    It's TENStorrent not Trenstorrent

  • @gabor5079
    @gabor5079 Год назад +7

    So far every single project of Raja was a failure. Amd vega and intel gpu

    • @scroopynooperz9051
      @scroopynooperz9051 Год назад +3

      Polaris was a solid midrange GPU range for low end pricing.
      My RX 580 8GB still good at 1080p modern gaming and it's good at compute.
      Got it for like $170 brand new a few years ago and that was a steal considering the midrange to high end GPU market now

    • @phoenixrising4995
      @phoenixrising4995 Год назад

      Let’s ship raja to Apple for a little while to allow MS to catch up.

    • @Decki777
      @Decki777 Год назад

      ​@@scroopynooperz9051your rx 580 is good for requirement but it's an outdated gpu even 1080 ti is out dated

  • @jemborg
    @jemborg Год назад

    AMD Radeon's poor efficiency, lack of features, their lying and terrible pricing had me purchasing a 4090... I still bought into their new AM5 platform though.

  • @denvera1g1
    @denvera1g1 Год назад

    Speaking of new technologies that companies ignore.
    Optane.
    Intels strategy for optane is utter garbage
    Unless that optane really does cost $200 for 120GB to manufacture, Intel was asking datacenter prices, from their consumer market.
    Do you know what is going to need Optane in the next 4 years?
    QLC NVMe
    Just like how its impossible to buy SLC NVMes and very hard to find MLC NVMes, so too will be TLC in about 4 years.
    What happens when you write more than 10GB at once to a high end 4TB gen 4 QLC NVMe that is 1/2 full?
    It slows to about 40MB/s or only slightly faster than USB 2.0
    Do you know what was faster transfering files from my camera than that $379 NVMe?
    A $299 16TB HDD, instead of starting out at ~4600MB/s and dropping down to 40, the SATA HDD started out at 300MB/s, and sometimes dropped to 260MB/s the total time it took to transfer that ~60GB of photos and videos(200Mbps 4K60) was about 50% faster with the HDD than the QLC NVMe
    If i had a 120GB optane drive for cache, that file transfer would have ran at 1500MB/s no matter which drive i was using, as long as it didnt go too much over 120GB.
    This is why my new file server is HDD+Optane(4x960GB 905P) instead of my old all flash file server, which is now my on-site backup + portable sync server for my off-site that does not have wired internet)
    The upside of my old file server is that 1: Pure flash saves alot on energy, 2: re-builds are far less stressful because reading from an SSD is very unlikely to kill it, unlike an HDD which is equally likely to die on both reads and writes

    • @denvera1g1
      @denvera1g1 Год назад

      intel's strategy for optane WAS utter garbage

    • @denvera1g1
      @denvera1g1 Год назад

      @@Unicorn-Coin To penetrate a market, it helps to have a household name, they needed something like the 960GB 905P at $250 that was not advertized as being selling point for your other products because it 'requires' speciffic newer processors
      instead of waiting several years to drop it to $399, and constantly hiding the fact that it actually works great on older intel parts and newer AMD parts
      With the public perception that Optane is 10-20x more expensive than NAND, and requires you to buy overpriced parts to even use it, Intel poisoned public perception
      Yes, Datacenter is where you make the money, but if Intel were smart like Nvidia and AMD used to be before 2020, they would have affordable consumer parts
      What makes businesses rush to buy good products, are their employees having good experiences with consumer versions of those products, 10k sales pitches arent as good as hearing your own employees sing the praises of a product.

    • @denvera1g1
      @denvera1g1 Год назад

      @@Unicorn-Coin To clarify, i dont think it was the right time to market it for gaming, but in 4 years, when we have PLC NAND gen 6 NVMe drives that slow down to 20MB/s when installing game updates, that is when Optane is for gaming.
      My other comment about consumer products is directed at homelab people, people who spent maybe $1500 building their own 20+TB file server, people who have alot of games, but cant afford an 8TB NVMe, so get a 16TB HDD and a 1TB Optane for 1/4 the price.
      I think Intel's strategy to target gaming was on the right track, but their marketing was all wrong.
      To make it successful, they needed a different software suite that allowed older, and non-intel products to work(without buying 3rd party tools like Primo Cache), sure they lack hardware acceleration, but the walled garden approach of Apple, will scare off ANY personal computer enthusiast, they stay away from Apple because they like owning their own hardware, and just a hint of 'this is not your computer, you are not allowed to use it that way' makes most enthusiasts turn away
      Sure Intel fanboys would buy it, but even myself, an agnostic who only had Intel up until i built a 3950x file server, did not buy optane because of the perception of 'having my rights taken away'

  • @billykotsos4642
    @billykotsos4642 Год назад

    Tenstorrent just needs to play the long game....
    They wont be a huge player until at least 2025...

  • @cerviche101
    @cerviche101 Год назад

    Truths in this presentation, thank you for being open and honest with your findings, you are of a dying or breed.

  • @phoenixsub7072
    @phoenixsub7072 Год назад

    x86 licence FEee ive never encountered it? ? ? ? ?

  • @BurningDrake39
    @BurningDrake39 Год назад

    Here! The more you but the more you save! 😂

  • @eugkra33
    @eugkra33 Год назад

    You're showing videos from the 70s and 80s about hard drives being inferior to SSDs. And claiming they were wrong. Sure, they were wrong 30 years later. They didn't make a mistake at all going for hard disks until SSDs became viable decades later.

  • @dimagass7801
    @dimagass7801 Год назад

    I gave upon investing and have AI do it for me, but it keeps buying more nvidia😭

  • @markemad1986
    @markemad1986 Год назад +1

    Open source will save capitalism

  • @saemranian
    @saemranian Год назад

    Perfect,
    Thanks for sharing .

  • @NeoShameMan
    @NeoShameMan Год назад

    Midjourney is closed though