HW News - NVIDIA's Reality Distortion Field, RIP E3, Intel 144-Core CPU

Поделиться
HTML-код
  • Опубликовано: 15 ноя 2024

Комментарии • 1,1 тыс.

  • @GamersNexus
    @GamersNexus  Год назад +203

    In case you missed it, check out our unhinged rant about a small motherboard feature: ruclips.net/video/bEjH775UeNg/видео.html

    • @dakoderii4221
      @dakoderii4221 Год назад +6

      Nvidia saying crypto is useless is like Homer Simpson saying donuts and beer are useless.

    • @Youtubecensorsmesince2015
      @Youtubecensorsmesince2015 Год назад

      paid the cryptotax end of 2016 for a 1060 6gb never again, sadly AMD is playing along EU pricing is fudged
      at least they have enough vram would have pulled the triger on a 6800/xt in the US

    • @robertlawrence9000
      @robertlawrence9000 Год назад +1

      I would really like to see some in depth reviews of the Framework laptops. Pros/cons and performance compared to different other laptops on the market.
      It's sad to know that Steam is no longer going to support Windows 7 and 8. It's a slap in the face to people who have older systems they use for older games. This is a troubling precedent if it means having to be forced to upgrade in order to use games we've purchased that work just fine on older systems. Steam should have a remedy for that!

    • @ayoooooooooooooooooooooooooooo
      @ayoooooooooooooooooooooooooooo Год назад

      I like cheese.

    • @robertpearson8546
      @robertpearson8546 Год назад +1

      Switching from CMOS to resonant tunnel diode threshold logic would give 2.2 THz at 5 W. And it would be cheaper!

  • @BOZ_11
    @BOZ_11 Год назад +1346

    Nvidia stopped caring about crypto when Ethereum moved away completely from PoW. It's akin to "YOU CAN'T FIRE ME! I QUIT!"

    • @lesslighter
      @lesslighter Год назад +11

      Can always probably go for Monero :X

    • @fern3436
      @fern3436 Год назад +61

      ​@@lesslighter It's probably best that alphabay stays underground and CPUs stay affordable. I'm happy with monero staying out of the news indefinitely.

    • @johnnypopstar
      @johnnypopstar Год назад +41

      @@lesslighter Don't give them ideas. That dumpster fire "industry" is on the way out, and that trend needs to continue.

    • @SupraSav
      @SupraSav Год назад

      Thank you, captain obvious

    • @BigTylt
      @BigTylt Год назад +39

      Cryptocurrency milking is dead, long live Generative AI milking

  • @rekire___
    @rekire___ Год назад +978

    _Friendship ended with cryptobros, now AI is my new best friend_

    • @cronyan
      @cronyan Год назад

      Nvidia sure loves their AI p0rns

    • @H0lyMoley
      @H0lyMoley Год назад

      Hey! There's no AI here, puny human *bzzrp* puny human fleshbag!

    • @117johnpar
      @117johnpar Год назад +17

      He was working up to the joke. I just think he forgot exactly how it was formatted. As did I.

    • @nisx2012
      @nisx2012 Год назад +20

      Until next crypto craze...

    • @ViciousTuna2012
      @ViciousTuna2012 Год назад +27

      @@nisx2012 Wait till AI and Crypto merge

  • @dano1307
    @dano1307 Год назад +935

    "People programmed it to mine crypto"
    If I recall correctly, nvidia made crypto mining specific cards

    • @dickdickling9389
      @dickdickling9389 Год назад +2

      I mean technically they didn't make any crypto specific cards... cause they ware made by partners, they just rebranded some existing cards and locked down their drivers through BIOS shenanigans.
      But yeah, you recall correctly.

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +4

      You don't remember correctly. Nvidia let AIBs produce ASICs after they tried to STOP mining on regular GPUs. It was third parties like EVGA who were the bad guys, not Nvidia.

    • @TheGulfryd
      @TheGulfryd Год назад

      @@Wobbothe3rd no they made mining cards before, they just have been very secretive about it and only sell them to crypto miners. good thing theyre saying bye bye crypto no soup

    • @dylanherron3963
      @dylanherron3963 Год назад +1

      @@Wobbothe3rd You shut your mouth right now and leave EVGA out of this.
      (it's a joke, yes, they released a mining-centric card in 2018)

    • @gabortoth3644
      @gabortoth3644 Год назад +140

      @@Wobbothe3rd You do not say....NVIDIA CMP 170HX Pro Mining was a room decoration then?

  • @darsh8964
    @darsh8964 Год назад +356

    It was clearly a logistics error when pallets of Nvidia cards went directly to cryptominers instead of anyone else, right?

    • @johnnypopstar
      @johnnypopstar Год назад +12

      I don't believe that was ever confirmed to have actually happened, was it? Someone just analysed overall sales figures and guessed.

    • @WayStedYou
      @WayStedYou Год назад +37

      And when they made dedicated mining cards.

    • @ModernOddity728
      @ModernOddity728 Год назад +6

      @@johnnypopstar Why else manufacture CMP cards? 🤨

    • @padnomnidprenon9672
      @padnomnidprenon9672 Год назад +13

      @@johnnypopstar Big miners called the corporate office and told them "I need two thousand gpus" Nvidia happily sold them. You often favor B2B customers

    • @Sabrinahuskydog
      @Sabrinahuskydog Год назад +8

      @@johnnypopstar It totally did. There are some actual large businesses that focus entirely on crypto mining (There's a few here in Texas too for example) and several of them posted photos online of multiple pallets of new Nvidia 30 series cards being unloaded at their loading docks.

  • @MrFredscrap
    @MrFredscrap Год назад +665

    Crypto is no longer NVIDIAs friend....
    ...
    ...
    "For now"

    • @lesslighter
      @lesslighter Год назад +9

      Maybe we should do that X guy is no longer my friend meme

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +3

      Lol yeah right like crypto is coming back...

    • @dylanherron3963
      @dylanherron3963 Год назад +3

      @@Wobbothe3rd HEY BUD

    • @m3l3e
      @m3l3e Год назад +7

      "till the next pump at least" 😂

    • @sammiller6631
      @sammiller6631 Год назад +1

      @@lesslighter Maybe we should think in actual thoughts instead of repeating memes?

  • @jamesoloughlin
    @jamesoloughlin Год назад +121

    More computer history videos (maybe) Steve? 🤔 glad you covered Moore’s contribution and history.

  • @kukuc96
    @kukuc96 Год назад +192

    A Framework 16 review would be cool. If you did a review on the 13, that could be cool as well, even though it's not a new product.

  • @Johnny2Starz
    @Johnny2Starz Год назад +53

    Rip to legend Gordon Moore you will be missed but your contributions will never be forgotten.

  • @stephanteconchuk
    @stephanteconchuk Год назад +67

    Sales down? USELESS

    • @llamaestingpie
      @llamaestingpie Год назад +4

      MUDA MUDA MUDA

    • @volvo09
      @volvo09 Год назад +6

      "we actually have to work to sell products now, rubbish!"

  • @HaefentheZebra
    @HaefentheZebra Год назад +71

    Please do buy one of those Framework laptops when the 16" comes out. I'm very interested in them and while I trust LTT to not be bias when reviewing them, it'd be nice seeing a source like you giving them a real rundown!

    • @joesterling4299
      @joesterling4299 Год назад +17

      Linus would absolutely disqualify himself from reviewing Framework products. But he will present them in a good light in announcement videos like the recent one. (They're not substitutes for objective reviews, obviously.) I too am very curious about the 16 with its discrete GPU module. I won't preorder or buy blindly, but I hope for good reviews when the product launches.

    • @snickerdoooodle
      @snickerdoooodle Год назад

      ​@@joesterling4299 iirc, Linus won't even review laptops himself anymore because of the Framework investment

  • @drizztcat1
    @drizztcat1 Год назад +175

    NVIDIA's been in a room sniffing its own farts for far too long.

    • @shekhinah5985
      @shekhinah5985 Год назад +8

      It's a company trying to maximize profits for investors. What did you expect.

    • @frizzlefry1921
      @frizzlefry1921 Год назад

      You see that South Park ep? All the yupsters wafting it right in their own face hilarious.

  • @jacobplyler3470
    @jacobplyler3470 Год назад +111

    I'm glad to hear that you are using a Framework laptop and covering the 16 inch model. I love my laptop and really hope the company keeps going. I'm going to get the 16 inch as soon as I can afford a new laptop. Will get an additional desktop at the same time with the Cooler Master case.

    • @akshatsingh9830
      @akshatsingh9830 Год назад +5

      and i m waiting for framework to start shipping in other regions so that i can get there mobo kit and vesa mount it to monitor

    • @imjust_a
      @imjust_a Год назад +1

      Framework's products have kind of interested me, but I do wish they offered lower-end options. I'm not in the market for something super powerful -- a general work/office laptop -- but I do like the repairability that Framework is selling.

    • @Deliveredmean42
      @Deliveredmean42 Год назад +1

      ​@@imjust_atrue. If this goes well this might unleash an interesting generation of laptops that works like a pc: you start with low end parts and slowly replace them until they become a midrange or high end. The best part is that since you already have the chasis, screen, keyboard and the ports, it becomes relatively cheaper that way.

  • @cemsengul16
    @cemsengul16 Год назад +31

    Man I really dig the Framework 16. I want it to blow up in sales and force other manufacturers to build modular laptops.

  • @jeremymtc
    @jeremymtc Год назад +8

    Re: Your story on the Framework laptop, I have massive respect for your statement regarding investment and conflict of interest. That's been a real problem in tech journalism for decades, and it is encouraging to hear that "homey don't play that game".

  • @19alive
    @19alive Год назад +552

    Nvidia: Friendship ended with crypto, now gamers are my best friend.
    Gamers: No ty

    • @_qwe_fk_1700
      @_qwe_fk_1700 Год назад +25

      Gamers: yes ty
      People are still buying a lot of nvidia cards

    • @Matkinson123
      @Matkinson123 Год назад +76

      @@_qwe_fk_1700 They're selling at record low numbers, so not really.

    • @aerosw1ft
      @aerosw1ft Год назад +23

      @@Matkinson123 record low numbers it might be, but they still have more than 80% market share (IIRC). So yeah, gamers still buy their products no matter how piss poor value they are, sadly

    • @ZinoAmare
      @ZinoAmare Год назад +1

      Nvidia: **Shocked pikachu face**

    • @_qwe_fk_1700
      @_qwe_fk_1700 Год назад +6

      @@Matkinson123 „record low numbers“ depends on what timeframe. But this is just something the haters tell themselves to make themselves feel better

  • @CrypticHashing
    @CrypticHashing Год назад +45

    NVIDA made 550 Million on the CMP lineup alone in the fiscal year 2022. You can look it up on their 8-K form they published this February.

  • @NinetyTres
    @NinetyTres Год назад +15

    Attended the first few E3's in the 90's
    If I recall, Friday was media day, Sat-Sun was open to public
    Pretty cool back when 3D hardware was in it's infancy

  • @be1100
    @be1100 Год назад +123

    Intel executive: "We need more cores! I don't care how it's done, just do it!"
    Intel engineer: "Guess we just make it bigger."

    • @1pcfred
      @1pcfred Год назад +16

      Subcontractor: How many pins would you like?
      Intel: All of them!

    • @GeminionRay
      @GeminionRay Год назад +12

      To be fair that's what AMD has been doing with Threadripper too. There's a slight die size increase from 3995WX to 5995WX. That's basically is the easiest way they can do with it right now, instead of trying to cram more cores into the same space.

    • @jaredweiman2987
      @jaredweiman2987 Год назад +8

      Fractal Designs: "We will make a log cabin themed case. The size of an actual log cabin to accommodate new GPUs and CPUs."

    • @1pcfred
      @1pcfred Год назад +3

      @@GeminionRay they can't increase density anymore like they could in the past. Those days are over now. Although you'd never know it by listening to what the industry says. Even Moore knew his observation wouldn't hold true forever.

    • @userblame632
      @userblame632 Год назад

      @@1pcfred I think its always a bad idea to state that something (particularly in computing) is over now. Maybe NOW, sure, but its hard to make confident predictions about the future. There will be some crazy development ad density will increase greatly (maybe its not even physical density)

  • @Slymind
    @Slymind Год назад +34

    After the honeymoon is over Nvidia goes like:
    "I never loved you in the first place!"

  • @pRopaaNS
    @pRopaaNS Год назад +75

    Nvidia just throwing whatever under a bus whenever. We remember.

    • @CaptainKenway
      @CaptainKenway Год назад +1

      It won't stop you from buying another Nvidia card next time you upgrade, so why should they care?

    • @pRopaaNS
      @pRopaaNS Год назад +1

      @@CaptainKenway it did stop me from buying many GPUs though in past several years? I had my own PC built without a proper new video card, and then I built and gifted 2 more PCs to my brother and sister. I ended up not buying a single new GPU. Bought used RTX 3070 TI from local seller for cheap, which means no money to nvidia. They just seems to not want to sell their products.

  • @jamescannon5105
    @jamescannon5105 Год назад +18

    I would definitely watch a Framework review or even just a talking head piece to hear your detailed thoughts!

  • @Heinz76Harald
    @Heinz76Harald Год назад +34

    after creating mining specific cards and selling gpu directly to mining companies the doublestandard hits hard

  • @Lurker-dk8jk
    @Lurker-dk8jk Год назад +37

    Thank you for mentioning the 2017 crypto boom (1:59). Had to drive 40 miles in the metro-Detroit area just to find a "local" supplier for ANY nVidia 10 series card. And it was a 1050 Ti. And, yes, I overpaid for it. PC died and I couldn't wait around for a card to be shipped to me.

    • @arz1898
      @arz1898 Год назад +7

      Lol yeah. Thats why I always keep 1 or 2 spare cards laying around, you never know when the next boom is..

    • @bitelaserkhalif
      @bitelaserkhalif Год назад

      Ah, I remember that time where it happened until 2019. I got GT 1030 instead.

    • @klobiforpresident2254
      @klobiforpresident2254 Год назад +3

      I remember ordering a 580 and the retailer not getting their expected shipment for a month. I got a 570 4GB and the difference back. Glad I took the trade because they didn't get another one for a while.

  • @acgamevids7855
    @acgamevids7855 Год назад +29

    Thank you for getting clarification on that AMD quote. Was a little astonished how many people were using it (as they'll probably use the new quote) as a reference too to tell me that AMD clearly could have beaten Nvidia at the high end but didn't (which certainly seems like the most convincing way of saying you're better than competitor but just decided to throw it a bone, yes).

    • @013nil
      @013nil Год назад

      i never understand why people care about what "They" said and what not. We should only look at the products.

  • @jimtekkit
    @jimtekkit Год назад +9

    RIP Gordon Moore, a pioneer in one of the most interesting times of development of silicon technology.

  • @volvo09
    @volvo09 Год назад +25

    That Framework laptop is awesome! I would totally buy one if they release it.

  • @jablue4329
    @jablue4329 Год назад +7

    RIP Mr. Moore. Accomplished quite a lot, and his contributions to humanity will be remembered.

    • @ycl260779
      @ycl260779 Год назад +1

      Yea everyone's talking about the nvidia bit while this part was kind of more significant to me.

  • @johnknouse6984
    @johnknouse6984 Год назад +12

    Man, I always love the HW News. The Framework laptop is a design type I have been looking to see made for years. And, I'm like Steve. I like to use 17.3" laptops and don't have an issue carrying an extra couple of pounds for the convenience it gives me to do what I need to do in the time I have. And, the Steam info just makes me think one thing: time to go to Linux. And RIP Mr. Moore...and thank you for all you gave mankind.

  • @twocows360
    @twocows360 Год назад +5

    up until recently, the steam client had an option to use the -no-browser switch to actually disable CEF. they disabled this recently and put up a feedback thread that i'm sure they are actually reading and not just putting up so people (like me) have a place to vent.
    the decision to force CEF means that if they want to keep this now hard dependency up to date (because it is very much not right now), they *have* to drop win7/8 support because newer versions of CEF don't support it. but my *personal* concern isn't with the OS limitations (all my systems are on 10 or 11), but with the fact that CEF is a memory-leaking piece of... "refuse" that the steam client could (and has, for the past 15 some odd years) serve its core functionality without.

  • @popcorny007
    @popcorny007 Год назад +3

    Minor correction, Sapphire Rapids has up to 56 cores, not 48.
    It's made of 4x 15C tiles, with one core disabled per tile to increase yield.

  • @ArmadaOne
    @ArmadaOne Год назад +7

    "NVidia's Reality Distortion Field"
    Very accurate, well played.
    I remember them selling cards in the thousands per order to crypto miner companies while gamers were waiting for 2 years to get a GTX 1660 at 500 dollars because not a single 3000 series card was in stock anywhere.

  • @christopherjames9843
    @christopherjames9843 Год назад +36

    At some point the chickens will come home to roost with Nvidia and AMD as far as basically being in cahoots artificially inflating gpu prices.

  • @SirRoundPotato
    @SirRoundPotato Год назад +8

    Steam breaking Win7 support just means faster Linux adoption. :)

    • @haukikannel
      @haukikannel Год назад +1

      No, it does not mean faster Linux adoptation… it is pity, but true… People will just say that win 10, 11, or 12 is bad and still use those.

  • @clay4816
    @clay4816 Год назад +115

    Nvidia strikes me as a very abusive relationship partner. I feel like Nvidia and crypto will be walking hand-in-hand again someday

    • @tylermc11795
      @tylermc11795 Год назад +20

      It strikes me as the most gaslighting business relationship

    • @Wobbothe3rd
      @Wobbothe3rd Год назад +4

      Crypto is dead. Crypto enthusiasts tend to be sociopath scammers. Nvidia is a real respected company.

    • @3of12
      @3of12 Год назад +4

      Ask EVGA, they stopped working with Nvidia over it.

    • @3of12
      @3of12 Год назад

      @@Wobbothe3rd that's because Bitcoin is a legitimate project to build a peer to peer currency and all the other coins are literal scams to take advantage of buzzwords. Bitcoin is now free of being treated as a stock and now can return to being a currency.

    • @jmiller6066
      @jmiller6066 Год назад +11

      ​@@Wobbothe3rdYeah, Nvidia might be shitty about some things but they at least make actual products that do actual useful things

  • @HAFBeast91
    @HAFBeast91 Год назад +21

    Nvidia also made 3080 20 Gb and sold them directly to miners and didn't sell them to regular consumers.

    • @antonisautos8704
      @antonisautos8704 Год назад +2

      These games could use that kind of GPU memory. But they didn't put it on consumer cards. Tragic

    • @ThunderingRoar
      @ThunderingRoar Год назад +2

      Whats the source on this? Afaik ethereum needed ~5GB of VRAM and anything above it was useless. The main thing that mattered was memory speed and bus width, and that 20GB 3080 would have the same 320bit bus as the regular 10GB 3080

  • @eldibs
    @eldibs Год назад +1

    Considering your job is journalism, your level of tact is perfect. These big companies and groups need journalists snarking at them over the garbage they try to pull.

  • @qlum
    @qlum Год назад +7

    About laptops, finally someone who shares the need for a navigation cluster.
    They could fit one into smaller laptops but generally don't, my old clevo one is 14inch and does include the nav keys.

    • @boam2943
      @boam2943 Год назад

      My very old 12 inch Toshiba M400 has one two. Although it is crammed against the rest of the keys I can still use it with no problems.

    • @qlum
      @qlum Год назад

      @@boam2943 Same goes for my 2012 Lenovo Thinkpad x121 (11.6) inch
      There they crammed page up / page down next to the up key and home / end next to delete.
      it was a Great little laptop apart from the horrible cpu (amd E350)

    • @boam2943
      @boam2943 Год назад

      @@qlum I don't like that layout, even if the keys are there. There is no spacing around the cursor keys and I tend to press a key that I do not want, unless I look at the keyboard first. But, if I had to choose between that and a keyboard without dedicated keys, my RSI says "more keys, please" :)

  • @m4dm3th0d
    @m4dm3th0d Год назад +2

    True greatness lives forever. RIP Gordon Moore, your contributions will last as long as computing does

  • @DEMENTO01
    @DEMENTO01 Год назад +3

    what framework is doing seems very interesting, ive been waiting for modular gpus on laptops to make a comeback, its also very interesting to see how they managed to overcome the cooling isues, the module includes its own heatsink and all and the laptop wouldnt even need to be opened up? wow, also the option to use it on other machines is super cool + open source? cool af, really hope there will be a way to use it on a PCI-e port tho because thatd make it have no expiry date at all, thunderbolt and usb4 are cool and all but every pc for the past 20 years has pci-e like ive never seen usb4 nor thunderbolt irl yet so i feel like not having a way to get them working directly on pci-e would be a bit of a waste but they dont need to have it all figured it out yet when the product isnt even out + it being open source means that with enough interest there'll be a way made by the community so yeah, massive W

  • @OfficialJamesNewberry
    @OfficialJamesNewberry Год назад +4

    GN is the GOD Tier for PC Hardware/Software info. E3 fumbled the bag in 2014.

  • @sergentboucherie
    @sergentboucherie Год назад +8

    I’d like to see a framework laptop review

  • @Zeuskabob1
    @Zeuskabob1 Год назад

    Man, I wish you'd come to the CHM when I was there! I interned at the museum in college, it was an incredible place. When I was there, only a couple docents were still capable of operating the PDP-11, and when they retired there was no plan to continue operation of the machine. So solemn to experience what are likely the twilight years of an incredible piece of technology.

  • @chuheihkg
    @chuheihkg Год назад +5

    That's unfortunate for these trade shows. By the way, there might be something more important to do first, may ask these trade shows later.

  • @oossoonngg
    @oossoonngg Год назад +1

    1:18 that whole passage, almost perfect Rick Sanchez moment right there

  • @Thor847200
    @Thor847200 Год назад +6

    You do not lack tact @Gamers Nexus. You are just honest. And being honest is all of the tact you need.

  • @WarblyWark
    @WarblyWark Год назад +1

    That framework gpu setup- if it is actually interesting enough, could mean high efficiency production stations, or multi GPU mining or even better, multi GPU rendering.
    Imagine being able to render games or digital sculptures (mostly CAD) in a time line similar to a work station or HEDT.

  • @astrea555
    @astrea555 Год назад +28

    >4 years later
    Oh we never liked AI either!

    • @haukikannel
      @haukikannel Год назад

      They shoud say that they love qctually money. That would be honest at least…

  • @tyswid
    @tyswid Год назад +2

    I hope framework doesn't suffer from the Osborne effect. Want to get one with a gpu

  • @Snafu2346
    @Snafu2346 Год назад +26

    Honestly I felt that E3 died after 2006. It was never as much fun as it was. It had its moments, but it went from an event, THEE event that everyone talked about. And it was, it was and still is the most fun I ever had because early 2000s it was an orgy of videogames, prizes, swag, swag, swag, and booth babes that are dying to get your attention by throwing swag at you.
    Then late 2000s it turned into some vender or press only event, held at some airport hanger. Nobody cared, NOBODY, except self righteous "journalist" from places like Electronic Gaming Monthly claiming that E3 was going back to its roots, like E3 was "FOR US, BY US". Except again, NOBODY CARED. Journalist don't buy the games, and couldn't give two ships about some fanbois opinions. Have some journalist only event invite only with less than 300 nobodies. There was so little fanfare on E3's airport hanger exclusive website traffic, they cancelled it as they realized, nobody cared about that event, and vendors wouldn't either. No exposure, what's the point?
    game shows should have fanfare and spectacle. They're SHOWS. yeah I know crowds suck, but it doesn't stop people from going.
    On top of the HORRIBLE management of the ESA, and rising cost as they themselves were gouging devs to just to have a display. I can't blame, Sony, Microsoft, and Nintendo from not showing.
    Of course a few years after the complete DEBACLE of the journalist only airport hanger "expo". They would relaunch E3 to its FORMER, trying to bring back the glitz, and the spectacle, what it used to be. They tried, but making it not vendor friendly hurt them eventually, and the last show I would attend was 2015.
    I had fun, but it ........ wasn't the same. It wasn't even close, it was a mild show, some fun, some prizes, little to no swag compared to what it was, (one year I left with 20 free games I caught by leaping into the air and nabbing over everyone heads as my vertical was pretty high, 22 tshirts, mouses, mouspads, game controlllers, calanders, posters, game guides, when they were a thing. one tombraider guide signed by Toby Gard, and Karima Adibebe. A Soul Calibur Calendar signed by Hiroaki Yotoriyama, and a Oblivion Guide signed by Todd Howard etc, etc, etc, I still have my Lineage coca cola can. I had so many WoW binders I sent them to friends around the country who were huge WoW fans.
    Fast Forward, there was still lights, but there was no fireworks. Literally there were flames at the NCsoft booth in early 2000s. Huge 3rd parties weren't there. There entire sections that were now empty. Though you could still feel the passion, it seemed like there was big brother saying you can't have too much fun.
    one of the booth girls said there was certain "conduct" they must keep up or get fined. But hey, I remember line dancing with the Nokia Engage girls with dozens of others, while being given free drinks surrounded by Nokia models dress in white mini skirts that dressed like gogo girls. We were all just having clean fun. And ESA killed it.

    • @pepperidgefarmremembers1911
      @pepperidgefarmremembers1911 Год назад +4

      Pepperidge farms remember those Nokia Engage girls.

    • @sammygal6730
      @sammygal6730 Год назад +2

      I remember when they changed E3 to that invite only event at some airport. It was dumb, and I didn't even pay any attention to it.

    • @nanocat4141
      @nanocat4141 Год назад +11

      I was at E3 early 2000's, I remember being at the I think the Namco booth. The girls were tossing shirts, calanders, and games into the crowd. I saw a game flying over my head, and some dude standing a few feet behind leaped into the air, and yoinked it. This guy was an entire torso above my highest reach. I'm like wtf. Then a tshirt came in my direction, and again, the same guy who's KNEE was above my head snatched it.
      I was getting angry. I turned around and I saw him give the shirt and game to a kid in a wheelchair who would have otherwise never been able to get the stuff. The kid in the wheelchair had his dad with him thank him and shook his hand. I was okay with it after that.
      Then the booth girl tossed a calendar and the guy went all Dennis Rodman and snatched it too......

    • @nier2b440
      @nier2b440 Год назад +3

      @@nanocat4141 lol...

    • @Snafu2346
      @Snafu2346 Год назад +4

      @@amberturd3504 I gave away most of my swag, I sent tshirts and WoW stuff because I wasn't a WoW fan to all my guild mates from Final Fantasy online. Shout out to Tshot/Spikeflail from Server Ragnarok. I kept the games though..... my guildmates really wanted the Ninja Gaiden OG shirts I had, so I mailed them out to all my friends who couldn't go to E3.

  • @furrys.1304
    @furrys.1304 Год назад +2

    Steve really seething about the E3 snub.

  • @castform57
    @castform57 Год назад +4

    The framework 16 would be almost the perfect laptop I'd want, especially since it now has a numpad. Only problem is that they're not sold in EU, or have had any mention of a physical ethernet port.

    • @flow221
      @flow221 Год назад

      Go read through Framework's product information. Ports are provided through plug-in modules the user can mix and match as desired. Ethernet is one of those.

    • @WhenDoesTheVideoActuallyStart
      @WhenDoesTheVideoActuallyStart Год назад +1

      The ports are customisable, I believe they have an ethernet solution.

    • @owensparks5013
      @owensparks5013 Год назад

      Where are you in the EU? Framework seems to have a bunch of EU countries on their ordering website.

  • @philaskytech
    @philaskytech Год назад

    It's refreshing to see Stephen's passion for the roots of computer technology. Keep up the great work young man.

  • @magnaxilorius
    @magnaxilorius Год назад +3

    I'd love to see y'all start covering video games again. I know it's just not the direction your channel has taken.

  • @ruikazane5123
    @ruikazane5123 Год назад +1

    What the...we lost another legend.
    Fairchild Semiconductor was the bomb in the day. They had made silicon based integrated circuit mass production possible (and guess _who_ ) which placed an end to the germanium ICs that other companies had been developing at the time. Whatever remained of the company is now part of ON Semiconductor (now onsemi) which is itself a spinoff from Motorola - another historic name in the semiconductor business at those old days.
    *For the curious:
    International Rectifier (i.e. HEXFET) was taken into Infineon (i.e. PowlRstage, CoolMOS)
    Philips Semiconductor (acquired Signetics, originator of the 555 timer IC) spun off as NXP (now Nexperia)
    National Semiconductor (i.e. LM317, LM2596) was acquired by Texas Instruments (the calculator)
    Linear Technology (i.e. LT1070, LT1581) and Maxim Integrated (i.e. MAX66xx temperature sensors) now part of Analog Devices

  • @ojassarup258
    @ojassarup258 Год назад +2

    Looking forward to seeing how viable Framework 16 makes gaming laptops. That said, my biggest complaint with Framework laptops is the tiny number of ports you get in exchange for customisation.

    • @padnomnidprenon9672
      @padnomnidprenon9672 Год назад

      I would like to buy one. I stopped buying laptop, like dell or HP, because that's basically ewaste that won't even work well in 3 years

  • @mrle0719
    @mrle0719 Год назад +1

    I hope framework can be financially stable, it is a project the world needs

  • @neonmidnight6264
    @neonmidnight6264 Год назад +13

    For all their woes, NVIDIA has been working on AI workloads for years, namely, on both libraries, hw support and general “just works”, the effort I would prefer to see from AMD too. Their claims with AI are not unsubstantiated, which is kinda sad since we need more competition in consumer and prof. HW to drive prices down because RTX4090 and A/H100s right now are unmatched.

    • @Finsternis..
      @Finsternis.. Год назад +1

      AMD has no need to do useful stuff. They get by with barely competitive hardware that gets bought by people that still think that being edgy and rejecting Nvidia and Intel is the cool thing to do.

    • @NightKev
      @NightKev Год назад +5

      @@Finsternis.. You seem stuck in 2012.

    • @brahtrumpwonbigly7309
      @brahtrumpwonbigly7309 Год назад +2

      @@Finsternis.. lots of emotions there, not many facts. Something tells me you couldn't back that opinion up without just getting angry and indignant and demanding you're right.

    • @ledoynier3694
      @ledoynier3694 Год назад +3

      @@brahtrumpwonbigly7309 sadly he's kind of right lol
      But it's not just AMD fanboys, that's the whole "fanboyism" attitude that prevails a lot of times since the advent of social media.
      Before that, we just used to stick with the brands that don't crash constantly, or that have drivers that work with the games we used to play.
      Now to sell well in the DIY market, you just need to have good marketing and brew a cult following (team red, team green, team blue etc...)

    • @WayStedYou
      @WayStedYou Год назад +1

      ​​@Die Finsternis or massively profitable server cpus that have buried intel for the past 3 or 4 years?
      They aren't competing with nvidia because they make almost no profit va using these wafers for cpu chiplets.

  • @VerdonTrigance
    @VerdonTrigance Год назад +2

    Sad to hear that Steam relies on internal browser to noramally operate. I actually paid a lot of moneys for using that software (and games behind it that I've bought) and I want to use it as I wish. No matter which OS I'm currently using (and I'm using Windows 8.1). I hope older version will remain functional at least for 5 years after.

  • @volvo09
    @volvo09 Год назад +6

    NVIDIA is just sour that they can't automatically sell EVERY SINGLE GPU they produce, and now have piles of unwanted GPU's sitting on shelves.

  • @FerralVideo
    @FerralVideo Год назад +2

    Whenever it becomes time for me to need a new laptop, Framework 16 is definitely on my radar.
    I'm in the process of upgrading my main PC though, and that's making my wallet weep, so another time.

  • @antonisautos8704
    @antonisautos8704 Год назад +13

    Man E3 used to be the place where you'd see all the upcoming stuff and could get excited about things like games or future PC stuff. But nope they're canceled now. Oh well I guess we will just find out about stuff when it's announced

    • @flow221
      @flow221 Год назад +3

      There are plenty of other options for game publishers to showcase their upcoming titles these days. Which is why E3 is going away.

    • @imjust_a
      @imjust_a Год назад +1

      @@flow221 Problem is that they're not really all consolidated in one place anymore. They're spread out throughout the year in little bite-sizes instead of all the info coming out at once. It makes sense from the publishers' perspectives: why compete with everyone else for the same timeslots? But for the consumer, it can require more effort to keep up-to-date on everything. Especially if you don't follow anyone who reports on upcoming titles.

  • @Aerobrake
    @Aerobrake Год назад +2

    Rest in peace Gordon Moore, you will be missed.

  • @eddiehimself
    @eddiehimself Год назад +7

    The interesting thing about Babbage's difference engine is although it wasn't able to be built at the time, he'd already requested a woman by the name of Ada Lovelace to start writing programs for it, making her in effect the first ever computer software engineer. She is of course the namesake of Nvidia's "Lovelace" architecture.

    • @t1e6x12
      @t1e6x12 Год назад

      I've heard that is kinda disputed whether she just transcribed the programs or actually wrote them.

    • @1pcfred
      @1pcfred Год назад

      That's not what really happened.

  • @HolmesHobbies
    @HolmesHobbies Год назад +1

    They lost a huge market share when eth went to POS. Now suddenly there isn't a shortage of graphics cards!

  • @Reverend_Rob
    @Reverend_Rob Год назад +4

    My 4090 doesn't even pull 400w most of the time. It really depends on what you play and what settings. These 40 series cards seem pretty damn efficient. I use my gpu with a 7700x at eco mode and while it's still using a lot of power relatively speaking, the total system draw isn't nearly as high as you'd expect for such a powerful gaming setup.

  • @deskosido4397
    @deskosido4397 Год назад +2

    I always have seen E3 like a red carpet for big industry videogames, much more that Video games awards...
    I'm sad to see this event go away, it was a very fun week following new announcements, mishaps, and reactions from out favourite youtubers/streamers...
    RIP E3

  • @prla5400
    @prla5400 Год назад +3

    RIP E3.
    I remember the Ubisoft E3 cinematic trailers. Great memories.

    • @TheDiner50
      @TheDiner50 Год назад +2

      You mean the cinematic trailers for games that where broken soulless money grabs? Making pre orders and fake crap up ruin the hole industry? Not even talking about monetization for a moment the games for the last 5 or 10 years are all just garbage unfinished messes. Anthem showing made up trailers that even made devs for the first time see what the actual game was going to look like! That is how broken it got and still is! The pre order was far more valuable to the point of a trailer for a soon to be released game was worked on before ALL the devs even knew anything about the game! That means no one in the dev team had made ANYTHING of the stuff shown. They had just began the groundwork and the trailer was all made up fantasy and the goal of what the game might look like!
      That is *. Making a gameplay trailer that was 100% made up cinematic. And the people to make the game saw what they where going to make together with the public. That is E3 at it's best.
      Almost calling for some kind of god to thank for getting rid of E3 and the like!

  • @572089
    @572089 Год назад +1

    if framework actually follows through with removable GPUs i will ABSOLUTELY buy one; preferably a ryzen model if the 16" comes in ryzen.
    personally, i like desktop replacement; those 21" chonkers, though i doubt they're moving that way in the future.

  • @Krytern
    @Krytern Год назад +17

    But Nvidia sure loved crypto when it made them money.

  • @L330ne
    @L330ne Год назад +2

    I remember in the late 90's / 2000 it was my dream to go to E3 =x

  • @christopherjames9843
    @christopherjames9843 Год назад +3

    Good morning Steve!

  • @tappy8741
    @tappy8741 Год назад +1

    I'm interested in framework 16 to cram multiple M.2 SSD's into it, it would be so nice to have 24TB of storage but 8TB M.2 modules will cost a bomb. 3x 4TB M.2 for 12TB will probably be the sweet spot.

  • @nightbirdds
    @nightbirdds Год назад +5

    I'm going to say it, E3 going away is not a good thing. We'll agree to disagree here, but for me, the way things are right now makes trying to find games you want to play a full-time job, because you literally have to wade through hundreds of websites, watch hundreds of streams, and try to find stuff that interests you. I could do that, or I could actually be playing games. E3 being a big show with structured events is a big advantage. Geoff Keighley understands this, that's why his show is still around.

    • @imjust_a
      @imjust_a Год назад +1

      I agree. I don't really frequent any gaming sites these days (since gaming journalism is a joke); E3 was the best place to go to see a variety of stuff in one spot -- from AAA "blockbusters" to the occasional indie spotlight. While I'm not sure if these games would've been shown off at E3, I'm quickly discovering (by chance) a lot of AA-tier games made within the past 2~3 years that I didn't even know existed.

    • @georgwarhead2801
      @georgwarhead2801 Год назад +1

      not realy, most game presentations where just scripted engine footage and at the end the qoute "more at gamescom" so at the end, gamescom was allways the place for me to get informations about new games

  • @-Datura-
    @-Datura- Год назад

    Saw your repair video over at LR channel and subbed before watching a vid. This is my virginity being lost! Love your style bro. I am going to enjoy my geek-binge!

  • @JosepsGSX
    @JosepsGSX Год назад +4

    I'm watching this in my almost 10yo Win7 computer I have in my B place. This is bad news regarding Steam, cause I love this thing. It boots lighting fast (way faster than my reasonably specced 7900x I assembled a month ago), it is rock solid reliable, and gives me awesome time playing Pubg and a few other old games on weekends.
    I hope Pubg on Epic will last a bit more.

    • @Notacka
      @Notacka Год назад

      Agreed I love my winxp/win7 pc and losing steam is pretty big. Maybe we can get someone to bypass the requirement.

  • @wile-e-coyote7257
    @wile-e-coyote7257 Год назад

    Thanks for all this, Steve - and crew!

  • @alcozar5905
    @alcozar5905 Год назад +3

    Many tears shed for NVIDIA and Crypto NOT, I hope NVIDIA can find better friends. Oh, they BURNED ALL THOSE RELATIONSHIPS.

  • @tommyservo9660
    @tommyservo9660 Год назад

    1:19 - hits hard. Very true for almost every company.

  • @FARKENGFH
    @FARKENGFH Год назад +3

    Nvdia used chat gpt to write the press statement...lol and it got confused..

  • @apollion888
    @apollion888 Год назад

    Thank you as always for these, I just realized I'm taking this for granted

  • @nagranoth_
    @nagranoth_ Год назад +3

    2:50 well no, that just means they care about income. Not that they care about crypto. And why would they? They sold those cards to make real money, not because they cared about crypto currencies.
    12:50 the bad translation didn't read like they said Nvidia had a 600W card either. It read like AMD said they could compete but they'd have to make a 600W card to do so.

  • @Xbox360mIRC
    @Xbox360mIRC Год назад +1

    Oh, the first crypto boom was in 2017. No wonder the 1070 was $379 and the 2070 was $599, I guess. I wondered why they skyrocketed before covid and when I remember the 2nd crypto boom because I wasn't gaming much or paying attention to any of this. Covid and the 2nd boom just allowed them to maintain prices forever high far past inflation which is only 19% USD cumulative since 2018 even though prices for each tier seems to be basically doubled since then.

  • @SpuriousECG
    @SpuriousECG Год назад +14

    When crypto booms again, no doubt Nvidia would re-organize to meet the demand in order to make record profits again :D

    • @fern3436
      @fern3436 Год назад +3

      I doubt PoW on consumer hardware ever comes back in the same way. Ethereum was a black swan for GPUs and I don't see how anything could repeat its history. PoW is justifiably under heavy scrutiny by investors and the public alike.

    • @drakomus7409
      @drakomus7409 Год назад

      @@fern3436 Silly fern, Bitcoin halvening happens next year. Nvidia profits about to go back into record highs.

    • @mbonkeden
      @mbonkeden Год назад +1

      @@drakomus7409 but but with their venture into AI, if crypto were to resurface... How could Nvidia the ONLY GPU maker in the world possibly handle all of the profit!?!!

    • @t1e6x12
      @t1e6x12 Год назад

      @@drakomus7409 Its not like Nvidia makes ASICS though

    • @drakomus7409
      @drakomus7409 Год назад

      @@t1e6x12 did they make ASICS in 2016 or 2020? last 2 bitcoin halvenings gave nvidia record profits, just like this one will

  • @ok8745
    @ok8745 Год назад

    ngl, feel like the recent vids have been more high energy and swwms to be more uploads. all for it tho. great stuff!

  • @alistairblaire6001
    @alistairblaire6001 Год назад +5

    It would be interesting if Nvidia makes some AI specific cards for home tinkerers. That crowd cares mostly about VRAM, not even necessarily the CUDA cores or clocks. The 3060 is a popular card because it has 12gb VRAM, and some people have 4090 cards not necessarily because of the GPU silicon, but because it has the highest amount of VRAM in a consumer card. Nvidia could make a 4070 with 32gb or more of VRAM and AI hobbyists would buy it. For the right price of course.

    • @dead-claudia
      @dead-claudia Год назад

      btw inference will still use the cuda cores, just it won't be as demanding on features (it's mostly just various floating-point math functions that correspond one to one with matrix multiplication inputs)
      high cuda core counts are also very useful for model training

  • @zkatt3238
    @zkatt3238 Год назад +2

    Framework is so cool. I hope it isn't just a fad that dies out in a few years like those phones with add-on modules.

  • @M1hay_L
    @M1hay_L Год назад +4

    Not just stopped nvidia from making money but also jeopardize 4000 series and future gpu sales cause miners have to sell their used gpus 😂

  • @tylergibbs3869
    @tylergibbs3869 Год назад

    (time index 19:41) Good point. Understood.

  • @Scooppi
    @Scooppi Год назад +3

    Looking at the efficiency of the RX 7000 GPUs, AMD would have to make a 600W GPU to compete with the 4090.

  • @syedrehanfida
    @syedrehanfida Год назад +1

    Thank you for the news, GN!

  • @hydroponicgard
    @hydroponicgard Год назад +9

    Nvidia is "Chase the trendy thing" the corporation nowadays.

    • @makuIa
      @makuIa Год назад +1

      & unfortunately it works in the world of youtube but I like how it keeps AMD cards low for us who don’t care about Nvidia lol

  • @jierenzheng7670
    @jierenzheng7670 Год назад +2

    There is a AMD model with a better iGPU, I am hoping Davinci Resolve would recognise it though, else hopefully Intel 14th Gen iGPU which is based on Arc would do so. I have the 13" model too, but despite Iris XE being capable, I can't edit on it.

  • @iancalandro8180
    @iancalandro8180 Год назад +5

    I think AMD is playing the long game here. Obviously we can't know for sure, but I suspect Nvidia might have been spooked by the RX 6000 series, where AMD got VERY close to Nvidia (the cards are within single digit percentage points of each other if you consult Passmark scores, which I know aren't the best metric, but not the worst either). Seeing this, Nvidia probably wanted to cement its position as the "quality card" and went out to make the most powerful thing possible (the RTX 4090). We're reaching the limit of how small we can make transistors, and while Nvidia opted for TSMC's 4 nanometer process, AMD opted for 5-6nm (this is from Techpowerup).
    So I suspect that in 2024-2025, when Nvidia releases the RTX 5000 series of cards, we won't see much of a performance boost over the last generation. Meanwhile, AMD might be able to boast a much larger improvement, making their newer cards more appealing, and *potentially* swaying people towards team red.
    I'm not an expert in this field and this is not an analysis without flaws, but take that as you will.

    • @makuIa
      @makuIa Год назад +1

      I hope the younger & more “woke” generation stick to Nvidia so AMD can keep their prices low. exact same performances for sometimes 40% lower price is too good for these youtubers to ruin for us

    • @mr.rainc0at614
      @mr.rainc0at614 Год назад

      Nah, I honestly think tghey are just not willing to go punch for punch in the same market as Nvidia, because they will lose.

    • @ARedditor397
      @ARedditor397 Год назад

      You are wrong in fact the node is 4Nvidia, not 4nm it is a customized 5 nano meter process

    • @iancalandro8180
      @iancalandro8180 Год назад

      @@ARedditor397 The "4Nvidia" description for N4 is incorrect. TSMC has different naming designations for each of their process node sizes (N22, N16, N7, N5, etc). If "N4" stood for "4Nvidia", then the N7 node that AMD used for the 6000 series wouldn't be called "N7". While Nvidia uses the N4 process, that's not the reason it's called that.
      As for it being 4 nanometers, I have looked into this and... I don't know. Techpowerup, Tom's Hardware, a good amount of mainstream reporting, and even TSMC's investor reports label the N4 process as "4nm". But I have also heard reports on the contrary that N4 is a "refinement" of TSMC's N5 process. That could be interpreted a number of different ways. "Refinement" could mean a node shrink as well as it could not mean one, so I'm leaving that one open.

  • @josipzuljevic3682
    @josipzuljevic3682 Год назад

    Great content as always ,just wondering if you cover the Phanteks nv7 case @Gamers Nexus

  • @bingbashbosh1
    @bingbashbosh1 Год назад +3

    I'll believe it when a GPU no longer costs a kidney

  • @vailpcs4040
    @vailpcs4040 Год назад +2

    My Framework laptop is the best I've ever owned. Love that they are making it work.

  • @joedaniels4329
    @joedaniels4329 Год назад +3

    Well I play in crypto and it sounds to me like the CTO of Nvidia got wrecked and feels the need to spread FUD. Most legitimate crypto's are moving to "Proof of Stake" rather then "Proof of work". Nvidia does not have a future in crypto anyway. Could his opinion about crypto be slightly bias?

    • @LucasFerreira-gx9yh
      @LucasFerreira-gx9yh Год назад +1

      proof of stake go against the point of cryptocurrencie anyway.

    • @joedaniels4329
      @joedaniels4329 Год назад +3

      @@LucasFerreira-gx9yh Actually it seems you are the one that misses the point. Crypto has realized and unrealized use cases. The intent of crypto was not about an unregulated safe haven for most of us. It is about finding a more agile and equitable means to move value from one entity to another without the need for so many middle men. Censorship resistance is still better then fiat even with proof of stake. So for most of us a properly set up proof of stake would still be a massive improvement over traditional systems.

  • @davidweeks1997
    @davidweeks1997 Год назад

    Kudos on the Gordon Moore notice.

  • @TheSickness
    @TheSickness Год назад +5

    Gas + Light = Nvidia

    • @drakomus7409
      @drakomus7409 Год назад

      banks collapsing... nvidia: being your own bank is useless to society right now
      🤡🌎