paid the cryptotax end of 2016 for a 1060 6gb never again, sadly AMD is playing along EU pricing is fudged at least they have enough vram would have pulled the triger on a 6800/xt in the US
I would really like to see some in depth reviews of the Framework laptops. Pros/cons and performance compared to different other laptops on the market. It's sad to know that Steam is no longer going to support Windows 7 and 8. It's a slap in the face to people who have older systems they use for older games. This is a troubling precedent if it means having to be forced to upgrade in order to use games we've purchased that work just fine on older systems. Steam should have a remedy for that!
@@lesslighter It's probably best that alphabay stays underground and CPUs stay affordable. I'm happy with monero staying out of the news indefinitely.
I mean technically they didn't make any crypto specific cards... cause they ware made by partners, they just rebranded some existing cards and locked down their drivers through BIOS shenanigans. But yeah, you recall correctly.
You don't remember correctly. Nvidia let AIBs produce ASICs after they tried to STOP mining on regular GPUs. It was third parties like EVGA who were the bad guys, not Nvidia.
@@Wobbothe3rd no they made mining cards before, they just have been very secretive about it and only sell them to crypto miners. good thing theyre saying bye bye crypto no soup
@@johnnypopstar Big miners called the corporate office and told them "I need two thousand gpus" Nvidia happily sold them. You often favor B2B customers
@@johnnypopstar It totally did. There are some actual large businesses that focus entirely on crypto mining (There's a few here in Texas too for example) and several of them posted photos online of multiple pallets of new Nvidia 30 series cards being unloaded at their loading docks.
Please do buy one of those Framework laptops when the 16" comes out. I'm very interested in them and while I trust LTT to not be bias when reviewing them, it'd be nice seeing a source like you giving them a real rundown!
Linus would absolutely disqualify himself from reviewing Framework products. But he will present them in a good light in announcement videos like the recent one. (They're not substitutes for objective reviews, obviously.) I too am very curious about the 16 with its discrete GPU module. I won't preorder or buy blindly, but I hope for good reviews when the product launches.
I'm glad to hear that you are using a Framework laptop and covering the 16 inch model. I love my laptop and really hope the company keeps going. I'm going to get the 16 inch as soon as I can afford a new laptop. Will get an additional desktop at the same time with the Cooler Master case.
Framework's products have kind of interested me, but I do wish they offered lower-end options. I'm not in the market for something super powerful -- a general work/office laptop -- but I do like the repairability that Framework is selling.
@@imjust_atrue. If this goes well this might unleash an interesting generation of laptops that works like a pc: you start with low end parts and slowly replace them until they become a midrange or high end. The best part is that since you already have the chasis, screen, keyboard and the ports, it becomes relatively cheaper that way.
Re: Your story on the Framework laptop, I have massive respect for your statement regarding investment and conflict of interest. That's been a real problem in tech journalism for decades, and it is encouraging to hear that "homey don't play that game".
@@Matkinson123 record low numbers it might be, but they still have more than 80% market share (IIRC). So yeah, gamers still buy their products no matter how piss poor value they are, sadly
Attended the first few E3's in the 90's If I recall, Friday was media day, Sat-Sun was open to public Pretty cool back when 3D hardware was in it's infancy
To be fair that's what AMD has been doing with Threadripper too. There's a slight die size increase from 3995WX to 5995WX. That's basically is the easiest way they can do with it right now, instead of trying to cram more cores into the same space.
@@GeminionRay they can't increase density anymore like they could in the past. Those days are over now. Although you'd never know it by listening to what the industry says. Even Moore knew his observation wouldn't hold true forever.
@@1pcfred I think its always a bad idea to state that something (particularly in computing) is over now. Maybe NOW, sure, but its hard to make confident predictions about the future. There will be some crazy development ad density will increase greatly (maybe its not even physical density)
@@CaptainKenway it did stop me from buying many GPUs though in past several years? I had my own PC built without a proper new video card, and then I built and gifted 2 more PCs to my brother and sister. I ended up not buying a single new GPU. Bought used RTX 3070 TI from local seller for cheap, which means no money to nvidia. They just seems to not want to sell their products.
Thank you for mentioning the 2017 crypto boom (1:59). Had to drive 40 miles in the metro-Detroit area just to find a "local" supplier for ANY nVidia 10 series card. And it was a 1050 Ti. And, yes, I overpaid for it. PC died and I couldn't wait around for a card to be shipped to me.
I remember ordering a 580 and the retailer not getting their expected shipment for a month. I got a 570 4GB and the difference back. Glad I took the trade because they didn't get another one for a while.
Thank you for getting clarification on that AMD quote. Was a little astonished how many people were using it (as they'll probably use the new quote) as a reference too to tell me that AMD clearly could have beaten Nvidia at the high end but didn't (which certainly seems like the most convincing way of saying you're better than competitor but just decided to throw it a bone, yes).
Man, I always love the HW News. The Framework laptop is a design type I have been looking to see made for years. And, I'm like Steve. I like to use 17.3" laptops and don't have an issue carrying an extra couple of pounds for the convenience it gives me to do what I need to do in the time I have. And, the Steam info just makes me think one thing: time to go to Linux. And RIP Mr. Moore...and thank you for all you gave mankind.
up until recently, the steam client had an option to use the -no-browser switch to actually disable CEF. they disabled this recently and put up a feedback thread that i'm sure they are actually reading and not just putting up so people (like me) have a place to vent. the decision to force CEF means that if they want to keep this now hard dependency up to date (because it is very much not right now), they *have* to drop win7/8 support because newer versions of CEF don't support it. but my *personal* concern isn't with the OS limitations (all my systems are on 10 or 11), but with the fact that CEF is a memory-leaking piece of... "refuse" that the steam client could (and has, for the past 15 some odd years) serve its core functionality without.
"NVidia's Reality Distortion Field" Very accurate, well played. I remember them selling cards in the thousands per order to crypto miner companies while gamers were waiting for 2 years to get a GTX 1660 at 500 dollars because not a single 3000 series card was in stock anywhere.
@@Wobbothe3rd that's because Bitcoin is a legitimate project to build a peer to peer currency and all the other coins are literal scams to take advantage of buzzwords. Bitcoin is now free of being treated as a stock and now can return to being a currency.
Whats the source on this? Afaik ethereum needed ~5GB of VRAM and anything above it was useless. The main thing that mattered was memory speed and bus width, and that 20GB 3080 would have the same 320bit bus as the regular 10GB 3080
Considering your job is journalism, your level of tact is perfect. These big companies and groups need journalists snarking at them over the garbage they try to pull.
About laptops, finally someone who shares the need for a navigation cluster. They could fit one into smaller laptops but generally don't, my old clevo one is 14inch and does include the nav keys.
@@boam2943 Same goes for my 2012 Lenovo Thinkpad x121 (11.6) inch There they crammed page up / page down next to the up key and home / end next to delete. it was a Great little laptop apart from the horrible cpu (amd E350)
@@qlum I don't like that layout, even if the keys are there. There is no spacing around the cursor keys and I tend to press a key that I do not want, unless I look at the keyboard first. But, if I had to choose between that and a keyboard without dedicated keys, my RSI says "more keys, please" :)
what framework is doing seems very interesting, ive been waiting for modular gpus on laptops to make a comeback, its also very interesting to see how they managed to overcome the cooling isues, the module includes its own heatsink and all and the laptop wouldnt even need to be opened up? wow, also the option to use it on other machines is super cool + open source? cool af, really hope there will be a way to use it on a PCI-e port tho because thatd make it have no expiry date at all, thunderbolt and usb4 are cool and all but every pc for the past 20 years has pci-e like ive never seen usb4 nor thunderbolt irl yet so i feel like not having a way to get them working directly on pci-e would be a bit of a waste but they dont need to have it all figured it out yet when the product isnt even out + it being open source means that with enough interest there'll be a way made by the community so yeah, massive W
Man, I wish you'd come to the CHM when I was there! I interned at the museum in college, it was an incredible place. When I was there, only a couple docents were still capable of operating the PDP-11, and when they retired there was no plan to continue operation of the machine. So solemn to experience what are likely the twilight years of an incredible piece of technology.
That framework gpu setup- if it is actually interesting enough, could mean high efficiency production stations, or multi GPU mining or even better, multi GPU rendering. Imagine being able to render games or digital sculptures (mostly CAD) in a time line similar to a work station or HEDT.
Honestly I felt that E3 died after 2006. It was never as much fun as it was. It had its moments, but it went from an event, THEE event that everyone talked about. And it was, it was and still is the most fun I ever had because early 2000s it was an orgy of videogames, prizes, swag, swag, swag, and booth babes that are dying to get your attention by throwing swag at you. Then late 2000s it turned into some vender or press only event, held at some airport hanger. Nobody cared, NOBODY, except self righteous "journalist" from places like Electronic Gaming Monthly claiming that E3 was going back to its roots, like E3 was "FOR US, BY US". Except again, NOBODY CARED. Journalist don't buy the games, and couldn't give two ships about some fanbois opinions. Have some journalist only event invite only with less than 300 nobodies. There was so little fanfare on E3's airport hanger exclusive website traffic, they cancelled it as they realized, nobody cared about that event, and vendors wouldn't either. No exposure, what's the point? game shows should have fanfare and spectacle. They're SHOWS. yeah I know crowds suck, but it doesn't stop people from going. On top of the HORRIBLE management of the ESA, and rising cost as they themselves were gouging devs to just to have a display. I can't blame, Sony, Microsoft, and Nintendo from not showing. Of course a few years after the complete DEBACLE of the journalist only airport hanger "expo". They would relaunch E3 to its FORMER, trying to bring back the glitz, and the spectacle, what it used to be. They tried, but making it not vendor friendly hurt them eventually, and the last show I would attend was 2015. I had fun, but it ........ wasn't the same. It wasn't even close, it was a mild show, some fun, some prizes, little to no swag compared to what it was, (one year I left with 20 free games I caught by leaping into the air and nabbing over everyone heads as my vertical was pretty high, 22 tshirts, mouses, mouspads, game controlllers, calanders, posters, game guides, when they were a thing. one tombraider guide signed by Toby Gard, and Karima Adibebe. A Soul Calibur Calendar signed by Hiroaki Yotoriyama, and a Oblivion Guide signed by Todd Howard etc, etc, etc, I still have my Lineage coca cola can. I had so many WoW binders I sent them to friends around the country who were huge WoW fans. Fast Forward, there was still lights, but there was no fireworks. Literally there were flames at the NCsoft booth in early 2000s. Huge 3rd parties weren't there. There entire sections that were now empty. Though you could still feel the passion, it seemed like there was big brother saying you can't have too much fun. one of the booth girls said there was certain "conduct" they must keep up or get fined. But hey, I remember line dancing with the Nokia Engage girls with dozens of others, while being given free drinks surrounded by Nokia models dress in white mini skirts that dressed like gogo girls. We were all just having clean fun. And ESA killed it.
I was at E3 early 2000's, I remember being at the I think the Namco booth. The girls were tossing shirts, calanders, and games into the crowd. I saw a game flying over my head, and some dude standing a few feet behind leaped into the air, and yoinked it. This guy was an entire torso above my highest reach. I'm like wtf. Then a tshirt came in my direction, and again, the same guy who's KNEE was above my head snatched it. I was getting angry. I turned around and I saw him give the shirt and game to a kid in a wheelchair who would have otherwise never been able to get the stuff. The kid in the wheelchair had his dad with him thank him and shook his hand. I was okay with it after that. Then the booth girl tossed a calendar and the guy went all Dennis Rodman and snatched it too......
@@amberturd3504 I gave away most of my swag, I sent tshirts and WoW stuff because I wasn't a WoW fan to all my guild mates from Final Fantasy online. Shout out to Tshot/Spikeflail from Server Ragnarok. I kept the games though..... my guildmates really wanted the Ninja Gaiden OG shirts I had, so I mailed them out to all my friends who couldn't go to E3.
The framework 16 would be almost the perfect laptop I'd want, especially since it now has a numpad. Only problem is that they're not sold in EU, or have had any mention of a physical ethernet port.
Go read through Framework's product information. Ports are provided through plug-in modules the user can mix and match as desired. Ethernet is one of those.
What the...we lost another legend. Fairchild Semiconductor was the bomb in the day. They had made silicon based integrated circuit mass production possible (and guess _who_ ) which placed an end to the germanium ICs that other companies had been developing at the time. Whatever remained of the company is now part of ON Semiconductor (now onsemi) which is itself a spinoff from Motorola - another historic name in the semiconductor business at those old days. *For the curious: International Rectifier (i.e. HEXFET) was taken into Infineon (i.e. PowlRstage, CoolMOS) Philips Semiconductor (acquired Signetics, originator of the 555 timer IC) spun off as NXP (now Nexperia) National Semiconductor (i.e. LM317, LM2596) was acquired by Texas Instruments (the calculator) Linear Technology (i.e. LT1070, LT1581) and Maxim Integrated (i.e. MAX66xx temperature sensors) now part of Analog Devices
Looking forward to seeing how viable Framework 16 makes gaming laptops. That said, my biggest complaint with Framework laptops is the tiny number of ports you get in exchange for customisation.
For all their woes, NVIDIA has been working on AI workloads for years, namely, on both libraries, hw support and general “just works”, the effort I would prefer to see from AMD too. Their claims with AI are not unsubstantiated, which is kinda sad since we need more competition in consumer and prof. HW to drive prices down because RTX4090 and A/H100s right now are unmatched.
AMD has no need to do useful stuff. They get by with barely competitive hardware that gets bought by people that still think that being edgy and rejecting Nvidia and Intel is the cool thing to do.
@@Finsternis.. lots of emotions there, not many facts. Something tells me you couldn't back that opinion up without just getting angry and indignant and demanding you're right.
@@brahtrumpwonbigly7309 sadly he's kind of right lol But it's not just AMD fanboys, that's the whole "fanboyism" attitude that prevails a lot of times since the advent of social media. Before that, we just used to stick with the brands that don't crash constantly, or that have drivers that work with the games we used to play. Now to sell well in the DIY market, you just need to have good marketing and brew a cult following (team red, team green, team blue etc...)
@Die Finsternis or massively profitable server cpus that have buried intel for the past 3 or 4 years? They aren't competing with nvidia because they make almost no profit va using these wafers for cpu chiplets.
Sad to hear that Steam relies on internal browser to noramally operate. I actually paid a lot of moneys for using that software (and games behind it that I've bought) and I want to use it as I wish. No matter which OS I'm currently using (and I'm using Windows 8.1). I hope older version will remain functional at least for 5 years after.
Whenever it becomes time for me to need a new laptop, Framework 16 is definitely on my radar. I'm in the process of upgrading my main PC though, and that's making my wallet weep, so another time.
Man E3 used to be the place where you'd see all the upcoming stuff and could get excited about things like games or future PC stuff. But nope they're canceled now. Oh well I guess we will just find out about stuff when it's announced
@@flow221 Problem is that they're not really all consolidated in one place anymore. They're spread out throughout the year in little bite-sizes instead of all the info coming out at once. It makes sense from the publishers' perspectives: why compete with everyone else for the same timeslots? But for the consumer, it can require more effort to keep up-to-date on everything. Especially if you don't follow anyone who reports on upcoming titles.
The interesting thing about Babbage's difference engine is although it wasn't able to be built at the time, he'd already requested a woman by the name of Ada Lovelace to start writing programs for it, making her in effect the first ever computer software engineer. She is of course the namesake of Nvidia's "Lovelace" architecture.
My 4090 doesn't even pull 400w most of the time. It really depends on what you play and what settings. These 40 series cards seem pretty damn efficient. I use my gpu with a 7700x at eco mode and while it's still using a lot of power relatively speaking, the total system draw isn't nearly as high as you'd expect for such a powerful gaming setup.
I always have seen E3 like a red carpet for big industry videogames, much more that Video games awards... I'm sad to see this event go away, it was a very fun week following new announcements, mishaps, and reactions from out favourite youtubers/streamers... RIP E3
You mean the cinematic trailers for games that where broken soulless money grabs? Making pre orders and fake crap up ruin the hole industry? Not even talking about monetization for a moment the games for the last 5 or 10 years are all just garbage unfinished messes. Anthem showing made up trailers that even made devs for the first time see what the actual game was going to look like! That is how broken it got and still is! The pre order was far more valuable to the point of a trailer for a soon to be released game was worked on before ALL the devs even knew anything about the game! That means no one in the dev team had made ANYTHING of the stuff shown. They had just began the groundwork and the trailer was all made up fantasy and the goal of what the game might look like! That is *. Making a gameplay trailer that was 100% made up cinematic. And the people to make the game saw what they where going to make together with the public. That is E3 at it's best. Almost calling for some kind of god to thank for getting rid of E3 and the like!
if framework actually follows through with removable GPUs i will ABSOLUTELY buy one; preferably a ryzen model if the 16" comes in ryzen. personally, i like desktop replacement; those 21" chonkers, though i doubt they're moving that way in the future.
I'm interested in framework 16 to cram multiple M.2 SSD's into it, it would be so nice to have 24TB of storage but 8TB M.2 modules will cost a bomb. 3x 4TB M.2 for 12TB will probably be the sweet spot.
I'm going to say it, E3 going away is not a good thing. We'll agree to disagree here, but for me, the way things are right now makes trying to find games you want to play a full-time job, because you literally have to wade through hundreds of websites, watch hundreds of streams, and try to find stuff that interests you. I could do that, or I could actually be playing games. E3 being a big show with structured events is a big advantage. Geoff Keighley understands this, that's why his show is still around.
I agree. I don't really frequent any gaming sites these days (since gaming journalism is a joke); E3 was the best place to go to see a variety of stuff in one spot -- from AAA "blockbusters" to the occasional indie spotlight. While I'm not sure if these games would've been shown off at E3, I'm quickly discovering (by chance) a lot of AA-tier games made within the past 2~3 years that I didn't even know existed.
not realy, most game presentations where just scripted engine footage and at the end the qoute "more at gamescom" so at the end, gamescom was allways the place for me to get informations about new games
Saw your repair video over at LR channel and subbed before watching a vid. This is my virginity being lost! Love your style bro. I am going to enjoy my geek-binge!
I'm watching this in my almost 10yo Win7 computer I have in my B place. This is bad news regarding Steam, cause I love this thing. It boots lighting fast (way faster than my reasonably specced 7900x I assembled a month ago), it is rock solid reliable, and gives me awesome time playing Pubg and a few other old games on weekends. I hope Pubg on Epic will last a bit more.
2:50 well no, that just means they care about income. Not that they care about crypto. And why would they? They sold those cards to make real money, not because they cared about crypto currencies. 12:50 the bad translation didn't read like they said Nvidia had a 600W card either. It read like AMD said they could compete but they'd have to make a 600W card to do so.
Oh, the first crypto boom was in 2017. No wonder the 1070 was $379 and the 2070 was $599, I guess. I wondered why they skyrocketed before covid and when I remember the 2nd crypto boom because I wasn't gaming much or paying attention to any of this. Covid and the 2nd boom just allowed them to maintain prices forever high far past inflation which is only 19% USD cumulative since 2018 even though prices for each tier seems to be basically doubled since then.
I doubt PoW on consumer hardware ever comes back in the same way. Ethereum was a black swan for GPUs and I don't see how anything could repeat its history. PoW is justifiably under heavy scrutiny by investors and the public alike.
@@drakomus7409 but but with their venture into AI, if crypto were to resurface... How could Nvidia the ONLY GPU maker in the world possibly handle all of the profit!?!!
It would be interesting if Nvidia makes some AI specific cards for home tinkerers. That crowd cares mostly about VRAM, not even necessarily the CUDA cores or clocks. The 3060 is a popular card because it has 12gb VRAM, and some people have 4090 cards not necessarily because of the GPU silicon, but because it has the highest amount of VRAM in a consumer card. Nvidia could make a 4070 with 32gb or more of VRAM and AI hobbyists would buy it. For the right price of course.
btw inference will still use the cuda cores, just it won't be as demanding on features (it's mostly just various floating-point math functions that correspond one to one with matrix multiplication inputs) high cuda core counts are also very useful for model training
There is a AMD model with a better iGPU, I am hoping Davinci Resolve would recognise it though, else hopefully Intel 14th Gen iGPU which is based on Arc would do so. I have the 13" model too, but despite Iris XE being capable, I can't edit on it.
I think AMD is playing the long game here. Obviously we can't know for sure, but I suspect Nvidia might have been spooked by the RX 6000 series, where AMD got VERY close to Nvidia (the cards are within single digit percentage points of each other if you consult Passmark scores, which I know aren't the best metric, but not the worst either). Seeing this, Nvidia probably wanted to cement its position as the "quality card" and went out to make the most powerful thing possible (the RTX 4090). We're reaching the limit of how small we can make transistors, and while Nvidia opted for TSMC's 4 nanometer process, AMD opted for 5-6nm (this is from Techpowerup). So I suspect that in 2024-2025, when Nvidia releases the RTX 5000 series of cards, we won't see much of a performance boost over the last generation. Meanwhile, AMD might be able to boast a much larger improvement, making their newer cards more appealing, and *potentially* swaying people towards team red. I'm not an expert in this field and this is not an analysis without flaws, but take that as you will.
I hope the younger & more “woke” generation stick to Nvidia so AMD can keep their prices low. exact same performances for sometimes 40% lower price is too good for these youtubers to ruin for us
@@ARedditor397 The "4Nvidia" description for N4 is incorrect. TSMC has different naming designations for each of their process node sizes (N22, N16, N7, N5, etc). If "N4" stood for "4Nvidia", then the N7 node that AMD used for the 6000 series wouldn't be called "N7". While Nvidia uses the N4 process, that's not the reason it's called that. As for it being 4 nanometers, I have looked into this and... I don't know. Techpowerup, Tom's Hardware, a good amount of mainstream reporting, and even TSMC's investor reports label the N4 process as "4nm". But I have also heard reports on the contrary that N4 is a "refinement" of TSMC's N5 process. That could be interpreted a number of different ways. "Refinement" could mean a node shrink as well as it could not mean one, so I'm leaving that one open.
Well I play in crypto and it sounds to me like the CTO of Nvidia got wrecked and feels the need to spread FUD. Most legitimate crypto's are moving to "Proof of Stake" rather then "Proof of work". Nvidia does not have a future in crypto anyway. Could his opinion about crypto be slightly bias?
@@LucasFerreira-gx9yh Actually it seems you are the one that misses the point. Crypto has realized and unrealized use cases. The intent of crypto was not about an unregulated safe haven for most of us. It is about finding a more agile and equitable means to move value from one entity to another without the need for so many middle men. Censorship resistance is still better then fiat even with proof of stake. So for most of us a properly set up proof of stake would still be a massive improvement over traditional systems.
In case you missed it, check out our unhinged rant about a small motherboard feature: ruclips.net/video/bEjH775UeNg/видео.html
Nvidia saying crypto is useless is like Homer Simpson saying donuts and beer are useless.
paid the cryptotax end of 2016 for a 1060 6gb never again, sadly AMD is playing along EU pricing is fudged
at least they have enough vram would have pulled the triger on a 6800/xt in the US
I would really like to see some in depth reviews of the Framework laptops. Pros/cons and performance compared to different other laptops on the market.
It's sad to know that Steam is no longer going to support Windows 7 and 8. It's a slap in the face to people who have older systems they use for older games. This is a troubling precedent if it means having to be forced to upgrade in order to use games we've purchased that work just fine on older systems. Steam should have a remedy for that!
I like cheese.
Switching from CMOS to resonant tunnel diode threshold logic would give 2.2 THz at 5 W. And it would be cheaper!
Nvidia stopped caring about crypto when Ethereum moved away completely from PoW. It's akin to "YOU CAN'T FIRE ME! I QUIT!"
Can always probably go for Monero :X
@@lesslighter It's probably best that alphabay stays underground and CPUs stay affordable. I'm happy with monero staying out of the news indefinitely.
@@lesslighter Don't give them ideas. That dumpster fire "industry" is on the way out, and that trend needs to continue.
Thank you, captain obvious
Cryptocurrency milking is dead, long live Generative AI milking
_Friendship ended with cryptobros, now AI is my new best friend_
Nvidia sure loves their AI p0rns
Hey! There's no AI here, puny human *bzzrp* puny human fleshbag!
He was working up to the joke. I just think he forgot exactly how it was formatted. As did I.
Until next crypto craze...
@@nisx2012 Wait till AI and Crypto merge
"People programmed it to mine crypto"
If I recall correctly, nvidia made crypto mining specific cards
I mean technically they didn't make any crypto specific cards... cause they ware made by partners, they just rebranded some existing cards and locked down their drivers through BIOS shenanigans.
But yeah, you recall correctly.
You don't remember correctly. Nvidia let AIBs produce ASICs after they tried to STOP mining on regular GPUs. It was third parties like EVGA who were the bad guys, not Nvidia.
@@Wobbothe3rd no they made mining cards before, they just have been very secretive about it and only sell them to crypto miners. good thing theyre saying bye bye crypto no soup
@@Wobbothe3rd You shut your mouth right now and leave EVGA out of this.
(it's a joke, yes, they released a mining-centric card in 2018)
@@Wobbothe3rd You do not say....NVIDIA CMP 170HX Pro Mining was a room decoration then?
It was clearly a logistics error when pallets of Nvidia cards went directly to cryptominers instead of anyone else, right?
I don't believe that was ever confirmed to have actually happened, was it? Someone just analysed overall sales figures and guessed.
And when they made dedicated mining cards.
@@johnnypopstar Why else manufacture CMP cards? 🤨
@@johnnypopstar Big miners called the corporate office and told them "I need two thousand gpus" Nvidia happily sold them. You often favor B2B customers
@@johnnypopstar It totally did. There are some actual large businesses that focus entirely on crypto mining (There's a few here in Texas too for example) and several of them posted photos online of multiple pallets of new Nvidia 30 series cards being unloaded at their loading docks.
Crypto is no longer NVIDIAs friend....
...
...
"For now"
Maybe we should do that X guy is no longer my friend meme
Lol yeah right like crypto is coming back...
@@Wobbothe3rd HEY BUD
"till the next pump at least" 😂
@@lesslighter Maybe we should think in actual thoughts instead of repeating memes?
More computer history videos (maybe) Steve? 🤔 glad you covered Moore’s contribution and history.
A Framework 16 review would be cool. If you did a review on the 13, that could be cool as well, even though it's not a new product.
I absolutely second this!
Rip to legend Gordon Moore you will be missed but your contributions will never be forgotten.
Sales down? USELESS
MUDA MUDA MUDA
"we actually have to work to sell products now, rubbish!"
Please do buy one of those Framework laptops when the 16" comes out. I'm very interested in them and while I trust LTT to not be bias when reviewing them, it'd be nice seeing a source like you giving them a real rundown!
Linus would absolutely disqualify himself from reviewing Framework products. But he will present them in a good light in announcement videos like the recent one. (They're not substitutes for objective reviews, obviously.) I too am very curious about the 16 with its discrete GPU module. I won't preorder or buy blindly, but I hope for good reviews when the product launches.
@@joesterling4299 iirc, Linus won't even review laptops himself anymore because of the Framework investment
NVIDIA's been in a room sniffing its own farts for far too long.
It's a company trying to maximize profits for investors. What did you expect.
You see that South Park ep? All the yupsters wafting it right in their own face hilarious.
I'm glad to hear that you are using a Framework laptop and covering the 16 inch model. I love my laptop and really hope the company keeps going. I'm going to get the 16 inch as soon as I can afford a new laptop. Will get an additional desktop at the same time with the Cooler Master case.
and i m waiting for framework to start shipping in other regions so that i can get there mobo kit and vesa mount it to monitor
Framework's products have kind of interested me, but I do wish they offered lower-end options. I'm not in the market for something super powerful -- a general work/office laptop -- but I do like the repairability that Framework is selling.
@@imjust_atrue. If this goes well this might unleash an interesting generation of laptops that works like a pc: you start with low end parts and slowly replace them until they become a midrange or high end. The best part is that since you already have the chasis, screen, keyboard and the ports, it becomes relatively cheaper that way.
Man I really dig the Framework 16. I want it to blow up in sales and force other manufacturers to build modular laptops.
Re: Your story on the Framework laptop, I have massive respect for your statement regarding investment and conflict of interest. That's been a real problem in tech journalism for decades, and it is encouraging to hear that "homey don't play that game".
Nvidia: Friendship ended with crypto, now gamers are my best friend.
Gamers: No ty
Gamers: yes ty
People are still buying a lot of nvidia cards
@@_qwe_fk_1700 They're selling at record low numbers, so not really.
@@Matkinson123 record low numbers it might be, but they still have more than 80% market share (IIRC). So yeah, gamers still buy their products no matter how piss poor value they are, sadly
Nvidia: **Shocked pikachu face**
@@Matkinson123 „record low numbers“ depends on what timeframe. But this is just something the haters tell themselves to make themselves feel better
NVIDA made 550 Million on the CMP lineup alone in the fiscal year 2022. You can look it up on their 8-K form they published this February.
Attended the first few E3's in the 90's
If I recall, Friday was media day, Sat-Sun was open to public
Pretty cool back when 3D hardware was in it's infancy
Intel executive: "We need more cores! I don't care how it's done, just do it!"
Intel engineer: "Guess we just make it bigger."
Subcontractor: How many pins would you like?
Intel: All of them!
To be fair that's what AMD has been doing with Threadripper too. There's a slight die size increase from 3995WX to 5995WX. That's basically is the easiest way they can do with it right now, instead of trying to cram more cores into the same space.
Fractal Designs: "We will make a log cabin themed case. The size of an actual log cabin to accommodate new GPUs and CPUs."
@@GeminionRay they can't increase density anymore like they could in the past. Those days are over now. Although you'd never know it by listening to what the industry says. Even Moore knew his observation wouldn't hold true forever.
@@1pcfred I think its always a bad idea to state that something (particularly in computing) is over now. Maybe NOW, sure, but its hard to make confident predictions about the future. There will be some crazy development ad density will increase greatly (maybe its not even physical density)
After the honeymoon is over Nvidia goes like:
"I never loved you in the first place!"
Nvidia just throwing whatever under a bus whenever. We remember.
It won't stop you from buying another Nvidia card next time you upgrade, so why should they care?
@@CaptainKenway it did stop me from buying many GPUs though in past several years? I had my own PC built without a proper new video card, and then I built and gifted 2 more PCs to my brother and sister. I ended up not buying a single new GPU. Bought used RTX 3070 TI from local seller for cheap, which means no money to nvidia. They just seems to not want to sell their products.
I would definitely watch a Framework review or even just a talking head piece to hear your detailed thoughts!
after creating mining specific cards and selling gpu directly to mining companies the doublestandard hits hard
Thank you for mentioning the 2017 crypto boom (1:59). Had to drive 40 miles in the metro-Detroit area just to find a "local" supplier for ANY nVidia 10 series card. And it was a 1050 Ti. And, yes, I overpaid for it. PC died and I couldn't wait around for a card to be shipped to me.
Lol yeah. Thats why I always keep 1 or 2 spare cards laying around, you never know when the next boom is..
Ah, I remember that time where it happened until 2019. I got GT 1030 instead.
I remember ordering a 580 and the retailer not getting their expected shipment for a month. I got a 570 4GB and the difference back. Glad I took the trade because they didn't get another one for a while.
Thank you for getting clarification on that AMD quote. Was a little astonished how many people were using it (as they'll probably use the new quote) as a reference too to tell me that AMD clearly could have beaten Nvidia at the high end but didn't (which certainly seems like the most convincing way of saying you're better than competitor but just decided to throw it a bone, yes).
i never understand why people care about what "They" said and what not. We should only look at the products.
RIP Gordon Moore, a pioneer in one of the most interesting times of development of silicon technology.
That Framework laptop is awesome! I would totally buy one if they release it.
RIP Mr. Moore. Accomplished quite a lot, and his contributions to humanity will be remembered.
Yea everyone's talking about the nvidia bit while this part was kind of more significant to me.
Man, I always love the HW News. The Framework laptop is a design type I have been looking to see made for years. And, I'm like Steve. I like to use 17.3" laptops and don't have an issue carrying an extra couple of pounds for the convenience it gives me to do what I need to do in the time I have. And, the Steam info just makes me think one thing: time to go to Linux. And RIP Mr. Moore...and thank you for all you gave mankind.
Linux for gaming? LOL
up until recently, the steam client had an option to use the -no-browser switch to actually disable CEF. they disabled this recently and put up a feedback thread that i'm sure they are actually reading and not just putting up so people (like me) have a place to vent.
the decision to force CEF means that if they want to keep this now hard dependency up to date (because it is very much not right now), they *have* to drop win7/8 support because newer versions of CEF don't support it. but my *personal* concern isn't with the OS limitations (all my systems are on 10 or 11), but with the fact that CEF is a memory-leaking piece of... "refuse" that the steam client could (and has, for the past 15 some odd years) serve its core functionality without.
Minor correction, Sapphire Rapids has up to 56 cores, not 48.
It's made of 4x 15C tiles, with one core disabled per tile to increase yield.
"NVidia's Reality Distortion Field"
Very accurate, well played.
I remember them selling cards in the thousands per order to crypto miner companies while gamers were waiting for 2 years to get a GTX 1660 at 500 dollars because not a single 3000 series card was in stock anywhere.
At some point the chickens will come home to roost with Nvidia and AMD as far as basically being in cahoots artificially inflating gpu prices.
Steam breaking Win7 support just means faster Linux adoption. :)
No, it does not mean faster Linux adoptation… it is pity, but true… People will just say that win 10, 11, or 12 is bad and still use those.
Nvidia strikes me as a very abusive relationship partner. I feel like Nvidia and crypto will be walking hand-in-hand again someday
It strikes me as the most gaslighting business relationship
Crypto is dead. Crypto enthusiasts tend to be sociopath scammers. Nvidia is a real respected company.
Ask EVGA, they stopped working with Nvidia over it.
@@Wobbothe3rd that's because Bitcoin is a legitimate project to build a peer to peer currency and all the other coins are literal scams to take advantage of buzzwords. Bitcoin is now free of being treated as a stock and now can return to being a currency.
@@Wobbothe3rdYeah, Nvidia might be shitty about some things but they at least make actual products that do actual useful things
Nvidia also made 3080 20 Gb and sold them directly to miners and didn't sell them to regular consumers.
These games could use that kind of GPU memory. But they didn't put it on consumer cards. Tragic
Whats the source on this? Afaik ethereum needed ~5GB of VRAM and anything above it was useless. The main thing that mattered was memory speed and bus width, and that 20GB 3080 would have the same 320bit bus as the regular 10GB 3080
Considering your job is journalism, your level of tact is perfect. These big companies and groups need journalists snarking at them over the garbage they try to pull.
About laptops, finally someone who shares the need for a navigation cluster.
They could fit one into smaller laptops but generally don't, my old clevo one is 14inch and does include the nav keys.
My very old 12 inch Toshiba M400 has one two. Although it is crammed against the rest of the keys I can still use it with no problems.
@@boam2943 Same goes for my 2012 Lenovo Thinkpad x121 (11.6) inch
There they crammed page up / page down next to the up key and home / end next to delete.
it was a Great little laptop apart from the horrible cpu (amd E350)
@@qlum I don't like that layout, even if the keys are there. There is no spacing around the cursor keys and I tend to press a key that I do not want, unless I look at the keyboard first. But, if I had to choose between that and a keyboard without dedicated keys, my RSI says "more keys, please" :)
True greatness lives forever. RIP Gordon Moore, your contributions will last as long as computing does
what framework is doing seems very interesting, ive been waiting for modular gpus on laptops to make a comeback, its also very interesting to see how they managed to overcome the cooling isues, the module includes its own heatsink and all and the laptop wouldnt even need to be opened up? wow, also the option to use it on other machines is super cool + open source? cool af, really hope there will be a way to use it on a PCI-e port tho because thatd make it have no expiry date at all, thunderbolt and usb4 are cool and all but every pc for the past 20 years has pci-e like ive never seen usb4 nor thunderbolt irl yet so i feel like not having a way to get them working directly on pci-e would be a bit of a waste but they dont need to have it all figured it out yet when the product isnt even out + it being open source means that with enough interest there'll be a way made by the community so yeah, massive W
GN is the GOD Tier for PC Hardware/Software info. E3 fumbled the bag in 2014.
I’d like to see a framework laptop review
Man, I wish you'd come to the CHM when I was there! I interned at the museum in college, it was an incredible place. When I was there, only a couple docents were still capable of operating the PDP-11, and when they retired there was no plan to continue operation of the machine. So solemn to experience what are likely the twilight years of an incredible piece of technology.
That's unfortunate for these trade shows. By the way, there might be something more important to do first, may ask these trade shows later.
1:18 that whole passage, almost perfect Rick Sanchez moment right there
You do not lack tact @Gamers Nexus. You are just honest. And being honest is all of the tact you need.
That framework gpu setup- if it is actually interesting enough, could mean high efficiency production stations, or multi GPU mining or even better, multi GPU rendering.
Imagine being able to render games or digital sculptures (mostly CAD) in a time line similar to a work station or HEDT.
>4 years later
Oh we never liked AI either!
They shoud say that they love qctually money. That would be honest at least…
I hope framework doesn't suffer from the Osborne effect. Want to get one with a gpu
Honestly I felt that E3 died after 2006. It was never as much fun as it was. It had its moments, but it went from an event, THEE event that everyone talked about. And it was, it was and still is the most fun I ever had because early 2000s it was an orgy of videogames, prizes, swag, swag, swag, and booth babes that are dying to get your attention by throwing swag at you.
Then late 2000s it turned into some vender or press only event, held at some airport hanger. Nobody cared, NOBODY, except self righteous "journalist" from places like Electronic Gaming Monthly claiming that E3 was going back to its roots, like E3 was "FOR US, BY US". Except again, NOBODY CARED. Journalist don't buy the games, and couldn't give two ships about some fanbois opinions. Have some journalist only event invite only with less than 300 nobodies. There was so little fanfare on E3's airport hanger exclusive website traffic, they cancelled it as they realized, nobody cared about that event, and vendors wouldn't either. No exposure, what's the point?
game shows should have fanfare and spectacle. They're SHOWS. yeah I know crowds suck, but it doesn't stop people from going.
On top of the HORRIBLE management of the ESA, and rising cost as they themselves were gouging devs to just to have a display. I can't blame, Sony, Microsoft, and Nintendo from not showing.
Of course a few years after the complete DEBACLE of the journalist only airport hanger "expo". They would relaunch E3 to its FORMER, trying to bring back the glitz, and the spectacle, what it used to be. They tried, but making it not vendor friendly hurt them eventually, and the last show I would attend was 2015.
I had fun, but it ........ wasn't the same. It wasn't even close, it was a mild show, some fun, some prizes, little to no swag compared to what it was, (one year I left with 20 free games I caught by leaping into the air and nabbing over everyone heads as my vertical was pretty high, 22 tshirts, mouses, mouspads, game controlllers, calanders, posters, game guides, when they were a thing. one tombraider guide signed by Toby Gard, and Karima Adibebe. A Soul Calibur Calendar signed by Hiroaki Yotoriyama, and a Oblivion Guide signed by Todd Howard etc, etc, etc, I still have my Lineage coca cola can. I had so many WoW binders I sent them to friends around the country who were huge WoW fans.
Fast Forward, there was still lights, but there was no fireworks. Literally there were flames at the NCsoft booth in early 2000s. Huge 3rd parties weren't there. There entire sections that were now empty. Though you could still feel the passion, it seemed like there was big brother saying you can't have too much fun.
one of the booth girls said there was certain "conduct" they must keep up or get fined. But hey, I remember line dancing with the Nokia Engage girls with dozens of others, while being given free drinks surrounded by Nokia models dress in white mini skirts that dressed like gogo girls. We were all just having clean fun. And ESA killed it.
Pepperidge farms remember those Nokia Engage girls.
I remember when they changed E3 to that invite only event at some airport. It was dumb, and I didn't even pay any attention to it.
I was at E3 early 2000's, I remember being at the I think the Namco booth. The girls were tossing shirts, calanders, and games into the crowd. I saw a game flying over my head, and some dude standing a few feet behind leaped into the air, and yoinked it. This guy was an entire torso above my highest reach. I'm like wtf. Then a tshirt came in my direction, and again, the same guy who's KNEE was above my head snatched it.
I was getting angry. I turned around and I saw him give the shirt and game to a kid in a wheelchair who would have otherwise never been able to get the stuff. The kid in the wheelchair had his dad with him thank him and shook his hand. I was okay with it after that.
Then the booth girl tossed a calendar and the guy went all Dennis Rodman and snatched it too......
@@nanocat4141 lol...
@@amberturd3504 I gave away most of my swag, I sent tshirts and WoW stuff because I wasn't a WoW fan to all my guild mates from Final Fantasy online. Shout out to Tshot/Spikeflail from Server Ragnarok. I kept the games though..... my guildmates really wanted the Ninja Gaiden OG shirts I had, so I mailed them out to all my friends who couldn't go to E3.
Steve really seething about the E3 snub.
The framework 16 would be almost the perfect laptop I'd want, especially since it now has a numpad. Only problem is that they're not sold in EU, or have had any mention of a physical ethernet port.
Go read through Framework's product information. Ports are provided through plug-in modules the user can mix and match as desired. Ethernet is one of those.
The ports are customisable, I believe they have an ethernet solution.
Where are you in the EU? Framework seems to have a bunch of EU countries on their ordering website.
It's refreshing to see Stephen's passion for the roots of computer technology. Keep up the great work young man.
I'd love to see y'all start covering video games again. I know it's just not the direction your channel has taken.
What the...we lost another legend.
Fairchild Semiconductor was the bomb in the day. They had made silicon based integrated circuit mass production possible (and guess _who_ ) which placed an end to the germanium ICs that other companies had been developing at the time. Whatever remained of the company is now part of ON Semiconductor (now onsemi) which is itself a spinoff from Motorola - another historic name in the semiconductor business at those old days.
*For the curious:
International Rectifier (i.e. HEXFET) was taken into Infineon (i.e. PowlRstage, CoolMOS)
Philips Semiconductor (acquired Signetics, originator of the 555 timer IC) spun off as NXP (now Nexperia)
National Semiconductor (i.e. LM317, LM2596) was acquired by Texas Instruments (the calculator)
Linear Technology (i.e. LT1070, LT1581) and Maxim Integrated (i.e. MAX66xx temperature sensors) now part of Analog Devices
Looking forward to seeing how viable Framework 16 makes gaming laptops. That said, my biggest complaint with Framework laptops is the tiny number of ports you get in exchange for customisation.
I would like to buy one. I stopped buying laptop, like dell or HP, because that's basically ewaste that won't even work well in 3 years
I hope framework can be financially stable, it is a project the world needs
For all their woes, NVIDIA has been working on AI workloads for years, namely, on both libraries, hw support and general “just works”, the effort I would prefer to see from AMD too. Their claims with AI are not unsubstantiated, which is kinda sad since we need more competition in consumer and prof. HW to drive prices down because RTX4090 and A/H100s right now are unmatched.
AMD has no need to do useful stuff. They get by with barely competitive hardware that gets bought by people that still think that being edgy and rejecting Nvidia and Intel is the cool thing to do.
@@Finsternis.. You seem stuck in 2012.
@@Finsternis.. lots of emotions there, not many facts. Something tells me you couldn't back that opinion up without just getting angry and indignant and demanding you're right.
@@brahtrumpwonbigly7309 sadly he's kind of right lol
But it's not just AMD fanboys, that's the whole "fanboyism" attitude that prevails a lot of times since the advent of social media.
Before that, we just used to stick with the brands that don't crash constantly, or that have drivers that work with the games we used to play.
Now to sell well in the DIY market, you just need to have good marketing and brew a cult following (team red, team green, team blue etc...)
@Die Finsternis or massively profitable server cpus that have buried intel for the past 3 or 4 years?
They aren't competing with nvidia because they make almost no profit va using these wafers for cpu chiplets.
Sad to hear that Steam relies on internal browser to noramally operate. I actually paid a lot of moneys for using that software (and games behind it that I've bought) and I want to use it as I wish. No matter which OS I'm currently using (and I'm using Windows 8.1). I hope older version will remain functional at least for 5 years after.
NVIDIA is just sour that they can't automatically sell EVERY SINGLE GPU they produce, and now have piles of unwanted GPU's sitting on shelves.
Whenever it becomes time for me to need a new laptop, Framework 16 is definitely on my radar.
I'm in the process of upgrading my main PC though, and that's making my wallet weep, so another time.
Man E3 used to be the place where you'd see all the upcoming stuff and could get excited about things like games or future PC stuff. But nope they're canceled now. Oh well I guess we will just find out about stuff when it's announced
There are plenty of other options for game publishers to showcase their upcoming titles these days. Which is why E3 is going away.
@@flow221 Problem is that they're not really all consolidated in one place anymore. They're spread out throughout the year in little bite-sizes instead of all the info coming out at once. It makes sense from the publishers' perspectives: why compete with everyone else for the same timeslots? But for the consumer, it can require more effort to keep up-to-date on everything. Especially if you don't follow anyone who reports on upcoming titles.
Rest in peace Gordon Moore, you will be missed.
The interesting thing about Babbage's difference engine is although it wasn't able to be built at the time, he'd already requested a woman by the name of Ada Lovelace to start writing programs for it, making her in effect the first ever computer software engineer. She is of course the namesake of Nvidia's "Lovelace" architecture.
I've heard that is kinda disputed whether she just transcribed the programs or actually wrote them.
That's not what really happened.
They lost a huge market share when eth went to POS. Now suddenly there isn't a shortage of graphics cards!
My 4090 doesn't even pull 400w most of the time. It really depends on what you play and what settings. These 40 series cards seem pretty damn efficient. I use my gpu with a 7700x at eco mode and while it's still using a lot of power relatively speaking, the total system draw isn't nearly as high as you'd expect for such a powerful gaming setup.
I always have seen E3 like a red carpet for big industry videogames, much more that Video games awards...
I'm sad to see this event go away, it was a very fun week following new announcements, mishaps, and reactions from out favourite youtubers/streamers...
RIP E3
RIP E3.
I remember the Ubisoft E3 cinematic trailers. Great memories.
You mean the cinematic trailers for games that where broken soulless money grabs? Making pre orders and fake crap up ruin the hole industry? Not even talking about monetization for a moment the games for the last 5 or 10 years are all just garbage unfinished messes. Anthem showing made up trailers that even made devs for the first time see what the actual game was going to look like! That is how broken it got and still is! The pre order was far more valuable to the point of a trailer for a soon to be released game was worked on before ALL the devs even knew anything about the game! That means no one in the dev team had made ANYTHING of the stuff shown. They had just began the groundwork and the trailer was all made up fantasy and the goal of what the game might look like!
That is *. Making a gameplay trailer that was 100% made up cinematic. And the people to make the game saw what they where going to make together with the public. That is E3 at it's best.
Almost calling for some kind of god to thank for getting rid of E3 and the like!
if framework actually follows through with removable GPUs i will ABSOLUTELY buy one; preferably a ryzen model if the 16" comes in ryzen.
personally, i like desktop replacement; those 21" chonkers, though i doubt they're moving that way in the future.
But Nvidia sure loved crypto when it made them money.
I remember in the late 90's / 2000 it was my dream to go to E3 =x
Good morning Steve!
I'm interested in framework 16 to cram multiple M.2 SSD's into it, it would be so nice to have 24TB of storage but 8TB M.2 modules will cost a bomb. 3x 4TB M.2 for 12TB will probably be the sweet spot.
I'm going to say it, E3 going away is not a good thing. We'll agree to disagree here, but for me, the way things are right now makes trying to find games you want to play a full-time job, because you literally have to wade through hundreds of websites, watch hundreds of streams, and try to find stuff that interests you. I could do that, or I could actually be playing games. E3 being a big show with structured events is a big advantage. Geoff Keighley understands this, that's why his show is still around.
I agree. I don't really frequent any gaming sites these days (since gaming journalism is a joke); E3 was the best place to go to see a variety of stuff in one spot -- from AAA "blockbusters" to the occasional indie spotlight. While I'm not sure if these games would've been shown off at E3, I'm quickly discovering (by chance) a lot of AA-tier games made within the past 2~3 years that I didn't even know existed.
not realy, most game presentations where just scripted engine footage and at the end the qoute "more at gamescom" so at the end, gamescom was allways the place for me to get informations about new games
Saw your repair video over at LR channel and subbed before watching a vid. This is my virginity being lost! Love your style bro. I am going to enjoy my geek-binge!
I'm watching this in my almost 10yo Win7 computer I have in my B place. This is bad news regarding Steam, cause I love this thing. It boots lighting fast (way faster than my reasonably specced 7900x I assembled a month ago), it is rock solid reliable, and gives me awesome time playing Pubg and a few other old games on weekends.
I hope Pubg on Epic will last a bit more.
Agreed I love my winxp/win7 pc and losing steam is pretty big. Maybe we can get someone to bypass the requirement.
Thanks for all this, Steve - and crew!
Many tears shed for NVIDIA and Crypto NOT, I hope NVIDIA can find better friends. Oh, they BURNED ALL THOSE RELATIONSHIPS.
1:19 - hits hard. Very true for almost every company.
Nvdia used chat gpt to write the press statement...lol and it got confused..
Thank you as always for these, I just realized I'm taking this for granted
2:50 well no, that just means they care about income. Not that they care about crypto. And why would they? They sold those cards to make real money, not because they cared about crypto currencies.
12:50 the bad translation didn't read like they said Nvidia had a 600W card either. It read like AMD said they could compete but they'd have to make a 600W card to do so.
Oh, the first crypto boom was in 2017. No wonder the 1070 was $379 and the 2070 was $599, I guess. I wondered why they skyrocketed before covid and when I remember the 2nd crypto boom because I wasn't gaming much or paying attention to any of this. Covid and the 2nd boom just allowed them to maintain prices forever high far past inflation which is only 19% USD cumulative since 2018 even though prices for each tier seems to be basically doubled since then.
When crypto booms again, no doubt Nvidia would re-organize to meet the demand in order to make record profits again :D
I doubt PoW on consumer hardware ever comes back in the same way. Ethereum was a black swan for GPUs and I don't see how anything could repeat its history. PoW is justifiably under heavy scrutiny by investors and the public alike.
@@fern3436 Silly fern, Bitcoin halvening happens next year. Nvidia profits about to go back into record highs.
@@drakomus7409 but but with their venture into AI, if crypto were to resurface... How could Nvidia the ONLY GPU maker in the world possibly handle all of the profit!?!!
@@drakomus7409 Its not like Nvidia makes ASICS though
@@t1e6x12 did they make ASICS in 2016 or 2020? last 2 bitcoin halvenings gave nvidia record profits, just like this one will
ngl, feel like the recent vids have been more high energy and swwms to be more uploads. all for it tho. great stuff!
It would be interesting if Nvidia makes some AI specific cards for home tinkerers. That crowd cares mostly about VRAM, not even necessarily the CUDA cores or clocks. The 3060 is a popular card because it has 12gb VRAM, and some people have 4090 cards not necessarily because of the GPU silicon, but because it has the highest amount of VRAM in a consumer card. Nvidia could make a 4070 with 32gb or more of VRAM and AI hobbyists would buy it. For the right price of course.
btw inference will still use the cuda cores, just it won't be as demanding on features (it's mostly just various floating-point math functions that correspond one to one with matrix multiplication inputs)
high cuda core counts are also very useful for model training
Framework is so cool. I hope it isn't just a fad that dies out in a few years like those phones with add-on modules.
Not just stopped nvidia from making money but also jeopardize 4000 series and future gpu sales cause miners have to sell their used gpus 😂
(time index 19:41) Good point. Understood.
Looking at the efficiency of the RX 7000 GPUs, AMD would have to make a 600W GPU to compete with the 4090.
Thank you for the news, GN!
Nvidia is "Chase the trendy thing" the corporation nowadays.
& unfortunately it works in the world of youtube but I like how it keeps AMD cards low for us who don’t care about Nvidia lol
There is a AMD model with a better iGPU, I am hoping Davinci Resolve would recognise it though, else hopefully Intel 14th Gen iGPU which is based on Arc would do so. I have the 13" model too, but despite Iris XE being capable, I can't edit on it.
I think AMD is playing the long game here. Obviously we can't know for sure, but I suspect Nvidia might have been spooked by the RX 6000 series, where AMD got VERY close to Nvidia (the cards are within single digit percentage points of each other if you consult Passmark scores, which I know aren't the best metric, but not the worst either). Seeing this, Nvidia probably wanted to cement its position as the "quality card" and went out to make the most powerful thing possible (the RTX 4090). We're reaching the limit of how small we can make transistors, and while Nvidia opted for TSMC's 4 nanometer process, AMD opted for 5-6nm (this is from Techpowerup).
So I suspect that in 2024-2025, when Nvidia releases the RTX 5000 series of cards, we won't see much of a performance boost over the last generation. Meanwhile, AMD might be able to boast a much larger improvement, making their newer cards more appealing, and *potentially* swaying people towards team red.
I'm not an expert in this field and this is not an analysis without flaws, but take that as you will.
I hope the younger & more “woke” generation stick to Nvidia so AMD can keep their prices low. exact same performances for sometimes 40% lower price is too good for these youtubers to ruin for us
Nah, I honestly think tghey are just not willing to go punch for punch in the same market as Nvidia, because they will lose.
You are wrong in fact the node is 4Nvidia, not 4nm it is a customized 5 nano meter process
@@ARedditor397 The "4Nvidia" description for N4 is incorrect. TSMC has different naming designations for each of their process node sizes (N22, N16, N7, N5, etc). If "N4" stood for "4Nvidia", then the N7 node that AMD used for the 6000 series wouldn't be called "N7". While Nvidia uses the N4 process, that's not the reason it's called that.
As for it being 4 nanometers, I have looked into this and... I don't know. Techpowerup, Tom's Hardware, a good amount of mainstream reporting, and even TSMC's investor reports label the N4 process as "4nm". But I have also heard reports on the contrary that N4 is a "refinement" of TSMC's N5 process. That could be interpreted a number of different ways. "Refinement" could mean a node shrink as well as it could not mean one, so I'm leaving that one open.
Great content as always ,just wondering if you cover the Phanteks nv7 case @Gamers Nexus
I'll believe it when a GPU no longer costs a kidney
My Framework laptop is the best I've ever owned. Love that they are making it work.
Well I play in crypto and it sounds to me like the CTO of Nvidia got wrecked and feels the need to spread FUD. Most legitimate crypto's are moving to "Proof of Stake" rather then "Proof of work". Nvidia does not have a future in crypto anyway. Could his opinion about crypto be slightly bias?
proof of stake go against the point of cryptocurrencie anyway.
@@LucasFerreira-gx9yh Actually it seems you are the one that misses the point. Crypto has realized and unrealized use cases. The intent of crypto was not about an unregulated safe haven for most of us. It is about finding a more agile and equitable means to move value from one entity to another without the need for so many middle men. Censorship resistance is still better then fiat even with proof of stake. So for most of us a properly set up proof of stake would still be a massive improvement over traditional systems.
Kudos on the Gordon Moore notice.
Gas + Light = Nvidia
banks collapsing... nvidia: being your own bank is useless to society right now
🤡🌎