1:57:20 - I actually just sold a Geforce 256 on ebay a few weeks ago, an Asus v6800ddr/32m. I was wondering why it went for so much. Even the Matrox m3d I put up has a lot of interest, and that’s not a great 3d accelerator.
What's the dual encode thing. is it just for AV1? The 20 series cards could do multiple encodes just fine. I used this for dual streaming and recording at the same time on a 2080.
I've done quite a lot of analysis on the gpu market and i found that, historically, the best time to upgrade was every third generation. However, (and i haven't fully updated my analysis for the current generation yet) with the slowing of performance and feature increases at the lower and mid-range segments, and with the associated price increases in those same segments, you are actually better buying something like the top-end GPU of the stack and only upgrading every four plus generations. You'll get better value that way, unfortunately....
The thing about PC games now, that I can't stand is all the annoying intro videos before you get to your damn settings. You have to delete them from a folder inside where the game is installed. We should only need one press of the ESC key to get to the settings in all games. I play Guild Wars 2 that game boots up in seconds and your already playing it, no intros.
Yeah, Witcher 3 and Cyberpunk start up really fast if you disable all the intro videos. That's usually one of the first things that I mod about any given game.
Motion blur isn't even good for that. Making it look blurry isn't the same thing as making it look smooth. At console framerates you have sluggish but visible or sluggish and blurry. There is no smooth. People need to stop with this lie in defense of motion blur - it does not make things look or feel smoother. It only does what it says in the name - makes things blurry.
never skiped gu generations till after the 1070 the 970 g1 gamer was 300 and i bought new at discout at 260 uk pounds, the cheapest 1070 was the amp extreme at 430 uk pounds and i grited my teeth at that cost as the rest were all 500 uk pounds or more. then after that the 470 series were too much. doday i canotjustify the 800 prive range for what i see as the mid range card. for me this new 4070 ti whith finally 256 bit bus is what the plain 4070 should have been. if it had id have considered it at 600 uk pounds even though that is stuipd high price too.
@1:21:00 This is why I've 'delved deep' into the effect of RAM speed and optimisations on gaming - to show that there's really not a big difference to be had past a certain point (especially in the mid-range). I haven't tested productivity programmes as that's not really my area of expertise. If Will has a particular "benchmark" i could run to add that data, I'm open to it! I'm currently testing DDR5 (DDR4 was done last year).
I think it really depends on the game. AMD famboys tend to downplay RAM speed because their CPUs tend to max out around 6000 at best. In some games it matters.
guys do yuh knw any list of features of that Blackwell architecture introduce in its 50 series lineup . like 40 series have dlss 3 ans av1. im currently using 3070 FE 8gb since 3 yrs. shud i buy the 4070 ti super or wait for 50 series 5070
The FE's are such amazing cards from an engineering perspective... putting the AIB designs to shame.... dont know why people would buy AIBs... aside from nvidia constraining supply of the FE's i guess...
because FEs are louder, harder to clean because that shit is glued together and the fans tend to have an ugly noise characteristic ask me how i know.... will never buy an FE again
But Elon isn't a closet monster like Ezra Miller. Which you seem more like. Anyway, that comment left a sour taste in my mouth. This podcast was a cake and you're the off putting thing left on top. Bring back Gordon.
The problem is Windows and shitty devs who don’t know how to optimize their engines. The problem is Nvidia/AMD selling you shit cards for 600$. Not really a lot you can do when people want to skip the basics and the science behind their job.
1:35:39 Is that actually a thing? I think most people agree on the best versions of Windows were XP (which was basically Windows 5), 7 and eventually 10 after many changes and updates because early day W10 was actually terrible if you upgraded from 7 Ultimate and was expecting at least the same basic functionality. The way I see it, Windows versions don't go between good and bad but rather move down a slope where every new version after 7 got plenty of cool new stuff but at the same time lost so many features, functions and basic user friendliness that instead of getting hyped over new features W12 might have I wonder what functions or features are next in line to get ruined or axed.
Nobody upgrades every generation, those questions about sidegrades are stupid. Even if you ARE upgrading durectly from one generation to another why would you go DOWN a class? Stupid. Thats why this entire discussion around the Super refreshes is blatantly biased against Nvidia - no one is buying a 4070ti super after buying a 4070ti! The 4070ti was the BEST SELLER of the rtx40 series, all this whining about price to performance doesn't reflect what consumers buy in real life.
1:57:20 - I actually just sold a Geforce 256 on ebay a few weeks ago, an Asus v6800ddr/32m. I was wondering why it went for so much. Even the Matrox m3d I put up has a lot of interest, and that’s not a great 3d accelerator.
This is a great cast of people for today. I'm watching that
It is interesting that during the podcast Adam spent more time speaking about Nvidia when talking about the 7600XT.
56:18 🎵 boss battle music begins 🎵
@1:02:30 Adam, I'm pretty sure you can run the benchmark from within the game - unless they've changed it in a patch, you could at launch!
I still run a 15yr old plasma. Still runs great. New tvs havnt stunned me enough over the plasma to justify changing.
Motion blur usually amplifies my simulation sickness
AV1, not AVC!
AVC is the same old H.265. Not full nerd confirmed.
Always nice to hang with you, this is what keeps me going for another day 🥰
What's the dual encode thing. is it just for AV1? The 20 series cards could do multiple encodes just fine. I used this for dual streaming and recording at the same time on a 2080.
As I always say - thank you for the podcast version!
WOW what a cast! 🎉🎉🎉
At the one looking for a geforce 256... I have one! Will need a cooler though. Can't remember if ddr or not
Motion blur is garbage. Just tell Linus he can save C$0.02 every time he turns it off.
I've done quite a lot of analysis on the gpu market and i found that, historically, the best time to upgrade was every third generation.
However, (and i haven't fully updated my analysis for the current generation yet) with the slowing of performance and feature increases at the lower and mid-range segments, and with the associated price increases in those same segments, you are actually better buying something like the top-end GPU of the stack and only upgrading every four plus generations.
You'll get better value that way, unfortunately....
The thing about PC games now, that I can't stand is all the annoying intro videos before you get to your damn settings. You have to delete them from a folder inside where the game is installed. We should only need one press of the ESC key to get to the settings in all games. I play Guild Wars 2 that game boots up in seconds and your already playing it, no intros.
Yeah, Witcher 3 and Cyberpunk start up really fast if you disable all the intro videos. That's usually one of the first things that I mod about any given game.
"would you say its a great 1080p 1440p card?" Its $800
Thought alex's head was gonna pop.
Alex, after watching your motion blur video back when it came out, I got converted from a motion blur hater
Motion blur is useful for making god awful console frame rates look smooth, but I would avoid those frame rates at any cost.
Motion blur isn't even good for that. Making it look blurry isn't the same thing as making it look smooth. At console framerates you have sluggish but visible or sluggish and blurry. There is no smooth. People need to stop with this lie in defense of motion blur - it does not make things look or feel smoother. It only does what it says in the name - makes things blurry.
looks smooth to me, so no@@mjc0961
im writing some ping problems here but never mind
never skiped gu generations till after the 1070 the 970 g1 gamer was 300 and i bought new at discout at 260 uk pounds, the cheapest 1070 was the amp extreme at 430 uk pounds and i grited my teeth at that cost as the rest were all 500 uk pounds or more. then after that the 470 series were too much. doday i canotjustify the 800 prive range for what i see as the mid range card. for me this new 4070 ti whith finally 256 bit bus is what the plain 4070 should have been. if it had id have considered it at 600 uk pounds even though that is stuipd high price too.
Yep, 4070 super around $499 is where things become at all interesting
Where is Gordon? Why is no one talking about his absence?
they did?
ruclips.net/video/lIoyetk2vfE/видео.html
@1:21:00 This is why I've 'delved deep' into the effect of RAM speed and optimisations on gaming - to show that there's really not a big difference to be had past a certain point (especially in the mid-range).
I haven't tested productivity programmes as that's not really my area of expertise.
If Will has a particular "benchmark" i could run to add that data, I'm open to it! I'm currently testing DDR5 (DDR4 was done last year).
I think it really depends on the game. AMD famboys tend to downplay RAM speed because their CPUs tend to max out around 6000 at best. In some games it matters.
RX 6800 destroys 7600xt by 50% for $399 or 20% more money
👍 Spaaaaace Weeeeeeed in da' houuuuse !!!!!!
guys do yuh knw any list of features of that Blackwell architecture introduce in its 50 series lineup . like 40 series have dlss 3 ans av1.
im currently using 3070 FE 8gb since 3 yrs. shud i buy the 4070 ti super or wait for 50 series 5070
Wait, or don’t buy Nvidia unless you can find a 4080 Super at MSRP…
@@or1on89msrp being $800 I hope
@@or1on89 i got the eagle 4070 ti super yesterday . 😊
The FE's are such amazing cards from an engineering perspective... putting the AIB designs to shame.... dont know why people would buy AIBs... aside from nvidia constraining supply of the FE's i guess...
because FEs are louder, harder to clean because that shit is glued together and the fans tend to have an ugly noise characteristic
ask me how i know....
will never buy an FE again
Yeah I don't know, every FE I've seen has been quiet and especially Dat 4090
But Elon isn't a closet monster like Ezra Miller. Which you seem more like. Anyway, that comment left a sour taste in my mouth. This podcast was a cake and you're the off putting thing left on top. Bring back Gordon.
woop
@NVIDIA
"GeForce Now" = 🧀🧀🧀... (mucho dinero en la futura)
Cheese
Cheese. The cheese
Why do people always go out of their way to apologize for Microsoft's crappy tech? DX12 has been meh, directstorage is meh, VRS is a bust
What's wrong with dx12? It's visually a lot better than dx11 and it seems to scale pretty well. Is vulkan better?
Windows is inexcusable buggy
The problem is Windows and shitty devs who don’t know how to optimize their engines. The problem is Nvidia/AMD selling you shit cards for 600$.
Not really a lot you can do when people want to skip the basics and the science behind their job.
More joke gpu's not worth buying.
1:35:39 Is that actually a thing? I think most people agree on the best versions of Windows were XP (which was basically Windows 5), 7 and eventually 10 after many changes and updates because early day W10 was actually terrible if you upgraded from 7 Ultimate and was expecting at least the same basic functionality.
The way I see it, Windows versions don't go between good and bad but rather move down a slope where every new version after 7 got plenty of cool new stuff but at the same time lost so many features, functions and basic user friendliness that instead of getting hyped over new features W12 might have I wonder what functions or features are next in line to get ruined or axed.
Nobody upgrades every generation, those questions about sidegrades are stupid. Even if you ARE upgrading durectly from one generation to another why would you go DOWN a class? Stupid. Thats why this entire discussion around the Super refreshes is blatantly biased against Nvidia - no one is buying a 4070ti super after buying a 4070ti! The 4070ti was the BEST SELLER of the rtx40 series, all this whining about price to performance doesn't reflect what consumers buy in real life.
In fact people in real life are buying mainly AMD since the 7800XT release…because Nvidia proposition this gen is pure crap.
@@or1on89 citation needed
Sorry but your knowledge about you are talking about is so limited it's actually hard to listen to you...