Watch our RDR2 GPU benchmark here! ruclips.net/video/yD3xd1qGcfo/видео.html - we have a graphics explanation guide coming up for the game next! To further elaborate on the 9700K: It's not a "framerate cap" here, but an engine FPS threshold where performance tanks if insufficient threads are present, but performance is still high. Both a thread deficit and high framerate are needed to encounter this, hence the uniqueness of the bug. Because high perf & medium thread count is an uncommon combination (the 9700K is among the only ones), the bug is more interesting than most. There's a limitation of threads that affects it. That's why the 8T older CPUs are fine, but the 9700K isn't -- those aren't fast enough to hit the engine threshold. This is a two-part issue. GTA V had that issue w/ 6T CPUs, but not 12T CPUs (8700K). Support our all-nighter efforts like this one via the GN store, where we're planting 10 trees per item ordered through Nov: store.gamersnexus.net/ or Patreon: www.patreon.com/gamersnexus
@Gamers Nexus Just throwing an idea out there: What power supply are you using and did you use the same power supply with both boards? If the irregular performance is seemingly sporadic across manufacturers, model numbers, etc, but seems to be somewhat correlation with bios and CPU, could it be that one board uses less power or BIOS implementation is a little more efficient in power usage during moments of power draw from the CPU, or probably more so if you guys are using a PSU with less headroom between PSU cap and perceived system draw, in supplying adequate power to CPU demand and within a certain game play sessions or scenes during game play that the inefficiency in voltage management from BIOS firmware applied to CPU, or inability to provide adequate power, etc, could be causing issues? Like, do you think it's like tolerance stacking? Probably sounds stupid...is that applicable to PCs in this situation (or any, even)? Just throwing it out there since this was a cause of problems with a engineering design I was working on a moment and 2 cups of coffee ago.
Translation of game milestones "Official release": alpha 30% off for the first time: beta 50% off and all DLCs finally released & playable: official release
But the preorder is already at a discount. When you preorder on Rockstar Launcher, the lowest edition you can buy is the Special Edition at the price of Standard Edition. The same for the Deluxe Edition which geared towards the online play, it comes at the cost of Special Edition.
it's not bad for everyone. I don't have any stuttering, freezing or other performance issues, with a i7-7700K and RX5700XT. I just run the game at ultra/high settings with a constant 60fps lock
All these people talking about their GPUs and CPUs...Under a post talking about the fact the game refuses to launch on some motherboards at all. Very on topic "haha works for me so it's good"
@@leandrotijink you are a special kind of idiot !!! It's BAD even if 10% of the hardware cannot run it , not because the hardware is poor or old, but because the port is ass
I wish to officially apologize to all those who had or have issues starting this game. I'm sure it's my fault. I almost never buy single player games, I rarely spend $60 on any game, and I had never purchased a Rockstar game, so I'm sure my desire to try this game jinxed everyone. The steps I've tried to get this to run includes updating GPU drivers, updating Windows, adding exclusions to my anti-virus, disabling anti-virus, disabling Windows Defender, disabling firewall, disabling full screen optimizations, running as administrator, rebooting Windows, deleting Rockstar local profile, verifying file integrity, uninstalling game, re-downloading game, installing new Rockstar 2GB patch, and waving chicken bones in a figure-eight pattern over my computer.
Try deleting your red dead 2 folder in Documents > Rockstar and then don't change any settings when you're in game. That worked for me-ish(still crashes sometimes)
@@unnamed715 cause its not optimized for todays hardware. Back in 2006 it was a general idea that CPUs will have high clocks after few years, going way above 5ghz, some predicting going 10 ghz in less than a decade. Well they were wrong, what happened was that CPUs got more cores while frequency remained max at 5 ghz. What you people fail to understand is how FAR ahead crysis was compared to any game. It literally took 5 years for any game to come even close to crysis graphics. Also here is another example for you saying it was unoptimized. WoW is a game that was always well optimized because it has to run on pretty much any system. Its mmorpg. How come Wotlk that is decade old runs worse and looks significally worse than BFA today on new hardware? You want to say that wow wotlk was badly optimized? Really? No, it was optimized well, it was just optimized for hardware at the time. Thats why all older games benefit more from high clocked CPUs rather than CPUs with more cores. It was just different time and different idea on how technology will progress. Cryssis was just a LOT ahead of anything else. Read dead 2 looks great, but its not a graphical achievment. In fact, some games look better even today. For crysis it took half a decade for games to even get in same rank.
i just don´t understand how a massive company like Rockstar has a full YEAR to optimize the game for PC, and yet they release the game in such a broken state, i just don´t understand how the game passed testing
Easy. The game is built on a mountain of legacy spaghetti code. It must have took a year to just get the thing to somewhat work. Mountains of code + very high turnover for employees and basically nobody really knows how that thing runs anymore. In a few decades we'll have tech priests like the ones in Foundation.
@@WandererOfWorlds0 You can't just use the same excuse that RDR 1 had with no source for these claims. If you have one, I'd love to see it, but this just sounds like a mountain of bull otherwise. GTA V had none of these issues at launch, and it too uses RAGE. This is almost certainly merely an updated version of that. RDR 1 had it's spaghetti code label because it was using an in-between version of RAGE with on the fly edits and changes that went largely undocumented, mostly due to badly managed development. The likelihood they used/updated that version of RAGE's codebase is laughably low. This is likely just a perfect storm of pushing settings too high for most users to understand or their PC's to handle, launching a new launcher without enough testing, and using new API's without testing them as thoroughly on a wide range of new and old hardware configs as they should have. For many, including myself, this game runs fantastic and is everything I could have wanted and more. But there is still no denying that for some it's quite the opposite.
motherboard compatibility has nothing to do with it, i have tested RDR2 with 6 different brands of motherboard (over 40 motherboards) and it runs perfectly fine with every single 1, i can honestly say that RDR2 runs perfectly fine if you do not feed it a potato for a PC.
Boy, 2019 isn't disappointing; damn near every game released is broken, shitty, or unplayable. It's amazing. Except for The Outer Worlds, atleast that game is fun, works, isn't micro transaction-infested..
@Him If only we lived in a perfect world. But the sad thing is, that's their bread and butter. Most console gamers; if not all, don't care much about ultra/high graphics and higher frame rates. They just want to play the game. Not to mention the refund policy is locked tight. Most console gamers prefer physical copies, and with physical copies, once you open it you own it forever unless you sell it to third party.
@Him there is a reason why rockstar games prioritize consoles over PC The steam version of gta V sold 11mill while the consoles combined sold nearly 90 MIL Not to mention it's less complicated to code on consoles And it is sold at a higher price
r/patientgamers Always playing older games is awesome, it means you get everything cheaper and mostly more functional and complete. So yeah, I'll be playjng this in like two years.
For the first time ever, playing this game has made me realize how consistent frames, even if at a consistent 30 fps, is better than spikes and valleys of fps such as 80 fps to 30 fps and back again to 80 and just all over the place. At 30 fps, I'm genuinely enjoying the game with near 4k visuals through resolution scaling from my 1080p 24" monitor.
The recommended is almost always for console quality, so like 30FPS 1080p high settings. not really acceptable for PC. You can get games looking pretty damn good on weak hardware if you're willing to run at 30FPS like a console.
There's a Frondtech article that says an i3 with a GTX 1050 Ti can run this game just fine in 50-60fps, as long as the graphics settings are correct. And other people also noticed that it's certain settings that drain FPS at a ridiculous amount, and putting them on medium doesn't impact visuals too much, but does make the problem go away. Specifically those settings are Water Quality : Medium Near Volumetric Resolution : Medium Far Volumetric Resolution : Medium Volumetric Lighting Quality : Medium For the slowest system's you'll obviously want to adjust other settings as well. But for "mid-tier" systems that can't seem to run the game well, start with the above and see if that's enough. There's a few articles detailing good gfx settings profiles for the game.
@@Leo-oc8up Eh, I wouldn't necessarily classify RDR2 as a "current gen game", sure, the graphics are EXTREMELY impressive, what with all the super detailed visuals, but it's also a game that was released a year earlier and in development for even longer, so honestly this performance is still not very justifiable, especially when factoring in the technical issues outside of just low framerate.
Sponsorship/ad that actually makes sense in the context of the video and channel. It's so rare these days to find people who simply want to provide high quality content. Thank you!
FOR THE INTEL BUG: It is a problem with the engine related to high fps. You actually have to limit your fps with something like rtss and once you get the limit right you will eliminate the horrible 1% lows. There was also an issue with the pause menu that is fixed in the same way. It's a problem with their engine.
Yeah dude. Watch the video. You're literally just saying what we already said. We're the ones who DISCOVERED that issue with GTA V. ruclips.net/video/fSlQL_iqGFg/видео.html
2019: the year the TURBO button was re-introduced due to developers being shitty at developing For those who don't know, the TURBO button did the opposite of what was expected and slowed down CPU's to play older games that based timing off CPU frequency
Turbo button worked exactly how it was expected to work. When the Turbo light is on, the mainboard is operating at nominal speed; when it's off, the board is operating at reduced clock speed. The button was usually not very descriptive in its state, you had to go by the indicator. Some PC cases had instead an 8-segment type display instead of a single LED, which the manufacturers could get with a speed on them that they had on the board, or simply saying Hi/Lo so there was no confusion as to what speed your computer is running at. I experienced the tail end of the Turbo era myself where the indicator was usually 33/66 or Hi/Lo and the Turbo button lost all original point - since it was no longer possible to run 4MHz games anyway, and all newer games were flex-speed, it would have just been better to hardwire Turbo=on with a jumper and leave it at that. But the indicator and the switch lingered. But then we wished the Turbo button back when it was gone, because games and software made in Borland's Turbo Pascal (and maybe Turbo C) software development kit would crash at start when the CPU was over 200 MHz. But eventually patchers to fix the software became available. It wasn't too common though, because for games Watcom C was king with its DOS/4GW which was a DOS extender, a minimal single-process 32-bit operating system that ran atop DOS, that made 386 features available to games.
I hope you held onto rdr2. I'm rocking an i5 4690k with a 1660 ti and it runs pretty much stutter free now. It's still heavily bottlenecked in populated areas. But they've made some major improvements to rdr2 since this video.
@Dex4Sure Not sure about that.. But I think i read somewhere that it will still go through the Rockstar launcher though.. Finally got mine working today.. Just needed to update Windows & graphic drivers.. Iv'e a Ryzen 2600 & RTX2060. The benchmark gives me a average 64 FPS at 1440p with everything set to high..
It is nice to see my 2600k back in the chart and still surprising how well it is holding up, especially with the comparison to the 7600k on the frametime chart. I was expecting ym chip to be all over the place. I had to double check I was seeng it right lol. Looks super smooth and consistant for such and old chip especially with apparently a high demanding modern game.
Sadly...Yeah. Still can't play it nearly two days since it was available and it burns even more considering I was foolish enough to pre order so I could play it as soon as it was available. To say I'm pissed is an understatement
@@earthvix I blame the launcher not the game which in essence I blame steam for being money grubbing assholes forcing all these companies to make their own launchers
This game runs decently on my FX-8350. I tried the BF5 trial earlier this week, and it stuttered like hell with a GTX 1050 Ti even on low. Such a dogshit game in general.
Got it to launch after 8 hours. Love the game. Beautiful. Crashes every time I go to valentine. It’s the city of death. 3700x. Asus x470. 2080ti. Crash boom bang! You are red dead playing.
Taa is fake antialiasing, actually downsampling, rendering on a lower resolution, and it is always blurry, somewhere more, somewhere less. The only true AA is the MSAA. Use MSAAx2 or whatever you can, turn off the shitty TAA and FXAA. The only bigger scam than TAA is the new DLSS.
"For those who are getting launcher error rename rdr2.exe to gtav.exe and it supposedly bypasses the launcher" worked for one of my friend edit: uselss
The thumbnail also has AMD holding the dual-barrel shotgun, showing that AMD has powerful multi-core performance and Intel has a revolver showing that Intel has powerful single-core performance.
Its the 1st time since owning a PC for 20 years that i have had to update my BIOS before a game will work. Im on a x570 MEG ace and after trying all the fixes going on the internet without luck, updating to version 1.0.0.4 allowed me to load the game.
@@raresmacovei8382 for real, did they not see to the i7-9700k being ass? Issue is with lack of hyperthreadimg/smt it seems. The 4c/8t 2700 and 7700 show quad cores are still good
The level of competence in this gaming companies has plummeted a lot in recent years. I wonder why? How hard is it to port a game to PC's, from consoles that are literally a PC with low specs.
@@RealHero101111 more like 2 because hell, it cant even handle the menu hahah letalone games with ultra settings because clearly rdr2 on console is custom settings from low to medium but textures are ultra.
@@rat341 at least it's quite stable around 30fps while PCs with 4 core cpus struggle with freezes. Game is simply made for more cores. Porting it isn't as easy as You think.
I found limiting my FPS to a consistent level fixed all the stuttering and low FPS issues. Personally I wanted 60fps at 1440p on Ultra so set a 60fps limit. Setting that limit helped avoid the really horrible fps dips and stutters! The game really needs a FPS cap option built in!
Something I noticed, when you were referring to Intel processors and the extreme 0.1% lows, it wasn't the core count or speed that was the issue (even the 7600k had an average of 69 FPS and never would have hit the in game cap). The issue seemed to be related to lack of hyper threading. Every CPU on your 1080p list that was a One-Thread-Per-Core CPU had sub-20 FPS 0.1% lows, which is an interesting find. I would try a 9900k with hyper threading disabled to see if it mimics the 9700k. The game engine seems to be extremely limited in its ability to disperse workloads across cores, something that the hyper-threading technology seems to fix.
@@WhiteoutTech You are correct but I'd still like to see the max performance result of the 3900x to see how close it comes to the 9900k also at its max (oc) because I suspect that alot of people that own or are considering such high performance cpu+gpu combos that watch this kind of tech channel will settle for stock.
Glad I went AMD> My 2700x plays this so smoothly at ultra without a problem. Beautiful. NO frame drops, no hick ups, lag spikes or anything. It just works. That fine wine. My 5700xt man, it shines so hard in this game. That fine wine man. My 2700x stock clocks with 3660 ram cl15 with tightened timings gets 126 on medium. But above all it just plays so smooooooth.
Then you should overclock the 9900K as well and the other cpus. A lot of work. P.S.: "Would" and "of" connected in a sentence is never grammatically correct. You wanted to say something like: "Would've liked to see".
Limited threads really becomes a huge problem in this game, I mean, just look at the 2600k absolutely destroying the 7600k, it doesn't even look like a fair comparison, like putting a Celeron to compete against a i7, except that the 7600k is a much newer CPU with a much higher clock speed and a higher IPC as well, it's just that not having hyperthreading makes the difference from smooth +60 fps gameplay to unplayeable garbage. Ryzen also did pretty well in this game, winning at the 1% lows and 0.1% lows across the board. Overall, I would consider this an all around AMD win because they had a considerable advantage at the frametimes against all Intel CPUs, which ranged from being a bit better to being overwhelmingly faster to the point Ryzen performs greatly while their Intel counterpart can't even deliver a playable experience. I was also impressed by the 9600k and 9700k doing so poorly, the 1600 and 1700 completely wiped the floor with them despite being much cheaper first gen parts, and even more impressive, the 2600k also did so much better than both of those CPUs which doesn't make any sense considering the 9700k has 8 threads just like the 2600k, but with the 2600k only has half the physical cores. It's like if SMT was an absolute must for this game for some weird reason, I mean, a 1700 winning against a 9700k may make sense in Blender or Cinebench, but it's a small win even in those ideal conditions, here it was a day and night difference, and a 2600k destroying a 9700k is plain nonsense.
I think they will because of the online mode. It's ment to give them a consitent money source just like GTA online, but it hasnt gotten there yet. So they will probably fix it so players get invested in the online mode.
Yea...no. They wouldn't have spent all this time making a game with so many awesome graphics settings that runs both DX12 and Vulkan API's. They very clearly tried to give us a great PC version, much like GTA V, and on my end at least, they succeeded. It runs great with minimal issues and I've been having a lot of fun playing around with the settings. That said, it's definitely got it's issues for some, and they definitely need to be worked out, but this level of drama about it is more than a bit excessive imo.
No way I'm the first to say it, but I LOVE how you guys do the countdown graphics on your videos. The vertical bars in the charts. Of course, also a fan of this during your news segments. I appreciate the extra work you guys put in and I also love my shirt from your store. Keep up the good work!
I'm confused how this game prefers hyoertheading vs real cores, a modern i7 with 8 real cores performs worse than an old i7 with 4/8 threads. It must be the only game in history that does that
No, GTA 5 did it too :) It's a glitch in their engine. My friend had to limit the fps to 100 through the riva tuner, so it wouldn't stutter on 6600k and 1080ti on 1080p.
Nice follow up!!! Is really awesome how the Ryzen 5 3600 perform on the first chart (1080p - medium), over 120 avg FPs and higher 1% low and .1 low than a i9 9900K, even OC one.
Back when I bought my i7 2600k, I was called a fool because an i5 2500k could do the same job, I was somehow just throwing my money away, guess who is the fool now. At 4.7ghz my gtx 1080 is the bottleneck on 1080p let alone 2k and 4k. I need to upgrade my gpu yet again it seems lol
Now we are smart for a hundred bucks more of an investment and future proofing. Mine is running excellent stable 9 years later. Legendary Sandy Bridge 2600k!
@@SilkMilkJilk lol I know you're stupid but don't make it public. Having a i7 2600k instead of a 2500k saved me alot of money through the years. I only basically replaced my GPU every now and then. Now I'm on a 2080ti and guess what my CPU is still the same old 2600k from nearly 10 years ago. At 4K gaming my 2080ti is the bottleneck, want proof? See my uploaded contet, if I bought anything else back then I would have needed to upgrade it, so tell me again, who is the fool? My peanut-brain friend :)
@@SilkMilkJilk if he hadn't got the 2600k he would have had to upgrade his cpu years ago as 4 core/threads isn't enough... instead when he does finally upgrade he can get a better cpu than one he would have upgraded to years ago.
8:30 I find it interesting that the only outliers are the chips that lack hyperthreading, and EVERY one without it (in this list) is affected. If I had to put out an uneducated guess, it looks like the game is betting on having two cores per thread and relying on both of them. When it realizes that isn't the case, it needs to recalculate the frame. Again totally baseless but I found this interesting
I got RDR2 to finally work on my R7 3700X by updating my motherboard bios for Gigabyte X570 Aorus Master, the update F10a was available on their site but not via their software servers... updated Windows and Nvidia drivers. Managed to average 61 FPS at 3440x1440p with my 3700X 16GB 3600MHz CL14 ram and Aorus 1080ti 1987Mhz core clock in afterburner, in a Custom Loop. I only played for an hour but had no issues with the benchmark or first 2 story missions.
try to install process lasso and use CPU limit setting. Follow this setting : When CPU use 98% for period of 1 sec, reduce by 1 core for period of 1 sec. source from tgrokz @reddit.
You can fix the terrible frame stutter issue running the game then alt-tab out and open the taskmanager. find the RDR2.exe under proccess list. Right click on it and assign the CPU affinity so that it is only running on two of your 4 cores. Once I did that I haven't has a frame sutter since. I also maxed out graphics and its wonderful with 1440p
Works beautifully. B450 ADRUS, NVIDIA GeForce RTX 2070 SUPER, Ryzen 5 3600X 3.79, 16 GB GDDR6....had a hard time deciding because I still have my PS4 version. OMG , night and day visuals. I have just about everything on ultra except for things I didn’t think would matter. My FPS range from 45 to 60 on a 1080p 50 in TV via high end HDMI. The detail, atmospheric lighting and visual details are insane.
It might be useful to throw up a demonstration of frametime - have three smoothly spinning circles with a pattern on them - one at 16.67ms frame time, one at mostly 16.67ms frame time with some spikes to bring the average down, and one at 33.33ms frame time. It would very viscerally demonstrate why smooth frame times are more important than higher average framerate.
@@arvidbahrke9408 same cpu, found workaround by messing with rdr2.exe process affinity - setting it to 2-3 cores helps, but framerate suffers. No freezes tho.
Yep my 4770k at 4.6 1440p i get consistent 100fps high settings. Ha I love my Haswell man my buddy runs a 8700k and i get almost same fps maybe 5 or so fps difference.
This explains so much... All my friends with i5's and older CPU's have extremely choppy gameplay. While we i7 players only have GFX cards that are sweating like crazy trying to run this, but its working well (enough).
i5-6600k is giving me freezing issues, I have just managed to at least make the game more playable by using Process Lasso and adding ar rule for RDR2.exe to limit CPU usage for 1 sec when a core reaches 98%.
@@tomassimkus1905 I didn't even bother with lasso. I just straight up used task manager to limit rdr2.exe with process affinity to use only 2 of my 4 cores. it runs like butter max settings now. no more stutter or freeze. using i5 7600u and gtx1080ti
Other than day 1 trying to launch the game I have had zero issues with RDR2 . Asus Strix Z-370-G I7 9700k @Turbo Msi RTX 2080 Trio HyperX Fury 2133@3000mhz Thermaltake 750w toughpower Only issue I had day one was the game not launching due too HDR. I turned on HDR in windows 10, Boom, I got too the game menu, turned off HDR in RDR2 settings. Exit, turn off HDR in w10. Good too go. Zero issues with nearly 100hrs of game time. Was likely a driver issue, and not mobo specific. Good Day
@Iama Dinosaur yeah well 4 core i5s ran GTA5 really well idk about RD2 in theory it should do fine but it's not looking good for us gamers with older systems.
Great comparison! Thanks! Do the same with Detroit Become Human, please. This game also turns four-threaded and six-threaded processors into garbage. Even eight-thread processors sometimes have lags and a low frame rate. You need to test in the third location of the game, which begins in the park and goes into the city. To unlock fps need to edit the file GraphicOptions.JSON changing the value to "FRAME_RATE_LIMIT": 4
I've always avoided CPUs w/o hyperthreading/simultaneous multi-threading due to this very issue addressed in this video. You budget CPU owners can argue w/ me all you want, but the proof is in the pudding.
good LORD, I'm glad I'm moving from my i5 6600 to a r7 3700x, I've had atrocious frametimes in almost every game overall but rdr2 sounds unplayable straight up
You mean optimization. Rockstar has and always will be miles ahead of Bethesda in terms of programming. Bethesda will always be limited by that broken and buggy creation engine that they can't seem to let go of.
You said you'd talk about the DX12 vs Vulkan thing in this video, but you didn't. Based on early reports for the game, that was the issue with the freezing on the Intel chips. Large oversight on something that goes so hard in about the frametimes on those Intel chips.
To the whole GN crew, How do you guys wrap your heads around all these bizarre results? Currently my system is a Ryzen 5 2600 on an Asus ROG Strix B350-F with an EVGA RTX 2080 XC Ultra (thiccboi). I set my settings to your high recommendation and found them to be perfectly satisfactory. According to the built in benchmark my results were as follows: Max - 102, Min - 38, Avg - 72 Very respectable considering how demanding (or un-optomized) RDR2 is. Thank you for what you do!
Watch our RDR2 GPU benchmark here! ruclips.net/video/yD3xd1qGcfo/видео.html - we have a graphics explanation guide coming up for the game next!
To further elaborate on the 9700K: It's not a "framerate cap" here, but an engine FPS threshold where performance tanks if insufficient threads are present, but performance is still high. Both a thread deficit and high framerate are needed to encounter this, hence the uniqueness of the bug. Because high perf & medium thread count is an uncommon combination (the 9700K is among the only ones), the bug is more interesting than most. There's a limitation of threads that affects it. That's why the 8T older CPUs are fine, but the 9700K isn't -- those aren't fast enough to hit the engine threshold. This is a two-part issue. GTA V had that issue w/ 6T CPUs, but not 12T CPUs (8700K).
Support our all-nighter efforts like this one via the GN store, where we're planting 10 trees per item ordered through Nov: store.gamersnexus.net/ or Patreon: www.patreon.com/gamersnexus
holy crap, can't believe the 3900X did so shitty compared to the 9900K! man, i was all for AMD but after seeing this, Intel is wayyyyyy better
I'm here soooo early :D
@@christiancepeda5457 Are you looking at the same charts we are? They're not that far apart.
@@GamersNexus Mind throwing in a bench with ocd / tuned ram and infinity fabric?
@Gamers Nexus Just throwing an idea out there: What power supply are you using and did you use the same power supply with both boards? If the irregular performance is seemingly sporadic across manufacturers, model numbers, etc, but seems to be somewhat correlation with bios and CPU, could it be that one board uses less power or BIOS implementation is a little more efficient in power usage during moments of power draw from the CPU, or probably more so if you guys are using a PSU with less headroom between PSU cap and perceived system draw, in supplying adequate power to CPU demand and within a certain game play sessions or scenes during game play that the inefficiency in voltage management from BIOS firmware applied to CPU, or inability to provide adequate power, etc, could be causing issues? Like, do you think it's like tolerance stacking? Probably sounds stupid...is that applicable to PCs in this situation (or any, even)? Just throwing it out there since this was a cause of problems with a engineering design I was working on a moment and 2 cups of coffee ago.
CPU Benchmark: Thread Dead Redemption 2
Best comment.
Upgrade bios edition
Red Dead...more like Green Dead, because Intel performance is..dead.
@@CaveyMoth Terrible...... just terrible
@@DestroyerofBubbles I know, those Intel CPUs are just terrible right now.
Translation of game milestones
"Official release": alpha
30% off for the first time: beta
50% off and all DLCs finally released & playable: official release
Especially true for Rockstar!
amorettique nova It’s sad how true you are
yeah very true xD which means like 2 years after console launch xD
But the preorder is already at a discount. When you preorder on Rockstar Launcher, the lowest edition you can buy is the Special Edition at the price of Standard Edition. The same for the Deluxe Edition which geared towards the online play, it comes at the cost of Special Edition.
*this translation does not apply to fallout 76 or Bethesda games in general.
When the console port is so bad, you need to swap motherboards...
it's not bad for everyone. I don't have any stuttering, freezing or other performance issues, with a i7-7700K and RX5700XT. I just run the game at ultra/high settings with a constant 60fps lock
@@leandrotijink 1080p or 1440p?
All these people talking about their GPUs and CPUs...Under a post talking about the fact the game refuses to launch on some motherboards at all.
Very on topic "haha works for me so it's good"
@@leandrotijink you are a special kind of idiot !!! It's BAD even if 10% of the hardware cannot run it , not because the hardware is poor or old, but because the port is ass
It's not a port...
I wish to officially apologize to all those who had or have issues starting this game. I'm sure it's my fault. I almost never buy single player games, I rarely spend $60 on any game, and I had never purchased a Rockstar game, so I'm sure my desire to try this game jinxed everyone.
The steps I've tried to get this to run includes updating GPU drivers, updating Windows, adding exclusions to my anti-virus, disabling anti-virus, disabling Windows Defender, disabling firewall, disabling full screen optimizations, running as administrator, rebooting Windows, deleting Rockstar local profile, verifying file integrity, uninstalling game, re-downloading game, installing new Rockstar 2GB patch, and waving chicken bones in a figure-eight pattern over my computer.
I feel like this is gonna be one of those 1k likes comments hahah
Try deleting your red dead 2 folder in Documents > Rockstar and then don't change any settings when you're in game. That worked for me-ish(still crashes sometimes)
I tried many thinks but what did the trick was to update the BIOS motherboard ..bingo!
I never thought I'd live to see the day where "But can it run Crysis'" becomes "But can it launch RDR2?".
Crysis was unoptimized shit too.
ravenwda007 no it had 2012 graphics in 2006
Nurudin Imširović it’s graphics were way ahead of its time, that doesn’t mean its unoptimized
@Nurudin Imširović Then explain why it runs buttery smooth now.
@@unnamed715 cause its not optimized for todays hardware. Back in 2006 it was a general idea that CPUs will have high clocks after few years, going way above 5ghz, some predicting going 10 ghz in less than a decade. Well they were wrong, what happened was that CPUs got more cores while frequency remained max at 5 ghz. What you people fail to understand is how FAR ahead crysis was compared to any game. It literally took 5 years for any game to come even close to crysis graphics. Also here is another example for you saying it was unoptimized. WoW is a game that was always well optimized because it has to run on pretty much any system. Its mmorpg. How come Wotlk that is decade old runs worse and looks significally worse than BFA today on new hardware? You want to say that wow wotlk was badly optimized? Really? No, it was optimized well, it was just optimized for hardware at the time. Thats why all older games benefit more from high clocked CPUs rather than CPUs with more cores. It was just different time and different idea on how technology will progress. Cryssis was just a LOT ahead of anything else. Read dead 2 looks great, but its not a graphical achievment. In fact, some games look better even today. For crysis it took half a decade for games to even get in same rank.
i just don´t understand how a massive company like Rockstar has a full YEAR to optimize the game for PC, and yet they release the game in such a broken state, i just don´t understand how the game passed testing
Probably just sat on their desk for a year, so they could double dip.
Easy. The game is built on a mountain of legacy spaghetti code. It must have took a year to just get the thing to somewhat work. Mountains of code + very high turnover for employees and basically nobody really knows how that thing runs anymore. In a few decades we'll have tech priests like the ones in Foundation.
If you've ever modded GTA v you will realize how janky it is. It uses vehicle handling tuning almost like the GTA 3 days...
@@WandererOfWorlds0 You can't just use the same excuse that RDR 1 had with no source for these claims. If you have one, I'd love to see it, but this just sounds like a mountain of bull otherwise. GTA V had none of these issues at launch, and it too uses RAGE. This is almost certainly merely an updated version of that. RDR 1 had it's spaghetti code label because it was using an in-between version of RAGE with on the fly edits and changes that went largely undocumented, mostly due to badly managed development. The likelihood they used/updated that version of RAGE's codebase is laughably low.
This is likely just a perfect storm of pushing settings too high for most users to understand or their PC's to handle, launching a new launcher without enough testing, and using new API's without testing them as thoroughly on a wide range of new and old hardware configs as they should have.
For many, including myself, this game runs fantastic and is everything I could have wanted and more. But there is still no denying that for some it's quite the opposite.
because it runs on emulator 😂
This is why I never pre order, or go for a day 1 release. I refuse to pay almost $100 AUD to be a beta tester.
This and also that the Devs lie to sell their game (No man's sky, watch dogs etc)
Then pay $68 USD instead, that way, you don't have to pay GST
@@TheHalfGlassFullGuy Too bad $68USD IS $100AUD without the GST. 😂🤣 I'd rather fund our government, no matter how crappy it currently is. 🤷
Smart move. I have not been able to get it to even launch let alone get it to play smoothly. I won't be making that mistake again.
@@MafiaboysWorld That's the joke though, you're giving more money to the company, "But at least you aren't paying GST"
2019/2020 Important things when you buy a new gaming pc : check if the motherboard is compatible with the game you playing
Emre Ercan :))) like rdr2 epic rockstar
What ? Pls explain. I didn't watch whole video.
LOL!!!
motherboard compatibility has nothing to do with it, i have tested RDR2 with 6 different brands of motherboard (over 40 motherboards) and it runs perfectly fine with every single 1, i can honestly say that RDR2 runs perfectly fine if you do not feed it a potato for a PC.
@@iikraze9864 are you stupid or something??? We know that.... Im just joking
And did you even watch jayztwosents 's video????
Boy, 2019 isn't disappointing; damn near every game released is broken, shitty, or unplayable. It's amazing. Except for The Outer Worlds, atleast that game is fun, works, isn't micro transaction-infested..
Except Outer Worlds on PC can't be played at over 60fps without hitching.
@@Owlero I didn't have any hitching at 1440 75fps so I guess it can 🤷♂️
what are you talking about. lol.
outer worlds sucks. it was way overhyped. Red dead is much better
T Hunt Red dead redemption is pretty good but it is over 10 years old
When a sandy bridge i7 provides a better experience than a modern i7
Many Sandy bridge i7 can still offer respectable performance when you are not pushing top framerate.
On paper this is crazy, but it's the engine's weird handling of very high framerates that is to blame here.
I'm so tired i read that as 'Candy Bridge' and got really excited for a second.
maybe R* dev still using sandy bridge i7 so that was their optimization goal
Sandy bridge kicks azz, man!
Why does the thermal grizzly packaging look like a workout supplement
looks like a candy wrapper to me.
It's a supplement for your pc
DO NOT EAT LIQUID METAL.
@@GamersNexus How about drinking?
@@GamersNexus too late
For a game that had a whole year to be ported to PC, well... tf.
That is where their problems started.
@Him If only we lived in a perfect world. But the sad thing is, that's their bread and butter. Most console gamers; if not all, don't care much about ultra/high graphics and higher frame rates. They just want to play the game. Not to mention the refund policy is locked tight. Most console gamers prefer physical copies, and with physical copies, once you open it you own it forever unless you sell it to third party.
@Him Consoles are more popular and accessible.
@Gaius Julius Caesar It will deffinenlty sell WAY more on console. PC is better, but people think that because PC is better, it is more popular.
@Him there is a reason why rockstar games prioritize consoles over PC
The steam version of gta V sold 11mill while the consoles combined sold nearly 90 MIL
Not to mention it's less complicated to code on consoles
And it is sold at a higher price
Getting this one a year later, like all major AAA releases = peace of mind.
So you will play it 2 years after initial relase.
r/patientgamers
Always playing older games is awesome, it means you get everything cheaper and mostly more functional and complete. So yeah, I'll be playjng this in like two years.
@@0FFICERPROBLEM
+ mods, overhauls and total conversions^^
@@BeatmasterAC Absolutely!
0FFICERPROBLEM r/bigchungus
For the first time ever, playing this game has made me realize how consistent frames, even if at a consistent 30 fps, is better than spikes and valleys of fps such as 80 fps to 30 fps and back again to 80 and just all over the place. At 30 fps, I'm genuinely enjoying the game with near 4k visuals through resolution scaling from my 1080p 24" monitor.
Me: About to sleep
GN: Releases this
Me: I don’t need any sleep
its 6am still no sleep just posted this !
@@minimaluser2132 But not necessarily achievable
Right?! #TeamNoSleep
You: "All right, about time I head to bed."
Also you: "Is that the fucking sun?"
'The game is stuttering and freezing because it is running at too high a frame rate'. 1st world PC problems.
Steve you should have added Rockstar's Recommended PC performance i.e 1500x and 1060 jus for the lulz xd
The recommended is almost always for console quality, so like 30FPS 1080p high settings. not really acceptable for PC. You can get games looking pretty damn good on weak hardware if you're willing to run at 30FPS like a console.
@@deathrager2404 I'm getting well over 70 fps on high settings 1440p, with a 3600 and a 2080. really not that bad for a current gen game at launch
@@BrianCroweAcolyte not?
There's a Frondtech article that says an i3 with a GTX 1050 Ti can run this game just fine in 50-60fps, as long as the graphics settings are correct. And other people also noticed that it's certain settings that drain FPS at a ridiculous amount, and putting them on medium doesn't impact visuals too much, but does make the problem go away. Specifically those settings are
Water Quality : Medium
Near Volumetric Resolution : Medium
Far Volumetric Resolution : Medium
Volumetric Lighting Quality : Medium
For the slowest system's you'll obviously want to adjust other settings as well. But for "mid-tier" systems that can't seem to run the game well, start with the above and see if that's enough. There's a few articles detailing good gfx settings profiles for the game.
@@Leo-oc8up Eh, I wouldn't necessarily classify RDR2 as a "current gen game", sure, the graphics are EXTREMELY impressive, what with all the super detailed visuals, but it's also a game that was released a year earlier and in development for even longer, so honestly this performance is still not very justifiable, especially when factoring in the technical issues outside of just low framerate.
As always, stupidly in-depth and so much information to help everyone check their specs specifically for this game. Thumbs up.
Sponsorship/ad that actually makes sense in the context of the video and channel.
It's so rare these days to find people who simply want to provide high quality content. Thank you!
FOR THE INTEL BUG:
It is a problem with the engine related to high fps. You actually have to limit your fps with something like rtss and once you get the limit right you will eliminate the horrible 1% lows.
There was also an issue with the pause menu that is fixed in the same way.
It's a problem with their engine.
Yeah dude. Watch the video. You're literally just saying what we already said. We're the ones who DISCOVERED that issue with GTA V. ruclips.net/video/fSlQL_iqGFg/видео.html
I mean this helped me with finding the fix I hadn’t watched the other video
Ah, Rockstar...you never fail to disappoint.
never fail to surprise us gamers
They never disappoint at fail XD
this reminds me that it wasn't a mistake to stop buying their games, after GTA San Andreas.
INB4 DRM is causing massive performance hits.
DRM is smashing performance
Honestly, i genuinely think thats the culprit. Denuvo is known to be a fuckin CPU hog.
2019: the year the TURBO button was re-introduced due to developers being shitty at developing
For those who don't know, the TURBO button did the opposite of what was expected and slowed down CPU's to play older games that based timing off CPU frequency
Turbo button worked exactly how it was expected to work. When the Turbo light is on, the mainboard is operating at nominal speed; when it's off, the board is operating at reduced clock speed. The button was usually not very descriptive in its state, you had to go by the indicator. Some PC cases had instead an 8-segment type display instead of a single LED, which the manufacturers could get with a speed on them that they had on the board, or simply saying Hi/Lo so there was no confusion as to what speed your computer is running at.
I experienced the tail end of the Turbo era myself where the indicator was usually 33/66 or Hi/Lo and the Turbo button lost all original point - since it was no longer possible to run 4MHz games anyway, and all newer games were flex-speed, it would have just been better to hardwire Turbo=on with a jumper and leave it at that. But the indicator and the switch lingered.
But then we wished the Turbo button back when it was gone, because games and software made in Borland's Turbo Pascal (and maybe Turbo C) software development kit would crash at start when the CPU was over 200 MHz. But eventually patchers to fix the software became available. It wasn't too common though, because for games Watcom C was king with its DOS/4GW which was a DOS extender, a minimal single-process 32-bit operating system that ran atop DOS, that made 386 features available to games.
@@SianaGearz 90% of people reading this will be clueless..those were the days dos boot diskette and himem.sys
@@broken1965 edit autoexec.bat was my best friend,lol
@@broken1965
the days when Sound Blasters were always configured at DMA Channel 3 and Interrupt 5 XD
This explains why I get stuttering on my i5 7600k. I appreciate your work on this. Think I’ll be asking for a refund.
raise graphics settings it seems...not lower them as would be the intuitive thing to do.
I hope you held onto rdr2. I'm rocking an i5 4690k with a 1660 ti and it runs pretty much stutter free now. It's still heavily bottlenecked in populated areas. But they've made some major improvements to rdr2 since this video.
@Curtis Riceman Maybe he meant refund for the game:)
I swear I heard you say "Thread Dead 2" at around 0:25
I like this.
Should be a new series for this channel.
You hear right 😁
Auch, that 9700k vs 8700k 0.1% low number is just painful... Happy I'm upgrading my 6600k to a 3700x this week ;p
7:07 Moment of silence for the people who bought 7600k in 2017 instead of ryzen 1600.
Right? And I thought I had it bad when I bit on a 7700K. I sold that PC though and went to a Ryzen 7 3700X.
*throws a fit in 7600K*
Seems to me that cpu's without hypertreading is performing bad. 7600k 9600k 9700k. I think it's a bug. A 2600k can't outperform a 7600k.
T_T
@@ramikol38 It certainly is an issue on Rockstar's end.
You guys have hands down the best testing methodology out there.
Imagine they didn't make a launcher and just used Steam...
Hmpppf most likely total catastrofe...
Hmpppf Shame it’s not on GOG.
And not make as much money? Why would they do that?
Will be available on Steam in December..
@Dex4Sure Not sure about that.. But I think i read somewhere that it will still go through the Rockstar launcher though..
Finally got mine working today.. Just needed to update Windows & graphic drivers..
Iv'e a Ryzen 2600 & RTX2060. The benchmark gives me a average 64 FPS at 1440p with everything set to high..
It has to be an R* glitch that shows me being the first view.
I'm surprised how far the engine has come, while dragging it's feet.
2600k: Ha, gotcha i5s
2nd gen i7 still kicking asses
5ghz 2600k reporting in.
Older high-end CPU's are really pulling their weight in this game, I'm running a 4930k and it's doing great.
*2500K emerges from the shadows*
i had a 2600 until a month ago, they're still holding up fairly well
I guess I'll be waiting for this one to get discounted. And fixed.
Buzz K prepare to wait forever then. They won’t address this issue even if it was in there face.
They never gonna fix it so dont wait for it.....
@@shadow59555 their*
Iama Dinosaur steam won’t make a difference. You’re just adding an extra step to start the game. It’ll still go through their launcher.
You'll be waiting for a discount forever then.
Thumbs up for including the 2600k :)
It is nice to see my 2600k back in the chart and still surprising how well it is holding up, especially with the comparison to the 7600k on the frametime chart. I was expecting ym chip to be all over the place. I had to double check I was seeng it right lol. Looks super smooth and consistant for such and old chip especially with apparently a high demanding modern game.
With this game launch, R* is becoming more like Bethesda
Sadly...Yeah. Still can't play it nearly two days since it was available and it burns even more considering I was foolish enough to pre order so I could play it as soon as it was available.
To say I'm pissed is an understatement
Yea but all game launches suck into todays gaming society plus at least this game doesn’t look like a potato like every Bethesda game lol
@@trevdev7130 probably. I guess they didn't test it enough
@@earthvix I blame the launcher not the game which in essence I blame steam for being money grubbing assholes forcing all these companies to make their own launchers
Rockstar launches has never been smooth on PC (at least not since GTA : San Andreas).
Let's get serious, where my Fx 8350 at?
It's great as an oven benchmark.
This game runs decently on my FX-8350. I tried the BF5 trial earlier this week, and it stuttered like hell with a GTX 1050 Ti even on low. Such a dogshit game in general.
considering the FX-6300 is the min spec for this title *apparently*
serious. like with a game thats released 2 years earlier.
. . . or 4 years earlier
or 6 years earlier . . . or famous 8 years earlier
@Dex4Sure I only played it for The Last Tiger, deleted it afterwards.
19:07 "older cpu" i5 7600k *Launch Date Q1'17*
time flies... i use g4560 and i just want it to give me 30 fps in upcoming cyberpunk 2077
Feels a lot older! Good point.
petar sebic cyberpunk wont work with that bro
@@GhettoPCbuilds we will find out next year
petar sebic Cyberpunk is 8 core minimum. It’s a end-gen title for PS4/X1. Also part of new gen.
These frametimes will haunt me in my nightmares
Got it to launch after 8 hours. Love the game. Beautiful. Crashes every time I go to valentine. It’s the city of death. 3700x. Asus x470. 2080ti. Crash boom bang! You are red dead playing.
My 6700k seems to be doing ok.
TAA is awful, it's like motion blur that isn't tied to motion, it's just there all the time.
my i5 7600k has the spikes from the video. i could prevent them (while tanking the fps) by disabeling affinity for 2 cores in the taskmanager
there is a setting for taa in advanced that may help
Try playing no antialiasing - actually looks pretty good in this game. I am at 1440p however might not be as good for 1080p
Taa is fake antialiasing, actually downsampling, rendering on a lower resolution, and it is always blurry, somewhere more, somewhere less.
The only true AA is the MSAA. Use MSAAx2 or whatever you can, turn off the shitty TAA and FXAA.
The only bigger scam than TAA is the new DLSS.
"For those who are getting launcher error rename rdr2.exe to gtav.exe and it supposedly bypasses the launcher" worked for one of my friend
edit: uselss
that's the sketchiest fix I've heard in my life. Sounds like rockstar
Does this really work? That would be really funny
Yeah saw this on reddit lol
But gets stuck on infinite loading on launching story mode
@@popifrex1993 lol
I love how the thumbnail has both companies pointing guns at Steve
The thumbnail also has AMD holding the dual-barrel shotgun, showing that AMD has powerful multi-core performance and Intel has a revolver showing that Intel has powerful single-core performance.
@@MyUnknown54321 That's deep
Its the 1st time since owning a PC for 20 years that i have had to update my BIOS before a game will work. Im on a x570 MEG ace and after trying all the fixes going on the internet without luck, updating to version 1.0.0.4 allowed me to load the game.
WoW Steve (GN) you have 686K subscribers now when I first subscribed you only had 60k. Great channel and really good content.
Wow. Just, wow. I never thought I'd see the day. Quad Core is OUT!
Octocore Gaming is here.
Bud The Cyborg It’s not. RDR2 just wants quad cores with HT
@@raresmacovei8382 for real, did they not see to the i7-9700k being ass?
Issue is with lack of hyperthreadimg/smt it seems. The 4c/8t 2700 and 7700 show quad cores are still good
quad core isn't dead because of one game. I'm pretty sure quad core cpus are still most popular right now.
@@KPalmTheWise The i7 9700k was not being ass if you've watched the video. It's too powerful and that results in a annoying bug in the Rage engine.
The level of competence in this gaming companies has plummeted a lot in recent years. I wonder why?
How hard is it to port a game to PC's, from consoles that are literally a PC with low specs.
consoles have cpus with 8 cores.
@@RealHero101111 more like 2 because hell, it cant even handle the menu hahah letalone games with ultra settings because clearly rdr2 on console is custom settings from low to medium but textures are ultra.
@@rat341 at least it's quite stable around 30fps while PCs with 4 core cpus struggle with freezes. Game is simply made for more cores. Porting it isn't as easy as You think.
I'm getting gta 4 console port flashbacks
I found limiting my FPS to a consistent level fixed all the stuttering and low FPS issues. Personally I wanted 60fps at 1440p on Ultra so set a 60fps limit. Setting that limit helped avoid the really horrible fps dips and stutters! The game really needs a FPS cap option built in!
Something I noticed, when you were referring to Intel processors and the extreme 0.1% lows, it wasn't the core count or speed that was the issue (even the 7600k had an average of 69 FPS and never would have hit the in game cap). The issue seemed to be related to lack of hyper threading. Every CPU on your 1080p list that was a One-Thread-Per-Core CPU had sub-20 FPS 0.1% lows, which is an interesting find. I would try a 9900k with hyper threading disabled to see if it mimics the 9700k. The game engine seems to be extremely limited in its ability to disperse workloads across cores, something that the hyper-threading technology seems to fix.
Would have like to of seen a 3900x overclock result similar to other games you have tested, maybe even SMT on vs off. Great effort nonetheless.
well the engine has problems at high framerates if you have ht / smt disabled no matter how many cpu cores you have.
@@WhiteoutTech You are correct but I'd still like to see the max performance result of the 3900x to see how close it comes to the 9900k also at its max (oc) because I suspect that alot of people that own or are considering such high performance cpu+gpu combos that watch this kind of tech channel will settle for stock.
Glad I went AMD> My 2700x plays this so smoothly at ultra without a problem. Beautiful. NO frame drops, no hick ups, lag spikes or anything.
It just works. That fine wine. My 5700xt man, it shines so hard in this game. That fine wine man.
My 2700x stock clocks with 3660 ram cl15 with tightened timings gets 126 on medium.
But above all it just plays so smooooooth.
Then you should overclock the 9900K as well and the other cpus. A lot of work. P.S.: "Would" and "of" connected in a sentence is never grammatically correct. You wanted to say something like: "Would've liked to see".
Limited threads really becomes a huge problem in this game, I mean, just look at the 2600k absolutely destroying the 7600k, it doesn't even look like a fair comparison, like putting a Celeron to compete against a i7, except that the 7600k is a much newer CPU with a much higher clock speed and a higher IPC as well, it's just that not having hyperthreading makes the difference from smooth +60 fps gameplay to unplayeable garbage.
Ryzen also did pretty well in this game, winning at the 1% lows and 0.1% lows across the board. Overall, I would consider this an all around AMD win because they had a considerable advantage at the frametimes against all Intel CPUs, which ranged from being a bit better to being overwhelmingly faster to the point Ryzen performs greatly while their Intel counterpart can't even deliver a playable experience.
I was also impressed by the 9600k and 9700k doing so poorly, the 1600 and 1700 completely wiped the floor with them despite being much cheaper first gen parts, and even more impressive, the 2600k also did so much better than both of those CPUs which doesn't make any sense considering the 9700k has 8 threads just like the 2600k, but with the 2600k only has half the physical cores. It's like if SMT was an absolute must for this game for some weird reason, I mean, a 1700 winning against a 9700k may make sense in Blender or Cinebench, but it's a small win even in those ideal conditions, here it was a day and night difference, and a 2600k destroying a 9700k is plain nonsense.
The sad fact is, that I dont expect R* to fix the game, nontheless to improve the optimization
They dont give a fck about how the game performs
Honestly, if they had done a good job I would have bought the game for sure. But if they do not fix this I definitely won't.
@@deathrager2404 but rockstar games are the only games I look forward to in a PC. Nothing compares to GTA Series
I think they will because of the online mode. It's ment to give them a consitent money source just like GTA online, but it hasnt gotten there yet. So they will probably fix it so players get invested in the online mode.
Yea...no. They wouldn't have spent all this time making a game with so many awesome graphics settings that runs both DX12 and Vulkan API's. They very clearly tried to give us a great PC version, much like GTA V, and on my end at least, they succeeded. It runs great with minimal issues and I've been having a lot of fun playing around with the settings.
That said, it's definitely got it's issues for some, and they definitely need to be worked out, but this level of drama about it is more than a bit excessive imo.
@@aneeshprasobhan You must not play that many good games on PC if that's what you look forward to playing on PC. Indy games are king.
Glad I bought the i9 9900KS. Cant go wrong with that here. Also have an MSI MEG Godlike z390 motherboard.
No way I'm the first to say it, but I LOVE how you guys do the countdown graphics on your videos. The vertical bars in the charts. Of course, also a fan of this during your news segments. I appreciate the extra work you guys put in and I also love my shirt from your store. Keep up the good work!
R* and PC have always been an issue.
Recommended Specs: Processor: Intel® Core™ i7-4770K / AMD Ryzen 5 1500X
Both have hyperthread so no freezing.
I'm confused how this game prefers hyoertheading vs real cores, a modern i7 with 8 real cores performs worse than an old i7 with 4/8 threads. It must be the only game in history that does that
No, GTA 5 did it too :) It's a glitch in their engine.
My friend had to limit the fps to 100 through the riva tuner, so it wouldn't stutter on 6600k and 1080ti on 1080p.
2600k beating 7600k at stock? :D
Rockstar going full Bethesda.
Pienimusta problem with non HT cpus, all over places
designamk1 then you mustv been a very luck one
@@designamk1160 or i might say steve is the less lucky guy? what do you want
@@designamk1160 they used a 2080 ti, that's why, you dont understand what's going on.
Nice follow up!!! Is really awesome how the Ryzen 5 3600 perform on the first chart (1080p - medium), over 120 avg FPs and higher 1% low and .1 low than a i9 9900K, even OC one.
Thanks for the inclusion of the 2600K
Back when I bought my i7 2600k, I was called a fool because an i5 2500k could do the same job, I was somehow just throwing my money away, guess who is the fool now.
At 4.7ghz my gtx 1080 is the bottleneck on 1080p let alone 2k and 4k. I need to upgrade my gpu yet again it seems lol
Now we are smart for a hundred bucks more of an investment and future proofing. Mine is running excellent stable 9 years later. Legendary Sandy Bridge 2600k!
@@SilkMilkJilk lol I know you're stupid but don't make it public. Having a i7 2600k instead of a 2500k saved me alot of money through the years. I only basically replaced my GPU every now and then. Now I'm on a 2080ti and guess what my CPU is still the same old 2600k from nearly 10 years ago. At 4K gaming my 2080ti is the bottleneck, want proof? See my uploaded contet, if I bought anything else back then I would have needed to upgrade it, so tell me again, who is the fool? My peanut-brain friend :)
@@SilkMilkJilk you're an idiot and a waste, and now you don't even understand what you wrote yourself, braindead much..
@@SilkMilkJilk if he hadn't got the 2600k he would have had to upgrade his cpu years ago as 4 core/threads isn't enough... instead when he does finally upgrade he can get a better cpu than one he would have upgraded to years ago.
@@SilkMilkJilk So you think the guy just left his PC stored in his attic? Wow
The best reason ever to pass on a game: my brand new beast of a mobo won't run it LOL
New meme: "But can your motherboard launch Rockstar games?"
So why did you buy a crappy mobo? Didn’t you do any research first?
@@Traitorman..Proverbs26.11 I would not call the Aorus X570 Master a crappy motherboard. But hey, that's me.
@@Traitorman..Proverbs26.11 like your motherboard is any better ya fuckin muppet
David Shaun
It can run RDR2 and still give an average fps of 68.
So....
8:30 I find it interesting that the only outliers are the chips that lack hyperthreading, and EVERY one without it (in this list) is affected. If I had to put out an uneducated guess, it looks like the game is betting on having two cores per thread and relying on both of them. When it realizes that isn't the case, it needs to recalculate the frame. Again totally baseless but I found this interesting
I got RDR2 to finally work on my R7 3700X by updating my motherboard bios for Gigabyte X570 Aorus Master, the update F10a was available on their site but not via their software servers... updated Windows and Nvidia drivers.
Managed to average 61 FPS at 3440x1440p with my 3700X 16GB 3600MHz CL14 ram and Aorus 1080ti 1987Mhz core clock in afterburner, in a Custom Loop. I only played for an hour but had no issues with the benchmark or first 2 story missions.
Really appreciate that you included 2600k in the lineup. Its a great baseline as it was so popular.
7600K owner here, 5!!! second freezes each 30 seconds on average
try to install process lasso and use CPU limit setting. Follow this setting : When CPU use 98% for period of 1 sec, reduce by 1 core for period of 1 sec.
source from tgrokz @reddit.
Why did you get a 7600K over a R6 1600?
@@CharcharoExplorer Mistakes were made.
2600k @ 4,6 ghz giving you a wink.
You can fix the terrible frame stutter issue running the game then alt-tab out and open the taskmanager. find the RDR2.exe under proccess list. Right click on it and assign the CPU affinity so that it is only running on two of your 4 cores. Once I did that I haven't has a frame sutter since. I also maxed out graphics and its wonderful with 1440p
Looks like I can wait for an Xmas sale for the PC version. Hopefully it will be fixed by then.
I hope you got the 25% off sale in the EPIC store in December bro.
Good lord, the errors and bugs in this game... This is a dumpster fire of a launch.
worse than GTA4?
I like how at 0:22 you can't help but make it sound like Thread Dead 2
Thank you Gamer Nexus for providing quality benchmarks and tech reviews.
Always learning some stuff thanks you.
After seeing this, I'm really glad i bought the ryzen 5 3600! I mean i have 8 frames lower than the high end CPU at more than half the price. LOL.
Yeah the 1% and 0,1 % low are really looking good compared to much higher prized CPUs.:)
It was mainly a XML glitch. Go back to bed and cry...
Stop right there... guys do you remember when was the last time you have to update the bios to play a f~king game? Well done Rockstar
I feel so lucky that RDR 2 has worked flawlessly on my setup. And runs decently with my 1070 and 7700k at high settings
I love how in-depth and informative your videos are - keep up the good work!
Works beautifully. B450 ADRUS, NVIDIA GeForce RTX 2070 SUPER, Ryzen 5 3600X 3.79, 16 GB GDDR6....had a hard time deciding because I still have my PS4 version. OMG , night and day visuals. I have just about everything on ultra except for things I didn’t think would matter. My FPS range from 45 to 60 on a 1080p 50 in TV via high end HDMI. The detail, atmospheric lighting and visual details are insane.
One thing i would really like to see, is a benchmark where ALL cpus have SMT/HT disabled, to see how much it matters
just look at the i9-9900K vs i7-9700K results or the i5-7600K vs i7-7700K which is basically HT enabled vs disabled
megapro125 exactly
Makes you appreciate how well optimized the PS4 version must be.
Yeah I played the PS4 version so I was like wow this is gonna run so well on my PC, but you know...
Yeah, if you call 25fps with tons of motion blur "well-optimized"
nothing like dips-into-15 fps gameplay!
@@unnamed715 Triggered?
@@Zoolookuk No comeback?
2019: publishers wearing bad press resistance +10000 armor
In the end the same people who are shareholders in those companies are also shareholders of the press.
Thanks for keeping the old school CPUs around - I still use the 2600K as my daily driver.
It might be useful to throw up a demonstration of frametime - have three smoothly spinning circles with a pattern on them - one at 16.67ms frame time, one at mostly 16.67ms frame time with some spikes to bring the average down, and one at 33.33ms frame time. It would very viscerally demonstrate why smooth frame times are more important than higher average framerate.
Whole point of vulkan and dx12 is that devs can optimise their games, Rockstar be like lets not do any optimisation 🤦♂️
Did anyone notice at 8:00, that every CPU without Hyperthreading has issues in 1% and 0.1% lows?
Iama Dinosaur I get 10 second freezes every 5 seconds with my i3 8100 (4 core 4 thread)
@@arvidbahrke9408 same cpu, found workaround by messing with rdr2.exe process affinity - setting it to 2-3 cores helps, but framerate suffers. No freezes tho.
Yep my 4770k at 4.6 1440p i get consistent 100fps high settings. Ha I love my Haswell man my buddy runs a 8700k and i get almost same fps maybe 5 or so fps difference.
Seems like every system without SMT/HT are the ones hitting those super high frame times. Even compared to same thread CPUs with less cores but SMT
Yeah I noticed that too. Good eye!
All Ryzen CPU have SMT, and I don't see the problem there...
@@marsovac yeah I meant to say without
This is why I would like to see an AMD CPU tested with SMT disabled.
This explains so much...
All my friends with i5's and older CPU's have extremely choppy gameplay. While we i7 players only have GFX cards that are sweating like crazy trying to run this, but its working well (enough).
i5-6600k is giving me freezing issues, I have just managed to at least make the game more playable by using Process Lasso and adding ar rule for RDR2.exe to limit CPU usage for 1 sec when a core reaches 98%.
@@tomassimkus1905 I didn't even bother with lasso. I just straight up used task manager to limit rdr2.exe with process affinity to use only 2 of my 4 cores. it runs like butter max settings now. no more stutter or freeze. using i5 7600u and gtx1080ti
Other than day 1 trying to launch the game I have had zero issues with RDR2 .
Asus Strix Z-370-G
I7 9700k @Turbo
Msi RTX 2080 Trio
HyperX Fury 2133@3000mhz
Thermaltake 750w toughpower
Only issue I had day one was the game not launching due too HDR. I turned on HDR in windows 10, Boom, I got too the game menu, turned off HDR in RDR2 settings. Exit, turn off HDR in w10. Good too go. Zero issues with nearly 100hrs of game time. Was likely a driver issue, and not mobo specific. Good Day
2600k killing it. My baby still going strong with a Radeon VII, can't wait to upgrade to a new cpu.
no fx cpus? :( i would really like to see how fx 8350 performs,since rockstar put fx 6350 as a minimum requirement.
same
expect decent max fps but really crappy 1% and 0.1% lows.
@Iama Dinosaur yeah well 4 core i5s ran GTA5 really well idk about RD2 in theory it should do fine but it's not looking good for us gamers with older systems.
Steve really dropping content at 2 am 🤔
it was eight am here
Great comparison! Thanks!
Do the same with Detroit Become Human, please.
This game also turns four-threaded and six-threaded processors into garbage.
Even eight-thread processors sometimes have lags and a low frame rate.
You need to test in the third location of the game, which begins in the park and goes into the city.
To unlock fps need to edit the file GraphicOptions.JSON changing the value to "FRAME_RATE_LIMIT": 4
I5 7600K 16GB RAM GTX 1070: Same problems you found. CPU usage goes to 100% GPU usage goes to 0%.
Thanks for the video.
I've always avoided CPUs w/o hyperthreading/simultaneous multi-threading due to this very issue addressed in this video. You budget CPU owners can argue w/ me all you want, but the proof is in the pudding.
"B-but i5 is enough f-for gaming"
@@BeowulfCav Until this shit happens, lol
good LORD, I'm glad I'm moving from my i5 6600 to a r7 3700x, I've had atrocious frametimes in almost every game overall but rdr2 sounds unplayable straight up
Just look how a 2nd gen i7 beating the hell out of i5 7th gen, lol
@@AZI3623 And a $500 3900x... Should have put up the 3950x up in this as well...
Sitting here, watching this with a 7600k in my pc........
RIP
F
@Iama Dinosaur lol
Same
its coming to steam in December
i swear i see your teeth is getting better on every video
good job
Thank you for testing 1700 with OC
Rockstar competing with Bethesda for laziest programming award.
You mean optimization. Rockstar has and always will be miles ahead of Bethesda in terms of programming. Bethesda will always be limited by that broken and buggy creation engine that they can't seem to let go of.
I'd love to see chart ordered to reflect 1% mins rather than max fps.
but thats how it is organized?
You said you'd talk about the DX12 vs Vulkan thing in this video, but you didn't. Based on early reports for the game, that was the issue with the freezing on the Intel chips. Large oversight on something that goes so hard in about the frametimes on those Intel chips.
To the whole GN crew,
How do you guys wrap your heads around all these bizarre results?
Currently my system is a Ryzen 5 2600 on an Asus ROG Strix B350-F with an EVGA RTX 2080 XC Ultra (thiccboi). I set my settings to your high recommendation and found them to be perfectly satisfactory. According to the built in benchmark my results were as follows: Max - 102, Min - 38, Avg - 72
Very respectable considering how demanding (or un-optomized) RDR2 is.
Thank you for what you do!
Maybe RDR2 started a new era for gaming, where high thread count cpu's rules.
Big slap at intel