i don't understand why i began watching this guy, i don't even own any of these gpus, i just suddenly catch myself watching his videos while i eat, i think i am in love
And it's starting lose some of its charm too. The 50 series gonna come out and the 4090 would probably be equal to the 5070 and 5070 Ti. Intel would go back to the mid range but hey they would still be keeping up. That's something even if it's the last gen flagship.
@@aligatortatotatotaovic1890 Exactly! who has this kind of money I have a rx 6800 u can buy for 379.99 and I can play this game @ 1440p ultra getting 60fps native res.
@@thatguy3000 with the lead that they have and chasing the AI bubble, 5070 is never going to give us 4090 performance. At best it's going to give us 4080 performance.
How many GPUs you got ? You definitely do more than anyone. I am glad that you’re not testing 4090s the time which most people can’t afford kind of pointless.
What settings? 1080p? because even oced your card should get off of vram so i espect not smooth frametime. 1080p max settings use 12GB of vram as you can see in the video.
How the hell al I not subscribed to you I’ve seen at least 30 videos in the past few months from hand helds to 1060s and GPUs I didn’t know existed to modern gpus. I appreciate the content
Been loving your content, I've got a suggestion of my own: the Radeo Rx 7900 GRE on helldivers 2. If you're not in possession of this card, the 7800 XT is my 2nd option. If you've got neither, that's okay, I'll continue to enjoy your upbeat reviews of graphics cards~
Thanks for benchmarking the Intel ARC A770 16GB! For me it's always a very interesting card because it's more unusal than the AMD & Nvidia cards. So I would be up for more benchmarks of it. From what I've seen so far the performance is mostly between the 3060 and the 3060Ti and usually the 3060Ti is stronger like in this game. Nevertheless the Intel A770 has something special and XeSS is looking much better than FSR in that game AND of course it has decent 16GB VRAM. So maybe in some years the 3060Ti can only see the backlights of the ARC A770. We'll see.
it would be nice to see an updated benchmark of this gpu on multiple different games as the drivers will have improved it significantly since the first one
Great job mentioning the ReBar-requirement Kryzzp , I’ve seen a lot of high-profile “reviewers” making revisits/follow-up on the Arc GPUs and they don’t say a word about ReBar anymore !! My brother still has a Ryzen2700X system ( *just like many other people who still have up to a Ryzen 2000-series system* ) and if he was buying an ARC GPU based on those high-profile “reviewers” recommendation *he would be basically* … *screwed* !! ( *ARC is relatively cheap so its target-audience is many people with low/er budget and with older systems* ) Intel’s Tom’s Petersen (former nVIDIA ) has said in the past ,that whomever doesn’t have ReBar enabled should go buy an RTX3060 instead !! ,so good to see that Kryzzp doesn’t mislead consumers with low-quality reviews like many 🤡"high-profilers" are doing… i'm very impressed by your professionalism Kryzzp ,good work🙂
It's already better than 4060 but 3060 will age better because it has 12 gb vram with nvidia driver support (they just works) 6700xt is good as well but it's 160 bit bus i think that's why it won't age that good
Was going to get a 7900gre but..I've committed, getting my a770 in on Monday for my new build as a placeholder for the battlemage, it'll be paired with a 7600x/32gb DDR5, and used the extra $250 for my monitor, wish me luck! 😅
Nice, I really wanted a A770 but don't see a reason to get rid of a perfectly good 3060. I'm eagerly waiting for battlemage too. I'm probably gonna use the same specs as you, and put it all together into an Iqunix ZX-1
@PhoenixKeebs yea the 3060 is still a really great card, I wouldn't bother either.. I'm just pumped there's a 3rd contender in the market, more competition=better deals for us
Would've liked to have seen 1440p high with very high textures and XeSS on quality at least, most of your experience with XeSS is the dp4a version on other gpus, when it is available in a game i think it's worth looking at a bit more when using Arc. Thanks for the video!
It's still not even close to the console experience actually.. PS5 (both modes) runs this game nearly identical to the high preset, but PS5 actually has parallax inclusion mapping turned on, and uses very high textures. The clouds setting might also be set to very high.The only downgrade on PS5 is it uses variable AF texture filtering (ranging from 2x-8x). The 60fps mode runs at a reconstructed (checkerboard) 1800p with dynamic resolution scaling. The 30fps mode runs at a native 4k, (possibly with DRS as well).. So you would probably need a 3060 ti 12GB to match the PS5... The fps on ps5 are very stable even at 60fps, and its v synced
Also a770 is identical to PS5 power(it's the same power as 2070 super, the card that identical to PS5 performance). PS5 is entry level now basically, cause 3070 is mid range and beats it very hard.
This game has big issues with driver overhead especially with a lower end cpu. On my 5600 I get 40fps more on a titan xp when playing at very low settings. My A750 is getting around 80-90% usage and my Titan XP 99% on the same cpu. At lowest settings I still drop in the 40-50s even with XeSS Ultra Performance because cpu bound.
@@zWORMzGamingthat cpu is helping a lot. I’m dropping in 40-50s even at lowest settings because it cpu bound, meanwhile my Titan XP which is similar in performance otherwise gets 100fps. Made a quick video at release comparing side by sides. Really weird behavior. Death stranding and horizon zero dawn have exact same issue. Might be something with the engine that Arc doesn’t like.
a770 16gb is one of the best budget vram:bus ratio gpu ever made.. even i would love to buy one for my i5 13500 d5 setup. if the driver was mature for gaming and productivity.... looks like i have to grab a 4070 or 3060 12gb for now
Well … not really . Arc is weak to compete(at performance level) with nVIDIA RTX3000/RTX4000 -lineup (*besides very few low-end exceptions) and this means that those who have money would rather spend them to an nVIDIA (or even AMD) GPU. This creates a serious issue ,since ARC is most appealing for it’s lower price ,which means that people with tight budget will be the most interested ….BUT … ,those same people with tight budget are mostly the same people who still have older systems ,and since those *older systems* ( *Ryzen2000-series and older* ) *don’t support ReBar ,most of those customers will be screwed if they pick an ARC due to its appealing price* As I said in my own message here ,it’s shameful that most of the so called “high-profile” reviewers are making revisits/follow-up for ARC but they don’t say a 😡single word(unlike 🙂Kryzzp here) about this potentially serious issue for the lower-budget customers … As I said at my own message(copy/paste) : “”Intel’s Tom’s Petersen (former nVIDIA ) has said in the past ,that whomever doesn’t have ReBar enabled should go buy an RTX3060 instead !!””
This game and Ratchet and clank have the same ISSUE. BUS load is very high! 70-80% (according to GPU-z Sensors). This is the reason the POWER draw is soo low... The gpu is busy fetching data over the PCI-E BUS instead of actual computing stuff.... This seems to be an issue with the last 2 ports of Sony games to PC done by Nixxes.... You can seen this on RTX 2000 and 3000 series and even on the 4000 series (but there you can brute force it).
Ok... So I did test this - doing a lot of fast travel. It makes things better, by hiding the issue ;) The issue is still there the moment you hit a cutscene. Before: Overall perf would drop. After(deleting Dstorage): Overall perf is around the same, but in cutscenes it drops down to 20 FPS still. Same savegame, doing a game reboot and reload, the cutscene plays at 60 FPS. (I am talking about the Embassy cutscene). However, this points again to the game starting to use the PCI-E bus extensively ;) I did not try in Ratchet and Clank but now I am really curios :D So thank you for this bit of information!
I've been experiencing a problem with my 3060 Mobile where the utilisation is stuck at 100% despite low energy usage (usually at 60-80W instead of the expected 110-130W for 100%) taking the FPS from a somewhat stable 60-65, to between 20 and 40, this is usually due to not having enough VRAM of course but the issue remains even with Very Low textures (although that does make the glitch take way longer to appear), after some googling it seems I'm also not the only one to have this issue, with it being most frequent in the cutscenes for most people. Considering the low wattage despite the high utilisation this just might be the same issue. In the end, I end up having to restart the game every 20 minutes to restore normal utilisation.
could the lower power draw and low performance be because of the locked clock speed? unless its not reading properly it just seems weird it was locked at 2400mhz constantly
Hello , Sorry to bother you but i was looking for Arc a770 Rust game Benchmark , i saw a couple of reviews that it crashes after few minutes and stutters a lot. looking forward to buy this gpu for mostly rust.
my buddy bought 1 and he claims to like it but I think he's nuts and won't admit it was a bad purchase. He always falls back on saying it's good at video editing and streaming. Neither of which he does LOL
@@zWORMzGaming Hm did you test this with an NVIDIA Gpu? I use it on my 4090 to get to a constant 120 FPS in 4k and I really tried hard to make out a difference, I took 2 hours of my time sitting in front of my 65" OLED, but I just could not make out a difference oO
@@zWORMzGaming is it overclock? If u OC and ReBAR together u will get double fps in all resolution ❤️❤️ I love ur videos 🥰plz test Rx 5700xt or 7600 🥰🥰🥰🥰
1080p high with med textures** is fine, you will get mostly 60fps or a bit more with drops to 50 ** If you dont want some shitty drops after like 30 mins because this game has vram leaks, here i didnt care and i am restarting game if that happens because high textures looks much better
i don't understand why i began watching this guy, i don't even own any of these gpus, i just suddenly catch myself watching his videos while i eat, i think i am in love
Same
Same
Same
Same
Same
*Intel "Battlemage" is a giving an anime protagonist vibes*
And it's starting lose some of its charm too. The 50 series gonna come out and the 4090 would probably be equal to the 5070 and 5070 Ti. Intel would go back to the mid range but hey they would still be keeping up. That's something even if it's the last gen flagship.
@@thatguy3000 Ngreedia RTX 50 series will have skyrocketed prices, only for rich people for others RTX 50 series will remain just a dream.
@@aligatortatotatotaovic1890 Exactly! who has this kind of money I have a rx 6800 u can buy for 379.99 and I can play this game @ 1440p ultra getting 60fps native res.
@@thatguy3000 with the lead that they have and chasing the AI bubble, 5070 is never going to give us 4090 performance. At best it's going to give us 4080 performance.
and if the benchmarks are to be trusted, games will look like animes in those gpus.
Thank you so much for accepting my request and testing arc a770 ❤
Low frames or no, it's smooth as butter...they're getting closer. Thanks for this, I've been keeping tabs on Intel's cards.
they just keep getting better and better and for the price the value is there still better than a 4060 most of the time but at lower prices
Hope intel succed and give more competition in gpu market
Been in love with your videos for a long time. You're are a great and funny af dude.
Glad you like them!
Thanks 😃
How many GPUs you got ? You definitely do more than anyone. I am glad that you’re not testing 4090s the time which most people can’t afford kind of pointless.
on his "about channel" page it says he has over 90 gpus @@mmremugamesmm
Can't believe that XeSS surpassed FSR despite releasing later. Great video 👍
proud of my 1080 TI overclocked to 2126mhz beating this to a pulp
1080 ti is goated
With doubled the wattafe
What settings? 1080p? because even oced your card should get off of vram so i espect not smooth frametime. 1080p max settings use 12GB of vram as you can see in the video.
@@patriciomateluna2223It can allocate 12GB, but it does utilize under 11GB and that should be fine.
How the hell al I not subscribed to you I’ve seen at least 30 videos in the past few months from hand helds to 1060s and GPUs I didn’t know existed to modern gpus. I appreciate the content
Haha thanks for watching mate! And subscribing 😃
Xess should look better and run faster on arc gpu's because xess on arc uses xmx cores like dlss uses tensor cores.
Great vid as always!! Only suggestion: you should have tested also with DRS + XeSS Dynamic so that thr DRS image quality is MUCH better.
You always you give good vibe with your video. You are the best
Good video and I'm very glad to see more A770 testing from you, but I hoped you would have also tried 4k with XeSS instead of DRS.
Been loving your content, I've got a suggestion of my own: the Radeo Rx 7900 GRE on helldivers 2. If you're not in possession of this card, the 7800 XT is my 2nd option.
If you've got neither, that's okay, I'll continue to enjoy your upbeat reviews of graphics cards~
Thanks for benchmarking the Intel ARC A770 16GB!
For me it's always a very interesting card because it's more unusal than the AMD & Nvidia cards. So I would be up for more benchmarks of it.
From what I've seen so far the performance is mostly between the 3060 and the 3060Ti and usually the 3060Ti is stronger like in this game.
Nevertheless the Intel A770 has something special and XeSS is looking much better than FSR in that game AND of course it has decent 16GB VRAM.
So maybe in some years the 3060Ti can only see the backlights of the ARC A770. We'll see.
it would be nice to see an updated benchmark of this gpu on multiple different games as the drivers will have improved it significantly since the first one
I think it is a driver issue .... but these problems are affecting the acceptance of intel gpus... awsome vid as always
Great job mentioning the ReBar-requirement Kryzzp , I’ve seen a lot of high-profile “reviewers” making revisits/follow-up on the Arc GPUs and they don’t say a word about ReBar anymore !!
My brother still has a Ryzen2700X system ( *just like many other people who still have up to a Ryzen 2000-series system* ) and if he was buying an ARC GPU based on those high-profile “reviewers” recommendation *he would be basically* … *screwed* !!
( *ARC is relatively cheap so its target-audience is many people with low/er budget and with older systems* )
Intel’s Tom’s Petersen (former nVIDIA ) has said in the past ,that whomever doesn’t have ReBar enabled should go buy an RTX3060 instead !! ,so good to see that Kryzzp doesn’t mislead consumers with low-quality reviews like many 🤡"high-profilers" are doing…
i'm very impressed by your professionalism Kryzzp ,good work🙂
Mark my words Arc a770 gonna age like 1080ti after they fix all issues. Its all just drivers issue the card itself has a immense potential
It's already better than 4060 but 3060 will age better because it has 12 gb vram with nvidia driver support (they just works) 6700xt is good as well but it's 160 bit bus i think that's why it won't age that good
Was going to get a 7900gre but..I've committed, getting my a770 in on Monday for my new build as a placeholder for the battlemage, it'll be paired with a 7600x/32gb DDR5, and used the extra $250 for my monitor, wish me luck! 😅
Just for reference I'm coming from a sff i7 6700/1650 lp pc, so it's still a huge upgrade for me lol
Nice, I really wanted a A770 but don't see a reason to get rid of a perfectly good 3060. I'm eagerly waiting for battlemage too. I'm probably gonna use the same specs as you, and put it all together into an Iqunix ZX-1
@PhoenixKeebs yea the 3060 is still a really great card, I wouldn't bother either.. I'm just pumped there's a 3rd contender in the market, more competition=better deals for us
Would've liked to have seen 1440p high with very high textures and XeSS on quality at least, most of your experience with XeSS is the dp4a version on other gpus, when it is available in a game i think it's worth looking at a bit more when using Arc.
Thanks for the video!
It's still not even close to the console experience actually.. PS5 (both modes) runs this game nearly identical to the high preset, but PS5 actually has parallax inclusion mapping turned on, and uses very high textures. The clouds setting might also be set to very high.The only downgrade on PS5 is it uses variable AF texture filtering (ranging from 2x-8x). The 60fps mode runs at a reconstructed (checkerboard) 1800p with dynamic resolution scaling. The 30fps mode runs at a native 4k, (possibly with DRS as well).. So you would probably need a 3060 ti 12GB to match the PS5... The fps on ps5 are very stable even at 60fps, and its v synced
Also a770 is identical to PS5 power(it's the same power as 2070 super, the card that identical to PS5 performance). PS5 is entry level now basically, cause 3070 is mid range and beats it very hard.
Damn Man I love your videos, keep it up man, btw can you also make a video about Intel Arc A580 & A750 ?
hopefully the next round of drivers fixes the frame rates because it should be running alot better than this.
i didn't expect much from the arc gpu and still i got disappointed
This game has big issues with driver overhead especially with a lower end cpu. On my 5600 I get 40fps more on a titan xp when playing at very low settings. My A750 is getting around 80-90% usage and my Titan XP 99% on the same cpu.
At lowest settings I still drop in the 40-50s even with XeSS Ultra Performance because cpu bound.
I noticed a little bit of GPU usage fluctuation at 1440p, but it was mostly at 97%+ the entire time. Probably something else going on with the drivers
@@zWORMzGamingthat cpu is helping a lot. I’m dropping in 40-50s even at lowest settings because it cpu bound, meanwhile my Titan XP which is similar in performance otherwise gets 100fps. Made a quick video at release comparing side by sides. Really weird behavior. Death stranding and horizon zero dawn have exact same issue. Might be something with the engine that Arc doesn’t like.
Melhor placa da Intel primeira geração de placas oficiais hehe
this is that ish bro good job here
Btw using DRS+Upscaling is a great combo, should be available in every game
right the PS5 uses DRS and checkerboard in all its modes the game was built with it in mind
a770 16gb is one of the best budget vram:bus ratio gpu ever made.. even i would love to buy one for my i5 13500 d5 setup. if the driver was mature for gaming and productivity.... looks like i have to grab a 4070 or 3060 12gb for now
Well … not really .
Arc is weak to compete(at performance level) with nVIDIA RTX3000/RTX4000 -lineup (*besides very few low-end exceptions) and this means that those who have money would rather spend them to an nVIDIA (or even AMD) GPU.
This creates a serious issue ,since ARC is most appealing for it’s lower price ,which means that people with tight budget will be the most interested ….BUT … ,those same people with tight budget are mostly the same people who still have older systems ,and since those *older systems* ( *Ryzen2000-series and older* ) *don’t support ReBar ,most of those customers will be screwed if they pick an ARC due to its appealing price*
As I said in my own message here ,it’s shameful that most of the so called “high-profile” reviewers are making revisits/follow-up for ARC but they don’t say a 😡single word(unlike 🙂Kryzzp here) about this potentially serious issue for the lower-budget customers …
As I said at my own message(copy/paste) :
“”Intel’s Tom’s Petersen (former nVIDIA ) has said in the past ,that whomever doesn’t have ReBar enabled should go buy an RTX3060 instead !!””
I wonder if driver updates will make a large percentage performance difference, i can imagine this get will get much better frame rates in the future
Intel has a long way to go sadly.
For every try they will get better just give it more time. We need more competition in this market.
I think its only matter of time, wait for it, Intel gives head start to other GPU makers.
This game and Ratchet and clank have the same ISSUE. BUS load is very high! 70-80% (according to GPU-z Sensors). This is the reason the POWER draw is soo low...
The gpu is busy fetching data over the PCI-E BUS instead of actual computing stuff.... This seems to be an issue with the last 2 ports of Sony games to PC done by Nixxes....
You can seen this on RTX 2000 and 3000 series and even on the 4000 series (but there you can brute force it).
Try deleting DirectStorage files and see if it's any better.
@@Funnky I've read about this, but I haven't tried it. I will try it out (and I find the video back, I will update^_^)
Ok... So I did test this - doing a lot of fast travel. It makes things better, by hiding the issue ;) The issue is still there the moment you hit a cutscene.
Before: Overall perf would drop.
After(deleting Dstorage): Overall perf is around the same, but in cutscenes it drops down to 20 FPS still. Same savegame, doing a game reboot and reload, the cutscene plays at 60 FPS. (I am talking about the Embassy cutscene).
However, this points again to the game starting to use the PCI-E bus extensively ;) I did not try in Ratchet and Clank but now I am really curios :D
So thank you for this bit of information!
😎
could you test the 6650xt next loving the gpu benchmarks
Try it with the fsr3 mod !! It would make a great series of benchmarking games. Love ya kryzzp ❤❤
What’s good KRYZZP🎉🎉
It's performing like a rtx 2060... Don't you think?
This feels like the drivers needs some optimization, if so the game could run better at V High but it probably will run better at High.
Long live kryzzp and digimoon
Dude can you please also do RX 6700 XT with Minecraft? I want to buy it but there aren't many quality benchmark/testing for this GPU.
i miss forza horizon the tunnel man bring it back :
Highly likely a driver update will fix this. Also RX6600 demo when?
So a RTX 3060TI ( maybe ) and RX 6700 XT do better than this card in 1440p....
Yes, a lot better
Intel Arc cards are so underrated 😭
Eh, from what people tell me they're overrated 😅
@@zWORMzGaming Bro that has to be super cap
@@Plastino2 fr
Not really, this performance is ass
I hate the a770, these drivers are fucking unstable all the time
I wanted to buy the ARC A580, now with this perfomance, i am skeptical about it!
i have A580, it was cheaper than rx6600 where i live, and its similar perf to rtx 3060, and Xess looks really good.
@@Kules3 Thanks for the info.
I've been experiencing a problem with my 3060 Mobile where the utilisation is stuck at 100% despite low energy usage (usually at 60-80W instead of the expected 110-130W for 100%) taking the FPS from a somewhat stable 60-65, to between 20 and 40, this is usually due to not having enough VRAM of course but the issue remains even with Very Low textures (although that does make the glitch take way longer to appear), after some googling it seems I'm also not the only one to have this issue, with it being most frequent in the cutscenes for most people. Considering the low wattage despite the high utilisation this just might be the same issue.
In the end, I end up having to restart the game every 20 minutes to restore normal utilisation.
Same
Utilization ≠ power
16:22 - Just stick to No AA/SMAA honestly, no TEMPORAL bs, just a sharp image.
may i ask whats the difference with VRAM A and VRAM U?
It's VRAM Allocation
And actual VRAM Usage
@@zWORMzGamingHave You try Rtx 2060
It performs like my RTX 3060 12Gb. Was expecting more, but maybe the Mage will do some battle with Nvidia and AMD 😉
could the lower power draw and low performance be because of the locked clock speed? unless its not reading properly it just seems weird it was locked at 2400mhz constantly
🔥🔥
when extra vram doesnt help.........
you should have tried 1440p with xess balanced.
10:30 I think textures usually don't drop the fps by that much in most of the games.
Kryzzp did you noticed gpu usage drop a lot of times!?
didnt this game have fullscreen and Exclusive fullscreen?, i noticed its on fullscreen
Because of the capture card. However it won't make a big difference really in term of performance.
Hello , Sorry to bother you but i was looking for Arc a770 Rust game Benchmark , i saw a couple of reviews that it crashes after few minutes and stutters a lot.
looking forward to buy this gpu for mostly rust.
could you please benchmark the rx 7700 xt? im thinking of buying it
yay intel has a full graphics card yay
what causes jittery performance and micro stutters in the game while panning the camera around? Has any fix for it been discovered yet?
Wht gpu , and to you have rebar turned on ?
yikes, long way to go for Intel GPUs. Not even a contest.
Can you please test the A750 as well?
Hey kryzzp!!
=)
Hi! :)
my buddy bought 1 and he claims to like it but I think he's nuts and won't admit it was a bad purchase. He always falls back on saying it's good at video editing and streaming. Neither of which he does LOL
When using DRS you should also enable Upscaling.
it looks better without it.
@@zWORMzGaming Hm did you test this with an NVIDIA Gpu? I use it on my 4090 to get to a constant 120 FPS in 4k and I really tried hard to make out a difference, I took 2 hours of my time sitting in front of my 65" OLED, but I just could not make out a difference oO
still waiting test RTX 4060 Ti on Horizon Forbidden West
Even Intel first party benchmarks compare this card to 3060
I am currently using a intel arc770 16gb it most of the time consumption 190w to 200w but when using raytracing it consumes more
Could you play ark survival ascended on this card
if u use SAM or Rebar u will get more avg and 1%lows ❤
It is enabled 👍
@@zWORMzGaming is it overclock? If u OC and ReBAR together u will get double fps in all resolution ❤️❤️ I love ur videos 🥰plz test Rx 5700xt or 7600 🥰🥰🥰🥰
Also make video on Quadro p620 2gb gddr5 gpu please
Heyyyyy
Hello 👋
Is v sync on?
hi , please test it on RTX 3050
my used mined 3070ti giving me over 40 with dlss on ultra settings 4k resolution
Hi! Is there any chance you could get your hand2 on a RTX 2080ti and test some games (like this one) with that card? :)
can u test rx5700 non xt on fortnite
Can you amd hd 5770 in fortnide pls mr.kryzzp❤
11:40 why CPU utilization is so high with this GPU
Driver overhead
Battlemage was a mistake for intel.
Oh no this will be a bad one. The drivers of Intel will make the game run like a potato.
770 really said u may run out of fps but not vram
Probably driver issue
do rx 6600xt 🙏
1080p high with med textures** is fine, you will get mostly 60fps or a bit more with drops to 50
** If you dont want some shitty drops after like 30 mins because this game has vram leaks, here i didnt care and i am restarting game if that happens because high textures looks much better
19:00 the screen tearing is horrible 🤣
Bro sunday video on rx 580
Day 5 of asking for BO3 benchmark
Hii Kryzzp
Hi :)
@@zWORMzGaming Will You Benchmark The Upcoming Battlemage GPU Lineup Of Intel? BTW Love From India ❤️
Hi
Hi!
Hi kryzzp
Hi!
DRS makes it look like it's running on a PS4...
pls test rtx 4050 laptop
Bro can you please please do forza horizon 4 on intel i5 please
gtx 1660 super pls
First❤
Rtx 2070 please
Intel really has a way to go before overtaking AMD and Nvidia in the graphics card space
3070 pls :)