Considering you can get a GTX550Ti as part of a scam and then refund it, it may work out as even better value. Ended up costing me £10. Really just shows the performance on the VEGA APU's they truly do perform brilliantly. Just a massive shame about DDR4 Pricing.
To give a brief explanation of the dropping performance in older titles with the VEGA 8 ... This has to do with the inefficient rendering pipeline in DX9-DX11 games (as well as openGL). Most engines from the time, CryEngine i particular, thrive on beefy CPU-core performance and memory bandwidth (because all graphics instructions are constantly being checked by the CPU-mainthread). Since the VEGA 8 shares the memory bandwidth with the CPU, both become completely hamstrung in these titles. In modern optimized DX12/Vulkan titles however. The graphics instructions can be a lot more isolated to the GPU, resulting in way fewer draw calls over the CPU and RAM, resulting in less memory bandwidth usage and more efficient rendering results.
with this being said. Even if you have dual channel 3200Mhz, you will not even come close to the memory bandwidth of exclusive access to DDR5 and DDR5X memory... Let us hope that once AMD opts for 7nm APUs they will have some sort of dedicated fast onboard memory. That would definitely be a game changer. :). As a side note. if you turn off the VEGA 8 graphics and have a dedicated graphics card instead the CPU/memory bandwidth-problem is lifted you can reach 70+ minimum fps in Witcher 3. Something that is impossible when the Vega 8 is activated (even in 720p). This is a full blown showcase of the memory bandwith problem of this APU. It's a shame, because neither the CPU or GPU can shine when they work together.
Very interesting technical insight! What documents do you suggest to read to learn more about that? What about if AMD would introduce quad-channel memory like they did already for the Threadripper platform - would that help to mitigate the bandwidth issue? Yep, I hope AMDs next gen APU will host a 2GB frame buffer made of high-speed memory.
Daniel Düsentrieb. Unfortunately I can't refer to one place that summarizes that kind of information:) The reason why I know this stuff is due to being a programmer/game developer. I'm currently developing a scandinavian fable themed 2D Adventure with HD graphics (think Rayman Legends level graphics), for PC and consoles. I learn a lot by reading tech blogs and posts by nvidia, amd, programmers and game engine developers as well as API documentation. While testing things like this in practice by doing performance profiling. Hahahaha, it's a full-time job on it's own. :) So I wouldn't recommend it since it isn't a light read. But I am happy to share what I know.
Yeah, I know ;-) I looked you up after I read your initial post. The game you're working on looks really nice and I hope it's going to be a success. Since the 80s I am fascinated by computer graphics, in particular gaming related graphics, but never really got into it even though I spent many years in the software industry (ERP and business applications).
Only way to fix that I found recently was with use of AMD Software’s Radeon chill. It will prevent the GPU from rendering frames before the cpu can process. Was getting massive throttle on my Vega 8 with ryzen 5 3500u. After some tweaking I found by doing these I could get stable frames and lower power draw and throttle for a stable experience. 1. Windows balanced power option, performance setting in battery tray setting while plugged in, pcie link state power mgmt disabled. 2. Radeon chill with min fps set to 60 and max to 60 as well. 3. In windows power plan min cpu power state set to 5%, max cpu pwr set to 65%. Depending on the game you might need to lower or raise that max power state setting depending on the cpu needs of the game to hit 60. Otherwise, lower it so that the Vega 8 does not throttle as early. Experiment with Radeon chill to see if you need to edit at all. My hp pavilion laptop has a shitty plastic chassis and barely any cooling, so my Vega typically clocks around 600-800 mhz. Lol
Its ryzen. Like it or not you are going to buy high frequency ram in the future. Consider it a good investment, sine it will work with or without onboard gpu.
Ryzen really doesn't need high frequency RAM, you can easily beat the results with tuned RAM. I run mine slower than advertised to get super low latencies. Cinebench went up a huge amount after tuning. Since I'm using my 1700 for CAD and rendering, that's what I optimize for. It still does pretty well, though if you are running 1080p with my Titan Xp, yeah it holds it back a little, but its just "holding it back" at 200+ fps... So not really a problem, especially since I don't see many people pairing a Titan with 1080p. I run things at 4k if the Titan can handle it (you have to tweak the settings for 60 fps, but it does ok for the most part).
Yedige Moldagaliyev I don't think your build to price is that important as any new build will suffer from the same problem. Best you could do is build a ddr3 system, as DDR4 is common to all new builds.
It's a seriously impressive API. Would I go for it, no. but then I have a lot of money invested in my current Intel / GTX rig. Starting from scratch, this really is a no brainer for productivity users and occasional gamers.
Man, I can't wait for this APU to get driver updates for optimization- with AMD's track record for those doing miracles for their CPUs and GPUs, the 2200g/2400g might become the best value GPU/CPU combination in its bracket.
This is just a testament to the progression of GPU technology, and how efficient, powerful and cool the vega 8 runs with such good performance. This is what I love about the new ryzen APU's, they are a milestone for graphics technology as it competes fairly closely with the 560ti, a 2U card with a huge cooler, when it is an integrated solution
If it was being compared to a card of it's price point it would have been less impressive. think GT 730 sub 550 non Ti. it's basically a r7 240 it's not in competition with any of the GTX line.
I am impressed how good this old GTX 560 Ti stll holds up today with really demanding games. I have a GTX 650 1GB and the card is in theory powerful enough, but with the little VRAM you can't play anything at all.
This is quite an important APU. If not only in part of how the market is right now. At least it is pretty darn decent at gaming, definitely for its price.
I'm gonna comment before I watch it. It's a very good call asking what it can do against a 560ti. I had a 1gb version and the only real difference is the textures. (between the 1gb &2gb 560ti's)
Most titles should be alright, generally unless you're playing some games like PUBG and Battlefield as Sart said then you'll see good performance. However do keep in mind that both PUBG and Battlefield would be playable on this pairing, just not as optimally as alternative CPU's.
I haven't got a recommended video from you in so long I though you stopped making videos until I actually searched for you and found you've been doing your usual thing. RUclips is so frustrating sometimes. Anyways keep up the awesome videos! :)
It's honestly sad that anymore, these days, because of the Digital Foundry crap they pulled for so very long... Criticizing any frame rate below 59.999FPS as UNPLAYABLE, that other reviewers have to commonly use similar terminology. I appreciate that you don't allow a 0.000157 sec stutter every 17 minutes of gameplay to be considered UNPLAYABLE, as so many others do... As well as the fact that you allow anything from 30-60+ FPS at various quality settings to be considered perfectly fine... as they are. I grew up in the real days of PC gaming, voodoo 1 and 2... STB riva 128, etc... The days when games naturally ran at 10-15fps unless you were a freaking GOD amongst PC enthusiasts. Yet today, if you aren't getting 144FPS on a $5000 HDR 4K monitor without ONE SINGLE FRAME DROP, then its UNPLAYABLE! And amongst that madness, the voice of reason, the voice who goes by "Random GamingHD" say's NO... NO MORE OF THIS HORSESHIT! Games can be plenty fun even if the eyeball only sees 47 images every single second! Games can be fun even if upon the 5th hour of playing a title, it gives you a momentary stutter... Games can still be playable and fun. What a reasonable mentality, and I love it!
At last I'm out from the FX-8320 and FX-8350 cpus and got the 2200G, the fps boost (combined with RX-480) from the previous FX was really worth it The DDR4 market is still expensive tho....
Hey sir! So I think the comparison of the 2400G vs 560 ti will definitely be a good comparison. Especially if someone is going for as new as possible of a build and goes for a R3 1200 + Used 560 ti. Too bad WeBuy closed down here!
The GTX 560Ti is still a good card today for the price the go for between $30 and $45 dollars in the U.S. right now and performs at right around the level of a GTX 750Ti which is still one of the top ten used cards today. This card can still play all of the top games today if you don't mind playing with settings and resolutions. The problem you ran into with Assassin's Creed may be able to be fixed with settings and or riva tuner. I had a simular problem with playing Mafia III on this card with a less powerful cpu than what your running. I tried riva tuner frame capping the game to 30 fps and using the v-sync in the game, but still got bad stutter and frame drops. But when I just played with the lower settings and v-sync with OpenGL the game was playable with better frame timing it would still drop a frame or two but not the bad stutter as before. The VRAM doesn't really matter either. The 1GB and 2GB versions of the GTX 560Ti perform about the same on most games today. Once the VRAM limit is hit it just goes over to system ram and the 2200G is only using system ram to begain with. The only problem with VRAM is when games have pre-installed settings that either don't allow them to play with VRAM limits not met or have high VRAM intensive settings even on the games lowest setting, but other than that VRAM settings on these old cards are not as big of an issuse as people make it out to be. The 2200G proves VRAM doesn't matter as it is only using system ram. The problem is how VRAM is allocated in some of todays game that causes issuse. The Assassins Creed series has never been the best optimized for pc as a whole. Assassins Creed 3 started giving me problems last summer after some Windows 10 updates and this was on a pc with far better specs than needed to play this game.
In terms of stream processors/compute units that vega 8 is more like a maybe a 6670 which was not even the direct competition to a GTX 550 that would have been the 6750 or so, that vega is punching about 4 classes above it's weight here. I'd say the fair competition would be vega 8 vs a GT 730 as that would be the level of card a vega 8 would be at if it was a retail product it's essentially a r7 240.
I think the most appropriate rival for an AMD APU build is a Pentium + old GPU. Or even a completely second hand parts build, to get 4 cores. Tests I'd love to see for a complete comparison should be: 1) Same price, to see if DDR4 and new components sacrifice performance 2) Same performance, to see how much you're paying for new (reliable and with warrantee) components and insane upgradability. But I guess I'm asking a lot of work... nice job anyway.
1:13 please explain to me how the hell it did not bottleneck with 2200g?? wtf? wait whuut?? really? and then you said it stuttering later on wtf? need answer my friend!
Having just watched it: I'm more surprised that the integrated gpu is as good as the 560ti. The 560ti is a piece of legendary kit to be held up with the 2500k and gtx 8800-9800. That apu has kind of shot down alot of the budget builds now seeing as you can buy that for the price of say the 560 and a 2nd-3rd gen intel chip that would run almost identical.
What did you set the frame buffer size on the APU? Hardware unboxed found that keeping it at the lowest setting gives better results due to the fact that the system memory is already shared with the graphics.
It does suck that DDR4 alone skyrockets the price of a budget-oriented component like this, you can build the entire system for ~$250-300 US but add the memory in there and you can forget about it. Although if I was building one for myself I would sacrifice one of my own 8G sticks in my system to bypass the tremendous cost
Strangely enough, from my old system to my new, I've found that Overwatch flattens out around 60fps and doesn't like climbing much higher, bearing in mind that I went from an a10-8600k, 8Gb RAM and a 760 to an i5-6500, 16Gb of ram and a 1060 6Gb. Not sure what the reason for this could be.
Did you paired 560ti with 2200g? Ryzen apu's have 8 pci-e lanes for vega and 8 for gpu. So if you pair a Ryzen apu with dedicated gpu, gpu might loose a little performance.
I'm curious. How do the graphics of the 2200G compare to something like an older budget HD7750? My mother wants to get my step father a new pc for their 10th anniversary coming up and I told her I'd build him one instead. So I was pondering either getting the R3 1200 and use his current HD7750 or get the 2200G for a few bucks less and get graphics that might be better. Also the R3 1200 doesn't have the compatibility issue like the 2200G currently does from being new and all. The most graphically demanding game he plays is World of Warships. So I highly doubt I'd ever see him playing something like Battle Field 1 or something with similar graphics.
What I don't get is why we still don't have the consoles gpu performance in a desktop apu. I mean does it matter that they might create some more pc vs console competition? Both platforms have their people so why not just release an updated version of those? I mean the xbox one ran them with ddr3 and did mostly fine.
what? you lost me half of the comment why we don't see the same performance level is because the games themselfs and the system... on PC hardware is "made" for the games on console games are made for the hardware... it's easy to optimize a game when every machine have exactly the same hardware and the devs use the entire performance of that hardware... look for example PUBG a game that doesn't utilize more then 3 cores... compare that to a game like horizon zero down that is build so he can take advantage of the full performance of the hardware (also the games that are on both console and pc, usually on console is set to a mixture of low-medium and high settings, so it's not like u go in and max everything out..)
I'm talking about the gpu in the consoles. If you look at their theoretical performance they are still ahead of the current ryzen apu's and in practice (the cpu part of them is really weak tho) they are still a bit better. It's just that why not just give us the performance of them. I am fully aware of the optimalization one can do on set hardware it's just that why can't we just have the raw performance in an apu that the consoles have had for years. I understood it when we were still on fm2 since the chips were physically too big to fit in the socket. But with ryzen I don't see why not give it too us. Both ryzen and vega can give more performance on a smaller surface area due to the shrink in the manufacturing process.
So what you think I should pair with my 2400g? I have like all the popular stuff - cards. I'm thinking about the 1030. Idk why but I think it just enough kick in the paints
Hello there! I'm about to build a PC based on AM4 platform.I have an HD 7770 1GB and I was wondering if R3 2200 would outperform R3 1200 + the graphics card mentioned before. Both systems would be with one stick 1x8GB DDR4 (planning to add one more stick in the future). Thanks for the help.
Ddr5 will make apu’s mainstream it’s amazing really the performance you can get with the current generation but the performance you can achieve compared to the price of Rx 580 8gb $100 and a used i7 or i5 desktop you can literally buy any prebuilt intel based desktop from last decade install a rx580 or even 570 and get double the performance for same cost of building a new apu setup prob a lil cheaper actually.
ryzen 2200g is ok for peace of mind as it is brand new, but any used quad core i series with a used geforce 660 / radeon 7850 etc will cost less and perform way better all the time in gaming. and for the memory price difference (ddr4 3200 vs ddr3 1600), i would instead buy discrete gpu. btw even when you say so, on some of the games you tested the cpu is the limiting factor aka bottleneck hence parity result!
Why do you use MS Paint and not Excel? You can easily make graphs on it. Makes no sense (plus it can do the 1% and 0.1% calculations automatically if you set the right formulae.
Its because of the difference in ram. Dedicated GDDR5 vs shared DDR4 isn't a fair comparison. One of those cheap low end GPUs you always see online with system memory would be more comparable. Also Nvidia optimizes game settings for specific GPUs. Even when the in game settings are the same the drivers are tampering with image quality. Both AMD and Nvidia are always tampering with the game to fix visual bugs and Nvidia has the software support to do more of that. A lot of these games, being older titles, it makes sense that the GTX 560 Ti, one of the most successful Nvidia GPUs from that generation, would get support like this for longer. It would also make sense that when it gets zero support at all, it falls flat on its face
There must be something wrong on your 720p low on AC. 0.1% should be way higher than on 1080p, and OC should perform better than stock. Maybe not enough voltage for the soc or cooling (not good with the stock heatsink).
Definitely seeing cpu bottlenecking, especially for 720p, cpu is is seeing very high usage and gpu's were dropping from 100%. Also the 2200g and IGPU are suffering in some games most likely from TDP limitations, which is shown at 720p results.
As Ryzen is an APU does it supports crossfire with AMD GRAPHIC CARDS?? If so, that would be also an advantage to boost performance by buying cheap amd gpu!
idk I'm still baffled by the fact that a 65watt Zen-Vega APU essentially performs in the i5 sandy's equipped with GTX750 territory. Both the later at least up north of 150watt easy. If only the DDR4 pricing are easier to swallow...
Crysis results make it seem as though the Vega 8 is bandwidth starved. It really is hard to get good RAM these days at a reasonable price, however, so that's more than fair.
Considering you can get a GTX550Ti as part of a scam and then refund it, it may work out as even better value. Ended up costing me £10. Really just shows the performance on the VEGA APU's they truly do perform brilliantly. Just a massive shame about DDR4 Pricing.
True, if DDR4 prices weren't as high as they are rn I could build a ryzen 5 2400g system at a very good price.
You get a 550 Ti as a scam.
NOT A 560 Ti!
GV: I mean VEGA56 is virtually a GTX1080 with an undervolt... So not exactly shit, just a shame about the availability.
I bought a scam one. Overwatch has so many artifacts lol. They offered me a 4 dollar "compensation". Hopefully I get a full refund
Vega ain't shit. These APU's beat anything from Intel.
These APUs would be perfect if not for DDR4 pricing
GV except you are a blind fanboy
GV saying "hot garbage" doesn't seem like an opinion
GV Just because you dislike a product, it doesn't mean it's bad
SaviTheGamer could you take this argument somewhere else
thx
Brandon don't worry, he's probably a troll.
"16gb 3200mhz DDR4"
I thought this was a budget build channel
I'm from the future, 16GB is only $65 for 3000Mhz at newegg lol.
It's now $55 LMAOOOOOOOOOOO. 2017 pricing - $160 at the least. Fuck price fixing.
November of 2019. 16gb 3000mhz is 50$ with free shipping
fuck you all im in 6600 and 16 gb of ram even doesnt exist
@@Limaokinzoks what?
To give a brief explanation of the dropping performance in older titles with the VEGA 8 ... This has to do with the inefficient rendering pipeline in DX9-DX11 games (as well as openGL). Most engines from the time, CryEngine i particular, thrive on beefy CPU-core performance and memory bandwidth (because all graphics instructions are constantly being checked by the CPU-mainthread). Since the VEGA 8 shares the memory bandwidth with the CPU, both become completely hamstrung in these titles. In modern optimized DX12/Vulkan titles however. The graphics instructions can be a lot more isolated to the GPU, resulting in way fewer draw calls over the CPU and RAM, resulting in less memory bandwidth usage and more efficient rendering results.
with this being said. Even if you have dual channel 3200Mhz, you will not even come close to the memory bandwidth of exclusive access to DDR5 and DDR5X memory... Let us hope that once AMD opts for 7nm APUs they will have some sort of dedicated fast onboard memory. That would definitely be a game changer. :). As a side note. if you turn off the VEGA 8 graphics and have a dedicated graphics card instead the CPU/memory bandwidth-problem is lifted you can reach 70+ minimum fps in Witcher 3. Something that is impossible when the Vega 8 is activated (even in 720p). This is a full blown showcase of the memory bandwith problem of this APU. It's a shame, because neither the CPU or GPU can shine when they work together.
Very interesting technical insight! What documents do you suggest to read to learn more about that? What about if AMD would introduce quad-channel memory like they did already for the Threadripper platform - would that help to mitigate the bandwidth issue? Yep, I hope AMDs next gen APU will host a 2GB frame buffer made of high-speed memory.
Daniel Düsentrieb. Unfortunately I can't refer to one place that summarizes that kind of information:) The reason why I know this stuff is due to being a programmer/game developer. I'm currently developing a scandinavian fable themed 2D Adventure with HD graphics (think Rayman Legends level graphics), for PC and consoles. I learn a lot by reading tech blogs and posts by nvidia, amd, programmers and game engine developers as well as API documentation. While testing things like this in practice by doing performance profiling. Hahahaha, it's a full-time job on it's own. :) So I wouldn't recommend it since it isn't a light read. But I am happy to share what I know.
Yeah, I know ;-) I looked you up after I read your initial post. The game you're working on looks really nice and I hope it's going to be a success. Since the 80s I am fascinated by computer graphics, in particular gaming related graphics, but never really got into it even though I spent many years in the software industry (ERP and business applications).
Only way to fix that I found recently was with use of AMD Software’s Radeon chill. It will prevent the GPU from rendering frames before the cpu can process. Was getting massive throttle on my Vega 8 with ryzen 5 3500u. After some tweaking I found by doing these I could get stable frames and lower power draw and throttle for a stable experience. 1. Windows balanced power option, performance setting in battery tray setting while plugged in, pcie link state power mgmt disabled. 2. Radeon chill with min fps set to 60 and max to 60 as well. 3. In windows power plan min cpu power state set to 5%, max cpu pwr set to 65%. Depending on the game you might need to lower or raise that max power state setting depending on the cpu needs of the game to hit 60. Otherwise, lower it so that the Vega 8 does not throttle as early. Experiment with Radeon chill to see if you need to edit at all. My hp pavilion laptop has a shitty plastic chassis and barely any cooling, so my Vega typically clocks around 600-800 mhz. Lol
No seagull - dislike, unsub, report.
Reported your comment for being unfunny and unoriginal
Gordon Freeman reported your name for addiction of the game you are in
Ugandan boi shut heck dead meme
IcyyRed salty kiddo go find a life don't be like Intel fanboyz
Call for PETA
I would get a 2200g if ram wasn't so expensive
Its ryzen. Like it or not you are going to buy high frequency ram in the future. Consider it a good investment, sine it will work with or without onboard gpu.
Ryzen really doesn't need high frequency RAM, you can easily beat the results with tuned RAM. I run mine slower than advertised to get super low latencies. Cinebench went up a huge amount after tuning. Since I'm using my 1700 for CAD and rendering, that's what I optimize for. It still does pretty well, though if you are running 1080p with my Titan Xp, yeah it holds it back a little, but its just "holding it back" at 200+ fps... So not really a problem, especially since I don't see many people pairing a Titan with 1080p. I run things at 4k if the Titan can handle it (you have to tweak the settings for 60 fps, but it does ok for the most part).
Ram is expensive no matter what build you want to get.
Aleksandar Ristic shouldn't it be, that you would upgrade at all, but ram is expensive?
Yup, my Corsair's ValueSelect 1x16Gb 2133Mhz cost me 135 EUR at late 2017 and a month later 200 EUR.
If only DDR4 prices weren't that high the Ryzen APU builds could have been even better in terms of price to performance
you are going to use DDR4 even with intel build.
Any build would have better price to performance
Yedige Moldagaliyev I don't think your build to price is that important as any new build will suffer from the same problem. Best you could do is build a ddr3 system, as DDR4 is common to all new builds.
Think ram prices are expensive? Try buying a gpu...
RandomBeardinHD
I feel wewcolmed everytime I click on RandomGaminginHD's videos
The Saucy Goblin Shark wewcolmed
Same UwU
although he is always trying to scare away people with the seagull xD
Thank you for this video! This is the CPU I'm planning on using, and will be using the Radeon graphics for a while until I can get a 1050Ti.
It's a seriously impressive API. Would I go for it, no. but then I have a lot of money invested in my current Intel / GTX rig. Starting from scratch, this really is a no brainer for productivity users and occasional gamers.
I’m very glad you overclocked the Vega GPU this time and used proper memory! Thanks for the video!
Cheers for that. Finally swaping my 9850 phenom for ryzen:)
Man, I can't wait for this APU to get driver updates for optimization- with AMD's track record for those doing miracles for their CPUs and GPUs, the 2200g/2400g might become the best value GPU/CPU combination in its bracket.
This is just a testament to the progression of GPU technology, and how efficient, powerful and cool the vega 8 runs with such good performance. This is what I love about the new ryzen APU's, they are a milestone for graphics technology as it competes fairly closely with the 560ti, a 2U card with a huge cooler, when it is an integrated solution
YaBoi ToastedVivid if this was compared with any other company the results would be better lmao
If it was being compared to a card of it's price point it would have been less impressive. think GT 730 sub 550 non Ti. it's basically a r7 240 it's not in competition with any of the GTX line.
Your videos help me sleep, they’re just so calming
A comparison of the power consumption would've been nice to really see how far we've come!
Ive been addictied to your vids lately nice job👍
I am impressed how good this old GTX 560 Ti stll holds up today with really demanding games.
I have a GTX 650 1GB and the card is in theory powerful enough, but with the little VRAM you can't play anything at all.
200k subs! congrats!
This is quite an important APU. If not only in part of how the market is right now.
At least it is pretty darn decent at gaming, definitely for its price.
Your ms paint graphs are legit bro.
Nice video dude. keep it up man.
I'm gonna comment before I watch it. It's a very good call asking what it can do against a 560ti. I had a 1gb version and the only real difference is the textures. (between the 1gb &2gb 560ti's)
Would a g4560 bottleneck a gtx 970?
should be ok in most titles, but in Battlefield, Pubg and other cpu intinsive Titles, it will bottleneck
Most titles should be alright, generally unless you're playing some games like PUBG and Battlefield as Sart said then you'll see good performance. However do keep in mind that both PUBG and Battlefield would be playable on this pairing, just not as optimally as alternative CPU's.
Ok thanks
Yes
Johns Memes Get r3 1200
I haven't got a recommended video from you in so long I though you stopped making videos until I actually searched for you and found you've been doing your usual thing. RUclips is so frustrating sometimes. Anyways keep up the awesome videos! :)
Congrats on 200k subs
I always waited for ur videos...so informative....keep it up bro..i wish could have get a chance to meet u...love from india
Tip: Showing off hardware at the same distance as your face makes keeping uniform frame focus easier.
Good review, although I would like a darker background for the graphs. Hurts the eyes with higher brightness.
It's honestly sad that anymore, these days, because of the Digital Foundry crap they pulled for so very long... Criticizing any frame rate below 59.999FPS as UNPLAYABLE, that other reviewers have to commonly use similar terminology. I appreciate that you don't allow a 0.000157 sec stutter every 17 minutes of gameplay to be considered UNPLAYABLE, as so many others do... As well as the fact that you allow anything from 30-60+ FPS at various quality settings to be considered perfectly fine... as they are. I grew up in the real days of PC gaming, voodoo 1 and 2... STB riva 128, etc... The days when games naturally ran at 10-15fps unless you were a freaking GOD amongst PC enthusiasts. Yet today, if you aren't getting 144FPS on a $5000 HDR 4K monitor without ONE SINGLE FRAME DROP, then its UNPLAYABLE! And amongst that madness, the voice of reason, the voice who goes by "Random GamingHD" say's NO... NO MORE OF THIS HORSESHIT! Games can be plenty fun even if the eyeball only sees 47 images every single second! Games can be fun even if upon the 5th hour of playing a title, it gives you a momentary stutter... Games can still be playable and fun. What a reasonable mentality, and I love it!
At last I'm out from the FX-8320 and FX-8350 cpus and got the 2200G, the fps boost (combined with RX-480) from the previous FX was really worth it
The DDR4 market is still expensive tho....
Great video!
The benchmarks I wanted to see.
Hey sir!
So I think the comparison of the 2400G vs 560 ti will definitely be a good comparison.
Especially if someone is going for as new as possible of a build and goes for a R3 1200 + Used 560 ti.
Too bad WeBuy closed down here!
Great video as usual
I have the 2400G, great CPU especially for anyone that might do content creation occasionally and is a casual gamer too with modest expectations.
The GTX 560Ti is still a good card today for the price the go for between $30 and $45 dollars in the U.S. right now and performs at right around the level of a GTX 750Ti which is still one of the top ten used cards today. This card can still play all of the top games today if you don't mind playing with settings and resolutions. The problem you ran into with Assassin's Creed may be able to be fixed with settings and or riva tuner. I had a simular problem with playing Mafia III on this card with a less powerful cpu than what your running. I tried riva tuner frame capping the game to 30 fps and using the v-sync in the game, but still got bad stutter and frame drops. But when I just played with the lower settings and v-sync with OpenGL the game was playable with better frame timing it would still drop a frame or two but not the bad stutter as before. The VRAM doesn't really matter either. The 1GB and 2GB versions of the GTX 560Ti perform about the same on most games today. Once the VRAM limit is hit it just goes over to system ram and the 2200G is only using system ram to begain with. The only problem with VRAM is when games have pre-installed settings that either don't allow them to play with VRAM limits not met or have high VRAM intensive settings even on the games lowest setting, but other than that VRAM settings on these old cards are not as big of an issuse as people make it out to be. The 2200G proves VRAM doesn't matter as it is only using system ram. The problem is how VRAM is allocated in some of todays game that causes issuse. The Assassins Creed series has never been the best optimized for pc as a whole. Assassins Creed 3 started giving me problems last summer after some Windows 10 updates and this was on a pc with far better specs than needed to play this game.
In terms of stream processors/compute units that vega 8 is more like a maybe a 6670 which was not even the direct competition to a GTX 550 that would have been the 6750 or so, that vega is punching about 4 classes above it's weight here. I'd say the fair competition would be vega 8 vs a GT 730 as that would be the level of card a vega 8 would be at if it was a retail product it's essentially a r7 240.
I think the most appropriate rival for an AMD APU build is a Pentium + old GPU. Or even a completely second hand parts build, to get 4 cores.
Tests I'd love to see for a complete comparison should be:
1) Same price, to see if DDR4 and new components sacrifice performance
2) Same performance, to see how much you're paying for new (reliable and with warrantee) components and insane upgradability.
But I guess I'm asking a lot of work... nice job anyway.
Wow, surprisingly good performance from AMD's APU's they've definitely upped their game. Nice test.
1:13 please explain to me how the hell it did not bottleneck with 2200g?? wtf? wait whuut?? really? and then you said it stuttering later on wtf? need answer my friend!
I'm a simple man. If i don't see a seagull, i hire a hitman.
Nice video! Good idea ;)
Having just watched it: I'm more surprised that the integrated gpu is as good as the 560ti. The 560ti is a piece of legendary kit to be held up with the 2500k and gtx 8800-9800. That apu has kind of shot down alot of the budget builds now seeing as you can buy that for the price of say the 560 and a 2nd-3rd gen intel chip that would run almost identical.
What did you set the frame buffer size on the APU? Hardware unboxed found that keeping it at the lowest setting gives better results due to the fact that the system memory is already shared with the graphics.
What software did you use for the OSD by the way?
Blows my mind....
Nice One!
More 2nd Gen Ryzen Videos!
Your graphs are so cuteeee lol
❤️
are you planning to look at the r5 2400G? Either way, do you feel it's worth the extra price?
nvm you mention in the end of the video :P
It does suck that DDR4 alone skyrockets the price of a budget-oriented component like this, you can build the entire system for ~$250-300 US but add the memory in there and you can forget about it. Although if I was building one for myself I would sacrifice one of my own 8G sticks in my system to bypass the tremendous cost
how would it compare to a gtx 650? roughly the same kind of results? debating sticking the gpu into the rig
Strangely enough, from my old system to my new, I've found that Overwatch flattens out around 60fps and doesn't like climbing much higher, bearing in mind that I went from an a10-8600k, 8Gb RAM and a 760 to an i5-6500, 16Gb of ram and a 1060 6Gb. Not sure what the reason for this could be.
Which software u used to check fps and memory consumption
Awesome beard👌
What if you put 2400g instead?
Did you paired 560ti with 2200g? Ryzen apu's have 8 pci-e lanes for vega and 8 for gpu. So if you pair a Ryzen apu with dedicated gpu, gpu might loose a little performance.
I'm curious. How do the graphics of the 2200G compare to something like an older budget HD7750? My mother wants to get my step father a new pc for their 10th anniversary coming up and I told her I'd build him one instead. So I was pondering either getting the R3 1200 and use his current HD7750 or get the 2200G for a few bucks less and get graphics that might be better. Also the R3 1200 doesn't have the compatibility issue like the 2200G currently does from being new and all. The most graphically demanding game he plays is World of Warships. So I highly doubt I'd ever see him playing something like Battle Field 1 or something with similar graphics.
What setting do you use on GTA 5? Insane frames for a 560ti, I got a 750ti and I want that shit haha
All on normal (low) and advanced settings off :)
RandomGaminginHD thanks mate 👌
What I don't get is why we still don't have the consoles gpu performance in a desktop apu. I mean does it matter that they might create some more pc vs console competition?
Both platforms have their people so why not just release an updated version of those? I mean the xbox one ran them with ddr3 and did mostly fine.
what? you lost me half of the comment
why we don't see the same performance level is because the games themselfs and the system...
on PC hardware is "made" for the games
on console games are made for the hardware...
it's easy to optimize a game when every machine have exactly the same hardware and the devs use the entire performance of that hardware... look for example PUBG a game that doesn't utilize more then 3 cores... compare that to a game like horizon zero down that is build so he can take advantage of the full performance of the hardware
(also the games that are on both console and pc, usually on console is set to a mixture of low-medium and high settings, so it's not like u go in and max everything out..)
I'm talking about the gpu in the consoles. If you look at their theoretical performance they are still ahead of the current ryzen apu's and in practice (the cpu part of them is really weak tho) they are still a bit better. It's just that why not just give us the performance of them.
I am fully aware of the optimalization one can do on set hardware it's just that why can't we just have the raw performance in an apu that the consoles have had for years. I understood it when we were still on fm2 since the chips were physically too big to fit in the socket. But with ryzen I don't see why not give it too us. Both ryzen and vega can give more performance on a smaller surface area due to the shrink in the manufacturing process.
So what you think I should pair with my 2400g? I have like all the popular stuff - cards. I'm thinking about the 1030. Idk why but I think it just enough kick in the paints
Hello there! I'm about to build a PC based on AM4 platform.I have an HD 7770 1GB and I was wondering if R3 2200 would outperform R3 1200 + the graphics card mentioned before. Both systems would be with one stick 1x8GB DDR4 (planning to add one more stick in the future). Thanks for the help.
Are you using dual channel ? APUs can be more powerful with dual channel.
I think the APU was getting bottlenecked somewhere as the overclock of around 20% gave only a tiny speed advantage.
Interesting how it's obvious ACO has code that only newer hardware is able to handle with efficiency. Good benches
What cooler are you using with the 2200g?
These new APUs are really impressive.
i have to admit Vega 8 has really surprised me!
Ddr5 will make apu’s mainstream it’s amazing really the performance you can get with the current generation but the performance you can achieve compared to the price of Rx 580 8gb $100 and a used i7 or i5 desktop you can literally buy any prebuilt intel based desktop from last decade install a rx580 or even 570 and get double the performance for same cost of building a new apu setup prob a lil cheaper actually.
ryzen 2200g is ok for peace of mind as it is brand new, but any used quad core i series with a used geforce 660 / radeon 7850 etc will cost less and perform way better all the time in gaming. and for the memory price difference (ddr4 3200 vs ddr3 1600), i would instead buy discrete gpu.
btw even when you say so, on some of the games you tested the cpu is the limiting factor aka bottleneck hence parity result!
(Drunk me) That text on the gpu reminds me of something, (Literally looks in trash can) oh i have a sticker like that.
hmm, surprised that overclock didnt yield more significant difference. I mean its +500mhz on core clock.
You should ask his ram speed first. Vega APU is greatly affected by ram speed and cooling.
will you be showing the 2400g?
Any chance of seeing the 2200G and/or the 2400G paired with a GTX970 ? cool chanel btw :-)
I got a pretty decent system myself, but somehow still feel the urge to get new AMD APU's xD Well done AMD, well done...!
So how does the "G series" apus compare to a Ryzen 3 or 5 cpu in the same price range as in cpu performance?
Why do you use MS Paint and not Excel? You can easily make graphs on it. Makes no sense (plus it can do the 1% and 0.1% calculations automatically if you set the right formulae.
the IGPU has a lot of potential, its just getting bottlenecked by the slow ddr4 ram compared to gddr5/hbm
i wonder if this *MIGHT* be the cause of nvidia apis giving the 560 the upper hand.
but i'm not quite sure , would love some correction .
Its because of the difference in ram. Dedicated GDDR5 vs shared DDR4 isn't a fair comparison. One of those cheap low end GPUs you always see online with system memory would be more comparable. Also Nvidia optimizes game settings for specific GPUs. Even when the in game settings are the same the drivers are tampering with image quality. Both AMD and Nvidia are always tampering with the game to fix visual bugs and Nvidia has the software support to do more of that. A lot of these games, being older titles, it makes sense that the GTX 560 Ti, one of the most successful Nvidia GPUs from that generation, would get support like this for longer.
It would also make sense that when it gets zero support at all, it falls flat on its face
thanks consoles for testing the jaguar prototype APU
Do a video on the first 1GB graphics cards
There must be something wrong on your 720p low on AC. 0.1% should be way higher than on 1080p, and OC should perform better than stock.
Maybe not enough voltage for the soc or cooling (not good with the stock heatsink).
can you do 2200g or 2400g vs gtx 750 non ti 2gb, and good video :D
Geniusly.CS:GO mandatory next time and every time! ;-)
Also you think amd cards will pair up better?? With 2400g
weird results on the vega 8 overclock other sites are reporting significant boost in performance. Yours barely differs between stock and oc'ed.
Bisayan_Tek Right? I wonder if it's the RAM bandwidth
TopDatTopHat he mentioned he's using 3200mhz kit already? If I heard it right
Bisayan_Tek hmm
So he did. Something's up though
Definitely seeing cpu bottlenecking, especially for 720p, cpu is is seeing very high usage and gpu's were dropping from 100%. Also the 2200g and IGPU are suffering in some games most likely from TDP limitations, which is shown at 720p results.
can the vega 8 run in crossfire with low amd add in card? I remember AMD used to do that with their old APU
How about comparing the 2400G to the top A series APU for the AM4 socket.
I'd be interested to see these pitted against the U series (laptop only) ryzen chips
Whaaaat?? O.o GTA V on an APU? Awesome
m8 the beard is... *interesting*
Wish I had this CPU combo when I sold 1070. The iGPU on 4570 was barely good enough for RUclips
What frame rate and system monitor is this?
As Ryzen is an APU does it supports crossfire with AMD GRAPHIC CARDS?? If so, that would be also an advantage to boost performance by buying cheap amd gpu!
idk
I'm still baffled by the fact that a 65watt Zen-Vega APU essentially performs in the i5 sandy's equipped with GTX750 territory. Both the later at least up north of 150watt easy.
If only the DDR4 pricing are easier to swallow...
Crysis results make it seem as though the Vega 8 is bandwidth starved. It really is hard to get good RAM these days at a reasonable price, however, so that's more than fair.
do you have the 16gb in the ryzen rig in dual channel (8gbx2)?
The GTX 660 would still destroy both the gt1030 and this APU.
And a radeon hd 7850 would destroy both. The 7850 at release performed worse then a 660 but now it performs better in most modern titles.
Kepler is one of the worst aging architectures ever made.