Because I don't want to reply to the same comment multiple times over, I'm going to address some of the common questions and comments in a pinned one. "A lot of games on the 360 run at sub-720p resolutions" - True, but that only applies to 3 out of 12 games on the list (Oblivion @ 1024x600, Crysis 2 @ 1152x720, Far Cry 2 @ 1280x696). I tried running Oblivion at 1024x600 but the game would crash at launch, and I didn't bother with Crysis 2 and Far Cry 2 since they're already very close to 720p. "But the Radeon X1900 is the equivalent card" - Xenos indeed shares some similarities with R580 (X1900), but with fixed function shader pipes it doesn't have nearly as much flexibility in pixel shading and geometry processing power. If I had limited myself to DX9 I wouldn't have been able to test nearly as many games as well. And lastly, you might be shocked to learn that this card already performs similarly to a Radeon X1900 in DX9 games. "Why didn't you use a period correct CPU (Pentium 4/D, Athlon 64/X2, Core 2)?" - I wanted to see the full extent of what this card was capable of, and I wouldn't say there's much point in trying to get as close as I can there since nothing made for consumer PCs is going to get remotely close to the 360 CPU anyway. "Crysis 2 doesn't run well on 360 either" - True! After refreshing my memory with a Digital Foundry FPS test, it turns out I was wrong about Crysis 2 being much closer to a locked 30 fps. I haven't touched that game on 360 in years and I guess it didn't perform as well as I remember. I'll call that matchup a tie.
Just to provide extra info on the "x is the actual equivalent card" bit: The Xenos GPU shares its design with both R520 and R600 as it was designed while R600 was still being worked on and is a very interesting look into GPU design methodology because it's effectively the new R600 IP where that was ready mixed with older R520 IP for the other areas of the GPU, which is why it's the only DX9-class GPU with unified shaders. (That I know of, at least)
For sure, it really proved its worth in the later games especially. There are some games on 360 that always make me think, this shouldn't be possible (Like GTA V and Titanfall)
The OG xbox didn't have just a regular Pentium 3, it was a cut down one, something between a Celeron and a P3. That being said, I miss this generation, the 360 and ps3 both had hardware the was close to a high end pc. I actually worked at xbox support during this time period, was a good time for xbox.
That's why I said almost off the shelf, but it does get extremely close. You just have to choose whether you want the lower 128K cache (Celeron) or 8-way associativity (P3), can't have both like the Xbox CPU though (at least in a normal desktop/mobile Coppermine). And that's really cool, do you have any notable experiences from your time working at Xbox support?
@@SPNG It was a fun place to work until we got bought out, we were contracted by Microsoft and my first couple years there I got to be on their Tier 2 team and on their pilot program for networking. That was a lot of fun and I got to learn a lot, often calling up an ISP with a customer and telling them to do their job, lol. Once we got bought out though they slowly took away our ability to properly troubleshoot issues and wanted us to push the customer off the phone as quick as possible to save money. Eventually they just killed our contract and sent the job overseas, at least I got a full ride to college and an associates degree because they did this. I did get to meet some of developers of Windows a few times, (because after 5 years of xbox support I moved over to surface support) and while I didn't get to talk to him I was in the same room with phil spencer once.
@@robertt9342 that's a misconception, because of the marketing, in reality xbox one and PS4 were more powerful for their time, because the native res target was 1080p and iirc most games ran at resolutions between 720p and 1080p Not cross-gen games like Alan Wake 2, SW outlaws, Avatar fop, BM Wukong, Lords of the Fallen, Immortals of aveum, Forspoken, Robocop, warhammer 40k space marine 2, all those games run at sub 1080p native resolutions like 720p or 800p, some games like Wukong drop native res even lower than 720p (like 640p) on both PS5/Series X I know today we have upscaling, but these consoles are supposedly "4K" machines
@@lopwidth7343Although by the end the patches weren’t too intrusive. If you didn’t want to update it just meant you couldn’t play online, and you could just buy games to play on your 360 without ever having to go online. Games were still mostly physical but digital was pushed as just an alternative, whereas now it’s getting harder to buy physical games with all these consoles removing the drives. The 7th generation was truly the last one where we had complete ownership over everything we played.
True, for this suite all the games I'm running with the exception of Oblivion (1024x600, I tried using this lower res but the game crashes at launch) and Crysis 2 (1152x720) run at 720p on 360
Later games are also usually more equivalent to a mix of low and medium settings on PC. Sometimes settings were set lower than the lowest possible on PC. AF was usually set to x2 and sometimes completely disabled and instead some games used trilinear texture filtering to save on memory bandwidth. For games released closer to the end of the generation there were compromises to visuals which didn't even appear on the lowest settings of the PC versions, such as draw distance, LOD pop-in and shadow draw distance. These were crucial optimizations for later games as more (up close) fidelity had to be squeezed out of the aging hardware.
Seeing old ATI Radeon cards is so strange for me because I'm so used to it saying AMD, even though I used plenty of ATI cards back in the day. Also this card looks pretty advanced for a card from 2007. It's fun to look at the way cards have evolved over time. The use of an 8 pin connector is intriguing. Edit: It would be cool if you showed the other hardware you were using on the test bench.
The 2900 GT was pretty much just a neutered version of the 2900 XT at the time, so it does have some of the same higher end aspects albeit toned down a lot. The 2900 series was amongst the first to use an 8-pin PEG connector interestingly enough! I've featured this test bed a fair bit in my other videos, but there is a slide showing all of the specs if you want to know some of the specifics.
TBH ATI on it last days made masterpiece of hardware that started with 3000 series and end up with 5000 series sadly I stop gaming having mid-tier GPU HD 4830.
Dude, this video is awesome, I can't believe you only have 4k subs. What a technical treat to showcase a card that's as close as commercially possible to the 360! I guess at the end of the day, being able to optimize for pre-made hardware like the 360 really gave it a lasting edge, as people know how to build for that specific hardware by the end of its life cycle.
Much appreciated man, I'm glad you enjoyed 👍 And yeah when devs only have one hardware config to worry about they can get every last drop of performance out of it. Even then some games on 360 just strike me as impossible, like GTA V or Titanfall. Some incredibly impressive stuff at work there.
Lately I have been starting to re-appreciate how good the late 2000s actually worth. A lot of people look at 1998, 1998 or the of the big 3 (2004) as some of the best years, but the entirety of the late 2000s was freaking awesome. Just to name some highlights! F.E.A.R. (2005) Quake 4 (2005) Half-Life Episode 1(2006) Prey (2006) Call of Juarez (2006) Bioshock (2007) S.T.A.L.K.E.R. Shadow of Chernobyl (2007) Half-Life Episode 2 (2007) Portal (2007) Team Fortress 2 (2007) Crysis (2007) Crysis Warhead (2008) Fallout 3 (2008) World of Warcraft Wrath of the Lich King (2008) - Arguably the best wow expansion Left 4 Dead (2008) Mirrors Edge (2008) Far Cry 2 (2008) Left 4 Dead 2 (2009) Wolfenstein (2009) The Sims 3 (2009) Borderlands 1 (2009) A couple of the titles above were not very popular upon release, but became recently more popular in retrospect, especially if you compare them to the crap released day. For Example Wolfenstein 2009 and Mirrors edge are true classics I have played through multiple times. Quake 4 was also criminally underrated in its era. I truly feel Crysis, Wow wrath of the lich king, and the orange box cab be see as the late 2000s personified. The end of the decade kicked off The Sims 3, which was arguably the best sims game. Of course this is not even mentioning the elephant in the room that is the XBox 360 and PS3. I feel it was the last console generation that truly had some hype to it. Flash forward to 2013/2014 and games started the transition to the garbage we are fed today.
@@zephaniahharrison5171 tl;dr october 2021 driver support was dropped, I used it as native adaptor for my CRTs. I still have to decide where to finally place it, because I mainly use and work on my more powerful and modern PC honestly, but I got an all-in-one which may be its perfect final home when I'll get around to restore it. The GT 710 is powerful enough to 1080p, 60FPS High Halo 3 with original graphics.
Ahhh, the memories, when you could put a DVD in the drive and just play a game! We've not progressed at all since the 360, gaming has most certainly gone backwards.
Eh, I had 6 360s break on me. They intentionally made a special version without HDMI even years later, just to save 40 cents on each return unit for the people who bought before they added it in
@@matt5721I had two dropout 360s, both were originals (neither died of RROD surprisingly haha), but I've had my Slim since 2021 and it just keeps on keeping on.
It's amazing how in the span of a few years we returned back to games needing to buffer at the start screen, poor optimization and bad load times on PCs that aren't better than Recommended Specs
I have been a PC gamer for most of my life. I had some NES chinese ripoffs as a kid, but ever since I got my first PC is 2000 I played games almost exclusively on PC (not that I was very good at, but it's a different story...) and had very little interest in buying a contemporary console in general ever since. However few months ago the curiosity got the best of me and I decided to buy some hacked refurbished X360 S with tons of games on it to see what it looks like playing on one of these things. I was pleasantly surprised how an almost 20 year old hardware still holds up in terms of visuals, even on modern big ass TVs, the game designers were truly magicians back then. It looks like this generation of consoles is still legendary till this day for a reason...
PC gaming was in kind of a rut in 2007 iirc. Vista and DX10 made everything... jank. I remember nvidia shipping their XP drivers as 'Vista-compatible' and them just not working lol. At this point I was still on an aging Pentium 4 system with an AGP slot and ATI was one of the few companies holding out for us non-PCIE peasants.
It's honestly sad what happened back then. It's a shame hardware devs didn't pay attention to how different Vista actually was to XP and were caught with their pants around their ankles. Vista and XP have so little in common, it's insane.
I've always maintained that 360 was the last console to compete with high end PCs at launch. I dont think a single other console has come close to that since then. PS3 would have been a contender too if it hadn't been delayed.
At least on the graphics side, considering some of the fastest cards you could buy at the end 2005 were the 7800 GTX and X1800 XT, the 360 was several cuts above!
Regarding the AA resolve. I don’t think it was a bug since it’s retained for all R600 series chips. Even the HD 3870 its successor/shrink does it like that. My guess is that they tried to save some die space and validation time in the ROPs and they thought they had so much compute they might as well do it on the shaders. Also i might be able to shed some light on the performance profile vs the Xbox 360. Especially early games were often CPU limited as per core the Xenon is pretty weak, about on par with a Pentium 4 2.8GHz (Prescott). Thats why you had much higher framerates. In later games when they got to grip with the hardware they actually started offloading GPU tasks to the big Xenon vector units (VMX128) in order to save precious GPU time. Thats why later games run worse on the 2900 GT (and because of the TeraScale drivers)
I'm just surprised they waited until the HD 4000 series to fix the issue, given all of the negative reception it got. RV670 would've been a perfect opportunity with the die shrink, that chip is tiny compared to R600.
@@SPNG Silicon developments takes time. And there is actually not much time between them. A little over a year. These days we get a new generation barely after 2.5 years
Games also started to take advantage of parallelization/multithreading, which was where the 360s CPUs shined. Since Xenon was originally based off CELL, I think offloading the GPU and decent multicore perf was always the intent with that CPU architecture. Which possibly explains why they were so lacking in other areas (IoE, cache, IPC etc). They were likely banking on multicore taking over, which I can't say they were wrong about tbh
@@SPNGI remember my ATI HD4850 512MB could run CoD MW2 at 1080p with 4x MSAA. That card was legendary for its price to performance. Although the 4870 definitely aged better due to the extra memory bandwidth.
It's pretty crazy how long the xbox 360 and PlayStation 3 lasted. Hard to believe it's still, a real trial to emulate thes systems on all but the most overkill PC's money can buy in the present ecosystem, too.
These systems got a crazy amount of longevity, and I think a lot of their exclusives still hold up pretty well. I remember trying to emulate Gran Turismo 6 on PC but it was pretty rough haha, the emulators for both are always improving though.
Emulation of different processor architecture and the attached hardware is always relatively time consuming task. One clock cycle of the emulated CPU usually takes several cycles on the host CPU. Even with dynamic recompilation, ie 'JIT', there is always a big performance hit. Apple spent a lot of time and money to optimise their emulation when they went from 68K to PPC, then from PPC to Intel x86, and x86 to ARM. But they couldn't achieve anything like native performance each time on the fastest new Macs. Eventually new hardware becomes fast enough for the emulation to outpace the old computers, but it usually takes a few years.
A Far Cry 3 test would be interesting, i know it uses a weird resolution on the 360 (704p if i recall correctly), but the highlight it that it runs extremely badly, being a 15-25fps game most of the time.
Far Cry 3 on the lowest settings on pc still has better graphics (mainly textures) than the console versions. and from what i know it was easy to run on lowest - so could maybe work!
The jump in security Microsoft made between the Xbox OG and the Xbox 360 was staggering. They went from a complete mess to a system that still hasn't had a software exploit almost 20 years later. Jumping from a Pentium 3 to a PPC chip with a hardware backed hypervisor and signing design in 2005 was stupid impressive for the time. And you could try saying the KK exploit was software, but it required (and still would require hardware modification) a flashed DVD drive still.
I agree, it is really impressive how they improved the security, but the lack of it was one of the biggest reasons why I love the OG Xbox modding scene. It's so easy to get into, and the fact that you can do stuff like swapping the CPU for an entirely different desktop chip is amazing to me
@@SPNG interestingly enough a few years ago some unused Xbox 360 CPUs appeared on eBay, having zero efuses blown. Some people said they were going to CPU swap it and then create their own keys and blow the correct fuses. I don't know if anyone did, let alone post about it. Would be really interesting to see though.
Wow I had this card. Never seen this card mentioned ANYWHERE. It actually sucked lol. All the heat of an HD2900XT or PRO and none of the performance. Drivers for this card were a nightmare when it launched. ATI didn't seem to want to acknowledge its existence for a few weeks when it first came out. Then it stopped even being sold after like 6 months.
You're the first person I've heard of having one back in the day 🤣 What made it worse is it came out like days before the HD 3850, which would have been faster and even cheaper I think? I can't even find a review from when the card came out (or a review at all to be honest). How much did you pay for yours back in the day?
I scrolled some and didn't see it mentioned, but for many games they do not have an internal render at a full 720p unlike the PC versions would. As well some titles were limited by the CPU in the xbox and scaled MUCH better on your test-bench.(though, you already know that) It is for this reason that we encounter additional issues with the best possible comparison. (curious if something like the old Durante DSfix could be used to inject an internal render resolution to match what the console was actually rendering pixel wise) I did enjoy the video and the effort to go over the technical aspects is always appreciated by myself! Keep up the great work!
3 out of 12 games on the list use a sub-720p resolution on 360 (Oblivion, Far Cry 2, Crysis 2). I tried running Oblivion at its 360 resolution of 1024x600 but it would crash on launch, Far Cry 2 might as well be the same since its 1280x696, and Crysis 2 only drops it to 1152x720. Also, I wanted to make sure the CPU wasn't a limiting factor here, and I think in a sense that can account for any CPU-related console optimization. If I used a more period correct chip I could've experienced some issues with the later games in particular. Much appreciated by the way and will do, there's more to come 👍
@@another3997 pretty unique for 360 era because the pc got straight up old gen ports. There was problem in the ps2 era as well with missing graphical effects but the games were at least port of the current gen consoles there.
makes me wonder if the xbox was held back by its 128 bit bus and the cpu. seems like the GPU in the 360 could have performed better but was gimped with everything around it.
I'd wager much of the differences come from the memory fine-tuning that consoles enjoy. A lot of your tests seemed to run into what looks like VRAM limitations on the 256MB card, but on 360 it was fairly routine from what I understand for games to go above 256MB memory use for graphics.
That sounds about right, I was definitely pushing it with some of these settings trying to match the image quality. Time to bust out the BGA rework station and get 512MB on the GT 🤣
It's more like an X1800/X1900 class GPU but built on TeraScale architecture. Dx9 only with very limited features. The HD2900 cards sucked but they were neat. They were all known for running hot and new paste and pads are essential to run them nowadays due to aging.
It's true that Xenos has a lot of aspects of R580 (X1900), but with fixed function shader pipes it doesn't have nearly as much flexibility in pixel shading and geometry processing power. Also, if I had limited myself to DX9 here I wouldn't have been able to test nearly as many games. I didn't mention this in the video but this card performs extremely similarly to the X1950 XTX in DX9 games.
In my experience, the game that pushed the most my old Xbox 360 arcade was Crysis 2, you could feel the poor thing coughing and screaming when nearing the end of the game, with too much happening at once... I miss it so bad :(
When I was comparing performance I was kind of going off of old memories on how it performed. I really thought it stuck to 30fps for the most part but refreshing my memory with some videos that doesn't seem to be the case, it kinda chugs to be honest
Surprisingly, Supreme Commander absolutely thrashes the xbox 360. Considering it does the same to hardware twice the performance I guess the xb360 got the last laugh.
yeah, the later aged games ran better on 360 just because of how devs could optimize the code for the hardware, who knows what nuanced changes they could make for those version to get every last frame they could. That's to say nothing about the structure of code execution on a more streamlined piece of hardware, either. Great video
There's really no replacement for those optimizations on PC, but I was still impressed with how this card managed. Even some of the losses were closer than I was expecting. Much appreciated, and thanks for watching 🙏
If Im remembering it correctly, the XBox 360 GPU was based on the ATI X1000 series, codename R500, no? R500 and R600 are completely different archtetures. The R600 was natively compatible with the DX10 API, the R500 only supported DX9.0c.
It's true that Xenos has a lot of aspects of R580 (X1900), but with fixed function shader pipes it doesn't have nearly as much flexibility (as R600 or Xenos) in pixel shading and geometry processing power. Also, if I had limited myself to DX9 here I wouldn't have been able to test nearly as many games. I didn't mention this in the video but this card performs extremely similarly to the X1950 XTX in DX9 games. - taken from one of my earlier replies lol
@@SPNG and I had one and stupidly sold it. Having a Powermac G5 with Microsoft Game Studios stickers was so cool. It would have needed hundreds of dollars of parts to boot and I thought I was being wise by getting rid of a computer I knew would never run again.
I would say the major flaw in your testing is the fact that a lot of games on the 360 run at a lower resolution than 720p and are upscaled internally which gives better performance.
Only 3 out of the 12 games tested are like this (Oblivion, Far Cry 2, Crysis 2). I tried to run Oblivion at its 360 resolution of 1024x600 but it would crash at launch, Far Cry 2 runs at 1280x696 which is so close the results might as well be interchangeable, and Crysis 2 only drops it to 1152x720.
Glad that you like it! I think having both tells more about how a game actually felt, and what kinds of stutter were experienced. Much appreciated and welcome to the channel 👍
With an updated GPU at least. I don't think they'd be capable of running the dev software due to a lack of support though, I think those systems originally came with an ATI X800-derivative FireGL card
this is how these consoles went down the ps3 had a significantly faster cpu that can do some graphics calculations . the xbox 360 had a much more powerful gpu that was mostly unified minus aa chip / the ps3s gpu was just a 7800gtx
Well yeah a cpu which has 1 power pc core + 8 128bit simd vector units could easily beat one with 3 core 6 threads About the gpu the xenos has more texture units but lower cycle i think makes around 240gflops while rsx makes 230 But edram makes it much more Superior
@@amdintelxsniperx Xbox CPU IBM Power PC its 3 cores 3.2ghz. What are you talking about? Prescott 2.8ghz its not so funny man. Try to turn on MGS V Tomb Raider 2 and GTA V for Prescott 2.8ghz. You will be see nothing.
wait, what version of gta 4 you played? Because first version had few graphical features that was cut later. Also, dunno about gta 4, but gta 5 uses pagefile, and your game may be unstable if your page file is in auto mode. Plus, is gta 4 with windows live(even if offline) or is it something like xliveness?
I'm not sure which precise version I was using, but I think it was on the newer side? Either way, the card would have lost, that game is just optimized so poorly on PC. I was amazed to see it struggle on some more powerful cards in my own testing
Man this was a really well made video, I stumbled with it because I was actually planing to do an xbox 360 PC myself and this was a blessing. Nice lots of information and overall performance was excellent
Much appreciated, I'm glad you enjoyed! Finding an equivalent CPU is definitely a challenge as there's pretty much nothing like Xenon made for like regular systems but you could just go off of instructions per second and something with the same amount of cache.
I bought a XBOX 360 on 2012 and I can't believe that this hardware which already released 7 years at that time could run some games like HALO 4 and GTA V. I hope my Xbox series X could be played for ten years too.
The OG Xbox and 360 GPUs were so interesting in feeling like functional mass production prototype GPUs that paved the next generation of Desktop graphics logic and its a miracle how they milked the 360 dry of its performance on later games, it was a technical marvel but boy did they waste power to get there, the Xbone was so dissapointing that after playing only 1 game, i handed the keys to my brother as an upgrade from a PS2 and moved back to PC Gaming, the Sempron and FX5200 that was my first gaming PC (then to an Athlon 880K + 1060) couldn't handle the more modern games, nor had the internet stability, which i think goes under the radar for a lot of later 360 players who weren't used to how spotty the OG Xbox could be. To this day i am still playing titles on the 360 that you just couldn't get on PC, no exclusivity deals like the following generations, if you wanted a game, you bought the platform, THATS what consoles were strong for, now? They're nothing more than a snazzed up GUI to launch games on a gamepad with, making me think that soon, the Wii U had the right idea, being able to take the power with you yet dock it when at home..sounds like a steamdeck..and i really think thats the future of console gaming, but with so little between what could be a MS/Sony handheld and an Ally etc, console/PC gaming optimisations could meet as one with Potential Native linux support (Though i find that hard for MS to swallow) and the big war of PC vs Console may be settled with portable powerhouses that we carry everywhere..maybe the nokia nguage wasn't so wrong either..a large phone screen that can dock into a handheld chassis with a dGPU, batteries etc? Sounds like it'd be an interesting shift to an all in one gaming phone/console/pc handheld :P
I noticed your test bench uses Windows 7, how did you play Dark Souls 2 on Windows 7 withotu steam support? If steam was already installed can you still play offline on games you've already downloaded or did you have another workaround?
got my 360 jailbroke with rgh with a 1tb hdd , and it basically became my main gaming machine, and i have a pretty beef'd up pc.You can argue all day but the plug and play'ness of the consoles is god send.Cant beat gaming session on a confy couch,huge tv after a hard day of work
I'm not an expert how the shared memory of the xbox 360 works (I know how it's done on the old Amiga in detail), but as the game creator could divide the RAM (512mb) into CPU / GPU slices as desired, there has to be a downside in the access times. Even when you split the memory exactly in half, there has to be an element that controls the access to the RAM, what always includes wait states. Dedicated memory for CPU/GPU would be faster, but of course 256/256mb or 512/512mb have their downsides (not the best solution for all games or too expensive). So even when you compare the identical GPUs, it will be hard to get a perfect match. It's always amazing how a system, even with a "bad framework" like DirectX, could be optimized, just by getting rid of the factors, that different hardware creates. I played GoW2 two weeks ago on my 78" OLED and besides being blurry, the look and feel is still great. Thanks for testing and creating the video 🙂
I mean, I guess we have to take what we can get though haha. In that case maybe 512MB of VRAM would be more suitable (probably better to have too much instead of not enough) but this is the only R600 card with this GPU configuration period, they didn't make any 2900 GTs with 512MB 😔 Thanks for watching man 👍
A enjoyable informative video nicely done 👍 Feels crazy now because i try and match all my game settings to 360 level on my steam deck and it just amazes me how far we've come
@@SirQuacksalotthe1st Much appreciated 🙏 And very true, with a lot of today's handheld gaming PCs you're getting well over that level of performance with a fraction of the power consumption!
@@SPNG I know it's crazy To have every single one of my childhood games playable In the parm of my hand Sometimes when I think about it I can't stop my self from giggling like a dam child haha because part of my brain registers that it's so absurd to me but it's true
Seeing what this made me appreciate 10 series cards even more. Gpus like this ~11 years old around that time barely managing 30fps at 720p at the time compared to 10 series cards performing well roughly 8 years from when they came out still very capable today in today's titles
I think The 256 MB vram is to low vor the pc Versions of the games. lot of AAA Games on ps3/360 run in a bit lower Resolution like 6xxp and not Always in real 16.9 like cysis with1152x720. this Save a bit power an vram. I Like The Gran Turismo Soundtrack.
True, I did try to run Oblivion at its 1024x600 resolution it uses on 360 but it would crash on launch. For Crysis 2 I thought the difference was too small to be mentioned. Those are the only games that really use a sub-720p resolution in this suite (technically Far Cry 2 does as well but its 1280x696 so pretty much the same lol) And thanks! I love the lounge tracks in those games 😀
It wasn't so much the GPU that made the 360 struggle, both the PS3 and 360 were pretty much behind when a quad core CPU came to the market. (i guess even the duo core were better at some point early on)you did get better looking games trough out the generation but the performance was always tricky to achieve.
"Capable of doing 4x MSAA with a negligible performance hit". Not true. A single 4xAA 720p framebuffer would exceed the 10MB of the EDRAM. That's why 2xAA was commonly used instead.
This is kind of splitting hairs a bit. There is indeed a performance cost with 4xAA since its doing enough on the eDRAM to require tiling, but it's very small. "ATI have been quoted as suggesting that 720p resolutions with 4x FSAA, which would require three tiles, has about 95% of the performance of 2x FSAA." - ATI Xenos: Xbox 360 Graphics Demystified (Beyond3D, Jun 13 2005)
You may be better off getting a Slim or something. I got my system several years ago and it has been going strong, some of the original models can be reliable too but you have to know what revision board you have
My first GPU was an old x1950 pro a friend found for free and gave to me in 2013 ish I believe, not really the same as this but the looks reminded me of it. It somehow made dayz playable with my 2 core 2 Gb ram prebuilt I had at the time.
There wouldn't have been enough Compute power to run FSR2 properly imo. I know the Switch does it well with No mans sky but I genuinely can't imagine the 360 being able to run such compute and even then it wouldn't have the capacity to do TAA which is needed for FSR2.0
This video isn't really focused on trying to recreate the entire system as replicating the CPU is pretty much impossible with regular PC parts (it is as far as off the shelf as you can get), and I wanted to see the full performance of this card.
As far as i know the PS3 was a similar case, where it's GPU the RSX is a Nvidia G70(GeForce 7800GTX) but slightly modified in similar ways to what explained in the video. The RAM also matches, 256mb of GDDR3 RAM in a 256 bit bus running at 650mhz
Hey bro fantastic vid if you ever got serious about making videos like this I’d suggest getting a hydraulic head tripod especially for pans. Static heads are just fine for stationary and moving it around but when you try to pan it gets jittery like that. Just a heads up and keep up the good work
Much appreciated, and thanks for the tip 🙏 I didn't even know that was a thing lol, I'll have to pick one up at some point. The rough pans always bothered me a bit, I can't get them to be perfectly smooth no matter what
Id have been thrilled to have a 2900 when oblivion launched. I was still running an x800, then a 7300 GT, and then an 8800 Ultra the following year. Obviously then I had no more troubles.
@@Loundsify not too impressive these days, I had a 4 way HD 5870 setup at one point. I’ve had a ton of multi GPU setups. A weird one was 2 x HD 4850’s and an HD 4830 in 3 way. 2 GTX 580 3 GB, 2 Kepler Titans, 2 GTX 980 Classified, 2 GTX 1080 sea Hawk X EK, 2 GTX 1080 Ti Hydrocoppers, and lastly 2 RTX 2080 Ti FE’s. Sadly multi GPU is pretty much dead now. So a single 3090 and a single 4090 more recently.
@@chincemagnetit's mainly because multi GPU scaling didn't scale linearly, quite often you would only get 50-60% boost in FPS and sometimes it would be worse than 1 card. I had a HD7870 XT 2GB in crossfire, that card was a cut down HD7950 so it was much faster than the standard HD7870 And really should have been called the HD7930.
@@SPNG mine supports 2133 ram speeds just can't find the frequency setting. My z97-k board has an xmp setting which allows you to choose between two xmp profiles.
Glad u made another vid like this. Hope you make a Nintedo Switch PC equivalent video so I can see if I am being beaten by a 2017 handheld on my laptop 😭😭
Recreating it in a whole PC would be a challenge due to the ARM CPU, but just the GPU comparison would actually be plausible thanks to a certain similar NVIDIA card, and I already have it 👀
My pc with a Celeron N3050 is worse than the Switch in portable mode. The only good thing is I know games that will definitely run fine if they were on the Switch, like the staker game coming out next month and just cause 2 or sleeping dog
The issue with any comparisons to Xbox 360's GPU is that it was very different to all the conventional single die GPUs of the time and indeed today. The point of 360's GPU and 10mb eDRAM setup was to create a high bandwidth framebuffer which no PC GPU had by comparison. It had one die with main GPU logic, and a daughter die which could handle a bunch of z buffering and blending at low performance cost. This worked very well except for one major problem: 10mb of eDRAM was not enough to store a 1280 x 720 frame with typical color and depth values. It was only enough to store a 1024 x 600 frame. So you either had to render at 1024 x 600 or tile the output, which impacted performance unless you used certain mitigations. Many early 360 titles thus rendered at 1024 x 600 and got very low cost anti aliasing. Later games tiled. On PC you usually targeted higher resolutions anyway, most cards by 2007 had more VRAM available and more bandwidth across it. So several of the games you test at full 720p were not 1280 x 720 on Xbox 360. Examples: Elder Scrolls 4, Crysis 2, Far Cry 2 etc
When you say that I gotta emphasize that I'm suggesting it's an "equivalent". Out of all PC graphics cards, I think this one is a parallel to Xenos due to its shading throughput, features borrowed from Xenos and everything else mentioned in this video. I'm not suggesting they are exact replicas of each other of course, just that the 2900 GT is the most correspondent you can get. Also out of the twelve game suite, those three games are the only ones that run at a resolution other than 720p on 360 (really two since Far Cry 2 runs at 1280x696)
Crysis 2 did not run smoothly on X360 at all. During intense moments the frame rate dipped to high-to-mid teens, it rarely even hit stable 30 fps. Check the digital foundry, they spoke of Crysis on consoles lots of times.
@@SPNG yeah, it was bad. But I finished Crysis 2 on my X360 and loved it a lot 🤣 back in early 2010s gamers were a lot more forgiving in terms of poor performance. Today it seems not acceptable but then... Well, it was what it was, we played it, we loved it, we had fun in 25 fps 😅
The Xbox 360 was nuts. My ex and I were always big into gaming and had a pair of pretty powerful gaming PCs at the time. I was a diehard Nintendo fan until Wii, and then went Xbox. The 360 blew me away and I almost exclusively used my Xbox over my PC except for WoW. Peak gaming era.
The 2900xt and the 2900 pro were practically the same card, with the later being 100 dollars cheaper at the time. I bought one and flashed the xt bios to it. I was able to get a more aggressive fan curve and higher clocks. It made a difference in GTA IV
Something to note about GTA IV, though I don't know if this can be applied in these tests, But on modern systems you can actually use dxvk by just dragging the DLLs for it into GTA IVs folder, making the game render through Vulkan. An issue GTA IV suffers from, and also a number of other DX9 games in some way shape or form, is that Microsoft uses a specialized version of DX9 since Windows Vista and this DX9 implementation does GTA4 no good. I can't remember why using dxvk on GTA IV improves performance, But I believe it's something offloading some CPU overhead or something like that.
A lot of GPUs are this way. The connectors are usually just big 12v supply rails, so the missing two pins just means that you risk the wire getting hotter as the current draw goes up.
@ZeroX252 The 2 extra pins don't carry 12V. One of them is ground and the other is a sense pin, and in my experience most cards won't work if the sense pin isn't connected.
The games in this suite that are sub 720p on 360 are TES Oblivion, Far Cry 2, and Crysis 2 to my knowledge. I tried to run Oblivion at the 1024x600 resolution it uses on 360 but the game would crash on launch. The 360 runs Far Cry 2 at 1280x696 (but the difference is so small it might as well be interchangeable with 720p). Crysis 2 is 1152x720 on 360 but it wasn't in the default display modes and even if I edited the config file, running it at that resolution would've hardly made a difference in performance. There's a multitude of other games that are sub 720p like you mentioned, but that should cover the ones in this suite at least.
Cool, Cool, Cool!!!, great video!!!, love to see more videos like that? hope xbox one GPU video? I think the GPU performance was surprisingly good, and I agree with your findings!!!
Sucks that you didn't test GTAV, it would really show how far the optimization stretched the 360's hardware. I'm guessing you had issues getting it to work.
The game worked, but I felt it would be redundant with GTA IV already on here. Not to mention GTA V really struggles on Terascale cards in particular, while GTA IV performs like crap on all hardware
GTA IV is absolutely fine, it just doesn't like FPS monitoring like rivatuner which causes the stuttering. Also many later games ran at some weirdly internal resolutions lower than 720p, so the card seems really comparable overall. Shame there weren't dx10 drivers to test out with GTA V. Im sure with the right equivalent settings it would do that too, since getting it crammed on the xbox 360 seemed so surreal at the time also, although running at those pseudo fake 30fps that felt more like 10-15 input wise. Double or triple buffered, like far cry 3
I was using Fraps, not Rivatuner. Stutters and performance were the same regardless of whether it was running or not. I did try GTA V, but I felt it would be redundant to include with GTA IV already on the list and due to the drivers it tends to perform badly on Terascale cards in particular, while GTA IV evenly distributes its poor performance lol
@@SPNG Yeah, you need it not even on your system, like completely uninstalled. It's such a weird software like that. I mean rivatuner. Okay, didn't know GTA V even ran with those old ATI drivers at all
Yeah, I remember installing GTAIV and it's so bad on PC, even with patches and stuff, it just isn't smooth. GTA V on the other hand, is greatly optimized. The 6th and 7th gen of consoles received many better releases when compared to their PC counterparts, it changed from 2010 onwards.
I remember my dad getting me and my older brother one back in the early 2010s. Considering I was grown up on an original Xbox I was amazed by how much more capable the 360 was
I expect that running some of these games in Linux through wine may actually perform better.... Mesa has had more significantly updates than catalyst, so the drivers may just work better. Dx9 only i think though. No DXVK, since the card is waaay too old for vulkan.
Just as an example of how neat modern MESA drivers are... There is experimental support for Vulkan on some terascale cards like the 6970 and 6870. Absolutely crazy.
@@SPNG Oh snap I didn't realise the card had no VK support, thats a shame. I also just checked on the guru3d forums where the modder who creates the nimez drivers posts his stuff and he only has terascale 2 & 3 drivers, nothing yet for terascale 1 :(
It’s more complicated. But I’ve heard that using DXVK in GTA:IV fixes many of the issues. Course, it isn’t relevant with this card, but for Vulkan-capable cards, it really helps frame rate and frame times.
I know there wasn't a direct Halo 3 port to pc but could you quickly compare the performance of this system on MCC next to Halo 3 on the 360? could make for a good "short"
If you were maybe interested in benchmarking gta 4 again but having it be actually playable, use dvxk, fusion fix and various fixes i know the point of the vid is supposed to be 1:1 with the 360 versions, but i feel like gta 4 is an exception here
Nice video. Can you make supermicro workstation build? Some standard atx or eatx pc case and standard psu, and old, example LGA 1366 Supermicro motherboard.
Thanks! Maybe at some point, I've never used a 1366 system and a workstation could be fun. Just a matter of finding one for a decent price, I'm a bit of a cheapskate haha
Oblivion would be more of a CPU test, like Morrowind before it and later Fallout 3/New Vegas it was ridiculously CPU bound, the bottleneck with those games was the memory or lack thereof.
That would be a fun build, only thing is this card is REALLY hard to find! It took me so long to find one. It's the ultimate pointless card from its time but good luck finding it
@@andrzej9880 Not trying to recreate the entire system here. That is pretty much impossible due to the CPU being so different from anything made for regular PCs
Because I don't want to reply to the same comment multiple times over, I'm going to address some of the common questions and comments in a pinned one.
"A lot of games on the 360 run at sub-720p resolutions" - True, but that only applies to 3 out of 12 games on the list (Oblivion @ 1024x600, Crysis 2 @ 1152x720, Far Cry 2 @ 1280x696). I tried running Oblivion at 1024x600 but the game would crash at launch, and I didn't bother with Crysis 2 and Far Cry 2 since they're already very close to 720p.
"But the Radeon X1900 is the equivalent card" - Xenos indeed shares some similarities with R580 (X1900), but with fixed function shader pipes it doesn't have nearly as much flexibility in pixel shading and geometry processing power. If I had limited myself to DX9 I wouldn't have been able to test nearly as many games as well. And lastly, you might be shocked to learn that this card already performs similarly to a Radeon X1900 in DX9 games.
"Why didn't you use a period correct CPU (Pentium 4/D, Athlon 64/X2, Core 2)?" - I wanted to see the full extent of what this card was capable of, and I wouldn't say there's much point in trying to get as close as I can there since nothing made for consumer PCs is going to get remotely close to the 360 CPU anyway.
"Crysis 2 doesn't run well on 360 either" - True! After refreshing my memory with a Digital Foundry FPS test, it turns out I was wrong about Crysis 2 being much closer to a locked 30 fps. I haven't touched that game on 360 in years and I guess it didn't perform as well as I remember. I'll call that matchup a tie.
Just to provide extra info on the "x is the actual equivalent card" bit: The Xenos GPU shares its design with both R520 and R600 as it was designed while R600 was still being worked on and is a very interesting look into GPU design methodology because it's effectively the new R600 IP where that was ready mixed with older R520 IP for the other areas of the GPU, which is why it's the only DX9-class GPU with unified shaders. (That I know of, at least)
Its just amazing how powerful of a GPU the Xbox 360 had, it really was high end in 2005
For sure, it really proved its worth in the later games especially. There are some games on 360 that always make me think, this shouldn't be possible (Like GTA V and Titanfall)
this card is from 2007.
@@cooleyYT whats fanboy about it? its facts
@@Ronaldinio69 it's a weak console and got beat multiple times in games. Pc always wins buddy
@@cooleyYTman the 360 is probably older than you
The era of transparent GPUs and cool flames!
Some may call it cheesy, but I prefer it over the samey looking cards we get today. I don't know, they all look so gamery
I always wanted to put a light on mine. First gaming PC I had with a plexi side panel.
The OG xbox didn't have just a regular Pentium 3, it was a cut down one, something between a Celeron and a P3. That being said, I miss this generation, the 360 and ps3 both had hardware the was close to a high end pc. I actually worked at xbox support during this time period, was a good time for xbox.
That's why I said almost off the shelf, but it does get extremely close. You just have to choose whether you want the lower 128K cache (Celeron) or 8-way associativity (P3), can't have both like the Xbox CPU though (at least in a normal desktop/mobile Coppermine). And that's really cool, do you have any notable experiences from your time working at Xbox support?
Yeah but for the purposes of rasterization what it cut out was less of an impact. It performed like a P3 Coppermine in terms of gaming.
@@SPNG It was a fun place to work until we got bought out, we were contracted by Microsoft and my first couple years there I got to be on their Tier 2 team and on their pilot program for networking. That was a lot of fun and I got to learn a lot, often calling up an ISP with a customer and telling them to do their job, lol. Once we got bought out though they slowly took away our ability to properly troubleshoot issues and wanted us to push the customer off the phone as quick as possible to save money. Eventually they just killed our contract and sent the job overseas, at least I got a full ride to college and an associates degree because they did this. I did get to meet some of developers of Windows a few times, (because after 5 years of xbox support I moved over to surface support) and while I didn't get to talk to him I was in the same room with phil spencer once.
Series X was also pretty high end comparatively.
@@robertt9342 that's a misconception, because of the marketing, in reality xbox one and PS4 were more powerful for their time, because the native res target was 1080p and iirc most games ran at resolutions between 720p and 1080p
Not cross-gen games like Alan Wake 2, SW outlaws, Avatar fop, BM Wukong, Lords of the Fallen, Immortals of aveum, Forspoken, Robocop, warhammer 40k space marine 2, all those games run at sub 1080p native resolutions like 720p or 800p, some games like Wukong drop native res even lower than 720p (like 640p) on both PS5/Series X
I know today we have upscaling, but these consoles are supposedly "4K" machines
Xbox 360 was a wonderful gaming Era for me :)
The last good one although a bit too digital towards the end with huge patches you had to download
@@lopwidth7343Although by the end the patches weren’t too intrusive. If you didn’t want to update it just meant you couldn’t play online, and you could just buy games to play on your 360 without ever having to go online. Games were still mostly physical but digital was pushed as just an alternative, whereas now it’s getting harder to buy physical games with all these consoles removing the drives. The 7th generation was truly the last one where we had complete ownership over everything we played.
I grew up with both Xbox 360 and PS3 I like both almost equally, but I still prefer PS3. Both amazing consoles
one thing about the 360 was that some games running at "720p" actually had lower internal resolutions that are closer to 480p
Yeah but all the games he mentioned do run at 720p internally
True, for this suite all the games I'm running with the exception of Oblivion (1024x600, I tried using this lower res but the game crashes at launch) and Crysis 2 (1152x720) run at 720p on 360
fair enough, lads
Later games are also usually more equivalent to a mix of low and medium settings on PC. Sometimes settings were set lower than the lowest possible on PC. AF was usually set to x2 and sometimes completely disabled and instead some games used trilinear texture filtering to save on memory bandwidth. For games released closer to the end of the generation there were compromises to visuals which didn't even appear on the lowest settings of the PC versions, such as draw distance, LOD pop-in and shadow draw distance. These were crucial optimizations for later games as more (up close) fidelity had to be squeezed out of the aging hardware.
Seeing old ATI Radeon cards is so strange for me because I'm so used to it saying AMD, even though I used plenty of ATI cards back in the day. Also this card looks pretty advanced for a card from 2007. It's fun to look at the way cards have evolved over time. The use of an 8 pin connector is intriguing.
Edit: It would be cool if you showed the other hardware you were using on the test bench.
The 2900 GT was pretty much just a neutered version of the 2900 XT at the time, so it does have some of the same higher end aspects albeit toned down a lot. The 2900 series was amongst the first to use an 8-pin PEG connector interestingly enough! I've featured this test bed a fair bit in my other videos, but there is a slide showing all of the specs if you want to know some of the specifics.
@@SPNGNo it's not. Xenos is a hybrid x1900xt. All the newer cards changed the architecture with a worse design, which ruined the company until GCN.
That's what I was wondering I didn't know that it was different
TBH ATI on it last days made masterpiece of hardware that started with 3000 series and end up with 5000 series sadly I stop gaming having mid-tier GPU HD 4830.
Dude, this video is awesome, I can't believe you only have 4k subs. What a technical treat to showcase a card that's as close as commercially possible to the 360! I guess at the end of the day, being able to optimize for pre-made hardware like the 360 really gave it a lasting edge, as people know how to build for that specific hardware by the end of its life cycle.
Much appreciated man, I'm glad you enjoyed 👍 And yeah when devs only have one hardware config to worry about they can get every last drop of performance out of it. Even then some games on 360 just strike me as impossible, like GTA V or Titanfall. Some incredibly impressive stuff at work there.
It's right on Wikipedia that xenos is using r500 and NOT TERASCALE.
*its
@@GrainGrown fixed. Thanks!!!
Hell yeah! I remember when green ham gaming did this idea but with the gpu from the Xbox one! I'm excited to watch this video.
I hope you enjoy it 🙏 Thanks for watching
@@SPNG I did enjoy it. I'm looking forward to your future work.
I think he did one for the Xbox 360's CPU too
Dang, I forgot about that guy. Does he still make his consoles smoke in his videos?
@@awesomeferret you bet
Lately I have been starting to re-appreciate how good the late 2000s actually worth.
A lot of people look at 1998, 1998 or the of the big 3 (2004) as some of the best years, but the entirety of the late 2000s was freaking awesome.
Just to name some highlights!
F.E.A.R. (2005)
Quake 4 (2005)
Half-Life Episode 1(2006)
Prey (2006)
Call of Juarez (2006)
Bioshock (2007)
S.T.A.L.K.E.R. Shadow of Chernobyl (2007)
Half-Life Episode 2 (2007)
Portal (2007)
Team Fortress 2 (2007)
Crysis (2007)
Crysis Warhead (2008)
Fallout 3 (2008)
World of Warcraft Wrath of the Lich King (2008) - Arguably the best wow expansion
Left 4 Dead (2008)
Mirrors Edge (2008)
Far Cry 2 (2008)
Left 4 Dead 2 (2009)
Wolfenstein (2009)
The Sims 3 (2009)
Borderlands 1 (2009)
A couple of the titles above were not very popular upon release, but became recently more popular in retrospect, especially if you compare them to the crap released day. For Example Wolfenstein 2009 and Mirrors edge are true classics I have played through multiple times. Quake 4 was also criminally underrated in its era.
I truly feel Crysis, Wow wrath of the lich king, and the orange box cab be see as the late 2000s personified.
The end of the decade kicked off The Sims 3, which was arguably the best sims game.
Of course this is not even mentioning the elephant in the room that is the XBox 360 and PS3. I feel it was the last console generation that truly had some hype to it. Flash forward to 2013/2014 and games started the transition to the garbage we are fed today.
Far Cry 3 and sleeping dogs in 2012/2013 were decent games tbf.
Golden Age was better which occurred just before that mid 90s through to early 00s.
That truly was a great time for games.
I use a GT 710 for "old games stuff + retro" because it's basically stronger than then 360's GPU and still compatible with a lot of OSs.
do you swap it in and out of your pc or do you have a build dedicated to it?
@@zephaniahharrison5171 tl;dr october 2021 driver support was dropped, I used it as native adaptor for my CRTs.
I still have to decide where to finally place it, because I mainly use and work on my more powerful and modern PC honestly, but I got an all-in-one which may be its perfect final home when I'll get around to restore it.
The GT 710 is powerful enough to 1080p, 60FPS High Halo 3 with original graphics.
@@zephaniahharrison5171you can have more than one GPU in a system
I have a 780 Ti I just bought for this! I am working on passing it to a WinXP VM
@@Legend1148 Virtual Machine?
Ok, you do you.
Anyways, if you ask me, Linux has better compatibility and security with new software on old hardware.
Ahhh, the memories, when you could put a DVD in the drive and just play a game! We've not progressed at all since the 360, gaming has most certainly gone backwards.
I'm getting worried about the growing trend of always online DRM
It was still better to just install the disc to 360 HDD
Eh, I had 6 360s break on me. They intentionally made a special version without HDMI even years later, just to save 40 cents on each return unit for the people who bought before they added it in
@@matt5721I had two dropout 360s, both were originals (neither died of RROD surprisingly haha), but I've had my Slim since 2021 and it just keeps on keeping on.
It's amazing how in the span of a few years we returned back to games needing to buffer at the start screen, poor optimization and bad load times on PCs that aren't better than Recommended Specs
I have been a PC gamer for most of my life. I had some NES chinese ripoffs as a kid, but ever since I got my first PC is 2000 I played games almost exclusively on PC (not that I was very good at, but it's a different story...) and had very little interest in buying a contemporary console in general ever since. However few months ago the curiosity got the best of me and I decided to buy some hacked refurbished X360 S with tons of games on it to see what it looks like playing on one of these things. I was pleasantly surprised how an almost 20 year old hardware still holds up in terms of visuals, even on modern big ass TVs, the game designers were truly magicians back then.
It looks like this generation of consoles is still legendary till this day for a reason...
PC gaming was in kind of a rut in 2007 iirc. Vista and DX10 made everything... jank. I remember nvidia shipping their XP drivers as 'Vista-compatible' and them just not working lol. At this point I was still on an aging Pentium 4 system with an AGP slot and ATI was one of the few companies holding out for us non-PCIE peasants.
The words "Vista","nVidia FX" and "Pentium 4" are enough to give PTSD to every PC gaming enthusiast...
It's honestly sad what happened back then. It's a shame hardware devs didn't pay attention to how different Vista actually was to XP and were caught with their pants around their ankles. Vista and XP have so little in common, it's insane.
This would explain why my PC with Vista would crash constantly with my 8600GT.
I've always maintained that 360 was the last console to compete with high end PCs at launch. I dont think a single other console has come close to that since then.
PS3 would have been a contender too if it hadn't been delayed.
At least on the graphics side, considering some of the fastest cards you could buy at the end 2005 were the 7800 GTX and X1800 XT, the 360 was several cuts above!
Regarding the AA resolve. I don’t think it was a bug since it’s retained for all R600 series chips. Even the HD 3870 its successor/shrink does it like that. My guess is that they tried to save some die space and validation time in the ROPs and they thought they had so much compute they might as well do it on the shaders.
Also i might be able to shed some light on the performance profile vs the Xbox 360. Especially early games were often CPU limited as per core the Xenon is pretty weak, about on par with a Pentium 4 2.8GHz (Prescott). Thats why you had much higher framerates. In later games when they got to grip with the hardware they actually started offloading GPU tasks to the big Xenon vector units (VMX128) in order to save precious GPU time. Thats why later games run worse on the 2900 GT (and because of the TeraScale drivers)
I'm just surprised they waited until the HD 4000 series to fix the issue, given all of the negative reception it got. RV670 would've been a perfect opportunity with the die shrink, that chip is tiny compared to R600.
Vector units saved that whole generation as the PPE architecture is STUPID beyond comprehension.
@@SPNG Silicon developments takes time. And there is actually not much time between them. A little over a year.
These days we get a new generation barely after 2.5 years
Games also started to take advantage of parallelization/multithreading, which was where the 360s CPUs shined.
Since Xenon was originally based off CELL, I think offloading the GPU and decent multicore perf was always the intent with that CPU architecture. Which possibly explains why they were so lacking in other areas (IoE, cache, IPC etc).
They were likely banking on multicore taking over, which I can't say they were wrong about tbh
@@SPNGI remember my ATI HD4850 512MB could run CoD MW2 at 1080p with 4x MSAA. That card was legendary for its price to performance. Although the 4870 definitely aged better due to the extra memory bandwidth.
It's pretty crazy how long the xbox 360 and PlayStation 3 lasted. Hard to believe it's still, a real trial to emulate thes systems on all but the most overkill PC's money can buy in the present ecosystem, too.
These systems got a crazy amount of longevity, and I think a lot of their exclusives still hold up pretty well. I remember trying to emulate Gran Turismo 6 on PC but it was pretty rough haha, the emulators for both are always improving though.
Well its very hard to emulate because of allot of translation of cpu architecture and gpu api especially for ps3 emulation
@@gnrtx-36969 RPCX3 devs are making breakthroughs as of late thankfully mostly by using newer CPU instructions such as AVX2
Emulation of different processor architecture and the attached hardware is always relatively time consuming task. One clock cycle of the emulated CPU usually takes several cycles on the host CPU. Even with dynamic recompilation, ie 'JIT', there is always a big performance hit. Apple spent a lot of time and money to optimise their emulation when they went from 68K to PPC, then from PPC to Intel x86, and x86 to ARM. But they couldn't achieve anything like native performance each time on the fastest new Macs. Eventually new hardware becomes fast enough for the emulation to outpace the old computers, but it usually takes a few years.
Meh, my rog ally is fine for ps3 emulation, and red dead redemption on the xbox 360 (only one tried so far) too. And thats a handheld...
A Far Cry 3 test would be interesting, i know it uses a weird resolution on the 360 (704p if i recall correctly), but the highlight it that it runs extremely badly, being a 15-25fps game most of the time.
Far Cry 3 on the lowest settings on pc still has better graphics (mainly textures) than the console versions. and from what i know it was easy to run on lowest - so could maybe work!
subbed. need to support smaller youtubers esp when the content is this good.
Thanks man, I'm really glad you enjoyed. Welcome to the channel 😁
The jump in security Microsoft made between the Xbox OG and the Xbox 360 was staggering. They went from a complete mess to a system that still hasn't had a software exploit almost 20 years later. Jumping from a Pentium 3 to a PPC chip with a hardware backed hypervisor and signing design in 2005 was stupid impressive for the time.
And you could try saying the KK exploit was software, but it required (and still would require hardware modification) a flashed DVD drive still.
I agree, it is really impressive how they improved the security, but the lack of it was one of the biggest reasons why I love the OG Xbox modding scene. It's so easy to get into, and the fact that you can do stuff like swapping the CPU for an entirely different desktop chip is amazing to me
@@SPNG interestingly enough a few years ago some unused Xbox 360 CPUs appeared on eBay, having zero efuses blown. Some people said they were going to CPU swap it and then create their own keys and blow the correct fuses. I don't know if anyone did, let alone post about it. Would be really interesting to see though.
Wow I had this card. Never seen this card mentioned ANYWHERE. It actually sucked lol. All the heat of an HD2900XT or PRO and none of the performance. Drivers for this card were a nightmare when it launched. ATI didn't seem to want to acknowledge its existence for a few weeks when it first came out. Then it stopped even being sold after like 6 months.
You're the first person I've heard of having one back in the day 🤣 What made it worse is it came out like days before the HD 3850, which would have been faster and even cheaper I think? I can't even find a review from when the card came out (or a review at all to be honest). How much did you pay for yours back in the day?
I play Crysis with this GPU paired with Core 2 duo
Very interesting
I scrolled some and didn't see it mentioned, but for many games they do not have an internal render at a full 720p unlike the PC versions would. As well some titles were limited by the CPU in the xbox and scaled MUCH better on your test-bench.(though, you already know that) It is for this reason that we encounter additional issues with the best possible comparison. (curious if something like the old Durante DSfix could be used to inject an internal render resolution to match what the console was actually rendering pixel wise) I did enjoy the video and the effort to go over the technical aspects is always appreciated by myself! Keep up the great work!
3 out of 12 games on the list use a sub-720p resolution on 360 (Oblivion, Far Cry 2, Crysis 2). I tried running Oblivion at its 360 resolution of 1024x600 but it would crash on launch, Far Cry 2 might as well be the same since its 1280x696, and Crysis 2 only drops it to 1152x720. Also, I wanted to make sure the CPU wasn't a limiting factor here, and I think in a sense that can account for any CPU-related console optimization. If I used a more period correct chip I could've experienced some issues with the later games in particular. Much appreciated by the way and will do, there's more to come 👍
man i miss the xbox 360, it was back then when it had games and games that sometimes were better than PC ports
The Project Gotham Racing series hits home for me. I grew up playing PGR2 on the Original Xbox, although I still need to try PGR3 and PGR4 on the 360!
Cursed era for pc. So many shitty pc ports or straight up ps2 ports.
@@RiasatSalminSami It's not a problem unique to the Xbox 360 era, it goes back much further than that.
@@another3997 pretty unique for 360 era because the pc got straight up old gen ports.
There was problem in the ps2 era as well with missing graphical effects but the games were at least port of the current gen consoles there.
@@RiasatSalminSami NFSMW comes to mind. The PC version really doesn't feel the same as the 360
makes me wonder if the xbox was held back by its 128 bit bus and the cpu. seems like the GPU in the 360 could have performed better but was gimped with everything around it.
I'd wager much of the differences come from the memory fine-tuning that consoles enjoy. A lot of your tests seemed to run into what looks like VRAM limitations on the 256MB card, but on 360 it was fairly routine from what I understand for games to go above 256MB memory use for graphics.
That sounds about right, I was definitely pushing it with some of these settings trying to match the image quality. Time to bust out the BGA rework station and get 512MB on the GT 🤣
It's more like an X1800/X1900 class GPU but built on TeraScale architecture. Dx9 only with very limited features.
The HD2900 cards sucked but they were neat. They were all known for running hot and new paste and pads are essential to run them nowadays due to aging.
It's true that Xenos has a lot of aspects of R580 (X1900), but with fixed function shader pipes it doesn't have nearly as much flexibility in pixel shading and geometry processing power. Also, if I had limited myself to DX9 here I wouldn't have been able to test nearly as many games. I didn't mention this in the video but this card performs extremely similarly to the X1950 XTX in DX9 games.
Fantastic channel and the video, instant sub!
Welcome to the channel! 👋I'm really glad you enjoyed
In my experience, the game that pushed the most my old Xbox 360 arcade was Crysis 2, you could feel the poor thing coughing and screaming when nearing the end of the game, with too much happening at once... I miss it so bad :(
When I was comparing performance I was kind of going off of old memories on how it performed. I really thought it stuck to 30fps for the most part but refreshing my memory with some videos that doesn't seem to be the case, it kinda chugs to be honest
Surprisingly, Supreme Commander absolutely thrashes the xbox 360. Considering it does the same to hardware twice the performance I guess the xb360 got the last laugh.
3 minutes in and very happy with the level of detail... definitely subscribing
Welcome to the channel, I'm glad you're liking it 👍
yeah, the later aged games ran better on 360 just because of how devs could optimize the code for the hardware, who knows what nuanced changes they could make for those version to get every last frame they could. That's to say nothing about the structure of code execution on a more streamlined piece of hardware, either. Great video
There's really no replacement for those optimizations on PC, but I was still impressed with how this card managed. Even some of the losses were closer than I was expecting. Much appreciated, and thanks for watching 🙏
If Im remembering it correctly, the XBox 360 GPU was based on the ATI X1000 series, codename R500, no? R500 and R600 are completely different archtetures. The R600 was natively compatible with the DX10 API, the R500 only supported DX9.0c.
It's true that Xenos has a lot of aspects of R580 (X1900), but with fixed function shader pipes it doesn't have nearly as much flexibility (as R600 or Xenos) in pixel shading and geometry processing power. Also, if I had limited myself to DX9 here I wouldn't have been able to test nearly as many games. I didn't mention this in the video but this card performs extremely similarly to the X1950 XTX in DX9 games. - taken from one of my earlier replies lol
Put it in a powerpc mac and we have the xbox 360 at home
Funny enough they used PowerMac G5s as early dev kits for the 360
@@SPNG and I had one and stupidly sold it. Having a Powermac G5 with Microsoft Game Studios stickers was so cool. It would have needed hundreds of dollars of parts to boot and I thought I was being wise by getting rid of a computer I knew would never run again.
I would say the major flaw in your testing is the fact that a lot of games on the 360 run at a lower resolution than 720p and are upscaled internally which gives better performance.
Only 3 out of the 12 games tested are like this (Oblivion, Far Cry 2, Crysis 2). I tried to run Oblivion at its 360 resolution of 1024x600 but it would crash at launch, Far Cry 2 runs at 1280x696 which is so close the results might as well be interchangeable, and Crysis 2 only drops it to 1152x720.
Thanks for the video, I appreciate the inclusion of frametimes AND the graph. Looking forward to your future uploads.
Glad that you like it! I think having both tells more about how a game actually felt, and what kinds of stutter were experienced. Much appreciated and welcome to the channel 👍
Put one of these in a PowerMac G5 and you have what's pretty much an early Xbox 360 devkit
With an updated GPU at least. I don't think they'd be capable of running the dev software due to a lack of support though, I think those systems originally came with an ATI X800-derivative FireGL card
this is how these consoles went down the ps3 had a significantly faster cpu that can do some graphics calculations . the xbox 360 had a much more powerful gpu that was mostly unified minus aa chip / the ps3s gpu was just a 7800gtx
the cpu on the xbox was utter poo as like another said it performed like a prescott 2.8 ghz cpu without htt
Well yeah a cpu which has 1 power pc core + 8 128bit simd vector units could easily beat one with 3 core 6 threads
About the gpu the xenos has more texture units but lower cycle i think makes around 240gflops while rsx makes 230
But edram makes it much more Superior
PS3 is a mess but the first party Devs made some magic on that box.
@@amdintelxsniperx Xbox CPU IBM Power PC its 3 cores 3.2ghz. What are you talking about? Prescott 2.8ghz its not so funny man. Try to turn on MGS V Tomb Raider 2 and GTA V for Prescott 2.8ghz. You will be see nothing.
wait, what version of gta 4 you played? Because first version had few graphical features that was cut later. Also, dunno about gta 4, but gta 5 uses pagefile, and your game may be unstable if your page file is in auto mode. Plus, is gta 4 with windows live(even if offline) or is it something like xliveness?
I'm not sure which precise version I was using, but I think it was on the newer side? Either way, the card would have lost, that game is just optimized so poorly on PC. I was amazed to see it struggle on some more powerful cards in my own testing
Man this was a really well made video, I stumbled with it because I was actually planing to do an xbox 360 PC myself and this was a blessing. Nice lots of information and overall performance was excellent
Much appreciated, I'm glad you enjoyed! Finding an equivalent CPU is definitely a challenge as there's pretty much nothing like Xenon made for like regular systems but you could just go off of instructions per second and something with the same amount of cache.
I bought a XBOX 360 on 2012 and I can't believe that this hardware which already released 7 years at that time could run some games like HALO 4 and GTA V. I hope my Xbox series X could be played for ten years too.
The OG Xbox and 360 GPUs were so interesting in feeling like functional mass production prototype GPUs that paved the next generation of Desktop graphics logic and its a miracle how they milked the 360 dry of its performance on later games, it was a technical marvel but boy did they waste power to get there, the Xbone was so dissapointing that after playing only 1 game, i handed the keys to my brother as an upgrade from a PS2 and moved back to PC Gaming, the Sempron and FX5200 that was my first gaming PC (then to an Athlon 880K + 1060) couldn't handle the more modern games, nor had the internet stability, which i think goes under the radar for a lot of later 360 players who weren't used to how spotty the OG Xbox could be. To this day i am still playing titles on the 360 that you just couldn't get on PC, no exclusivity deals like the following generations, if you wanted a game, you bought the platform, THATS what consoles were strong for, now? They're nothing more than a snazzed up GUI to launch games on a gamepad with, making me think that soon, the Wii U had the right idea, being able to take the power with you yet dock it when at home..sounds like a steamdeck..and i really think thats the future of console gaming, but with so little between what could be a MS/Sony handheld and an Ally etc, console/PC gaming optimisations could meet as one with Potential Native linux support (Though i find that hard for MS to swallow) and the big war of PC vs Console may be settled with portable powerhouses that we carry everywhere..maybe the nokia nguage wasn't so wrong either..a large phone screen that can dock into a handheld chassis with a dGPU, batteries etc? Sounds like it'd be an interesting shift to an all in one gaming phone/console/pc handheld :P
Love the use of Gran Turismo 5 OST songs
I noticed your test bench uses Windows 7, how did you play Dark Souls 2 on Windows 7 withotu steam support? If steam was already installed can you still play offline on games you've already downloaded or did you have another workaround?
I just used a repack I found somewhere online lol
@@SPNG beautiful reply.
@@SPNG XD
*found it in the seas, Brother
I love the GT5 music in the background 😩
Gran Turismo music is something else man, its like comfort food
What is the music in the conclusion?
Gran Turismo 6 After Race Music 1
@@SPNG thank you so much! 👍
got my 360 jailbroke with rgh with a 1tb hdd , and it basically became my main gaming machine, and i have a pretty beef'd up pc.You can argue all day but the plug and play'ness of the consoles is god send.Cant beat gaming session on a confy couch,huge tv after a hard day of work
I'm not an expert how the shared memory of the xbox 360 works (I know how it's done on the old Amiga in detail), but as the game creator could divide the RAM (512mb) into CPU / GPU slices as desired, there has to be a downside in the access times. Even when you split the memory exactly in half, there has to be an element that controls the access to the RAM, what always includes wait states. Dedicated memory for CPU/GPU would be faster, but of course 256/256mb or 512/512mb have their downsides (not the best solution for all games or too expensive). So even when you compare the identical GPUs, it will be hard to get a perfect match.
It's always amazing how a system, even with a "bad framework" like DirectX, could be optimized, just by getting rid of the factors, that different hardware creates. I played GoW2 two weeks ago on my 78" OLED and besides being blurry, the look and feel is still great.
Thanks for testing and creating the video 🙂
I mean, I guess we have to take what we can get though haha. In that case maybe 512MB of VRAM would be more suitable (probably better to have too much instead of not enough) but this is the only R600 card with this GPU configuration period, they didn't make any 2900 GTs with 512MB 😔 Thanks for watching man 👍
A enjoyable informative video nicely done 👍
Feels crazy now because i try and match all my game settings to 360 level on my steam deck and it just amazes me how far we've come
@@SirQuacksalotthe1st Much appreciated 🙏 And very true, with a lot of today's handheld gaming PCs you're getting well over that level of performance with a fraction of the power consumption!
@@SPNG I know it's crazy
To have every single one of my childhood games playable
In the parm of my hand
Sometimes when I think about it I can't stop my self from giggling like a dam child haha because part of my brain registers that it's so absurd to me but it's true
Seeing what this made me appreciate 10 series cards even more. Gpus like this ~11 years old around that time barely managing 30fps at 720p at the time compared to 10 series cards performing well roughly 8 years from when they came out still very capable today in today's titles
The HD 7870 from 12 years ago can even run Cyberpunk with playable framerates and even FSR!
Would like to see a video of PS4/Xbox One equal gpu, something like HD 7850 2 gig for ps4 and hd 7770 2 gig for Xbox one
Good idea. Although I guess the PS4 used closer to 4.5GB of VRAM in later years.
I think The 256 MB vram is to low vor the pc Versions of the games. lot of AAA Games on ps3/360 run in a bit lower Resolution like 6xxp and not Always in real 16.9 like cysis with1152x720. this Save a bit power an vram.
I Like The Gran Turismo Soundtrack.
True, I did try to run Oblivion at its 1024x600 resolution it uses on 360 but it would crash on launch. For Crysis 2 I thought the difference was too small to be mentioned. Those are the only games that really use a sub-720p resolution in this suite (technically Far Cry 2 does as well but its 1280x696 so pretty much the same lol) And thanks! I love the lounge tracks in those games 😀
You also have to remember most people still used CRT monitors or even CRT TV's when these cards came out.
It wasn't so much the GPU that made the 360 struggle, both the PS3 and 360 were pretty much behind when a quad core CPU came to the market. (i guess even the duo core were better at some point early on)you did get better looking games trough out the generation but the performance was always tricky to achieve.
Good video! Hope this continues to hit the algorithm.
Subbed!
Much appreciated, welcome to the channel 👋
"Capable of doing 4x MSAA with a negligible performance hit". Not true. A single 4xAA 720p framebuffer would exceed the 10MB of the EDRAM. That's why 2xAA was commonly used instead.
I think 2xAA was a requirement pushed by MS for the first 2 years.
This is kind of splitting hairs a bit. There is indeed a performance cost with 4xAA since its doing enough on the eDRAM to require tiling, but it's very small.
"ATI have been quoted as suggesting that 720p resolutions with 4x FSAA, which would require three tiles, has about 95% of the performance of 2x FSAA." - ATI Xenos: Xbox 360 Graphics Demystified (Beyond3D, Jun 13 2005)
I wanna get my 360s back up and running, its currently RROD and i dont know for sure why but i have the feeling it needs to be re-capped
You may be better off getting a Slim or something. I got my system several years ago and it has been going strong, some of the original models can be reliable too but you have to know what revision board you have
My first GPU was an old x1950 pro a friend found for free and gave to me in 2013 ish I believe, not really the same as this but the looks reminded me of it. It somehow made dayz playable with my 2 core 2 Gb ram prebuilt I had at the time.
Great video my dude!
Thanks, I'm glad you enjoyed 🙏
This is awesome stuff dude! Just slapped that sub button; looking forward to more!
Much appreciated man 🙏 Welcome to the channel!
The pc version of GTA 4 was an inexcusable pos
💯 Its crappiness strikes again
could you imagine what the 360 would be capable of if FSR was implemented onto it.
Even without any upscaling it's hardware is surprisingly capable.
There wouldn't have been enough Compute power to run FSR2 properly imo. I know the Switch does it well with No mans sky but I genuinely can't imagine the 360 being able to run such compute and even then it wouldn't have the capacity to do TAA which is needed for FSR2.0
@@Loundsify Crysis games on 360/ps3 use an early form of TAA - just saying.
Great video. very detailed and informative. benchmark charts AND gameplay footage so we can see how it looks? ez like.
I really appreciate it, thanks for watching 😀
The high framerate is due to the cpu used. A 2005/2006 cpu would struggle.
Exactly. That cpu is so overpowered!
@@okene yeah like the major bottleneck on consoles is more the cpu than the gpu.
This video isn't really focused on trying to recreate the entire system as replicating the CPU is pretty much impossible with regular PC parts (it is as far as off the shelf as you can get), and I wanted to see the full performance of this card.
As far as i know the PS3 was a similar case, where it's GPU the RSX is a Nvidia G70(GeForce 7800GTX) but slightly modified in similar ways to what explained in the video. The RAM also matches, 256mb of GDDR3 RAM in a 256 bit bus running at 650mhz
Hey bro fantastic vid if you ever got serious about making videos like this I’d suggest getting a hydraulic head tripod especially for pans. Static heads are just fine for stationary and moving it around but when you try to pan it gets jittery like that. Just a heads up and keep up the good work
Much appreciated, and thanks for the tip 🙏 I didn't even know that was a thing lol, I'll have to pick one up at some point. The rough pans always bothered me a bit, I can't get them to be perfectly smooth no matter what
Id have been thrilled to have a 2900 when oblivion launched. I was still running an x800, then a 7300 GT, and then an 8800 Ultra the following year. Obviously then I had no more troubles.
8800 Ultra would have been a massive jump up 😯 How long did you have that card?
@@SPNG how long before the GTX 280? 2 years maybe, 18 months. Then I had 3 GTX 280’s 😆 Crysis was the obsession by that point 🫤
@@chincemagnetdude we're not worthy; 3 way sli GTX 280's. 😮😮😮
@@Loundsify not too impressive these days, I had a 4 way HD 5870 setup at one point. I’ve had a ton of multi GPU setups. A weird one was 2 x HD 4850’s and an HD 4830 in 3 way. 2 GTX 580 3 GB, 2 Kepler Titans, 2 GTX 980 Classified, 2 GTX 1080 sea Hawk X EK, 2 GTX 1080 Ti Hydrocoppers, and lastly 2 RTX 2080 Ti FE’s. Sadly multi GPU is pretty much dead now. So a single 3090 and a single 4090 more recently.
@@chincemagnetit's mainly because multi GPU scaling didn't scale linearly, quite often you would only get 50-60% boost in FPS and sometimes it would be worse than 1 card. I had a HD7870 XT 2GB in crossfire, that card was a cut down HD7950 so it was much faster than the standard HD7870 And really should have been called the HD7930.
How did you get the ram to run at 2400? I have a z68 mobo from gigabyte and the memory frequency setting is nowhere to be found.
Every board is gonna be a little different. Just poke around the settings a bit and looking for OC guides for my specific board helped a lot.
@@SPNG mine supports 2133 ram speeds just can't find the frequency setting. My z97-k board has an xmp setting which allows you to choose between two xmp profiles.
@@SPNG btw how many systems do you own?
@@thesmokecriminal5395 A fair few. This test bench ends up housing multiple different boards depending on what I feel like testing
@@SPNG but how many systems do you own that are for personal use? I would imagine that you use your older hardware as well
Glad u made another vid like this. Hope you make a Nintedo Switch PC equivalent video so I can see if I am being beaten by a 2017 handheld on my laptop 😭😭
Recreating it in a whole PC would be a challenge due to the ARM CPU, but just the GPU comparison would actually be plausible thanks to a certain similar NVIDIA card, and I already have it 👀
@@SPNG 👀
@@SPNGthen make a video with the gpu and call it "The nintendo switch gpu"
My pc with a Celeron N3050 is worse than the Switch in portable mode. The only good thing is I know games that will definitely run fine if they were on the Switch, like the staker game coming out next month and just cause 2 or sleeping dog
@@SPNGphenom X3 and downclock to 1Ghz 😂
The issue with any comparisons to Xbox 360's GPU is that it was very different to all the conventional single die GPUs of the time and indeed today. The point of 360's GPU and 10mb eDRAM setup was to create a high bandwidth framebuffer which no PC GPU had by comparison. It had one die with main GPU logic, and a daughter die which could handle a bunch of z buffering and blending at low performance cost. This worked very well except for one major problem: 10mb of eDRAM was not enough to store a 1280 x 720 frame with typical color and depth values. It was only enough to store a 1024 x 600 frame. So you either had to render at 1024 x 600 or tile the output, which impacted performance unless you used certain mitigations. Many early 360 titles thus rendered at 1024 x 600 and got very low cost anti aliasing. Later games tiled. On PC you usually targeted higher resolutions anyway, most cards by 2007 had more VRAM available and more bandwidth across it. So several of the games you test at full 720p were not 1280 x 720 on Xbox 360. Examples: Elder Scrolls 4, Crysis 2, Far Cry 2 etc
When you say that I gotta emphasize that I'm suggesting it's an "equivalent". Out of all PC graphics cards, I think this one is a parallel to Xenos due to its shading throughput, features borrowed from Xenos and everything else mentioned in this video. I'm not suggesting they are exact replicas of each other of course, just that the 2900 GT is the most correspondent you can get. Also out of the twelve game suite, those three games are the only ones that run at a resolution other than 720p on 360 (really two since Far Cry 2 runs at 1280x696)
Crysis 2 did not run smoothly on X360 at all. During intense moments the frame rate dipped to high-to-mid teens, it rarely even hit stable 30 fps. Check the digital foundry, they spoke of Crysis on consoles lots of times.
ruclips.net/video/xq4s06EVgRY/видео.html
@@przemyslaw_polak_93 Wow that's bad. I didn't know it was that slow, thanks for showing that
@@SPNG yeah, it was bad. But I finished Crysis 2 on my X360 and loved it a lot 🤣 back in early 2010s gamers were a lot more forgiving in terms of poor performance. Today it seems not acceptable but then... Well, it was what it was, we played it, we loved it, we had fun in 25 fps 😅
The Xbox 360 was nuts. My ex and I were always big into gaming and had a pair of pretty powerful gaming PCs at the time. I was a diehard Nintendo fan until Wii, and then went Xbox. The 360 blew me away and I almost exclusively used my Xbox over my PC except for WoW. Peak gaming era.
The 2900xt and the 2900 pro were practically the same card, with the later being 100 dollars cheaper at the time. I bought one and flashed the xt bios to it. I was able to get a more aggressive fan curve and higher clocks. It made a difference in GTA IV
Something to note about GTA IV, though I don't know if this can be applied in these tests, But on modern systems you can actually use dxvk by just dragging the DLLs for it into GTA IVs folder, making the game render through Vulkan.
An issue GTA IV suffers from, and also a number of other DX9 games in some way shape or form, is that Microsoft uses a specialized version of DX9 since Windows Vista and this DX9 implementation does GTA4 no good.
I can't remember why using dxvk on GTA IV improves performance, But I believe it's something offloading some CPU overhead or something like that.
Just out of curiosity, is the 2900 GT like the 2900 XT in that you can plug a 6-pin connector into the 8-pin socket and it'll still work?
Seems like that's the case, I tried just the 6-pin on a different system and got a POST so it should work fine (Edit: 3D stuff works as well)
A lot of GPUs are this way. The connectors are usually just big 12v supply rails, so the missing two pins just means that you risk the wire getting hotter as the current draw goes up.
@ZeroX252 The 2 extra pins don't carry 12V. One of them is ground and the other is a sense pin, and in my experience most cards won't work if the sense pin isn't connected.
I remember from Digital foundry reviews, lots of x360 games did run sub 720p
The games in this suite that are sub 720p on 360 are TES Oblivion, Far Cry 2, and Crysis 2 to my knowledge. I tried to run Oblivion at the 1024x600 resolution it uses on 360 but the game would crash on launch. The 360 runs Far Cry 2 at 1280x696 (but the difference is so small it might as well be interchangeable with 720p). Crysis 2 is 1152x720 on 360 but it wasn't in the default display modes and even if I edited the config file, running it at that resolution would've hardly made a difference in performance. There's a multitude of other games that are sub 720p like you mentioned, but that should cover the ones in this suite at least.
@@SPNG Call of Duty games ran at 880x720.
Cool, Cool, Cool!!!, great video!!!, love to see more videos like that? hope xbox one GPU video? I think the GPU performance was surprisingly good, and I agree with your findings!!!
I might have a Nintendo Switch GPU equivalent video coming somewhere down the line, we'll see 👀 Thanks for watching man 👍
Great video ! keep up the good work !
Much appreciated man. There's more to come
Incredibly interesting subject.
I had to sub.
Welcome to the channel. I'm glad you're interested in seeing this kind of stuff 👍
Sadly there was a error in the chipset where it would fail quicker causing the red ring of death
Sucks that you didn't test GTAV, it would really show how far the optimization stretched the 360's hardware. I'm guessing you had issues getting it to work.
The game worked, but I felt it would be redundant with GTA IV already on here. Not to mention GTA V really struggles on Terascale cards in particular, while GTA IV performs like crap on all hardware
GTA IV is absolutely fine, it just doesn't like FPS monitoring like rivatuner which causes the stuttering. Also many later games ran at some weirdly internal resolutions lower than 720p, so the card seems really comparable overall. Shame there weren't dx10 drivers to test out with GTA V. Im sure with the right equivalent settings it would do that too, since getting it crammed on the xbox 360 seemed so surreal at the time also, although running at those pseudo fake 30fps that felt more like 10-15 input wise. Double or triple buffered, like far cry 3
I was using Fraps, not Rivatuner. Stutters and performance were the same regardless of whether it was running or not. I did try GTA V, but I felt it would be redundant to include with GTA IV already on the list and due to the drivers it tends to perform badly on Terascale cards in particular, while GTA IV evenly distributes its poor performance lol
@@SPNG Yeah, you need it not even on your system, like completely uninstalled. It's such a weird software like that. I mean rivatuner. Okay, didn't know GTA V even ran with those old ATI drivers at all
i think the hd 1950xtx is xbox 360 gpu it fits more into time because that gpu exist in 2005 and hd 2900 exist in Late 2007
Yeah, I remember installing GTAIV and it's so bad on PC, even with patches and stuff, it just isn't smooth. GTA V on the other hand, is greatly optimized. The 6th and 7th gen of consoles received many better releases when compared to their PC counterparts, it changed from 2010 onwards.
The 360 will always be legendary I remember my dad surprising my and my older brothers with it when it first launched
I remember my dad getting me and my older brother one back in the early 2010s. Considering I was grown up on an original Xbox I was amazed by how much more capable the 360 was
@@SPNG for the time the 360 was quite powerful even In 2010 it was still solid and most affordable option for smooth gaming
I expect that running some of these games in Linux through wine may actually perform better.... Mesa has had more significantly updates than catalyst, so the drivers may just work better.
Dx9 only i think though. No DXVK, since the card is waaay too old for vulkan.
Just as an example of how neat modern MESA drivers are... There is experimental support for Vulkan on some terascale cards like the 6970 and 6870. Absolutely crazy.
Incredible breakdown. You should have 100k+ subs.
Much appreciated man, I do enjoy trying to go more in-depth for these reviews!
would be interesting to see how this card performs with modded drivers from Nimez and also using DXVK in some of the games.
With no Vulkan support I wouldn't be able to do that unfortunately. I've heard GTA IV can massively improve with that in particular.
@@SPNG Oh snap I didn't realise the card had no VK support, thats a shame. I also just checked on the guru3d forums where the modder who creates the nimez drivers posts his stuff and he only has terascale 2 & 3 drivers, nothing yet for terascale 1 :(
Crazy that hardware from 2005 could run games like dragon age inquisition and gta 5.
And Titanfall! The console optimization from that later era has some really impressive stuff at work.
It’s more complicated. But I’ve heard that using DXVK in GTA:IV fixes many of the issues.
Course, it isn’t relevant with this card, but for Vulkan-capable cards, it really helps frame rate and frame times.
what is the banger song that plays last in this video? it gives me severe JRPG vibes.
Japanese, but as far from RPG as you can get haha. It's Gran Turismo 6's After Race Music 1.
@@SPNG cool, thanks!
I know there wasn't a direct Halo 3 port to pc but could you quickly compare the performance of this system on MCC next to Halo 3 on the 360? could make for a good "short"
Unfortunately MCC is DirectX 11 😔
@@SPNG Isn't there a directx 11 compatible gpu with similar specs to this one in your video? If not I'm sure there's a way to work around it
Any chance you could shove this into a PowerMac G5? That system has the most in common with the Xenon CPU
Probably could, but drivers would be an issue and there would be very little cross platform software to test with on PPC
the Xbox 360 used state of art technology and some games really showcased what the hardware was actually capable of.
If you were maybe interested in benchmarking gta 4 again but having it be actually playable, use dvxk, fusion fix and various fixes
i know the point of the vid is supposed to be 1:1 with the 360 versions, but i feel like gta 4 is an exception here
I wish, but no Vulkan support on the card makes using DXVK impossible.
Nice video. Can you make supermicro workstation build? Some standard atx or eatx pc case and standard psu, and old, example LGA 1366 Supermicro motherboard.
Thanks! Maybe at some point, I've never used a 1366 system and a workstation could be fun. Just a matter of finding one for a decent price, I'm a bit of a cheapskate haha
Oblivion would be more of a CPU test, like Morrowind before it and later Fallout 3/New Vegas it was ridiculously CPU bound, the bottleneck with those games was the memory or lack thereof.
I had an HD 1950XT Pro. I was under the impression that was the same GPU.
Holy heck!!
Getting the card that old in working condition is as rare,as ice in summer on the beach 🏖️
Currently have the idea to build the budget ultimate 2009-2011 AMD K10 shitbox. This card MIGHT work or a later one.
That would be a fun build, only thing is this card is REALLY hard to find! It took me so long to find one. It's the ultimate pointless card from its time but good luck finding it
@@SPNG Looks sick. I'm trying to find a terascale-based card that isn't being sold for $100,000,000 outside of the half-height trash specced ones.
9800 gt will be probably a better match for some cheap phenom x2/x3 or athlon x3/x4
@@codegenprime3362 So, nVidia it is then?
underappreciated channel
Test system from 5:54 isn t similiar to Xbox 360.
@@andrzej9880 Not trying to recreate the entire system here. That is pretty much impossible due to the CPU being so different from anything made for regular PCs