one other tip, try linux for these old cards, the opensouce drivers totally destroy amd's own on these cards... i recommend testing Solus Linux for example, i just link it to my games lib from windows in steam and such and. away we go. solus has steam, proton and the rest in the software center ;)
@@JohnChrysostom101 Yes we know. He is talking about being impressed with the GFX the 360 was pumping out at the time. I remember being awestruck seeing Gears Of War for the first time on a HDTV. Ever played Forza Horizon 1? It looks fantastic even to this day. The ED-Ram was what enabled them to include 4xMSAA and it looks great.
I could hear 'way back', '1983' and 'Apple ][e' and be fine with that. They said future generations would be natural PC experts. What a joke. The proliferation of computer tech made more recent generations dumber, not smarter. But the funniest thing is that remembering 2018 makes you feel old when you're one of those PC illiterate kids I'm talking about.
Considering this was around the same time NVIDIA introduced CUDA, yeah I'd say this is about when it all went wrong for Radeon as a GPU brand. They never recovered from the 8000 series as far as the market share graph is concerned.
Not true the HD4000, HD5000 and HD6000 series AMD had 48-54% GPU market share. Things started to change with the gtx 900 series and went bad during the gtx 1000 series vs rx 400/500 series, because AMD was stuck on the vastly inferior 14nm from Global Foundry, while Nvidia was using probably the most dominant process node of all time, TSMC's 16nm.
My 4830 was faster than a 9600GT for less money. 2xxx was shit. Sure. HD3xxx wasnt that bad, but 4xxx just dumbstered nvidia. 5xxx, again. Actually what went really wrong is that people thought amd is shit, just because they heard about it. Fanboys
Should also mention, AMD/ATI abandoned the X800 and X1800/X1900 series cards and everything associated with them fairly fast also. It was kind of a trend for them back then to do this.
Yeah, I had a laptop once (to sell on) that had a Core 2 Duo and X1000 series GPU, and there were literally no working graphics drivers for Windows 10. But on Windows 7, where drivers existed... perfect! AMD and ATI did a horrific job with drivers with their older GPUs.
Kind of sad because DX9 was still king at the time. I imagine a revamped x1900 series would have been adored by all CoD players because ATI Tray tools was becoming a big deal.
If i remember correctly the cards sucked at that time because of patent disputes . It tickles my memory that a key engineer left for nividia and he held key patents or a patent hold didnt like the sale and with drew the use of the patents . Either way Nividia ended up with the patents and the rest is history
A year and a half after the 2900 XT hit the market AMD released the HD 4830 for $130 and was over 50% faster with less than half the power consumption. Many of us bought them with rebates to drop it to $99 at launch.
Don't forget about the HD 3870 which was basically a HD 2900 with adequate power consumption. I personally wasn't a fan of the HD 4000 series. Except for the 4770 which was - if I remember correctly - a test run of a new TSMC node (40nm?).
Jesus christ I've never actually thought about it. Now you've just reminded me of Omega drivers and it has never occured to me that those were 3rd party drivers since I was a wee boy playing on my pc. Still on AMD though :D
3:17 turn hte screw counter-clockwise first, until you feel the thread "click" a little. Now start screwing into the opposite direction. That way, you'll always find the start of the thread easily.
Regardless of what ATI/AMD would have put out in 2007, the actual main take away is that for a very brief moment, console gamers had a superior experience over a high end PC, until Nvidia came along and shut that shit down with the Geforce 8 series. :P As is though, back then I was using a Radeon x850 XT AGP card during my first jump into AM2 with the Athlon 64x2. I went from that card to the X1950 XT AGP. But after the first Phenoms were released, I ended up doing a Phenom X3 Crossfire setup with two HD 3850 cards. I looked at buying two used HD 2900 cards, but it just would not have made sense due to the used market prices on them and the heat they generated. I did end up setting up a Athlon 64 FX-62 SLI system also for Physx stuff, and my first GPU pair in it was two 8800 GTS, before swapping out for two GTX 280.
@@chillhour6155 Thats not exactly true. Many games were, some were not (Forza, Ninja Gaiden 2, Bayonetta and a chunk of others were not if I recall right). Same applies to PS3 also. And honestly, I could care less about the frame cap if the fps was mostly consistent back then. I was using a pipeline/shader unlocked Geforce 6800 AGP during the Xbox 360 launch, so I was used to running stuff at around 30FPS, like NFS Carbon and Company of Heroes. What mattered a lot, for me, was the hardware like the triple core Xenon with six threads, and modern feature set the 360 brought to the table, along with the exclusive titles like COD 3, and titles that looked better on Xbox 360 like Ghost Recon Advanced Warfighter (I had a Ageia Physx card and still felt the game looked and ran better on 360). For a brief moment 360 was the only hardware in town featuring Unified shader architecture and 3 core/6 thread processor. That was not present on the consumer PC side until the Geforce 8000 series and later Radeon features in this video, along with C2Q and Phenoms. BUT, and this is a BIG BUT, I did not care for how the 360 cooked itself to death. That was literally why I never got one for myself. We only got my son a slim model in late 2010 after we were sure the red ring issues were over. But for myself, I stuck to pc gaming. I have not had a modern console during its retail life since the Dreamcast.
Huh? Vega 56/64 crushed Nvidia. One of the best 1440p cards for the money. Now a 5700XT for 150 bucks is the better choice but botha re craty good. Sometimes u get a vega 64 for 60 bucks tho, then its a no brainer
The BIG problem with Terascale wasn't that it wasn't powerful enough to keep up its that AMD's drivers struggled to deal with the fact most games PC games were not that well optimized put its power to any real use. Basically games had to be capable of constantly giving the GPU multiple things to do and the hardware would shuffle between what was most optimal based on what resources are available, if a game isn't giving multiple batches of draw calls at a time so the card fails to live up to expectations but when the game is doing that it was a powerhouse, the rub of all of this is Microsoft was meant to fix this with DirectX 10 and did fix it with the Xbox 360's version of DirectX. Terrascale 3 mostly fixed this but still depended on game developers to implement efficient draw call batching and multi threading the GPU workload and AMD updating drivers for game spesific fixes. Thats where the Hardware Scheduler of the GCN cards came from and the AMD Radeon fine aged wine came from, what Mantle showed was possible and why DirectX12 and Vulkan were so badly needed. Nvidia had exactly the same goals with Tesla based GPUs but they relied on you having 4 or more CPU cores to have the driver software do the draw call batching on the fly, it was part of the reason AMD dominated in a lot of DirectX 12 and Vulkan games and Nvidia lagged behind for a while but still won if the game was DirectX 11. Funnily enough a GPU bound DX11 game on something as modern as a 1060 on a dual core processor will run slower than on a quad core where as the GCN equivalent would run about the same on dual or quad core.
A few historical fixes, the Tessellator in the 2900XT was not D3D compliant. Microsoft wouldn't finalize the spec until the Radeon HD 5000 series came along. ATi/AMD had their own developer tool to implement their tessellator but due to the poor performance of the 2900XT, no developer that I know of implemented it. The Radeon HD 5000 series as the first GPU capable of tessellation. nVIDIA did not have a GPU capable of this at the time. nVIDIA's answer would be an even more powerful Geometry processor capable of extreme levels of tessellation performance (beyond what was even visible to the Human Eye in terms of image quality difference). nVIDIA would go on to attach obscene levels of tessellation to their "The way it's meant to be played" and other such titles in order to win benchmarks against AMD. AMD would answer with a tessellation slider in their driver to fix that in those titles. Eventually AMD would go on to implement a very high performance Geometry processor of their own negating nVIDIA implementing obsene levels of tesselation in their sponsored titles. Once that performance edge went away, those titles disappeared. In some of them, even the area bellow ground, that a player could not see, were being heavily tesselated. We're seeing the same with RT today. We also saw the same with PhysX. All of those technologies have or will largely die once nVIDIA's edge is lost. CPUs have replaced GPUs for Physics. Tessellation is now down to reasonable levels. Adaptive Sync beat G-Sync. Etc.
Yeah, unfortunately Nvidia has a history of pioneering new technology but only ever using it to make their cards look better than the competition and justify their pricing. Once their gimmicks get beat, they always mysteriously vanish.
@@GrainGrown that's because you're young. Google "1999 nVIDIA logo". Even today, the N is not written capitalized in their logo. It's a "n". Have a look yourself.
@thepcenthusiastchannel2300 It's literally the second sentence in NVIDIA's guidelines, "NVIDIA is written in upper case." You're talking about a logo, it's just graphics, and it has its own set of guidelines. You're welcome, kiddo.
I've never used terascale gpus under windows, but I did under GNU / Linux for many years, and the open source drivers (radeon ddx + r600g) were great for their time. Easily the best end user experience with open source drivers on the platform until GCN support finally started coming together years after GCN 1.0 launched. So it's interesting to hear that they sucked under Windows when they were my preference under GNU / Linux for ages.
I'm very curious about modern MESA performance with this card. I'd expect it to be fairly decent compared to the official drivers. And the work on running MESA drivers on Windows could mean that testing them might not require Linux (although I've seen some evidence that the process is still a bit janky).
@@benjaminmiddaugh2729 Sadly my HD 6870 kicked the bucket back during summer 2018. I think the only terascale card I have that still works is a passively cooled DDR3 HD 6450... so not all that exciting. One aspect I think would be better with r600g now as opposed to when I actively used it is playing dx9 games. Galliumnine has become much more mature and accessible since then.
"Only fixed with driver updates that these cards never got" - I wonder if it would perform significantly better under Linux with Mesa, since there've likely been significant updates. May be limited to dx9 and opengl3.3 stuff though, since it won't be able to do dxvk.
@@amirpourghoureiyan1637 Yeah, i've been following the Terakan development, since i have a few 6970s and 6950s. Still decent cards for 720p gaming, and vulkan really opens that up for proton and more modern casual titles.
I had a 2900 pro card back in the day and that could be unlocked to XT. The card would run Bioshock just fine and crysis with a mix of Med/High settings. Not bad really and was ok as a cheaper option.
AMDs issue with GPUs has never been the hardware its always the drivers, its nice to see how even the RX 470 still holds up today but when it launched the drivers were bad.
@@yx8074there is a lot of problems with AMD drivers. Even for gamers only, i tried to run older games on AMD, and it was dogshit. A lot of missing files, missing dlls. I did a fresh instal windows with C++ installation, never been able to play the Settlers 2. I also did a fresh instal windows with Nvidia, c++ versions was the same files. The game just openned and runs smooth.
@@yx8074it is. Even from gaming only PC they dont stand Nvidia Drivers. I had a RX 580 and i never manage to play old games like The Settlers 2. I even tried to do fresh windows instal, all drivers, C++ versions, missing dlls. I bought a 4060 recently and just made a fresh windows instal and same C++ files i used before, the game just opens wand works.
Tbf Terascale 2 was left in a much better state than this though. Terascale 1 never competed with Fermi, as the HD 5870 had already been out for quite a few months when the GTX 480 appeared. AMD still dropped Terascale 2 two years before Fermi got dropped, and without the rudimentary Dx12 support Fermi had, but then Dx12 wouldn't have worked well on Terascale 2 either way, because it doesn't lend itself to Dx12 (or Vulkan for that matter) architecturally. Terascale 3 could've been supported though.
I remember I had a 2600XT and it was my first gaming graphics card. I asked my friends what card to get and somehow ended up with it, even though they all deny that they ever recommended it. It was a terrible card, but was absolutely amazing when I got it. Compared to the crappy entry-level or onboard graphics I was used to, it was like a whole new world. I upgraded to a 4850 a couple years later, which restarted the amazing feeling.
@00zero557A it is! It's essentially a sandwich of 2 9800 boards squashed together.. it does run HOT and I mean HOT.. upwards of 91c under load on both chips
@@OneLife69- don't worry, gtx 480 ran hotter out of the box and whole chip series had that feature. I think it could hit 105c under high load and it supposedly was "fine". Pretty sure that nvidia 4xx series were the hottest cards ever made so far. I'd like to see how they are right now. edit: Was the GTX480 that Bad? ruclips.net/video/PbLuRPgzlLY/видео.html
@@BudgetBuildsOfficial If modern drivers make it viable, then a video could make sense. BUT if you basically rehash your video without showing any new titles, then video might not have an audience. I mean, if it can run some less ram heavy popular modern game, it might be impressive to see that. vram is a huge issue with modern games, and ini config tutorial fits more to that Spanish dude that had tutorials for games that could run on ultra low end systems at 24+ fps. Too bad the dude sold his soul to brilliant, haven't seen any new english videos, was great "competitor" to you.
I had one of these back in the day. It was absolutely brilliant in the winter. It also seems this card is haunting your video, as you've got some odd audio stutters in a couple of locations. Probably nothing I'd reupload for, but worth mentioning.
This card is about as good according to techpoweredup as the 59 watt 67 dollar budget card 4670 that launched in 2008. AMD really messed up with this card and did really well with the 4000 series. To see a budget card perform as well that soon after.
terrascales magic is in windows xp only , i love the hd 2000-6000 cards on xp , i make ultra versatile xp system setups in these use cases they launch 3d mark 2000 on xp flawlessly , they play a ton of windows 98 games on windows xp with these cards with the right drivers they are very valuable
According to most people, TeraScale 2 (HD 5000/HD 6000) was trash, all of the 2010 and 2011 iMacs and MacBooks with these GPUs died. As a 2010 iMac owner I can confirm that this is true. TeraScale/TeraScale 1 was supposedly good though
I think the problem in Black Skylands was that if it is there, that FSR was active. Not supported GPUs like the 600 Series from Nvidia will just show a black screen
All the DX10 cards where an awful mistake a path chosen for profit that affects us today. That includes NV's. The reason they existed was to bridge the gap and make a useful jack of all trades card for both gfx and enterprise which lead to what we have today. These first gen cards could often be beaten by DX9.0c cards in DX9 games and so few DX10 games were ever made it was pointless to grab one till the dx11 era to upgrade, if you already had a 7xxx or x19xx.
The cost of producing different architectures for enterprise computing and for general usage is clearly not viable, otherwise all manufacturers would be doing it. The costs involved in developing and manufacturing modern GPUs and CPUs is mindblowing. There are always compromises, whatever the technology or product... it's inevitable. Businesses have to balance any number of factors to remain solvent and competitive. It's what gives us the luxury of having a choice of what to buy.
Like these videos as always man. I do want to point out though. 8:15 Weirdly enough. I have had this issue always with Black Mesa across both Nvidia and AMD GPUs and different CPUs across multiple Windows and Linux OSs. Black Mesa always crashes at the beginning stages. Then suddenly it starts working just fine for hours then crashes all over again. Hardware and Software being: Xeons, i5-2400, i5-3570k, i5-8400, i7-8700, Ryzen 7 1800x, Ryzen 5 3600, GTX 1050, RX 560, RX 470, GTX 1070, RX 5500 XT, 8gb Ram, 12gb Ram, 16gb Ram, 32gb Ram, 1TB SSD for Ext4, 4TB NTFS HDD, 6TB BTRFS Enterprise HDD, NVMes, all across Windows 10, Windows 11, Linux Mint, Fedora, Debian, and Bazzite. It doesn't matter. It just crashes. I finally did manage to see a dialogue box pop up on Bazzite though with the saying "Insuffienct Ram" which is weird cause I have 8gb Vram and 32gb of DDR4 at 2933mhz. I looked up a solution. Tried it out. As usual, it did not work. Now, I am halfway through with zero crashes all over again. Now, I love Black Mesa. I love the Half Life series. But Black Mesa as a game absolutely has issues. It such an amazing looking remake with all its extra details in even the characters from animation to sound. Gameplay is as awesome as any Half Life game. And the saves in the game help pick back up whenever it crashes. I'm just wanting to point out that the issues with Black Mesa from my experience is beyond just hardware and software. Saving is absolutely essential in playing the game. I know I'm not the only one with these issues. But still. I don't think your issue in this specific title at least stems from the ATI card. It's just something weird with the game is all. Great game though.
I had their last true flagship card. The Radeon HD 1950XTX 768MB VRAM. It was a night and day difference from any other GPU I had at the time including a GTX FX 5950 Ultra. I was playing Oblivion and I had a GT 5700. Too slow. Went to a 5950, getting better but still not there. Went to an HD 1300. The 5950 was faster. Finally dumped em all and got the fastest and most expensive card in CompUSA that day, the HD 1950 XTX. An OC version with a beefy cooling system. First triple slot card I had ever laid eyes on let alone owned. I loved it. I could not believe how beautiful Oblivion looked maxxed out at 1280 x 1024. The walls inside the spires in Oblivion itself breathed. Something I never noticed before. They actually were animated to look like the spire was a living thing. I loved it. That card made that possible.
It's been noted elsewhere online that AMD had issues with integrating ATI into the larger corporation. That happens a lot in the tech world, usually to the detriment of the one or more of the parent corporation's product lines. I can still remember the absolute nightmare of working with HP as a US based laptop & desktop warranty repair services provider after HP purchased Compaq. The whole Compaq & DEC merger hadn't happened too long before that which only made everything worse. In my personal experience the driver situation did get better with the Radeon 4000 series of cards. I had a 4850 back then which was pretty solid in Vista and later in Win 8 (had 2 OEM machines that were purchased at the worst times for Windows!) AMD's drivers have gotten a lot better over the years, but they're competing directly against Nvidia who's drivers have been outstanding since the mid 2000s. These days I think AMD's biggest issues with GPU sales are a lack of a true high end GPU to compete with Nvidia with (obviously) as well as a lack of a truly spectacular mid-range card. In the 7000 series there's nothing priced less than $250 new and the 6000 series cards below the 6600 are all gimped in some major way. Nvidia can afford negative press on gimping their lower end cards, AMD can't IMO. I personally own a RX 6600XT which I was able to purchase in the summer of 2022 for almost $40 under MSRP at a time when pretty much all Nvidia cards were running 25-50% above their MSRPs. Driver wise that card is quite good, but only about 95% as good as the 1060 3GB and 750ti I had before it as the drivers for both of those cards were usually outstanding!
You say lack of true high end, but the 6900xt was on par with the 3090, the R9 290x was a great card IMHO the times AMD/ATI messed up was not releasing terascale a year sooner, and not releasing rdna1 2 years sooner. the result was GCN sticking around for too long, 2011-2019 was all gcn, in that time nVidia went from Fermi(fail) > Kepler (arguably worse than gcn) > Maxwell (better than GCN) > Turing Pascal was Maxwell but on 14nm steroids, pretty sure we could have had a samsung 14nm 1080ti 12gb in 2015/2016 for $1499 if amd was competitive enough AMD should have released bulldozer 2 years sooner also, FX9980 vs I7 980X would have been 14900k vs 5700x but in reverse. I think the big problem was GloFo 28nm being delayed, then again GloFo 14nm. I wonder, if AMD went fabless for 65/45nm then WSA was for 28nm ie GCN cards + FX cpus, then jumped to samsung 16/14/12nm for Polaris/ vega, would they have released on time in 2015/2016
You sound like me, although I was a bit behind the curve (I dropped my 750ti for the 1060 in 2020) and got a 6750xtx last year. I've had good experiences for the most part.
@@user-lp5wb2rb3v That's only in rasterization if I recall correctly. In raytracing, AMD still has no answer to Nvidia despite being 3 generations into RDNA... And before you say "but raytracing is pointless for gamers, it doesn't matter!" Clearly it does, considering AMD's pitiful market share these days.
It's definitely a space heater with that TDP. You can get cards with much less low power draw that produce higher results. Good video mate, I won't be getting one of those cards anytime soon lol.
I love channels like this used a 512mb sapphire 3850 AGP card for years because well i was stuck on a single core scoket 462 athlon xp 1900+ M (1.6ghz) clocked overclocked to 1.8-1.9ish ghz (cooked itself after about 2-3 years of usage) 3GB of DDR 400 it got me around 70-80 fps (with dips down to 40ish) in shooters like americas army, Warrock, Combat arms, CS. Later was relocated into a 3.0ghz socket 478 p4 with 4gb(3.2gb) DDR400 which lasted me for years till i bought a 2.8ghz Tri-core Phenom II 720 BE - which i finally got a GTX 8800 (also 512mb) and the system was gifted to my fater......it still runs btw.
@@GrumpyWolfTech No. ATI was still around and had been developing R600. Think about it: It took less than a year from the acquisition of ATI until the release of R600. But developing a new GPU achitecture, tape out, production etc. is a process that takes many years.
i know these cards are kind of terrible but i personaly love them. Just bought my thrid one yesterday funnily enough. Now one may ask why three? i got the first one because of your original video on it (just fell in love with the red transparent shroud with the flame prints). after that i decided since i had no use for it to build the "ultimate" 2007 windows xp machine around it. about 150 bucks of retro parts aquired over the internet later i realised the only thing missing now was crossfire, so after the second gpu turned out to be fried the obvious move was to buy a third one 😂
I owned this card I remember my 7800gtx bricked and I used it until my 9800GTX. I don't really remember suffering from bad performance at the time. It made me realize that good gaming experiences could happen outside Nvidia.. This card was a refresh of Ati along with the next cards of that generation. The first true in house amd card was the 480 which has aged very well. Its refresh the 580 8gb still plays all modern games today..
The Nintendo GC was powered by ATI as well and that machine had some beautiful games. Star Wars Rogue Squadron and WWE Day of Reckoning 2 are two amazing examples of how god the GC could look. All thanks to ATI, I as a young kid also had a nice 128mb radeon 9000 series card in my PC that I loved.
GTA Online has a simplified physics system vs single player mode, there are some graphical reductions too, LOD, render distance, NPC movement etc, it is lighter to run until you hit a high player count so for older cards it typically runs better, its honestly amazing how bad GTAO on the 360 ran overtime but considering the hardware, its a damn miracle it even kept up..i think the heat this card has sat at for a long time has hurt the core and memory since they bleed the heat throughout the PCB horrendously bad. These once overclocked were the fermi/pentium 4 of its time..this is a key showing of the poor early PWM fan curves being so poor, Nvidia cards would suddenly screm at 86C, these cards would just cook and cook. They require a very thick paste to tame them..for example, i have one with a passive cooler that full bore ran almost 110C (curious me wanted to see where it tapped out) but in games it sat at the high 80s you saw on this air cooled card, which ran in the mid 60s with a tickled fan curve, it didn't take a lot, an earlier ramp made a world of difference. The pump out on cheap paste is insane.
I miss the days when amd/ati's worst mistakes were just trying to compete against a strong nvidia generation. Vs today where they're almost constantly shooting themselves in the foot.
RX 6000 looked like a ray of hope since each tier (as long as you ignore everything under the RX 6600, as someone mentioned) performed very well compared to their competition, but RDNA 3 largely missed the mark for some reason.
There is a reason why the GT 8800 Series are so rare and high priced: The 90/80/65nm Cards from Nvidia have the soldergate Problem when they start breaking the solder joints. When you want a GT 8800 don't do it go for a GT 9800GTX+ 1GB which is the G92b Chip in 55nm which got the solder problem solved. They run for around 40€ here. For the Problem on the HD 2000 Series, the Problems already started with the X 1xx0 series. Many may denie it, but it's the reason why i sold my old ATI X1650 pro 512MB which was pretty fast, as fast as possible as i had plenty Problems with her. The Problems got worse for the HD 2000 Series and for the HD 3000 Series with the drivers. Then AMD took full over the Drivers Team and with the HD 4000 Series the drivers got better for the 3000/4000 series. But the Damage was done and the HD 2000 never got their drivers fixed. I have several 4000 and nearly the complete range of 5000 which work fine and with the Adrenalin Drivers the cancer which is called catalyst is finaly over. But everybody remembers those drivers.
I still run my old C2QE/R9 380 pc even though i recently got R5 5600/RX5700XT for daily because it really does 99% of stuff I use pc for. Honestly its only recent games i boot up the ryzen for, and music production.
We're still seeing this tbh. People are biased towards Nvidia despite AMD offering a fantastic experience with their current RDNA3 cards and software. In my opinion it's just yet another case of Nvidia pioneering new technology (Raytracing, upscaling/framegen) and using that as their selling point until AMD catches up to it. Once that happens, I'm sure it'll all mysteriously disappear like all their other gimmicks did.
The 2900xt was actually a pretty good card, a very standard generational leap, the 8800gtx was just exceptional. AMD hasn't really had many bad releases, it's only whenever nvidia releases an exceptional generation that it's viewed as "bad". The 2900xt was a massive increase in complexity and they had a lot of issues sorting drivers out on launch, this is made a lot worse by it being dropped for the highly simplified version of it, the hd 3870 that was released very shortly after, so many outstanding bugs never got fixed since getting a competitive version out the door was all hands on deck. The 2900xt with modern drivers in linux is actually really good to this day.
No it was terrible. AMD DECREASED the number of TMUs to 16 while making them FP16 capable, but most games did not use that. Then then removed the HW AA resolve mechanism, making MSAA ridiculously slow, so slow that when you used it the card was noticeably slower than its own predecessor in the form of Radeon X1950XTX. It was terrible. The lack of usable MSAA also made the whole 512-bit bus useless. The 3000 series were the same thing, just a die-shrink so they could be made cheaper. The 4000 series increased the number of TMUs to 40 and added the AA resolve back and these were decent cards, they started the last great price-war, GPU became really cheap back then, tho, the Software side of things with AMD was still murky at best. The later Terrascale cards, tho praised for price and efficiency had a lot of issues. Terrible tessellation and DX11performance on the 5000 and 6000 cards, SW issues with not working AF and Vsync state enforcement, and also cheats when it came to texture filtering (visible binilear-ditherning lines on some metallic/shiny surfaces that looked OK on Fermi) made them kinda a compromise solution.
@@Lady_Zenith Hardware AA resolve was actually in the card, but disabled on launch due to issues. In theory, had the 8800gtx not been as far ahead as it was, they could have brought this feature back in. Tesselation on HD 5000 was fine, the only issue was that it was like RT today, nvidia released the 400 series nearly a year later and had stronger tesselation, so they pushed affiliated studios to crank tesselation way past what was necessary. DX11 performance was fine on HD 5000, it held pretty close to the 480 despite coming out nearly a year earlier. The 6000 series was not the advancement people wanted though, it was more of a refresh than a new gen.
The 2900XT was expensive to make (big chip, plus the 512 memory ring bus which made the PCB pricey as well), it ran hot and loud and it only performed decently until you turned on anti aliasing - some genius had the bright idea to remove the specialized AA hardware thinking that the GPU had plenty of compute power and that developers would make custom AA modes to best suit their games instead on relying on a single fixed function (or it was bugged as some claim). The 3870 was a die shrink (price, power, heat and noise go down), it used a 256-bit memory bus (price goes further down, performance goes down slightly) with tweaks that included hardware AA (performance goes way up when AA is enabled, eliminating the 2900XT's Achille's heel), DX10.1 support (a first!) and a hardware decoder for (at the time) modern video formats. It performed and sold really well, forcing Nvidia to release the 8800GT, killing the GTS and making the GTX kind of pointless. EDIT: Then the next generation showed up and, in the form of the 4670, the same (well, very similar) level of performance was made available at a sub $100 price point. It was the first graphics card that I purchased out of my own pocket (I was using the old Radeon 9800 Pro until that point) and for the price it was brilliant, playing all of the console ports (i.e. all of the games) of the era at 1080p, which was as high a resolution as you would use back in that day (1920x1200 notwithstanding).
The Linux AMD/ATI drivers are open source and are directly integrated into the Linux kernel. HL2 has a Linux-native version. You might want to test that software combination, it very likely works better than the Windows drivers and might be a glimpse of what the card could have done.
From 2004 to 2016 i only used ati gpus.Had 9600, hd 3650, hd 5670, hd 7750. At the end of 2016 i bought gtx 1050 2gb. Then i got a laptop with a gtx 1050 3gb version in 2020. And now got a laptop with an rtx 4060. So since i chnaged to the 1050, i went with nvidia. Im very happy with my purchase. At 1080p it runs everything superwell
Thanks for the insight into the Radeon 2900XT! I'd say it's a decent card to put into an older system if you want to play retro games... it's just a shame that the last AMD drivers for this were a flop. This probably also explains why the HD3000 series and HD4000 series could also never play GTA very well without modded drivers, due to the awful final release drivers!
I got a 2900 Pro for a good price (under $300 at the time), it happily overclocked to ~840MHz and had the same 512 bit memory bus. Essentially an XT at much less money.
I remember the R350 range of cards getting the community (Omega) drivers, which amongst other things, let you softmod the 9800 se to 9800 Pro. Might be worth looking into... also, wasn't the 2900XT supported on macOS as well? or was that the 2700XT???
I honestly can't see the reason for your rant at the end, considering almost all of the games you mentioned were released or changed way after the 13.4 driver. The only game that hasn't changed since that driver release was Crysis and it pretty much stayed the same, meaning the card was pretty much well optimized for it. All the other games were either released after the 13.4 driver or had updates on them. I remember back in 2013 that _all_ Source games had an update that made them run with half the performance (compared to before the update) on my 9800 GT.
Ahh, AMD/ATi's Pentium 4 Netburst moment for Radeon. My friends' 8600GT and 8800GTXs were making me envious versus my HD2600Pro. They eventually did succeed with the HD4000 series and HD5000 series. The HD4830 and HD5770s were bang for the buck cards. Driver-wise, you can still make these run in modern systems using modded drivers from Leshcat or Amernime. HD7000 with the GCN architecture was the start of "modern" Radeon cards though. Still working today with Vulkan and DX12 in my fiance's system as a HD7770.
I wonder if there are any custom drivers made by the community that could better utilize this graphics card. Either way this video was pretty interesting, well done!
Companies should open source the drivers for cards like this so a community can maintain them if they so desire. Give old cards like this some of the updates that would clear up stuff like the GTA bug for even cards this old.
From what I recall (and i was an ATI fanboi for a long time) even before the AMD acquisition ATI was a floundering organizational disaster. As much as I didn't want to acknowledge back then AMD was the best thing that ever happened to ATI. Things just took some time to "fix". And yes, I even "upgraded" from a rage128 pro to a rage fury maxx XD.
3:12 you could re paste two cards with this amount of thermal paste. Also, you should pre-spread it. Anyone doing/saying/thinking otherwise is just wasting thermal paste (which isn't a problem today with non-conductive thermal paste, but back then most high performing thermal pastes available were highly conductive)
not sure if i'm misunderstanding but is the overlay at 6:06 supposed to say "power consumption: 37Watt"? or is it just cut off, where it's like 37W ambient and 170W under load? just seemed weird so i want to make sure
Always funny looking back at tech we thought was great at the time and seeing how bad it looks now. In 2009 I bought a PC with an AMD HD4350 graphics card (which is so much worse than the 2900xt) back then I only use to play counter strike source, and I thought it was fine...I used the PC and GPU until 2014 when I upgraded and went with a R9 290x, that was a massive jump in performance.
I remember buy 3 old Dell workstations and surprise to see inside had ATI FireGL V8650 which is HD2900XT with 2GB VRAM. I just take it out tested and cleaned after verify it's still working well, still on my GPU collection shelf.
What frustrates me most, is that despite the advances in tech.... we still haven't gotten a half-height, single-slot performance equivelent of the 8800GTX or HD4870.... But I do remember the 2900XT, and how many of the 2000 series I had to RMA.
The 2900XT was designed by ATI. AMD's first high-end graphics card was the 4870, following up on the actually-good 3870 that was still an ATI design. The 4870 was just twice as fast, and the 5870 that followed was twice as fast again, bringing them to about 50% market share against nVidia.
This card was AMD's modus operandi for many years, a smaller cheaper die, with a wider memory bus than the die size would suggest. I was not surprized when the ~250mm² 256 bit 5700XT was launched, it was on 7nm after all. What did surprise me was how good it was, when i bought one for ~$360 USD, it was somehow beating the RTX 208p in several games, sure the 2080 still won in most games, and soon Nvidia would release drivers that would make the 2070 faster than the 2080 a month earlier, but still for a 250mm card, to compete with a 500mm+ card, from a company that never seemed to focus on GPUs was a wild surprize from me.
HAHA I knew this would be the card , yes this was a big fermi like moment for amd. However the time allowed for them to quickly come out with the 3800 series that was cheap and efficient. Followed up by the 4800 series which was sweet spot. In those days it wasnt about the crown , it was about the 250-350 market , then if you could put a halo on top. Nvidia had its issues too with things like the 5000 fx line and later fermi, otherwise the 6000 - 8000 were very good imho. The release cadence and respins were much faster back in those wild west days , one could expect and new card and then its refresh within 10 - 18 months...crazy now that you think about it and the waits between generations. we all knew it would happen eventually. Great vid
I will definitely do a revisit on the HD3000/4000 series as I do remember them being better. Seemed only right to cover this after seeing how good the ATI 9000 series was last week.
I had one playing star wars jedi outcast.I used to have a quite a few old cards like the msi MX420 now I have a legacy rocker with Titan x in SLI And i use that with a R7 3700x to play all my games RDR2 and earlier.I got my titans to not stutter it is beautiful.Had to roll back drivers for the to run right.
Hey, budgets, do you test the card with multiple hdd/ssd windows/drivers installations, because swapping windows installations on boot would probably make it faster to test different drivers and having games on a separate partition is what you're doing already, I bet. Like, if you reinstall drivers, you need to restart anyway, idk if there's a better way.
I guess I bought one of the last ATI cards before the buyout, it was my first foray into PC modification and I was too young to really know much about tech (it had a cool box), but it was a great card that I used for years. I haven't bought or used an AMD card since, even now as I've been looking to finally upgrade from my old 1080ti war horse (my pick for the best card ever made), I still can't justify going back to AMD, the price to raw power value is great, but the feature set is still lagging far behind and the drivers still have some instability issues. I'd rather pay extra to have the best of the best features especially considering I plan on keeping the same card for at least 4-5 years. I'm not just anti-AMD, I do think the 7800x3D is the king of the CPU market right now, it's just sad that they still aren't on NVIDIA's level with graphic cards, because true competition is what's going to get us the best prices, performance, and features. Maybe making both high-end GPU's and CPU's is too much for one company to handle.
I'd check to see if the core voltages set themselves properly, I used to run a Radeon HD6850 that one driver revision started setting the 2d and 3d core voltages the same (to the 2d voltage) so the card would freak out in games. I would always have to manually edit the config and then set it to read only because the driver always changed it back.
Terascale was notoriously difficult to work with and optimize games for. Add to that the crappy AMD drivers and you get magic. Still love the cards and I wish I still had my HD2900PRO.
When I first started pc gaming I was all and. Then I went to smd CPU and Nvidia GPU , to intel CPU and Nvidia GPU. Then about five years ago I switched back to all and again. I have no regrets going back to AMD.
According to my understanding, the HD 2000 AMD 3000 series had issues with ROP hardware resolve for anti-aliasing, having to use shaders killing the performance
They where still impressive - both camps back in that time. I remember working in a online shop and actually holding one fresh one in my hands. That feeling, smell and sensation, never to be seen again with such leet high end hardware.
Just wonder why you don't install the mod driver you did install in the last video festuring hd4890, which should be compatible given the same GPU architechture.
Another big thank you to our channel artist with some more Retro Radeon Renders: ruclips.net/user/shortsH3Lo-yvjD4A?si=Py9WdrNBnVkSl6Yb
Xbox 360 hd grapphics? PC's had way better graphics for the entire time computers existed
You've tried the alternative drivers from amernime?
one other tip, try linux for these old cards, the opensouce drivers totally destroy amd's own on these cards... i recommend testing Solus Linux for example, i just link it to my games lib from windows in steam and such and. away we go.
solus has steam, proton and the rest in the software center ;)
Can you do a video on the HD3850? And then the HD4830/50/70? I owned all of them. I'd be really interested to see/hear you test them. Cheers.
@@JohnChrysostom101 Yes we know. He is talking about being impressed with the GFX the 360 was pumping out at the time. I remember being awestruck seeing Gears Of War for the first time on a HDTV. Ever played Forza Horizon 1? It looks fantastic even to this day. The ED-Ram was what enabled them to include 4xMSAA and it looks great.
Hearing "way back" and "2018" in the same sentence hurt way more than it should have...
I could hear 'way back', '1983' and 'Apple ][e' and be fine with that. They said future generations would be natural PC experts. What a joke. The proliferation of computer tech made more recent generations dumber, not smarter. But the funniest thing is that remembering 2018 makes you feel old when you're one of those PC illiterate kids I'm talking about.
@@Lurch-Bot
"I am 12 years old, you are 80.
Get a life."
@@Lurch-Bot shut up old boomer
@@Lurch-Botdon't think 2018 can be classified as Way Back yet now maybe 2008 can
@@Lurch-Bot Old man yells at cloud.
2006 seemed like such an optimistic time for tech. now 18 years later things are kind of bleak
That can be said for the WHOLE world for EVERYTHING.
pre-2008 everything was looking great. XD
When the world reached the peak of everything there is no more up , then the only way is down
😂
I don't know what you mean. With artificial intelligence taking over everything within the next few years. The outlook for tech has never been better.
everything had its "blooming" age, but in the end only the successful ones make it
Considering this was around the same time NVIDIA introduced CUDA, yeah I'd say this is about when it all went wrong for Radeon as a GPU brand. They never recovered from the 8000 series as far as the market share graph is concerned.
I remember making the best idea I ever had getting a 8800GT 512MB card when they came out, so solid a performer back then.
@@UKVampy Got myself a 8800gts (640mb) one when it was launched. Served me for years.
Not true the HD4000, HD5000 and HD6000 series AMD had 48-54% GPU market share. Things started to change with the gtx 900 series and went bad during the gtx 1000 series vs rx 400/500 series, because AMD was stuck on the vastly inferior 14nm from Global Foundry, while Nvidia was using probably the most dominant process node of all time, TSMC's 16nm.
@@rattlehead999 this
My 4830 was faster than a 9600GT for less money. 2xxx was shit. Sure. HD3xxx wasnt that bad, but 4xxx just dumbstered nvidia. 5xxx, again. Actually what went really wrong is that people thought amd is shit, just because they heard about it. Fanboys
Should also mention, AMD/ATI abandoned the X800 and X1800/X1900 series cards and everything associated with them fairly fast also. It was kind of a trend for them back then to do this.
Atleast the final drivers for them work in all honesty, I used an X800 for years.
Yeah, I had a laptop once (to sell on) that had a Core 2 Duo and X1000 series GPU, and there were literally no working graphics drivers for Windows 10. But on Windows 7, where drivers existed... perfect! AMD and ATI did a horrific job with drivers with their older GPUs.
Kind of sad because DX9 was still king at the time. I imagine a revamped x1900 series would have been adored by all CoD players because ATI Tray tools was becoming a big deal.
If i remember correctly the cards sucked at that time because of patent disputes . It tickles my memory that a key engineer left for nividia and he held key patents or a patent hold didnt like the sale and with drew the use of the patents . Either way Nividia ended up with the patents and the rest is history
Once unified shader GPUs appeared, it was a wrap for everything before when it came to gaming.
A year and a half after the 2900 XT hit the market AMD released the HD 4830 for $130 and was over 50% faster with less than half the power consumption. Many of us bought them with rebates to drop it to $99 at launch.
Can’t forget the Legendary HD4890
@@BudgetBuildsOfficial I had a HD 4850, it was great
Don't forget about the HD 3870 which was basically a HD 2900 with adequate power consumption. I personally wasn't a fan of the HD 4000 series. Except for the 4770 which was - if I remember correctly - a test run of a new TSMC node (40nm?).
I realy liked the old ATI cards, but also kinda forgot i used Omega drivers (third party).
Jesus christ I've never actually thought about it. Now you've just reminded me of Omega drivers and it has never occured to me that those were 3rd party drivers since I was a wee boy playing on my pc. Still on AMD though :D
3:17 turn hte screw counter-clockwise first, until you feel the thread "click" a little. Now start screwing into the opposite direction. That way, you'll always find the start of the thread easily.
Good point, I always thought people did this automatically but I guess not.
Regardless of what ATI/AMD would have put out in 2007, the actual main take away is that for a very brief moment, console gamers had a superior experience over a high end PC, until Nvidia came along and shut that shit down with the Geforce 8 series. :P As is though, back then I was using a Radeon x850 XT AGP card during my first jump into AM2 with the Athlon 64x2. I went from that card to the X1950 XT AGP. But after the first Phenoms were released, I ended up doing a Phenom X3 Crossfire setup with two HD 3850 cards. I looked at buying two used HD 2900 cards, but it just would not have made sense due to the used market prices on them and the heat they generated. I did end up setting up a Athlon 64 FX-62 SLI system also for Physx stuff, and my first GPU pair in it was two 8800 GTS, before swapping out for two GTX 280.
HEY! I LOVE my 7900 GTO 😛
Except that 7th Gen games were all locked at sub par 30 fps, even back then PC gaming standards were never that low
@@chillhour6155 Thats not exactly true. Many games were, some were not (Forza, Ninja Gaiden 2, Bayonetta and a chunk of others were not if I recall right). Same applies to PS3 also. And honestly, I could care less about the frame cap if the fps was mostly consistent back then. I was using a pipeline/shader unlocked Geforce 6800 AGP during the Xbox 360 launch, so I was used to running stuff at around 30FPS, like NFS Carbon and Company of Heroes.
What mattered a lot, for me, was the hardware like the triple core Xenon with six threads, and modern feature set the 360 brought to the table, along with the exclusive titles like COD 3, and titles that looked better on Xbox 360 like Ghost Recon Advanced Warfighter (I had a Ageia Physx card and still felt the game looked and ran better on 360).
For a brief moment 360 was the only hardware in town featuring Unified shader architecture and 3 core/6 thread processor. That was not present on the consumer PC side until the Geforce 8000 series and later Radeon features in this video, along with C2Q and Phenoms.
BUT, and this is a BIG BUT, I did not care for how the 360 cooked itself to death. That was literally why I never got one for myself. We only got my son a slim model in late 2010 after we were sure the red ring issues were over. But for myself, I stuck to pc gaming. I have not had a modern console during its retail life since the Dreamcast.
*NVIDIA, all caps.
I expected Vega lookig at the title.
That thing payed itself 20 times over when mining showed up ^^
Vega VII comes to mind. It's unfortunate they don't have the answer for RTX 20xx series
Vegas were actually good.
I was thinking fury lol
Huh? Vega 56/64 crushed Nvidia. One of the best 1440p cards for the money. Now a 5700XT for 150 bucks is the better choice but botha re craty good. Sometimes u get a vega 64 for 60 bucks tho, then its a no brainer
The BIG problem with Terascale wasn't that it wasn't powerful enough to keep up its that AMD's drivers struggled to deal with the fact most games PC games were not that well optimized put its power to any real use. Basically games had to be capable of constantly giving the GPU multiple things to do and the hardware would shuffle between what was most optimal based on what resources are available, if a game isn't giving multiple batches of draw calls at a time so the card fails to live up to expectations but when the game is doing that it was a powerhouse, the rub of all of this is Microsoft was meant to fix this with DirectX 10 and did fix it with the Xbox 360's version of DirectX.
Terrascale 3 mostly fixed this but still depended on game developers to implement efficient draw call batching and multi threading the GPU workload and AMD updating drivers for game spesific fixes. Thats where the Hardware Scheduler of the GCN cards came from and the AMD Radeon fine aged wine came from, what Mantle showed was possible and why DirectX12 and Vulkan were so badly needed.
Nvidia had exactly the same goals with Tesla based GPUs but they relied on you having 4 or more CPU cores to have the driver software do the draw call batching on the fly, it was part of the reason AMD dominated in a lot of DirectX 12 and Vulkan games and Nvidia lagged behind for a while but still won if the game was DirectX 11. Funnily enough a GPU bound DX11 game on something as modern as a 1060 on a dual core processor will run slower than on a quad core where as the GCN equivalent would run about the same on dual or quad core.
A few historical fixes, the Tessellator in the 2900XT was not D3D compliant. Microsoft wouldn't finalize the spec until the Radeon HD 5000 series came along. ATi/AMD had their own developer tool to implement their tessellator but due to the poor performance of the 2900XT, no developer that I know of implemented it. The Radeon HD 5000 series as the first GPU capable of tessellation. nVIDIA did not have a GPU capable of this at the time. nVIDIA's answer would be an even more powerful Geometry processor capable of extreme levels of tessellation performance (beyond what was even visible to the Human Eye in terms of image quality difference). nVIDIA would go on to attach obscene levels of tessellation to their "The way it's meant to be played" and other such titles in order to win benchmarks against AMD. AMD would answer with a tessellation slider in their driver to fix that in those titles. Eventually AMD would go on to implement a very high performance Geometry processor of their own negating nVIDIA implementing obsene levels of tesselation in their sponsored titles. Once that performance edge went away, those titles disappeared. In some of them, even the area bellow ground, that a player could not see, were being heavily tesselated.
We're seeing the same with RT today. We also saw the same with PhysX. All of those technologies have or will largely die once nVIDIA's edge is lost. CPUs have replaced GPUs for Physics. Tessellation is now down to reasonable levels. Adaptive Sync beat G-Sync. Etc.
Yeah, unfortunately Nvidia has a history of pioneering new technology but only ever using it to make their cards look better than the competition and justify their pricing. Once their gimmicks get beat, they always mysteriously vanish.
It's written NVIDIA, not "nVIDIA".
@@notsogrand2837*NVIDIA
@@GrainGrown that's because you're young. Google "1999 nVIDIA logo". Even today, the N is not written capitalized in their logo. It's a "n". Have a look yourself.
@thepcenthusiastchannel2300 It's literally the second sentence in NVIDIA's guidelines, "NVIDIA is written in upper case."
You're talking about a logo, it's just graphics, and it has its own set of guidelines.
You're welcome, kiddo.
I've never used terascale gpus under windows, but I did under GNU / Linux for many years, and the open source drivers (radeon ddx + r600g) were great for their time. Easily the best end user experience with open source drivers on the platform until GCN support finally started coming together years after GCN 1.0 launched.
So it's interesting to hear that they sucked under Windows when they were my preference under GNU / Linux for ages.
I'm very curious about modern MESA performance with this card. I'd expect it to be fairly decent compared to the official drivers. And the work on running MESA drivers on Windows could mean that testing them might not require Linux (although I've seen some evidence that the process is still a bit janky).
@@benjaminmiddaugh2729 Sadly my HD 6870 kicked the bucket back during summer 2018. I think the only terascale card I have that still works is a passively cooled DDR3 HD 6450... so not all that exciting.
One aspect I think would be better with r600g now as opposed to when I actively used it is playing dx9 games. Galliumnine has become much more mature and accessible since then.
"Only fixed with driver updates that these cards never got" - I wonder if it would perform significantly better under Linux with Mesa, since there've likely been significant updates. May be limited to dx9 and opengl3.3 stuff though, since it won't be able to do dxvk.
I would love to see the performance difference under Linux. My experience with my RX560 under Windows vs Linux is that games run better in Linux.
Someone in the community a while back added Vulkan support to the R600 series, I bet the card would be in a far better state on Linux.
@@amirpourghoureiyan1637 Yeah, i've been following the Terakan development, since i have a few 6970s and 6950s. Still decent cards for 720p gaming, and vulkan really opens that up for proton and more modern casual titles.
I had a 2900 pro card back in the day and that could be unlocked to XT. The card would run Bioshock just fine and crysis with a mix of Med/High settings. Not bad really and was ok as a cheaper option.
AMDs issue with GPUs has never been the hardware its always the drivers, its nice to see how even the RX 470 still holds up today but when it launched the drivers were bad.
There isn't more issues with drivers than nvidias. Did you try any intel GPU?
@@yx8074there is a lot of problems with AMD drivers. Even for gamers only, i tried to run older games on AMD, and it was dogshit. A lot of missing files, missing dlls. I did a fresh instal windows with C++ installation, never been able to play the Settlers 2.
I also did a fresh instal windows with Nvidia, c++ versions was the same files. The game just openned and runs smooth.
@@yx8074it is. Even from gaming only PC they dont stand Nvidia Drivers. I had a RX 580 and i never manage to play old games like The Settlers 2. I even tried to do fresh windows instal, all drivers, C++ versions, missing dlls.
I bought a 4060 recently and just made a fresh windows instal and same C++ files i used before, the game just opens wand works.
literal AMD finewine
Tbf Terascale 2 was left in a much better state than this though. Terascale 1 never competed with Fermi, as the HD 5870 had already been out for quite a few months when the GTX 480 appeared. AMD still dropped Terascale 2 two years before Fermi got dropped, and without the rudimentary Dx12 support Fermi had, but then Dx12 wouldn't have worked well on Terascale 2 either way, because it doesn't lend itself to Dx12 (or Vulkan for that matter) architecturally. Terascale 3 could've been supported though.
I remember I had a 2600XT and it was my first gaming graphics card. I asked my friends what card to get and somehow ended up with it, even though they all deny that they ever recommended it.
It was a terrible card, but was absolutely amazing when I got it. Compared to the crappy entry-level or onboard graphics I was used to, it was like a whole new world. I upgraded to a 4850 a couple years later, which restarted the amazing feeling.
I have a BFG 9800 GX2 in my collection, I would love to lend this card to you for you to explore it on the channel.
Neat card! One of the few truly " dual graphics card " - cards.
@00zero557A it is! It's essentially a sandwich of 2 9800 boards squashed together.. it does run HOT and I mean HOT.. upwards of 91c under load on both chips
I actually have one somewhere. No clue if it works though.
@@OneLife69- don't worry, gtx 480 ran hotter out of the box and whole chip series had that feature. I think it could hit 105c under high load and it supposedly was "fine".
Pretty sure that nvidia 4xx series were the hottest cards ever made so far. I'd like to see how they are right now.
edit:
Was the GTX480 that Bad?
ruclips.net/video/PbLuRPgzlLY/видео.html
@@BudgetBuildsOfficial If modern drivers make it viable, then a video could make sense. BUT if you basically rehash your video without showing any new titles, then video might not have an audience.
I mean, if it can run some less ram heavy popular modern game, it might be impressive to see that.
vram is a huge issue with modern games, and ini config tutorial fits more to that Spanish dude that had tutorials for games that could run on ultra low end systems at 24+ fps. Too bad the dude sold his soul to brilliant, haven't seen any new english videos, was great "competitor" to you.
I had one of these back in the day. It was absolutely brilliant in the winter.
It also seems this card is haunting your video, as you've got some odd audio stutters in a couple of locations. Probably nothing I'd reupload for, but worth mentioning.
Doom might have been released in 1993 but Budget-Builds Official is Eternal!
This card is about as good according to techpoweredup as the 59 watt 67 dollar budget card 4670 that launched in 2008. AMD really messed up with this card and did really well with the 4000 series. To see a budget card perform as well that soon after.
terrascales magic is in windows xp only , i love the hd 2000-6000 cards on xp , i make ultra versatile xp system setups in these use cases they launch 3d mark 2000 on xp flawlessly , they play a ton of windows 98 games on windows xp with these cards with the right drivers they are very valuable
Shame as they’re a DX10 card, but the do seem to manage very well in DX9
According to most people, TeraScale 2 (HD 5000/HD 6000) was trash, all of the 2010 and 2011 iMacs and MacBooks with these GPUs died. As a 2010 iMac owner I can confirm that this is true. TeraScale/TeraScale 1 was supposedly good though
@@charliesretrocomputingTeraScale 2 was great on the PC side of things though.
@@TheDemocrab good point, I don’t know much about the PCs of the 2000s, just the Macs. 😂
@@charliesretrocomputingApple just put them in chassis that couldn’t cool them right and genuinely got a few bad batches of silicon
I think the problem in Black Skylands was that if it is there, that FSR was active. Not supported GPUs like the 600 Series from Nvidia will just show a black screen
All the DX10 cards where an awful mistake a path chosen for profit that affects us today. That includes NV's. The reason they existed was to bridge the gap and make a useful jack of all trades card for both gfx and enterprise which lead to what we have today. These first gen cards could often be beaten by DX9.0c cards in DX9 games and so few DX10 games were ever made it was pointless to grab one till the dx11 era to upgrade, if you already had a 7xxx or x19xx.
I'm so glad someone else knows this!
The cost of producing different architectures for enterprise computing and for general usage is clearly not viable, otherwise all manufacturers would be doing it. The costs involved in developing and manufacturing modern GPUs and CPUs is mindblowing. There are always compromises, whatever the technology or product... it's inevitable. Businesses have to balance any number of factors to remain solvent and competitive. It's what gives us the luxury of having a choice of what to buy.
Sitting in middle England in my tent, on my cycle trip, it's passing it down. But a budget build video always makes me happy :)
Truly a graphics card of all time.
I also built a 150$ brand new system too.
Thanks for another video, love your content :)
Like these videos as always man.
I do want to point out though.
8:15 Weirdly enough. I have had this issue always with Black Mesa across both Nvidia and AMD GPUs and different CPUs across multiple Windows and Linux OSs. Black Mesa always crashes at the beginning stages. Then suddenly it starts working just fine for hours then crashes all over again.
Hardware and Software being:
Xeons, i5-2400, i5-3570k, i5-8400, i7-8700, Ryzen 7 1800x, Ryzen 5 3600, GTX 1050, RX 560, RX 470, GTX 1070, RX 5500 XT, 8gb Ram, 12gb Ram, 16gb Ram, 32gb Ram, 1TB SSD for Ext4, 4TB NTFS HDD, 6TB BTRFS Enterprise HDD, NVMes, all across Windows 10, Windows 11, Linux Mint, Fedora, Debian, and Bazzite.
It doesn't matter. It just crashes.
I finally did manage to see a dialogue box pop up on Bazzite though with the saying "Insuffienct Ram" which is weird cause I have 8gb Vram and 32gb of DDR4 at 2933mhz. I looked up a solution. Tried it out. As usual, it did not work. Now, I am halfway through with zero crashes all over again.
Now, I love Black Mesa. I love the Half Life series. But Black Mesa as a game absolutely has issues. It such an amazing looking remake with all its extra details in even the characters from animation to sound. Gameplay is as awesome as any Half Life game. And the saves in the game help pick back up whenever it crashes.
I'm just wanting to point out that the issues with Black Mesa from my experience is beyond just hardware and software. Saving is absolutely essential in playing the game. I know I'm not the only one with these issues. But still. I don't think your issue in this specific title at least stems from the ATI card. It's just something weird with the game is all. Great game though.
0:15 right! right! Nice to have you uploading again regularly
I had their last true flagship card. The Radeon HD 1950XTX 768MB VRAM. It was a night and day difference from any other GPU I had at the time including a GTX FX 5950 Ultra. I was playing Oblivion and I had a GT 5700. Too slow. Went to a 5950, getting better but still not there. Went to an HD 1300. The 5950 was faster. Finally dumped em all and got the fastest and most expensive card in CompUSA that day, the HD 1950 XTX. An OC version with a beefy cooling system. First triple slot card I had ever laid eyes on let alone owned. I loved it. I could not believe how beautiful Oblivion looked maxxed out at 1280 x 1024. The walls inside the spires in Oblivion itself breathed. Something I never noticed before. They actually were animated to look like the spire was a living thing. I loved it. That card made that possible.
It's been noted elsewhere online that AMD had issues with integrating ATI into the larger corporation. That happens a lot in the tech world, usually to the detriment of the one or more of the parent corporation's product lines. I can still remember the absolute nightmare of working with HP as a US based laptop & desktop warranty repair services provider after HP purchased Compaq. The whole Compaq & DEC merger hadn't happened too long before that which only made everything worse.
In my personal experience the driver situation did get better with the Radeon 4000 series of cards. I had a 4850 back then which was pretty solid in Vista and later in Win 8 (had 2 OEM machines that were purchased at the worst times for Windows!) AMD's drivers have gotten a lot better over the years, but they're competing directly against Nvidia who's drivers have been outstanding since the mid 2000s.
These days I think AMD's biggest issues with GPU sales are a lack of a true high end GPU to compete with Nvidia with (obviously) as well as a lack of a truly spectacular mid-range card. In the 7000 series there's nothing priced less than $250 new and the 6000 series cards below the 6600 are all gimped in some major way. Nvidia can afford negative press on gimping their lower end cards, AMD can't IMO.
I personally own a RX 6600XT which I was able to purchase in the summer of 2022 for almost $40 under MSRP at a time when pretty much all Nvidia cards were running 25-50% above their MSRPs. Driver wise that card is quite good, but only about 95% as good as the 1060 3GB and 750ti I had before it as the drivers for both of those cards were usually outstanding!
You say lack of true high end, but the 6900xt was on par with the 3090, the R9 290x was a great card
IMHO the times AMD/ATI messed up was not releasing terascale a year sooner, and not releasing rdna1 2 years sooner. the result was GCN sticking around for too long, 2011-2019 was all gcn, in that time nVidia went from Fermi(fail) > Kepler (arguably worse than gcn) > Maxwell (better than GCN) > Turing
Pascal was Maxwell but on 14nm steroids, pretty sure we could have had a samsung 14nm 1080ti 12gb in 2015/2016 for $1499 if amd was competitive enough
AMD should have released bulldozer 2 years sooner also, FX9980 vs I7 980X would have been 14900k vs 5700x but in reverse.
I think the big problem was GloFo 28nm being delayed, then again GloFo 14nm.
I wonder, if AMD went fabless for 65/45nm then WSA was for 28nm ie GCN cards + FX cpus, then jumped to samsung 16/14/12nm for Polaris/ vega, would they have released on time in 2015/2016
You sound like me, although I was a bit behind the curve (I dropped my 750ti for the 1060 in 2020) and got a 6750xtx last year. I've had good experiences for the most part.
@@user-lp5wb2rb3v That's only in rasterization if I recall correctly. In raytracing, AMD still has no answer to Nvidia despite being 3 generations into RDNA... And before you say "but raytracing is pointless for gamers, it doesn't matter!" Clearly it does, considering AMD's pitiful market share these days.
It's definitely a space heater with that TDP. You can get cards with much less low power draw that produce higher results. Good video mate, I won't be getting one of those cards anytime soon lol.
I love channels like this used a 512mb sapphire 3850 AGP card for years because well i was stuck on a single core scoket 462 athlon xp 1900+ M (1.6ghz) clocked overclocked to 1.8-1.9ish ghz (cooked itself after about 2-3 years of usage) 3GB of DDR 400 it got me around 70-80 fps (with dips down to 40ish) in shooters like americas army, Warrock, Combat arms, CS.
Later was relocated into a 3.0ghz socket 478 p4 with 4gb(3.2gb) DDR400 which lasted me for years till i bought a 2.8ghz Tri-core Phenom II 720 BE - which i finally got a GTX 8800 (also 512mb) and the system was gifted to my fater......it still runs btw.
I love that you're not just commentating recorded footage, you're essentially doing a let's play and go in-depth on the performance
Glad you changed this from the community post from ATI to AMD.
This is an ATI card. They only adopted the AMD branding with the HD 6000 series in 2011
@@Txm_Dxr_Bxss You're missing the point, ATI was gone at this point, it was simply AMD using the patents and naming.
@@GrumpyWolfTech No. ATI was still around and had been developing R600. Think about it: It took less than a year from the acquisition of ATI until the release of R600. But developing a new GPU achitecture, tape out, production etc. is a process that takes many years.
Heh I was stuck with an Fx5200 on a desktop and 8400 GSM on a laptop at the time :D Wouldn't have minded one of these bad boys to be honest with you.
I feel you! I built my first pc with an 8500gs, lol
Seabreeze playing in the background made my day.
i know these cards are kind of terrible but i personaly love them. Just bought my thrid one yesterday funnily enough. Now one may ask why three? i got the first one because of your original video on it (just fell in love with the red transparent shroud with the flame prints). after that i decided since i had no use for it to build the "ultimate" 2007 windows xp machine around it. about 150 bucks of retro parts aquired over the internet later i realised the only thing missing now was crossfire, so after the second gpu turned out to be fried the obvious move was to buy a third one 😂
I owned this card I remember my 7800gtx bricked and I used it until my 9800GTX. I don't really remember suffering from bad performance at the time. It made me realize that good gaming experiences could happen outside Nvidia.. This card was a refresh of Ati along with the next cards of that generation. The first true in house amd card was the 480 which has aged very well. Its refresh the 580 8gb still plays all modern games today..
How does it compare to the gallium r600 driver (with both nine and wined3d)?
The Nintendo GC was powered by ATI as well and that machine had some beautiful games. Star Wars Rogue Squadron and WWE Day of Reckoning 2 are two amazing examples of how god the GC could look. All thanks to ATI, I as a young kid also had a nice 128mb radeon 9000 series card in my PC that I loved.
185Watts!!! that was nuts back then, and for me it still is today - but nowadays thats called a low power budget.
GTA Online has a simplified physics system vs single player mode, there are some graphical reductions too, LOD, render distance, NPC movement etc, it is lighter to run until you hit a high player count so for older cards it typically runs better, its honestly amazing how bad GTAO on the 360 ran overtime but considering the hardware, its a damn miracle it even kept up..i think the heat this card has sat at for a long time has hurt the core and memory since they bleed the heat throughout the PCB horrendously bad. These once overclocked were the fermi/pentium 4 of its time..this is a key showing of the poor early PWM fan curves being so poor, Nvidia cards would suddenly screm at 86C, these cards would just cook and cook. They require a very thick paste to tame them..for example, i have one with a passive cooler that full bore ran almost 110C (curious me wanted to see where it tapped out) but in games it sat at the high 80s you saw on this air cooled card, which ran in the mid 60s with a tickled fan curve, it didn't take a lot, an earlier ramp made a world of difference. The pump out on cheap paste is insane.
I love how RUclips subtitles always writes "Terror Scales" instead of terascale 😂
I built a 8800GTS SLI rig around the time this card came out. What a time!
Early gang rise up, at first i thought i was watching another video of yours from years ago 😂
Power consumption at 6:05 ... 🤨Also yay, more Budget-Builds!
Should be 185Watts good spot that
I miss the days when amd/ati's worst mistakes were just trying to compete against a strong nvidia generation. Vs today where they're almost constantly shooting themselves in the foot.
Very true
Radeon never misses an opportunity to miss an opportunity.
RX 6000 looked like a ray of hope since each tier (as long as you ignore everything under the RX 6600, as someone mentioned) performed very well compared to their competition, but RDNA 3 largely missed the mark for some reason.
It’s competitor is destroying itself over AI
@@yasu_red RDNA 3 behaved as expected. (They don't want to compete with NVIDIA)
Man, were all these uploads in the works during your hiatus? Loving the consistent uploads!
There is a reason why the GT 8800 Series are so rare and high priced: The 90/80/65nm Cards from Nvidia have the soldergate Problem when they start breaking the solder joints. When you want a GT 8800 don't do it go for a GT 9800GTX+ 1GB which is the G92b Chip in 55nm which got the solder problem solved. They run for around 40€ here. For the Problem on the HD 2000 Series, the Problems already started with the X 1xx0 series. Many may denie it, but it's the reason why i sold my old ATI X1650 pro 512MB which was pretty fast, as fast as possible as i had plenty Problems with her. The Problems got worse for the HD 2000 Series and for the HD 3000 Series with the drivers. Then AMD took full over the Drivers Team and with the HD 4000 Series the drivers got better for the 3000/4000 series. But the Damage was done and the HD 2000 never got their drivers fixed. I have several 4000 and nearly the complete range of 5000 which work fine and with the Adrenalin Drivers the cancer which is called catalyst is finaly over. But everybody remembers those drivers.
Like always a damn enjoyable Video😃
*DON`T SELL IT*
? I have 3 I literally cant sell.
People are just biased to NVIDIA, GCN 1.0 and 2.0 was mutch better then Kepler, people still got NVIDIA.
I still run my old C2QE/R9 380 pc even though i recently got R5 5600/RX5700XT for daily because it really does 99% of stuff I use pc for. Honestly its only recent games i boot up the ryzen for, and music production.
We're still seeing this tbh. People are biased towards Nvidia despite AMD offering a fantastic experience with their current RDNA3 cards and software. In my opinion it's just yet another case of Nvidia pioneering new technology (Raytracing, upscaling/framegen) and using that as their selling point until AMD catches up to it. Once that happens, I'm sure it'll all mysteriously disappear like all their other gimmicks did.
*much
*than...
@@notsogrand2837It's written NVIDIA.
@@GrainGrown Huh?
Would you say, Terrascale drivers are Terrable ?
The 2900xt was actually a pretty good card, a very standard generational leap, the 8800gtx was just exceptional. AMD hasn't really had many bad releases, it's only whenever nvidia releases an exceptional generation that it's viewed as "bad".
The 2900xt was a massive increase in complexity and they had a lot of issues sorting drivers out on launch, this is made a lot worse by it being dropped for the highly simplified version of it, the hd 3870 that was released very shortly after, so many outstanding bugs never got fixed since getting a competitive version out the door was all hands on deck.
The 2900xt with modern drivers in linux is actually really good to this day.
No it was terrible. AMD DECREASED the number of TMUs to 16 while making them FP16 capable, but most games did not use that. Then then removed the HW AA resolve mechanism, making MSAA ridiculously slow, so slow that when you used it the card was noticeably slower than its own predecessor in the form of Radeon X1950XTX. It was terrible. The lack of usable MSAA also made the whole 512-bit bus useless.
The 3000 series were the same thing, just a die-shrink so they could be made cheaper. The 4000 series increased the number of TMUs to 40 and added the AA resolve back and these were decent cards, they started the last great price-war, GPU became really cheap back then, tho, the Software side of things with AMD was still murky at best.
The later Terrascale cards, tho praised for price and efficiency had a lot of issues. Terrible tessellation and DX11performance on the 5000 and 6000 cards, SW issues with not working AF and Vsync state enforcement, and also cheats when it came to texture filtering (visible binilear-ditherning lines on some metallic/shiny surfaces that looked OK on Fermi) made them kinda a compromise solution.
@@Lady_Zenith Hardware AA resolve was actually in the card, but disabled on launch due to issues. In theory, had the 8800gtx not been as far ahead as it was, they could have brought this feature back in.
Tesselation on HD 5000 was fine, the only issue was that it was like RT today, nvidia released the 400 series nearly a year later and had stronger tesselation, so they pushed affiliated studios to crank tesselation way past what was necessary. DX11 performance was fine on HD 5000, it held pretty close to the 480 despite coming out nearly a year earlier.
The 6000 series was not the advancement people wanted though, it was more of a refresh than a new gen.
The 2900XT was expensive to make (big chip, plus the 512 memory ring bus which made the PCB pricey as well), it ran hot and loud and it only performed decently until you turned on anti aliasing - some genius had the bright idea to remove the specialized AA hardware thinking that the GPU had plenty of compute power and that developers would make custom AA modes to best suit their games instead on relying on a single fixed function (or it was bugged as some claim).
The 3870 was a die shrink (price, power, heat and noise go down), it used a 256-bit memory bus (price goes further down, performance goes down slightly) with tweaks that included hardware AA (performance goes way up when AA is enabled, eliminating the 2900XT's Achille's heel), DX10.1 support (a first!) and a hardware decoder for (at the time) modern video formats. It performed and sold really well, forcing Nvidia to release the 8800GT, killing the GTS and making the GTX kind of pointless.
EDIT: Then the next generation showed up and, in the form of the 4670, the same (well, very similar) level of performance was made available at a sub $100 price point. It was the first graphics card that I purchased out of my own pocket (I was using the old Radeon 9800 Pro until that point) and for the price it was brilliant, playing all of the console ports (i.e. all of the games) of the era at 1080p, which was as high a resolution as you would use back in that day (1920x1200 notwithstanding).
Why is AMD the only word you wrote in caps?
The Linux AMD/ATI drivers are open source and are directly integrated into the Linux kernel. HL2 has a Linux-native version. You might want to test that software combination, it very likely works better than the Windows drivers and might be a glimpse of what the card could have done.
Is the background music from SimCity 2013? It sounds eerily similar.
From 2004 to 2016 i only used ati gpus.Had 9600, hd 3650, hd 5670, hd 7750. At the end of 2016 i bought gtx 1050 2gb. Then i got a laptop with a gtx 1050 3gb version in 2020. And now got a laptop with an rtx 4060. So since i chnaged to the 1050, i went with nvidia. Im very happy with my purchase. At 1080p it runs everything superwell
Have you ever used and tested a Nvidia Quattro card before? I have a K2200 4GB one.
Thanks for the insight into the Radeon 2900XT! I'd say it's a decent card to put into an older system if you want to play retro games... it's just a shame that the last AMD drivers for this were a flop.
This probably also explains why the HD3000 series and HD4000 series could also never play GTA very well without modded drivers, due to the awful final release drivers!
I got a 2900 Pro for a good price (under $300 at the time), it happily overclocked to ~840MHz and had the same 512 bit memory bus. Essentially an XT at much less money.
GTA IV is still borked to this day on pc. I can get around 80fps with a RTX 3070.
yeah in dx9 but with dxvk it works much better.
I remember the R350 range of cards getting the community (Omega) drivers, which amongst other things, let you softmod the 9800 se to 9800 Pro. Might be worth looking into... also, wasn't the 2900XT supported on macOS as well? or was that the 2700XT???
I honestly can't see the reason for your rant at the end, considering almost all of the games you mentioned were released or changed way after the 13.4 driver. The only game that hasn't changed since that driver release was Crysis and it pretty much stayed the same, meaning the card was pretty much well optimized for it.
All the other games were either released after the 13.4 driver or had updates on them. I remember back in 2013 that _all_ Source games had an update that made them run with half the performance (compared to before the update) on my 9800 GT.
am i the only one that noticed a few audio stutters in specific parts of the video though the audio bugs are perfect for this bad amd card video
Ahh, AMD/ATi's Pentium 4 Netburst moment for Radeon. My friends' 8600GT and 8800GTXs were making me envious versus my HD2600Pro. They eventually did succeed with the HD4000 series and HD5000 series. The HD4830 and HD5770s were bang for the buck cards. Driver-wise, you can still make these run in modern systems using modded drivers from Leshcat or Amernime.
HD7000 with the GCN architecture was the start of "modern" Radeon cards though. Still working today with Vulkan and DX12 in my fiance's system as a HD7770.
I wonder if there are any custom drivers made by the community that could better utilize this graphics card. Either way this video was pretty interesting, well done!
Companies should open source the drivers for cards like this so a community can maintain them if they so desire. Give old cards like this some of the updates that would clear up stuff like the GTA bug for even cards this old.
From what I recall (and i was an ATI fanboi for a long time) even before the AMD acquisition ATI was a floundering organizational disaster. As much as I didn't want to acknowledge back then AMD was the best thing that ever happened to ATI. Things just took some time to "fix". And yes, I even "upgraded" from a rage128 pro to a rage fury maxx XD.
3:12 you could re paste two cards with this amount of thermal paste. Also, you should pre-spread it. Anyone doing/saying/thinking otherwise is just wasting thermal paste (which isn't a problem today with non-conductive thermal paste, but back then most high performing thermal pastes available were highly conductive)
Too much thermal paste is not a thing. It's been proven. Sit down and shu t up
not sure if i'm misunderstanding but is the overlay at 6:06 supposed to say "power consumption: 37Watt"? or is it just cut off, where it's like 37W ambient and 170W under load? just seemed weird so i want to make sure
Always funny looking back at tech we thought was great at the time and seeing how bad it looks now. In 2009 I bought a PC with an AMD HD4350 graphics card (which is so much worse than the 2900xt) back then I only use to play counter strike source, and I thought it was fine...I used the PC and GPU until 2014 when I upgraded and went with a R9 290x, that was a massive jump in performance.
how would Linux be? the same or better drivers?
Hey, just a quick question but..... how do you feel about terrascale drivers?
GTA IV plays best on Linux in my experience. 60fps capped-or 30 depending on your hardware-to stop physics wackiness, smooth as butter.
I remember buy 3 old Dell workstations and surprise to see inside had ATI FireGL V8650 which is HD2900XT with 2GB VRAM. I just take it out tested and cleaned after verify it's still working well, still on my GPU collection shelf.
What frustrates me most, is that despite the advances in tech.... we still haven't gotten a half-height, single-slot performance equivelent of the 8800GTX or HD4870....
But I do remember the 2900XT, and how many of the 2000 series I had to RMA.
@23:35 Yeah, I don't suppose Leshcat would fix that...
I remember using an HD 2600xt on agp with a P4 @3ghz with HT to play Medal of Honor: Airborne. Did pretty decent too!
The 2900XT was designed by ATI. AMD's first high-end graphics card was the 4870, following up on the actually-good 3870 that was still an ATI design. The 4870 was just twice as fast, and the 5870 that followed was twice as fast again, bringing them to about 50% market share against nVidia.
*NVIDIA, not "nVidia" or anything else.
This card was AMD's modus operandi for many years, a smaller cheaper die, with a wider memory bus than the die size would suggest.
I was not surprized when the ~250mm² 256 bit 5700XT was launched, it was on 7nm after all. What did surprise me was how good it was, when i bought one for ~$360 USD, it was somehow beating the RTX 208p in several games, sure the 2080 still won in most games, and soon Nvidia would release drivers that would make the 2070 faster than the 2080 a month earlier, but still for a 250mm card, to compete with a 500mm+ card, from a company that never seemed to focus on GPUs was a wild surprize from me.
Half the power of my 8800gt came from the name. XFX 8800gt XXX Alpha Dog was good for at least 64bits extra depth on the memory bus.
I had this card back in 2008 and I loved it. It was my first high end PC and I held it for a long time.
HAHA I knew this would be the card , yes this was a big fermi like moment for amd. However the time allowed for them to quickly come out with the 3800 series that was cheap and efficient. Followed up by the 4800 series which was sweet spot. In those days it wasnt about the crown , it was about the 250-350 market , then if you could put a halo on top. Nvidia had its issues too with things like the 5000 fx line and later fermi, otherwise the 6000 - 8000 were very good imho.
The release cadence and respins were much faster back in those wild west days , one could expect and new card and then its refresh within 10 - 18 months...crazy now that you think about it and the waits between generations. we all knew it would happen eventually. Great vid
I will definitely do a revisit on the HD3000/4000 series as I do remember them being better.
Seemed only right to cover this after seeing how good the ATI 9000 series was last week.
lmfao 11 seconds ago. Never seen a that before.
That's what she said!
Man who remembers watching the original video and doesn’t remember that it’s been 6 years 😢
I had one playing star wars jedi outcast.I used to have a quite a few old cards like the msi MX420 now I have a legacy rocker with Titan x in SLI And i use that with a R7 3700x to play all my games RDR2 and earlier.I got my titans to not stutter it is beautiful.Had to roll back drivers for the to run right.
Hey, budgets, do you test the card with multiple hdd/ssd windows/drivers installations, because swapping windows installations on boot would probably make it faster to test different drivers and having games on a separate partition is what you're doing already, I bet.
Like, if you reinstall drivers, you need to restart anyway, idk if there's a better way.
I remember the 7800 GTX smacking the X1950XT like a red-headed step child. It was the last Radeon GPU I used until I started using the RX 480.
My last ATi card was a Rage 128 Pro, or something like this. Damn, I'm getting old...
I guess I bought one of the last ATI cards before the buyout, it was my first foray into PC modification and I was too young to really know much about tech (it had a cool box), but it was a great card that I used for years.
I haven't bought or used an AMD card since, even now as I've been looking to finally upgrade from my old 1080ti war horse (my pick for the best card ever made), I still can't justify going back to AMD, the price to raw power value is great, but the feature set is still lagging far behind and the drivers still have some instability issues. I'd rather pay extra to have the best of the best features especially considering I plan on keeping the same card for at least 4-5 years.
I'm not just anti-AMD, I do think the 7800x3D is the king of the CPU market right now, it's just sad that they still aren't on NVIDIA's level with graphic cards, because true competition is what's going to get us the best prices, performance, and features. Maybe making both high-end GPU's and CPU's is too much for one company to handle.
I'd check to see if the core voltages set themselves properly, I used to run a Radeon HD6850 that one driver revision started setting the 2d and 3d core voltages the same (to the 2d voltage) so the card would freak out in games. I would always have to manually edit the config and then set it to read only because the driver always changed it back.
Terascale was notoriously difficult to work with and optimize games for. Add to that the crappy AMD drivers and you get magic. Still love the cards and I wish I still had my HD2900PRO.
When the bracket was bent without hesitation to install, I died ☠️☠️☠️
I would still be looking for the little tool to remove the bracket LOL
When I first started pc gaming I was all and. Then I went to smd CPU and Nvidia GPU , to intel CPU and Nvidia GPU. Then about five years ago I switched back to all and again. I have no regrets going back to AMD.
Still haven't had a 2900 XT (or Pro) to this day. I had a 2900 GT as a temporary card though and at least it ran Dirt3 surprisingly well. :D
According to my understanding, the HD 2000 AMD 3000 series had issues with ROP hardware resolve for anti-aliasing, having to use shaders killing the performance
interesting to se this,because this gpu is same old like me, do you plan testing xtx is version of this card?
I have some more Terascale coming to the channel soon.
24:55 I feel like they did the same thing for the HD3000 and HD4000 cards, 4000 was better on drivers but it still feels like they just dumped them.
They where still impressive - both camps back in that time. I remember working in a online shop and actually holding one fresh one in my hands. That feeling, smell and sensation, never to be seen again with such leet high end hardware.
23:02 I did not expect that 😄
Just wonder why you don't install the mod driver you did install in the last video festuring hd4890, which should be compatible given the same GPU architechture.