How Intel & AMD Made 3D Faster - 3DNow! vs. SSE

Поделиться
HTML-код
  • Опубликовано: 19 окт 2017
  • These days AMD is synonymous with 3D; after all they make graphics cards & the chipsets which power them, but back in the 90s, their main focus was CPUs, and their main rival was Intel. Not very much has changed in that respect, but the late 90s was a different world where 3D games were still in their infancy. This meant that new technologies were flying about to try and squeeze the most from systems at the time. From AMD, one of those was the 3DNow! instruction set incorporated into the AMD K6-2 CPU. Intel's version was SSE, which was really an extension to their MMX instruction set. So in this video we'll take a quick look at both. At what they do, and ultimately at who won out in the early days of 3D gaming.
    ** For exclusive videos, mystery boxes and other rewards, please consider supporting me at; www.patreon.com/nostalgianerd... **
    ☟Sharing☟
    If you wish to share this video in forums, social media, on your website, *please do so*! It helps tremendously with the channel!
    ☟Subcribe☟
    ruclips.net/user/nostalgi...
    ✊Support Me! ✊
    *Please consider supporting the channel on Patreon*: www.patreon.com/nostalgianerd...
    Visit my eBay Shop: ebay.to/1QQpYyy
    Buy From Amazon (Amazon give a small commission to my affiliate account): AMAZON UK - amzn.to/2sTOsRW
    ★Join me on Social Media★
    Twitter: / nostalnerd
    Face: / nostalnerd
    Instagram: / nostalgianerd
    Web: www.nostalgianerd.com
    ★Equipment★
    Lumix G6 with Vario 14-42mm Lens
    Nikon D3200 with 40mm Macro
    Corel Video Studio Ultimate X9
    Corel Paint Shop Pro X6
    Blue Snowball Microphone
    ♜Resources♜
    Music (RUclips audio library);
    Dark Step
    Parasail
    Tuesday Dub
    If you believe I have forgotten to attribute anything in this video, please let me know, so I can add the source in. It takes time to make these videos and therefore it can be easy to forget things or make a mistake.

Комментарии • 169

  • @WDeranged
    @WDeranged 6 лет назад +88

    3DNow! netted me an extra 10fps in Quake 2 back in the day. I remember being rather pleased about it.

    • @m9078jk3
      @m9078jk3 5 лет назад +11

      That had Intel scared because in that highly popular FPS the K6-2 processor performed similar to the expensive Pentium II.
      Intel had the Celeron processor to compete with the K6-2 and in gaming it and the Pentium II at the same processor speed usually outperformed the K6-2 with the exception in Quake II.
      Otherwise the K6-2 matched the Pentium II in Integer performance whereas the K6-2 stomped on the Celeron in Integer Application software.

  • @SpeedySPCFan
    @SpeedySPCFan 6 лет назад +57

    I remember seeing "SSE/SSE2 required" on buttloads of PC game boxes and always wondered what it meant. Very interesting video!

    • @thecaptain2281
      @thecaptain2281 6 лет назад +11

      And the funny thing was, all of them would still run on non-SSE CPU's, because really, who's going to alienate the rest of a HUGE market? The "requirement" was a flat-out lie and they eventually changed the wording to "SSE Enhanced" or something similar. What wasn't a lie was the improved performance SSE/SSE2/SSE3 provided as it was very measurable. Of course, AMD's 3DNow was also very good for games that had the code to take advantage of it. Both brought improvements to the table for software that was optimized.

    • @mstcrow5429
      @mstcrow5429 6 лет назад +1

      Now things like Windows and Office require SSE2. Dunno why.

    • @thecaptain2281
      @thecaptain2281 6 лет назад +9

      +mstcrow5429
      No they don't. Windows 7, 8/8.1 and 10 all run fine, but slowly, on CPU's without SSE/SSE2/SSE3. I have personally tested this.

    • @mstcrow5429
      @mstcrow5429 6 лет назад

      What exactly is it doing with SSE when running Windows Update?

    • @thecaptain2281
      @thecaptain2281 6 лет назад

      +Hubert Hans
      I don't use Windows update[and yes my systems are just fine]. Never have, never will. Don't trust Microsoft and never will until they stop using the general public as a second tier beta testing platform, among other distasteful things they do. And operations like compression/decompression use floating point math. SSE can help but is not required. Your understanding of what SSE does seems to need improvement. Recommend doing some research on the subject.

  • @CaveyMoth
    @CaveyMoth 5 лет назад +5

    My Phenom II X6 1100T supports 3DNow! and I am darn proud of it.

  • @jerrywh3
    @jerrywh3 6 лет назад +23

    My first computer I built in ‘99 had an AMD K6-2 500 in it. Paired with my Viewsonic 21 inch monitor it made for some great gaming experiences.

    • @alexc3504
      @alexc3504 5 лет назад +2

      @@tripplefives1402 He must've saved a lot of money buying a K6-2 rather than a Pentium to buy that expensive monitor and at nearly or equal performance for less money. Go AMD!

  • @RobShootPhotos
    @RobShootPhotos 4 года назад +3

    All I know is with my old socket 7 K6-2 500mhz with aging dual stacked Monster II 12mb 3dfx cards, I upset a couple of my friends when they bought their new Intel PIII computers and how I kept up with their frame rate in a LAN party. I had very much fine tuned my computer to take full advantage of using the 3DNow tech and I was nice enough to help their computer to run better too.

  • @tiikoni8742
    @tiikoni8742 6 лет назад +43

    I must say I was a bit distracted by those crazy ads on background :-)

    • @GreenAppelPie
      @GreenAppelPie 6 лет назад +4

      Tiikoni I kinda miss those outlandish commercials.

    • @kbhasi
      @kbhasi 6 лет назад +3

      Me too. I was also distracted and keep having to rewind.

    • @V1VISECT6
      @V1VISECT6 4 года назад +4

      The 90s was a wonderful time.

  • @chrismcfee6785
    @chrismcfee6785 6 лет назад +7

    I had a k6-2 350mhz. I loved it. i remember quake had a 3dnow patch. it was awesome!

  • @johncate9541
    @johncate9541 4 года назад +2

    If you had a 3dfx card back in the day, 3dNow! was a miracle-worker, allowing AMD's much weaker FPU to compete with Intel CPUs on the same games. I remember a point in 2000 when I was able to get NOS V2 and V3 cards for cheap, along with K6-III 333 CPUs and VA-503 motherboards, and I would run those chips at 392/112 with the Voodoo cards, and the setup was faster than it had any right to be. They didn't last long as serious gaming rigs at that time point, but the systems I built were running for years.

  • @magnum333
    @magnum333 6 лет назад +4

    I like how thorough your investigative videos are. I'd have liked to see more on this!

  • @methanbreather
    @methanbreather 6 лет назад +19

    you failed to mention INTEL doing their usual shady/criminal stuff... which was also a factor and resulted in lots of fun for the lawyers.

  • @STOG01
    @STOG01 6 лет назад +52

    Once again - competition made our experience so much better.

  • @SUCRA
    @SUCRA 3 года назад

    Very instructive, lovely video, thanks for another one.

  • @yangashi
    @yangashi 6 лет назад +5

    oh man, I remember when those processors hit the market. Exciting times.

  • @DrewberTravels
    @DrewberTravels 6 лет назад +6

    I completely bought into the 3d now advertising. I was always yelling at my friends that my computer was better because it had 3dnow

  • @zye8355
    @zye8355 6 лет назад

    U are a gem of youtube!
    Keep up the good work

  • @charlesdorval394
    @charlesdorval394 6 лет назад +13

    Descent:Freespace is my best game ever, thanks for showing it up!
    It deserves to be better known, especially with the HD stuff released afterwards.
    *goes to find a joystick* hehehe

    • @Rick_Ibbott
      @Rick_Ibbott 6 лет назад +1

      Still one of the greatest game intro's of all time in my opinion. The voice actor for the pilot really did a good job of sounding stricken with fear.

    • @Bitterman5868
      @Bitterman5868 5 лет назад +1

      there's the remake with the new updated mediaVP for FSO, Just install FS2 and then Knossos launcher

    • @EasyCheesyChodes
      @EasyCheesyChodes 5 лет назад

      @@Rick_Ibbott Completely agree. It's my favorite intro to any game, ever.

    • @V1VISECT6
      @V1VISECT6 4 года назад +1

      Couldn't get into to FreeSpace. Don't understand why as Descent was the first video game I ever recall playing and I still love that series to this day.

  • @TefenCa
    @TefenCa 6 лет назад +5

    Another great video!

  • @emtee40
    @emtee40 2 года назад +1

    I still got a K6 floating around somewhere. My dad built a computer in 2000/2001 and I got it gifted to me in 2005.

  • @BornaPrpic
    @BornaPrpic 6 лет назад

    Another great video. Keep up the awesome work. From a guy, born "beyond the iron curtain"ish" " in 1987. Played on Amiga 500 from age 3 and got my first PC then (286) which I bumped up later (much later after my clone nintedo famicon, cousins sega megadrive, my clone atari 2600) to an AMD 5x86 133 MHz :) I think a long story about AMD should happen. It's an interesting one and since Ryzen came out this year. It's a great budget option wise and has almost always been (with dips here and there but hey. It's not a behemoth like Intel hehehe).

  • @Xilefian
    @Xilefian 6 лет назад +8

    MMX is an interesting one as that was pushed specifically as a way to have high-colour (24-bit) graphics at the same performance cost of 8-bit palette graphics, which is certainly the case with MMX (writing 3 bytes in 1 instruction for RGB, versus 1 byte in 1 instruction for palette index) however if you had a processor capable of MMX, chances are you had a GPU accelerator too, so despite gamers and the gaming press gobbling up Intel's message of MMX being a graphics revolution, games didn't use it and went straight to offering hardware accelerated graphics (with 8-bit palette software mode available for compatibility).
    Intel were in a strange war against hardware acceleration right up until 2010; Michael Abrash (famous low-level x86 PC programmer, helped work on Quake, was at Valve for VR, now Oculus) even worked on Pixomatic (software DirectX-7!!!) which was acquired by Intel, and then turned into project Larrabee (Intel's failed GPGPU project, was supposed to be an Intel GPU-CPU hybrid that competed against NVIDIA's CUDA). They figured out it makes more sense to focus on their integrated graphics (which are getting better and better and will play a big role when multi-GPU capable graphics+compute APIs like Vulkan get fully explored, AMD's APUs would be absolutely wonderful in this field).

    • @drewsebastino2889
      @drewsebastino2889 5 лет назад +1

      Hah, I remember you from the "How powerful is the Wii U" video, not that anyone cares.
      MMX seems like another vestigial x86 compatibility layer, along with real mode and a lot that it entails. Especially now that x86-64 adds 8 more general purpose registers while maintaining all the original x86 registers and the SSE ones.

  • @RamonChiNangWong078
    @RamonChiNangWong078 4 года назад +3

    These Intel Pentium II advertising.
    I imagine this is how Intel worked for the last 5 years on their 14nm+++++ technology

  • @ElimAgate
    @ElimAgate 6 лет назад

    Love the Freespace clips :)

  • @confusedkemono
    @confusedkemono 6 лет назад

    Nothing better than starting off a video with a cracked voice!

  • @anteldrobat3880
    @anteldrobat3880 6 лет назад

    One of the best channels on RUclips

  • @chrisbinghamton
    @chrisbinghamton 6 лет назад +1

    Very informative! In my CS classes, though, I've always heard SIMD pronounced as "sim-dee".

  • @mackal
    @mackal Год назад +1

    3DNow! was the cause of the "west bug" in EverQuest (actually I think it was a DirectX bug in their usage and assuming something worked the same in SSE and 3DNow!)

  • @Xilefian
    @Xilefian 6 лет назад +1

    Compilers these days can sometimes take advantage of these instruction set extensions without you needing to put much effort in, so even a basic program that isn't even game-related (with no 3D graphics) could get compiled to use 3DNow/SSE instructions for performance boosts. It doesn't happen as often as some people like to believe, but compilers will do their best to use extensions in cases where they are available.
    Generally, you will get better results and performance if you specifically use SSE intrinsic code and manually program the optimisation than hoping the compiler will take care of it, but the compiler certainly can (and will) take care of it when possible.

  • @thatGuyRULES123
    @thatGuyRULES123 6 лет назад +6

    Still running one of the 3D-NOW! chips in my daily driver, the Phenom II X6 1100t Black Edition, bumped up to 3.8Ghz base 4.2Ghz boost, due for an upgrade at some point

    • @dutchdykefinger
      @dutchdykefinger 6 лет назад +1

      i wasn't even aware that chip existed,
      do have an old phenom II x6 1055t that ran 2.80 -> 3.85 on bus clocking though :D

  • @wildbilltexas
    @wildbilltexas 6 лет назад +3

    The K6-2 had a weak floating point for 3D gaming, so I saw 3D NOW! as just a quick fix or advertising hype. I remember several games including Quake that had 3D NOW! patches on their websites so they'd play faster on the K6-2.

    • @bbkcs
      @bbkcs 6 лет назад

      They still sucked-ass compared to playing on a Pentium 166 MMX.

    • @njspencer79
      @njspencer79 6 лет назад +3

      Um no. I went from a P166MMX to a K6-2/300 and it was a good boost in performance. The key to the K6-2 was having latest drivers and 3dnow patches on like Quake II. For the dollar it had a lot of bang vs the PII.

  • @JulianUccetta
    @JulianUccetta 6 лет назад +1

    :D I have an old Compaq sitting next to me with an AMD K6-2

  • @tanguyfautre8446
    @tanguyfautre8446 6 лет назад

    Hi. Where did you manage to find such a high quality video of 3Dfx adverts?
    IIRC there were three ads: food, medicine and environment. Would love to download a high quality version of each. The youtube versions are highly compressed (i.e. poor quality).
    Never used 3D Now! But did a lot of programming with SSE/SSE2. Nowadays CUDA and GPUs really take the whole SIMD concept to a new level, the recently added AVX512 feels a bit underwhelming in that regard. We're now programming new apps using CUDA, never AVX512. Intel may have won the old 90's SIMD battle against AMD, but NVIDIA is certainly ahead of Intel when it comes to massively parallel architectures and instructions.

  • @hubzcaps
    @hubzcaps 6 лет назад

    love me some k62. fav CPU from back then

  • @SpAMCAN
    @SpAMCAN 6 лет назад +2

    SIMD wouldn't have been able to modify multiple pixels at once, but it could modify every value of a colour at once, since SIMD traditionally only uses 4 data points at once, and a colour is made up of red, green, blue and alpha values. Otherwise, a brightness operation as you suggested would require modifying every single value of a pixel separately. In maths like this, SIMD grants a 3-4x speed improvement.

    • @jhgvvetyjj6589
      @jhgvvetyjj6589 8 месяцев назад +1

      MMX already had instructions for bulk operations on eight 8-bit integers, and SSE2 made it possible to apply bulk integer operations on 128-bit registers, allowing for operations on sixteen 8-bit registers.

  • @st.john_one
    @st.john_one 10 месяцев назад

    i've had k6-2 300 on abit mobo, paired with Tseng Et6000 and Creative Voodoo2 and sb32... aoc 19' crt :D

  • @wilsonj4705
    @wilsonj4705 5 лет назад +2

    I almost forgot how cheesy many of intel's ads were back then.

  • @aceofhearts573
    @aceofhearts573 4 года назад

    What about Cyrix? Did they have something equivalent to SSE and 3dnow?

  • @darrencolegold3642
    @darrencolegold3642 5 лет назад

    Reminds me of my first 'gaming' PC, slot one P3 with 100mhz of frontside bus, simm style ram that had to be installed in pairs (I doubled mine to something still pathetically small), upgraded the HDD and clean installed windows 2k, and fitted a GeForce 4 4200 to it's AGP port, playing the shit out of CS 1.5, and quake rocket arena, getting together with highschool friends to LAN on weekends.

  • @0371998
    @0371998 3 года назад

    Internet browser Who require sse2 are they capable to replace this for 3dnow? Why all the browsers are only ajustés for the things burnt on Intel chips?

  • @helio1791
    @helio1791 6 лет назад +14

    Did Intel really still have the superior floating point performance by the 00s? In general the pentium 4 didn't stand a chance against K7 on a clock-for-clock basis. I suspect the standardization around Intel's instruction set had a lot more to do with 'clout' than anything else.

    • @vanheflin3015
      @vanheflin3015 6 лет назад +9

      Maybe. Most companies back then used a compiler that Intel provided. The compilers core feature was to make competitors cpu's run slower. When people found out that if you spoofed your cpuid to an Intel cpu you would get a performance boost in many apps. It was shown that, in many cases, the Intel cpu was slower when ,compared to it's competitors, even in apps that it was ahead in before you changed the cpuid. This was one of the big issues(there are many many more) that got Intel sued around the world.

    • @michalzustak8846
      @michalzustak8846 6 лет назад +1

      +randomguy8196 Except the AMD Duron was about equal per clock to the Pentium 3. And per clock, P4 was WORSE than P3, and the Duron was a Celeron equivalent for AMD. So even a Duron beats a P4 per clock let alone Athlon. It was not faster due to it's clockspeed, as first gen P4 did not clock above 2 Ghz anyways. First P4s were complete garbage. Athlon was a superior CPU all around.

    • @Magnus_Loov
      @Magnus_Loov 5 лет назад

      @@michalzustak8846 Then again when applications were optimized for SSE or even SSE2, the P4 and even the top clocked P3:s had an advantage over the Athlons.
      For DAW:S (music creation) Intel based PC:s were superior in performance (for real-time virtual instruments and effects when recording). They were extremely dependent on floating point performance.
      Even when Athlon XP:s added support for SSE, their implementation was much slower than the "real deal" in the newer Northwood P4:s.
      It was not until the AMD X2:s that Intel could be matched (ore even surpassed)
      But that just lasted a year or so before Core2 was released and after that AMD has never been able to catch up in that department (DAW music making), not even with Ryzen where this area is about the only "productive apps" that was a bit disappointing.

    • @adg1355
      @adg1355 2 года назад

      P4 really sucked at x87 FP, given the fact Athlon had a superscalar FPU.

  • @honkhonkler7732
    @honkhonkler7732 5 лет назад +1

    I'm on Team 3DNow! I had a K6-2 soooooo yeah.

  • @mactipiak
    @mactipiak 6 лет назад +1

    My first proc in my own rig :) blowing up my savings on an K6-2 333MHz... In a case made by: Amstrad!

  • @vinyljunkie07
    @vinyljunkie07 2 года назад

    Uff I wanted a AMD K6 so bad for gaming when I was a kid and had to make do with a horribly slow and unstable Cyrix CPU or old and even slower P2 so when I had enough clams later on I got a Athalon XP 2000.. Wrong move! Almost all the software I was using then (music not gaming now) was optimised for Intel! I later upgraded to a 3000+ or 3200+ (I can't remember) and whilst this was better I still had stability issues from a crappy VIA motherboard chipset. That pretty much sold me on Intel for a long time until recently where I switched to a Ryzen CPU and quite pleased with it

  • @Leeki85
    @Leeki85 6 лет назад +4

    I had Athlon 1.3 GHz and lack of SSE instructions made this CPU obsolete way too early. It was just unfair to get a game that required Pentium III 800 MHz like Tomb Raider Legend and get a big "This game requires CPU with SSE instruction set" notification.
    The same was with almost everything that was released after 2004. I still have that computer and for fun I installed Windows XP on it few months ago. Almost no modern software works. You can't run Steam, which requires SSE2. You can't even run modern browsers. Opera 12 is probably the newest one that don't need SSE.
    It's probably Intel influence to shutdown 3DNow! When you make software in C++ or other popular languages, you don't use any instruction sets explicitly. Compilers do the hard work, and optimizing for SSE is just one options. There were really no reason to don't support 3DNow! ten years ago. It would just make binary bigger by 1-5%. Even assuming if it would work slower than SSE, Athlon 1.3 GHz still would be much faster than P3 800 MHz.
    There's really a bad trend of abandoning older hardware, even if it's still capable. Resident Evil 7 was a game that didn't even run on i5 2500 and other CPUs from that period, because binary was compiled with requirement of newer instruction set.
    Apple is just famous for their planned obsolescence. Each year they made some of their devices useless. You can't make 32-bit apps for iOS anymore, and Macs are being stupidly discarded. Safari just won't process HTML5 on CPUs older than Sandy Bridge, while Chrome Opera and Firefox won't have any issue on the same computer.
    I'm the type of consumer that if forced to upgrade, will switch to competition products. This is why I'm using Intel CPUs, Android Phones and Nvidia GPUs. Although I might get Ryzen in a next upgrade if AMD manages to fulfill their promises of extended suppport for AM4 socket.

    • @az09letters92
      @az09letters92 5 лет назад +1

      @@tripplefives1402 Not true. You can compile all options in same binary and use function pointers to dispatch to correct version at runtime. So one object file would contain do_something_scalar(), another do_something_3dnow() and third do_something_sse(). At startup, you'd assign the correct version to do_something() function pointer. Some compilers even do this automatically.

  • @AshenTechDotCom
    @AshenTechDotCom 5 лет назад

    would have been nice to have had some side by sides of comparable setups, for example, a celeron vs a k6-2(or the rare 3 model!!!) , where 3dnow made the k6-2 more capable then even an overclocked celeron, (i owned both, anything that supported 3dnow, ran so much better on 3dnow...it was crazy...)
    there was even a 3dnow-powersgl binary for a few games that those of us who had power-vr cards made great use of..not sure if its even around anymore.... it was hard to find even then, and was never official, but, my god, 3dnow stacked with 2x matrox m3d cards was actually pretty strong... 1024 maxed out was no problem at all, games just maxxed out.... it was great... it made me sad they depricated them, i mean i get why, but would kinda like to see how it would effect perf of older games on say an FX line cpu... since i have not heard of anybody compiling unreal(classic) or quake with modern optimizations.... though the idea isnt bad, could make running them at high settings on very crappy hardware very doable.... pandaboard at full 1080p on classic windows games?
    anyway, i knew all of this, but, for those who didnt, good info, some mention of how much it boosted the k6-2 vs what it was competing against price wise would have been nice, it wasnt an across the board win but, any game that supported 3dnow, really showed a huge gain, my friends who had overclocked celeron 333's where all jelly of my k6-2 that i clocked to 566 or something close to that(board would do 133 bus), without 3d now they won most times, with it.... even games that chugged for them, ran playable smooth for me, another advantage was, the platform cost ALOT less then the celeron rigs of the day.
    IF you do setup a k6 rig, you really need to delid the chip carefully, and replace the thermal compound, even back in the day it was pretty crap, by now, its dry AF, helped a buddy build a k6-3 system a while back when he got ahold of 2 working chips, with the right board, we got around the same clocks, once i took the lid off one and replaced the compound the temps dropped 15-20c (depending on what thermal sensor you believe.. we also had 2 externals in that little lip where the ihs is glued down, put some ASC in there, used a couple dabs of superglue on 2 corners, and put it all back togather, we got a stable 666/667mhz out of it... no joke, its still running a couple years later, and gets used for classic games alot, his sons friends will get on his classic systems and play old games like Unreal/quake/etc for hours on end, and the k6-3 was a very nice chip... sad its so rare.... i didnt even know about it back in the day... then the athelon, then duron came along for those of us on a budget... :)

  • @musclesmouse
    @musclesmouse 4 года назад +1

    I have a K6 III+ sitting around somewhere.

  • @DeetexSeraphine
    @DeetexSeraphine 4 года назад +3

    Funnily enough, modern Diagnostic tools found in Dell and Lenovo systems(to name a few) still test the 3DNow! functionality... though marking them as non-applicable pretty much all of the time.
    This is interesting because said diagnostic tools could generously be considered as being completely beyond the capabilities of said processors.

  • @Smartzenegger
    @Smartzenegger 6 лет назад +1

    So now it's kinda like "3DThen!"

  • @umageddon
    @umageddon 6 лет назад

    Maybe you could talk about the Duron and Celeron 300B and overlcoking in that era

  • @AgnostosGnostos
    @AgnostosGnostos 6 лет назад +1

    For real 3D performance only a graphic card with powerful GPU was adequate. Of course for laptops or very cheap motherboards with intergraded graphic card this technology was good.

  • @georgehilty3561
    @georgehilty3561 6 лет назад

    when i was younger, i had a Pentium 3 running at 500 MHz with SSE, and he had a Pentium 2 running at MHz, and omg what difference there was between the two! there wasn't much difference in speed, and everything else was comparable. but my Pentium 3 would run circles around him in gaming back then!

  • @jari2018
    @jari2018 5 лет назад

    So fast is the fastest ( latest )APU/CPU with 3dNow ? (AMD 3870 ?? )

  • @adg1355
    @adg1355 5 лет назад

    3DNow! was more advanced: it allowed horizontal ops since its inception, something that SSE could only do since the PNI

  • @thefurrygamer1489
    @thefurrygamer1489 6 лет назад +1

    Speaking of the Pentium 3, I know that processor kept on being used into the early 2000s

  • @PhilipStorry
    @PhilipStorry 6 лет назад +21

    I felt sorry for AMD when it came to 3DNow!/SSE.
    If they don't do something, then they're clearly behind Intel. So they must do something.
    Whatever they do isn't likely to get as much marketshare, so it gets less use, and therefore there's not a huge return on that development effort.
    And yet, as usual, AMD reminded Intel that they were doing a crap job. (Intel don't always do a crap job, but when they do AMD are very good at reminding them!)
    The lack of marketshare is a pity, as some of their designs have been bloody wonderful. I remember my Athlon fondly. That was a great processor.
    There have been times when AMD's engineers clearly did a better job, and I think 3DNow! just crosses the line on that one.

    • @GraveUypo
      @GraveUypo 4 года назад +1

      atlhon was great. my atlhon 700 was noticeably FASTER than my friends's pentium 800, at least in unreal tournament.

    • @Dong_Harvey
      @Dong_Harvey 4 года назад +1

      AMD is even doing a better job at hiding their NSA backdoors than Intel.. pretty soon they won't even market them as 'coprocessors'

  • @mstcrow5429
    @mstcrow5429 6 лет назад

    SSE originally shared registers with the x87 FPU (due to transistor budget?), just like MMX. I think SSE2 fixed this.

  • @RetroAmateur1989
    @RetroAmateur1989 6 лет назад

    MMX my love

  • @dallesamllhals9161
    @dallesamllhals9161 2 года назад

    And here I am. With the last 3DNow! supporting CPU in 2021.
    Long live my Phenom II x6 1100t ♥ ..Bloody INTEL and their SSE4.1-4.2 and SSSE3! Oh wait!? I haven't even got the time to play all the games - That I CAN run...(sigh)

  • @zxkim8136
    @zxkim8136 5 лет назад

    Excellence as standard Peter😁😁😁Kim😁😁😁

  • @niks660097
    @niks660097 8 месяцев назад

    Nowadays without SSE or AVX, most games won't even work because everyone is using SIMD collision model, since the amount of physicalized geometries is increased a million fold, much higher than general purpose performance of modern cpus i.e there is not a million fold increase in general computing power(CPUs) since late 90s..

  • @marvnuts
    @marvnuts 5 лет назад

    Wow a Descent Freespace sighting.

  • @retrogamer33
    @retrogamer33 5 лет назад

    Still got my K6-2 500MHz

  • @timevans9189
    @timevans9189 5 лет назад

    What killed it was the advent of graphics cards with 3D functions, they ran as CPU units on the card.

  • @0371998
    @0371998 3 года назад

    Back in the good days the compagny were with the will to help the old computer with deficient capacity in mmx, video instruction, then They sold specific pci card made to read the video, and dvd apps. So why this good intiative has stopped by exemple it should be really kind if Intel or AMD introduced a pci card, a pci X card with all the SSE and others instructions made since 2003. The data transfert could be too slow, I dont know, It will be so usefull for the computer made before 2004 ? I suppose that Intel and Amd owns the right on those instructions, thats why no independants compagny has commercialised theses applications on pci card.

  • @djosearth3618
    @djosearth3618 3 года назад +1

    Remember around this time Intel put out a bunch of defective processors resulting in the same incorrect result on specific instances of a certain calculation. They had a free tool to check if your CPU would give the bad result or not therefore proving your CPU was of the affected ones. My (pedo) high school puter teacher lured I mean showed me that his CPU was giving the wrong result, seems that it was not normally an issue.
    ps he never tried nothing on me but tried to zip our two sleeping bags together but by now I was on to him and I separated them he later boldly asked me why. I skipped gr. 8 so I was the youngest kid in the school and I believed this was why he paid me special attention taking me camping and canoeing, good times taught me lots about hacking. Yes he was charged a decade later for events that took place 2 decades prior at his old school in Thunder Bay. Kinky ol' downlow Mr Dukelow... ;]

  • @FingerinUrDaughter
    @FingerinUrDaughter 5 лет назад

    ahh the days of the Thunderbird. the only time AMD had anything worth buying. i remember going directly from a 950mhz Athlon to a 2.4Ghz P4 and thinking "meh. windows loads 5 second quicker, and the onboard video has more memory." but was otherwise unimpressed with the supposed large difference between the two.these were also the day when gigabyte made products worth buying, that didnt shit out in 6 months.

  • @thetrivium2261
    @thetrivium2261 6 лет назад

    Powered many an hour of Wargasm!

  • @rbmk__1000
    @rbmk__1000 6 лет назад +2

    great vid, love the 90s stuff however I'm quite sure the Athlon fpu kicked the p4's ass, since they were all around slugs and could only compete with and in later high clocked versions, the overly complicated netburst pipeline is one of Intel's biggest boondoggles and eventually was abandoned, worthy of a video?

    • @starfrost6816
      @starfrost6816 5 лет назад +1

      everything kicked the p4's ass. in the beginning, even higher-clocked p3s kicked the p4s ass.

  • @jeremyjohnson8844
    @jeremyjohnson8844 6 лет назад

    Ah, the K6's FPU. A huge thorn in my side until I got a P3 833 in 2000. It did advance FP SIMD but god was it a dog outside of SIMD. Celeron Mendocino wiped the floor with it.
    By the way, Intel could not compete with K7's floating point performance. Certainly not.

  • @JoeStuffz
    @JoeStuffz 5 лет назад

    Microsoft took advantage of every 64-bit CPU having SSE2, making it a requirement for 64-bit Windows. 64-bit Windows will accelerate certain floating-point operations by running them using the SSE2 unit instead on

  • @dmbarlow2008
    @dmbarlow2008 6 лет назад

    AMD's retirement of 3DNow! had some interesting consequences, particularly with sloppy, rushed, or otherwise sub-par ports to PC from console. 2007's Mass Effect is a very good example of this: on 2 (or 3, it's difficult to ascertain exactly how many of the graphical glitches are related unless you're some sort of forensic programmer, and I am not) of the game's 4 major plot planets, there are areas which refuse to render properly if you have a non-3DNow! AMD CPU. There are mods to mitigate this issue, but unless a player is aware of the Nexus (in a larger sense than, 'oh, that's where I get fixes for Fallout/TES'), or the PCGamingWiki, or some other source, it's simply going to have unplayable required segments.

  • @jaredgarbo3679
    @jaredgarbo3679 6 лет назад

    1:28 90's as fuck.

  • @peytonlutz1
    @peytonlutz1 6 лет назад

    Before 1000 views?

  • @0371998
    @0371998 3 года назад +1

    Its sad that the support for the socket 7 has shut down so soon after the Athlon appearance. It was useless to give better motherboard for the millions of person who had a k6 ? Its a question for the consumer and their taste to preserve their invest. K6-III should has taste the DDR and agp 4x ! Maybe the japanese have it. In comparaison the pentium 4 has won a support and a duration long like the life of Noah.

    • @dallesamllhals9161
      @dallesamllhals9161 2 года назад +1

      K6-III with 256KB Cache was pretty great EVEN at the 400MHz* on Socket 7.

  • @Objectorbit
    @Objectorbit 6 лет назад +1

    I've never heard the angle that AMD didn't want to pay the cost for SSE. Do you have a source for that?

  • @merlingt1
    @merlingt1 6 лет назад

    I had a K6-II 266Mhz? if I remember right and it was a pile of crap. I spent a lot of my money when I was young for this upgrade and I was throughly disappointed.

  • @kennethober6626
    @kennethober6626 Год назад

    The K6-2 would’ve been better if paired with a 100mhz FSB. The 66 fsb and slower ram killed it’s potential.

  • @taketimeout2share
    @taketimeout2share 6 лет назад

    You know what rings my bell. Graphics cards, AMD K6 and Pentium 3. And everything else since.(Lost interest when AMD brought out Bullldozer.) Exciting times.
    Cyrix? Never heard of them.(sarcasm)

    • @mstcrow5429
      @mstcrow5429 6 лет назад

      Never heard of Cyrix? Let the noob bashing begin! (probably not)

    • @taketimeout2share
      @taketimeout2share 6 лет назад

      Bash away. When it comes to Cyrix I wish I was a noob.(more sarcasm)

    • @mstcrow5429
      @mstcrow5429 6 лет назад

      You're such a noob that you tried to buy a Pentium Pro and bought a 6x86 instead!
      And then thought you had a 486!

    • @taketimeout2share
      @taketimeout2share 6 лет назад

      Is that a type of locamotive? Don't know much about those either.( ignorance)

    • @taketimeout2share
      @taketimeout2share 6 лет назад

      I used them. (regret).

  • @undeadelite
    @undeadelite 6 лет назад +1

    3D now? More like 3D soon...

  • @Grgazola
    @Grgazola 6 лет назад

    Yay for data prefetching, on a side note.

  • @shadowkat1010
    @shadowkat1010 6 лет назад

    Had a terrible VIA C3 chip with 3DNow! specifically.
    It still sucked.

    • @Cooe.
      @Cooe. 6 лет назад

      No instruction set is gonna save a piece of Via trash. Doesn't say anything bad about 3DNow! though, just that Via chips are the absolute freaking worst.

  • @arnaudmeert1527
    @arnaudmeert1527 6 лет назад +1

    Damn, Intel adverts were pretty awesome back then.

  • @SavageScientist
    @SavageScientist 6 лет назад

    PowerPC was killing Intel in the mid 90s

  • @SteelRhinoXpress
    @SteelRhinoXpress 6 лет назад +2

    cpu's are too cluttered with instruction sets now take SSE for example SSE 4.2 can do everything SEE 1-4 can do, so there isnt a need for SSE 1, 2, 3, 4. 4.1 to be on cpu's anymore. by simplifying the instruction sets you'll make the cpu more efficient and more streamlined..

    • @mstcrow5429
      @mstcrow5429 6 лет назад

      I think new SSE instruction sets, at least usually, add additional, non-redundant instructions. Each iteration is just a superset of the previous set.

    • @rickyyoung
      @rickyyoung 6 лет назад +2

      Plus you also need patches for legacy software when you remove an instruction set that software uses

    • @SteelRhinoXpress
      @SteelRhinoXpress 6 лет назад

      all SSE versions support legacy instruction sets. SSE 4x for example will run SSE 3x 2x and 1x software..

    • @rickyyoung
      @rickyyoung 6 лет назад +1

      right, like DirectX is 'supposed' to then. We all know it doesn't, there's always something you have to go and dig out of the web to get older titles to work on modern Dx versions and seeing as how you can't 'download' a older instruction sets for your CPU then it will be up to the software developers to create a patch. look at Resi 7 on Phenom IIs. Ok i know that's flipped with the older CPUs not supporting newer SSE extensions but you can bet your bottom dollar there will be something, just a tiny little niggle that'll cause the whole thing to fall down around your ears

    • @ZipplyZane
      @ZipplyZane 6 лет назад

      There is just little advantage to doing this. It's not as if the old instruction sets slow anything down. They just essentially redirect to the new instruction sets, or possibly to multiple instructions. They were designed for slower processors, so they can be slower, while having the new stuff be fast. I don't even think there is a power advantage, as long as the old parts aren't used.
      Then again, with CPU drivers, I'm not sure how much of this is handled on chip vs. software. For all I know, the older instruction sets have been removed and get converted on the fly to newer ones.

  • @92trdman
    @92trdman 6 лет назад

    Intel, nvidia, creative stole from competitive (basically everything advance at that time)

  • @ichabaudcraine2923
    @ichabaudcraine2923 6 лет назад

    weeeeeeeeeeeeeeeeeeeeeeev

  • @Jikyuu
    @Jikyuu 6 лет назад +37

    3D Now! ... that ! indicates a 'not' ... so ... Not 3D Now ... 3D in a bit?

  • @macdaniel6029
    @macdaniel6029 6 лет назад

    3DNow! was nothing but a marketing gag. Every AMD CPU was slow as hell. I had some K6-2 processors back in the days because they where cheap. One of my friends had an Pentium 2 350 System that outperformed my K6-2 500 by far.
    But today I like the K6-2 processors :)

  • @EarthboundX
    @EarthboundX 6 лет назад

    Haha, another video I didn't understand half of.

  • @waldnew
    @waldnew 6 лет назад

    I got to be honest but didn't understand a word of it.

  • @tsartomato
    @tsartomato 6 лет назад

    1 you screwed up aspect ratio
    2 SUB CULTURE SUB CULTURE SUB CULTURE SUB CULTURE SUB CULTURE SUB CULTURE SUB CULTURE SUB CULTURE SUB CULTURE

  • @pelgervampireduck
    @pelgervampireduck 6 лет назад +8

    k6-2 500 was the worst processor I had, everything ran slower than in a pentium 2 300!!! almost nothing that I had or liked used 3dnow :( when I was lucky and I found something with 3dnow support it ran fine, but I remember like almost no games used it. (maybe not the ones I played? maybe all the sports or racing games used so it was good for "general audiences"?)

    • @adamclark9552
      @adamclark9552 6 лет назад

      believe me the K6-2 450 was even worse. It ran so hot I had to put a heat exhaust in the back.

    • @pelgervampireduck
      @pelgervampireduck 6 лет назад +2

      I had a 3dfx 4mb card, the stuff that ran great already was playable on pentium 1, like quake 2, but it struggled with newer stuff like unreal or deus ex. it improved a lot when I got a voodoo 3 16mb, that card was so good it made games playable on the crappy k6-2. deus ex worked fine with the voodoo 3, the first max payne was playable, I think return to castle wolfenstein was too slow even with the voodoo 3.

    • @si4632
      @si4632 6 лет назад

      yeah had the 550 and the pentium 2 300 was definitely smoother but it did only cost £50 in 1999

    • @si4632
      @si4632 6 лет назад

      yeah voodoo3 3000 was the limit really i put a £200 geforce ddr in it and it bottlenecked badly in a lot of games

    • @Promilus1984
      @Promilus1984 6 лет назад +3

      Guys, you are missing important things here. 1. Pentium II used Slot 1 motherboards, new system. K6-2 used Socket 7 boards so was natural upgrade for Pentium MMX systems. Pentium II was "top" processor while K6-2 was budget processor (with no cache L2). Later 180nm process for K6-2+ (used mostly in laptops) allowed integration of small L2 running full speed just like newer celerons. And most notably - 500MHz K6-2 wasn't slower in EVERYTHING. Integer performance of K6 was solid and didn't fall all that much behind Pentium II per core so 500MHz K6 was better in office suite than 300MHz Pentium II. What K6 lacked was floating point performance. K6-III (with L2 on-die) was struggling against celerons. That's why it was important to use 3DNow! 3dfx driver had optimizations for it and nv driver had some too. If game used 3DNow! it could be on par or even faster on AMD system than Intel system of same price. But that's a big if.

  • @SnorFlax
    @SnorFlax 6 лет назад

    A bit too nerdy?

  • @chatboxguy3363
    @chatboxguy3363 6 лет назад +6

    Pentium 4 was crap. The Athlon XP ruled! You remove such facts like by 2004, Intel processing graded like crap with the hardware so it was choppy as hell which was why a lot of techs were using the AMD athlon XP processor. However, Pentium had managed to gain contracts with a number of companies such as Dell and so on to have computers made with there choppy processors which were cheep which were what the average American would by without paying attention. You will notice the blue men drumming they are to entice typical teens and young adults of the time to BUY the cool toy. Anyway, AMD focused on those contracts, but they prized there use of higher end hardware as GOOD tech preferred them so there contract with the motherboard winner maker Gigabyte was preferred. They managed to hang on until Microsoft was bought out by another group looking to finish Vista in its failure after launch. As a reward for a contract with Intel as a number of other sources they went out to change the mathematics of the balancing to favor intel's finicky processors. AMD's processors in the offset were a bit slower AFTER, but sense then they have become equal although it is CLEAR in recent times that BOTH are becoming slower as the chips on the motherboards are choppy in speed as time traveled on. Chipset processing is becoming sluggish which is why now the BEST computers for either AMD/Intel processors is quadcore. By being able to perform 4 actions at once they are able to split up and speed up accessing by using 4 channels and when is finished replace it with another set. This makes it so more activity can happen. This is WHY President Trump who actually cares about the economy has initiated research in the last few months on research for higher and faster processing chips and WHY Microsoft Believes PC tech has outlived itself and is researching Virtual SSI software which ALL previous attempts are a flop, but I believe that Trump knows that Virtual SSI software CAN be a nice toy for gaming and home play, but office computers and such for programming are traditionally done on a mac or a PC. I haven't used a mac, who knows they might be faster because they hold back from higher graphics for fear of processing overload. I don't use macs cuz there too darn expensive. Anyway, in 2014 threw 2016 technology is advancing in certain areas, but not PC's. Gaming on the X-box versus PS-4, tablet computers, i-phones, Microsoft jumping ship for other ideas in research. This actually gives AMD a advance because they have been continued to be advance performance for the only other operating system in existence. If Microsoft kills themselves off as a business or doesn't return to make another operating system the REMAINING company although there traditional practice hasn't been to focus on Gaming will have too finally concede to support it can get together with other companies such as AMD and Geforce to make a real operating system like no other. You seem to miss that as of 2017 computer technology which still rules our lives VERY much just as the tablet is at a insurpass. Without the hardware problem which Obama ignored to correct, which trump is. But the point is Intel hasn't been better BECAUSE of better research is because of CONTRACTS. LOL, Intel was good in the early 1990's, but they edged off in performance quality.

    • @eecajledo8430
      @eecajledo8430 6 лет назад +1

      Wall of text crits for 500. DAZED !! Unable to take any action this turn.HP reduce by 40% for next 5 turns. AP reduced by 60% next 5 turns. Unrecoverable!

    • @rbmk__1000
      @rbmk__1000 6 лет назад +2

      wonderfully incoherent, bravo

    • @chatboxguy3363
      @chatboxguy3363 6 лет назад

      Just because it gave them the lead doesn't mean it was the best. 1996 to 2001 marked a change in technology this GUY seems to not recognize. More End users were buying PC's so it takes up the bulk of the market. Home computers took a upward nose dive. Before that Computers were recognized as a nerds a smart person's item. But while the Internet made more people interested, there were MORE people buying computers without rating what is on the inside. This is What Pentium was taking advantage of. your seeing high end value. As of 1997 and after the BEST tech isn't always the highest end tech in sales. I can walk up and down the street and say "hey did you know Gigabyte made the best motherboards in quality? And Well, nobody knows what a Gigabyte motherboard is. Typically they have sold theirs online threw secondary sellers such as Tigerdirect (I do not recommend using Tiger direct today as they have some issues with payments that didn't involve awhile back) and New Egg. You don't get the whole computer when you purchase it. You get a bundle typically, some of there earliest included the case and power supply but later it just included a processor (AMD of course were there highest sold), the motherboard, and a starter memory kit. People building there computers would often have wanted a different and better case as time goes on as well as a better power supply. But when they did they often produced a good power supply brand, just lower wattage. A best-tech PSU. We would then select the rest of the items often paying much less if the user does it right or just a little more than store cost to get a better computer parts. The companies often had cut corners in quality. Today, it still holds true for this computer assembly as there are more choicer options for stuff that are NOT in stores. Often Good parts ARE no longer about Revenue these days. The highest end computer models for group sale are often THROW away stock. In fact there have been a number of cost cutting that raised sales. If you look at the actual performance rating of the Athlon XP processors using a typically good selection for windows which is XP for the era versus one using a Pentium 4 processor of equal or greater speed you will notice that the Athlon XP has lesser hang-ups. The information printed is QUITE accurate. This man found his height in computer knowledge in 1990 maybe a bit before. He got hung up on Pentium and Pentium's II's glory days. But in his saga like you you have missed a few high points and understand the change of effects. MAJOR computer sales had MOVED to the END user. Fewer COMPANIES were supporting TECHS OKAY? We cover 10% of the market, and were much more demanding. Hard to please. We ENFORCE quality where as they will take anything that looks neat or entertaining. The men in the commercial EXEMPLIFY this change in issue. The Pentium 4 commercial wasn't designed to explain and attract knowing techs, it was designed to attract morons who WANTED things to look neat. They could constantly sell computers and cut corners. Thats why ANYONE who wants a serous computer that isn't a laptop will be found on New Egg if you conduct a survey where as the home computer user will buy one in Best Buy or go onto the Internet. However I believe we are on yet another change back which started in 2014 which is quite unsettled do to tablet devices which can be used for plain and simple web browsing so most homes needs in place of computers which started in 2013 or laptops. When the smoke clears eventually factories like HP/HP Compaq and Dell will no longer exist. I believe we are in there declining years. The feast is over and there in famine. Techs are supporting parent companies by buying and purchasing PC parts and building themselves. But the Pentium 4 debuted in November of 2001 which was toward the end of the END user take over in priority in home computer tech. Lots of late end users who wanted to use the Internet in the latest trend in growing toys in purchases were happening at this point. That is WHY Pentium made that commercial. It was a sales GIMMICK. Anyway, I think I drilled a well here and that is enough. I can talk about pros and cons why the Pentium 4 done better, but it wasn't quality. It was END USERS.

    • @rbmk__1000
      @rbmk__1000 6 лет назад +3

      ChatBoxGuy I award you no points everyone is now dumber for hearing it may God have mercy on your soul

    • @chatboxguy3363
      @chatboxguy3363 6 лет назад

      Yeah and I award you the stupidity medal for 2 reasons. 1) The first is less important, that you cannot invent your own jokes for which you have to get a joke from an Adam Sandler movie rather than coming up with your own. 2) For not understanding my points which merit points. The goal of the market had SWITCHED FROM the BEST technology promoted by the best in the mind to winning out customer-ship to a market of end-users. Before that it was more limited. Fewer end-users were buying computers before the rise of the internet and end-user PC gaming. Any and all polls there in before 1997, would have brought for those who OWNED a PC without provision of the technical department of an office work space WERE technical in mind. And Thereby the office people maintained tech workers who SELECTED the best technology. With the rise of the internet came end users who wanted to enjoy this new toy in its potential. But these people HAD NO CLUE of the inside workings of a computer. Half a lot of the sales for the Pentium 4 computers you will notice were to offices trying to cheepen computers threw contracts as the market began a slight decline, and after 2001 the market began to really crash so businesses were MORE open to cheaper PC models. My point is sense then we are now at a rift where this is starting to CATCH up to them possibly depending ON how things play out and the likely hood. This this the second REASON I award you no points. You don't have the mind to understand that computers like any industry are given to business factors.

  • @KabelkowyJoe
    @KabelkowyJoe 2 года назад

    IMO 3DNow! was DUMB idea just as MMX was, and x87 was dumb as well. Entire x87 thing should never be part of x86 platform. Up until SSE era Intel could not make good FPU from programing standpoint . It might be OK if AMD called it MMX+ and Intel added it to their processors as well cause 3DNow was nothing else than MMX with instrutions to support 16,32,64bit float data types. Just as SSE2 was nothing else than SSE with integer data types added to SSE. AMD should never ever waste time on this 3DNow "feature" most importantly never call it this way as anything unique. IMO x87 was designed to fit into 16bit platform 64bit+16bit=80bit it was better than nothing but it wasn't designed for 64bit data bus, it wasn't designed for games nor for multimedia it was designed to be used in Excel and Excel only. Excel Accelerator. MMX was attempt to help make use of FPU in era of multimedia, to help read/decode audio tracks not so much on 2D photos lot of instructions are clearly designed to decode WAV files. What SSE was added back to Intel CPUs was to make use simultaious use of floating point, while same time CPU could handle audio via MMX instructions, CPU would not need to switch between MMX and FPU all over again. Finally SSE 128 was much better suited to operate on 4x32bit vector datatype might fully fit into registers and being calculated at once. SSE2 added integers to it the opposite to what AMD did with MMX witch added floating point to integer based MMX. To me x86 integer, x87 excel accelerator, the mmx and 3dnow was attempt to use for other people, 128bit sse finally bring to PC floating point that is for everyone. Neither MMX nor 3DNow! should be ever born. For certain 3DNow! should not be called such weird name but well.. probably patents, trademarks forced them to do so.

  • @meowpolitely981
    @meowpolitely981 6 лет назад

    im second

  • @aurelia8028
    @aurelia8028 Год назад

    as always Intel reigns supreme!

  • @chrismcfee6785
    @chrismcfee6785 6 лет назад +6

    I had a k6-2 350mhz. I loved it. i remember quake had a 3dnow patch. it was awesome!

    • @warrax111
      @warrax111 4 года назад

      I had k6-2 500 mhz. Loved it. Wanted k6-3 450 mhz, but they were too rare.