Why Microsoft switched from Intel to Power PC for the Xbox 360 | MVG

Поделиться
HTML-код
  • Опубликовано: 23 ноя 2024

Комментарии • 1,8 тыс.

  • @LambdaCalculus379
    @LambdaCalculus379 4 года назад +3761

    The *real* winner of the seventh generation of consoles?
    IBM.

    • @johanlundberg8449
      @johanlundberg8449 4 года назад +352

      And the next was AMD. You know what was the same? Lisa Su. Coincidence? Maybe, I don't know. Just a fun fact.

    • @Nossieuk
      @Nossieuk 4 года назад +39

      @@johanlundberg8449 and is it coincidence that Nvidia have a raytracing card out and AMD does not? I would think not, in the same way IBM could not or would not provide Apple with better CPUs until after Apple left (What I'm saying here is that both these shifts came at a cost in the long term ... IBM lost a huge long term business partner and by the looks of it AMDs next gen GPU is now delayed until next year when the Nvidia 3k series is about to be released (maybe they will both hold back now)

    • @Nossieuk
      @Nossieuk 4 года назад +12

      I full appreciate that working with Microsoft and Sony will work out long term in desktop cards, but that won't be for a few years yet.

    • @joeyvdm1
      @joeyvdm1 4 года назад +66

      @@Nossieuk I am not so sure about RDNA2 being delayed, AMD and Lisa Su herself have stated multiple times and right up till recently that both Zen3 and RDNA2 are still 100% confirmed for this year.
      There have been many rumors going around about delays, but AMD and Lisa Su have shot them down repeatedly. AMD do keep stating that RDNA2 is releasing before the next gen consoles and so is Zen3. So I do think they will release this year still, unless AMD and Lisa Su make an official announcement saying otherwise, I wouldn't believe any other information.
      There have just been too many rumors about delays that keep getting debunked by AMD and Lisa Su, so I am just not believing anymore new rumors about delays. I will believe official word on the matter from AMD though.

    • @fredsas12
      @fredsas12 4 года назад +35

      Yeah, but Apple.. Apple Computers used IBM PowerPC RISC CPUs in their MACs, then they wokeup to reality and switched to Intel X86 CISCs for ultimate capablities.. But behold, they've recently become stupid again and now they have dumped CISC to go back RISC CPUs using custom ARMs.. The good thing though is that Apple users really can't complain now when PC users refer to them as consolers.. Hah

  • @KarlBaron
    @KarlBaron 4 года назад +1215

    At the time it was really amusing how Microsoft bought PowerMac G5s in bulk to send out as the original Xbox 360 pre-release devkit

    • @andreiarg
      @andreiarg 4 года назад +147

      It kinda made sense, so at the very least developers would get familiar with the architecture

    • @alvallac2171
      @alvallac2171 4 года назад +9

      @@andreiarg *made

    • @whoeusbsknsi
      @whoeusbsknsi 4 года назад +18

      alvallac21 💀💀💀

    • @campkira
      @campkira 4 года назад +14

      well... it just hackingotich cheapst production....

    • @DJ_POOP_IT_OUT_FEAT_LIL_WiiWii
      @DJ_POOP_IT_OUT_FEAT_LIL_WiiWii 3 года назад +59

      That has always been Microsoft strategy. They are one of the top vendors of Mac software (office).

  • @joemann7971
    @joemann7971 4 года назад +806

    So, Microsoft took a RISC with PowerPC....

  • @czimmerman86
    @czimmerman86 4 года назад +472

    RE: The question at the end on how did I feel about the move to PPC? I remember Apple abandoning PPC because of heat, and then when the whole "red ring" thing started happening on the 360, I was like...yep. There's the magic.

    • @Yeen125
      @Yeen125 4 года назад +62

      Dener Silva And to add a bit of irony (or going full circle), Apple’s RISC-based ARM processors have more in common with PowerPC then with the CISC-based x86-64 processors from Intel and AMD.

    • @Kain652
      @Kain652 4 года назад +6

      Great info, thanks man. I feel like such posts are underappreciated.

    • @yoloerboyron9364
      @yoloerboyron9364 4 года назад +11

      the wii u had a ppc chip that wasnt suspectible to overheating, it was more powerful than 7th gen chjps as well, despite having a low clock speed.

    • @FeeLtheHertZ
      @FeeLtheHertZ 4 года назад +38

      Yoloer Boy No, believe it or not the WiiU paled in comparison to the much faster Xbox 360 chip and of course the CELL. Look it up, even Metro 2033 devs said the wiiu sucked balls due to its weak cpu, but the 360 and whatnot was plenty even still.
      It only had more ram and a somewhat beefier GPU, but better results still came out of the previous gen machines due to better cpu, not worse.

    • @jonyw8851
      @jonyw8851 4 года назад

      Dener Silva not, its igpu issue

  • @RandomlyDrumming
    @RandomlyDrumming 4 года назад +632

    10:19 - you probably meant to say "AMD", not "ATI" :)
    Also, according to an interview with Nicholas Baker, the lead architect behind Xbox 360, one of the biggest reasons Microsoft chose PowerPC over x86 by Intel or AMD was because his team figured out that CPU clock speed isn't going to get much higher (especially when you take into account that power consumption and heat dissipation play a very large role in the design of a game console) and that the solution for increased performance in the long run is parallel execution. In other words - a multicore CPU. Problem was that neither Intel or AMD had anything like that on their roadmaps (at least at the time) while IBM did, so Microsoft chose to partner up with IBM.
    EDIT:
    For those interested in more details, here's the full interview with Nick Baker:
    ruclips.net/video/JP9TDLxq_1U/видео.html
    Oh, and thanks for all the thumbs up :)

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 4 года назад +63

      Amazingly accurate considering that we've barely increased clock speed since the mid 2000s. Multicore's become the focus of new chips, even though Intel got to 5ghz on their processors recently, with the 9900K coming out over a year ago.
      AMD's managed to take IBM's place in consoles, being involved in every system for most of the 2010s until the Switch came out.

    • @TheMarc1k1
      @TheMarc1k1 4 года назад +26

      Thanks for the info! Very interesting, 5GHz is a pretty much a hard wall for PC users to push past in most cases or at the very least requires a lot of money spent on either the CPU or it cooling it... hell both really, so expecting consoles to ever reach that level comfortably is still unrealistic, let alone back in the mid 2000's - it's actually pretty cool that they had the foresight to recognise this, I believe there's some 'law' quoted about this very subject though isn't there?

    • @bitelaserkhalif
      @bitelaserkhalif 4 года назад +32

      @@TheMarc1k1 Moore's law.
      AMD did hit 5ghz with fx 9590 but really hot (220w TDP!). Plus with failure of Bulldozer included (basically Pentium 4 repeats itself)

    • @bitelaserkhalif
      @bitelaserkhalif 4 года назад +15

      @@YoureUsingWordsIncorrectly that's why some fx 9590 came bundled with water-cooling. TDP was so close to gtx 480

    • @Kalvinjj
      @Kalvinjj 4 года назад +3

      @@bitelaserkhalif When I saw that processor, I felt like "...they... just literally grabbed one of their processors and sold it pre-overclocked right?"

  • @NindieNation
    @NindieNation 4 года назад +124

    Man. The only E3 I ever went to was 2005, one month after spending my part-time job in college savings on a Dual 2.0GHz PowerMac G5 (it's still running in my house as a media server and RAID...right now). The only next-gen system playable at that E3 was the 360, yet to my surprise, it was nothing more than a bunch of my exact towers. Just felt so cool to be like "I bought this new workstation for music production, but hey, I also apparently bought an Xbox 360 Dev Kit!"

    • @lucasrem
      @lucasrem 2 года назад

      Microsoft will always be corruption!
      Evil company, big and evil Epstein Bill gates people!

    • @brkbtjunkie
      @brkbtjunkie 9 месяцев назад +6

      Hell yeah I used a dual cpu g5 for a long time. I still have it and play Mac OS 9 games on it from time to time.

  • @mmmlinux
    @mmmlinux 4 года назад +1301

    nintendo: we want a cheap, cool running, fast chip.
    chip manufactures: OH is that what people want. we thought they wanted expensive, slow space heaters.

  • @evancrazyerror
    @evancrazyerror 4 года назад +1680

    Apple in 2005: “We’re moving from PowerPC to Intel”
    Microsoft in 2005: “We’re moving from Intel to PowerPC”

    • @Volodimar
      @Volodimar 4 года назад +288

      Apple in 2020: "We're moving from x86 to ARM"

    • @spidey9504
      @spidey9504 4 года назад +32

      @@Volodimar what's microsoft gonna say

    • @matthewrease2376
      @matthewrease2376 4 года назад +72

      Microsoft staying as far away from Apple as possible.

    • @bitset3741
      @bitset3741 4 года назад +121

      Apple in 1994 "We're moving from Motorola 68k to PowerPC"

    • @Cobalt985
      @Cobalt985 4 года назад +51

      @@matthewrease2376 "Now announcing Windows RT 2"

  • @IrishCarney
    @IrishCarney 4 года назад +253

    I remember when I was an Apple true believer arguing against Intel fans, advocating for the superior performance of PowerPC RISC chips even though they had lower clock speeds than Intel Pentiums. "Wintel" (Windows and Intel combined) were the enemy. Then not only did Apple leave PowerPC for archenemy Intel, Microsoft left Intel for PowerPC. Boy was I thrown for a loop.

    • @joemann7971
      @joemann7971 4 года назад +23

      Well, Microsoft only left Intel on the console side. They obviously still work with Intel. lol
      But yeah, I also had a friend that said the same thing about Apple. But at the time, the statement was also true about AMD processors being fastest than Intel despite their CPUs being faster.
      Intel was heading downstream fast, but it was their mobile division that developed the core duo that saved them. I think that was also around the time that Apple went with Intel. The Core Duo is what put them back in the map.

    • @SylveonMujigaeOfficial
      @SylveonMujigaeOfficial 2 года назад +4

      Apple for the longest time used RISC in their computers. From 2006 to 2020, they used x86 Intel chips before going back to RISC with the Apple M chips.

    • @brkbtjunkie
      @brkbtjunkie 2 года назад +2

      The dual cpu g5 was quite amazing for its day. I have one in non working condition, a single chip g5 I used to dual boot tiger and Mac os9, and a 1st gen Mac Pro that both still work and get used from time to time.

    • @phillgizmo8934
      @phillgizmo8934 2 года назад +2

      Started with AthlonXP 1800 on nforce mobo with geforce2, never heard from wintels, cause it was the best bang for the buck, officially on the early Tom's hardware.

    • @serene-illusion
      @serene-illusion 2 года назад +25

      Lesson learned, simping for companies is a fruitless endeavor

  • @xdevs23
    @xdevs23 4 года назад +819

    It's funny that they basically ended up selling the same heat monster to all three competitors.

    • @snetmotnosrorb3946
      @snetmotnosrorb3946 4 года назад +137

      Nintendo's CPU's were based on the very efficient PPC 750 en.wikipedia.org/wiki/PowerPC_7xx

    • @wishusknight3009
      @wishusknight3009 4 года назад +84

      The GCN from the wall pulls only about 40 watts of power. And that is everything combined. Graphics, cpu, ram, drive, controllers, blinkenlights, all combined drew half the power of Xenon by itself... And that was even more so with the Wii

    • @Locutus
      @Locutus 4 года назад +19

      IBM did a Microsoft against Sony, by licensing the chip to Microsoft.

    • @nattila7713
      @nattila7713 4 года назад +17

      @Officer94 a 10w chip can get very hot without sufficient cooling and in a small box ;) fortunatelly my wiis still run hot but fine...

    • @squirlmy
      @squirlmy 4 года назад +15

      @@Locutus people have to know the history of DOS to get that, I think. IBM rushed to get PCs out, initially as a "loss-leader" to get businesses into "real" hardware. They never imagined Microsoft would license to companies like Compaq. I guess IBM was looking at DEC's terminals for a business model. Bad move. Apple deserves credit for popularizing the idea that individuals should own/be in charge of their own computers. Commodore64s were too underpowered, and Amiga too late. Being older, I know history of "home" computers (now nearly all called PCs). Funny how many parallels there were in 21st century consoles. I imagine the same is true for smartphones.

  • @quacc4748
    @quacc4748 4 года назад +382

    Thought the first song was familiar. Turns out it's from Touhou 4! Good choice

  • @sundhaug92
    @sundhaug92 4 года назад +102

    One oddity of the Xbox 360 going PowerPC - when Microsoft made the original SDK for the 360, they went to the only major provider of 970-based systems ... Apple, namely the PowerMac G5 (G5 being the Apple name for the 970, and the PowerMac being the forerunner to the Mac Pro)

    • @WalnutSpice
      @WalnutSpice 4 года назад +7

      @AnEn Apple's Quad 2.7GHz G5 was MUCH quicker than the Xbox 360, They can compete with a 2009 Core 2 Quad, you can't compare their single 1.6 model to an Xbox. Even the Dual 2.5GHz had pretty even performance

    • @blankname8553
      @blankname8553 4 года назад +11

      @@WalnutSpice I think you mean quad 2.5 and dual 2.7. I have a quad 2.5 in my closet :P

    • @snetmotnosrorb3946
      @snetmotnosrorb3946 4 года назад +20

      I remember the early development kits was literally a PowerMac G5 with a Radeon X850 XT (PE?).

    • @Damien_N
      @Damien_N 4 года назад +4

      AnEn one thing also well worth taking into account is that AMD seems to be very willing to work with their customers to design custom products, Xbox One, PS4, and even current Mac computers all have custom non standard AMD silicon in them (most recently the MacBook Pro 16” has a very custom GPU option, a 5600 equipped with HBM2)

  • @reallyboringindividual
    @reallyboringindividual 4 года назад +85

    As an ex-IBM employee, this puts a smile on my face.

  • @punchrockgroin5597
    @punchrockgroin5597 4 года назад +126

    Seems like a good choice when they were planning the consoles in the early-mid 2000's.

    • @campkira
      @campkira 4 года назад +1

      they had back them chip are very underpower.. if you want a power like ps3 and xbox360... you had to deal with chip that don't had much gpu and heating like hell... taken them 10 years before they come up with new gen while ps4 and xbox taken only 3 years to get hardware refreash.

  • @FloppyDiskMaster
    @FloppyDiskMaster 4 года назад +190

    If you played a 360 as it started to red ring, it had extreme graphical freak outs. To this day, I still have PTSD and assume my console is dying whenever I see a graphical bug in a game, especially on Switch with how hot that thing gets.
    Edit: To those asking, yes my Switch is an original 2017 version.

    • @TheCoolDave
      @TheCoolDave 4 года назад +15

      Yea, my white 20gb Xbox 360 went RROD 5 times and I was able to fix it 4 times and just gave up on the 5th...picked up a brand new 250gb slim, still to this day, it boots up fine, I still play games that is not on compatibility, and there is still a ton of them...

    • @Not-Great-at-Gaming
      @Not-Great-at-Gaming 4 года назад +7

      I remember there were a few games that would actually mimic a RROD and some that just had weird bugs or quirks that looked like it. I actually had on of the first RRODs and had no idea what was going on. It was only PGR3 and I figured it was just some weird game crashes, but then I started to see stuff on the internet about RROD. So, I sent my console back for the first of 3 times. Eventually I just bought a slim.

    • @awilliams1701
      @awilliams1701 4 года назад +8

      must be a switch 1.0 issue. I have the switch 1.1 (same specs, but gets like 2 hours extra battery life from having a better manufactured APU). and it doesn't get hot at all or glitch out.

    • @buccob
      @buccob 4 года назад

      @@awilliams1701 my Switch 1.1 has glitched out on me while playing SSBU a couple of times, I decided to move all the save data from that game to the console an so far it has been behaving properly... Of course there is no way of truly knowing if it was a MicroSD card fault, or an update, but I've been happy with it ever since (also I managed to fix my joycon drift with electrical spray cleaner)

    • @samuelthecamel
      @samuelthecamel 4 года назад +6

      My one friend has an Xbox 360 that has been having constant graphical glitches for a long time but his console didn't red ring yet

  • @adamcartermi5
    @adamcartermi5 4 года назад +42

    You never mentioned that before a working system was developed. developers were sent a special powerpc based mac to build their games on. Until a working developer kit was ready.

    • @remakeyourself
      @remakeyourself 4 года назад +7

      It actually wasn't anything "special", it was just a Dual 2.0GHz PowerMac G5 since that was the chip they used stock at the time. I left a comment somewhere on this thread, but I had purchased that exact computer before going to E3 the year the 360 was playable (and the PS3/"Revolution" Unveiled). Every single demo station had an empty shell of the launch 360 right next to the exact computer I bought. I had to ask a few of the demo drivers before someone with the right knowledge was able to answer (I think it was someone from Lionhead who was showing off a Fable tech demo). They told me they had a few of these at the studio and other than specific specs on each unit (which were the stock "build to order" G5s from Apple's site), it was indeed just a mostly off-the-shelf PowerMac running the SDK. Pretty cool stuff. It was so fun getting back home, looking at my tower and thinking "huh...I bought this thing for music/video production, but actually have an Xbox 360 DevKit, too" :)

  • @Waldoe16
    @Waldoe16 2 года назад +200

    One of the devs at Microsoft had confessed that for the OG xbox, they turned down AMD at the last minute for their cpu and went intel. I wanted to also add that the Gamecube, xbox 360, XSX and PS5 are well built and balanced in general. PS4 and Xbone were unbalanced by the Jaguar arch limiting most games. PS3 was hard to program but very powerful. N64 was an engineering disaster that worked out.

    • @lucasrem1870
      @lucasrem1870 2 года назад

      now they all need the TSMC chips, then intel only was able to give them Ghz clock speeds
      AMD is gone, was never a big party

    • @alexandernorman5337
      @alexandernorman5337 2 года назад +5

      The N64 was a tradeoff among the best solutions available at the time.

    • @alexandernorman5337
      @alexandernorman5337 2 года назад +10

      @A World Under A Spell We'll have to agree to disagree on everything you just said. I'm not even sure how you are coming up with your games list.
      But I did some PC gaming as well as console gaming. None of the consoles could match running a PC with Voodoo 2 in Glide. But the N64 came somewhat close. The textures were laid out right, the animations were good for its time, the lighting was appropriate. The edge antialiasing it used wasn't as good as the FSAA that came later (post 2000), but I doubt the difference between the two would even have been noticeable with the N64 because of its intrinsic blurriness. It was like a low textured version of a PC.
      The PS looked absolutely horrendous in comparison. The textures strangely contorted because it couldn't correct for perspective. And pixels shimmered, and polygons popped in and out because it lacked a Z buffer. It rendered artifacts galore. And its lighting ability was lackluster. Even at the time I knew it couldn't really do 3D.
      I'll agree with you that the audio quality on the N64 wasn't good. But the N64 sounded better than the PS looked. The PS had the better controller, I'll give it that.
      All the consoles of that generation were a tradeoff to meet the transistor budget. The N64 had a small texture cache size because that's all they could fit after implementing the rest of the hardware. The PS lacked a Z-buffer and a proper texture wrapper because of the same reasons. And the Jaguar didn't even have a texture wrapper.
      But they were all tradeoffs of the best engineering solutions available at the time.

    • @infernaldaedra
      @infernaldaedra 2 года назад +7

      Apparently AMD wasn't even aware that the Xbox wasn't running their CPUs. must have felt like a serious betrayal at the time.

    • @WamblyHades
      @WamblyHades 2 года назад +20

      @@lucasrem1870 AMD is gone? What? PlayStation and Xbox have been using AMD hardware exclusively for their CPUs/GPUs since 2013. TSMC is just a foundry.

  • @tomstorm255
    @tomstorm255 4 года назад +30

    2005 was such a weird time in the tech world. Microsoft went from Intel to Power PC right when Apple was switching from Power PC to Intel.

  • @Minto107
    @Minto107 4 года назад +302

    I love how everyone is abandoning Intel now. These prices are ridiculous

    • @aaron1182
      @aaron1182 4 года назад +8

      That's only if you want the latest and greatest. Modern processors last years for gaming these days. (most likely consoles stagnate the gaming processor needs until a new gen)

    • @EphyMusicOfficial
      @EphyMusicOfficial 4 года назад +86

      @@aaron1182 "Latest and Greatest." Intel still can't downsize their process. Still stuck at 10nm. Sure, their CPU's are still beating out AMD in terms of Single Core performance, but almost ALL modern applications support some form of multi-threading. AMD's SMT beats Intel's HT every time, while having slightly lower TDP's and lower manufacturing costs, and by extension can be marketed cheaper. Also, I seem to recall some malicious code being inserted into Intel firmware and software that crippled some AMD products intentionally. Something AMD has never resorted to.

    • @EphyMusicOfficial
      @EphyMusicOfficial 4 года назад +26

      Let's not forget that by putting the pins on the CPU, you also lower manufacturing costs for the motherboard. There's also no such thing as "mounting pressure" in that case, as pins don't need to be pushed against something vertically. Instead, they're clamped and contact is made laterally. A cheaper solution.

    • @waltercool
      @waltercool 4 года назад +29

      @@aaron1182 Intel's prices are way too high and AMD didn't performed bad. Price/Performance for Intel been awful since days, and since last year, AMD is very competing with Intel, for at least half of the price.
      Intel performance results are mostly advertising nowadays, with their awful Retpolines/Spectre issue they lost ~40% performance with 1st mitigation, and nowadays is ~20% with improvements. Intel CPU security issues been increasing every year, like CVE-2020-0543

    • @circuit10
      @circuit10 4 года назад +5

      waltercool I think 40% is a bit over the top (I heard it was only a few percent)... If that’s true I’m turning them off though

  • @kicapanmanis1060
    @kicapanmanis1060 2 года назад +28

    22 years later and Cell still haunts Playstation in terms of PS3 backcompat

    • @stevenswall
      @stevenswall 6 месяцев назад +1

      That's unfortunate, wasn't it the most powerful console at the time?
      Seems like they should have developed that architecture So we wouldn't have all the low end x86 trash we have today.

    • @soundspark
      @soundspark 5 месяцев назад +2

      In that case wasn't it IBM that pushed the Cell architecture to Sony in order to push a "home supercomputer" platform?

    • @PMARC14
      @PMARC14 5 месяцев назад +3

      @@stevenswall x86 is way more powerful than PowerPC ever would have been. why would you put supercomputer architecture in a console, they are the last thing I would think be running one math equation a billion times.

    • @knucklestheechidna5718
      @knucklestheechidna5718 5 месяцев назад

      I'm so confused, ps3 dropped way after the year 2000

    • @argvminusone
      @argvminusone 5 месяцев назад +1

      ​@@PMARC14 That's pretty much what GPUs do. Modern integrated GPUs are basically the same idea as Cell.

  • @megan_alnico
    @megan_alnico 4 года назад +110

    I'm pretty sure the CPU in the Classic Xbox was a Pentium III "tualatin" and not a Celeron. It gets murky because at that time because traditionaly the difference between the Pentium and Celeron was chache size, with the Celeron being made from Pentium III chips that had quality issues. This version had the same cache size.
    Also Intel was being embarrassed big time because this generation of overclockers discovered that Celerons could out perform PIIIs. Basically cache speed was more important than cache size. PIIIs had twice the cache at half the speed of the Celeron but preformed worse. I believe this is why the tualatin version of these chips were so similar.
    Anyway, like many generations of Intel chips at the time the LAST generation of chips outperformed the first generation of the NEW chips. I believe this had something to do with the quality of chip yelds always staring off poorly. That tualatin and even the mobile version were able to reach P4 performance levels with the right overclock. It was some dark times for Intel. Unfortunately Microsoft ran it CPU at 733mhz and not at the chips 1.4ghz max speed. My guess is, once again, yealds. Much more of the chips would run stable and cool at 733 then at 1.4 and it was cheaper.
    All this to say that by the time they were developing the 360, all Intel had was the P4. Everything else was basically the same PIII chip Microsoft had already used.
    The P4 was such an engineering failure that the 'Core' lines were based on a modified PIII core and the P4 'nerburst' architecture was brushed under the rug.
    This is mostly from memory so I may have missed a few things. I had an AMD Athalon XP at the time and really enjoyed my gaming experience.
    Edit: Thanks to everyone in the comments for their feedback. It turns out the Tualatin CPU was the fasted you could swap into the original Xbox. Not the stock CPU. It's still a PIII and not a Celeron. It's been almost 20 years now and I was bound to be a little wrong. You all are great.

    • @FeeLtheHertZ
      @FeeLtheHertZ 4 года назад +8

      Wicked lost and well written! In closing, Intel sucked hard and PPC was the better choice. Didn’t know about the Celeron performance delta though, interesting stuff.

    • @omegarugal9283
      @omegarugal9283 4 года назад +6

      it was a frankestein cpu, a tualatin core with half the cache running at 200 MHz bus...

    • @omegarugal9283
      @omegarugal9283 4 года назад +3

      @@FeeLtheHertZ i had a celeron A that moped the floor with a friend 's pentium 2

    • @megan_alnico
      @megan_alnico 4 года назад +8

      @@omegarugal9283 Yes, Celeron A! That's the name of the first Celeron chips with cache memory. I remember people in school having dual Celeron As at 1.1ghz. While I had an AMD K6-350.

    • @6581punk
      @6581punk 4 года назад +6

      "Microsoft's Xbox game console uses a variant of the Pentium III/Mobile Celeron family in a Micro-PGA2 form factor. The sSpec designator of the chips is SL5Sx, which makes it more similar to the Mobile Celeron Coppermine-128 processor"
      It's a mixture of all sorts.

  • @DehnusNorder
    @DehnusNorder 4 года назад +82

    One small mistake: The CPU wasn't a Celeron when it released, but a slightly stripped P3. It wasn't until Coppermine Celerons that they had the same cache design as the Xbox chip.

    • @raven4k998
      @raven4k998 2 года назад

      not just that they didn't leave ibm to ati they left for AMD which had bought ati before this time ati was no longer ati anymore that that time there are more then one mistake in there

    • @DehnusNorder
      @DehnusNorder 2 года назад

      @@raven4k998 Indeed, IBM was then at the top of their game CPU wise. It wasn't until Apple went Intel, that that started to wane. A powerPC based cpu was really a great processor to have for a console. In fact, the Xbox one CPU is not that much faster, but it has branch prediction and out of order execution. Which makes it more powerful. But theoretically the Xbox 360's PPC should be able to keep up. (Graphics is a whole different matter of course :P ).

  • @lwvmobile
    @lwvmobile 4 года назад +8

    Those IBM PowerPC processors plus ATI graphics were a match made in heaven for Nintendo for the Gamecube, Wii, and Wii U. There weren't the most powerful chips, but they ran stable, cool, and fast with almost unheard of hardware failures, certainly much lower than the others at the time.

  • @andrewclegg9501
    @andrewclegg9501 4 года назад +86

    Apple had to water cool the fastest G5 Powermac, and never managed to release a G5 laptop. PPC970 cores did run very hot. It's why they jumped ship to Intel, and now the same thing is happening again, hence the jump to ARM. MS and Sony knew this, and took the risk of heat problems.

    • @comicsans1689
      @comicsans1689 4 года назад +27

      I think the ARM jump has a lot to deal with the iPhone and iPad running on ARM, so they can unify platform development.

    • @ElNeroDiablo
      @ElNeroDiablo 4 года назад +12

      @@comicsans1689 Fairly much so I'd say, and even though Apple announced here in 2020 that by 2022 everything will be ARM, I'm fairly sure that choice was made back around 2016-2017 as Intel was the main name in x86-64 (aka: x64) CPU's at the time and AMD had get to start laying down the smackdown with the Zen architecture (they were just coming off the relative flop that was AM3/+ with their high heat-output, poor IPC and pseudo-multicore design (sharing key FPU's and such between what was otherwise pairs of true cores), which powers XB1/PS4) when Apple saw the problems Intel was having and was going to keep having at the time, so gave themselves about 5 years headstart to start moving MacOS from x86 to ARM to have it ready for their 2022 launch.
      If Apple waited for say 2018/2019 to make the move from Intel for MacOS, they might have chosen to go with AMD for Zen-family CPU's to power future Macs, or they might have still made the choice to go with ARM and move from x86 anyway. We don't really know.

    • @romannasuti25
      @romannasuti25 4 года назад +3

      Yeah, and funnily enough, Power arch hasn’t been good for general compute efficiency since G4. They’re still useful, but mainly for enterprise server workloads as I/O, reliability/live failover, and security are greater concerns there (and Intel has been dropping the ball on those, leaving an enterprise server niche). There’s reason to believe IBM’s s390x and Power arch’s have been drifting into each other with s390x maxing out reliability and Power being more balanced. Right now, the best are:
      Dedicated HPC: Fujitsu weirdly enough, using their newfangled ARM A64FX chips which are absolute vector compute demons, basically the modern-day Cell except not a nightmare to program
      General-purpose/commodity: AMD because Zen2 arch is hard to go wrong with
      Enterprise workloads: IBM, Power for better cost/performance or s390x for absolute maximum reliability

    • @s1mph0ny
      @s1mph0ny 4 года назад +3

      That's a true statement, but when you compare to the time era it doesn't mean much. AMD wasn't able to push 3ghz clocks on multi-core processors without using even more heat, and Intel wouldn't have anything heat/power competitive until 2006. If things had waited just a year it's easy to see how different this generation might have been though. Apple likely had advanced knowledge of how much better the core2 was compared to outgoing P4's, and they would have delayed their move from G5 to intel to match with this vast improvement. OTOH, Sony pushed back their ps3 launch due to blu-ray AACS fuckery, and if they had known the ps3 was going to be so far delayed, they could have used that time to gain a huge compatibility and performance advantage by leveraging a core2 based processor on their platform.
      Given how far below competitive clocks the wii, u, and switch have been I doubt very much that their choice of platform matters in terms of having a good thermal performance. They just need to ensure they meet the bare minimum cooling requirements spec'd by the chip and hope that their supplier isn't lying about thermal/power properties.

    • @tHeWasTeDYouTh
      @tHeWasTeDYouTh 4 года назад +9

      but the RROD issue was because the GPU was overheating. It was never because the CPU......how are people getting this wrong.

  • @SatoshiMatrix1
    @SatoshiMatrix1 4 года назад +19

    I went through three Xbox 360s due to the red ring.
    One of the things I like about retro gaming: I don't ever have to worry my NES or GameBoy CPU is gonna get so hot it causes a hardware failure just by playing them.

    • @OliverNorth9729
      @OliverNorth9729 2 года назад +3

      I went through like 4 ps2s. Not fun.

    • @19CD91
      @19CD91 7 месяцев назад

      @@OliverNorth9729 was why it was my last console generation

  • @InsaneWayne355
    @InsaneWayne355 4 года назад +59

    While the 360 RROD was very common, it was almost always related to the GPU. The CPU rarely had anything to do with those failures.

    • @hatesac1
      @hatesac1 4 года назад +8

      Yep, the X1900 overheated and broke the solder.

    •  4 года назад +19

      @John Hooper You should say typical nVidia too. They had serious soldering issues as well, they lost the apple partnership because of that and laptops were dying with nV chips in the thousands. So this is not typical ATI, it was a common problem for all players when they switched to more environment friendly solders.

    • @godsinbox
      @godsinbox 4 года назад

      urrrrgh, now you tell me

    • @alexsinclair2012
      @alexsinclair2012 4 года назад +11

      @@hatesac1 Nope. The temperatures are not hot enough to melt the solder. The GPU die would detatch from the chip substrate due to poor design. Louis Rossman has a video all about this

    • @XantheFIN
      @XantheFIN 3 года назад +1

      @@hatesac1 Not the solder. It was more likely inside GPU. Macbook Pro's had similar fault. GPU dies itself and not for heat just. It was manufacturing problem of the GPU itself.

  • @friedgpu
    @friedgpu 4 года назад +22

    Apple suffered too with the PPC 970 heat, the Quad Power Mac G5 used liquid cooling and is famous for leaking and the machine itself being unreliable.

  • @mad1316
    @mad1316 4 года назад +27

    You seem to have some mixups. The pentium 4 that used RDRAM was the Willamette core, socket 423 (and later 478), which maxed at 2.0GHz without HT. It wasn't until the Northwood core in 2003 that HT was introduced on consumer Intel CPUs with the 3.06GHz Pentium 4 w/HT. Later a Pentium 4 3.0GHz HT was released with the higher 800MHz FSB (whereas the 3.06GHz was based on a 533MHz FSB).

    • @campkira
      @campkira 4 года назад

      they hit limited of one core... and end of the more's law... after that endless.. more core... and forcing chip...in the end we need to change from chip to a tower... or you just going get endless way to used less power to reducing heat...

    • @jamezxh
      @jamezxh 2 года назад

      RDRam was very short lived

  • @JustcallmeGnarly22
    @JustcallmeGnarly22 4 года назад +65

    Microsoft- we lost money on each original Xbox. We need to cost cut a bit on the next console to make some money back.
    Red ring of death.

    • @waltercomunello121
      @waltercomunello121 4 года назад +5

      According to SSFF, they also cut costs on test devices and test procedures. Hence, high failure rate and shipping faulty units. There was a guy who was sent 3 faulty 360s in a row. The PPC problem just adds a tile to the jigsaw puzzle.

    • @GiuseppeGaetanoSabatelli
      @GiuseppeGaetanoSabatelli 4 года назад +3

      Microsoft gladly burned billions on the Xbox division in order to maintain mindshare and consumer loyalty.

    • @squirlmy
      @squirlmy 4 года назад +3

      @@GiuseppeGaetanoSabatelli particularly to children who aren't discerning. The way Camel Joe sold cigarettes. That's why I find it hilarious to get linux running on these things, even though it's usually highly impractical.

    • @campkira
      @campkira 4 года назад

      and you wondeer why bill gate hated the idea... he only allow them so the japan don't corner them on game.. since if people only buy console.. they don't buy pc... since it just nothing more than workstation for office... and not a real consumer household...

    • @JAGtheTrekkieGEMINI1701
      @JAGtheTrekkieGEMINI1701 4 года назад +2

      @@waltercomunello121 the PS3 had extreme Heat Problems too dude

  • @powerzx
    @powerzx 4 года назад +79

    PowerPC CPU for X360 was great. Problems with "red ring of death" in X360 or "yellow light of death" in PS3 were because of Lead Free Solder, bad radiators, bad case design and bad cooling.

    • @robertvuitton
      @robertvuitton 4 года назад +44

      That problem in the PS3 were because of the nec/tokin capacitors and it has been proven. Two PS3 I had from 2006 with YLOD ended up working AGAIN after changing the capacitors in both. How I found this out? whenever I blew hot air around the CPU and GPU where the nec/tokin capacitors are - it turned on for a few seconds or minutes and once the console temperatures (while playing) dropped or just idle, it turned off. Later found out around some brazilian forum talking about the capacitors and changing them to fix the issue. I tried it and never again had more YLOD issue. Went out of my way to do some testing with YLOD motherboard online and now own seven of them, all working lol.
      UPDATE:
      Edited to add this link www.psx-place.com/threads/tutorial-nec-tokin-capacitors-replacement-ylod-fix.25260/

    • @sadp1535
      @sadp1535 4 года назад +15

      @@robertvuitton Good comment. This is the real reason of YLOD. For all these years people blamed RSX and Cell (PowerPC) for bad design and overheating, but the main problem was nec caps.. And also, that shitty thermal paste under IHSes did make everything even worse. PPC was already hot, but that shitty thermal paste did help to kill those chips and those caps...PS3 was very hot, but after you delid that 90nm Cell and replace these caps, you will never get ylod ever again if you take care of that console. Slims were redesigned and had no nec caps, and process was shrunk to 45nm thats why slims are so reliable and fatties are not, but atleast now we know the fix for those "seemed to be dead" CECHA-CECHE models.. They stopped putting that shitty thermal paste on 3xxx-4xxx units, and soldered the chips to ihs. and those systems never fail, that was a real problem, because of shitty paste, chips got hot, because chips got hot and those caps overheated and died, those caps became even worse with all of that heat.

    • @chillhour6155
      @chillhour6155 4 года назад

      Yep, went through 3 ps3's myself bc of ylod

    • @neakmenter
      @neakmenter 4 года назад +1

      R. Atuey - yup - I believe it may be the same kind of tantalum capacitor (possibly even the same rating?) that causes the majority of MacBook Pro 15” 2010 and 2011 “gpu” failures...?

    • @Ehal256
      @Ehal256 2 года назад

      No idea about the quality of manufacturing of other components, but everything I've heard from game developers is that the PPC chips in the PS3 and 360 were pretty bad to program for, with all kinds of performance pitfalls.

  • @MrSkyl1ne
    @MrSkyl1ne 4 года назад +112

    The part about the Pentium 4 is wrong. The Pentium 4 only supported RDRAM on launch in 2001 with the Willamette architecture. The Northwood architecture that was introduced in 2002 already switched to DDR SDRAM (or what you incorrectly label as regular DRAM since there are multiple variants). The P4 HT processors were introduced in 2002 with the Northwood architecture. (3.06Ghz in 2002 and 3.0Ghz in 2003 with a faster FBS) and did not support RDRAM.

    • @KonjonoAwesome
      @KonjonoAwesome 4 года назад +18

      Not only that, but Northwood was pretty tame temperature-wise through the 2.8 GHz models. Pushing past that frequency and switching to 90 nm with Prescott is where the heat issues really began. Netburst wasn't an ideal architecture for IPC but the Northwood chips were competitive with AMD's offerings at the time. Both consumed about the same power. I'm guessing power consumption and price per unit were a larger concern than thermals or performance in Microsoft's decision to move on from Intel. Microsoft must have sunk outrageous amounts of money into redeveloping their SDK for PPC. You would think they would have stuck with x86 if the price had been right, based on what x86 hardware was available at the time.

    • @yukinagato1573
      @yukinagato1573 4 года назад +4

      Pentium 4 Willamette chips were first released supporting RDRAM only (socket 423). Later on, before the Northwood release, Intel launched Willamette cores (Pentium 4 and the first NetBurst Celerons) supporting regular SDRAM (socket 478). With the Northwood core, DDR became the standard.

    • @yukinagato1573
      @yukinagato1573 4 года назад +8

      @@KonjonoAwesome Yeah, they really made significant changes with the Prescott release. The major problem was (once again) the pipeline size increase, from 20 to 31 stages. That's literally more than 3 times the Pentium III stage count! At least in the 478 platform, upgrading from Northwood to Prescott didn't offer much advantages...

    • @snetmotnosrorb3946
      @snetmotnosrorb3946 4 года назад +6

      @@yukinagato1573 I bought one of those. Huge mistake. I even knew that AMD was better at the time, but I had 3 years earlier gotten a pretty good prebuilt PC with a Pentium 4 in it for a low price, and all things considered I didn't want to complain. I wanted to upgrade it, but didn't understand how meaningless it is to upgrade to something of the same generation, so I went from a 2.4 GHz Northwood to a 3 GHz Prescott. It had HT, but that didn't work in a home desktop environment.

    • @jrus690
      @jrus690 4 года назад +3

      No is was not the Pentium 4 that did not support DDR SDRAM or SDRAM, the north bridge was separate entity from the CPU up until the Core 9xx series chips. Intel bet the farm on RDRAM and decided not to make a chipset for DDR SDRAM until 2002. They even designed a Pentium pro (P6) chipset with RDRAM because they thought that was the future but those chips never had DDR FSB so they could not use it, except in dual processor servers that is. In 2002 Intel gave up on RDRAM and chucked all the support and chipsets for it.

  • @kristianutomotobing9719
    @kristianutomotobing9719 4 года назад +94

    AMD is really knocking the competition with supplying cpu and gpu for console this generation and next generation.
    AMD technology enables backwards compatibility with current gen and next gen.
    Really really waiting for the video about AMD supplying the console gen.

    • @robertvuitton
      @robertvuitton 4 года назад +1

      They'll be using both cpu and gpu from AMD? that sounds damn great.

    • @SuperAmazinglover
      @SuperAmazinglover 4 года назад +13

      AMD is not the main reason for backwards compatibility.

    • @Fadexpl
      @Fadexpl 4 года назад +27

      @@SuperAmazinglover you're right, common x64 architecture is. But since consoles could never go Intel due to poor price/performance right now the only other option would be ARM, effectively killing backward compatibility once again. So... Thanks, AMD.

    • @SuperAmazinglover
      @SuperAmazinglover 4 года назад +1

      Fadex price too performance on intel is specifically because they choose too. They have no reason to lower there price as they still have a fair share of the consumer market and most Laptops and prebuilt are intel as well.
      They also wouldn’t go ARM because there not powerful enough for next gen. Where getting backwards compat because of x64 and xbox has worked really hard on their emulation. If it was all thanks to AMD Sony would be having a lot more games at launch ready.
      We really should give MS emulation team a lot more credit.

    • @MarcoZ1ITA1
      @MarcoZ1ITA1 4 года назад +18

      Not like they have an alternative. Nvidia has had petty bitch fights with anyone but Nintendo (and I'll bet they will at some point) and intel can't provide an all in one custom solution with proven tech, good yields and good thermal and power efficiency like AMD, especially on the GPU side.

  • @carolinehusky
    @carolinehusky 4 года назад +47

    One could say that Nintendo was actually ahead of its time, with both the PowerPC based GameCube (used in the Wii and Wii U, as well as the 360 and PS3), and the ARM based Gameboy Advance (used in all their handhelds since, including the Switch).

    • @tHeWasTeDYouTh
      @tHeWasTeDYouTh 4 года назад +11

      Since the N64 Nintendo had been using RISC cpus. Sony also always used RIsc, they used Mips chips. Microsoft was the first console in a while to use CISC

    • @mirac_
      @mirac_ 2 года назад +1

      I wouldn't say Nintendo was ahead of their time with the Gameboy Advance being based on ARM, it was probably more of a necessity because an x86 or PowerPC handheld wouldn't have made much sense (power efficiency and thermals)
      They probably are ahead of their time with the Nintendo Switch being ARM based though

    • @Luke357
      @Luke357 2 года назад +6

      @@mirac_ The Switch being on ARM is not being ahead of times its with the times as mobile devices tended to use ARM before the switch came out.

    • @mirac_
      @mirac_ 2 года назад +1

      @@Luke357 I was talking about home consoles specifically, my bad
      I wouldn’t be surprised if the PS6/7 or future Xbox‘s would be based on ARM

  • @theshadowman1398
    @theshadowman1398 4 года назад +56

    As a huge Mac fan and love PPC. I still have a last gen ( late 2005 ) G5 tower with dual 2.3 ghz and 10GB of ram, that thing still flies and is actually used as a daily driver.

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 4 года назад +11

      Power9 is still very relevant in the workstation space, just means the move over to Linux to stay in the PPC ecosystem.

    • @kjjustinXD
      @kjjustinXD 4 года назад +14

      16gb G5 Quad here 🙂

    • @theshadowman1398
      @theshadowman1398 4 года назад +5

      kjjustinXD
      I am staying away from that one because I am a bit scared of it’s liquid cooling. I do have Quadro FX4500 in my G5

    • @kjjustinXD
      @kjjustinXD 4 года назад +9

      @@theshadowman1398 i got lucky, got it locally for only 50€, no leaks but i replaced everything that may fail and refilled it with non corrosive liquid that wont rip and tear everything apart when it fails.

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 4 года назад +4

      @@kjjustinXD I'd rather jerry-rig a Noctua cooler instead of chancing an aging AiO, especially on a maxed out G5

  • @atm94404
    @atm94404 4 года назад +3

    There's a key part of this you missed- 3DO. The 3DO M2 was PowerPC based. Although it never shipped as a console (there's an huge story about what happened between Panasonic, Sega, and 3DO), the system was done and was used in arcade machines and kiosks. 3DO showed M2 to Nintendo, and Nintendo was impressed but not enough buy the MX successor. For Gamecube they ditched the MIPS processor (used in the N64 and the PS1) and went with the impressively scalable PowerPC. Apparently one thing that impressed NIntendo was how easily M2 was able to have two CPUs work together with almost no overhead and no tricky programming compared to the Sega Saturn, 3DO people ended up working both at WebTV and on the original Xbox, so there were people familiar with the PPC and in dealing with IBM since they had already worked with them to create a low-cost PPC that was also tuned to be especially good for the needs of 3D games.

    • @tHeWasTeDYouTh
      @tHeWasTeDYouTh 4 года назад

      3DO M2 used two 602 cpus but the console was inferior to the Dreamcast in every way. It was dead in the water if it had been released.

    • @deansmith6924
      @deansmith6924 2 месяца назад +1

      The lead designer for the Xbox 360 worked on the 3do. So their is a link

  • @nrg753
    @nrg753 4 года назад +17

    First thing MS and IBM did together since the massive drama that was OS/2!

    • @squirlmy
      @squirlmy 4 года назад +2

      @referral madness They also tried a different hardware standards of Personal System/2 (PS/2, but that's confusing in this context with Playstation!) to fight ISA (later EISA), which was a standard agreed to by "clone" makers. That failed spectacularly, worse than OS/2. The PC clones won out over IBM. There was a lot of drama at the time. IBM didn't even want "home computers" to exist, they wanted to use the PC to tempt growing businesses into buying "minicomputers" or Big Iron. It was a bad miscalculation.

  • @faustianblur1798
    @faustianblur1798 4 года назад +27

    As Sony and Microsoft were transitioning their consoles to PowerPC, Apple were moving to x86. Now as Sony and Microsoft are adopting a decent x86/x64 architecture, Apple are migrating their Mac products to custom ARM CPUs. It'll be interesting to see if the following generation of consoles (PS6 + Xbox ?) follow suit again.

    • @88oscuro
      @88oscuro 4 года назад +7

      Microsoft have already started to migrate Windows to ARM since last year. It's not the best of solution at the moment but personal computers will most likely with time move to ARM. If there is another generation of consoles after PS5 and Xbox series x (which I hope) I wouldn't be surprised if there is an other switch in architecture.

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 4 года назад +7

      AMD's got an ARM license, I bet Sony and Microsoft are already hashing out the details with AMD. They've both been restricted with design in order to maintain good heat dissipation, moving back to RISC would allow them to not compromise on aesthetics in order to get great performance.

    • @88oscuro
      @88oscuro 4 года назад +2

      @@amirpourghoureiyan1637 Yea didn't Jim Keller also design a ARM architecture while at AMD as well? K12 or something, but it was put on hold because of finacial constraints and making sure zen got released.

    • @ScottTancock
      @ScottTancock 4 года назад +3

      Once Sony and Microsoft transition to ARM (it'll probably happen, when the single-core performance gets there), hipster Apple will probably move again. Currently the best bet would be RISCV (no licensing cost to pay, meaning Apple keeps more of your money), but RISCV has issues at the moment. Maybe RISCV will sort out its issues, maybe something else will become available, maybe Apple will transition to OpenPOWER (IBMs latest PowerPC ISA iteration) instead. Who knows?
      @88oscuro K12 got announced and then forgotten about. Maybe it will come when AMD has the cash flow to do so, maybe it'll stay dead, maybe AMD has found some reason why ARM won't be able to outperform x86 in the desktop market.

    • @DripDripDrip69
      @DripDripDrip69 4 года назад +1

      @@88oscuro K12 design is complete, AMD's original plan was use zen for high performance(bulldozer replacement) and K12 for low power(bobcat/jaguar replacement) but since zen turn out to scale so well they shelved K12.

  • @ElectronikHeart
    @ElectronikHeart 4 года назад +36

    The Red Ring OF death and the YLOD are not CPU related.
    That's the GPU, from 2 different manufacturers causing the problem.
    Mainly related to the ROHS solder used around this period of time being over-stressed by the heat emanating from theses chips.
    The CPUs on the other hand are hot, that's for sure ! But they are designed to throttle correctly when overheating.
    Also, the Xbox 360 GPU, is suffocating under the DVD Drive.

    • @ricky302v8
      @ricky302v8 4 года назад +2

      Lead free solder has a higher melting point than leaded solder, so how can it be 'over-stressed' by the heat from the GPUs?

    • @ancapsu8728
      @ancapsu8728 4 года назад +1

      RoHS is causing of problem.

    • @ElectronikHeart
      @ElectronikHeart 4 года назад +4

      @@ricky302v8 it have a higher melting point, but it's not as malleable as lead solder.
      But another problem is that theses GPUs could not handled the extra heat necessary to solder this rohs solder properly.
      So it's coming out of the factory with balls of solder making physical contact with the motherboard, but not really soldered.
      So as time goes on and oxydation occurs, they are losing electrical conductivity.

    • @Nathan123Bhi8
      @Nathan123Bhi8 4 года назад

      @@ElectronikHeartÇa fait plaisir de te voir ici ! :)

    • @supermasterfighter
      @supermasterfighter 4 года назад

      It was actually a power supply issue, the power supply for the original Xbox ran too hot and it would damage the gpu. It was related to the solder but the original GPU just wasn’t up to par to begin with heating wise.

  • @tyraelhermosa
    @tyraelhermosa 4 года назад +5

    Love this video. A minor correction: those 3Ghz Pentium 4 chips with Hyperthreading used DDR RAM. Only the original P4s, the "Willamette" chips utilized the fast but expensive and inconvenient (had to be installed in symmetric pairs) RD-RAM. About 8 months after the initial launch of the first Pentium 4 processors, new budget motherboards with a new chipset were released that utilized the much cheaper (but slower) PC133 SDRAM, followed about 6 months after that by another chipset and set of motherboards that replaced those that used DDR RAM, which Intel stuck with moving forward, joining the rest of the industry.

  • @MrNside
    @MrNside 2 года назад +7

    According to MS, the overwhelming cause of 360 failures was separation in the solder between the silicon and the substrate on the CPU. It wasn't so much that it was too hot, it was because of repeated heating and cooling cycles. Their stress testing (albiet rushed) didn't factor in how consoles are turned on and off a lot more than something like a PC.

    • @garricksl
      @garricksl 2 года назад

      @@deansmith6924 Blame PPC 970, not ATI . Every old Mac users hate unreliable of G5 computers.

    • @garricksl
      @garricksl 2 года назад

      @@deansmith6924 I am suprising that ATI was making substanard GPU. Anyway, I use AMD CPU for WIntel Plafform and Nviada GPU is defacto monoply.
      I am a Mac person. 970 causes Apple to switch Intel first place.

  • @kmemz
    @kmemz 4 года назад +45

    Heat wasn't the only issue with IBM+Toshiba's Cell architecture.
    The largest problem point for these consoles, was that they released very early during the era of moving to lead-free solder, and in those early days, the mix of metals in the solder weren't fully figured out yet, and the techniques and tempuratures that you have to use for lead-free solder were just different enough that things were bound to go wrong in the right conditions.
    And in these consoles, right conditions they were. Early failures were mostly due to cracking of brittle joints, that couldn't deal with the flexes and thermal differences between the base PCB and the CPU substrate PCB. But later on, the consoles that had come out of that era without issue had issues with the tin slowly starting to fringe outwards, eventually, the little hair-like leads coming off of them bridging with other joints, shorting the connections and interrupting the data processing flow or voltages, depending on what got shorted. If the consoles had released in the era of leaded solder, where the techniques and manufacturing process had already been in use with little failure for decades, this would likely have never been an issue, as there would not have been any stringing solder, or brittle joints, and the heat load would have been much less of a problem.
    Even now, leaded solder, while generally being seen as less environmentally friendly for some reason (likely more related to the eWaste industry than just having it in a computer), still proves easier to handle and generally better for getting a good, strong and reliable joint, although manufacturing and design has long since shifted to account for the potential issues that tin based solder brings to the table.

    • @snetmotnosrorb3946
      @snetmotnosrorb3946 4 года назад +2

      PCB fabs had years to prepare for the RoHS, but they shove the head in the sand and got caught pants down when it took effect.

    • @bizzzzzzle
      @bizzzzzzle 4 года назад

      KillerMemz PPC is also IBM...

    • @kmemz
      @kmemz 4 года назад +5

      @@bizzzzzzle The Cell architecture is co-developed between IBM, Toshiba, and Sony. In the full version of the Cell architecture, the version that the PlayStation 3 had/has, there was one PowerPC core with a few instructions tacked in to handle Cell specific workloads, that served as the master core, and several slave cores that ran on an offshoot instruction set that wasn't PowerPC compliant. The Master core served to handle all of the IO, the base processing power for the game, as well as the scheduling of the slave cores, while one of the slave cores got used for system UI, and the rest were reserved purely for games to use.
      The XBox 360 did get the Cell architecture, but it got aodified version of it. Instead of one master and several slave cores, the XBox 360 recieved a version that had repurposed the core communications instructions of the main PowerPC core to communicate with other full PowerPC cores, and there were three of these full cores, in what is now a much more traditional multi-core configuration, that would have likely never happened, if not for the core communications instructions that were co-developed by IBM, Sony, and Toshiba.
      Imagine a single core XBox 360. What a different world *that* would be.

    • @robertvuitton
      @robertvuitton 4 года назад

      @@kmemz What if Sony's current gen used the better and small CELL (used in the super slim model) architecture, but with the current gen's memory size? I would had loved to see how powerful that could've been.

    • @kmemz
      @kmemz 4 года назад +5

      @@robertvuitton While a new CELL style master-slave core processor using POWER10 and a newer manufacturing node would be great, I don't think it would have as much "Unlockable potential" as the original CELL did. The difference between then, mid PS4 cycle, and now, is:
      Back then, single-socket multi-core processing was still a very, very new thing, and not many people knew how to code for it properly.
      In the mid cycle of the PS4, most developers didn't properly utilize the eight jaguar cores of the consoles of that generation, because of AMD's whole thing about having combined important sections of their cores, leaving four floating point units and eight integer units, in practice working out as a strange hardware hyper-threading, and on the Intel side, we were still in the middle of the quad core monopoly era.
      Now, The Quad core monopoly era is completely and entirely over, and most decent developers practically fully understand how to work with six to eight cores in practice. As long as they can understand both the master and slave architectures, have a relative understanding of how the master core schedules the slaves, and actually make sure to code their program to take advantage of the fact that the hardware is there, then I don't imagine it posing the same intimidating threat that it did back then.
      On top of that, Cross communicating master cores have come a long, long way since then, and Intel's Ring Bus has proven particularly impressive for its theoretical limitations, with AMD's shared cache mesh bus, the "infinity Fabric", proving to be an absolutely incredible work, although its ties to system RAM clock hold it back a bit unfortunately. While cross communicating master cores still poses some theoretical limits, it seems to generally have more flexibility in terms of how well it scales depending on its implementation, and can be implemented in a larger variety of ways.
      in short, I think that we've finally reached the era where what made the CELL hard to code for yet such a wonder of unlockable performance is long over, and a new version with updated manufacturing and architecture wouldn't look nearly as impressive as it did back then.

  • @onlysublime
    @onlysublime 2 года назад +2

    There were a number of incorrect statements in this video but I'm only going to mention one. The original Xbox did not have a celeron processor. It actually had a modified Pentium III Coppermine processor (modified down to reduce cost but preserving the key aspects of coppermine).

  • @sdmods619
    @sdmods619 4 года назад +5

    Sometimes I wonder what the Xbox-scene would have looked like if they were able to stay with Intel/nVidia. It could have been another explosion in homebrew, emulators, and backup capability at a much faster pace and maybe saved us from the RROD.

  • @D4Disdain
    @D4Disdain 4 года назад +6

    Props to MVG for using Unkai from the Axelay OST, one of the best level 1 tracks of all time!

    • @Vanessaira-Retro
      @Vanessaira-Retro 4 года назад

      Yes, I was like. Wait thats Axelay!!! Also props for Salamander Avatar there D4Disdain. Though I think we both know what shooter has the best 1st stage music. :D

  • @anderson9244MLG
    @anderson9244MLG 4 года назад +40

    Holy shit! Touhou music

    • @C00L9UY
      @C00L9UY 4 года назад +1

      I love Patchouli!!

    • @poetycko_o_ksiezycu
      @poetycko_o_ksiezycu 4 года назад

      @@C00L9UY me2

    • @system64_MC
      @system64_MC 4 года назад

      ​@@C00L9UY Flandre Scarlet is my fav (Yeah I know she is overrated)

  • @sundhaug92
    @sundhaug92 4 года назад +20

    The original Xbox originally had an AMD CPU, which is why the CPU erroneously rolls over to addr 0 at one point (AMD CPUs throw an exception), they switched to intel relatively late in development

    • @DripDripDrip69
      @DripDripDrip69 4 года назад +1

      The day before announcement to be exact

    • @snetmotnosrorb3946
      @snetmotnosrorb3946 4 года назад +3

      @@DripDripDrip69 Really? I've never heard that before. What AMD CPU was considered?

  • @TaimatCR
    @TaimatCR 4 года назад +8

    Man I sure love the way you present this information about the consoles, and their architecture and exploits. Thanks for doing this, it is a both entertaining and informative

  • @JacobJacob2
    @JacobJacob2 4 года назад +33

    I don't know why, but every time I watch one of your videos, I just imagine the executives of whatever console company your talking about banging their heads against their desks in rage when they hear people are loading homebrew on their consoles lmao.

    • @DxBlack
      @DxBlack 4 года назад +1

      ..."console company" -> Why would they care that the old systems can run games no longer for sale without purchase, when the game publishers/developers don't care anymore either? (other than Nintendo, since they can't make a dime without nostalgia these days)

    • @JacobJacob2
      @JacobJacob2 4 года назад +1

      @@DxBlack I imagine it's during the time when the console is fresh and new.

    • @Kalvinjj
      @Kalvinjj 4 года назад +1

      @@DxBlack you kinda lost me at "[...] they can't make a dime without nostalgia these days"... I'm pretty sure the Switch is selling pretty well and not just on retro stuff...
      Now if you considering that old franchises or characters are making money, that's a bit too broad and we could say the same about Disney or Marvel even.

  • @tchitchouan
    @tchitchouan 4 года назад +28

    HOLY SHIT PC98 TOUHOU MUSIC THAT WAS ABSOLUTELY UNEXPECTED

  • @TheLonerD
    @TheLonerD 4 года назад +7

    Huge thank you for Touhou PC-98 music.

  • @AcornElectron
    @AcornElectron 3 года назад +3

    Engineer: It’s gets hot
    SMN: Ship it!

  • @johola
    @johola 4 года назад +7

    10:20 Come on, ATI was integrated into AMD 7 years before that...

  • @konstantinkh
    @konstantinkh 4 года назад +9

    I've just missed the Xenon era, getting into console game development shortly after XB1 release. But I've seen enough #ifdef XENON over the years to have a pretty good idea of just how many workarounds had to be put into place to get the games to actually perform at an acceptable level on the 360. Sure, you could just take your Windows DX8 game and compile it for 360 with minimal changes, but you weren't shipping a competitive AAA title this way. It's good to know there were good, legitimate reasons to go PowerPC in that generation, and it wasn't just a case of mass insanity in the game development world.

  • @JoeStuffz
    @JoeStuffz 3 года назад +1

    The Race for a New Game Machine book was impressive. It said that the 360's PPE was originally designed for the PS3. IBM showed Microsoft the PPE, and Microsoft was impressed. Microsoft did add their own special sauce to that chip and extended the vector register file. It might had 2 vector units per core (6 threads, 6 vector units), which knocked down the advantage of the Cell CPU by a little. Around that time, even PCs were going multi-core, so the risk for these multi-core CPUs was relatively low.
    IBM also was a good choice because IBM developer documentation was actually pretty good at that time (it's been a while since I have seen IBM developer docs). Microsoft is a compiler maker, and the in-order nature of the CPUs might be able to be helped with compiler optimizations. That probably sounded really good to Microsoft. The other advantage was that to learn how to develop an Xbox 360 game, you just needed a multi-core PowerPC system with an ATI graphics card, and they had Apple Macs at the time.
    I also feel that the likes of Unreal Engine 3 taking off helped since Unreal Engine is still being used, and is often praised at the time as "that system will run UE3 games very well!"
    Considering their options, outside the RRoD, the CPU itself wasn't a bad choice. They probably should have tested the CPU more before releasing (they released an early version instead).

  • @ultimatemarisaac5303
    @ultimatemarisaac5303 4 года назад +4

    Ay my man MVG with the Touhou 4 LLS Main Menu Theme Song at the beginning of the video, nice. I've never expected to hear Touhou music in one of your videos, what a pleasant surprise~

  • @hugo-garcia
    @hugo-garcia 4 года назад +3

    Mistakes were made : 2:11 xbox had a pentium III cpu not a celeron

  • @RichHomieBodhi
    @RichHomieBodhi 4 года назад +6

    The chips ran hot and was thought to be one of the issues for the ps3 Ylod. it’s crazy that the real reason the ps3 ylod was the nec capacitors messing up. Replacing them should fix the system for good.

    • @mcrecordings
      @mcrecordings 4 года назад

      Did Sony ever own up to this? I remember when my phat PS3 YLOD they didn't want to know, even though it was a common problem, MS eventually took responsibility for the RROD although that was after my launch 360 had developed the problem...

    • @mcrecordings
      @mcrecordings 4 года назад

      @jvalex18The 'phat' models are notorious for it. Not as bad as RROD on the 360 but still a well known issue.

  • @jevansturner
    @jevansturner 4 года назад +4

    8:00 I don't know why this is framed as "unfortunate for Sony" that they didn't have exclusive access to PowerPC architecture and Microsoft was also able to use it. Nintendo had been using it since the previous generation and went on to use PowerPC for 3 generations in a row (GameCube, Wii, Wii U). Also Apple had been using it in their computers at that time. It was never a Sony thing. Sony only had the arrangement of special-use cores as the "Cell" processor. Microsoft and Nintendo didn't really try to do that. I don't think it was any more "unfortunate" for Sony than it was for Nintendo or Apple. They probably didn't care one bit.

  • @1363Max
    @1363Max 4 года назад +3

    This type of technical history videos are so cool and watching them are fun!
    Nice job.

  • @aviumcaravan
    @aviumcaravan 2 года назад +1

    the thermal issues of the POWER5 architecture became apparent with the fact that Apple never released G5-based mobile devices and that the dual CPU G5 Mac used water cooling with copper blocks.

  • @m3chan1zr
    @m3chan1zr 4 года назад +3

    Always a fan of PPC architecture. I wonder how the current gen would be if they stuck with PPC with modern architectures.

  • @italodirenzo5876
    @italodirenzo5876 4 года назад +7

    I find it interesting to see the shift that hardware chips have taken over the years from excessive diversity to a rigid monoculture. Back in the day it seemed like every console and arcade machine had its own customized hardware designed specifically for it. But as games became more technically advanced and costs of making them increased, it just doesn't seem to be financially viable anymore for console manufacturers to spend the exhorbant amount of money on custom designs and instead reaching for tweaked off-the-shelf solutions. Which is why the PS4 and Xbox one run x86-64 (like standard PCs) and the Switch runs ARM (mobile phones/tablets). I feel like for the 360, the PPC deal with IBM was simply the most financially viable deal for them at the time. Look how much money Sony lost on engineering the Cell. Enough to offset pretty much all profit that would have been made from the PS3 in its lifetime. Despite how interesting of a design the Cell was, I don't we'll ever see such a radical and risque design like that ever again because it just isn't financially viable anymore.

    • @mytech6779
      @mytech6779 2 года назад +5

      You have it a bit backwards, they made custom hardware because it was required for the games, commodity hardware was not capable. They never made custom hardware just for the fun of spending money.

  • @8ightbitshaun558
    @8ightbitshaun558 4 года назад +3

    i hope you read this... i absolutely love your content. You should make a dvd/bluray set with these documentaries as i would buy them in a heartbeat!! very interesting and would rewatch them. thank you for your hard work in making these type of videos.

  • @fbussier80
    @fbussier80 4 года назад +2

    Was working at IBM Bromont in Quebec when they were making those chips. Fun times.

  • @nitramnitramis2339
    @nitramnitramis2339 4 года назад +3

    Again an interesting video - Thank you
    At 10:30 i see a modified Xbox360 (with external power cable for DVD drive ?). Can you give more information on this ? Where can i buy this long cable ?

  • @JustWingingIt-s5f
    @JustWingingIt-s5f 2 года назад +1

    What’s really bad is that all they case designers for the 360 and the PS3 had to do was improve air flow and use more active cooler with better thermal paste. I’ve modded both console cases by simply adding cut outs to improve air intake and on the PS3 it dropped cpu temps by 8 degrees C. The 360 relied on basically trying to passively draw air across the heat sinks from 2 incredibly small bands of holes. Stand up the console? Block the bottom intake. Add on a HDD? Block the top intake. The 3rd and 4th bands of holes? Blocked by the presence of the motherboard itself. All I did was cut a huge hole in the case and stand off a sheet of plexi to protect the components and temps dropped dramatically for that system. Add in the 2 small fans, one on the old style cpu cooler to push air through it and one on the extended arm of the gpu cooler to push air down over it and it runs without having a single stutter in fps

    • @BruceStephan
      @BruceStephan 9 месяцев назад

      I get tired of morons dragging PS3s into the problem Microsoft did with their POS 360s . They never bothered fixing the problem in the first place by using better thermal paste and make some minor adjustments with their fans . SONY never really had a problem until after a lot of heavy use over a long time with the PS3 . I never had any problems with my PS3 and I used it just as much as my 360 and still have no problems . The 360 issue with me is very different . I didn't buy one til the big lie was going around that Microsoft finally fixed it with the HALO3 Edition 360 . Mine lasted 3 weeks . Microsoft has always been last because they still sell 💩 to their customers .

  • @justforfunvideohobby
    @justforfunvideohobby 4 года назад +5

    My dude coming through consistent with the good content

  • @Syx7h
    @Syx7h 4 года назад +1

    I don't know much about hardware and coding but i love your videos so much. I watch them them all the way through and even come back to watch them over again. You can describe these things and articulate so well.

  • @Josh.Davidson
    @Josh.Davidson 4 года назад +3

    Quick correction here, the red ring of death was not at all related to heat. It was a defective graphics chip from ATi. Replacing this chip with a newer fixed chip repaired the console.
    Heat caused louder fans and such, but didn't cause RROD.
    Also, by the time the Pentium 4 HT was released, Intel had long abandoned RDRAM in favor of traditional DDR.

  • @jsheradin
    @jsheradin 4 месяца назад +1

    Red ring isn't caused by high temperatures and, in almost all cases, isn't related to the IBM CPU at all. The underfill used by AMD/NEC for the GPU was a poor match in terms of thermal expansion coefficient and glass transition temperature. After many thermal cycles it causes fatigue in the solder bumps attaching the silicon dies to the interposer. Doesn't matter how hot they run, it'll fail no matter what. It's also why reball/reflow is only a temporary fix. They're just going to crack again.
    This was resolved at some point during the Falcon revision of the 360 with a move to a different underfill material. Take a look at xenonlibrary and ripfelix for some good deep-dives.

  • @maratkopytjuk3490
    @maratkopytjuk3490 4 года назад +3

    Can you make a video how a development of a simple hello world on a Xbox/PS3 looks like? Tooling/SDK/docs would be interesting how it differs from a "normal" development with non-proprietary libs/frameworks

  • @chubbysumo2230
    @chubbysumo2230 4 года назад +2

    Their attempt to control the hardware and control the cost ended up with the red ring of death, would cost them even more in the end. They really cut down on quality control, which saved them costs up front. Those cost savings bit them in the ass when the quality control came back as millions of units failed.

    • @amirpourghoureiyan1637
      @amirpourghoureiyan1637 4 года назад +1

      They make these sacrifices in order to steal consumer mind share from each other, this has been the most prevalent in making systems successful. The PS2 was outclassed in hardware by its competitors, but the fact that they had limitless exclusivity deals and were reduced in price year after year, led it to becoming the most successful gaming console to date.

  • @mitchel71
    @mitchel71 4 года назад +7

    I still remember all the red rings my friends got. I had a ps3, but I remember a couple of my friends had Frankenstein machines, holes cut in the side, fans hot glued in to suck the heat out, one guy had it straight mounted to piece of wood instead of in a case at all
    Strange times to be sure.

    • @mitchel71
      @mitchel71 4 года назад +1

      I played the crap out of my ps3, never had any yellow lights. I know they were out there, I'm not naive, but I think the failure rates were 10% of the 360s.

    • @damonsalvatore3222
      @damonsalvatore3222 4 года назад

      I kept the shell off of mine and would just take the top of the disc drive off, remove the magnet then insert the disc & put the drive lid back on . Basically hotswapping which would also allow for ISO mods on unmodded 360s :)

  • @barrynevio4440
    @barrynevio4440 4 года назад +1

    Although they shared the same architecture, harnessing the power of those processors was difficult and different between all 3 consoles. Making a cross-platform game at that time was a nightmare.
    My 360 was cursed. Out of all the consoles I've ever owned, the 360 was the one I took apart the most which was always for repair. The 360 was a such a furnace that the cooler mounts would deform under the heat. When I finally perfected my cooler mods and had it running seamlessly, I accidentally flashed the wrong firmware onto the dvd drive and bricked it. When I found a cheap compatible replacement, I spilled an entire diet coke into the console as soon as I took the cover off. At that point I took it as a sign and shelved it.

  • @MarcoGPUtuber
    @MarcoGPUtuber 4 года назад +6

    Ooooh Axelay theme! Haven't played that game in years!

  • @youdontneedtoknowthishandle
    @youdontneedtoknowthishandle 4 года назад +2

    I think the tragedy of PowerPC is the shear amount of terrible PC ports we got for many games and the lack of ports for many excellent games that are now stuck on 13+ year old consoles.

  • @gmugrumbach
    @gmugrumbach 4 года назад +4

    I was too young to appreciate the reasoning for all of these designs at the time. But looking back, these decisions were the right moves. Console gaming has alwas had to find a way to be more competative and cheaper than PC's.

    • @maggiejetson7904
      @maggiejetson7904 4 года назад

      Console were like $300 back then when a PC CPU can easily be $100-300 alone.

    • @Dash120z
      @Dash120z 4 года назад

      @@maggiejetson7904 yeah but nothing beats the comfort of plopping a disc into your console and pressing A to start your game

  • @retropuffer2986
    @retropuffer2986 4 года назад +2

    Apple allowed legally licensed Mac Clones during their PowerPC era. It led to some interesting designs and the Clone companies were making performance monsters until Apple cancelled the program. Makes you wonder how much further PowerPC could of been taken.

  • @surrodox
    @surrodox 4 года назад +12

    I saw a MVG video about old consoles, I like.

    • @marsma2229
      @marsma2229 4 года назад +2

      'old consoles' RIP me

  • @maynardburger
    @maynardburger 4 года назад +1

    CPU's used to take up a much larger chunk of the costs back then. So it really made sense that they had to go with the most cost-conscious options in order to keep selling their systems at a reasonable cost, which is obviously a mandatory aspect of the console strategy.

  • @techtomek5062
    @techtomek5062 4 года назад +4

    The funny thing is that while the consoles were moving to PowerPC, Apple turned their backs on IBM PowerPC and used X86.
    Now we havesimilar times again. Apple goes to ARM, Nintendo also uses. PS and MS use AMD APUs.
    The question is not which CPU architecture is used but which graphics acceleration.
    AMD has made a really great purchase with the ATi. The current success is based on this purchase.
    Intel can't get GPUs right we already saw with i740, Larabee we still wait for it 😄
    We now need to see how ARM will evolve and if Apple is really getting performance out of ARM. But more important is the GPU performance and if there will be more competition from maybe NEC.

    • @net_news
      @net_news 4 года назад +3

      Well, Apple moving to Intel was a huge loss for PowerPC, I'm sure Microsoft and Sony got a great deal on their PPC chips... becasue you need customers to keep your fabs working you know.

    • @net_news
      @net_news 4 года назад +1

      @referral madness NEC PC98 series were x86 IBM PCs clones with a different BIOS... not a big deal. PC88 were really interesting though.

    • @techtomek5062
      @techtomek5062 4 года назад

      ​@referral madness NEC or PowerVR is still active in the mobile sector. We have the successors of the Kyro graphics cards, for example the PowerVR GE8320 . In the Playstation Vita was also PowerVR built in

    • @tHeWasTeDYouTh
      @tHeWasTeDYouTh 4 года назад

      NEC has nothing to do with PowerVR, they left them in 1999. Imagination jumped with STMicro only for them to bail on them and have the Kyro 3 cancelled in 2002. It hurts even today when I think of it. I know Apples custom GPU uses Imagination patents but I don't know much more than that.

  • @newfiepacer
    @newfiepacer 4 года назад +1

    I don't understand any of this technical stuff, but for some reason, I really enjoy listening to it. Maybe it'll all come together someday in the future!

  • @mattiaswennerstrom6271
    @mattiaswennerstrom6271 4 года назад +7

    Only the VERY early Pentium 4 used RDRAM and that one never reached above 2.0ghz and never had HT.

    • @GraveUypo
      @GraveUypo 4 года назад +1

      pretty sure there were pentiums up to 3ghz that used rdram. but the OPTION to use ddr ram came after a year or so. it made the cpu much slower though. still worth it over the overpriced piece of shit that was rdram
      found an article:
      www.tomshardware.com/reviews/intel-ddr,403-4.html

    • @DripDripDrip69
      @DripDripDrip69 4 года назад

      Since the memory controller was in the northbridge not in the CPU like nowadays it was really a motherboard requirement.

    • @DripDripDrip69
      @DripDripDrip69 4 года назад +1

      @referral madness RAM tech developed by RAMBUS that uses serial rather than parallel communication that was faster than SDRAM at the time but because RAMBUS is such a patent troll nobody uses their stuff anymore.

    • @KenSharp
      @KenSharp 4 года назад

      @referral madness It probably means you can't be bothered to use Google.

    • @Pasi123
      @Pasi123 4 года назад +1

      Socket 478 motherboards with RDRAM are really rare. Almost all of them use DDR memory

  • @jcnbw01
    @jcnbw01 3 месяца назад +1

    I was working tech support for the original Xbox back in the day. The months leading up to the 360's release, the hype and hoopla around it was astounding. when it came out, the 3RL issue gave me enough overtime to buy my first car.

  • @karlstenator
    @karlstenator 4 года назад +13

    Argh! Pentium 4, bring, pain!

    • @campkira
      @campkira 4 года назад

      good thing i only used mid level chip back then.... core2duo also start heating up after few years of used..

  • @SylveonMujigaeOfficial
    @SylveonMujigaeOfficial 2 года назад +1

    One thing that Nintendo never did was move away from RISC.
    Sony switched to the x86 architecture with the PlayStation 4.

  • @tHeWasTeDYouTh
    @tHeWasTeDYouTh 4 года назад +5

    2:47 the original plan for the Xbox was for the console to have 128MB of ram but Microsoft cheaped out last minute. It makes me mad even today. I always wondered why the original Xbox never got a CPU/GPU die shrink(180nm cpu and 150nm gpu) but I have never found any article from MS, developers or game media stating why this didn't happen. MVG do you know? Intel did shrink the Pentium 3 to 130nm and Nvidia had their GPUs made by TSMC so I am sure they could have moved the Xbox GPU to 130nm.

    • @campkira
      @campkira 4 года назад

      they relized it woul be too expensive and left no room for hardware refreash

    • @jpesicka492
      @jpesicka492 4 года назад +2

      Yes the sega chihiro is essentially an xbox with 128mb. You can upgrade it to 128mb or have someone do it for you for about $100. It basically allows you to play Sega Chihiro arcade games, no other real benefit considering Xbox games were designed and optimized to only you 64mb of memory. It does help with other emulation as well but ya, by now we have modded 360/ps3 or a PC that can emulate the higher end stuff.

  • @johnsimon8457
    @johnsimon8457 4 года назад +1

    10:00 oh wow a Burger Becky article about perf and Load-Hit-Store and coaxing the C++ compiler into not generating garbage. There’s quite a bit of difference between early and late X360 and PS3 games and counting cycles and pipeline stalls made the difference.
    IIRC late generation games like Gears of War 3 was known to cause overheating issues because efficient use of hardware means everything is being used at the same time, those 40 cycles of pipeline stall in that article meant a core isn't being used, thus less heat being emitted.

    • @johnsimon8457
      @johnsimon8457 4 года назад

      ​@referral madness one of the co-founders of Interplay, industry vet going back to the 2600 and Apple IIGS
      Probably want to read "Masters of Doom" for the burger story. It's a name that keeps popping up around PC and 16 bit era games.

  • @purplefuku
    @purplefuku 4 года назад +11

    And now Apple’s jumping ship from Intel! (For good reasons, too...) It’s funny: AMD’s chips are soaring, Apple’s SoCs are going to be super exciting... and Intel is having die shrink trouble & similar thermal issues like the PowerPC line had... What a wild moment in time!

    • @AbdulBido
      @AbdulBido 4 года назад +5

      Intel sat at the top of the hill and abused the shit out of us consumers and even more for enterprises and industrial.. They milked us for almost a decade.. Since they won against the athlons and phenoms. Happiest person to see them go..

  • @Trollkonto
    @Trollkonto 4 года назад +1

    @MVG ATi didn't exist in 2013, they were by that time fully incorporated into AMD who bought them in 2006. ATi as a brand was used on some products until late 2010. Regarding the CPU used in the Xbox, it was a Pentium III with half the cache, not a Celeron.

  • @sadp1535
    @sadp1535 4 года назад +3

    PowerPC just had a great price/performance ratio at the time, x86 could never match it at the time, remember Xbox had to hit 299 price tag and perform well in gaming, at the time for that price you could not get anything from Intel nor AMD which would perform like this for this price. IBM was best partner to do this thing, reliable, strong partner with a lot of experience in custom designs, decades of experience, they were doing it very well with Gamecube, and now they did it again with XBOX360, PS3 and Wii. There is a reason why manufacturers chose IBM.
    I do not think that IBM is to blame for overheating, designers from Sony and Microsoft are. GameCube/Wii/Wii U did just fine.
    Yes that PPC 970 chip was hot as hell but most FAT PS3 failed because of NEC TOKINS and not because of failing Cell. The real problem was that it had shitte thermal paste under the IHS which dried up after long use. A lot of people think that RSX is to blame for YLOD, but it is actually Cell which was overheating because of shitty thermal paste. Sony designed PS3 poorly, they jammed that hot 380w PSU into small case, it did sit near damn chips and caused even more heat on those heat monsters. And even though, PS3 FAT design is shit, if you took care of your system, delided it, replaced failing NEC TOKINS, even CECHA models work to this day. Sony only solved their problems with PS3 Slim, ofcourse 45nm redesign lowered the heat drastically, but that was the price, remember guys they designed that chip in 2002-2005, Intel had a heat monster of their creation as well... Called Pentium 4 and then later Pentium D.
    Same shit with XBOX360, they jammed all of those components to even smaller case than Sony did and they expect it do not overheat? Its is MS to blame not IBM.
    Both Sony and MS sold their consoles below manufacturing costs, at the time, PC prices were just too high, and even though PPC was cheaper it still did cost a lot.
    Wii U was a failure to be honest, that chip was just fkin too old at that point, IBM did not have anything acceptable in 2010-2011, which is kind of strange having in mind that PowerPC chips were in like 200-300 million gaming devices, why the hell IBM did not invest some money and effort in RND to improve their PPC gaming architecture, they had the market share and the money, its like really they got money for every 200-300 million unit those companies sold and they did just sit do nothing for 12 years with their architecture. Thats why X86 won, because IBM fell to sleep, they had Apple and Consoles in their hands and they dropped the ball. Apple did good predictions with that and foreseen what will happen, C2D architecture was such a huge leap that even to this day those chips perform decently which is kinda mind blowing to think about it, for example my Macbook Pro and iMac both with C2D chips run so well with modern SSD.. A 14-15 year old chip working well even to this day.... is amazing..
    To be honest, Cell was a technical marvel, it could not be matched for some time, it was created in 2005 and was so powerful its amazing.. And it is creation of Sony, IBM and Toshiba. The bottleneck of PS3 was nVidia GPU, if it had better GPU it could even play modern games I believe... look, they did go safe route with PS4, but CPU power of PS3 is stronger than that low end X86 CPU on PS4 which just shows that Power PC was amazing and really could have been an option if not for lack of interest from IBM. Both Sony and Microsoft decided to offload most of processing to GPU side, and did chose weak X86 CPU to do the other calculations which could not be done on GPU. I do not buy fact that developers were more common with X86 than with PPC, look, they sold 300 million consoles from various manufacturers, they had Apple users using that platform, PPC certainly have developers interest. They did chose X86, because it was easier than PPC, but most importantly it had better performance per price, IBM just dropped the ball. Manufacturers chose to go code optimization route, which Nintendo did for years starting with GameCube, they did not give a shit that console was weaker, but they made some the best games. look at late cycle PS3 games, they were amazing.. Even on Wii U, which was a failure, they managed to make some amazing games, its all about code optimization, hardware is nice, but you really care about price/performance + code optimization, and thats why we got X86 APUS on 8th gen. Thats why we got X86 APUS on 9th gen, because for its price those APUS cannot be matched, just like X86 could not match it at 6th gen.
    To conclude, it is kinda amazing that IBM had such an impact in gaming and content creation history they had it all and they dropped it all which is kinda sad, because we need competition to x86, competition is good for us, consumers and it makes technology move forward. Now we see the same thing is happening with X86, Apple already announced they are dropping X86 for ARM which I think is a good call. X86 was a PPC killer and now ARM is a X86 killer, same old problems, too much heat and incremental improvements. We really have not got anything really major since 2011 then Sandy Bridge was introduced, we only got more cores, more clock speed, faster ram, faster storage on these new platforms. Atleast AMD did kickback and provided chips like R5 1600 AF/ R5 3600 and these new amazing R5 3100, they are amazing for their price, Intel is down on its knees now. I would not be surprised if at some point even consoles will drop X86 and move forward, but for now and probably for PS6/XBOX1_V3 its going to be X86, because price/performance is unmatched at high end, at lower end, lower wattage ARM is already better than anything X86 can offer, look at A13 chips from Apple, they are cheap yet they have so much power that they can match 15W Intel CPU at 7,5W, its just a matter of time then they will learn how to stack those ARM chips and outperform X86 even at higher market.

  • @jgsh8062
    @jgsh8062 4 года назад +2

    I love PPC and ARM and MIPS. Similar instruction sets and way more pleasurable to program in than x86

    • @jajkomaster
      @jajkomaster 2 года назад

      Exactly, PPC gets hugely misunderstood nowadays. It didn't disappear from the consumer market because it was bad, it was 90% due to mishandling its potential and 10% due to Intel getting their sh*t together with Core cpu lineup 🤷🏼‍♂️

  • @brontoenator
    @brontoenator 7 месяцев назад +5

    I miss so much when each console manufacturer tried to use a completely different design. Today everything is the same. SO BORING.

    • @charlesg5085
      @charlesg5085 6 месяцев назад +1

      I don't understand that. When you play the game you can not even tell what cpu architecture it is running on. I am more interested in the ability to load Linux or custom upgrade the hardware.

  • @Lethaltail
    @Lethaltail 4 года назад +2

    Well if former WebTV staff were directly involved, that would explain that one picture with the guy in the WebTV jacket standing next to all the 360s connected to Lamprey boards.

    • @Lethaltail
      @Lethaltail 4 года назад

      You'll likely find that image if you search for "Xbox 360 Lamprey"

  • @andresbravo2003
    @andresbravo2003 4 года назад +5

    It might be Sense that Xbox 360, PS3, GameCube and Wii did have PowerPC Processor simular to a Mac that is powered by IBM. How great it might be? Well, the difference here is history...

  • @MrNagant007
    @MrNagant007 5 месяцев назад

    I used to surplus computer hardware at an old job i had. I had to open up printers to verify there were no hard drives that needed to be destroyed. I can't remember the model anymore, but there was one printer series we had that used a PowerPC 750. Always got a kick out of that.
    One machine it's playing state of the art games, another it is just running a webpage and listening on the network for print jobs.

  • @uSaschka
    @uSaschka 4 года назад +13

    There are two types of comments here:
    - Making jokes of XBox or PowerPC related stuff
    - OMG TOUHOU MUSIC. THIS IS HEAVEN!!!

  • @mrpyrostorm3165
    @mrpyrostorm3165 4 года назад +1

    At the start of a new gen, videos like this are a good reminder that sometimes its worth a bit of a wait to ensure the heat issues are taken care of before we buy.

  • @RGD-Audio-Repairs
    @RGD-Audio-Repairs 4 года назад +9

    also, personally i dont think that sony's ps3 ran that hot tbh... Alot of the YLOD issues werent to do with the heat and solder balls, like the xbox 360 had..
    I bought a job lot of YLOD ps3's the other week on ebay, None were solder ball issues... they were all repaired by myself, simply but replacing the NEC/TOKIN capacitors with Tantelum capacitors...

    • @FeeLtheHertZ
      @FeeLtheHertZ 4 года назад +1

      They ran extremely hot dude what the hell are you talking about!? They were toaster ovens with heat that would cause your hands to sweat if you held it by the vents. Maybe if you only worked on non launch models then yeah, your statements then accurate.
      No, those things could produce heat signatures seen from space.

    • @ayuchanayuko
      @ayuchanayuko 4 года назад

      Could you help me out? Tantalum caps arent available where I am and the available ones are both expensive and has low capacity.
      Would a regular electrolytic capacitor work? Tantalums basically should operate like electrolytics. If you could try it out so I can fix my PS3, that would be great :D

    • @solidsnakeandgrayfox
      @solidsnakeandgrayfox 4 года назад

      Man ran ridiculously warm.

    • @RGD-Audio-Repairs
      @RGD-Audio-Repairs 4 года назад

      @@ayuchanayuko they technically would yep, problem is, Standard capacitors are too big and bulky. they needs to be surface mounted to the motherboard.

  • @Tomiply
    @Tomiply 4 года назад +1

    Small correction: ATI wasn't known as ATI when the PS4 and Xbox One came out. They were acquired by AMD in 2006, so they were AMD from that point forward.