Gabe Newell does not enjoy developing for the Cell Processor, especially with its SPEs. These days the Source engine is multi-threaded, so it's not as bad.
@@Dncmaster Does he dislike it now as we haven't received anymore updates from their console ports since 11 years ago? I thought Xbox 360 was actually easy to develop games and all that.
Yes, Gaben had made his life easier by that point, making the Source engine multithreaded. Additionally, the fact that Sony allowed Steam integration made Gaben accept the horribly complex environment.
Another simpler multicore processor may have been easier on developers, but one thing that did come out of this was that it dragged developers kicking and screaming into the multithreaded era, which was a net positive.
I didn't think of it like that. Still, you could have a multithreaded processor without the complex architecture of the ps3 processor. The PS4 is a testament to that.
He wasn't arguing against multithreading as much as even in standard multithreaded programming a junior dev isn't going to outright destroy performance. With the SPE cores, it elevated the minimum skill for all devs on the project. As mentioned, the xbox 360 was also multithreaded, but faired far better. I wonder if a better compiler could have helped. Staying within register and other architecture limitations shouldn't be the task of the programmer in any modern software platform. Such knowledge and skill can be a bonus for optimization, but modern software is too complex to work at scale while having to micromanage those details.
It's almost funny looking back now how PS3 was in this position where it had amazing looking first party titles, but 3rd party games pretty much exclusively ran better on Xbox 360. It was so capable but, understandably, underutilised.
Because which third party dev is going to put in the time to create a game build particularly optimised for a single platform when they're trying to make 2 others and launch everything in a year? If Sony wanted better performance they should have built libraries and an SDK that allowed a greater amount of abstraction, and less SPE micromanagement.
@@xmlthegreatreal men know that x86 is more complicated than ppc but intel actually provided suffecient documentation as you mentioned, sony did too little too late and by then they decided to move onto the jaguar cores which held back so many games from 60fps just because of the cpu bottleneck
They had amazing looking first party titles - several years into the systems life. If we look at the last years of that generation, and (for example) compare the Last of us on ps3 with Halo 4 or Gears of war Judgement on xbox360, they're honestly pretty comparable. It's not like the ps3 blows the 360 out the water. Saying the system is underutilized is kind of a funny statement, what those Sony studios pulled off does in no way give credit to the ps3 hardware. Literally everyone hated that system, there is a reason no one designs systems like that anymore, and no one likely will
The issue wasn't multi-threading or multiple cores. The issue was specifically Cell processor. At the time I was in college and we were preparing some course on this subject. Luckily I didn't do any coding on it but my friends who did never had anything good to say about it. It was problematic to say the least. You had PPE (power processor element) which was used for job scheduling and management. SPE, or synergistic processor elements as they called them, were actually in charge of these jobs. SPE executes relatively small and simple programs in single or double floating point precision. So to make your program work on Cell you have to split it into really small independent chunks where each chunk does one thing for a short time with limited amount of memory, executed in parallel with others. These SPE cores had their own local memory ~256kB which can be later merged with L1 and L2 caches and main memory. PPE supposedly manages all this and merges the results while at the same time controlling main application and managing memory. They called it flexible, we called it nightmare. You had so many racing conditions and issues which were almost impossible to solve. Not to mention memory problems and synchronization. To make matters worse it wasn't much faster than general purpose CPUs with multiple cores on the market and with advent of GPU general purpose programming platforms it became obsolete before it had the chance to prove itself.
Cell was a supercomputing chip. They basically took AltiVec, which was the SIMD component of a CPU, and built cores based on that. They are 100% SIMD processors. They had their own compiler and toolchain. To run code on them you had to load the program into SPE local storage and run it. You had to talk to the program using a messaging system. Access to main memory had to be orchestrated using DMA commands. It was very cumbersome. It was almost like trying to write game code using GPU shaders.
@@ben_spiller It absolutely was a good chip for gaming, just not for junior (or mediocre) developers. SIMD is awesome, most compiler-optimized code relies on it. Sure, going 100% with SIMD and branchless code is harder, but it's far from impossible. We do it all the time for collision and other such problems. The more developers learn these kinds of operations, the better. Fewer games would be CPU capped if more developers knew what they were doing (especially with cache, SoAs and all that). Frankly, it's embarassing. And at the rate GPUs are outpacing CPUs, pushing load onto shaders is only going to grow in relevance. Parallel processing is the future, and it was already the future decades ago. If gamedevs weren't having their time wasted in university learning OOP, "clean code" or some other dogmatic nonsense, we wouldn't even be discussing this right now. This is already known by serious engine teams, but it should absolutely go wider than that.
@@pierrelaroche1441 bro necro'ed a month old comment just to rant about how devs should know how to work with a cumbersome system as if this video, which was released a 11 years ago, will catch the eyes of anyone that can do anything with this knowledge. Now I know nothing about coding or the software/hardware side of game design but I'm pretty sure if everyone is universally having a problem with something maybe they aren't the problem. Again I know nothing but I'd bet that the knowledge you have now, in 2024, maybe wasn't as available for game devs who had to work with this system 17 years ago.
@@MrTophatcatshut the shite up mate, your postulate that "if everyone is universally having a problem with something may be they are th problem" is flawed. Sometimes and most often incompetence just occupies the whole scape that everyone who has a problem are the actual problems
I never even played that shit and I remember they had to cancel DLC for that entirely on the PS3 because the developers said, "You know what? No. Just NO. N. FREAKING O." and that's what happened ladies and gentlemen. I do not lie. TL;DR: Something about ram.
to be honest I bought skyrim on launch day november 11, 2011 for ps3 and played it every day for a month and never encountered an issue. I think the only bug I found was walking up a mountain, I clipped through it, but I was fine and the game didn't break
I know decade pass since those games was released,but i cant pass this message section without saying that all bethesda games is disaster on ps3(i have platinum trophy in fallout 3 and New vegas),so i can say how they work in long run(Like shit).Major problem was in extremly low Ram(256mb) and critical save file size(more than 8mb),those games was close to unplayble after 9mb save size,more than on norman save they run in 25fps.Has played skyrim on ps4 and this was good experience
The code/architecture of the console is so conplex that developers strugle to make thier games run at least 30 fps, so complex that even sony can't even make thier own ps3 emulator to run in the ps4/ps5
@@alejoyugar the complexity is simply a challenge, not a roadblock. Sony could make it work if they wanted, but they don't want to. More lucrative to remaster and remake everything.
Multi-threaded is different than optimizing code for SPEs. SPE's are very specific to vector processors and you need a problem that would work well using vectorization. GCC didn't automagically make SPE code with existing data like it does with SSE or now with AVX. That's why many developers just ignored the SPEs and just did the vectoration code on the Nvidia GPU (using shaders) rather than figure out how to make it run on the SPEs.
@@BlindBison The PS3 (and PS2) were simply extremely developer unfriendly for the first few years of their lives. The debugging tools weren't even just unfriendly but for the PS2 vector units and PS3 SPUs there weren't any. You were just supposed to write bug free code. It was ridiculous and nothing Gabe said was wrong. Microsoft provided great tools from day 1.
@@jc_dogen What doesn't make sense is that the specs for the initial QS20 Cell prototype had been available since at least 2004/2005, when the errata for the architecture had been finalized and the design approved for shipment. Why couldn't Gabe or other devs, knowing the future was on the horizon, make even a slight attempt to perform some in-house coding sessions to simulate how well they could write vectorized code to run on multiple SPEs in tandem, in addition to low-level C/ASM code for the GeForce 7800 GTX (which the RSX was based upon)? That way, they could have at least had a head start before the system was released even without Sony-provided SDKs.
Even so, only certain developers were offered assistance in development. Who was helped the mosts? Activision, Ubisoft, EA. All the AAA devs only and in house studios only. F all the little people.
The PS3 was a nightmare to program for because multithreaded CPUs were still a brand new technology that devs weren’t used to. But there needed to be a trooper at some point to move the industry forward technologically speaking. And the devs’ hard work finally paid off in the second half of the PS3’s life. We started to see so many PS3 games that looked incredible and just as good as some PS4 games.
Oh, really? Looks like the Xbox 360 didn't have those problems. Or maybe, just maybe, what Gaben is saying is correct and Sony overdid themselves letting Ken Kutaragi go wild on this.
It's the memory model used by some of these massively parallel systems that causes the problem. Had they simply gone AMP with 2-3 OOO PowerPCs, each with two sets of AltiVec pipelines, they would've performed much better. That's what Xbox did, and they got more games, more exclusives, and better use of the actual hardware. They could even have added another 2-3 G4s to each processor to get much better and more usable performance. And that's similar to what Intel and AMD are now doing with their x86-64.
@@garykildall4111 Power consumption was 18 Watts in 2005, and you usually run one core at a time. Under a heavy 3D graphics load, you're still not at 55W, and those don't need a separate fan for cooling--just a heat sink and thermal paste. 100W power supplies were still being used in laptops, and didn't cost that much.
CELL would have been fine if people weren't expecting to port traditional software from SMP to it. CELL was more forward looking, more scalable: if you built one with modern process it would have 64 cores. It would be superior to CPU or GPU for the AI & vision workloads that are set to change the world. these kind of chips will re-appear as AI accelerators outside of games, but if people hadn't bitched so much about the PS3 we'd be ahead with that. the software tools are better now (clang/llvm)
+walter0bz I see Cell as a step between a CPU, having very complex instructions, branch prediction, all that good stuff, with just a few cores, and a GPU, just having as many threads as possible with relatively simple design. You have a fairly standard PPC core on die with more specialized floating point processors. Looking forward, GPUs have advanced greatly, and computer vision, neural net workloads are accelerated quite well on them already. Nvidia's ARM SoCs that have integrated GPUs offer incredible performance for not a lot of power consumption, much more so than Cell. I'd be interested in seeing what specifically in clang/LLVM has improved writing code for the Cell BE, as I can't think of anything specific.
r.e clang/LLVM i'm referring to the availability of a C++ AST, and IR, for writing custom tools for parallelism (i'd have killed for that 10 years ago lol); also today C++ has lambdas. Between these it would be easier to write dataflow abstractions (not just parallel, but DMA). Yes GPUs can do AI/vision (better than CPU): but further improvement is possible with a dataflow oriented chip (i.e. direct communication between multiple on chip memories instead of caches). This is being addressed now with new chips- Movidius , Eyeriss, Kalray (see articles 'rise of the VPU'); IBM's true north takes this idea to the extreme with dataflow and a neural processor, but I'd prefer von-neuman machines with packed SIMD etc as a middle ground between specificity and versatility. I agree that CELL is 'intermediate between CPUs and GPUs' along one axis, but there's another axis: i.e. feed-forward/dataflow vs random access. Dataflow can handle vertex processing and image processing (e.g. final combines of deferred rendering) just fine. On this axis CPUs and GPUs are similar in that they're designed for completely random access, through caches. (GPUs hide latency with massive threading, CPUs with OOOE, and need hardware/memory to buffer states for that). On a dataflow oriented processor you can reason about the locality (e.g. see the debugging tools IBM are making for showing the spatial allocation of a computation across their on chip network, in the true north video series). Imagine something with that control but von-neuman nodes. (e.g. adapteva epiphany, I'm looking forward to their next RISC-V based chip..) Note that modern SOC's always have some other DSP units to complement their CPU & GPU. This is where CELL belonged, but if wired up right it would have handled vertex & image post processing just fine..
walter0bz I'm pleasantly surprised that we have gotten to have a conversation about this topic. The AST and IR provided by Clang/LLVM definitely are a big help. Sony / Toshiba / IBM could've put more work into the tooling with Cell to help developers take advantage of the SPEs I think, but having a well defined IR they could've based their optimizer around would've made their jobs easier. Processors like those you mentioned certainly perform great in their domains. These certainly have incredible advantages, but have a higher developer knowledge cost of entry, as well as actually getting your hands on these chips to mess around with them is more difficult. Eventually we will be seeing these in dev boards, and I think then they'll catch on even more. I believe that when someone works on something in their free time, they tend to be more revolutionary, because they're not bound by rules. Cell, as shown by things like the Gravity Grid, certainly has incredible power that can be unlocked with just more work. I think that the real problem about the PS3 at the time was that developers didn't want to make themselves too bogged down with Cell's slightly more complex design, as they wanted to be performant on other machines. If Cell had gotten more action as purely an accelerator to work in high performance computing applications, it certainly could've taken off, as this was a time that GPGPU was still fairly new. Thank you for this conversation, it showed me new information about upcoming processors.
To revive this discussion, and maybe get a few pointer seems like what gabe is bitching about, could have been solved with proper OoOE but the SPE never supported it, as well as having proper cache, i am sure there is a TECHNICAL reason for it but i never understood why, If anyone knows or provide references, would be great.
'propper cache'- the problem is caches limit scalability (cache-coherence). their design was forward looking and would scale to thousands of cores . See the 'adapteva epiphany 5' which is a 1024 core chip similar to CELL, just developped. their chip is for AI, etc.
The issue is the cell is an impressive piece of hardware. But myself as a software programmer i understand how hard it is to move to new platforms where there isn’t really a great set of instructions for how you manage errors. The cell is a really strange piece of hardware. Most developers were used to standard power processes elements as multi core. Easy compiler that pushed tasks between the cores. The SPE approach is kind of brilliant if you understand it, but it’s a huge nonsense to try and deal with. Sony studios used it very well. But most third party’s were not willing to come up with massive relearnings of their engines on the SPEs. You literally go from a compiler that can push your tasks across three cores, as per the 360, to a set of novel procedures that will take advantage of the SPEs. Sony thought they would make all developers develop in this way because they thought they would be market leader. But Xbox was far more similar to actual game development. The moment Xbox was more successful the cell kind of died.
To program using the new ppe system was very challanging. Not like they couldn't, it's just it wasn't like a normal multi-threaded enviorment like modern architecture. You don't just split everything among all the cores / processing units. the SPE were only good at certain things and programming for them was complex. You could go from just 10-20 lines of code where you run everything on the main PPU to offloading things like ai or other elemtns to the PSE, and suddenly you'd be looking at almost 100-300 lines just to make it work. There are some great info on this from various ps3 devlopers and emulation developers that go over the complexities of the PS3. I'm no expert, but it was eye opening to see how different the Xbox 360 was vs the PS3.
bruh ps3 is a legitimate Parallel processing capable Device unlike the x86 based AMD and Intel which are just muti threaded and still sequential execution. its a step backward
Gabe: " My most hated thing about the system is the number in it, I've never seen it in my life, i have know idea what it means and it stresses me out everyday when anyone mentions it which is literally every 40-60 mins"
He loved Playstation so much that Playstation 3 (after the incredible PS2) gave him PTSD! 😆Even as a Nintendo fan myself that still played with friends (for real, and thankfuly, we all had different things at different times) on Arcade, 486 PC, Mega-Drive, Sega Saturn, Playstation, Dreamcast and PS2... PS3 PTSD is something i can relate to because we all already had +/- nice pc's at the time and PS3 proposal was nothing more than: - Ei! Do you wanna own a locked average PC that is iflated in price and offers much less than just a regular self-built PC? - Well... thanks, but fuck no!🤣
@ The games themselves would have been possible with some graphical cuts here and there on the 360. It's not like the PS3 was able to do stuff like AI or whatever that were completly impossible on the 360.
@@BlindBison Just because the 360 technically didn't have the same theorictical peak doesn't mean it couldn't have had versions of TLOU/Killzone 3 that were very close if not Indistinguishable from the PS3. The 360 was quite capable and less of a headache. Also, No developer would call a machine where it takes 400 lines of code just to print "Hello World" Elegant. The PS3 was needlessly complex and it nearly sunk Sony. If Sony had made the PS3 with more simple archetecutire, included backwards comp and Blu Ray, they wouldn't have nearly flopped.
PS3 was so much ahead of it's time! And definitely much better than the xbox. It just needed time for developers to learn modern coding. Yeah, it is easier to write for something with three identical cores, but in the end, the PS3 had more potential which naughty dogs and others tapped and released games that were with no competition.
OH NO, WE HAD TO LEARN SOMETHING NEW. what to do? throw it away! why ? was it bad? no it was actually an exciting direction to take, The reason is : See the first line.
Gabe actually talked about this. That learning the PS3's archetecture was needless because you'd never use it for anything else. It was this extremely convuluted and specific system while everything else was much easier. Had Sony made a more traditional system, they would have killed the 7th gen
@@FraserSouris this is most likely why modern gaming systems is based on similar architecture. In which divided into two categories, home systems using x64 (PS4, Xbox One and PC) and portable handhelds using ARM (Switch, Android and iOS).
@@UltimateAlgorithm Yeah. The idea is to make it easy to design and port games on different platforms. Having systems with completely different architectures is not conducive to that as you're spending more time getting the damn thing to run in the first place rather than designing the game and optimizing stuff
@@decriper1097 do you even know what ram is lmao PS3 has 256mb ram with 256mb vram xbox only has 512mb of vram Both almost would perform same Only thing what makes PS3 more powerful is the cell cpu it has
Simply not true. The RSX in the PS3 is garbage. The CPU was harder to developer for, but in theory more capable. The memory configuration was inferior to 360. In no reality was the PS3 overall more powerful than the 360.
After learning more about Cell.. I can say that developers back then didn't really know or rather didn't want to tackle what could have been an ongoing feat for the Cell CPU. it's such a shame it's gone . Now this guy's just racking in game distribution sales and not giving two shifts about future development of games and technology. their hardware ducked as well. prime example of a lost potential architectural achievement STI developed. sigh.
Imagine the last of us 2 running on cell but with a few dozen cores. Naughty dog started to master that chip but then ps4 came out and it killed off that possible future
because in 2006 and 2007 was a poor documented platform, even sony realize and admit they didnt gave to much attention about how to use their mixed platform
@@yol_n Portal 2 port was made in house so they have to give their best shot. With custom script you can actually get the PS3 version (I guess the 360 version as well) reach 60fps or even more though typical Valve double buffering will make the experience not so pleasant.
Sony revert back to CPU + GPU architecture in PS4 instead of CPU + Coprocessor + GPU in PS3. That alone told us that Cell Processor is less optimal for game development. Beside, the thing you do in Cell you can do in modern GPU anyway. Since modern GPU support GPGPU operations, which is SIMD much like Cell. This is why Xbox One and PS4 can get away with very weak CPU. Some of the computation hard work is unloaded to GPU.
I'd love to hear how he feels know considering how big arm has gotten, which is a reduced instruction set like cell, aka gaben always had a hate boner with sony
@@KingMacintosh2 "console games nowadays are DEVELOPED ON PC" Doesn't that make developing console games easier than before? The reason I said PC games are harder to develop is because you need to write codes to make the games work on many hardware (can you list down the Nvidia cards from the Fermi-era to now?) Consoles on the other hand have the exact same hardware configuration on every unit, which make them easier to develop games for
@@Reed_Peer You understand that hardware compatability is handled by APIs and drivers right? Game developers only need to worry about optimizing performance for an array of hardware specs, that's the only unique difficulty that comes with developing for PC.
This man literally hates the number 3.
LOL
"3 a bitchnugget" - 2gang
Half Life 3, Porat 3, CS 3, TF 3...
He doesn't hate the Xbox 360
@@Dncmaster Does he dislike it now as we haven't received anymore updates from their console ports since 11 years ago? I thought Xbox 360 was actually easy to develop games and all that.
0:00 Wait... what number?!?!
lol
@@wiiagentlmao even
Daniel T. Gaming
I think he must have got used to it, by the time Portal 2 came out it seemed to be their preferred platform, by porting Steam over to PS3.
Yes, Gaben had made his life easier by that point, making the Source engine multithreaded. Additionally, the fact that Sony allowed Steam integration made Gaben accept the horribly complex environment.
There's a clip of him at E3 2010 announcing it for ps3, he does slightly mention how critical he used to be on the PS3 there.
Another simpler multicore processor may have been easier on developers, but one thing that did come out of this was that it dragged developers kicking and screaming into the multithreaded era, which was a net positive.
It might been easier if Sony would make good documentation and libraries to work with their hardware.
Thats what the xbox 360 was. 3 powerpc cores.
I didn't think of it like that. Still, you could have a multithreaded processor without the complex architecture of the ps3 processor. The PS4 is a testament to that.
He wasn't arguing against multithreading as much as even in standard multithreaded programming a junior dev isn't going to outright destroy performance. With the SPE cores, it elevated the minimum skill for all devs on the project. As mentioned, the xbox 360 was also multithreaded, but faired far better. I wonder if a better compiler could have helped. Staying within register and other architecture limitations shouldn't be the task of the programmer in any modern software platform. Such knowledge and skill can be a bonus for optimization, but modern software is too complex to work at scale while having to micromanage those details.
@@JimBob1937 The Cell processor was so interesting that 18 years after its PS3 debut and 8 years after my comment people are still talking about it 😅
It's almost funny looking back now how PS3 was in this position where it had amazing looking first party titles, but 3rd party games pretty much exclusively ran better on Xbox 360. It was so capable but, understandably, underutilised.
Because which third party dev is going to put in the time to create a game build particularly optimised for a single platform when they're trying to make 2 others and launch everything in a year? If Sony wanted better performance they should have built libraries and an SDK that allowed a greater amount of abstraction, and less SPE micromanagement.
@@xmlthegreatreal men know that x86 is more complicated than ppc but intel actually provided suffecient documentation as you mentioned, sony did too little too late and by then they decided to move onto the jaguar cores which held back so many games from 60fps just because of the cpu bottleneck
They had amazing looking first party titles - several years into the systems life. If we look at the last years of that generation, and (for example) compare the Last of us on ps3 with Halo 4 or Gears of war Judgement on xbox360, they're honestly pretty comparable. It's not like the ps3 blows the 360 out the water. Saying the system is underutilized is kind of a funny statement, what those Sony studios pulled off does in no way give credit to the ps3 hardware. Literally everyone hated that system, there is a reason no one designs systems like that anymore, and no one likely will
The reason why we never got half life 3
yeah the "3" was wasted here when he used it after the "playstation" LOL
they teased half life 3 in Half Life alyx's ending
You dumb?
@@bvchecha262 You can't take a joke?
@@thawkade didn’t seemed ironic to me
And now he is probably thankful that he had to learn multi thread code.
Nobody is happy knowing how to do that. No exceptions.
PS3 was still trash
The issue wasn't multi-threading or multiple cores. The issue was specifically Cell processor. At the time I was in college and we were preparing some course on this subject. Luckily I didn't do any coding on it but my friends who did never had anything good to say about it. It was problematic to say the least. You had PPE (power processor element) which was used for job scheduling and management. SPE, or synergistic processor elements as they called them, were actually in charge of these jobs. SPE executes relatively small and simple programs in single or double floating point precision.
So to make your program work on Cell you have to split it into really small independent chunks where each chunk does one thing for a short time with limited amount of memory, executed in parallel with others. These SPE cores had their own local memory ~256kB which can be later merged with L1 and L2 caches and main memory. PPE supposedly manages all this and merges the results while at the same time controlling main application and managing memory.
They called it flexible, we called it nightmare. You had so many racing conditions and issues which were almost impossible to solve. Not to mention memory problems and synchronization. To make matters worse it wasn't much faster than general purpose CPUs with multiple cores on the market and with advent of GPU general purpose programming platforms it became obsolete before it had the chance to prove itself.
it was ahead of its time@@pierreo33
@@pierreo33still had better games than shitbox 69420
Cell was a supercomputing chip. They basically took AltiVec, which was the SIMD component of a CPU, and built cores based on that. They are 100% SIMD processors. They had their own compiler and toolchain. To run code on them you had to load the program into SPE local storage and run it. You had to talk to the program using a messaging system. Access to main memory had to be orchestrated using DMA commands.
It was very cumbersome. It was almost like trying to write game code using GPU shaders.
@@Giulio-Romeo That was my point. It's not a good chip for gaming.
@@ben_spiller It absolutely was a good chip for gaming, just not for junior (or mediocre) developers.
SIMD is awesome, most compiler-optimized code relies on it. Sure, going 100% with SIMD and branchless code is harder, but it's far from impossible. We do it all the time for collision and other such problems. The more developers learn these kinds of operations, the better.
Fewer games would be CPU capped if more developers knew what they were doing (especially with cache, SoAs and all that). Frankly, it's embarassing. And at the rate GPUs are outpacing CPUs, pushing load onto shaders is only going to grow in relevance. Parallel processing is the future, and it was already the future decades ago.
If gamedevs weren't having their time wasted in university learning OOP, "clean code" or some other dogmatic nonsense, we wouldn't even be discussing this right now. This is already known by serious engine teams, but it should absolutely go wider than that.
@@pierrelaroche1441 bro necro'ed a month old comment just to rant about how devs should know how to work with a cumbersome system as if this video, which was released a 11 years ago, will catch the eyes of anyone that can do anything with this knowledge. Now I know nothing about coding or the software/hardware side of game design but I'm pretty sure if everyone is universally having a problem with something maybe they aren't the problem. Again I know nothing but I'd bet that the knowledge you have now, in 2024, maybe wasn't as available for game devs who had to work with this system 17 years ago.
Yeah, that's the same BlackJack Dealer memory model that killed the Saturn and delayed PS2 games for a year or more.
@@MrTophatcatshut the shite up mate, your postulate that "if everyone is universally having a problem with something may be they are th problem" is flawed. Sometimes and most often incompetence just occupies the whole scape that everyone who has a problem are the actual problems
"You have no idea why that is happening it is just magically running really really slow"
So that's is what happened to skyrim.
I never even played that shit and I remember they had to cancel DLC for that entirely on the PS3 because the developers said, "You know what? No. Just NO. N. FREAKING O." and that's what happened ladies and gentlemen. I do not lie.
TL;DR: Something about ram.
In Skyrim on PS3 just not enough RAM.
to every multiplatform on PS3 basically
to be honest I bought skyrim on launch day november 11, 2011 for ps3 and played it every day for a month and never encountered an issue. I think the only bug I found was walking up a mountain, I clipped through it, but I was fine and the game didn't break
I know decade pass since those games was released,but i cant pass this message section without saying that all bethesda games is disaster on ps3(i have platinum trophy in fallout 3 and New vegas),so i can say how they work in long run(Like shit).Major problem was in extremly low Ram(256mb) and critical save file size(more than 8mb),those games was close to unplayble after 9mb save size,more than on norman save they run in 25fps.Has played skyrim on ps4 and this was good experience
Nowadays utility multithread is pretty common but back then it was different
The code/architecture of the console is so conplex that developers strugle to make thier games run at least 30 fps,
so complex that even sony can't even make thier own ps3 emulator to run in the ps4/ps5
@@alejoyugar the complexity is simply a challenge, not a roadblock. Sony could make it work if they wanted, but they don't want to. More lucrative to remaster and remake everything.
Multi-threaded is different than optimizing code for SPEs. SPE's are very specific to vector processors and you need a problem that would work well using vectorization. GCC didn't automagically make SPE code with existing data like it does with SSE or now with AVX. That's why many developers just ignored the SPEs and just did the vectoration code on the Nvidia GPU (using shaders) rather than figure out how to make it run on the SPEs.
I agree that this was a huge problem that was the killer of the PS3. Developers did not want to try to make games because of the SPEs.
@@BlindBison The PS3 (and PS2) were simply extremely developer unfriendly for the first few years of their lives. The debugging tools weren't even just unfriendly but for the PS2 vector units and PS3 SPUs there weren't any. You were just supposed to write bug free code. It was ridiculous and nothing Gabe said was wrong. Microsoft provided great tools from day 1.
@@jc_dogen What doesn't make sense is that the specs for the initial QS20 Cell prototype had been available since at least 2004/2005, when the errata for the architecture had been finalized and the design approved for shipment. Why couldn't Gabe or other devs, knowing the future was on the horizon, make even a slight attempt to perform some in-house coding sessions to simulate how well they could write vectorized code to run on multiple SPEs in tandem, in addition to low-level C/ASM code for the GeForce 7800 GTX (which the RSX was based upon)? That way, they could have at least had a head start before the system was released even without Sony-provided SDKs.
even some japanese dev prefer to release their game on 360
Even so, only certain developers were offered assistance in development.
Who was helped the mosts?
Activision, Ubisoft, EA.
All the AAA devs only and in house studios only.
F all the little people.
The PS3 was a nightmare to program for because multithreaded CPUs were still a brand new technology that devs weren’t used to.
But there needed to be a trooper at some point to move the industry forward technologically speaking. And the devs’ hard work finally paid off in the second half of the PS3’s life. We started to see so many PS3 games that looked incredible and just as good as some PS4 games.
Oh, really?
Looks like the Xbox 360 didn't have those problems.
Or maybe, just maybe, what Gaben is saying is correct and Sony overdid themselves letting Ken Kutaragi go wild on this.
Very true Gabe they fucked it up but they made up for it with the 4th system ...
Now developers hated Microsoft for xbox series s
*a few certain developers, not the whole damn industry
It's the memory model used by some of these massively parallel systems that causes the problem. Had they simply gone AMP with 2-3 OOO PowerPCs, each with two sets of AltiVec pipelines, they would've performed much better. That's what Xbox did, and they got more games, more exclusives, and better use of the actual hardware. They could even have added another 2-3 G4s to each processor to get much better and more usable performance. And that's similar to what Intel and AMD are now doing with their x86-64.
3 G4s per processor! Might have been fast but might’ve needed a 1200W power supply 😅
@@garykildall4111 Power consumption was 18 Watts in 2005, and you usually run one core at a time. Under a heavy 3D graphics load, you're still not at 55W, and those don't need a separate fan for cooling--just a heat sink and thermal paste. 100W power supplies were still being used in laptops, and didn't cost that much.
Xenon was not out of order iinm
@@Aggrofool Xenon was a line of server versions.
I love how Gabe is complaining about a product that has the number 3 in it
Sony should update the whole CELL project for the PS4, fixing all these kinds of problems.
What do you mean? PS4 doesn't use the CELL processor
@@MichaelM28learn to read.
Are you stupid?@@MichaelM28
Now imagine how it is to reverse-engineer it and make an emulator
RPCS3 did it!
Xbox 360: easy
Ps3: hardcore
CELL would have been fine if people weren't expecting to port traditional software from SMP to it. CELL was more forward looking, more scalable: if you built one with modern process it would have 64 cores. It would be superior to CPU or GPU for the AI & vision workloads that are set to change the world. these kind of chips will re-appear as AI accelerators outside of games, but if people hadn't bitched so much about the PS3 we'd be ahead with that. the software tools are better now (clang/llvm)
+walter0bz I see Cell as a step between a CPU, having very complex instructions, branch prediction, all that good stuff, with just a few cores, and a GPU, just having as many threads as possible with relatively simple design. You have a fairly standard PPC core on die with more specialized floating point processors. Looking forward, GPUs have advanced greatly, and computer vision, neural net workloads are accelerated quite well on them already. Nvidia's ARM SoCs that have integrated GPUs offer incredible performance for not a lot of power consumption, much more so than Cell. I'd be interested in seeing what specifically in clang/LLVM has improved writing code for the Cell BE, as I can't think of anything specific.
r.e clang/LLVM i'm referring to the availability of a C++ AST, and IR, for writing custom tools for parallelism (i'd have killed for that 10 years ago lol); also today C++ has lambdas. Between these it would be easier to write dataflow abstractions (not just parallel, but DMA).
Yes GPUs can do AI/vision (better than CPU): but further improvement is possible with a dataflow oriented chip (i.e. direct communication between multiple on chip memories instead of caches).
This is being addressed now with new chips- Movidius , Eyeriss, Kalray (see articles 'rise of the VPU'); IBM's true north takes this idea to the extreme with dataflow and a neural processor, but I'd prefer von-neuman machines with packed SIMD etc as a middle ground between specificity and versatility.
I agree that CELL is 'intermediate between CPUs and GPUs' along one axis, but there's another axis: i.e. feed-forward/dataflow vs random access. Dataflow can handle vertex processing and image processing (e.g. final combines of deferred rendering) just fine. On this axis CPUs and GPUs are similar in that they're designed for completely random access, through caches. (GPUs hide latency with massive threading, CPUs with OOOE, and need hardware/memory to buffer states for that).
On a dataflow oriented processor you can reason about the locality (e.g. see the debugging tools IBM are making for showing the spatial allocation of a computation across their on chip network, in the true north video series). Imagine something with that control but von-neuman nodes. (e.g. adapteva epiphany, I'm looking forward to their next RISC-V based chip..)
Note that modern SOC's always have some other DSP units to complement their CPU & GPU.
This is where CELL belonged, but if wired up right it would have handled vertex & image post processing just fine..
walter0bz I'm pleasantly surprised that we have gotten to have a conversation about this topic. The AST and IR provided by Clang/LLVM definitely are a big help. Sony / Toshiba / IBM could've put more work into the tooling with Cell to help developers take advantage of the SPEs I think, but having a well defined IR they could've based their optimizer around would've made their jobs easier.
Processors like those you mentioned certainly perform great in their domains. These certainly have incredible advantages, but have a higher developer knowledge cost of entry, as well as actually getting your hands on these chips to mess around with them is more difficult.
Eventually we will be seeing these in dev boards, and I think then they'll catch on even more. I believe that when someone works on something in their free time, they tend to be more revolutionary, because they're not bound by rules.
Cell, as shown by things like the Gravity Grid, certainly has incredible power that can be unlocked with just more work. I think that the real problem about the PS3 at the time was that developers didn't want to make themselves too bogged down with Cell's slightly more complex design, as they wanted to be performant on other machines. If Cell had gotten more action as purely an accelerator to work in high performance computing applications, it certainly could've taken off, as this was a time that GPGPU was still fairly new.
Thank you for this conversation, it showed me new information about upcoming processors.
To revive this discussion, and maybe get a few pointer
seems like what gabe is bitching about, could have been solved with proper OoOE
but the SPE never supported it, as well as having proper cache, i am sure there is a TECHNICAL reason
for it but i never understood why, If anyone knows or provide references, would be great.
'propper cache'- the problem is caches limit scalability (cache-coherence). their design was forward looking and would scale to thousands of cores . See the 'adapteva epiphany 5' which is a 1024 core chip similar to CELL, just developped. their chip is for AI, etc.
The issue is the cell is an impressive piece of hardware. But myself as a software programmer i understand how hard it is to move to new platforms where there isn’t really a great set of instructions for how you manage errors. The cell is a really strange piece of hardware. Most developers were used to standard power processes elements as multi core. Easy compiler that pushed tasks between the cores. The SPE approach is kind of brilliant if you understand it, but it’s a huge nonsense to try and deal with. Sony studios used it very well. But most third party’s were not willing to come up with massive relearnings of their engines on the SPEs. You literally go from a compiler that can push your tasks across three cores, as per the 360, to a set of novel procedures that will take advantage of the SPEs. Sony thought they would make all developers develop in this way because they thought they would be market leader. But Xbox was far more similar to actual game development. The moment Xbox was more successful the cell kind of died.
To program using the new ppe system was very challanging. Not like they couldn't, it's just it wasn't like a normal multi-threaded enviorment like modern architecture. You don't just split everything among all the cores / processing units. the SPE were only good at certain things and programming for them was complex. You could go from just 10-20 lines of code where you run everything on the main PPU to offloading things like ai or other elemtns to the PSE, and suddenly you'd be looking at almost 100-300 lines just to make it work.
There are some great info on this from various ps3 devlopers and emulation developers that go over the complexities of the PS3. I'm no expert, but it was eye opening to see how different the Xbox 360 was vs the PS3.
Ahh so that's why ps3 never had a left 4 dead port
slayyyy gabe neverrrr write a line of a-symmetrical multi-threaded coede everrrrrr
bruh ps3 is a legitimate Parallel processing capable Device unlike the x86 based AMD and Intel which are just muti threaded and still sequential execution. its a step backward
Modern PC processors do, in fact, have multiple cores
modern x86 microprocessors are crazy complex now
@@龗 and here we are x86 architecture market share is getting smaller. Even in my line of work Enterprise is Moving to ARM and NVIDIA based Data Center
0:11 is that why to this day their latest game (CS2) is CPU limited on my machine because it's maxing out ONE of my 6 cores?
Clearly it worked differently than multithreading now. And there was multithreading, or at least dual threading on the PC then...
GTA IV and GTA V were somehow programmed to run this system
Gabe: " My most hated thing about the system is the number in it, I've never seen it in my life, i have know idea what it means and it stresses me out everyday when anyone mentions it which is literally every 40-60 mins"
And that's why we have never gotten a Valve game on PS3-PS5 (Or any other console) after CSGO.
Thats why ps3 doesnt have left 4 dead 1 and 2 LOL
He loved Playstation so much that Playstation 3 (after the incredible PS2) gave him PTSD! 😆Even as a Nintendo fan myself that still played with friends (for real, and thankfuly, we all had different things at different times) on Arcade, 486 PC, Mega-Drive, Sega Saturn, Playstation, Dreamcast and PS2... PS3 PTSD is something i can relate to because we all already had +/- nice pc's at the time and PS3 proposal was nothing more than:
- Ei! Do you wanna own a locked average PC that is iflated in price and offers much less than just a regular self-built PC?
- Well... thanks, but fuck no!🤣
This isn't a rant, these were legitimate problems which arguably made the PS3 shit.
@
The exclusives themselves are irrelevant. You could have gotten those games faster and easier had the PS3 been easier to develop for
@@FraserSouris no...the whole point is that those games weren't possible, at that level of quality, on other systems of the time.
@ The games themselves would have been possible with some graphical cuts here and there on the 360. It's not like the PS3 was able to do stuff like AI or whatever that were completly impossible on the 360.
@@BlindBison Just because the 360 technically didn't have the same theorictical peak doesn't mean it couldn't have had versions of TLOU/Killzone 3 that were very close if not Indistinguishable from the PS3. The 360 was quite capable and less of a headache.
Also, No developer would call a machine where it takes 400 lines of code just to print "Hello World" Elegant. The PS3 was needlessly complex and it nearly sunk Sony. If Sony had made the PS3 with more simple archetecutire, included backwards comp and Blu Ray, they wouldn't have nearly flopped.
PS3 was so much ahead of it's time! And definitely much better than the xbox. It just needed time for developers to learn modern coding. Yeah, it is easier to write for something with three identical cores, but in the end, the PS3 had more potential which naughty dogs and others tapped and released games that were with no competition.
OH NO, WE HAD TO LEARN SOMETHING NEW.
what to do? throw it away! why ? was it bad?
no it was actually an exciting direction to take, The reason is : See the first line.
That's some good recursion right there.
Gabe actually talked about this. That learning the PS3's archetecture was needless because you'd never use it for anything else. It was this extremely convuluted and specific system while everything else was much easier.
Had Sony made a more traditional system, they would have killed the 7th gen
@@FraserSouris this is most likely why modern gaming systems is based on similar architecture. In which divided into two categories, home systems using x64 (PS4, Xbox One and PC) and portable handhelds using ARM (Switch, Android and iOS).
@@UltimateAlgorithm Yeah. The idea is to make it easy to design and port games on different platforms. Having systems with completely different architectures is not conducive to that as you're spending more time getting the damn thing to run in the first place rather than designing the game and optimizing stuff
@@Maximus20778 bad compilers to CELL (generated crapcode) and other reasons such as this.
Left 4 dead music really??? SWEET
Xbox 360 was easier to develop for but was overall weaker
The PS3 was harder to develop for but was overall more powerful than the 360
256 Mb ram 🤣🤣
@@decriper1097it's 512 lmao what you on kido
@@gnrtx-36969 on the xbox 360 yes
@@decriper1097 do you even know what ram is lmao PS3 has 256mb ram with 256mb vram xbox only has 512mb of vram
Both almost would perform same
Only thing what makes PS3 more powerful is the cell cpu it has
Simply not true. The RSX in the PS3 is garbage. The CPU was harder to developer for, but in theory more capable. The memory configuration was inferior to 360. In no reality was the PS3 overall more powerful than the 360.
After learning more about Cell.. I can say that developers back then didn't really know or rather didn't want to tackle what could have been an ongoing feat for the Cell CPU. it's such a shame it's gone .
Now this guy's just racking in game distribution sales and not giving two shifts about future development of games and technology. their hardware ducked as well.
prime example of a lost potential architectural achievement STI developed. sigh.
Imagine the last of us 2 running on cell but with a few dozen cores. Naughty dog started to master that chip but then ps4 came out and it killed off that possible future
because in 2006 and 2007 was a poor documented platform, even sony realize and admit they didnt gave to much attention about how to use their mixed platform
Cell is needlessly convoluted for games. We're better off without it
@@insanezain2 That's a good thing. It meant PS4 games were optimized from day 1
If it’s hell to work with you aren’t going to get the most out of it when there’s similar hardware that’s easier to learn and provides the same specs
“, I hate the PlayStation 2+1! I would rather develop on the Xbox 4-1 60!!!
LAZY MILLENIAL DEVS DESTROYED GAMING
True
bullshit, half life 2 ran like shit even on the 360
valve never made a decent console port simply because they didn't want to
To be fair EA made this port.
@@pocketstationman6364 EA did the PS3 port, the 360 was made by Valve
@@nunosa528 yes
The PS3 Portal 2 port is pretty good.
@@yol_n Portal 2 port was made in house so they have to give their best shot. With custom script you can actually get the PS3 version (I guess the 360 version as well) reach 60fps or even more though typical Valve double buffering will make the experience not so pleasant.
Mi opinion sobre como es la economia en argentina be like:
Gabe is a amature programmer all developers who hate Cell processors are.
gamingconsumed6000 Lol they did not listen to Sony Interactive Entertainment instructions.
because you know all the developers know how to optimize system code in assembly language on a poor documented platform.
@@RobertK1993 The PS3'S archetecture wasn't that well documented though. It was legitimately difficult to do even basic stuff
No. Gabe is right. Unless you're working on Supercomputers, you don't need the Cell
Sony revert back to CPU + GPU architecture in PS4 instead of CPU + Coprocessor + GPU in PS3. That alone told us that Cell Processor is less optimal for game development. Beside, the thing you do in Cell you can do in modern GPU anyway. Since modern GPU support GPGPU operations, which is SIMD much like Cell. This is why Xbox One and PS4 can get away with very weak CPU. Some of the computation hard work is unloaded to GPU.
I'd love to hear how he feels know considering how big arm has gotten, which is a reduced instruction set like cell, aka gaben always had a hate boner with sony
🤔very interesting
*CONSOLES ARE WASTING TIMES!*
He's just a baby
Cell processor can made easier to develop for with Ai programming
EVURRRRRR 👄
gabens not a brony
PS3 sucked, Should have just kept making PS2 games lol.
Im pleasantly surprised there arent any pc elitists in here.
Developing PC games is much more difficult than developing console games
@@Reed_Peer Not like this is the 90s... console games nowadays are DEVELOPED ON PC, so LOL WTF are you talking about???
@@KingMacintosh2 "console games nowadays are DEVELOPED ON PC"
Doesn't that make developing console games easier than before?
The reason I said PC games are harder to develop is because you need to write codes to make the games work on many hardware (can you list down the Nvidia cards from the Fermi-era to now?)
Consoles on the other hand have the exact same hardware configuration on every unit, which make them easier to develop games for
@@Reed_Peer You understand that hardware compatability is handled by APIs and drivers right? Game developers only need to worry about optimizing performance for an array of hardware specs, that's the only unique difficulty that comes with developing for PC.
His comments were such a setback. Just pure lack of critical thinking skills.
unless you will know more about the nightmare of the CELL processor
Gay Newell
Gay ben
our lord and savior gaben
pc crowd rise up
He is a brony, or at least he was. Who cares anyway?
skill issue lmao
L Gaben