this might sound weird, but i cant get tired of watching your videos, and now after a rewatch of the PS3 documentary and the Frankenstein PS3s you upload again? sweet!
That OC saga was glorious, that poor Frankenstein of a PS3 sure took a beating. Even if ps3 voltmods remain an obscure curiosity with little practical effect that was still entertaining as hell, fantastic vid.
Another RIP Felix Video! Your YLOD documentary stopped me from doing the token mod and made me check the SYSCON logs.Thank you for all the work you have done!
This was a great video. Took us for a wild ride. So many "Oh no. I know where he's going with this" moments. The overclocked firmware is a huge game changer that doesn't get as much attention as it deserves. So cool to see you push it this far. This video really has me wondering how far the PS3 can be pushed, and how far people will go to find out, akin to PC overclockers. Jerry-rigged air cooling towers? Retrofitting water cooling loops? Stacking caps of different values/types to reduce ripple further? Custom voltage regulator modules for optimal power delivery? How hard can we redline these systems, even if they are obviously doomed to burn out in the process? Seeing the principles of extreme overclocking applied to the PS3, (and the effects on performance) would be not just entertaining, but utterly fascinating.
I think uncapped Uncharted felt weird because a lot of elements like animation or physics might be locked to 30fps or lower, creating discrepancy between them and the actual framerate. Also 40-50 fps will look worse than a capped 30 due to rampant screen-tearing on non-VRR displays/hardware. I still appreciate the extra input responsivemess though. Great video!
IIRC, there was quite a bit of work done on the PS4 remasters for the Uncharted games and The Last of Us, and those include a 30FPS lock option, even if those remasters were optimized to run at 60FPS on the hardware. Also in regards to the soap opera effect, it generally also has the problem of the shutter speed being set incorrectly, resulting in less motion blur (a high exposure is what 24FPS films use for natural looking motion blur). Some games have an issue of not adjusting the shutter speed with higher framerates, resulting in the effect looking diminished, games nowadays have been doing a better job at including a motion blur intensity option. Unsure if the RPCS3 patches for the game do anything beyond just changing the VSync interval.
Yeah it must be because it’s not looked at 60. Locked 60 beats 30 every time. Soap opera effect is a movie issue. Games are always better at higher frame rates. Personally I get motion sickness from 30fps these days. 60 fps is much better and on PC I target 120.
It really depends on people TBH those with vertigo & or motion sickness will see things differently than those who don't have any of the above. @@wertywerrtyson5529
I recently modded one of my PS3s to run homebrew and decided to play through all of the Uncharted games as they were originally. Initially I was playing them on a modern 27 inch 1440p monitor and I thought they looked horrible. I then played them on an old 40 inch 720p plasma and my god they looked so much better. That TV was the target resolution and it was crazy how all the on-screen effects came together on a display that the game was designed for. Also, on top of this, flat screen/LCD/plasma displays tend to look awful when you're running stuff below the native resolution.
@@2Step2Hell I think if you play it on a decent 4K TV instead, it would look better as they are made to upscale 720p broadcasts. But nothing will beat integer scaling or native.
A digital IC designer here. I'll be addressing the Voltage mod & how that worked in this scenario. Basically as we design a chip, there will always be a working frequency & core voltage specification chosen as a base line. But the operation of the chip will be tested based on that value with additional margins (plus/minus some percentage of core voltage & operating frequency, nominally 10%). So by increasing the input core voltage (as long as it remains within the tested value), you can also operate the design at a slightly higher frequency. So, logically & physically, it's possible to do this. I would just not recommend operating the device like this at all times though.
As someone who's very into overclocking all kinds of pc hardware incl. notebooks and some record hunting, this video really had me hooked these 44 mins felt like 10 mins ngl! Absolutely astounding!
While I wish they did better, xbox is dying rn, and they aren't going to last very long if they don't give a game-changing game to give people a reason for xbox, and if they don't, then it's probably going to be likely like what happened with Sega (which is ironic).
Keep in mind people were more used to 60fps games in 6th gen because of the massive jump. What you said about being used to low framerates only really applies to 5th gen and PS2 games
Also Sony was very misleading in the announcement period of the PS3. The graphics and efects they showed in some of the trailers would be impossible even for a PS4 to pull off (Tekken, Motorstorm etc.) The initial showcasings of the 360 were way worse in comparison with games running with severe slowdown and graphics that didn't look like a huge jump from 6th gen to 7th gen, but they ended up outdoing themselves when the final product released.
@spiral7041 Actually this current generation would surpass it for 3D games. Pretty much most 9th gen games run at 60fps and maintain that performance well
That's not how I remember it. A lot of high profile PS3/360 games ran at pretty crusty framerates to get as much eye candy on screen as possible. That was THE way to make games look "next gen". Framerate snobbery was more of a PC gamer thing. I only recall a few examples where console players complained about low framerates around that time. For example, people complained that the launch version of Dark Souls 1 got overly choppy in upper Blighttown on the Xbox 360. It was only when the game got ported to PC that people really started complaining that it was locked at 30FPS. "Literally unplayable!". Console framerates came more into focus towards the end of 6th gen, when many games pushed the aging hardware a little too hard in the continuing race for eye popping fidelity for very marginal gains. Like how The Last of Us didn't run very well compared to previous Naughty Dog games, and how the main upgrade of the PS4 version was that it ran smooth. But even then the focus was more about consistency than a high target framerate. I was still very accustomed to sub 30 FPS gaming until I put together my first proper gaming computer in 2011. I played through Oblivion at
It is fun to debate which console is better sometimes but not when it comes tearing each other down. The PS3 is stronger but it's so complicated that devs just couldn't fully take advantage of it. Overclocking shouldn't work either because the device already has heating problems. So did the 360 so I can't say much there. Both systems were great
That gen is the one that would really benefit from enhanced consoles with sub-720p res and unstable framerate. PS3 Pro with G94-based GPU could push everything at atleast 720p with stable framerate and if asked nicely, 1080p in some games.
"and replace that stupid cell" you do realize that the cell be was more powerful than the xbox 360's cpu, right? the problem was that the programming concepts needed to use it well were novel at the time, but are all commonplace nowadays with multithreading and compute shaders the problem was the ps3 threw these things as a requirement at developers before dual-core x86 cpus were even a commonplace thing
@@TorutheRedFox the ps3 designer, _apparently_ said he wanted dev to be tied up learning his stupid system and would therefore make less games for other consoles. Could this be true? Because if it is, the cell architecture is a wrench thrown into the games industry as a whole, consciously.
I have never heard of him saying this and I don't think that IBM or Toshiba would invest so much money in a partnership with Sony to make such a CPU if they wouldn't have seen potential in it, as only Sony would've had such a silly reason as that
@@TorutheRedFox Toshiba and IBM were taking orders and that's all. I distinctly remember footage of Gabe Newell getting annoyed about the cell architecture trying to port Half Life 2. One mistake and the whole house of cards collapses. Whether or not cell architecture damaged the industry or not comes down to if you think the steep learning curve of it's systems could be migrated to the Intel Core architecture that is now standard.
Most people hate sliding back down the slope of enlightenment. Felix throws his arms up and goes WEEEEEEEEEEE!, gets to the end and goes for another round!
Funny how Felix quietly changed the labels from the fictitious supposed "Dunning Kruger graph" to the Gartner hype cycle graph lol Might want to read doi:10.1037/0022-3514.77.6.1121 and then doi:10.5038/1936-4660.9.1.4 why it's horseshit anyway
I remember back in the 64, PS1 era when a game struggles with framerate me and my friends used to say " oooh wow the game is soooo good it beautiful that it makes the console struggle" 😂 its crazy how back then fps drops for us was a look at in a more positive side where it would prove how the good the game was and how that quickly shifted into the worst thing ever when it happens
Felix buddy, love your work, learnt a lot from your YLOD video but when you tested one game and said "dropping the resolution to 480p makes no difference", I was disappointed and said "he should know better". It's game dependent. Digital Foundry, whom you cited in the video, did an entire PS3 1080p miniseries courtesy of John Linnemann where he shows you a few games that do run differently based on output. One off of the top of my head is _Gran Turismo 6_ which can be run in 1080i instead of the default 720p if you deselect 1080p as an output resolution (1080i has to be the maximum allowed at system setting level). The tradeoff for higher temporal resolution is more variable framerate and screen tearing, slight but noticeable on busier environments like city circuits.
Yeah, I suspect it varies from game to game. I only tested it in Crysis3, since that's the cross platform title I have for both consoles and know is late enough in the life-cycle to make full use of the consoles (theoretically). So it's the one I wanted to use for comparisons. But the claim that it always helps FPS is untrue. That's the point.
Gran Turismo 6 defaults to 1440x1080 even if you have all resolutions selected. You need to disable 1080p (and 1080i?) to make it run in 720p, which does indeed improve performance. I remember doing that when the game was new, because framerates were quite poor even for the time. It runs a lot better with a slight oc though.
@@SterkeYerke5555 Good point, it's doing that PS3 thing of scaling on the horizontal only due to architecture, right? Point is it still pushes a different amount of pixels from 720p, it's not a fixed internal render res as some games are on the platform.
Your video is novel as usual. With proper citations, benchmarks, and jokes, it is the most enjoyable PS3 educational material I can find out there! Please accept my tuition fee!
High framerate (no motion blur, actual higher framerates with adapting frequency), isn't just for contest, for many people including myself, the more frames there is the less my eyes and brain need to interpolate between those frames, not counting the huge input lag improvements. I think most people are just not educated enough about the subject. Also let me guess, you made your friend test uncapped on an LCD display right? Try again on a CRT if you can. 30 FPS stable looks great on an LCD because it is slow enough to allow for pixels to switch states, however when you increase the framerate it creates this blurry uncanny effect that people find jarring, which is simply over persistence of sample on hold blur. It doesn't happen at all on CRT, just a little bit on Plasma, and to an acceptable extent I'd say on OLED. There is a fascinating case that would be truly worth a test regarding this, which is the original BioShock, since that game has a 30 FPS cap and an uncapped mode that you can select from the options menu! (as well as 2 FOV settings believe it or not). I remember preferring 30 FPS back in the days because it was more stable and didn't have horrendous tearing with flashing lights, however the input lag was WAY worse.
It was a LCD. I've tested input latency on this TV at about 1 frame. All tests were done with game mode on for the least amount of latency possable on what I'd consider a fairly average cheap TV you'd find most casual gamers to use. I could have had him try it on my LG OLED or a Sony Trinitron CRT, but I decided it was better for the demographic I wanted to get an genuine experience from to use a more generic TV.
@@ripfelix3020 thanks for answering, As sad as it sound so far, 30 fps with proper frame pacing is going to be what most people will prefer, since most people have 60 Hz LCD TVs. assuming a proper locked 60 fps isn't possible of course. Also since you overclock the ps3... maybe MGS4 can avoid the 20 fps drops more that way right? make your friend try that!
@@deus_nsf Mitsu did this and MGS4 ran at 30FPS locked for the most part with just a few dips below vs stock that was stuck at 20FPS. He used the same OC I have on my 2501A 700/850 which it does without breaking a sweat.
@@deus_nsf It's the reverse for me. I will happily play 30fps on CRT/Plasma/OLED/IPS. If I play 30fps on a VA panel, my eyes bleed. At the very least I'd need a high end VA TV as they have response times matching or exceed IPSes.
Just another thing to note right at the end of your video re capacitors and console usage. Using a device less frequently can be paradoxically worse for it. Frequent use helps to keep the dielectric in good shape in electrolytic caps. Although PS3's don't have a ton of them, they do have a few. So the longer it sits on a shelf, the more likely the dielectric breaks down in the electrolytics. Yeah, the filter caps for the RSX/CELL aren't electrolytics but still, others on the board are. So yep, better to have fun with the console whilst you can. Nothing lasts forever!
@@iwanttocomplain yeah of course. You can for most anything to be fair. I was more agreeing with Felix that it's better to just play the PS3 rather than leave it. No matter what you do, devices age and break down with time so just have fun with them. If the electrolytics go bad, it's easy enough to replace them.
@@NewRetroRepair yeah no point in being precious. These are mass produced consumer products, made out of plastic. Not hand crafted mahogany antiques from the 16th century. Although I would love a Sega MK3 to play sg-1000 and such a cool case design.
Great video ! It should be put forward on the thread, so that it can serve as a reference. You condensed everything perfectly. Risks, benefits, diminishing returns, how to proceed... In short, it will probably be very useful to many people. Congratulations for the work done !
So, in summary, changing voltages is not a worthwhile endeavour? My PS3 is a slim 2501A, and your fat frankenstein seems to obliterate mine in overclocking capabilities. I suppose silicon lottery wasn't on my side. Temps were alright though (changed the thermal paste). Some months ago I saw your comments, when researching to overclock mine, in the forums (I have to admit to be very happy to see you actively researching in real time) and I was expectant and hopeful of this video to be revelatory as I knew you where working on it. I suppose I expected a software voltage tool of some sort to magically change the voltage, and not a realist pragmatic new knowledge of the ps3 overclocking lore. I have to admit I am disappointed in the reality of it, although quite impressed of your thoroughness. Still hopeful of a future 'tool' or collection of 'methods' to reach a stable and interesting performance in the ps3. And thanks to your "style", I am now very interested in the general limits on the possible ps3 non-stock mods. Loved the video. Love your content. Keep it up.
It is weird for someone to say the higher the framerate the worst it is for them. Higher framerate is suppose to help motion sickness. Probably got use to the game fps and no FOV slider wasn't a thing on console. So that makes sense. But higher framerates and FOV slider to something higher is probably the best for him. Lower fov gives motion sickness. Great video though!
Ive never ever ever ever heard someone dislike games over 45 fps. That's absolutely absurd, literally ridiculous. 60/120 fps gaming is the absolute best thing about the current gen. The decrease in input lag is amazing, and the need for significalty less motion blur to make up for the low frame rate is amazing. There's objectively no benefit to low fps over high fps. Literal none.
Except for the fact my friend preferrs it. What's the saying? "There's no accounting for taste." Bro had an opinion, absurd as we both agree it is. 7th generation consoles couldn't achieve achieve high framrates for many of the techniques used to make modern games. Photo realistic expansive games were just within reach, but you have to accept the cost. And we did, because it was still awesome. But I agree that the remasters of those games on PS4/5 are a quality of life improvement on games that really needed it. Not just a way for publishers to double dip, endlessly re-releasing their back catalog, but actually to achieve the vision they intended but couldn't realize on 7th gen hardware.
Work is the bane of creativity. No time for anything but recovering the courage to go back to work the next day. Making videos (the way I do it) is time consuming.
I watched the tutorial first because I thought it was the only video uploaded right now and I thought how long I had to wait until this one would cme out lmao
It seems that overlocking the PS3, makes some change, but the BIOS and the Motherboard is probably soldered on, and it requires advanced skills to upgrade it as well, some games are capped to 30fps, while some games aren’t, and/or it could be a gpu bottleneck, and potentially in the future, there would be a way to overclock the gpu aswell, but it’s kinda unlikely with the skills we have today. Sick Video!
30 FPS is fine if you haven’t played 60 FPS in a while. What’s worse is coming down to 60 FPS from 120 FPS with mouse and keyboard, feels like there’s a noticeable amount of input lag
@@Lxca-fi4np I can say that personally anything below 60 isn't particularly nice to me, regardless of how long it's been since i played a 60fps game. Will say FPS above 60 is very very nice
Sonic Unleashed is capped at 30 on Xbox but 60 on PS3- but they both average 20 so the PS3 just looks more unstable. If the game is actually hitting 60 there's absolutely no reason to prefer 30 tho lol
Firstly, love the work here, I love seeing the path not taken. Secondly from experience, yeah pushing the VRAM bandwidth higher for the most part will get you the extra performance over a faster core. RSX was incredibly bandwidth starved, especially when you used multiple buffers to combine results. Considering the PC GPU's it was based typically had about double the bandwidth, it was obvious from day 1 that it was going to be a major bottleneck. You could do a LOT of data shuffling of multiple buffers just to end up with the final 27MB buffer displayed on screen (if you are 720p native). If you could eliminate a buffer or reduce the bit depth of one, that would yield decent results every time. Simply put, rendering pixels to RAM was slow. This is the exact same issue with the Switch currently, this is why Botw/TotK have major performance issues. Decent GPU core but you tried to keep it away from writing to VRAM for as long as possible. It meant that there would be a lot of idle time if you are bandwidth bound. Xbox 360 didn't have this issue thanks to the EDRAM - this a big simplification because you have other issues such as a shared memory bus with the PPC and having to tile your input/outputs to fit said EDRAM - but when blasting pixel data to it could do it much more freely. That is not to say that memory bandwidth was the only performance bottleneck, but it was a considerable one. It is impossible to balance a GPU (or any system) perfectly when the kind of work loads it will get will be all over the place. You can go from buffer heavy stuff like Crysis 3, to the elegant artistic driven beauty of Burnout Paradise. There is no winning move just selective compromise. As a bonus, to balance the argument, if you had to do render-to-texture elements RSX was actually kind of decent for this compared with 360, not because it was great at it but because the 360 GPU SUCKED at this stuff as it basically forced it to render directly to RAM. It would be like switching the cache off on your CPU. Once you stepped out of the EDRAM space, you where in a performance glut like nothing else.
I'm not sure if you mention it later in the video, but one of the most important things that determines how playable a game is is the framerate and the gameplay logic/inputs being decoupled. When this is done, the game just slows down instead of skipping frames and missing inputs. That's how MGS games on the PS2 and PS3 did it (and countless others), but that's like an integrated slow-mo during heavy scenes, which I actually kind of appreciated. That's completely different from the other way, and, if you have bad frame-time, and errors occur in the rendering-animation pipeline, then a "high" framerate that is unstable and misses button presses can be infinitely worse because it's invisible when it decides to just lose your controller/keyboard/mouse input in between frames. I'm not sure why things changed in the way games are made, but that's the main reason a low framerate was tolerable back then, and why getting proper control input and animations at or well above 60fps is so problematic now.
the xbox 360 abstracted the hardware layer of memory management with an Intel ivybridge or some kind of bridge chipset to make handling memory inaccessible to 99% of programmers, who wouldn't know how to get into the chipset kernel.
cpu's are not useful any more. 1, single core, 1Ghz cpu is enough. All it does is run bus operations and remember the state of the program. Make it a risc cpu and you need half the gigs. The gpu is there to get the high res's and particle effects and a 512bit bus width bandwidth pipeline for 20GB/Per second of texture data etc. I'm trying to say I hate modern graphics and wasteful 3D pipelines. Because nvidia still isn't rich enough?
@@iwanttocomplain Wtf are you talking about. Tons of games are CPU-bound and not GPU-bound because their world-simulation is more demanding than the graphics.
@@iwanttocomplain Practically all strategy and simulation games are primarily CPU-bound either because of AI players (Civilization, Total War, etc.) or because of world-simulation (Cities Skylines, ex). Dwarf Fortress is the best example in my opinion since it doesnt even need anything other than text graphics but will make your CPU choke if you have enough dwarves running around doing constant personality and thoughts calculations and stuff, all while the entire world is simulating the history of important figures, other civilizations, megabeasts, and other monsters and creatures on the surface or in caverns, or trees growing and changing over seasons, or water physics, etc. Minecraft is probably the most well-known example of a CPU-bound game, though, due to the world being divided into so many blocks that have to be loaded and kept track of, and it can especially make the internal server struggle if lots of redstone machinery is activated. Lots of open-world sandbox games can be CPU-intensive for similar reasons to minecraft. No Man's Sky might be a good example since it simulates solar systems and their planets in real-time. Any game that has realistic physics and AI is going to be CPU-bound. Half Life Alyx and other similar "realistic" VR-titles are like this. But games dont have to be primarily built on CPU-intense features to be CPU-bound. A lot of modern games can be heavy on the CPU in general, and you can probably fit any modern AAA game into this category. I can't list every game in the world thats CPU-bound but the general rule of thumb is that if it has complex AI, pathfinding, world-simulation, and physics, its probably going to require more CPU: In so, so, so many cases multithreading and multicore CPUs are vital to make these work, due to it being able to run calculations in parallel. High amounts of Ghz to be able to run more faster too. Single-core is really only useful in games that fundamentally require a causal line that couldn't be kept track of under multiple cores easily (but Dwarf Fortress is probably the only example of this in practice).
Yep, we can do that too. I didn't include it in this video, for time. There's a bunch of stuff I would like to have included but it was already dragging along.
about the resolution thing, PS3 games can be programmed to change the render resolution based on the output resolution selected, so while it doesn't apply to all games, it most certainly isn't misguided advice that applies to 0 games. this is something Digital foundry themselves have found during their 1080p PS3 games videos.
Ps3 simply was too expensive for me in my teens. I wanted one so bad but got an xbox360 arcade which was like half the price of a ps3. Than I saw how much better most games ran. Everyone with rich parents at my Highschool had a ps3 that’s why I started to hate the other side. They mocked me everyday for having a Xbox and not a ps3 so I started to mock them back… that’s how 90% of the war stories began.
People who use the term "soap opera effect" in relation to games and say "the framerate is too high" or " it's way too smooth" shouldn't be asked any performance research related questions period. I applaud your overall ability to try and stay objective when discussing these topics, but the Uncharted test here is hard to take seriously, it should be done with someone with an actual understanding and a better eye; I'm not saying it wasn't interesting anyway, it's just that I'd rather listen to you trying to point out differences. Other than that, cool video as usual, though none are better than the YLOD one, because it's mostly free of your biases and has exclusively honest mistakes which don't take away from it one bit.
If I did that it wouldn't be representitive of the demographic I was targeting. The segment of the population you are talking about is a small minority of the larger group of gamers that know nothing about how the game works on a hardware level and instead just experiance it. This is a truer representation of the opinions and preferences of the market average. What a company would need to know to target the widest possable demographic. You and I are on the fringes when it comes to performance. Most people dont know what to look for or care about the same things. Although they are highly influenced by people telling them what matters. And that why I was so careful not to hint at what I did or answer any questions. I do understand the desire to believe your opinion is one shared by the majority, but that's often not the case. Nor does it matter. You can select the games and console or PC specs as you prefer, juat as they can do the same for their preferences. I was suprised by his comments and my goal wasnt to correct him or disagree. It was simply to get his honest opinion, without interjecting my own bias into it. Then let you guys hear it unabridged and decide what makes sense or not, from your perspective. That it was suprising reveals a dissonance between our precincieved notions and the reality of how others precieve the same issue. It can be very subjective and it this case, was.
Dude, fps directly corresponds to latency. In games where you need to react as fast as possible, like oh idk, first person shooters, it makes a huge difference. They are just plain easier to play in high fps because your effective reaction time becomes faster. That's why every competitive fps player on the planet lowers settings to the minimum, and plays on a high refresh rate display.
Yes, but many people subjectively prefer 'cinematic' framerates, don't ask me why, I prefer high framerates where possible. One other thing to consider is frametimes, a game running at 30FPS consistently will look smoother than one jumping wildly between 30 - 45 fps on a non-VRR display (and in the case of the PS3, on any display since it doesn't support VRR at all anyway).
That's why Microsoft chose to disable vsync in skyrim on 360. It causes screen tearing, but maintains a locked 30FPS so your controller inputs don't get eaten by lag. PS3 has it enables and doesnt suffer from screan tearing. But FPS drops can cause input latency that's arguably more distracting.
The majority of casual players would prefer a stable frame time matching divisions of refresh rate. 30fps solid or 60fps solid. Any other variance - you need VRR to avoid judder.
@@razmann4k You can subjectively prefer cinematic framerates all you want. You can even subjectively prefer not hitting your targets in FPS games. But the plain objective fact is that all else being equal the higher your framerate the faster you react.
@@Wooksley So what you're saying is, higher the resolution, higher the processing requirement? Thank you captain obvious. Can I try it? You can subjectively prefer using a CRT TV, and bust out your composite with 480p. You can even subjectively prefer 1080i mode in PS2 games. However objective fact is, PS3 is an HD console. Meaning 720p and 1080p. No hard feelings. I'm just kinda tired seeing latency in fos conversations.
IDK man, I owned both day one and even on a CRT the 360 fps advantage was pretty noticeable. Plus most people I know were off CRTs years before the end of the generation making the resolution difference obvious. But I was one of the weird ones flipping back and forth between inputs. Only having a PS3 you wouldn't care but that's more along the lines of "ignorance is bliss" than "an unbiased opinion".
"CRTs had better motion handling, contrast and blacks than most TVs today" True! Only QD-OLEDs are now getting in the same range of image fidelity as a CRT.
I have my LG OLED hooked up to the same RGB Moded N64 outputting to my SONY trinitron via Component. UltraHDMI to oled, component RGB to CRT. The contrast is increadble on the CRT. The colors pop and glow. Looks like candy! I never wanted to lick my TV more. By comparison the OLED is dull. Sharp, beautiful, both had deep blacks. Lag has been mitigated with a retroTINK 5x, but brighness is where the CRT shines. As for Motion handling, way better. And the phosphors/bloom around them hides alot of aliasing. Blending the textures to look like they should. It's an all around obvious thing when you see it, that this was the way the game was intended to be played. A while back I hooked my a2600 up and found myself getting into the game cuz it looked so good on CRT. On Modern display it's such a hassle for a blurry box and simple game. It was revelatory. It don't play atari much anymore, but when I'm feeling nostalgic, the CRT is the only way to itch that scratch!
The 360 architecture is a little weird too being a PowerPC based cpu (as opposed to a x86) paired with an ATI gpu but def not a weird as Sony's more custom PowerPC and RSX GPU setup. Microsoft usually seems to play these things on the safe side, probably because their whole company was built on the x86 and similar architectures. Seems it was the right choice as well.
I agree with that from a developer centric perspective, but as a gamer I don't. Many of the most interesting and unique gaming experiances came from the limitations imposed on developers by the hardware and architecture. Their clever workarounds often contributed to the unique look and feel of a console. Like PS1, N64, saturn...etc. They each have a unique feel and astetic that emerged from what most developers didnt like about developing games for them. I would argue that easy development makes them less creative. Yes it allows them to achieve their original vision for the game, but it doesnt force them to think outside the norm to acomplish things others didnt think of in order to acomplish an effect, or level design...etc. Those breakthroughs made some games stand out as a breakthroug. We hardly get that sense from gaming anymore. It just feels like another itteration of something we've already seen, because there's no barrier to overcome. No limitation to showcase a developers talent. Case in point. Look at the last of us compared to uncharted 1. There us a serious step up in polygon count and visual fidelity as Naughty Dog learned to optimize the Cell architecture. They stood out as a shining example of what the PS3 could do if developers took the time to learn how. And TLOU is usually the #1 pick for best games of the entire generation. Achieving something the 360 couldn't match even had the game released cross platform.
The 360's CPU is actually the same core as the PS3's PPU - IBM secretly made the deal as Cell was in development. PowerPC was attractive at the time because it provided the necessary performance per core without needing much transistors vs x86 = smaller dies = cheaper. They could have a simpler Core2 Allendale-based quad core with the same transistor count but Intel would charge way too much money.
The gamecube and Wii were also PowerPC based too. They also both used graphics made by ATI so in a way the 360 wasn't too dissimilar from what Nintendo offered. That could have made development easier for many as they had experience with those consoles
Wouldn't be possible to get at HD resolutions and with a generational graphical improvement. If you look at 7th gen 60fps games they're either compromised (not HD, severely limited physics/shaders) or are essentially enhanced PS2 games, like remasters and such.
Not all games will be 60fps, But I understand and I need the games to be 60fps or at least like kirby and forgotten land that is 30fps but the input lag is similar to 60fps. With that said, like John from digital foundry says: "its a developer choice " Where do i stand? If any developer chooses to make a 30 or 60fps game, the game needs to be close to that target 98% of the time!!
You never got a generation where all games ran at 60fps and expecting that would be unreasonable. Not even the SNES, NES, or Genesis could maintain a perfect 60fps or have all games target 60fps
I found that a very good way to overclock safely is to warm up the PS3 by playing some games, immediately updating to raise the frequency by 50MHz, then jumping back in game. At some point, you'll end up with artifacts and/or freezing, at that point you just use the fan blowout function a couple times to cool the heatsink, or let the system sit for a while to cool, then revert back to the last safe clock; rinse and repeat! If you start with a high clock at low temperature and it locks up, you're forced to grab a flasher, but this way I always ended up having the console crash or artifact before it became impossible to update again! (Tested on a 2504B slim and delidded CECHG fat, slim can get to 750/900 and the fat to 650/800 before artifacting). Delidding and more tantalum capacitors seemed to give 50~100MHz extra on the fat, haven't messed with the slim, since it's always worked well and I'm in no rush to ruin it LOL.
Feel free to add a list of games that do render at lower resolutions based on system settings. I'm interested if this is a trade off people are truely willing to accept.
@@ripfelix3020 some off the top of my head: Bayonetta MGS4 GTA4 DMC4 GT5/6 Ninja Gaiden sigma 2 (possibly 1 too) Sonic unleashed (and 06 if you dare) Rainbow six vegas Maybe all BioShock games, those allow for unlocked framerate too
@@ripfelix3020 the PS3 had a notoriously shit HW scaler, so it was up to the developers to use it or not, hence many games switch output to 720P while the console is set to 1080P, while the 360 is constant.
@@ripfelix3020 looking back I personally should probably have gotten a 480P CRT and a good RGB cable, instead of at the time an expensive LCD TV I got.
I had a 360 at launch but bought a used PS3 many years later to play the PS exclusives. They are both great and I would have been happy with either. That PS3 is still getting used today because emulators aren't quite there yet and I can't stop playing Gran Turismo 6. The 360 got replaced with an Xbox One and the backwards emulation is so good I don't miss it.
Comparing PC and PS3 back then for me was like comparing your regular honda civic with GT track racing cars... sure they happen to have common things like an engine and wheels but its a whole different league that not many people can afford to have so there is no way comparing that
The 7th generation was supposed to be a push for real time lighting and other things. It was a generation where everyone wanted to make worlds more dynamic from not just a lighting standpoint but from an interaction standpoint too. Yes, performance was terrible but it was necessary as the 7th generation laid many techniques that would later be refined in the 8th generation. The 5th generation was the exact same way with the consoles there were pushing new techniques, namely 3D capabilities that could be done on system and not just with extra hardware inside a cartridge or with other various compromises like untextured polygons or faking 3D with sprite scaling/super scaler games. I don't really hate on the PS3 and 360 for that reason but they were important as many technologies that are still used to this day made their appearance on these consoles or at least were popularized. Shadow maps, screen space reflections, ambient occlusion, and many more techniques. Plus, post processing became very popular around this time too. The 8th gen consoles took all these technologies and refined them to a point where many of their flaws were being fixed. Most 60fps games on the 7th gen consoles tended to look closer in technology to 6th gen games with many refinements to make them look visually superior. They often didn't push for real time lighting or super dynamic worlds
Umm funny fact is depending on how you soft mod and hardware mod the xbox 360 you can overclock but I would take the motherboard out of the 360 and put it in a pc case with good airflow. If you don't do the following steps, Then there is a risk of a fire. 360's did run hot. I watercooled the 360 and I used to overclock it.
Some games do render at whatever your PS3 resolution was set to. Skyrim ran much better at 720p rather than 1080i for example. It all depends on how the game was made. Tho ever since PS4, all games render at whatever the devs decide regardless of output resolution.
games aren't about escape. i always hear that but i never agreed with it. games are about fun and enjoyment. you can have that playing very realistic simulators. now, some devs like to say their games are not meant to be fun or enjoyable (coughtlous2cough) and i think that's them failing at making games, flat out.
I recently had a major carrer setback that taxed me emotionally to the limit. Gaming is absolutely an escape. Anything that can take your mind off that flood of negative emotions is desparetly needed relief. When everything is fine, sure. It's all entertainment and games. But when it comes crashing down, escape is priceless...when you can achieve it!
@@ripfelix3020 Disgusting. Journey is not a game and neither is it entertaining. People must be confusing pretentions with actual foundational and tangible design elements that are not theoretical.
@@ripfelix3020 Did you mean Carer or Career? If it's your career, you should not be emotional about that because it is just money. If your carrier had a setback, have you checked the resistances?
I just wish Sony gave us ANYTHING that couldn't have ran on the Xbox 360 with optimisation. Even the best PS3 games like The Last of Us could've been optimised to run on PS3 just as the best 360 games could've done for PS3. And when the offerings are the same, price is king. To me, that made the 360 the overall winner.
I see this point and actually agree with the value based conclusion. RN 360 games are dirt cheap and generally run better. So 360 from a gaming standpoint makes more sense. If you have to choose. But we don't anymore. A slim ps3 can be had for about $100 and the most popular ps3 exclusives are inexpensive too. If it's cross platform, get the 360 copy. Then enjoy both consoles exclusives. Both consoles have a proud place on my entertainment cabinet to be enjoyed together. It's a great time to be 7 gen gamer.
@@ripfelix3020 Absolutely, I collect for both of them right now. I got some great Xbox games on deal, but also really enjoy the Ratchet and Clank Future games on PS3. For first party the PS3 was amazing; I collect it almost exclusively for first party games, with a handful of exceptions of course. Also for streaming to my PS Vita. For its own exclusives and third party games (98% of the time anyway) I also collect for the 360 since my 6th gen collection is basically finished.
32Bit console era started with the Fifth Gen consoles on the Sega Saturn/PS1 and ended with Xbox360/PS3. We just barely left the 32Bit era and jumped to TRUE 64bit consoles with the Xbox One and PS4, and even to this day with the Series and PS5. The N64 was mostly 32Bit with some 64bit executions. The jump from 8, 16 and 32Bit was quite fast, but we will be stuck to 64Bit consoles for a good while.
@@gnrtx-36969 hmm i think It was mostly just people assuming they were. But if i recall the NGC was a 32Bit PowerPC CPU at over 400 mhz, while the Xbox was a x86 32Bit CPU at 733 mhz, i don't remember exactly but PS2 was a MIPS III CPU mostly also 32Bit with some 64Bit executions running at around 300 mhz. The PS2 pulled a similar trick to the N64 to give out an illusion to be more bits than what they were. But all consoles after the fifth gen were all 32Bit. And they all died at the end of the 7th gen consoles. Even the insanely fast Zen x86 CPU's inside the Series and PS5 are still 64Bit. That's going to be a thing for quite a good while, as 64bit memory address will take a while to get obsolete.
You're full of copium, during that time if a game was released on consoles and PC the PC was the better version in 99.9% cases, full stop, it's a known fact, ps3 and xbox weren't powerful enough. In 2006 you could get a gtx 8800 for the price of the ps3 and in 2008 the msrp of the gtx 9800 was about 250$. Also CRT monitors and HDCRT tv were a thing. The previous consoles targeted mostly 60fps due to CRTs.
@@crocodilegamer93 Yeah the initial cost is higher but you could offset it by pirating so. During that period you would just buy a core 2 and slap it on basically whatever motherboard you want, overclock it by 40/50% and you're done.
The PS3 and 360 had effects that couldn't be done as deferred rendering wasn't supported on most GPUs. The first GPUs to support it were the Nvidia 8000 series and previous gen GPUs couldn't support it. The PS3 and 360 had some advantages over the PCs of the time even if they were quickly surpassed in a few years. There were also games at the time that had better visuals on console vs the PC
I cant do wasd for character movment (walking). It's a terrible substitute for an analog control stick. The mouse is natural for aiming tho. give me a wii style nunchuck in the left hand and a mouse in the right and we'll call it even. That's possable now, but was a bigger hurtle in 2007. Many games on PC forced keyboard/mouse and I couldn't make the transition.
An internal 1280x720 will always look decent since it can be perfectly integer scaled to 4k (x3 in bot axis). Unfortunately all games under 720p will look like crap on modern TVs especially GTA 4 and RDR 1. The situation is flipped for Battlefield 3
They can* *but they won't, Most people don't have a framemister 4k, or similar upscaling device that can handle 4K output and HDMI. And most TVs are going to apply a bilinear filter and make the screen look like you covered in Vaseline
Cool video, good comparisons, love stuff like this, im one of those pc guys and love my games as "fast" as they can be, it just feels better to me, its hard for me playing anything at 30fps anymore, it feels sluggish, and i can notice the frames.
wow, "unbiased opinion" but you're literally butthurt about the ps3 being slower almost 20 years after their launches 😅i think you meant to say biased. now for the wall of text barely anyone will read: i disagree with your comment on frame rate. i agree that "unplayable" is a term tossed around too loosely nowadays (like hardware unboxed saying anything under 60fps is unplayable, or DF saying that anything with any stutter, like shader cache stutter, is unplayable... i roll my eyes every time. The threshold for unplayability is actually 18fps in my extensive testing. if you're really desperate to play a game, it's still somewhat fun at 18fps. anything below that becomes sources of migraine if the game is 3D, even 17fps. it's like a hard cut. that comparison you showed "playable", "meh", "enjoyable" was on point. actually pretty accurate. Anyway, my point is, bad fps is bad no matter the era. ps2 games generally ran much better than ps3 games and aged better for it. But let's be fair here - low FPS only really started in the 3D era. games before that used to be all at either 30 or 60fps, on the 16bit and below consoles, with very rare exceptions. arcades also ran stuff blazing smooth. And these terrible FPS were, in my opinion, poor dev practices, because these consoles could very well put out some great looking games that run at 60fps, like call of duty 4, gran turismo, forza, among many others. as a result, those games aged WAY better than anything else - the graphics all look dated anyway, but the 60fps with all of its smoothness and latency benefits are still fresh for reaping - the games still play great. 30fps games are good too as long as they don't drop too often. I come from a pc + console background, i bought my first pc in 1994, and i had my first high-fps experience in late 1997 - with my pentium mmx 233mhz being able to run quake 2 at 85fps in 640x480 with my voodoo2 - the smoothness was something unforgettable. but i was never spoiled by it to the point of saying anything less is unplayable - heck, two years earlier i was content playing quake 1 in a small window at 20fps. obviously i wasn't happy about it, but i was content. it was playable, and that's what mattered most. Somewhere in the middle of the 2010s i traveled for the holydays and took my shitty laptop with me. it rained every day, and i wanted to play forza horizon 3 (i think it was 3...) it ran like crap on my poor laptop. i wanted to lower resolution down to 640x360, but the minimum resolution the game allowed was 720p (why? - it'd run awesome if i could lower it further!). but it was unplayable, around 12 fps. i tweaked a bunch of settings, changed my camera to bumper, overclocked the hell out of my laptop and cooled it externally-- i got it to play at 18~30fps. there it was, completely playable, and i was able to have fun in it. anyway, nowadays you can emulate all these games and run them at better performance and resolutions, and that's enough to bring some them almost to par with early ps4one games.
Absolutely agree from a spoiled modern perapective. But I remember the N64 as a vivid memory burned into that special time of a kids memory. The time where every game, movie and music you saw for that short 4 or 5 year period is rememberd for the rest of your life. I didn't care about the FPS drops. My friends didnt care. We were too busy destroying controllers spunning the joystic to win at tug-o-war in Mario party, or getting the golden gun to own in golden eye. Low frame rates and drops from unoptimized code were just...incidental to what was an otherwise awesome and memorable gaming experiance I cherish. So when a young whipper-snapper craps on my childhood saying the game is unplayable and trash because of the frame drops it hits me in that special place. But, I don't disagree that it has aged and going back to play them now is a bit jarring. It's just when I play them now I overlook these minor issues and accept it as a part of the charm. Just as I did as a kid. Instead of looking to be an elitest and crap the classics. I agree 18FPS is right at about that limit for me as well. Where I start to notice the FPS drop and it intrudes on the immersion. But I do notice the input latency befor that. I'm sensitive to input lag and unless Vsync is off, allowing screen tearing instead of slowdown, I notice that more than tearing. Skyrim on 360 is better for that reason IMO.
I can agree with DF on the stutter. Constant stutter is unbearable and can really take you out. Shader stutter is annoying to deal with. I do disagree with Hardware Unboxed saying anything under 60fps is unplayable and I hate it every time they say it. I get it there are some types of games that are best above 60fps but that's not all games. People have really only recently started to care about fps.
And what's crazy is we probably haven't unlocked the PS3's TRUE POWAH with the potential of a Cell overclock.
True power only PC. I love consoles but PC always better. PS3 isn't hidden power. This is marketing trick.
@@Key_Pi o boi
No way a pc in 2024 stonger than a ps3🙀😫@Key_Pi
The 2008 PC is more powerful. I bought it for $500.
@@Key_Pi u don't understand the tru powah of da cell!!
this might sound weird, but i cant get tired of watching your videos, and now after a rewatch of the PS3 documentary and the Frankenstein PS3s you upload again? sweet!
That OC saga was glorious, that poor Frankenstein of a PS3 sure took a beating. Even if ps3 voltmods remain an obscure curiosity with little practical effect that was still entertaining as hell, fantastic vid.
Another RIP Felix Video! Your YLOD documentary stopped me from doing the token mod and made me check the SYSCON logs.Thank you for all the work you have done!
This was a great video. Took us for a wild ride. So many "Oh no. I know where he's going with this" moments. The overclocked firmware is a huge game changer that doesn't get as much attention as it deserves. So cool to see you push it this far.
This video really has me wondering how far the PS3 can be pushed, and how far people will go to find out, akin to PC overclockers. Jerry-rigged air cooling towers? Retrofitting water cooling loops? Stacking caps of different values/types to reduce ripple further? Custom voltage regulator modules for optimal power delivery? How hard can we redline these systems, even if they are obviously doomed to burn out in the process? Seeing the principles of extreme overclocking applied to the PS3, (and the effects on performance) would be not just entertaining, but utterly fascinating.
I think uncapped Uncharted felt weird because a lot of elements like animation or physics might be locked to 30fps or lower, creating discrepancy between them and the actual framerate. Also 40-50 fps will look worse than a capped 30 due to rampant screen-tearing on non-VRR displays/hardware.
I still appreciate the extra input responsivemess though. Great video!
IIRC, there was quite a bit of work done on the PS4 remasters for the Uncharted games and The Last of Us, and those include a 30FPS lock option, even if those remasters were optimized to run at 60FPS on the hardware. Also in regards to the soap opera effect, it generally also has the problem of the shutter speed being set incorrectly, resulting in less motion blur (a high exposure is what 24FPS films use for natural looking motion blur). Some games have an issue of not adjusting the shutter speed with higher framerates, resulting in the effect looking diminished, games nowadays have been doing a better job at including a motion blur intensity option.
Unsure if the RPCS3 patches for the game do anything beyond just changing the VSync interval.
Yeah it must be because it’s not looked at 60. Locked 60 beats 30 every time. Soap opera effect is a movie issue. Games are always better at higher frame rates. Personally I get motion sickness from 30fps these days. 60 fps is much better and on PC I target 120.
It really depends on people TBH those with vertigo & or motion sickness will see things differently than those who don't have any of the above. @@wertywerrtyson5529
Depends if it's V-synced or not. If it is, you won't get tearing, but judder instead, which frankly can be more disorienting.
@@wertywerrtyson5529 60 FPS beats 24 in movies IMO.
I recently modded one of my PS3s to run homebrew and decided to play through all of the Uncharted games as they were originally. Initially I was playing them on a modern 27 inch 1440p monitor and I thought they looked horrible. I then played them on an old 40 inch 720p plasma and my god they looked so much better. That TV was the target resolution and it was crazy how all the on-screen effects came together on a display that the game was designed for. Also, on top of this, flat screen/LCD/plasma displays tend to look awful when you're running stuff below the native resolution.
@@2Step2Hell I think if you play it on a decent 4K TV instead, it would look better as they are made to upscale 720p broadcasts. But nothing will beat integer scaling or native.
A digital IC designer here.
I'll be addressing the Voltage mod & how that worked in this scenario.
Basically as we design a chip, there will always be a working frequency & core voltage specification chosen as a base line.
But the operation of the chip will be tested based on that value with additional margins (plus/minus some percentage of core voltage & operating frequency, nominally 10%).
So by increasing the input core voltage (as long as it remains within the tested value), you can also operate the design at a slightly higher frequency.
So, logically & physically, it's possible to do this. I would just not recommend operating the device like this at all times though.
Yeah, it's technomancy. A fun thing to try, but probably not good idea if you want the thing to last.
Oma god did ma favourite youtuber just upload?
Yes he did!
As someone who's very into overclocking all kinds of pc hardware incl. notebooks and some record hunting, this video really had me hooked these 44 mins felt like 10 mins ngl! Absolutely astounding!
RIP Felix making all of us 7th gen soldiers relive our trauma of the warfare that has long since died. Lol
look at what they need to mimic a fraction of our power.
you mean the xboxers puking on everything and then dying miserably?
@@Jackson-bh1jwSure kiddo
While I wish they did better, xbox is dying rn, and they aren't going to last very long if they don't give a game-changing game to give people a reason for xbox, and if they don't, then it's probably going to be likely like what happened with Sega (which is ironic).
I could barely get through the intro. Fucking having some serious Nam flashbacks.
This is the kind of content that can barely find anymore. This channel must be protected at all costs. God save Rip Felix..
Keep in mind people were more used to 60fps games in 6th gen because of the massive jump.
What you said about being used to low framerates only really applies to 5th gen and PS2 games
Also Sony was very misleading in the announcement period of the PS3. The graphics and efects they showed in some of the trailers would be impossible even for a PS4 to pull off (Tekken, Motorstorm etc.)
The initial showcasings of the 360 were way worse in comparison with games running with severe slowdown and graphics that didn't look like a huge jump from 6th gen to 7th gen, but they ended up outdoing themselves when the final product released.
@spiral7041 Actually this current generation would surpass it for 3D games. Pretty much most 9th gen games run at 60fps and maintain that performance well
@@crestofhonor2349 Which is good.
That's not how I remember it. A lot of high profile PS3/360 games ran at pretty crusty framerates to get as much eye candy on screen as possible. That was THE way to make games look "next gen". Framerate snobbery was more of a PC gamer thing. I only recall a few examples where console players complained about low framerates around that time. For example, people complained that the launch version of Dark Souls 1 got overly choppy in upper Blighttown on the Xbox 360. It was only when the game got ported to PC that people really started complaining that it was locked at 30FPS. "Literally unplayable!".
Console framerates came more into focus towards the end of 6th gen, when many games pushed the aging hardware a little too hard in the continuing race for eye popping fidelity for very marginal gains. Like how The Last of Us didn't run very well compared to previous Naughty Dog games, and how the main upgrade of the PS4 version was that it ran smooth. But even then the focus was more about consistency than a high target framerate.
I was still very accustomed to sub 30 FPS gaming until I put together my first proper gaming computer in 2011. I played through Oblivion at
@@crestofhonor2349 that is an absolute LIE
You know Felix is the only person in the world doing in-depth testing on the ps3 today. Respect. 😁
It is fun to debate which console is better sometimes but not when it comes tearing each other down. The PS3 is stronger but it's so complicated that devs just couldn't fully take advantage of it. Overclocking shouldn't work either because the device already has heating problems. So did the 360 so I can't say much there. Both systems were great
That gen is the one that would really benefit from enhanced consoles with sub-720p res and unstable framerate. PS3 Pro with G94-based GPU could push everything at atleast 720p with stable framerate and if asked nicely, 1080p in some games.
And replace that stupid Cell!
"and replace that stupid cell" you do realize that the cell be was more powerful than the xbox 360's cpu, right? the problem was that the programming concepts needed to use it well were novel at the time, but are all commonplace nowadays with multithreading and compute shaders
the problem was the ps3 threw these things as a requirement at developers before dual-core x86 cpus were even a commonplace thing
@@TorutheRedFox the ps3 designer, _apparently_ said he wanted dev to be tied up learning his stupid system and would therefore make less games for other consoles. Could this be true? Because if it is, the cell architecture is a wrench thrown into the games industry as a whole, consciously.
I have never heard of him saying this
and I don't think that IBM or Toshiba would invest so much money in a partnership with Sony to make such a CPU if they wouldn't have seen potential in it, as only Sony would've had such a silly reason as that
@@TorutheRedFox Toshiba and IBM were taking orders and that's all.
I distinctly remember footage of Gabe Newell getting annoyed about the cell architecture trying to port Half Life 2.
One mistake and the whole house of cards collapses.
Whether or not cell architecture damaged the industry or not comes down to if you think the steep learning curve of it's systems could be migrated to the Intel Core architecture that is now standard.
Most people hate sliding back down the slope of enlightenment.
Felix throws his arms up and goes WEEEEEEEEEEE!, gets to the end and goes for another round!
Kaizo PS3
What are you talking about
the dunning-kruger effect of console repair. it's a running joke on my channel.
Funny how Felix quietly changed the labels from the fictitious supposed "Dunning Kruger graph" to the Gartner hype cycle graph lol
Might want to read doi:10.1037/0022-3514.77.6.1121 and then doi:10.5038/1936-4660.9.1.4 why it's horseshit anyway
Outstanding video! I never had a ps3, but I feel like getting one just for the sake of overclocking now 😂
I couldn't get better notification as that of a new video of yours. Thank you in advance for your presence in the scene
I remember back in the 64, PS1 era when a game struggles with framerate me and my friends used to say " oooh wow the game is soooo good it beautiful that it makes the console struggle" 😂 its crazy how back then fps drops for us was a look at in a more positive side where it would prove how the good the game was and how that quickly shifted into the worst thing ever when it happens
Felix buddy, love your work, learnt a lot from your YLOD video but when you tested one game and said "dropping the resolution to 480p makes no difference", I was disappointed and said "he should know better".
It's game dependent.
Digital Foundry, whom you cited in the video, did an entire PS3 1080p miniseries courtesy of John Linnemann where he shows you a few games that do run differently based on output. One off of the top of my head is _Gran Turismo 6_ which can be run in 1080i instead of the default 720p if you deselect 1080p as an output resolution (1080i has to be the maximum allowed at system setting level). The tradeoff for higher temporal resolution is more variable framerate and screen tearing, slight but noticeable on busier environments like city circuits.
Yeah, I suspect it varies from game to game. I only tested it in Crysis3, since that's the cross platform title I have for both consoles and know is late enough in the life-cycle to make full use of the consoles (theoretically). So it's the one I wanted to use for comparisons.
But the claim that it always helps FPS is untrue. That's the point.
Gran Turismo 6 defaults to 1440x1080 even if you have all resolutions selected. You need to disable 1080p (and 1080i?) to make it run in 720p, which does indeed improve performance. I remember doing that when the game was new, because framerates were quite poor even for the time. It runs a lot better with a slight oc though.
@@SterkeYerke5555 Good point, it's doing that PS3 thing of scaling on the horizontal only due to architecture, right? Point is it still pushes a different amount of pixels from 720p, it's not a fixed internal render res as some games are on the platform.
Your video is novel as usual. With proper citations, benchmarks, and jokes, it is the most enjoyable PS3 educational material I can find out there!
Please accept my tuition fee!
Thanks man. I like your Videos too! That last one about the caps from ali, mouser, digikey and LCSC was great.
High framerate (no motion blur, actual higher framerates with adapting frequency), isn't just for contest, for many people including myself, the more frames there is the less my eyes and brain need to interpolate between those frames, not counting the huge input lag improvements.
I think most people are just not educated enough about the subject.
Also let me guess, you made your friend test uncapped on an LCD display right? Try again on a CRT if you can. 30 FPS stable looks great on an LCD because it is slow enough to allow for pixels to switch states, however when you increase the framerate it creates this blurry uncanny effect that people find jarring, which is simply over persistence of sample on hold blur. It doesn't happen at all on CRT, just a little bit on Plasma, and to an acceptable extent I'd say on OLED.
There is a fascinating case that would be truly worth a test regarding this, which is the original BioShock, since that game has a 30 FPS cap and an uncapped mode that you can select from the options menu! (as well as 2 FOV settings believe it or not). I remember preferring 30 FPS back in the days because it was more stable and didn't have horrendous tearing with flashing lights, however the input lag was WAY worse.
It was a LCD. I've tested input latency on this TV at about 1 frame. All tests were done with game mode on for the least amount of latency possable on what I'd consider a fairly average cheap TV you'd find most casual gamers to use. I could have had him try it on my LG OLED or a Sony Trinitron CRT, but I decided it was better for the demographic I wanted to get an genuine experience from to use a more generic TV.
@@ripfelix3020 thanks for answering, As sad as it sound so far, 30 fps with proper frame pacing is going to be what most people will prefer, since most people have 60 Hz LCD TVs.
assuming a proper locked 60 fps isn't possible of course.
Also since you overclock the ps3... maybe MGS4 can avoid the 20 fps drops more that way right? make your friend try that!
@@deus_nsf Mitsu did this and MGS4 ran at 30FPS locked for the most part with just a few dips below vs stock that was stuck at 20FPS. He used the same OC I have on my 2501A 700/850 which it does without breaking a sweat.
@@deus_nsf It's the reverse for me. I will happily play 30fps on CRT/Plasma/OLED/IPS. If I play 30fps on a VA panel, my eyes bleed. At the very least I'd need a high end VA TV as they have response times matching or exceed IPSes.
Never seen such a great technical breakdown of overclocking any console. The only thing that comes close is the mod scene for the og xbox. Great work.
goat can't stop making bangers
God I love 7th gen. The last generation when consoles were worth buying and actually had games
Just another thing to note right at the end of your video re capacitors and console usage. Using a device less frequently can be paradoxically worse for it. Frequent use helps to keep the dielectric in good shape in electrolytic caps. Although PS3's don't have a ton of them, they do have a few. So the longer it sits on a shelf, the more likely the dielectric breaks down in the electrolytics. Yeah, the filter caps for the RSX/CELL aren't electrolytics but still, others on the board are. So yep, better to have fun with the console whilst you can. Nothing lasts forever!
can't you replace the caps on ps3's?
@@iwanttocomplain yeah of course. You can for most anything to be fair. I was more agreeing with Felix that it's better to just play the PS3 rather than leave it. No matter what you do, devices age and break down with time so just have fun with them. If the electrolytics go bad, it's easy enough to replace them.
@@NewRetroRepair yeah no point in being precious. These are mass produced consumer products, made out of plastic. Not hand crafted mahogany antiques from the 16th century.
Although I would love a Sega MK3 to play sg-1000 and such a cool case design.
This video was insanely good and I wish I understood more of the specifics. I spent SO MUCH time modding PS3 when I was younger and I miss it!!
Great video !
It should be put forward on the thread, so that it can serve as a reference.
You condensed everything perfectly. Risks, benefits, diminishing returns, how to proceed...
In short, it will probably be very useful to many people.
Congratulations for the work done !
Thanks. And thanks to you for all you've done on the thread to help out as well.
I'm so proud to be part of psx-place. Thanks Rip-Felix
Naxil (user who must necessarily use every mod)
So, in summary, changing voltages is not a worthwhile endeavour? My PS3 is a slim 2501A, and your fat frankenstein seems to obliterate mine in overclocking capabilities. I suppose silicon lottery wasn't on my side. Temps were alright though (changed the thermal paste). Some months ago I saw your comments, when researching to overclock mine, in the forums (I have to admit to be very happy to see you actively researching in real time) and I was expectant and hopeful of this video to be revelatory as I knew you where working on it. I suppose I expected a software voltage tool of some sort to magically change the voltage, and not a realist pragmatic new knowledge of the ps3 overclocking lore. I have to admit I am disappointed in the reality of it, although quite impressed of your thoroughness. Still hopeful of a future 'tool' or collection of 'methods' to reach a stable and interesting performance in the ps3. And thanks to your "style", I am now very interested in the general limits on the possible ps3 non-stock mods.
Loved the video. Love your content. Keep it up.
Modified FW is ballsy on a NAND console. SUBSCRIBED!
Insane video, better than most A-titles (Docs) on Discovery Channel 🙏
Felix keeps coming up with the best documentaries and makes things better by explaining things carefully, watch this people.
peak modding right there, love it.
Maybe, the best video I seen in months
For sure the best video of PS3 every made!
It is weird for someone to say the higher the framerate the worst it is for them. Higher framerate is suppose to help motion sickness. Probably got use to the game fps and no FOV slider wasn't a thing on console. So that makes sense. But higher framerates and FOV slider to something higher is probably the best for him. Lower fov gives motion sickness. Great video though!
Ive never ever ever ever heard someone dislike games over 45 fps. That's absolutely absurd, literally ridiculous. 60/120 fps gaming is the absolute best thing about the current gen. The decrease in input lag is amazing, and the need for significalty less motion blur to make up for the low frame rate is amazing.
There's objectively no benefit to low fps over high fps. Literal none.
Except for the fact my friend preferrs it. What's the saying? "There's no accounting for taste." Bro had an opinion, absurd as we both agree it is.
7th generation consoles couldn't achieve achieve high framrates for many of the techniques used to make modern games. Photo realistic expansive games were just within reach, but you have to accept the cost. And we did, because it was still awesome.
But I agree that the remasters of those games on PS4/5 are a quality of life improvement on games that really needed it. Not just a way for publishers to double dip, endlessly re-releasing their back catalog, but actually to achieve the vision they intended but couldn't realize on 7th gen hardware.
glad your alive
Work is the bane of creativity. No time for anything but recovering the courage to go back to work the next day. Making videos (the way I do it) is time consuming.
I watched the tutorial first because I thought it was the only video uploaded right now and I thought how long I had to wait until this one would cme out lmao
SUPRISE! Instant gradification granted.
Only on this channel!
@@ripfelix3020One question, can the Xbox 360 be overclocked or not?
It seems that overlocking the PS3, makes some change, but the BIOS and the Motherboard is probably soldered on, and it requires advanced skills to upgrade it as well, some games are capped to 30fps, while some games aren’t, and/or it could be a gpu bottleneck, and potentially in the future, there would be a way to overclock the gpu aswell, but it’s kinda unlikely with the skills we have today. Sick Video!
I gotta say, for me personally, the FPS really matters, it's very interesting to hear that people actually prefer 30fps!
30 FPS is fine if you haven’t played 60 FPS in a while. What’s worse is coming down to 60 FPS from 120 FPS with mouse and keyboard, feels like there’s a noticeable amount of input lag
@@Lxca-fi4np I can say that personally anything below 60 isn't particularly nice to me, regardless of how long it's been since i played a 60fps game. Will say FPS above 60 is very very nice
Sonic Unleashed is capped at 30 on Xbox but 60 on PS3- but they both average 20 so the PS3 just looks more unstable. If the game is actually hitting 60 there's absolutely no reason to prefer 30 tho lol
Firstly, love the work here, I love seeing the path not taken.
Secondly from experience, yeah pushing the VRAM bandwidth higher for the most part will get you the extra performance over a faster core. RSX was incredibly bandwidth starved, especially when you used multiple buffers to combine results. Considering the PC GPU's it was based typically had about double the bandwidth, it was obvious from day 1 that it was going to be a major bottleneck. You could do a LOT of data shuffling of multiple buffers just to end up with the final 27MB buffer displayed on screen (if you are 720p native). If you could eliminate a buffer or reduce the bit depth of one, that would yield decent results every time. Simply put, rendering pixels to RAM was slow. This is the exact same issue with the Switch currently, this is why Botw/TotK have major performance issues. Decent GPU core but you tried to keep it away from writing to VRAM for as long as possible. It meant that there would be a lot of idle time if you are bandwidth bound. Xbox 360 didn't have this issue thanks to the EDRAM - this a big simplification because you have other issues such as a shared memory bus with the PPC and having to tile your input/outputs to fit said EDRAM - but when blasting pixel data to it could do it much more freely.
That is not to say that memory bandwidth was the only performance bottleneck, but it was a considerable one. It is impossible to balance a GPU (or any system) perfectly when the kind of work loads it will get will be all over the place. You can go from buffer heavy stuff like Crysis 3, to the elegant artistic driven beauty of Burnout Paradise. There is no winning move just selective compromise.
As a bonus, to balance the argument, if you had to do render-to-texture elements RSX was actually kind of decent for this compared with 360, not because it was great at it but because the 360 GPU SUCKED at this stuff as it basically forced it to render directly to RAM. It would be like switching the cache off on your CPU. Once you stepped out of the EDRAM space, you where in a performance glut like nothing else.
Another banger🔥
Thanks for a new video. As always, an incredibly interesting stuff to watch.
YESSSSS, THIS IS WHAT I SUBSCRIBED FOR!!!!
felix...Pros here glad to see you upload again
Thanks for the lesson at the end of the video. I will be making the most of my CabbageStation 3 while it lasts!
What an awesome video. Very interesting to hear about your friend‘s perspective (soap opera effect)
Increible video, felicidades!!! 😁
enjoyed the full video. You've really put good effort into making the video.
Ain't no way I'm watching a video about overclocking a PS3 in this scorching heat.
Hmm...you just game me a video idea. THX!
@@ripfelix3020 Oh. Well, I'm glad I helped.
Amazing video! I liked basically everything in it
I'm not sure if you mention it later in the video, but one of the most important things that determines how playable a game is is the framerate and the gameplay logic/inputs being decoupled. When this is done, the game just slows down instead of skipping frames and missing inputs. That's how MGS games on the PS2 and PS3 did it (and countless others), but that's like an integrated slow-mo during heavy scenes, which I actually kind of appreciated.
That's completely different from the other way, and, if you have bad frame-time, and errors occur in the rendering-animation pipeline, then a "high" framerate that is unstable and misses button presses can be infinitely worse because it's invisible when it decides to just lose your controller/keyboard/mouse input in between frames.
I'm not sure why things changed in the way games are made, but that's the main reason a low framerate was tolerable back then, and why getting proper control input and animations at or well above 60fps is so problematic now.
the xbox 360 abstracted the hardware layer of memory management with an Intel ivybridge or some kind of bridge chipset to make handling memory inaccessible to 99% of programmers, who wouldn't know how to get into the chipset kernel.
cpu's are not useful any more. 1, single core, 1Ghz cpu is enough. All it does is run bus operations and remember the state of the program. Make it a risc cpu and you need half the gigs. The gpu is there to get the high res's and particle effects and a 512bit bus width bandwidth pipeline for 20GB/Per second of texture data etc. I'm trying to say I hate modern graphics and wasteful 3D pipelines. Because nvidia still isn't rich enough?
@@iwanttocomplain Wtf are you talking about. Tons of games are CPU-bound and not GPU-bound because their world-simulation is more demanding than the graphics.
@@Vellerize which?
@@iwanttocomplain Practically all strategy and simulation games are primarily CPU-bound either because of AI players (Civilization, Total War, etc.) or because of world-simulation (Cities Skylines, ex). Dwarf Fortress is the best example in my opinion since it doesnt even need anything other than text graphics but will make your CPU choke if you have enough dwarves running around doing constant personality and thoughts calculations and stuff, all while the entire world is simulating the history of important figures, other civilizations, megabeasts, and other monsters and creatures on the surface or in caverns, or trees growing and changing over seasons, or water physics, etc.
Minecraft is probably the most well-known example of a CPU-bound game, though, due to the world being divided into so many blocks that have to be loaded and kept track of, and it can especially make the internal server struggle if lots of redstone machinery is activated. Lots of open-world sandbox games can be CPU-intensive for similar reasons to minecraft. No Man's Sky might be a good example since it simulates solar systems and their planets in real-time.
Any game that has realistic physics and AI is going to be CPU-bound. Half Life Alyx and other similar "realistic" VR-titles are like this.
But games dont have to be primarily built on CPU-intense features to be CPU-bound. A lot of modern games can be heavy on the CPU in general, and you can probably fit any modern AAA game into this category.
I can't list every game in the world thats CPU-bound but the general rule of thumb is that if it has complex AI, pathfinding, world-simulation, and physics, its probably going to require more CPU: In so, so, so many cases multithreading and multicore CPUs are vital to make these work, due to it being able to run calculations in parallel. High amounts of Ghz to be able to run more faster too. Single-core is really only useful in games that fundamentally require a causal line that couldn't be kept track of under multiple cores easily (but Dwarf Fortress is probably the only example of this in practice).
Digital foundry should've really collabed with you. Awesome work all around dude.
Lowering voltage would be a better idea than underclock
Yep, we can do that too. I didn't include it in this video, for time. There's a bunch of stuff I would like to have included but it was already dragging along.
@@ripfelix3020 2nd part?
@@ripfelix3020Hi i only use my ps3 60gb for Ps2 only should i underclocked?
@@kiritowallace2plays If you love your console yes
Awesome video. Subbed!
Amazing video Felix
about the resolution thing, PS3 games can be programmed to change the render resolution based on the output resolution selected, so while it doesn't apply to all games, it most certainly isn't misguided advice that applies to 0 games. this is something Digital foundry themselves have found during their 1080p PS3 games videos.
Ps3 simply was too expensive for me in my teens. I wanted one so bad but got an xbox360 arcade which was like half the price of a ps3. Than I saw how much better most games ran. Everyone with rich parents at my Highschool had a ps3 that’s why I started to hate the other side. They mocked me everyday for having a Xbox and not a ps3 so I started to mock them back… that’s how 90% of the war stories began.
this is an insane video! keep the good work up!
I hope super slim is gonna get an overclock one day
me too, i hope its sometime soon
People who use the term "soap opera effect" in relation to games and say "the framerate is too high" or " it's way too smooth" shouldn't be asked any performance research related questions period. I applaud your overall ability to try and stay objective when discussing these topics, but the Uncharted test here is hard to take seriously, it should be done with someone with an actual understanding and a better eye; I'm not saying it wasn't interesting anyway, it's just that I'd rather listen to you trying to point out differences. Other than that, cool video as usual, though none are better than the YLOD one, because it's mostly free of your biases and has exclusively honest mistakes which don't take away from it one bit.
If I did that it wouldn't be representitive of the demographic I was targeting. The segment of the population you are talking about is a small minority of the larger group of gamers that know nothing about how the game works on a hardware level and instead just experiance it. This is a truer representation of the opinions and preferences of the market average. What a company would need to know to target the widest possable demographic.
You and I are on the fringes when it comes to performance. Most people dont know what to look for or care about the same things. Although they are highly influenced by people telling them what matters. And that why I was so careful not to hint at what I did or answer any questions.
I do understand the desire to believe your opinion is one shared by the majority, but that's often not the case. Nor does it matter. You can select the games and console or PC specs as you prefer, juat as they can do the same for their preferences.
I was suprised by his comments and my goal wasnt to correct him or disagree. It was simply to get his honest opinion, without interjecting my own bias into it. Then let you guys hear it unabridged and decide what makes sense or not, from your perspective.
That it was suprising reveals a dissonance between our precincieved notions and the reality of how others precieve the same issue. It can be very subjective and it this case, was.
Dude, fps directly corresponds to latency. In games where you need to react as fast as possible, like oh idk, first person shooters, it makes a huge difference. They are just plain easier to play in high fps because your effective reaction time becomes faster. That's why every competitive fps player on the planet lowers settings to the minimum, and plays on a high refresh rate display.
Yes, but many people subjectively prefer 'cinematic' framerates, don't ask me why, I prefer high framerates where possible.
One other thing to consider is frametimes, a game running at 30FPS consistently will look smoother than one jumping wildly between 30 - 45 fps on a non-VRR display (and in the case of the PS3, on any display since it doesn't support VRR at all anyway).
That's why Microsoft chose to disable vsync in skyrim on 360. It causes screen tearing, but maintains a locked 30FPS so your controller inputs don't get eaten by lag.
PS3 has it enables and doesnt suffer from screan tearing. But FPS drops can cause input latency that's arguably more distracting.
The majority of casual players would prefer a stable frame time matching divisions of refresh rate. 30fps solid or 60fps solid. Any other variance - you need VRR to avoid judder.
@@razmann4k You can subjectively prefer cinematic framerates all you want. You can even subjectively prefer not hitting your targets in FPS games. But the plain objective fact is that all else being equal the higher your framerate the faster you react.
@@Wooksley So what you're saying is, higher the resolution, higher the processing requirement? Thank you captain obvious. Can I try it? You can subjectively prefer using a CRT TV, and bust out your composite with 480p. You can even subjectively prefer 1080i mode in PS2 games. However objective fact is, PS3 is an HD console. Meaning 720p and 1080p. No hard feelings. I'm just kinda tired seeing latency in fos conversations.
ahhhh its good to see felix upload a video :)
Bro your an absolute Legend as always... Kudos
*you're...
Thank you for all of this top quality content.
IDK man, I owned both day one and even on a CRT the 360 fps advantage was pretty noticeable. Plus most people I know were off CRTs years before the end of the generation making the resolution difference obvious. But I was one of the weird ones flipping back and forth between inputs.
Only having a PS3 you wouldn't care but that's more along the lines of "ignorance is bliss" than "an unbiased opinion".
"CRTs had better motion handling, contrast and blacks than most TVs today"
True! Only QD-OLEDs are now getting in the same range of image fidelity as a CRT.
I have my LG OLED hooked up to the same RGB Moded N64 outputting to my SONY trinitron via Component. UltraHDMI to oled, component RGB to CRT. The contrast is increadble on the CRT. The colors pop and glow. Looks like candy! I never wanted to lick my TV more.
By comparison the OLED is dull. Sharp, beautiful, both had deep blacks. Lag has been mitigated with a retroTINK 5x, but brighness is where the CRT shines. As for Motion handling, way better. And the phosphors/bloom around them hides alot of aliasing. Blending the textures to look like they should. It's an all around obvious thing when you see it, that this was the way the game was intended to be played.
A while back I hooked my a2600 up and found myself getting into the game cuz it looked so good on CRT. On Modern display it's such a hassle for a blurry box and simple game. It was revelatory. It don't play atari much anymore, but when I'm feeling nostalgic, the CRT is the only way to itch that scratch!
The 360 architecture is a little weird too being a PowerPC based cpu (as opposed to a x86) paired with an ATI gpu but def not a weird as Sony's more custom PowerPC and RSX GPU setup. Microsoft usually seems to play these things on the safe side, probably because their whole company was built on the x86 and similar architectures. Seems it was the right choice as well.
I agree with that from a developer centric perspective, but as a gamer I don't. Many of the most interesting and unique gaming experiances came from the limitations imposed on developers by the hardware and architecture. Their clever workarounds often contributed to the unique look and feel of a console. Like PS1, N64, saturn...etc. They each have a unique feel and astetic that emerged from what most developers didnt like about developing games for them.
I would argue that easy development makes them less creative. Yes it allows them to achieve their original vision for the game, but it doesnt force them to think outside the norm to acomplish things others didnt think of in order to acomplish an effect, or level design...etc. Those breakthroughs made some games stand out as a breakthroug. We hardly get that sense from gaming anymore. It just feels like another itteration of something we've already seen, because there's no barrier to overcome. No limitation to showcase a developers talent.
Case in point. Look at the last of us compared to uncharted 1. There us a serious step up in polygon count and visual fidelity as Naughty Dog learned to optimize the Cell architecture. They stood out as a shining example of what the PS3 could do if developers took the time to learn how. And TLOU is usually the #1 pick for best games of the entire generation. Achieving something the 360 couldn't match even had the game released cross platform.
The 360's CPU is actually the same core as the PS3's PPU - IBM secretly made the deal as Cell was in development. PowerPC was attractive at the time because it provided the necessary performance per core without needing much transistors vs x86 = smaller dies = cheaper. They could have a simpler Core2 Allendale-based quad core with the same transistor count but Intel would charge way too much money.
X86 was the pentium 4 when the powerpc reigned in 2006... yeah.
The gamecube and Wii were also PowerPC based too. They also both used graphics made by ATI so in a way the 360 wasn't too dissimilar from what Nintendo offered. That could have made development easier for many as they had experience with those consoles
I never thought people would unlock these consoles these far, very nice.
Locked 60 FPS was supposed to be an industry standard ever since the 6th generation of consoles...
Computer tech couldn't deliver 60FPS on HDTVs in 2005/6. Not at a pricepoint a console needed to achieve.
That's simply impossible for all games in any generation.
Wouldn't be possible to get at HD resolutions and with a generational graphical improvement. If you look at 7th gen 60fps games they're either compromised (not HD, severely limited physics/shaders) or are essentially enhanced PS2 games, like remasters and such.
Not all games will be 60fps, But I understand and I need the games to be 60fps or at least like kirby and forgotten land that is 30fps but the input lag is similar to 60fps. With that said, like John from digital foundry says: "its a developer choice "
Where do i stand? If any developer chooses to make a 30 or 60fps game, the game needs to be close to that target 98% of the time!!
You never got a generation where all games ran at 60fps and expecting that would be unreasonable. Not even the SNES, NES, or Genesis could maintain a perfect 60fps or have all games target 60fps
I like the idea of your friend staying over and you toping him into this lol
I hope one day we can see a Cell Processor OC to help out abit on those CPU bound games.
I found that a very good way to overclock safely is to warm up the PS3 by playing some games, immediately updating to raise the frequency by 50MHz, then jumping back in game. At some point, you'll end up with artifacts and/or freezing, at that point you just use the fan blowout function a couple times to cool the heatsink, or let the system sit for a while to cool, then revert back to the last safe clock; rinse and repeat! If you start with a high clock at low temperature and it locks up, you're forced to grab a flasher, but this way I always ended up having the console crash or artifact before it became impossible to update again! (Tested on a 2504B slim and delidded CECHG fat, slim can get to 750/900 and the fat to 650/800 before artifacting). Delidding and more tantalum capacitors seemed to give 50~100MHz extra on the fat, haven't messed with the slim, since it's always worked well and I'm in no rush to ruin it LOL.
Some games on the ps3 actually renders in lower resolution if you change system resolution
Feel free to add a list of games that do render at lower resolutions based on system settings. I'm interested if this is a trade off people are truely willing to accept.
@@ripfelix3020 some off the top of my head:
Bayonetta
MGS4
GTA4
DMC4
GT5/6
Ninja Gaiden sigma 2 (possibly 1 too)
Sonic unleashed (and 06 if you dare)
Rainbow six vegas
Maybe all BioShock games, those allow for unlocked framerate too
@@ripfelix3020 the PS3 had a notoriously shit HW scaler, so it was up to the developers to use it or not, hence many games switch output to 720P while the console is set to 1080P, while the 360 is constant.
@@ripfelix3020 looking back I personally should probably have gotten a 480P CRT and a good RGB cable, instead of at the time an expensive LCD TV I got.
@@ThePsychoticWombat An HD CRT would have been even better as many of those could do 480p, 720p, and 1080i well
Damn, you are surely a mad man, AND I LOVE IT🎉
I had a 360 at launch but bought a used PS3 many years later to play the PS exclusives. They are both great and I would have been happy with either.
That PS3 is still getting used today because emulators aren't quite there yet and I can't stop playing Gran Turismo 6. The 360 got replaced with an Xbox One and the backwards emulation is so good I don't miss it.
Can the Xbox one emulate every 360 game?
@@Doobydoe No both for legal reasons and limitations in emulation
@@crestofhonor2349 damn that sucks
@@Doobydoe No, less than half the titles, but all the good ones are there.
Comparing PC and PS3 back then for me was like comparing your regular honda civic with GT track racing cars... sure they happen to have common things like an engine and wheels but its a whole different league that not many people can afford to have so there is no way comparing that
Watercooling next? Come on, 1GHZ ps3 HAS to happen.. X)
The 7th generation was supposed to be a push for real time lighting and other things. It was a generation where everyone wanted to make worlds more dynamic from not just a lighting standpoint but from an interaction standpoint too. Yes, performance was terrible but it was necessary as the 7th generation laid many techniques that would later be refined in the 8th generation. The 5th generation was the exact same way with the consoles there were pushing new techniques, namely 3D capabilities that could be done on system and not just with extra hardware inside a cartridge or with other various compromises like untextured polygons or faking 3D with sprite scaling/super scaler games.
I don't really hate on the PS3 and 360 for that reason but they were important as many technologies that are still used to this day made their appearance on these consoles or at least were popularized. Shadow maps, screen space reflections, ambient occlusion, and many more techniques. Plus, post processing became very popular around this time too. The 8th gen consoles took all these technologies and refined them to a point where many of their flaws were being fixed.
Most 60fps games on the 7th gen consoles tended to look closer in technology to 6th gen games with many refinements to make them look visually superior. They often didn't push for real time lighting or super dynamic worlds
Umm funny fact is depending on how you soft mod and hardware mod the xbox 360 you can overclock but I would take the motherboard out of the 360 and put it in a pc case with good airflow.
If you don't do the following steps, Then there is a risk of a fire. 360's did run hot. I watercooled the 360 and I used to overclock it.
Overclocking consoles do shorten its lifespan so just be aware.
I had game cube ps3 and 360 and modded all of them.
O really! I would like to check this out. Can you point me to a source of more info on this elusive 360 overclock?
@@ripfelix3020 I will show you soon. I will update you soon. Give me a day or two.
Update found some of my old notes still looking for my other notes but you need a 360 FALCON OR JASPER VERSION.
Some games do render at whatever your PS3 resolution was set to.
Skyrim ran much better at 720p rather than 1080i for example.
It all depends on how the game was made.
Tho ever since PS4, all games render at whatever the devs decide regardless of output resolution.
games aren't about escape. i always hear that but i never agreed with it. games are about fun and enjoyment. you can have that playing very realistic simulators. now, some devs like to say their games are not meant to be fun or enjoyable (coughtlous2cough) and i think that's them failing at making games, flat out.
I recently had a major carrer setback that taxed me emotionally to the limit. Gaming is absolutely an escape. Anything that can take your mind off that flood of negative emotions is desparetly needed relief.
When everything is fine, sure. It's all entertainment and games. But when it comes crashing down, escape is priceless...when you can achieve it!
@@ripfelix3020 "that flood of negative emotions" Established fact nominal human mind condition?!
@@ripfelix3020 you talk like a florid ad for some pointless cream or something.
@@ripfelix3020 Disgusting. Journey is not a game and neither is it entertaining. People must be confusing pretentions with actual foundational and tangible design elements that are not theoretical.
@@ripfelix3020 Did you mean Carer or Career? If it's your career, you should not be emotional about that because it is just money. If your carrier had a setback, have you checked the resistances?
oh thank you ive been very curious about this subject and not enough has been done about it
I just wish Sony gave us ANYTHING that couldn't have ran on the Xbox 360 with optimisation.
Even the best PS3 games like The Last of Us could've been optimised to run on PS3 just as the best 360 games could've done for PS3.
And when the offerings are the same, price is king. To me, that made the 360 the overall winner.
I see this point and actually agree with the value based conclusion. RN 360 games are dirt cheap and generally run better. So 360 from a gaming standpoint makes more sense. If you have to choose. But we don't anymore. A slim ps3 can be had for about $100 and the most popular ps3 exclusives are inexpensive too.
If it's cross platform, get the 360 copy. Then enjoy both consoles exclusives. Both consoles have a proud place on my entertainment cabinet to be enjoyed together.
It's a great time to be 7 gen gamer.
@@ripfelix3020 Absolutely, I collect for both of them right now.
I got some great Xbox games on deal, but also really enjoy the Ratchet and Clank Future games on PS3.
For first party the PS3 was amazing; I collect it almost exclusively for first party games, with a handful of exceptions of course. Also for streaming to my PS Vita.
For its own exclusives and third party games (98% of the time anyway) I also collect for the 360 since my 6th gen collection is basically finished.
32Bit console era started with the Fifth Gen consoles on the Sega Saturn/PS1 and ended with Xbox360/PS3. We just barely left the 32Bit era and jumped to TRUE 64bit consoles with the Xbox One and PS4, and even to this day with the Series and PS5. The N64 was mostly 32Bit with some 64bit executions. The jump from 8, 16 and 32Bit was quite fast, but we will be stuck to 64Bit consoles for a good while.
And there comes the 6th gen era which claimed it was 128bit even though it is 64bit cpu+ 128bit vector processor
@@gnrtx-36969 hmm i think It was mostly just people assuming they were. But if i recall the NGC was a 32Bit PowerPC CPU at over 400 mhz, while the Xbox was a x86 32Bit CPU at 733 mhz, i don't remember exactly but PS2 was a MIPS III CPU mostly also 32Bit with some 64Bit executions running at around 300 mhz. The PS2 pulled a similar trick to the N64 to give out an illusion to be more bits than what they were. But all consoles after the fifth gen were all 32Bit. And they all died at the end of the 7th gen consoles. Even the insanely fast Zen x86 CPU's inside the Series and PS5 are still 64Bit. That's going to be a thing for quite a good while, as 64bit memory address will take a while to get obsolete.
@@EtaYorius hmm also weird thing is PS3 has 64bit Ppu with 8 128bit vector processor
cell CPU was more like a vector processor than a CPU right?
@@EtaYorius wait didnt emotion engine have 2 vpus though?
@@gnrtx-36969 It did, both with 64 Bit executions, hence why people thought it was a 128Bit CPU, but was mostly still 32Bit.
Jeez the new reddit website is so space inefficient lmao
Felix you are the best!
You're full of copium, during that time if a game was released on consoles and PC the PC was the better version in 99.9% cases, full stop, it's a known fact, ps3 and xbox weren't powerful enough. In 2006 you could get a gtx 8800 for the price of the ps3 and in 2008 the msrp of the gtx 9800 was about 250$. Also CRT monitors and HDCRT tv were a thing.
The previous consoles targeted mostly 60fps due to CRTs.
nope. pc is more expensive. you needed more than a 8800gtx to game lmao
@@crocodilegamer93 Yeah the initial cost is higher but you could offset it by pirating so. During that period you would just buy a core 2 and slap it on basically whatever motherboard you want, overclock it by 40/50% and you're done.
The PS3 and 360 had effects that couldn't be done as deferred rendering wasn't supported on most GPUs. The first GPUs to support it were the Nvidia 8000 series and previous gen GPUs couldn't support it. The PS3 and 360 had some advantages over the PCs of the time even if they were quickly surpassed in a few years. There were also games at the time that had better visuals on console vs the PC
@@SviatoslavDamaschin Most people didn't know how to pirate then and still don't know now
Can GTX 9800 play last of us lmao?
videos like this make me soo happy ive been a pc gamer exclusively.
I cant do wasd for character movment (walking). It's a terrible substitute for an analog control stick. The mouse is natural for aiming tho. give me a wii style nunchuck in the left hand and a mouse in the right and we'll call it even.
That's possable now, but was a bigger hurtle in 2007. Many games on PC forced keyboard/mouse and I couldn't make the transition.
Console wars are about having fun trolling fanboys of the rival console. Its like sports team fans taunting each other. What's so wrong with that?
Nothing really, as long as it's kept in good fun. Some people take it too seriously.
The grand wizard of PS Thrizle is back.
An internal 1280x720 will always look decent since it can be perfectly integer scaled to 4k (x3 in bot axis). Unfortunately all games under 720p will look like crap on modern TVs especially GTA 4 and RDR 1. The situation is flipped for Battlefield 3
They can*
*but they won't, Most people don't have a framemister 4k, or similar upscaling device that can handle 4K output and HDMI. And most TVs are going to apply a bilinear filter and make the screen look like you covered in Vaseline
@@lula4260 Which is a real shame, I can only imagine how much better it would be if TVs included more scaling options (Nearest Neighbour and Lanczos)
Cool video, good comparisons, love stuff like this, im one of those pc guys and love my games as "fast" as they can be, it just feels better to me, its hard for me playing anything at 30fps anymore, it feels sluggish, and i can notice the frames.
Can you OC a CPU? ;-)
Not yet unfortunately.
Bae wake up Felix just uploaded
wow, "unbiased opinion" but you're literally butthurt about the ps3 being slower almost 20 years after their launches 😅i think you meant to say biased.
now for the wall of text barely anyone will read:
i disagree with your comment on frame rate. i agree that "unplayable" is a term tossed around too loosely nowadays (like hardware unboxed saying anything under 60fps is unplayable, or DF saying that anything with any stutter, like shader cache stutter, is unplayable... i roll my eyes every time. The threshold for unplayability is actually 18fps in my extensive testing. if you're really desperate to play a game, it's still somewhat fun at 18fps. anything below that becomes sources of migraine if the game is 3D, even 17fps. it's like a hard cut. that comparison you showed "playable", "meh", "enjoyable" was on point. actually pretty accurate. Anyway, my point is, bad fps is bad no matter the era. ps2 games generally ran much better than ps3 games and aged better for it.
But let's be fair here - low FPS only really started in the 3D era. games before that used to be all at either 30 or 60fps, on the 16bit and below consoles, with very rare exceptions. arcades also ran stuff blazing smooth. And these terrible FPS were, in my opinion, poor dev practices, because these consoles could very well put out some great looking games that run at 60fps, like call of duty 4, gran turismo, forza, among many others. as a result, those games aged WAY better than anything else - the graphics all look dated anyway, but the 60fps with all of its smoothness and latency benefits are still fresh for reaping - the games still play great. 30fps games are good too as long as they don't drop too often.
I come from a pc + console background, i bought my first pc in 1994, and i had my first high-fps experience in late 1997 - with my pentium mmx 233mhz being able to run quake 2 at 85fps in 640x480 with my voodoo2 - the smoothness was something unforgettable. but i was never spoiled by it to the point of saying anything less is unplayable - heck, two years earlier i was content playing quake 1 in a small window at 20fps. obviously i wasn't happy about it, but i was content. it was playable, and that's what mattered most.
Somewhere in the middle of the 2010s i traveled for the holydays and took my shitty laptop with me. it rained every day, and i wanted to play forza horizon 3 (i think it was 3...) it ran like crap on my poor laptop. i wanted to lower resolution down to 640x360, but the minimum resolution the game allowed was 720p (why? - it'd run awesome if i could lower it further!). but it was unplayable, around 12 fps. i tweaked a bunch of settings, changed my camera to bumper, overclocked the hell out of my laptop and cooled it externally-- i got it to play at 18~30fps. there it was, completely playable, and i was able to have fun in it.
anyway, nowadays you can emulate all these games and run them at better performance and resolutions, and that's enough to bring some them almost to par with early ps4one games.
Absolutely agree from a spoiled modern perapective. But I remember the N64 as a vivid memory burned into that special time of a kids memory. The time where every game, movie and music you saw for that short 4 or 5 year period is rememberd for the rest of your life.
I didn't care about the FPS drops. My friends didnt care. We were too busy destroying controllers spunning the joystic to win at tug-o-war in Mario party, or getting the golden gun to own in golden eye. Low frame rates and drops from unoptimized code were just...incidental to what was an otherwise awesome and memorable gaming experiance I cherish.
So when a young whipper-snapper craps on my childhood saying the game is unplayable and trash because of the frame drops it hits me in that special place.
But, I don't disagree that it has aged and going back to play them now is a bit jarring. It's just when I play them now I overlook these minor issues and accept it as a part of the charm. Just as I did as a kid. Instead of looking to be an elitest and crap the classics.
I agree 18FPS is right at about that limit for me as well. Where I start to notice the FPS drop and it intrudes on the immersion. But I do notice the input latency befor that. I'm sensitive to input lag and unless Vsync is off, allowing screen tearing instead of slowdown, I notice that more than tearing. Skyrim on 360 is better for that reason IMO.
I can agree with DF on the stutter. Constant stutter is unbearable and can really take you out. Shader stutter is annoying to deal with. I do disagree with Hardware Unboxed saying anything under 60fps is unplayable and I hate it every time they say it. I get it there are some types of games that are best above 60fps but that's not all games. People have really only recently started to care about fps.
Sucha good video!