Try this - go to Settings -> Cores. Turn off 'Allow Cores To Switch The Video Driver'. Exit Retroarch for the changes to take effect, then try the video again. If you don't see that setting, go to Settings -> User Interface and make sure 'Advanced Settings' is turned on. Here's what is happening when you leave that option on: it switches from Vulkan/D3D12/D3D11 to OpenGL when it tries to play a video with the ffmpeg core. Why does it do that? Prob because the core has some OpenGL-exclusive feature (some frame interpolation setting) so it makes the 'assumption' that you want to use the OpenGL driver instead. For the ffmpeg core if you don't mind the frame interpolation option I think it's best to just run it in the native driver (Vulkan/D3D12/D3D11/whatever) instead. I'll try to see if we can get rid of that video driver switching for ffmpeg. Anyway the reason brightness would be completely off vs. D3D12/D3D11/Vulkan is prob because it doesn't support HDR at all.
Is someone working on making it for it to look as good but on lcd and 60hz 4k? Not all of us have oled screens and CRTs were certainly not oleds and most of them were 60hz.
OLED picture quality, CRT organic feeling and minimal eye strain. THAT'S the future we were promised and denied during the CRT -> LED transition and robbed when Plasma was abandoned. We're back on track, finally!!
How much input lag is there with this shader? I'm used to low input lag from 144Hz 144FPS gameplay, but do I get that low input lag with the CRT beam simulation shader enabled?
The mid 2000's was such an awful Time for HD TV's. I can't believe people actually bought those first wave junk LCD TV's, just because they were light, slim, widescreen and offered 720p HD. Yet the motion smearing was god awful, black levels were milky grey, viewing angles were terrible, colors were artificial and the input lag for video games was noticeably higher than the 10ms we're getting now. I knew better. I bought myself a brand new 32" 2006(Or it might of been 2005) FullScreen Sony WEGA Trinitron SDTV CRT at BestBuy back in 2006, and paired that sucker with component cables with my DVD player & Nintendo Wii. Best overall TV i've ever owned period, edging out my 1999 32" JVC D-Series. With the WEGA, there was no input lag, perfect blur-free motion clarity, No black crush, amazing RGB colours and it was the brightest CRT i had ever laid eyes on.
@@bigjoegamer if you're already getting 144fps on 144hz this won't help you. This would help anything below half that framerate, so 70fps or lower would be helped with this.
I like how much you brought up this being a passion project for the creators. It’s the definition of love. To do something for the benefit of others without expecting anything in return. I love this hobby. Such a cool space. Hope you had a happy holidays man.
Tried that shader and it wasn't as dim using this as it was with Lottes. I can't remove that scrolling shaded bar though. Turning the LCD setting off stops it moving but it remains. Anyway to remove the fuzzy effect when moving with this shader too?
Comments dont understand that this isnt BFI. BFI is one method to achieve a similar effect. This is much more advanced and you dont lose as much brightness
i'm guessing BFI just replaces every other frame with black frames, effectively halving the framerate of whatever is on-screen. This seems to take a 60fps video or game, cuts each frame in half vertically, and displays the top half first and bottom half 2nd at 120hz, or 120fps. So, it's similar to interpolation but it's not showing every other "line" it's showing the top half of the screen then the bottom half of the screen. So, i don't really see how this would help a source that is already 120fps, so if you're already getting 120fps, reducing that back down to 60fps just in order to enable this would not seem to help motion clarity or anything really, in fact, it would be objectively worse, and while anyone can have an opinion, you're still losing frames. It could help a 60fps source but then you must have a 120hz or better display for it to work properly.
@@tirkentubeYou don't see it because you don't get it. It is not supposed to be using on a source you can fill your screen hz with. Simple as that. It is for lower hz sources than you screen have available. Like 30hz, 60hz games or even lower like 24hz movie and show content. If your screen can at least double that. It is always better to display 120hz as 120hz or what ever.
@@cajampa you re-stated precisely what i said, answered my rhetorical question (like an idiot would), and then acted rude about being dumb enough not to realize that i already knew the answer to my own question, you know, cuz i ALREADY SAID THAT.
Ill back that up. Also there is something i noticed with the LG panels that have 21:9 aspect ratio options under game optimizer. If you have any retro console hook it up to an HDMI converter and see the difference. it has a scanline like filter over it that also makes the input signal clean.
That's actually the solution they should implement in any modern game also. An option on games to enable a per-line/per-segment (adjustable) refresh, so to have good to perfect motion resolution on any S&H display. Applying that to the game's native rendering, you'd also have an immense performance gain (like interlaced rendering).
Not likely to happen as it means you need a high refresh display. For console it would require using a 120hz display so the system would need to turn it off when used on 60hz displays and disable when turning on high performance modes. That's just not going to happen. It's a cool trick for PC where tinkering is the name of the game, but console it's very unlikely to ever show up. Honestly I see the biggest future for it is to be added to Nvidia's freestyle filters or reshade. I don't even see it as very likely to come to Nvidia or AMD drivers as it's a bit complicated and needs calibration per display and per frame rate.
Yep it's working! You need to play a real game. I tried with Vulkan on the Flycast core and a dreamcast game on a regular VA monitor. You need to go into shader parameters and max the brightness vs clarity option to 1, and reduce the gamma to the minimum, to get normal image colors. Otherwise yes the brightness and colors looks cooked, if the brightness is too bloomy try reducing the clarity options, but it feels like Gamma at 1 works best. Works in SDR and HDR, retroarch has an HDR option that need to be set first. Need 120hz set on monitor as well, I tried at 144hz and it just doesn't work for 60FPS content. So it's pretty good but I need to see if the image quality issues this causes get resolved at 180hz or 240hz, if it does look better the higher the refresh rate is then this shader is the real deal for sure, if it doesn't then I'll still ask for hardware rolling scan BFI on my monitors instead Also, this shader doesn't like framegeneration at all, so I will still prefer to use my C1 to get 120fps with BFI for 60fps games instead of using this shader. But it's very promising, someone with a 240hz OLED monitor need to try and tell us how it looks ASAP!
Thanks for sharing. I'll try that but there's something wrong we have to figure out. Others are telling me that games look perfect, I have to download a game
I just cannot get it to work properly on my 360hz oled. Turning off HDR and VRR does not do the trick. I set subframes to 6 and FPS divisor to 6, the resulting is a pulsating, strobing screen with flashing colors.
@@Bugatti12563 I made it work on my 240Hz Oled, but I find this extremely finicki. You can't have HDR, Gsync active, then essentially close every or most software, the slight framerate variation generates flickering and other issues. Having more than one monitor also introduces flicker, I love the idea but I feel I need to sacrifice too much every time to be able to play.
@@Hoytehablode What core did you try this on? For nes on mesen I get flickering because it is more resource intensive and causes slight frame rate variations but fceum works fine
CES there is 300hz 1440p 24.5 inch IPS LCD, 600hz TN 1080p msi, 750HZ TN 1080p koouri, 1440p 360hz gsync pulsar monitors, 500hz 1440p OLEDS, 4k 27 inch 240hz oleds. we eating good this year 😀😀😀
Just don't buy any of the LCD trash especially a TN no matter how high they advertise that refresh rate. I keep my old TN as a secondary to remind me to never buy another one lol
Finally, i told everyone 10 years back to do this. Instead of half a screen I suggested to show the pictures in possibly alternating pixel line for line. If that did not work Id suggested alternating thicker lines which might be more precessing friendly. Then my third option would be 1th the resolution in a checkerboard alternating each frame.
@ in the time I suggested this we had 100hz screens, since I am from a pall region, id always told we could use those extra 50 frames for this, since I was unsure what would work best, and never actually tried it, I am insure about all other tech that could help with the motion clarity. But its close to the concept I have in mind for years now! The same counted for crt filters to atleast give a high pixel count lcd a phosphorous look giving 240p/480p back its charmes.
@plasmatvforgaming9648 seems like best case scenario would be all new & recent games. Borderlands 3 is my newest game I play. I play even older ones more.
Thanks for highlighting this awesome feature. I think you're covering an area of image quality that is vastly underexplored for gaming. You should definitely do a follow up video testing some retro games in different genres via emulation. For example, how does a platformer like Sonic 2 look vs a racing game like Gran Turismo 4 vs a fighting game like Tekken 5? If you need help, Reddit can guide you to find everything you need for emulation.
@@plasmatvforgaming9648 Retroarch is just a frontend for emulation. It uses various "cores" for each console. So all you'd have to do is find some roms online - plus maybe some BIOS files - and then boot up your game. Reddit and RUclips will point you in the right direction.
I like the enthusiasm in this video. Feels like I'm revisiting my old friend where he was always upto something new he's very fond of and finally managed to achieve it
I've been messing around with this for the past 2 days on a non-OLED 240hz monitor, and the motion looks very choppy and stuttery compared to the smoothness that I get with BFI. Maybe i'm missing something.
Your content is great man but I wish you made some videos that are super short and just to the point, settings and effects and that's it. Those lengthy videos make it impossible to find specific information
@@plasmatvforgaming9648 I did try it without turning adaptive sync off (in the monitor) and it works, but unfortunately I have to force 120hz in retroarch which gives a slight flicker. Using the shader sub frames at 180hz looks good but causes the audio to go out of sync. Wish I could use my native 165hz
Can't TV manufactures add CRT or scanline simulation to get better motion clarity? I just want something that can match the fluidity of my old plasma TV. Especially for 60fps content.
LCD TVs with back light manipulation can be a lot more clear than Plasmas. I tested a Sony kdl-47w802a and it was 4x more clear than Plasma. It was as clear as a CRT at just 60fps
I'm wondering if the retroarch people could use their rolling scan method with the color control app or something like it to fix lg's high bfi flicker when playing 50/60fps using motion interpolation to 100/120fps, it makes no sense that they didn't just use the same process as 100/120hz without interpolation
This tech can be implemented by displays and the chief says he could do it better having more hardware control vs just the software. But for this to work BFI has to be off and this can't change the rolling scan behavior of the TVs. It just creates different versions of each frame to simulate the CRT rolling scan
This just made me think that software bfi like this could be used with hardware bfi so you could get 120hz bfi to work at 60fps without double imaging.
Can you use this for movies too or is it just for games? 24 fps movies and tv shows on sample and hold displays is rough, man, older lcds were actually better because pixel response was so slow that motion and camera panning was blurry, which feels more natural. Now with fast lcds and oleds we get motion stutter. Man, if the camera is panning and there is an object with vertical lines or edges, especially against a bright background, the image becomes a mess...
You can already emulate a double or triple shutter old style projector like in RetroTink4K via 48hz/96hz or 72hz/144hz You can also potentially emulate a 48hz or 72hz Plasma TV with their phosphor decay LCD blur can be simulated too and in fact some OLED TVs out there have an option for something like this
From a Chief reply: "It’s an algorithm that needs to be implemented by software developers into apps. Hopefully some media player author implements this for great playback of 60fps content, maybe ask the VLC or MediaPlayerClassic or other people, about implementing this? Possibly could be done as a DirectShow filter, as long as you remember to make it permanently run the shader every Hz (CRT emulation must successfully modify every refresh cycle), indepedently of the video frame rate. A software developer can change an internal native:emulated Hz ratio dynamically to match content framerate, e.g. for 240Hz displays automatically use a ratio of 5 for 48fps material and a ratio of 4 for 60fps material. The native would stay unchanged (unless you switch your real display Hz), but you can dynamically change the emulated Hz to simulate a 72Hz CRT for 24p material! So I welcome VLC or MediaPlayerClassic to implement subframe-capable plugins to allow me to do 35mm BFI (ON-OFF-REPEAT-OFF) for double strobing, or for CRT simulation, or other multiple-passes-per-movie-frame feats. RetroArch just implemented it already into a developer release, it’s likely the next public release of RetroArch will have it."
How does this affect the lifespan of the TV? I'm not sure if that makes sense, but wouldn't the turn off and turn on of the pixels reduce their lifespan?
Not gonna lie, this is cool, as a retro gamer and lover of CRTs and OLEDs and motion clarity this is huge but the whole video I was wondering with that accent where you're from since you sound just like me, I'm Cuban btw lol.
I have 2 questions for this : 1- Are oled tvs of the future coming with this feature will have different picture type or it just improve motion ? 2 - what the benefit of having 480hz panel if you only can play to certain framerates and even gonna have big drops ?
60fps 480Hz CRT simulation =480 like motion for eye tracking (2ms of persistence equal to 2 pixels of motion blur when moving at 1000 pixels per second)
It is a gameplay video, not the game running. The fix to the Gamma is to turn off let the application change the API (I showed that in the video after this)
Are you using HDR? This doesn't work well with HDR for the moment. We experimented with including it in Lossless Scaling a few days ago and it worked lol, but any tiny variation in frame time results in heavy flickering due to desynchronization. It would need advanced algorithms to adapt to frame time changes, but it might be very interesting for the future!
LG tv's actually use this now. My LG 120hz model does this exact effect when some motion clarity setting us turned on. It helps but isn't perfect. I verified this by recording my tv in slow motion mode on my phone.
@plasmatvforgaming9648 Then there must be some illusion with how my phone records in slow motion because the video has big rolling black bars. I'd be happy to be wrong on this because it's not nearly as effective as perfect BFI.
@@MrPragnienie1993 yeah id rather not have it as a shader because the games I'd want it on would also include pvp competitive shooters. Which doesn't always work with reshade
@@JasonJtran I'm not sure how it being an app instead of a shader would be benefical to you in this case, it would most likely be flagged by an anti-cheat regarless of what form it has.
I wonder if it could work on Steam Deck/Linux. Gamescope does support reshade. Maybe due to open nature more system-wide implementation could be possible
Since you don't have Twitter or any other platform I could tag you on, I have to write a comment under your video :) New infos are out on the new LG OLED lineup 🔥 Looks like the G5 is going to be insane.
@@plasmatvforgaming9648 My guess is that it's more of a niche feature, maybe LG thinks it's not worth the development time, I'm not sure? CRT emulation sounds interesting though.
Doesn't it trade of brightness for less motion blur? If only half the screen is illuminated at any given time, it means you get only half the brightness, right? The blurbuster article you showed is also stating, that this is ideally used in SDR mode.
Interesting times, though it does mention in the github that an integer division of frames works best even if it's not mandatory, which makes sense. I assume the gamma settings on your TV and in the software match?
I like this video but could you please explain what this shader is even actually doing? How is simulating a CRT going to actually make motion any smoother, that doesn’t make any sense.
It reduces the frame visibility time instead or showing the frame for the whole duration of the frame time (it turns of the pixels for a moment). Motion clarity is limited by persistence based on the Blur Busters Law.
Motion clarity is limited by persistence. What that means is that the motion clarity of any display, regardless of technology, cannot be better than its persistence. Persistence is the frame visibility time. For sample and hold displays, the frame visibility time is as big as the frame time. For example 120Hz means 120 frames refreshed each second, and sample and hold displays show each frame until the new one comes so 1/120×1000=8.3ms of persistence. 1ms of persistence is equal to 1 pixel of motion blur when moving at 1000 pixels per second (Blur Busters Law) 8ms of persistence is equal to 8 pixels of motion blur when moving at 1000 pixels per second. Finally, just understand speed in pixels per second to understand the statement (although speed can be measured in different units such as screen width (time it takes an object to cover screen width)). The speed 1000 pixels per second would make an object moving left to right to take 4 seconds to cover screen width on a 4k screen that has 3840 pixels horizontally. So at 1000 pixels per second in a 4k display it takes approximately 4 seconds for an object to go from left to right.
I'm very excited for this CRT simulation to become available on displays, apps, etc, but I wonder if there will be enough brightness on current OLED displays to use the higher refresh rates (360Hz, 480Hz, 500Hz, etc) and get low persistence AND still have a good looking image. Motion quality is very important to me, but image quality is equally important. That's what makes CRTs so magical. They have incredible motion quality and image quality at the same time. My LG CX, on the other hand, has pretty decent motion quality with BFI turned on to the highest setting, but I rarely use it because it causes the image to become dim and dull. WOLED already have inferior color brightness and with BFI colors just look absolutely lifeless. If I can get a QD-OLED with glossy screen, 500Hz refresh rate, this new CRT simulation capability and still get around 100 nits of brightness with rich colors I'll be in heaven.
When Samsung finnaly make their RGB-OLED displays available, which are triple emiiter OLED (red/green/blue OLED emitters) instead of just single emitter WOLED & QD-OLED, they will be plenty bright enough, RGB-OLED is over two thirds brighter and higher colour luminance, not to mention has a much higher subpixel resolution (the defacto resolution metric), in fact, MVA RGB-OLED should see 4K nits of luminance, and that's pure colour luminance, no filtering or conversion and luminance lost, which should make for incredible HDR performance, and a really good full-screen uniform brightness.
This is just coincidence to me. I read in tech news that the Windows 11 24H2 update makes games crash and colors in games are off. But that would be a total other issue right?
I've tried this in Retroarch, and for me it's a no go. There is very obvious image retention, like the screen of a old arcade that has been running for years, no matter if the option for screen retention is on or off. For example, in Teenage Mutant Ninja Turtles 4 player arcade, just the little demo play through playing once, that last for less then 30 seconds, is enough to have the ghost of the 1-2-3-4 player outline boxes at the top of the screen from then on. It's like its simulating years of CRT burn-in, in just seconds.
@@RandomUser-tj3mg I'm not a programmer, but shouldn't this be something done in the script/program itself? Simulate the good things about CRTs, not the bad.
Ok, i need to rewatch this video again in case i missed anything. But again, amazing intro! LOL glad to see you're having fun and not trying to be overly serious like most of these other tech-related RUclipsrs. Ok, so correct me if I'm wrong. * 60fps games with CRT Simulation turned on will remove 58% of OLED motion blur with 144hz QD-OLED? or is it just 50%? * How much lag exactly is piled on top of the 10ms present from OLED/QD-OLED Game mode for 60fps games? * How much brightness in SDR would be left by using CRT simulation, on something like an S90D(With GRS & TCL turned off in the SM)and can't you use the same HDR to SDR brightness boost technique like you can with RetroTINk4K to compensate for the big dip in brightness loss? * Does it support up to 4K, or is it also limited to a maximum of 1080p like the TINK4K? * Will this ever be compatible with Consoles like Switch & PS5/PS5 Pro?
@@plasmatvforgaming9648 Great to hear, and thanks again man! Seems like it's shaping up to surpass Tink4K's 60hz BFI. As it's 4K compatible, reduces 58% motion blur instead of 50% for even 144hz OLED TV's for 60fps titles, and according to you has less visable BFI flicker than internal OLED BFI, and assumingly TINK4K. But does this CRT Simulation cause any shadow detail crushing by any chance? And we still need to get input lag numbers(Wouldn't be able to tolerate anything beyond 16ms total, even that's slightly pushing it), and if it will even be compatible with Video game consoles like Switch & PS5/PS5 Pro at some point. But ya, 150 nits SDR sounds good, but i hope they update it with the HDR to SDR brightness boost setting/trick for even more SDR brightness, but without any weird compromises. Oh, and i have a weird feeling that CRT Simulator will probably add nearly 7ms of lag to a 144hz QD-OLED if it's going by the 144hz/6.94ms persistence. :P
Maybe some LCD/LED monitors will be able to provide a similar effect to phosphor decay and make flicker and jutter less pronounced this requiring a lower FPS rolling scan?
Yes, this could be implemented natively on displays to achieve even better results. Check out the video I did about the Sony kdl-47w802a which has CRT motion at 60fps
I liked the video. Hoping for actually working demonstration. Really need this in reshade :D. I would be okay using G5 and base 82fps -> 165 crt simulation. 60base fps is really on the edge regarding input lag. I don't like that in souls like games. But its okay for emulators i guess.
I actually played deus ex:MD + RTGI shader from mr Pascal doing 38 -> 115 fps + gsync, using lossless scaling framegen + specialk. Since i'ts stealth based game, I could handle the 38fps input lag. So there ARE usecases for high end pcs as of today for lower base framerates.
The way this is such a pain to set up is why i got an arcade CRT monitor a couple of years ago. I am so done messing with all those RetroArch settings and cores.
@@plasmatvforgaming9648 Will 2025 QD-OLED's TV's support 165hz by any chance? If 144hz reduces motion blur by 58%, how much would 165hz reduce it when playing a game at 60fps with CRT Simulation?
Nice Idea but I can't get this to work without losing way too much brightness in comparison to with HDR with CRT shader and g-sync. Anyone want to make a suggestion on crt shader settings and using this on 120hz LG C3.
@@plasmatvforgaming9648 Thanks i tried it with another shader and it wasn't as dim. I had this shaded bar which slowly goes up the screen though. Turning the LCD setting off stops it moving but it still remains. Deal breaker in itself for me. Hopefully early days and they can fix these things.
I’m new tot his stuff and a little confused. Can this CRT simulator be used for any game you want? Like can I just add it to any game or does the developer actually have to implement it themselves for it to work?
This could be implemented on: Reshade, SpecialK, LosslessScaling, ingame setting put by the developer. Also it could be added to TVs and Monitors as a Firmware update or Natively (hardware controlled to make it even better)
Yo! I was doing some testing and reverted back to stock settings at the moment on the s90d 682. Did you find yours was clipping at 800 nits when stock, is the eotf raised in game mode? I turned contrast down to 47-48, not sure if this is correct. Can remember is this was how it was before the mod.
It's possible retroarch just doesnt do some shaders properly with the FFMPEG core. Could be as simple as that. FFMpeg isnt really being rendered through vulkan and it's doubtful its even using dxgi which would be a direct x standard. Which would explain the shader being broken. You can change the color space in core options which might help. I would just download the snes9x core (current) and find some snes home brew game or ya know sail the seven seas 😂. Only real way to be sure its not other settings. I really need to give this a go though. This was made for my 360hz oled 😂. I just dont really do much emulating on my PC since I've got handhelds for that.
Okay this is interesting. Do you think you could do input latency testing to see if it works like the Steam Deck when refreshing the frame at a high multiplier? I.e. rendering game at 30fps but refreshing at something much higher. That’s supposed to get you faster input response, curious if this would do the same thing, worsen, or have no effect on it.
@@plasmatvforgaming9648Thanks for letting FOMO know on last nights stream! Also, check out Blur Busters - new post about Open Source Display Initiative. Includes Plasma simulator planned for later this year and possible SteamOS support!!
How do you launch regular games from RetroArch? I didn't know this could be done. Edit. Just realized that it was a video and not the game in real time :)
@@plasmatvforgaming9648normal bfi is unusable due to flicker. On my phone I can still see some flicker with 120hz bfi. With this shader I don't see flicker anymore with my phone and it looks amazing.
@@RandomUser-tj3mg You may be able to get 120Hz out of you Laptop with a lower interger resolution, aka overclocking the display, try a custom res app, or try within the GPU driver, set it to 50% the native resolution and 120Hz and test it to see if it works, you may even get 144Hz to work, you can then use Nvidia interger-scaling to map the resolution to your native res, or just use internal scaling in the game options or RetroArch et cetera.
@@plasmatvforgaming9648 so 58% motion blur reduction with CRT simulation on a 144hz QD-OLED? but what about reinforcing HDR brightness into SDR to compensate for BFI or CRT simulation brightness loss? I need this app for Switch & PS5. Don't have a beefy PC gaming rig! :P
@@plasmatvforgaming9648 Didn't you mention in the past that the C1's MotionPro High seting, when gaming at 60fps, reduces motion blur by about 63% or something? Input lag unfortunately is some where in between 21-26ms...And if your using it with consoles you have to use HDR Module ON in the SM to compensate for the brightness loss, which risks burn-in. 60hz internal OLED BFI Flicker is too bothersome for me whenever whites hit the screen anyways, and even just 20ms is too much latency in most cases for me. :P Sigh. I wish CRT Simulation could some how be a downloadable app, or part of a console update that downloads into the system, on Switch & PS5/PS5 Pro already. I mean, with CRT Simulations less visible flicker, higher brightness than internal OLED BFI, it's 58% motion blur reduction for those with 144hz OLED TV's for 60fps games, and with no negative impact on gamma make it sound super promising. We still need input lag numbers and it if causes any weird shadow detail crushing. Aside from console support, an update for an HDR to 'SDR' brightness boosting trick would be the icing on the cake.
Do you get the input lag of 60FPS gameplay with this CRT beam simulation shader? I want use the shader and still get the input lag of 240FPS (and higher FPS) at the same time, which is less than the input lag of 60FPS gameplay.
I am excited to test all of these but the functionality is limited with newer steam games especially ones that require a "launcher" from the games developer. I'll be watching eagerly for updates on how to run this.
most gamers are just not aware what is motion clarity , bc never seen CRTs - to understand what it is you must see it live, it cant be showed on youtube thats the problem.
The newer generation don't know any better. By the mid to late 2000's, they were gaming on lousy motion smearing LCD & LED TV's with their XBOX360, PS3 & Wii's. I've been gaming since the late 80's and have been using multiple CRT's most of my life. OLED motion STILL looks alien to me. I hate it. lol with a 50% motion blur reduction with current BFI it's....good-ish, depending on the genre. Reminiscent of plasma.
No, the display is refreshing at its maximum refresh rate and this tech creates multiple versions of the same frame that get refreshed to simulate the rolling scan
17 дней назад+1
Using a 240 Hz OLED (LG 39GS95QE), and this is not amazing. Motion clarity doesn’t look better to me with crt beam simulation enabled. And yes, I *do* get flickering. V-sync enabled in retroarch, v-sync set to application mode in nvidia control panel. I had high hopes but I honestly don’t see any amazing difference in motion clarity with this.
Looks very cool - realistically for now, is there much point trying to use this if my monitor "only" goes up to 165Hz, or is it really a case of waiting a few years for >300Hz displays?
@@plasmatvforgaming9648 Do you know the exact number? We're already getting 10ms of lag at 60fps with OLED Game mode, which is already enough as it is.
The S90D is better, just make sure to get a membership or something that allows you to exchange it just in case you get one with the signature Samsung quality control
I tried on my 34inch Samsung 165Hz LCD(using 120Hz mode) but I was unimpressed, didn't notice any difference....probably because of the backlight and LCDs are not well suited for this kind of tech.......man I wish I had OLED. For such a long time I've hated being relegated to substandard LCD tech since the time of CRTs. P.S. Backlight strobing looks way better on my LCD, I'm sure it would be different on OLED though.
Nice! I had no idea, both Samsung & LG are releasing 165hz QD-OLED & OLED TV's this year, most likely in April or so I'm guessing. I've got my eye on the 65" Samsung S95F(Hopefully it doesn't have an Anti-Glare filter...) or the LG G5, both of which support 165hz, and more nits under the SDR hood for even brighter CRT simulation. If 120hz reduces motion blur by 50%, and 144hz does so by 58%, where does that leave 165hz, in tandem with CRT Simulation when gaming at 60fps? My guess is just under 65%
@@plasmatvforgaming9648 Somebody told me 165hz might be a 60-62% motion blur reduction. 60 seems too low, 62-63% seems more likely but i haven't done the math.
@@plasmatvforgaming9648 lol. I do not want to go back to WOLED colours with a G5, but the downsides to a anti-glare filter may even be worse. I guess it's a matter of picking your poison unless you settle for a 144hz S90F.
I was actually looking which video players support BFI and didn’t find anything. If you can play videos in Retroarch with this shader, I can try it out myself! I just hope Retroarch has color management 😅
@ I am one of the seemingly few who like Motion Interpolation, I have have Smooth Video Project. But Samsungs Built in feature is better. I wonder if actually with better motion clarity with this CRT feature I don‘t want the additional frames anymore. I can’t recall seeing stuttery images in cinema.
A reply from the Chief: "It’s an algorithm that needs to be implemented by software developers into apps. Hopefully some media player author implements this for great playback of 60fps content, maybe ask the VLC or MediaPlayerClassic or other people, about implementing this? Possibly could be done as a DirectShow filter, as long as you remember to make it permanently run the shader every Hz (CRT emulation must successfully modify every refresh cycle), indepedently of the video frame rate. A software developer can change an internal native:emulated Hz ratio dynamically to match content framerate, e.g. for 240Hz displays automatically use a ratio of 5 for 48fps material and a ratio of 4 for 60fps material. The native would stay unchanged (unless you switch your real display Hz), but you can dynamically change the emulated Hz to simulate a 72Hz CRT for 24p material! So I welcome VLC or MediaPlayerClassic to implement subframe-capable plugins to allow me to do 35mm BFI (ON-OFF-REPEAT-OFF) for double strobing, or for CRT simulation, or other multiple-passes-per-movie-frame feats. RetroArch just implemented it already into a developer release, it’s likely the next public release of RetroArch will have it."
You know what, someone can port this shader to ReShade and then it can be used on ANY PC game that Reshade works with ( which is most of them ). And it could be done today!
I thought this works for Windows Games also, like Modern titles. But it doesn't. :( That's sad, it only works for iso based titles like ones from Sega and so on.
Bro, I get this high contrast/ gamma issues in some games. Remove Windows HDR only for that game as it's conflicting with RTX HDR. It fixed it for me, I hope this is useful
I cant wait until there is a simple FPGA box with DP 2.1 & HDMI 2.1 passthrough that converts any sample & hold display to rolling-scan, it would also be amazing to have it at the GPU driver level in Nvidia & AMD drivers, and perhaps Nvidia OLED Pulsar FPGA modules could have it too, imagine hardware level Gsync & rolling-scan with zero flicker 0.1ms MPRT on a 30 inch 16:10 aspect clear black glass RGB-OLED monitor.
I think your brightness problem is introduced by incompatibility with ffmpeg video playback in RetroArch. You should be trying this with an actual game ROM that shows off 60hz side-scrolling.
From the Chief: "It’s an algorithm that needs to be implemented by software developers into apps. Hopefully some media player author implements this for great playback of 60fps content, maybe ask the VLC or MediaPlayerClassic or other people, about implementing this? Possibly could be done as a DirectShow filter, as long as you remember to make it permanently run the shader every Hz (CRT emulation must successfully modify every refresh cycle), indepedently of the video frame rate. A software developer can change an internal native:emulated Hz ratio dynamically to match content framerate, e.g. for 240Hz displays automatically use a ratio of 5 for 48fps material and a ratio of 4 for 60fps material. The native would stay unchanged (unless you switch your real display Hz), but you can dynamically change the emulated Hz to simulate a 72Hz CRT for 24p material! So I welcome VLC or MediaPlayerClassic to implement subframe-capable plugins to allow me to do 35mm BFI (ON-OFF-REPEAT-OFF) for double strobing, or for CRT simulation, or other multiple-passes-per-movie-frame feats. RetroArch just implemented it already into a developer release, it’s likely the next public release of RetroArch will have it."
@@plasmatvforgaming9648 That's great! I know he's also working in a plasma simulator, it's been a while since i last saw a plasma but i'm really interested in it's benefits.
@@josedeleon1923 I'm really excited about the idea of a 2500Hz Neo Focus-Field-Drive simulator, or even a 600Hz Sub-Field-Drive Kuro simulator, would be incredible for watching media content and films, especially animated content, which looks so good on my Pioneer Kuro Plasma, and films are silky smooth in 24FPS pulldown mode, Plasma is to film and video, what CRTs are to vid games.
@@plasmatvforgaming9648 oh wow, so 24-30fps suttering(like the choppyness of certain panning shots) will be reduced without any form of SOE being implemented? If we can get less stuttering or just as much as plasma and CRT for movies & TV than that's a huge win. Because as is, OLED seems like it has triple the amount of film judder(or Stutter is it?) compared to my S60 Plasma. :P TINK4K can reduces film judder more than plasma & CRT for 24-30fps content with trible strobe BFI at least, so that's still an option that's currently available, with the same 58% motion blur reduction. But somebody on RUclips mentioned that it creates occasional image artifacts....
Tried this, it worked. But I get weird artifacts if I add scanlines to games, and still get some image retention with my Asus monitor. I think I will stick to just using Losless Scaling which is a superb program and gives me 120 hz when I have 60 hz content.
I have a 15 inch CRT monitor from 1995 and the video/picture quality is way better/more realistic than my 1ms 240hz monitor and it has MPRT as well the only thing that looks good is the sharpness since its like 2k resolution, looking for a bigger CRT monitor but its getting harder to find and more expensive (my 1ms monitor feels cartoonish compared to the crt lol idk why I cant explain it)
Bro just downloada a Sonic 1 or Super Mario World rom They have attract screens which are basically videos so you don't even have to play them HDR is not supported yet As for consoles implementing this it would actually be a bad idea right now Mark and Timothy are still working on it and improving it, in many displays it won't work that well, HDR not supported yet etc. It might hurt the reputation of this simulator which would be counterproductive
Try this -
go to Settings -> Cores. Turn off 'Allow Cores To Switch The Video Driver'. Exit Retroarch for the changes to take effect, then try the video again.
If you don't see that setting, go to Settings -> User Interface and make sure 'Advanced Settings' is turned on.
Here's what is happening when you leave that option on: it switches from Vulkan/D3D12/D3D11 to OpenGL when it tries to play a video with the ffmpeg core. Why does it do that? Prob because the core has some OpenGL-exclusive feature (some frame interpolation setting) so it makes the 'assumption' that you want to use the OpenGL driver instead. For the ffmpeg core if you don't mind the frame interpolation option I think it's best to just run it in the native driver (Vulkan/D3D12/D3D11/whatever) instead.
I'll try to see if we can get rid of that video driver switching for ffmpeg.
Anyway the reason brightness would be completely off vs. D3D12/D3D11/Vulkan is prob because it doesn't support HDR at all.
Thank you! I'll try that. The video is SDR and HDR is off
Is someone working on making it for it to look as good but on lcd and 60hz 4k? Not all of us have oled screens and CRTs were certainly not oleds and most of them were 60hz.
@ cant as you're limited by the technology of the day, need a higher refresh rate to simulate it
OLED picture quality, CRT organic feeling and minimal eye strain. THAT'S the future we were promised and denied during the CRT -> LED transition and robbed when Plasma was abandoned. We're back on track, finally!!
This will look unbelievable on a future 1000Hz display
How much input lag is there with this shader? I'm used to low input lag from 144Hz 144FPS gameplay, but do I get that low input lag with the CRT beam simulation shader enabled?
The mid 2000's was such an awful Time for HD TV's. I can't believe people actually bought those first wave junk LCD TV's, just because they were light, slim, widescreen and offered 720p HD. Yet the motion smearing was god awful, black levels were milky grey, viewing angles were terrible, colors were artificial and the input lag for video games was noticeably higher than the 10ms we're getting now.
I knew better. I bought myself a brand new 32" 2006(Or it might of been 2005) FullScreen Sony WEGA Trinitron SDTV CRT at BestBuy back in 2006, and paired that sucker with component cables with my DVD player & Nintendo Wii. Best overall TV i've ever owned period, edging out my 1999 32" JVC D-Series. With the WEGA, there was no input lag, perfect blur-free motion clarity, No black crush, amazing RGB colours and it was the brightest CRT i had ever laid eyes on.
@@bigjoegamer if you're already getting 144fps on 144hz this won't help you. This would help anything below half that framerate, so 70fps or lower would be helped with this.
Yup
I like how much you brought up this being a passion project for the creators. It’s the definition of love. To do something for the benefit of others without expecting anything in return. I love this hobby. Such a cool space. Hope you had a happy holidays man.
Absolutely! Thanks and same to you!
Combining this with the CRT_MaximusRoyale shader overlay is just amazing!
Tried that shader and it wasn't as dim using this as it was with Lottes. I can't remove that scrolling shaded bar though. Turning the LCD setting off stops it moving but it remains. Anyway to remove the fuzzy effect when moving with this shader too?
Comments dont understand that this isnt BFI. BFI is one method to achieve a similar effect.
This is much more advanced and you dont lose as much brightness
biggest BFI issue is flickering, brightness also but flicker is deal breaker for me
i'm guessing BFI just replaces every other frame with black frames, effectively halving the framerate of whatever is on-screen. This seems to take a 60fps video or game, cuts each frame in half vertically, and displays the top half first and bottom half 2nd at 120hz, or 120fps. So, it's similar to interpolation but it's not showing every other "line" it's showing the top half of the screen then the bottom half of the screen.
So, i don't really see how this would help a source that is already 120fps, so if you're already getting 120fps, reducing that back down to 60fps just in order to enable this would not seem to help motion clarity or anything really, in fact, it would be objectively worse, and while anyone can have an opinion, you're still losing frames. It could help a 60fps source but then you must have a 120hz or better display for it to work properly.
@@tirkentubeYou don't see it because you don't get it. It is not supposed to be using on a source you can fill your screen hz with.
Simple as that. It is for lower hz sources than you screen have available.
Like 30hz, 60hz games or even lower like 24hz movie and show content. If your screen can at least double that.
It is always better to display 120hz as 120hz or what ever.
@@tirkentube BFI doesn't cut the framerate, it cuts frame persistence.
@@cajampa you re-stated precisely what i said, answered my rhetorical question (like an idiot would), and then acted rude about being dumb enough not to realize that i already knew the answer to my own question, you know, cuz i ALREADY SAID THAT.
TV manufacturers should take notice, implement this on new TVs.
ABSOLUTELY!!
Ill back that up. Also there is something i noticed with the LG panels that have 21:9 aspect ratio options under game optimizer. If you have any retro console hook it up to an HDMI converter and see the difference. it has a scanline like filter over it that also makes the input signal clean.
Let’s make CRT simulation the new Franks Red Hot Sauce…Put that sh*t on everything!!
🤣
Yaaaaaas INNNNNNDEEEEEEEEEEED!!!!
Always curious, always exploring, and always discovering these gems! thank you.
The beginning with the old TV is so fkn great bro I love you so much for your humor😂🎉
I questioned the Windows fonts color choice this man makes everyday of my life.
It is all about not having the color blue. Perfect blacks means no eye strain at all.
lol dat high contrast theme, just how it goes on oleds
BTW Timothy Lottes is an ex-Nvidia and an ex-AMD employee
So he definitely had Jensen # and didn't make that call
He got autobalanced
i love your enthusiasm dude. we need more of this
That's actually the solution they should implement in any modern game also.
An option on games to enable a per-line/per-segment (adjustable) refresh, so to have good to perfect motion resolution on any S&H display.
Applying that to the game's native rendering, you'd also have an immense performance gain (like interlaced rendering).
Absolutely! However, this will only get as clear as the maximum refresh rate of the sample and hold display
@plasmatvforgaming9648 Indeed. This is a smart use of high refresh rate S&H panels.
Not likely to happen as it means you need a high refresh display. For console it would require using a 120hz display so the system would need to turn it off when used on 60hz displays and disable when turning on high performance modes. That's just not going to happen. It's a cool trick for PC where tinkering is the name of the game, but console it's very unlikely to ever show up.
Honestly I see the biggest future for it is to be added to Nvidia's freestyle filters or reshade. I don't even see it as very likely to come to Nvidia or AMD drivers as it's a bit complicated and needs calibration per display and per frame rate.
I seriously hope someone working for LG or Samsung R&D watches this
This could easily be a firmware update
Yes, it needs to be implemented at the display level so that it can work on everything.
Yep it's working! You need to play a real game. I tried with Vulkan on the Flycast core and a dreamcast game on a regular VA monitor. You need to go into shader parameters and max the brightness vs clarity option to 1, and reduce the gamma to the minimum, to get normal image colors. Otherwise yes the brightness and colors looks cooked, if the brightness is too bloomy try reducing the clarity options, but it feels like Gamma at 1 works best. Works in SDR and HDR, retroarch has an HDR option that need to be set first. Need 120hz set on monitor as well, I tried at 144hz and it just doesn't work for 60FPS content.
So it's pretty good but I need to see if the image quality issues this causes get resolved at 180hz or 240hz, if it does look better the higher the refresh rate is then this shader is the real deal for sure, if it doesn't then I'll still ask for hardware rolling scan BFI on my monitors instead Also, this shader doesn't like framegeneration at all, so I will still prefer to use my C1 to get 120fps with BFI for 60fps games instead of using this shader. But it's very promising, someone with a 240hz OLED monitor need to try and tell us how it looks ASAP!
Thanks for sharing. I'll try that but there's something wrong we have to figure out. Others are telling me that games look perfect, I have to download a game
I just cannot get it to work properly on my 360hz oled. Turning off HDR and VRR does not do the trick. I set subframes to 6 and FPS divisor to 6, the resulting is a pulsating, strobing screen with flashing colors.
@@Bugatti12563 I made it work on my 240Hz Oled, but I find this extremely finicki. You can't have HDR, Gsync active, then essentially close every or most software, the slight framerate variation generates flickering and other issues. Having more than one monitor also introduces flicker, I love the idea but I feel I need to sacrifice too much every time to be able to play.
@@Hoytehablode What core did you try this on? For nes on mesen I get flickering because it is more resource intensive and causes slight frame rate variations but fceum works fine
@@dm.3145 It doesn't matter the core, I tested many and was teh same thing unless I close almost everything and that is having a 4090 and a 7950X3D
CES there is 300hz 1440p 24.5 inch IPS LCD, 600hz TN 1080p msi, 750HZ TN 1080p koouri, 1440p 360hz gsync pulsar monitors, 500hz 1440p OLEDS, 4k 27 inch 240hz oleds. we eating good this year 😀😀😀
Awesome! Thanks for sharing 👍
Just don't buy any of the LCD trash especially a TN no matter how high they advertise that refresh rate. I keep my old TN as a secondary to remind me to never buy another one lol
That's sick. I'll be making some good money on my backup crts.
We are having this Odin 2 Portal 120hz OLED Android handheld incoming. This feature will be GOLD on it.
Finally, i told everyone 10 years back to do this. Instead of half a screen I suggested to show the pictures in possibly alternating pixel line for line. If that did not work Id suggested alternating thicker lines which might be more precessing friendly. Then my third option would be 1th the resolution in a checkerboard alternating each frame.
This one seems to generate multiple versions of the same frame (depending on max refresh rate) to simulate the CRT rolling scan with phosphorus decay
@ in the time I suggested this we had 100hz screens, since I am from a pall region, id always told we could use those extra 50 frames for this, since I was unsure what would work best, and never actually tried it, I am insure about all other tech that could help with the motion clarity. But its close to the concept I have in mind for years now! The same counted for crt filters to atleast give a high pixel count lcd a phosphorous look giving 240p/480p back its charmes.
If a gpu company integrated this or BFI into it at driver level for any input, I would buy it in a heartbeat!
This will be available somehow for all games very soon
@plasmatvforgaming9648 seems like best case scenario would be all new & recent games. Borderlands 3 is my newest game I play. I play even older ones more.
Thanks for highlighting this awesome feature. I think you're covering an area of image quality that is vastly underexplored for gaming. You should definitely do a follow up video testing some retro games in different genres via emulation. For example, how does a platformer like Sonic 2 look vs a racing game like Gran Turismo 4 vs a fighting game like Tekken 5?
If you need help, Reddit can guide you to find everything you need for emulation.
Thanks! I just don't have any games for RetroArch.
@@plasmatvforgaming9648 Retroarch is just a frontend for emulation. It uses various "cores" for each console. So all you'd have to do is find some roms online - plus maybe some BIOS files - and then boot up your game. Reddit and RUclips will point you in the right direction.
Thanks for explaining how bfi on a LCD can cause image retention.
I like the enthusiasm in this video. Feels like I'm revisiting my old friend where he was always upto something new he's very fond of and finally managed to achieve it
I've been messing around with this for the past 2 days on a non-OLED 240hz monitor, and the motion looks very choppy and stuttery compared to the smoothness that I get with BFI. Maybe i'm missing something.
Set it to 240Hz and lock the FPS to 60. Turn off VRR
My uncle would LOVE this
You never fail to impress me man. Always pioneering the display enthusiast youtuber game. 👏👏🏆🎊
The news is more than a week old at this point...
Your content is great man but I wish you made some videos that are super short and just to the point, settings and effects and that's it. Those lengthy videos make it impossible to find specific information
So do I have to disable adaptive sync in the monitor everytime I want to use this shader? That's a deal breaker
They're working on CRT simulation with VRR, it will happen
@@plasmatvforgaming9648 I did try it without turning adaptive sync off (in the monitor) and it works, but unfortunately I have to force 120hz in retroarch which gives a slight flicker. Using the shader sub frames at 180hz looks good but causes the audio to go out of sync. Wish I could use my native 165hz
Can't TV manufactures add CRT or scanline simulation to get better motion clarity? I just want something that can match the fluidity of my old plasma TV.
Especially for 60fps content.
Ofc they can. It's trivial for them. Will they? No. They can't be bothered.
TVs have been shipping with built-in BFI/blur reduction for almost a decade.
LCD TVs with back light manipulation can be a lot more clear than Plasmas. I tested a Sony kdl-47w802a and it was 4x more clear than Plasma. It was as clear as a CRT at just 60fps
I'm wondering if the retroarch people could use their rolling scan method with the color control app or something like it to fix lg's high bfi flicker when playing 50/60fps using motion interpolation to 100/120fps, it makes no sense that they didn't just use the same process as 100/120hz without interpolation
This tech can be implemented by displays and the chief says he could do it better having more hardware control vs just the software. But for this to work BFI has to be off and this can't change the rolling scan behavior of the TVs. It just creates different versions of each frame to simulate the CRT rolling scan
This just made me think that software bfi like this could be used with hardware bfi so you could get 120hz bfi to work at 60fps without double imaging.
Can you use this for movies too or is it just for games? 24 fps movies and tv shows on sample and hold displays is rough, man, older lcds were actually better because pixel response was so slow that motion and camera panning was blurry, which feels more natural. Now with fast lcds and oleds we get motion stutter. Man, if the camera is panning and there is an object with vertical lines or edges, especially against a bright background, the image becomes a mess...
You can already emulate a double or triple shutter old style projector like in RetroTink4K via 48hz/96hz or 72hz/144hz
You can also potentially emulate a 48hz or 72hz Plasma TV with their phosphor decay
LCD blur can be simulated too and in fact some OLED TVs out there have an option for something like this
From a Chief reply:
"It’s an algorithm that needs to be implemented by software developers into apps. Hopefully some media player author implements this for great playback of 60fps content, maybe ask the VLC or MediaPlayerClassic or other people, about implementing this? Possibly could be done as a DirectShow filter, as long as you remember to make it permanently run the shader every Hz (CRT emulation must successfully modify every refresh cycle), indepedently of the video frame rate.
A software developer can change an internal native:emulated Hz ratio dynamically to match content framerate, e.g. for 240Hz displays automatically use a ratio of 5 for 48fps material and a ratio of 4 for 60fps material. The native would stay unchanged (unless you switch your real display Hz), but you can dynamically change the emulated Hz to simulate a 72Hz CRT for 24p material!
So I welcome VLC or MediaPlayerClassic to implement subframe-capable plugins to allow me to do 35mm BFI (ON-OFF-REPEAT-OFF) for double strobing, or for CRT simulation, or other multiple-passes-per-movie-frame feats.
RetroArch just implemented it already into a developer release, it’s likely the next public release of RetroArch will have it."
How does this affect the lifespan of the TV? I'm not sure if that makes sense, but wouldn't the turn off and turn on of the pixels reduce their lifespan?
All the opposite, it would actually decrease the chance of image retention
Wonder how does this work on SONY tvs that already flicker at 720hz like almost every X900+ /X930 series
It doesn't work at all with any display that already have back light manipulation tech unless you turnmthat off. This only works with sample and hold
my monitor (lg lg 24gn600) has retention regardless of what settings i use on the shader or monitor :/
Interesting
CRT comeback
This is the closest we will get
Would this also mean we can get light gun games to work?
This works regardless of the content, but if those required something special, I don't know
no
someone mind explaining me how to download and setup all the shebang?
I shared that here
ruclips.net/video/PmXmr4Yiz_0/видео.html
Not gonna lie, this is cool, as a retro gamer and lover of CRTs and OLEDs and motion clarity this is huge but the whole video I was wondering with that accent where you're from since you sound just like me, I'm Cuban btw lol.
Que bola acere
I have 2 questions for this :
1- Are oled tvs of the future coming with this feature will have different picture type or it just improve motion ?
2 - what the benefit of having 480hz panel if you only can play to certain framerates and even gonna have big drops ?
With this, you can get as good as the maximum refresh rate motion at just 60fps or even custom FPS. There's an evolution that will work with VRR
60fps 480Hz CRT simulation =480 like motion for eye tracking (2ms of persistence equal to 2 pixels of motion blur when moving at 1000 pixels per second)
Thanks for the respond Ariel it seems to only improve motion which I don't have big problem with already 🙂
how do you get high on life to start with retroarch???
It is a gameplay video, not the game running. The fix to the Gamma is to turn off let the application change the API (I showed that in the video after this)
@plasmatvforgaming9648 😊👍🏻
Are you using HDR? This doesn't work well with HDR for the moment. We experimented with including it in Lossless Scaling a few days ago and it worked lol, but any tiny variation in frame time results in heavy flickering due to desynchronization. It would need advanced algorithms to adapt to frame time changes, but it might be very interesting for the future!
Interesting. On the code I remember reading that this doesn't care what the frame rate is and it is fixed doing the same
LG tv's actually use this now. My LG 120hz model does this exact effect when some motion clarity setting us turned on. It helps but isn't perfect.
I verified this by recording my tv in slow motion mode on my phone.
The last LG TVs with rolling scan was the C1/G1
@plasmatvforgaming9648 Then there must be some illusion with how my phone records in slow motion because the video has big rolling black bars. I'd be happy to be wrong on this because it's not nearly as effective as perfect BFI.
And the rolling black bars are only on the TV screen, not on the whole recording.
Bro i hope they can make this somehow run on my desktop instead of reshade. Because i want it compatible for all games.
It's not meant to be a replacement for reshade, If anything it's likely to be added to it as one of the shaders.
@@MrPragnienie1993 yeah id rather not have it as a shader because the games I'd want it on would also include pvp competitive shooters. Which doesn't always work with reshade
@@JasonJtran I'm not sure how it being an app instead of a shader would be benefical to you in this case, it would most likely be flagged by an anti-cheat regarless of what form it has.
@@MrPragnienie1993 if it doesn't hook onto the game then it would be flagged. Like using lossless scaling
Use Shaderglass
I wonder if it could work on Steam Deck/Linux. Gamescope does support reshade. Maybe due to open nature more system-wide implementation could be possible
This even works for mobile. I believe it will be everywhere in no time
EHH... Technically, but you'd need the OLED version and even then 90hz is slow for this sort of thing.
Since you don't have Twitter or any other platform I could tag you on, I have to write a comment under your video :) New infos are out on the new LG OLED lineup 🔥 Looks like the G5 is going to be insane.
I'll cover the CES news. There's a lot of exciting stuff
@plasmatvforgaming9648 Awesome :) I wonder how BFI works on the new lineup.
Same crap. Hopefully the Chief can collaborate with these companies to add the CRT emulation to their TVs instead of the lazy BFI
@@plasmatvforgaming9648 My guess is that it's more of a niche feature, maybe LG thinks it's not worth the development time, I'm not sure? CRT emulation sounds interesting though.
Can you show us a demonstration of a retro game? Like a game console that outputs in 240p-320p?
I'll follow up. I have a game already to try
what does this mean for steam deck retroarch?
The latest article from Blur Busters mentions that SteamOS will add CRT simulation
Doesn't it trade of brightness for less motion blur? If only half the screen is illuminated at any given time, it means you get only half the brightness, right? The blurbuster article you showed is also stating, that this is ideally used in SDR mode.
Not exactly that but that's the idea. Half of the screen is not perfectly black. If you look at the 120Hz example you can see it
these intros are always so incredible
Interesting times, though it does mention in the github that an integer division of frames works best even if it's not mandatory, which makes sense. I assume the gamma settings on your TV and in the software match?
Yes, it is an issue with the settings. Maybe I have to turn off RTX HDR even if I'm on SDR because it might be conflicting
I just want an SGI Triniton and Diamondtrons one day for my gaming Pc
I like this video but could you please explain what this shader is even actually doing? How is simulating a CRT going to actually make motion any smoother, that doesn’t make any sense.
It reduces the frame visibility time instead or showing the frame for the whole duration of the frame time (it turns of the pixels for a moment). Motion clarity is limited by persistence based on the Blur Busters Law.
Motion clarity is limited by persistence. What that means is that the motion clarity of any display, regardless of technology, cannot be better than its persistence. Persistence is the frame visibility time. For sample and hold displays, the frame visibility time is as big as the frame time. For example 120Hz means 120 frames refreshed each second, and sample and hold displays show each frame until the new one comes so 1/120×1000=8.3ms of persistence.
1ms of persistence is equal to 1 pixel of motion blur when moving at 1000 pixels per second (Blur Busters Law)
8ms of persistence is equal to 8 pixels of motion blur when moving at 1000 pixels per second.
Finally, just understand speed in pixels per second to understand the statement (although speed can be measured in different units such as screen width (time it takes an object to cover screen width)).
The speed 1000 pixels per second would make an object moving left to right to take 4 seconds to cover screen width on a 4k screen that has 3840 pixels horizontally.
So at 1000 pixels per second in a 4k display it takes approximately 4 seconds for an object to go from left to right.
It's pretty impressive, I tried it in shadertoy at 120hz, but I still prefer to use the 60hz BFI mode on my G4, i believe that uses rolling scan too
The G4 just uses BFI, not rolling scan like CX C1 GX G1
I'm very excited for this CRT simulation to become available on displays, apps, etc, but I wonder if there will be enough brightness on current OLED displays to use the higher refresh rates (360Hz, 480Hz, 500Hz, etc) and get low persistence AND still have a good looking image. Motion quality is very important to me, but image quality is equally important. That's what makes CRTs so magical. They have incredible motion quality and image quality at the same time.
My LG CX, on the other hand, has pretty decent motion quality with BFI turned on to the highest setting, but I rarely use it because it causes the image to become dim and dull. WOLED already have inferior color brightness and with BFI colors just look absolutely lifeless.
If I can get a QD-OLED with glossy screen, 500Hz refresh rate, this new CRT simulation capability and still get around 100 nits of brightness with rich colors I'll be in heaven.
When Samsung finnaly make their RGB-OLED displays available, which are triple emiiter OLED (red/green/blue OLED emitters) instead of just single emitter WOLED & QD-OLED, they will be plenty bright enough, RGB-OLED is over two thirds brighter and higher colour luminance, not to mention has a much higher subpixel resolution (the defacto resolution metric), in fact, MVA RGB-OLED should see 4K nits of luminance, and that's pure colour luminance, no filtering or conversion and luminance lost, which should make for incredible HDR performance, and a really good full-screen uniform brightness.
Your CX will be more clear at 60 with OLED Motion Pro high (312 like) than this CRT simulation because CX is 120Hz
Is cx bfi better than your new qd oled?
This is just coincidence to me. I read in tech news that the Windows 11 24H2 update makes games crash and colors in games are off.
But that would be a total other issue right?
Everything else looks fine
I've tried this in Retroarch, and for me it's a no go. There is very obvious image retention, like the screen of a old arcade that has been running for years, no matter if the option for screen retention is on or off.
For example, in Teenage Mutant Ninja Turtles 4 player arcade, just the little demo play through playing once, that last for less then 30 seconds, is enough to have the ghost of the 1-2-3-4 player outline boxes at the top of the screen from then on. It's like its simulating years of CRT burn-in, in just seconds.
Are you using an OLED or LCD?
@@plasmatvforgaming9648 120Hz LCD
@@plasmatvforgaming9648 Wierd. I put in a reply but its gone. Its a 120Hz LCD monitor.
@@DownandOutNYCimage retention is a known issue on 120hz lcds. You need a higher refresh rate or oled
@@RandomUser-tj3mg I'm not a programmer, but shouldn't this be something done in the script/program itself? Simulate the good things about CRTs, not the bad.
Ok, i need to rewatch this video again in case i missed anything. But again, amazing intro! LOL glad to see you're having fun and not trying to be overly serious like most of these other tech-related RUclipsrs.
Ok, so correct me if I'm wrong.
* 60fps games with CRT Simulation turned on will remove 58% of OLED motion blur with 144hz QD-OLED? or is it just 50%?
* How much lag exactly is piled on top of the 10ms present from OLED/QD-OLED Game mode for 60fps games?
* How much brightness in SDR would be left by using CRT simulation, on something like an S90D(With GRS & TCL turned off in the SM)and can't you use the same HDR to SDR brightness boost technique like you can with RetroTINk4K to compensate for the big dip in brightness loss?
* Does it support up to 4K, or is it also limited to a maximum of 1080p like the TINK4K?
* Will this ever be compatible with Consoles like Switch & PS5/PS5 Pro?
Yes, we can get as good motion as the maximum refresh rate, and it will work for 144 and 165Hz!
All resolutions supported, HDR isn't supported for now, but I'm sure it will at some point
The Gamma should look right because it is set up as good as possible. Nothing like the lazy BFI implementation we have on the TV
My prediction is 150 nits full screen on the S90D with this on
@@plasmatvforgaming9648
Great to hear, and thanks again man! Seems like it's shaping up to surpass Tink4K's 60hz BFI. As it's 4K compatible, reduces 58% motion blur instead of 50% for even 144hz OLED TV's for 60fps titles, and according to you has less visable BFI flicker than internal OLED BFI, and assumingly TINK4K.
But does this CRT Simulation cause any shadow detail crushing by any chance? And we still need to get input lag numbers(Wouldn't be able to tolerate anything beyond 16ms total, even that's slightly pushing it), and if it will even be compatible with Video game consoles like Switch & PS5/PS5 Pro at some point.
But ya, 150 nits SDR sounds good, but i hope they update it with the HDR to SDR brightness boost setting/trick for even more SDR brightness, but without any weird compromises. Oh, and i have a weird feeling that CRT Simulator will probably add nearly 7ms of lag to a 144hz QD-OLED if it's going by the 144hz/6.94ms persistence. :P
Maybe some LCD/LED monitors will be able to provide a similar effect to phosphor decay and make flicker and jutter less pronounced this requiring a lower FPS rolling scan?
Yes, this could be implemented natively on displays to achieve even better results. Check out the video I did about the Sony kdl-47w802a which has CRT motion at 60fps
I liked the video. Hoping for actually working demonstration. Really need this in reshade :D. I would be okay using G5 and base 82fps -> 165 crt simulation. 60base fps is really on the edge regarding input lag. I don't like that in souls like games. But its okay for emulators i guess.
I actually played deus ex:MD + RTGI shader from mr Pascal doing 38 -> 115 fps + gsync, using lossless scaling framegen + specialk. Since i'ts stealth based game, I could handle the 38fps input lag. So there ARE usecases for high end pcs as of today for lower base framerates.
This could definitely work at custom fps , it is just a matter of time
The way this is such a pain to set up is why i got an arcade CRT monitor a couple of years ago. I am so done messing with all those RetroArch settings and cores.
It has been a pain for me, too. I'll make a video follow-up as soon as I figure it out
what do i do if i have a 165hz display?
It will work for 144, and 165Hz and will give you the persistence of that maximum refresh rate at just 60fps (at best)
@@plasmatvforgaming9648
Will 2025 QD-OLED's TV's support 165hz by any chance? If 144hz reduces motion blur by 58%, how much would 165hz reduce it when playing a game at 60fps with CRT Simulation?
Nice Idea but I can't get this to work without losing way too much brightness in comparison to with HDR with CRT shader and g-sync. Anyone want to make a suggestion on crt shader settings and using this on 120hz LG C3.
I'll share more when I test a real game
@@plasmatvforgaming9648 Thanks i tried it with another shader and it wasn't as dim. I had this shaded bar which slowly goes up the screen though. Turning the LCD setting off stops it moving but it still remains. Deal breaker in itself for me.
Hopefully early days and they can fix these things.
Ariel did you watch the Nvidia Reflex 2 video? That looks like asynchronous reprojection! The dream is real bro!
It is taking the latency advantage of Asynchronous reprojection but without reprojecting frames
I’m new tot his stuff and a little confused. Can this CRT simulator be used for any game you want? Like can I just add it to any game or does the developer actually have to implement it themselves for it to work?
This could be implemented on: Reshade, SpecialK, LosslessScaling, ingame setting put by the developer. Also it could be added to TVs and Monitors as a Firmware update or Natively (hardware controlled to make it even better)
Will you go to CES this year?
No, but planning for 2026
@plasmatvforgaming9648 that would be some peak content for your channel. Imagine standing there sucking out the whole oxygen out of CES 2026. 😂
Yo! I was doing some testing and reverted back to stock settings at the moment on the s90d 682. Did you find yours was clipping at 800 nits when stock, is the eotf raised in game mode? I turned contrast down to 47-48, not sure if this is correct. Can remember is this was how it was before the mod.
Stock is clipping at 900 but I didn't test stock too much
I keep the contrast at 100. You can just change EOTF tracking on the service menu
It's possible retroarch just doesnt do some shaders properly with the FFMPEG core. Could be as simple as that.
FFMpeg isnt really being rendered through vulkan and it's doubtful its even using dxgi which would be a direct x standard. Which would explain the shader being broken.
You can change the color space in core options which might help.
I would just download the snes9x core (current) and find some snes home brew game or ya know sail the seven seas 😂. Only real way to be sure its not other settings.
I really need to give this a go though. This was made for my 360hz oled 😂. I just dont really do much emulating on my PC since I've got handhelds for that.
I tried changing the color space and it didn't work. I'll have to go back to sail the forbidden waters again too😂
Okay this is interesting. Do you think you could do input latency testing to see if it works like the Steam Deck when refreshing the frame at a high multiplier? I.e. rendering game at 30fps but refreshing at something much higher.
That’s supposed to get you faster input response, curious if this would do the same thing, worsen, or have no effect on it.
I'll follow up with a game testing
Plasma, can you talk to FOMO about CRT simulation before he goes to CES so he can talk it up to TV vendors at the show?
I'll try to reach out to everyone I can but I don't have FOMO contact
@@plasmatvforgaming9648Thanks for letting FOMO know on last nights stream! Also, check out Blur Busters - new post about Open Source Display Initiative. Includes Plasma simulator planned for later this year and possible SteamOS support!!
How do you launch regular games from RetroArch? I didn't know this could be done.
Edit. Just realized that it was a video and not the game in real time :)
I'll follow up this with a game testing and share how to
Can i use this with 90hz? My laptop only has a 90hz oled.
60 seems to be the minimum but I bet in the future custom targets will be possible. However, I'd be shocked if 45 fps is usable due to flickering
@@plasmatvforgaming9648normal bfi is unusable due to flicker. On my phone I can still see some flicker with 120hz bfi. With this shader I don't see flicker anymore with my phone and it looks amazing.
@@RandomUser-tj3mg You may be able to get 120Hz out of you Laptop with a lower interger resolution, aka overclocking the display, try a custom res app, or try within the GPU driver, set it to 50% the native resolution and 120Hz and test it to see if it works, you may even get 144Hz to work, you can then use Nvidia interger-scaling to map the resolution to your native res, or just use internal scaling in the game options or RetroArch et cetera.
Best shader for crt simulation of retro consoles on c1?
This is not better than the C1 using OLED Motion pro high which improves motion 2.6x. It will flicker less at 60 and that's the only advantage
@@plasmatvforgaming9648 so 58% motion blur reduction with CRT simulation on a 144hz QD-OLED? but what about reinforcing HDR brightness into SDR to compensate for BFI or CRT simulation brightness loss?
I need this app for Switch & PS5. Don't have a beefy PC gaming rig! :P
@@plasmatvforgaming9648
Didn't you mention in the past that the C1's MotionPro High seting, when gaming at 60fps, reduces motion blur by about 63% or something? Input lag unfortunately is some where in between 21-26ms...And if your using it with consoles you have to use HDR Module ON in the SM to compensate for the brightness loss, which risks burn-in. 60hz internal OLED BFI Flicker is too bothersome for me whenever whites hit the screen anyways, and even just 20ms is too much latency in most cases for me. :P
Sigh. I wish CRT Simulation could some how be a downloadable app, or part of a console update that downloads into the system, on Switch & PS5/PS5 Pro already. I mean, with CRT Simulations less visible flicker, higher brightness than internal OLED BFI, it's 58% motion blur reduction for those with 144hz OLED TV's for 60fps games, and with no negative impact on gamma make it sound super promising.
We still need input lag numbers and it if causes any weird shadow detail crushing. Aside from console support, an update for an HDR to 'SDR' brightness boosting trick would be the icing on the cake.
Do you get the input lag of 60FPS gameplay with this CRT beam simulation shader? I want use the shader and still get the input lag of 240FPS (and higher FPS) at the same time, which is less than the input lag of 60FPS gameplay.
That's a good question. I don't know yet. I think most likely worse than 60 but the display being at the full refresh rate could actually help.
Can't you just decrease the gamma in the settings?
Yes, but it doesn't fix the issue
I am excited to test all of these but the functionality is limited with newer steam games especially ones that require a "launcher" from the games developer. I'll be watching eagerly for updates on how to run this.
The future is here. We've come full circle.
most gamers are just not aware what is motion clarity , bc never seen CRTs - to understand what it is you must see it live, it cant be showed on youtube thats the problem.
The newer generation don't know any better. By the mid to late 2000's, they were gaming on lousy motion smearing LCD & LED TV's with their XBOX360, PS3 & Wii's. I've been gaming since the late 80's and have been using multiple CRT's most of my life. OLED motion STILL looks alien to me. I hate it. lol with a 50% motion blur reduction with current BFI it's....good-ish, depending on the genre. Reminiscent of plasma.
So instead of 1080 i or 1080p . Its 1080 50/50 alternativelly ?
No, the display is refreshing at its maximum refresh rate and this tech creates multiple versions of the same frame that get refreshed to simulate the rolling scan
Using a 240 Hz OLED (LG 39GS95QE), and this is not amazing. Motion clarity doesn’t look better to me with crt beam simulation enabled. And yes, I *do* get flickering. V-sync enabled in retroarch, v-sync set to application mode in nvidia control panel. I had high hopes but I honestly don’t see any amazing difference in motion clarity with this.
V-Sync enabled
If motion clarity doesn't look like 240 check the settings, it is probably not working right.
@@Wobble2007 I'm not sure what you mean... V-sync needs to be on in the Retroarch settings for any of this to work.
Looks very cool - realistically for now, is there much point trying to use this if my monitor "only" goes up to 165Hz, or is it really a case of waiting a few years for >300Hz displays?
Yes, it will work at 165Hz and give you the clarity of 165 at just 60fps
input lag the same or worse?
Worse, but they'll improve it soon to reduce it in one frame.
@@plasmatvforgaming9648 Do you know the exact number? We're already getting 10ms of lag at 60fps with OLED Game mode, which is already enough as it is.
Hey man, I have lg c3 do you recommend an s90d over it or just keep the c3 cause I'm a bit worried about the issues I've been hearing about the s90d
The S90D is better, just make sure to get a membership or something that allows you to exchange it just in case you get one with the signature Samsung quality control
Make sure you're on a completely dark room, otherwise I wouldn't recommend QDOLED
@plasmatvforgaming9648 but which tv quality is better the lg oleds or the Samsung oleds ?
I tried on my 34inch Samsung 165Hz LCD(using 120Hz mode) but I was unimpressed, didn't notice any difference....probably because of the backlight and LCDs are not well suited for this kind of tech.......man I wish I had OLED. For such a long time I've hated being relegated to substandard LCD tech since the time of CRTs.
P.S. Backlight strobing looks way better on my LCD, I'm sure it would be different on OLED though.
LCD strobing can be much better that this software solution when it comes to motion clarity, depending on the monitor
Nvidia Pulsar is the best example having strobing and VRR at the same time and being way more clear than CRTs
Nice! I had no idea, both Samsung & LG are releasing 165hz QD-OLED & OLED TV's this year, most likely in April or so I'm guessing. I've got my eye on the 65" Samsung S95F(Hopefully it doesn't have an Anti-Glare filter...) or the LG G5, both of which support 165hz, and more nits under the SDR hood for even brighter CRT simulation.
If 120hz reduces motion blur by 50%, and 144hz does so by 58%, where does that leave 165hz, in tandem with CRT Simulation when gaming at 60fps? My guess is just under 65%
Unfortunately, the entire Samsung lineup now has Antiglare. They messed it up!
With this tech, you'll get the motion clarity as good as the refresh rate, and 60fps will look as good as 165 for eye tracking
@@plasmatvforgaming9648
Somebody told me 165hz might be a 60-62% motion blur reduction. 60 seems too low, 62-63% seems more likely but i haven't done the math.
@@plasmatvforgaming9648
lol. I do not want to go back to WOLED colours with a G5, but the downsides to a anti-glare filter may even be worse. I guess it's a matter of picking your poison unless you settle for a 144hz S90F.
The math is easy 60/165×100-100x(-1)=63.64%😁
I was actually looking which video players support BFI and didn’t find anything. If you can play videos in Retroarch with this shader, I can try it out myself! I just hope Retroarch has color management 😅
Yes!
@ I am one of the seemingly few who like Motion Interpolation, I have have Smooth Video Project. But Samsungs Built in feature is better.
I wonder if actually with better motion clarity with this CRT feature I don‘t want the additional frames anymore. I can’t recall seeing stuttery images in cinema.
A reply from the Chief:
"It’s an algorithm that needs to be implemented by software developers into apps. Hopefully some media player author implements this for great playback of 60fps content, maybe ask the VLC or MediaPlayerClassic or other people, about implementing this? Possibly could be done as a DirectShow filter, as long as you remember to make it permanently run the shader every Hz (CRT emulation must successfully modify every refresh cycle), indepedently of the video frame rate.
A software developer can change an internal native:emulated Hz ratio dynamically to match content framerate, e.g. for 240Hz displays automatically use a ratio of 5 for 48fps material and a ratio of 4 for 60fps material. The native would stay unchanged (unless you switch your real display Hz), but you can dynamically change the emulated Hz to simulate a 72Hz CRT for 24p material!
So I welcome VLC or MediaPlayerClassic to implement subframe-capable plugins to allow me to do 35mm BFI (ON-OFF-REPEAT-OFF) for double strobing, or for CRT simulation, or other multiple-passes-per-movie-frame feats.
RetroArch just implemented it already into a developer release, it’s likely the next public release of RetroArch will have it."
Is this like software based BFI?
Yeah. It's very similar in how it works.
It's way better than bfi because it flickers less and can be brighter
@ will try it out, thank you
You know what, someone can port this shader to ReShade and then it can be used on ANY PC game that Reshade works with ( which is most of them ). And it could be done today!
Yes!
I thought this works for Windows Games also, like Modern titles. But it doesn't. :( That's sad, it only works for iso based titles like ones from Sega and so on.
We will have this working for all games soon using SpecialK, LosslessScaling, Reshade, etc
@plasmatvforgaming9648 that's actually amazing to hear. Thanks for the info!
Bro, I get this high contrast/ gamma issues in some games. Remove Windows HDR only for that game as it's conflicting with RTX HDR. It fixed it for me, I hope this is useful
That might be the problem! I'll try it and pin this comment if it works. Thanks!
I cant wait until there is a simple FPGA box with DP 2.1 & HDMI 2.1 passthrough that converts any sample & hold display to rolling-scan, it would also be amazing to have it at the GPU driver level in Nvidia & AMD drivers, and perhaps Nvidia OLED Pulsar FPGA modules could have it too, imagine hardware level Gsync & rolling-scan with zero flicker 0.1ms MPRT on a 30 inch 16:10 aspect clear black glass RGB-OLED monitor.
VRR with this is already in the works
MAKE GAMING/eSPORTS HEALTHY AGAIN
I think your brightness problem is introduced by incompatibility with ffmpeg video playback in RetroArch. You should be trying this with an actual game ROM that shows off 60hz side-scrolling.
Is that something RetroArch could fix? I'll have to get some retrogames to try
“ RRRRIGHTTTTTT NOWWWWWW “ 😂😂😂
I'm sad that I can't use this on my 60hz display. Time to start saving up.
Go for it!
What about 24 fps video?
From the Chief:
"It’s an algorithm that needs to be implemented by software developers into apps. Hopefully some media player author implements this for great playback of 60fps content, maybe ask the VLC or MediaPlayerClassic or other people, about implementing this? Possibly could be done as a DirectShow filter, as long as you remember to make it permanently run the shader every Hz (CRT emulation must successfully modify every refresh cycle), indepedently of the video frame rate.
A software developer can change an internal native:emulated Hz ratio dynamically to match content framerate, e.g. for 240Hz displays automatically use a ratio of 5 for 48fps material and a ratio of 4 for 60fps material. The native would stay unchanged (unless you switch your real display Hz), but you can dynamically change the emulated Hz to simulate a 72Hz CRT for 24p material!
So I welcome VLC or MediaPlayerClassic to implement subframe-capable plugins to allow me to do 35mm BFI (ON-OFF-REPEAT-OFF) for double strobing, or for CRT simulation, or other multiple-passes-per-movie-frame feats.
RetroArch just implemented it already into a developer release, it’s likely the next public release of RetroArch will have it."
He also has an LCD GtG simulation in the works for OLEDs to reduce 24fps stuttering
@@plasmatvforgaming9648 That's great! I know he's also working in a plasma simulator, it's been a while since i last saw a plasma but i'm really interested in it's benefits.
@@josedeleon1923 I'm really excited about the idea of a 2500Hz Neo Focus-Field-Drive simulator, or even a 600Hz Sub-Field-Drive Kuro simulator, would be incredible for watching media content and films, especially animated content, which looks so good on my Pioneer Kuro Plasma, and films are silky smooth in 24FPS pulldown mode, Plasma is to film and video, what CRTs are to vid games.
@@plasmatvforgaming9648 oh wow, so 24-30fps suttering(like the choppyness of certain panning shots) will be reduced without any form of SOE being implemented? If we can get less stuttering or just as much as plasma and CRT for movies & TV than that's a huge win. Because as is, OLED seems like it has triple the amount of film judder(or Stutter is it?) compared to my S60 Plasma. :P
TINK4K can reduces film judder more than plasma & CRT for 24-30fps content with trible strobe BFI at least, so that's still an option that's currently available, with the same 58% motion blur reduction. But somebody on RUclips mentioned that it creates occasional image artifacts....
S90C has BFI, but i don't like that it's only 60fps BFI. If it can do 72Hz, then i would get rid of my Samsung 955DF
It can do 60fps and give you 144 like motion
@@plasmatvforgaming9648I know, but it's not as smooth as 75Hz which is my sweetspot (that's the refresh rate i always use on my 955DF).
Tried this, it worked. But I get weird artifacts if I add scanlines to games, and still get some image retention with my Asus monitor. I think I will stick to just using Losless Scaling which is a superb program and gives me 120 hz when I have 60 hz content.
I'll share when I get it working right
Try turning on integer scalling
And theres a resolution independent scanline shader in shader_slang/scanlines/res ind scanlines
God Retroarch is so archaic, like trying to learn how to use Linux or something. But it does make sense, it's just overly thorough heh
I figured out how to.fix the.messed up Gamma in the next video.
I have a 15 inch CRT monitor from 1995 and the video/picture quality is way better/more realistic than my 1ms 240hz monitor and it has MPRT as well the only thing that looks good is the sharpness since its like 2k resolution, looking for a bigger CRT monitor but its getting harder to find and more expensive (my 1ms monitor feels cartoonish compared to the crt lol idk why I cant explain it)
And CRTs look sharp at any resolution
Is plasma motion any close to CRT?
Bro just downloada a Sonic 1 or Super Mario World rom
They have attract screens which are basically videos so you don't even have to play them
HDR is not supported yet
As for consoles implementing this it would actually be a bad idea right now
Mark and Timothy are still working on it and improving it, in many displays it won't work that well, HDR not supported yet etc.
It might hurt the reputation of this simulator which would be counterproductive
The biggest problem is that consoles don't output more than 120Hz and I see a lot of Console gamers with low end LCDs not liking it.