Both are generating frames but TV doesn't have the processing power or the complex software required any good looking frame gen. Use this and avoid the TV one. This one has lower latency too.
@@DragonOfTheMortalKombat Interesting, but one thing to note that many T.Vs like mine also have different chip sets, for example my T.V comes with X1 Ultimate Chip, it is a Sony Bravia T.V. Wouldn't that also make a big difference in how the T.V upscales and frame gens?
@@JohnDoe-qj3iv I agree with Dragon, if your use case is gaming you're definitely better off using Lossless Scaling as it just gives you more control in general, you can lower the latency by allowing screen tearing, generate more intermediate frames (2X, 3X and 4X) according to your screen's refresh rate or base FPS, etc. Plus it just gives you more scaling options in general, which TVs don't. My older Samsung TV for example doesn't even allow me to have both Game Mode enabled (which lowers the input latency) and the fluid motion modes enabled at the same time, so it's definitely not ideal for gaming, and it's better for media consumption.
@@Skndeone Cool you managed to cool it down, but it still too hot for 1060. keep in mind that this gpu should have 50 degrees without any undervolting or custom fan curves. Check other benchmarks on youtube and you will find out that people have way lower temps. I also had 1060 in the past and mine also was never hotter than 60 degrees. Maybe your thermopads in gpu are the cause of high temps, who knows
Hey, just a quick question. What is the difference between this and the 60fps "fluid motion" modes that we get on T.Vs ?
Both are generating frames but TV doesn't have the processing power or the complex software required any good looking frame gen. Use this and avoid the TV one. This one has lower latency too.
@@DragonOfTheMortalKombat Interesting, but one thing to note that many T.Vs like mine also have different chip sets, for example my T.V comes with X1 Ultimate Chip, it is a Sony Bravia T.V. Wouldn't that also make a big difference in how the T.V upscales and frame gens?
@@JohnDoe-qj3iv I agree with Dragon, if your use case is gaming you're definitely better off using Lossless Scaling as it just gives you more control in general, you can lower the latency by allowing screen tearing, generate more intermediate frames (2X, 3X and 4X) according to your screen's refresh rate or base FPS, etc. Plus it just gives you more scaling options in general, which TVs don't.
My older Samsung TV for example doesn't even allow me to have both Game Mode enabled (which lowers the input latency) and the fluid motion modes enabled at the same time, so it's definitely not ideal for gaming, and it's better for media consumption.
if i set 1360/768 i got some of black lines why u dont?
@@dizzy7376 not sure but if your monitor's native resolution is 1366x768, just leave it at that instead
for me it is not working just gives me 10 fps instead of incresed it
@@dizzy7376 You have the same specs?
Any delay?
Some, at 2X it's not really noticable imo
Bro When I Play RDR 2 It Gave Me 90°c On GPU Why YounDont Get 90°
thats not a good thing??
@@Pamela-oj2ly then what Temp Is Acceptable GPU - RTX 3050
@@MultiEditz13 pc or laptop?
@@sanfrrraa PC
@@MultiEditz13 maybe air flow issue u have exhaust and intake fans on your pc?
man ur supposed to have 45-55 degrees on this gpu. hitting close to 80 means something is wrong with ur gpu or cooling system.
Probably is a laptop it's normal
It's not a laptop, but it's all good now I figured out how to reduce my temps by a lot
@@Skndeone undervolting goes brr
@@varios4883 Undervolting + a custom fan speed curve and my GPU now stays below 67 or 68 degrees easily even at 99-100% usage
@@Skndeone Cool you managed to cool it down, but it still too hot for 1060. keep in mind that this gpu should have 50 degrees without any undervolting or custom fan curves. Check other benchmarks on youtube and you will find out that people have way lower temps. I also had 1060 in the past and mine also was never hotter than 60 degrees. Maybe your thermopads in gpu are the cause of high temps, who knows