Thanks for the great video! Main point for using 800 dpi as main is the fact that it's the highest DPI that can be played in many different games for a very high cm/360 sensitivity. A lot of the new games don't have sensitivity sliders / numbers that can go low enough when you're using 1600, unfortunately. Adjusting windows sens is a no go - it always created issues in the past.
@@moretoastedthanatoasterstr9773 i play on 12000 DPI and its just amazing for me with high fps to match. To each his own i suggest everyone try high dpi low sens to at least see how they like it.
@@UltraMegaFail ya no I I complete agree was playing on 400 dpi like 2 months after I got a computer and iv had it since 2014 and fuck me wish I bump the dpi a long time ago
wrong but semi right. as an FPS player who is high rank in all FPS games (radiant valorant, global csgo, FPL). we use 400 DPI and 800 DPI because it's more consistent when it comes to pixel tracking. if there is less variable for mistake on lower DPI but if you go too low it will become skippy and unplayable so people assumed 800 DPI was the best but 400DPI has been proven to be just as good.
@@Soleft i can only speak on behalf of everyone i know and everyone in the pro scene of FPS games. it's not like there is a data sheet, we just can feel the difference
@@BattleNonSense I can imagine, thanks for the hard work. Ever since you announced it I was very curious about the results. Maybe you'd be able to make the movement slower by using gears? Or will that be too complicated or maybe cause too much inaccuracy?
@@BattleNonSense Thanks for the results. I would be very very very interested in the same test just with different brands. Take 2 common logitech mice, a razer, a Corsair and compare them please. And if possible with an even lower initial speed, because when tracking a target while full auto I am sure I dont move my mouse as fast as seen in your first test... Thanks! :)
@@kmndra5831 even worse is some games in full screen borderless modes, high DPI can make the invisible cursor go outside the border and then when you click you click out of the game
@@wile123456 He's probably talking about the program "raw accel" iirc there used to be a raw accel dll injection so it's compatible with virtually everything.
Polling rate = temporal resolution. DPI = spatial resolution. The higher latency from slower movement is because time = distance/velocity. The "distance between dots per 1 inch" is greater at lower DPI before the sensor registers movement, and hence faster mouse movement nullifies this.
really wish you would test other sensors besides this one! would be interesting to see if this curve drawn is universal or some affect of the sensor in the death adder.
its the death adder. Theres other testing by other people with many more test subjects that show this. The G Pro X superlight was tested alongside the 10khz models that just came out, there was no latency difference between them. The death adder is unique in this oddity.
If anyone is having a hard time with windows sensitivity you can go into the registry to edit in decimals. I am running 3200DPI most of the so I run it at 2.5 though it doesn't display in either of the mouse menus. Remember to backup your registry before you make any changes. The reg Key is "Computer\HKEY_CURRENT_USER\Control Panel\Mouse" > "MouseSensitivity"
I want to use 6400dpi, but the problem I encounter in game is that the mouse will be too fast, even in lowest game sensivity. My question is, will a low windows mouse sensitivity solve the problem with high DPI I described above?
the best way to look at DPI vs Sensitivity is to think of analog signals. The high DPI is a hardware version of a high signal, while the high sensitivity is a software amplification. With a high dpi (i use 900) and a lower sensitivity, you can get the fast movements you want without sacrificing the super fine details when you are sniping long distances. You cannot just say they have the same equivalent DPI because the sensitivity x DPI is the same number. How the system amplifies each signal will cause issues.
with 900 dpi you basically make your mouse in need to add an interpolation process while registering your mouse mouvement wich can create jitter and inaccuracy. Most mouse has a native dpi of 800 so you need to make a dpi number that is a multiple of 800 since 900 its not a multiple of 800 I suggest you first to check the native dpi of your mouse and use a multiple of it for native 800 dpi use 800 - 1600 - 2400 - 3200 - 4000 etc (tbh more than 2400 is useless in my opinion) Also you need to take in count your monitor this is due to the fact that DPI does not adjust with resolution, meaning lower DPI settings on higher resolution monitors can cause slight stuttering and imprecise aiming in games. Just realise its a 2 years old comment lol. You probably doesnt care anymore at all aha
@@w0uffv379 i have steel series prime and it says 18000 cpi so should i use multiplies of 18000 also should i change desktop resolution when i get into game?
Are the results adjusted for sensitivity? Cuz it would be interesting to see if there is a difference between (let's say) "100 DPI with sensitivity 10" in game compared "1000 DPI with sensitivity 1" to see if there is a delay because the sensor isn't registering the movement or simply because the game doesn't update the camera position because the input value is too small!
I think he's not using a game with a concept of 'sensitivity' or 'eDPI', it's using software which flashes the screen black -> white on an input (i.e. one mouse packet with count values)
BE CAREFUL NOT TO INCREASE DPI TOO HIGH ON YOUR MOUSE, some mice will start smoothing at certain DPI points and will have higher latency. All of Razer's new wireless mice have very good sensors with little to no smoothing (hard to tell) up to 20,000 DPI.
@@rubenbaczo8497 will be less raw, will have like a "filter" between your hardware(mouse) and the cursor in the monitor, will not be 1:1 (mouse movement and cursor in monitor), and this usually add latency (less real time between the response with the mouse/system)
Keep in mind, USB devices don't send updates on their own, they are "polled" at a rate by the computer. The higher DPI's diminishing results may be due to that polling rate. Some mice let you increase that rate.
@@CanwegetSubscriberswithn-cu2it All his graphs are showing is a delta normalized DPI, DPI / (hand distance / cursor distance moved) * delta. This is simply how mouse sensors work, the lower your DPI the longer between your mouse jumping pixels, this should be obvious and common sense considering you know that DPI means "Dots Per Inch." Waffle never claimed it's determined by the PC so we already know you're intellectually disingenuous lol
@@SoftBreadSoft I have actually read tge USB protocol specifications and have programmed USB devises. The 1kHz polling rate was the max available in hatdware for USB 1.1 and 2.0. Only USB 3.0 added the ability to clock the interrupt transfers ss multiples of 125us instead of 1ms.
The polling is done by the USB controller, not the "PC". While there are devices which alow changing the polling rate, these are pre-defined by the device. The PC may choose which one, but they are pre-determined by the device, not the PC. I'm familiar with the actual USB specification, I've done ISB development. I know how this shit operates.
Are you sure this is not simply "the lower the dpi the more movement needs to be done to start mouse movement, hence high delay (of the first action, not continuous actions)"?
I would wager that this is almost entirely a sensitivity effect, not a latency effect. Controlling for the difference in sensitivity by testing e.g. 1600dpi 4sens against 800dpi 8sens should nullify the overwhelming majority of the difference. Chris is using the data measured with different sensitivites to say that a higher DPI which is scaled down to have equal sensitivity would have an improved latency. The data does not support this at all - it's a mistake in interpretation.
That's the thing. Lower DPI requires more movement to be registered, and is thus slower. The higher DPI for a given movement, the more often updates get registered, hence lower input lag. It's similar to how higher FPS provides lower input lag due to more updates, except that in the case of mouse, the frequency scales with movement speed, hence lower delay at higher speeds. At very high speeds, the input lag due to DPI updates approaches zero, leaving the rest of the system lag that in this case seems to be 20ms.
Amazing channel!!! Subscribed! Loved the testing methodology and it helped me understanding why I do prefer higher dpi (1600) over the "standard" 400...
You didnt mention the reason behind the increased input lag. It seems logical to me, that lower DPI "react" slower because the steps between each count are larger, so you have to cover a larger area until the sensor starts detecting movement. Amirite?
This also means that discarding 50% of the mouse counts (or making them move a camera 50% as much) via using a lower sensitivity multiplier would nullify any latency improvement from doubling DPI.
@@AerynGaming Exactly. As soon as every DPI adjustment meets the distance necessary to report 1 count, ALL of then would have the SAME exact input lag at that moment. This video has the most stupid conclusion ever about how mice resolution works. It's like saying that 4k gaming has 50% less input lag at horizontal and vertical movements because you're moving "double the pixels". You can arguee that higher DPI resolution at the same speed will trigger higher polling rate frequency report, but AGAIN, this would be nulified as soon as the lower DPI settings moves the same distance needed for a single count. The only effect that would had at going at higher DPI values is a smoother "granularity effect", that people misinterpret as "pixel skipping". Assuming that the eDPI would be the same.
I am always set to 800DPI and it seems that it's perfectly fine for pretty much all situations, I rarely have a game that it too high on minimum value, I tried 1600DPI to get the tiny reduction in latency but some games couldn't go low enough and I can't be bothered constantly switching, but the difference in latency between 800 and 1600 is so minor anyway
Should have used eDPI to maintain the same overall "sensitivity" and relationship between DPI and in game sensitivity values. If he doubled the DPI, halving the in game value would maintain the same relationship. Otherwise all we're seeing is the time relation between sensor signals as the sensor tracks it's position
Are you sure you aren't measuring a different effect? Like how perhaps a slower DPI means the mouse will have to move further to register sufficient change? In both refresh rates at 100 DPI, you measure about +20ms response time compared to the fastest DPI setting, which suggests to me you are measuring the hysteresis of your setup. Solid methodology though!
he always measure the same distance and the same speed, its like setting low dpi and high in game sens vs high dpi low and low in game sens, it looks correct to me
@@Anderson_Roger I feel like static in game sens is a right choice here, so only the dpi of a mouse can have an impact on the result, he couldnt done both, to keep eDPI the same and compare, but I feel that in game sens being just a multiplier of the mouse dpi, wouldnt change anything
I had a similar thought. For these tests, if you scaled in-game sensitivity with DPI, then I think the results would be more comparable. Meaning, 100 DPI with 5 sens would become 200 DPI with 2.5 sens, and so on. Therefore, the amount of 'turning' that occurs is the same since the ratio of DPI to sensitivity is the same.
Also big note: in games with high zoom levels(i.e. battlefield 4's 40x scope) low dpi will fuck up your aim as it'll become jittery, at that point if you're gonna snipe from long range you might as well use a controller instead with how jittery the aiming becomes at sub800 dpi
Question: When at a lower DPI (like 100) it takes a longer distance of travel for the mouse before it sends another signal. A longer travel does not mean more input lag, it means more movement of the mouse before it sends more signals to the pc. Was this considered when making the test? I dont know as much about latency as you do, so correct me if im wrong.
At the slow speed (10mm, 100ms) it should only take 2.5ms to move between DPI points at 100dpi. Maybe the acceleration delay, from stationary to full speed, is affecting it. It could be mouse smoothing built into the firmware.
@@speedweasel I think that depends on many other things. I have been sitting in front of a PC for 21 years (both for work and gaming, since I was 6) and I have no issues.
@@tredbobek Same for me. A4tech x7 with 800 DPI over 15 years of gaming in all genres (mostly action), the only thing that I have improved over the years is the area for a large mouse pad.
Did you change the Sensitivity ingame to travel the same distance with different DPI? I think this way you tested, in the first case, 100 dpi will travel a little distance, and the 1600+ will travel so much faster, thats why the 1600+ had less input lag. So changing the DPI will make no difference, just change the velocity will. I think the correct way to test will be travel the same distance (like 30cm) with different DPI/Sensitivity values, to see if the DPI alone will change the Input lag.
Yea, I was wondering the same. Possibly the LDAT software is looking at some minimum delta of movement per frame, and at higher dpi you are more likely to trigger that
I did the same test and compensated the sensitivity to match my old one went from 800 to 1600 but no way to actually measure it feels better tho I don't know maybe just placebo effect.
Good point. Most gamers probably have a pretty tight range of speed at which they're comfortable, so the whole "does more dpi reduce latency" thing only becomes interesting when you want to keep your comfortable, trained speed, but wonder if more or less dpi is better. If you do your 30cm/360°, you want to keep it.
To avoid jitter, there is nil response when going from "distance moved 0 dots -> 1 dot", the larger in size the dot, the larger the delay to go from value 0 to value 1.
I always understood that setting the Windows setting to anything other than the default 6/11 would mean you lose a 1:1 relationship between physical mouse movement and how the input is understood by the computer, e.g. skipped pixels for setting higher values. Is that still the case?
4/11 translates to 1/2nd and 3/11 translates to 1/4th. You do keep your 1:1 relationship. But like if you are used to 800 dpi and you go to 1600 dpi for the lesser input lagg but want to remain your 800 dpi mouse feel on desktop. Then you change dpi to 1600 and your windows cursor speed to 4/11
And might even have more digits to work with. Like when the game only offers two in the UI, but the config file supports 8, so you can have your perfect 1.23624 sens.
Can you give a reason as to why this is? Taking the dpi value literally as "dots per inch", do you have to displace the first row of dots for the mouse and system to discover the movement? This would explain why faster moving speeds show less dependence on the dpi value, because the faster you get the minor is the different between how long it takes you to cover the first inches of dots NOMATTER how far they are apart? For slower moving speeds, it takes you longer to cover the first row of dots, and for different dpi values, they are spread apart further. Hence the higher influence of the dpi value on the system latency for slow speeds? It would be great to have an explanation next to your testing :) Anyway, as always, great video!
Calling it input lag is a rather poor choice of words, it's not like a signal takes more time to reach its destination but rather there are fewer updates in a given time. He should have made it more clear that it's behaving closer to how polling rate affects input lag which is in the end the amount of information in a given time period.
In Counterstrike we compare sensivity in eDPI which is calculated by multiplying DPI with your sensivity. So If you have a DPI about 800 and a ingame sensivity about 1 but want a smoother Mouse with the same sensivity you need to Set your DPI to 1600 and your ingame sensivity to 0.5. Or 400 DPI and Ingamesense 2 for a more steady but Not so smooth sensivity. Very simple
In my setup, frame rate would still matter far more than mouse dpi (mine is 400). My 6700k is falling behind, it seemingly can't keep up with newer games like BFV and is probably bottlenecking the system.
Been thinking about this, When I increase my DPI to 1000 from 800 (adjusting in game sense to account for eDPI) I feel like it is a bit to responsive with over shooting etc, so maybe I should be looking at a slight reduction in game sens rather than the standard eDPI calculation. :thinking:
I think it's more to do with you have to move your mouse further to register a change on lower dpi. This being said 1600 still feels more responsive than 400 and because it's smoother, it's easier to track targets even during a flick while at 400 I can't track the image at all.
i just came see this and try it out right away. switching from 800 to 1600 dpi for some reason feels insanely good. tracking in apex feels 100% so much better!
I was thinking the same thing. I’ve always been at 800 and never changed it before because I change my in game sens accordingly. I see this comment is a month old, are you still on 1600 and do you enjoy it?
@@Sandzzy i'm still at 1600 dpi rn and i do really enjoy it! i change my windows mouse cursos speed setting to 5 instead of 6 because its too fast if i just browsing in general.
I think this makes logical sense due to a lower DPI value only detecting movement after a larger physical distance on your mousepad. This will mean the first input is naturally going to be delayed, however this is the very reason many pro gamers will choose to use 400 DPI, because it detects only significant inputs and will not move your crosshair held on a pixel with a shaky hand for instance. As seen by Battle(non)sense's test with flick shots however, it does not have an effect on the latency of the mouse.
This is exactly why I use 450DPI (used 400 @ 1080p, went to 450 when I upgraded to 1440p). I tried 800 and it's not steady enough, picks up too many micro movements.
@@chy.0190 No, csgo works really well on 400 and pros will usually play 400 or 800 because it is ideal for stability and precision. Only reason I ever left 400 is orher games not doing well on it, but csgo worked just fine.
I feel we are missing an insight here? If we imagine that DPI is the amount of squares on a chessboard, and we use that chessboard as a mouse mat. Each square is a signal of movement from the mouse. So the higher DPI, the more squares on the chessboard. If we now drag our mouse across the chessboard - the smaller the squares, the faster the initial response of movement would be. But would higher DPI really reach its target faster?
I've always been using 400 DPI, not only because it feels good imo, but also because a lot of games have horrible ingame sensitivity settings, sometimes not even supporting decimal spaces. That means I basically have to use a low DPI to get anywhere near my usual eDPI. But I'm sure I'm not the only one with this problem :/
Yup that was the main reason I stuck to 400 dpi for so long, but recently after battlenonsense made this discovery I switched to 800 dpi anything higher is too fast for me, and so far I havent ran into too many problems with sensitivity scaling in modern games, but a few years ago it was really common.
@@TheKillerZmile raw input doesnt invalidate the use of a higher DPI. Raw input means mouse input goes directly to game, meaning the input doesnt get rendered by your windows sensitivity settings or mouse acceleration by windows, anything like that. It's useful for sure, but the whole DPI thing still applies. 400 DPI is pretty suboptimal but unless you NEED every competitive edge you can get, you should probably go up to 800 or 1600. Just because my mouse can I use 12000 DPI with sensitivity on everything including windows scaled all the way down. Unless a game doesn't support extremely low sensitivities I have the DPI scaled up in Logitech G HUB. If your mouse can increase DPI super high I recommend you try it out, it's super cool to see how responsive your camera is to tiny mouse adjustments. And when you take a second to look close you can see how unresponsive your mouse was compared to with a higher DPI. This is unnoticeable with normal use unless you're making microadjustments in an FPS game
Basically DPI can be viewed as a measurement update frequency per inch. The higher the DPI the sooner the first measuremnt gets sent when the mouse moves. Which results in lowering the input lag in effect. But we can also increase the physical movement speed of the mouse movement to get the measurement sent earlier when we have low DPI which also lowers the input lag. At certain movement speeds we reach a cap on input lag irrelevant of the DPI.
I just read your farewell announcement Chris. I'm sad that you're no longer able to make new content due to circumstances out of your control but I'm also glad to have be here with you since the beginning of your RUclips journey since the Battlefield 4 days. Your videos have always been top notch quality and I appreciated every second of it. You are very talented and it's very rare to find someone such as yourself who consistently creates top quality RUclips content. I hope everything works out for you and your family and I wish you all the best. Good luck to you and your family and take care of yourself Chris. ♥
The reason I'm not moving up to higher dpi values from my 400 is because my edpi feels the same as the standard windows cursor speed (which is 6). Therefore, if I increase dpi, my cursor will fly very quickly, and slowing it down through the mouse parameters in windows will add mouse acceleration (you can check it by yourself by using Mouse Movement Recorder app), so I do not recommend sacrificing the lack of acceleration for the sake of a couple of milliseconds of Input-lag. Display scaling in windows settings also adds mouse acceleration, so keep it 100%.
For anyone curious, i suggest you to calculate your EDPI first. It's basically your current DPI X In game sensitivity. For example, 400dpi x 1.3 sentivity= 520 edpi. If you want to switch to higher dpi, simple divide your EDPI to the new DPI option. For example, 520 edpi÷800dpi= 0.65 is the new in game sensitivity with 800 dpi.
1.3 what sensitivity? in pubg that i care about there is vertical sensitivity and general sensitivity but the general is 0-100 and i have it 42 with 1100 dpi,that means my edpi is 46200,so if i want to go 400 dpi that makes 46200/400dpi=115,5 so i cant do that,,,i guess in pubg works till around 500dpi or something...on higher dpi works ok i guess couse you can go to the 0 if you need too...
I would prefer it if, instead of a slider, I could actually type out a specific sensitivity to 3 decimal places of precision. The relationship between DPI and sens is pretty exponential and shooting for high DPI really requires more precision on the sensitivity scale.
i assume the "input lag" is the time between the start of the movement and the first registered "pixel" or "dot" of the sensor. thats probably why the faster solenoid setting reduced the latency in lower dpi setting, because the first pixel would be registered by the sensor earlier
Very interesting. I have for a long time used high dpi, with in game devider (0.01 - 0.99) as I belived I felt more connected to the game (1800). Then for windows desktop stuff I just used a differnt profile on the mouse.
@Alex.UA6 Yea, not necessarily more delayed, but just generally bad. nobody would ever deliberately decide to use 500HZ over 1000HZ. I can see people deciding to use 1000HZ over 2K of 4K because they got used to 1K, but generally most inputs from a 500HZ mouse will be more delayed than a 1000HZ one
it depend on the game tbh for example r6 is coded for 128hz mouse polling rate so if you use 1000hz there you nerf yourself. apex legends for example support max 500hz (mby that changed i talk about like a year ago i didnt play r6 and apex since) so better use 500 on apex and most game tbh counter strike does support very well high mouse polling rate so its fine to use 1000hz + there same for valorant and im not sure but i think call of duty also support it very well.
Great video. I’d like to see a video about changing the in-OS Windows sensitivity and how it affects games and input lag. Historically, adjusting the in-OS Windows mouse sensitivity from its default position produces major mouse inaccuracies, which can lead to your games feeling all wrong. It is reminiscent to when you have the “Enhance pointer precision” checkbox set on ON. (Which you should also never do.) There is a lot of research out there about it that you can Google for anyone curious.
Most games nowadays use raw input from the mouse though, so that _shouldn't_ be a factor. Unless you're playing an older game of course. Completely agree that it could be a nice video to see though, just get a definitive answer once and for all.
@jparkerwillis I’d also like to know if the settings are different if you’re using the new Windows 10 sensitivity slider or the classic method inside the Control Panel.
@@Plazmunky It's the new one that matters in Windows 10, not sure why everyone always talks about it like there's just 11 steps. The "classic" steps just correlate to _1, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20_ on the new slider - yet all of those on the new one do affect cursor movement. If you set it to 19 for example, instead of 20 there is a difference 5% in speed.
@@BattleNonSense hi there, I'm curious about using macro software like X-Mouse button control will add some input lag, can you test it? for some reason many gamers use that software to change or create macro on their mouse like burst click or like auto reload when clicking button
I was also hoping to see testing for DPI changes vs in-game sensitivity as well. I think I set and forgot my mouse at around 1200 dpi and adjust sensitive per game to what feels right. I have a small space for the mouse, and use wrist movents usually, with a little bit of arm room.
The same thing applies. All ingame sensitivity setting affects is how much your camera rotates per pixel traveled. Meaning it depends on your DPI. If your DPI is 1000, your camera will make 1000 movements per inch the mouse travels. If your DPI is 10000, your camera will make 10000 movements per inch the mouse travels. Lower the sensitivity accordingly to make up for this
@@bigbob5103 Pretty old comment but the only FPS games I remember playing the most was the Halo series and that was on console. I dont have much interest in FPS most games in general, not the multiplayer ones anyway.
Long-story short, use higher DPI and lower the in-game sensitivity to make up for it, then after that get a smooth mousepad, trust me it makes all the difference, maybe the mousepad even more.
Maybe mice with lower max DPI on their sensors generally have less input lag than using the same DPI on a mouse with a higher max DPI. So a Viper Mini with 8k max DPI using 1000 DPI has less latency than a Deathadder v2 with 20k max DPI using 1000 DPI because of how high/low the DPI is currently set relative to the maximum DPI possible.
So you're measuring initial mouse movement, correct? Let's say I have a length of, I don't know 10mm, along this length I have two setups: Length 1: 100 dots Length 2: 3200 dots If I drag something along this length, I'm far more likely to hit a dot earlier on length 2 than 1. What you're measuring is "time to hit first dot" and not the response time.
Imagine Usain Bolt was sprinting against a 5 year old and they both set off at exactly the same time. If we apply the same testing philosophy that Battlenonsense did in his video, we would only measure the time until the first foot landed on the ground, and because of their tiny little legs and narrow stride pattern, we would therefore conclude the 5 year old was the faster runner.
I would like to note that you can't have windows cursor speed presets like you can with DPI presets. So some games that use the cursor in game will have an uncomfortable speed if you use 2 different DPI for desktop and gaming. Just get a bigger mousepad ;)
It's because high dpi settings are more sensitive than low dpi. The sensor of mouse catches smaller movement of mouse at higher dpi. So as you can see in the video, mouse travel speed affected on initial delay difference between DPIs, But your total travel time to destination(flick time) will be same anyway. Well i don't think this initial low delay provides any benefits. Assuming your dpi is 1600 and in-game sense 0.25 in CSGO. Csgo's degree per pixel is 0.022 so your initial travel degree will be 0.022x0.25 = 0.0055. Compared with 400dpi@1.0(same eDPI) your benefit is only under 0.011°. From over 0.011° to 0.022° scenario, 400dpi will be faster than 1600dpi because 400dpi will jump to 0.022° immediately while 1600dpi still traveling per 0.0055°. Flicking in 0.011° faster but slower in 0.022° isnt meaningful. Use whatever you want. 400dpi is also good option
You’re forgetting that with the same eDPI and high DPI equivalent you will also get significantly smoother motion while tracking because information is sent more often from the mouse as opposed to with low DPI where the game has a stronger multiplier for less data to achieve the same distance traveled. Alongside the input lag benefit, higher DPI is much more precise whether your flicking or tracking.
@@xzraiderzx308 Flicking is more consistent in lower DPI. Lower DPI eliminates shakiness which is the biggest factor that messes up flicks. Tracking is more consistent in higher DPI.
so basically what i took away from this was; low sens, high dpi high sens, low dpi but depending on what speed youre moving your mouse at neither matters that much. there’s also ig taking into account what kind aimer you are as well as depending on game(s) you play. in a precision FPS shooter like CS or Val for example, if you’re a smooth aimer, it’d be better to play low sens high dpi since most of the time you’ll be trying to achieve the most consistency in accuracy, and obviously from a logical standpoint the vice versa would apply for aimers that like to be more flicky. if you want to achieve the best of both worlds per say then i guess it would be best to go with 800 dpi and whatever you would consider a medium sens. but really at the end of the day its just personal preference. if you read this, ty and have a good one :)
I like to crank the DPI and lower the sensitivity in game to what feels good. Seems to get a much smoother feel. Basically same idea as a higher resolution making a smoother image all other things being equal.
Can you double check this with adjusting the sensitivity to scale the dpi, to make sure you are actually measuring the input lag due to DPI and not just a sensitivity effect?
There is no "sensitivity effect". When the mouse reports movement the angle you look changes in game no matter what sensitivity is set. LDAT reports the first instance of movement, not how much it moves.
That feels wired maybe i got somethink wrong but: - If i move my mouse at 1600 DPI ~ 20mm from left to right to turn around 180 degrees for example, shouldnt be the calculations distance / time ? less distance in less time = less input lag also ? - if i move my mouse at 400 DPI ~ 80mm from left to right to turn around 180 degrees shouldnt it be slower and have more input lag special cause ME move the mouse causing the input lag to increase on move distance ?? ~ 1600 dpi ~ 20mm " flickshot " vs 400 DPI ~ 80mm " flickshot " causes the 1600 DPI " less movement ~ distance / time "= less input lag at all ?
While testing the latency seems useful on the face of it, there's a lot of information missing that would make this test actually useful in real life. First, mouse movement to cursor fidelity at each DPI setting in addition to testing for input lag. Second: testing across multiple mice used in the esports space, from multiple additional brands including (but not limited to) Logitech, HyperX, and Zowie. Third: test how much the proprietary software adds to possible latency (and movement fidelity) by turning it off (which in many cases forces the mouse to a default dpi setting) and adjusting the operating system mouse sensitivity instead. Such a video would be AMAZING.
true, but this dude already puts in so much effort into his videos, it would be insanity to go so much further and actually make these experiments be fully applicable in real-life applications. i'd love to see it, you would love to see it, everyone would love to see it, but i think he'd need a friend or someone to help him with the boring and time-consuming parts. it's sad, but at least we have SOME data which we can apply, and hope it has a positive impact on our latency. i love videos like these because i'm really concerned with the details of my system, even though i'm not yet able to play games professionally. so at least we can find some minimal use from these experiments of his, not all of the effort is going to waste. the main issue is as you said, a lot of missing info. but that does not necessarily mean we can't use these graphs to adjust our system. for instance, me using the razer viper 8k, a DPI of 1600 is going to be objectively the best (check out his other video if you want) no matter if other factors also increase the latency, because afterall it's just going to be simple addition/subtraction, i.e. if an application increases latency, that latency is very unlikely to directly impact how much latency a low DPI induces for instance, instead, their 'scope' is more likely to be separately/individually influencing latency. albeit not impossible, the results should be predictable is what i'm saying, however we have no way to confirm that or know exactly how other factors influence things. and just doing the minimal amount of work to prove this would still be a lot of work.
POLLING RATE (mouse refresh rate in HZ) is what changes your input lag // DPI is just the measurement of the mouse movement over a certain distance; you can lower dpi but increase sensitivy, or vice versa, and you will accomplish the same thing.
I think the 4k mouse trend is mostly a joke in 99% of scenarios. A lot of great pros still use Zowie EC2. You can say they're stuck in their old ways if you want but it is certainly a comfortable mouse. Watch the video on the gamer nexus video about latency which includes a very well educated Nvidia spokesperson. Mice are one of the lesser important things in overall latency but still important if you're competitive with a good to great rig. He said most gaming mice are in the 1ms response time and even the most popular mouse in shooters (g pro wireless) have a 0.8ms response time in his case. There can be slight deviations I'm sure. In the end, I think what is most important is at least a 1 ms response time to when you press the left and right click buttons and comfort. Mousepads can also be thrown in the comfort level. Some pros who playing games that require heavy tracking like lyr1c use a glass pad. A glass pad does have a weakness though and that's stopping power for pixel perfect flicks
What you should also keep in mind is your mouse sensor's limits. For example my Razer Deathadder Elite has a maximum DPI value of 16000 but after 1900 DPI it starts to use smoothing which increases input lag. At higher steps is smoothes even more increasing the overall latency so I have to use my mouse at 1800 DPI to avoid this and have the lowest possible input lag.
FINALLY. And with this people won't take me for a crazy person when i say "high dpi low sensitivity is better for sniping": low dpi mean more distance necessary for the mouse to register movement and also for this movement to translate in-game. Try playing with a 40X scope in bf4 with 100dpi vs 2000, it'll be like the difference between a slideshow an real life.
Agreed. in some games though you can do it even if the ingame settings won’t allow it. You just have to tinker with game files. In r6s, you have to open some ini file. It’s not as intuitive but that’s just how it is. It differs for every game.
I'd love to see if the DPI affects tracking accuracy. If you could somehow do the same movement over and over, but not a simple linear one, I'd love to see if the lower latency comes at the cost of lower accuracy in travelled distance (I'm assuming that the higher sensibility should have higher errors and while also extrapolating less data more often).
It doesn't in any good sensor implementantion nowadays. Which is truth for probably any new mouse and sensor. Still, you don't have any difference in "latency" going at higher or low DPI values. Don't worry about it. Use a value that you're confortable with.
@@daniloberserk the video and other independent test literally show a difference in latency lmao what? Also why are you putting the word latency in quotes as if it's something that doesn't exist? It seems more like since you don't understand it, you rather play make believe that it isn't real
@@RequiemOfSolo I'm honestly tired to discuss this here. This discussion isn't even new, I remember when cpate talked about that on OCN. I literally work with perypherals and I'm digging every community about this stuff since 2004. I don't care if dumb kids nowadays don't have enough critical thinking to filter stupid information. Battle non sense spreaded misinformation from amd chill and the person who actually developed that tool had to correct him about his misinformation on the topic. And battlenonsense never replied the post or updated the video. So this should tell enough about the "source" that he is. - Does an atomic clock measure a second faster then an quartz clock? Is 4k screen 4 times faster then 1080p because they show 4 times the amount of pixels? Unless some sensor has serious design flaws, every count will be reported at the same rate. The handicap here is an bigger threshold of movement for less DPI. It's that simple. Since people is trying to reduce input lag while using the same eDPI, and a flick from point A to point B will still take the same amount of distance when moving the mouse. It doesn't matter if "first on screen reaction" happens faster with more DPI, you're just moving less degrees of motion. No one flicks an subpixel motion, your input chain will not work any faster. More "reactive" motion doesn't mean you have less latency. This is why DPI /CPI is mouse RESOLUTION and not mouse "speed". And the rule with resolution is, enough is enough. Unless you're playing with an ludicrous high eDPI value, It's pointless to change DPI for any "theoretical" advantage, even for the geekest of the geeks. Like seriously. It's not that hard to understand. You can have the same effect just by moving your arm faster. As soon as the speed of motion matches a single count for any DPI you're using, both will be reported with the same latency. Of course. Since we can't have infinite speed and acceleration. Higher DPI will always report a single count faster. But unless you're raising your eDPI, you can't take advantage from that for any real application , since any usefull motion will take hundreds or even thousands of counts sometimes. He should've measure the amount of time for a motion from point A to B on the screen. Not "first on screen reaction". As I said a lot of times here already, bad methodology leading to stupid conclusions. - Believe in whatever you want. Battle non sense is just and average guy with an enormous ego, he's not willing to take a second thought about his conclusions. But I do, as I used to believe that high DPI was objectively better for the same reason. Good luck with those "faster" 0.00001 degrees of motion. It'll certainly unlock your divine powers in gaming. God bless those sacred RUclipsrs dumbing down complex discussions so the average kids can "know stuff" without wasting their brain power. Maybe one day when I have enough patience I'll make an more "visual" explanation for the dumb ones. If you really need less input lag on a mouse. Which is already the smallest source of lag in your input chain. Just buy any mice that has a polling rate of 8khz. I arguee that the best thing about 8khz is better flick shot precision and not input lag. But only for games who actually can support sub frame input. As overwatch does for example.
@@daniloberserk can you point me to where I claimed the input lag difference was large enough to significantly improve my in game skill? I don't seem to ever recall that being the point I made. Weird how that happens. Please refrain from turning this into your personal blog where you dump all of your schizophrenic ramblings and pent up anger from previous interactions. My point was that the reduction simply does exist. Also you're wrong about the mouse being the smallest source of lag in the entire chain. It is one of the smallest yes, but game simulation and driver latency are both smaller on average for majority of systems/games/drivers. Display scanout can be even smaller too with 240-390hz displays.
@@RequiemOfSolo it doesn't exist any difference if you hit the threshold for a single count on any DPI setting. It's really not that hard to understand that. Just because you need less movement for a single count with high DPI doesn't mean the mouse works "faster". And as I said. First on screen reaction for this measurement is useless. Unless for some reason an bad sensor may work different in some DPI setting, which used to happen when sensors had native resolutions. Not the case anymore. Even an schizophrenic may understand this discussion better then you tbh (or most fanboys from this video).
Won't some of the diminishing returns over 800DPI be due to the polling rate being capped at 1000hz also? If DPI is 1600 but polling is only 1000hz, it's not a true 1:1 reporting rate to the PC?
The reason people use lower dpi is for accuracy and not latency. Latency seems barely different on 800 vs 1600 so perhaps 800 is the best of both worlds if latency really is something you thinks limits you.
I hope you disabled enhanced pointer precision in the windows settings. And It would be cool to see if high mouse dpi but low ingame sensitivity settings effect the results? (basicallly 400dpi 4 ingame sens. vs 1600dpi 1 ingame sens comparison)
Nah, this guy is a an absolute noob. He won't have even used the windows mouse accel patch nevermind unchecked the pointer precision box. Battle(non)sense? More like bottle nonce sens(itivity)! LOLOL
The reason low dpi causes higher delay is simple: 100 dpi is doing 100 samples/inch, 1600dpi is doing 1600 samples/inch. if you move with 1inch/second speed for 1 inch distance: 100dpi = 100 sample per second = 10ms delay between 2 sample 1600dpi = 0.625ms/sample = 0.625ms delay between samples after 1600 the diminishing return kicks in hard and the difference is barely noticeable even on slowly moving mouse. if you move the mouse fast, lets say 10inch/second speed for 1 inch distance: 100dpi = 1000sample/second = 1ms delay 1600dpi = 16000sample/second = 0.0625ms delay That is why there is barely any difference for fast movements. Also there can be differences above these for each mouse because they might process the DPI data differently before sending it.
I've been using 800 toggle for minor sniping corrections and 6400 with windows pointer speed at 3 for about 10 years now. In-game where possible i select untainted system speed or 1.00 which usually represent system speed. So I basically figured out with my senses since 2001 what's the best way to setup a mouse hardware wise, interesting
Great video, my main gripe is you recommend adjusting the Windows sensitivity at the end. Everything I have read is to NEVER ever ever adjust that to anything other than 6/11 as it introduces imperfect mouse tracking.
The 6/11 info is possibly outdated by many many years. I believe issues are supposed to arise going over 6, not under. Plus, most games use Raw Input, ignoring the Windows slider anyway.
@@ChrisGarcia683 I edited my post a little. It's 4am...i don't have any info off the to top of my head, but will reply when I do and remember . I've personally never needed to adjust it, buts there's MarkC reg tweaks for that. Or Raw Accel for the sens option.
If you disable EPP and use the reg fix, the values below 6/11 are pretty much a linear scale. Going down to 5/11 gives you 75%, 4/11 gives 50% and below that it halves the speed with each step.
Damn Yesterday I was thinking does dpi makes a difference in latency and I saw you're Razer Viper 8K Video and what I Saw is the higher DPI you go up until 3200 the lower the delay gets, but I watched this video and now I play at 1600 dpi instead of 400 dpi cause all the pro's made it 400 I copied but thank you Battle(non)sense for making me change my mind, EDIT: btw I use very low sens at 2000 edpi and moving my mouse like a turtle that's why I changed it
He didn't really come to the conclusion that higher DPI = better. He specifically states so in the video. Only a single mouse was tested and other potential variables need to be considered. I've watched Logitech engineers explain how their sensor technology works in their top end mice and DPI should not impact input latency, only how fast the mouse moves.
Well He tested the razer viper 8k and that's my mouse, and tbh I really don't care about how minor is the difference in latency if I get it lower than its good for me and I'm playing with a comfortable 1600 DPI and he stated to use what ever dpi you use and I use very low sens and move my mouse very slow so that's why I made the Dpi Higher and it feels for some reason smooth and no it wasn't pixel skipping .
wouldn't you have to move the mouse 4x faster at 100DPI to compare against 400DPI and then compare the latencys? because you would do that to make the same flickshot.... it seems to me that the "latency" - idk if you could even call that like this - is just because it for sure takes longer to reach the first dot at 1/100 inch than at 1/400 inch when you move at the same speed. that doesn't seem like latency though to me because the sensor is just not sending data until you move it one dot. if it was for real latency difference one had to measure the time from sensor registers movement to display. If you adjust the speed i think it could/should be the same results too.
Your test probably suffers from the same problem as other video I've seen: You're not measuring latency, you're measuring accuracy+latency. Your results show time it takes for your machine to move the mouse by initial 1 unit of distance from stand-still, plus latency. 1 unit of distance is bigger for lower DPI, thus the time is bigger as well.
I'm glad to see someone actually understands how to properly research something. This entire comment section is filled with pseudo intellectuals who huff their own farts.
Thats what i did from 400 dpi to 1200 dpi and the delay and tracking was better. I up my mouse dpi and lower my windows and ingame sens and its like a free upgrade from 125hz to 1k polling rate. It really detects micro movement which is higher polling rate mouse offers
@@SimoneBellomonte 1600 is the highest you'll get any difference on with a 1000hz mouse, highest DPI would be terrible for normal PC usage and for setting sensitivities in game
Thanks for the great video! Main point for using 800 dpi as main is the fact that it's the highest DPI that can be played in many different games for a very high cm/360 sensitivity. A lot of the new games don't have sensitivity sliders / numbers that can go low enough when you're using 1600, unfortunately. Adjusting windows sens is a no go - it always created issues in the past.
Yo exactly I was ganna go to 1600 but most games ingame sensitivity doesn't go fucking down enough doing fucking 3 360s in one mouse swipe
@@moretoastedthanatoasterstr9773 i play on 12000 DPI and its just amazing for me with high fps to match. To each his own i suggest everyone try high dpi low sens to at least see how they like it.
@@UltraMegaFail ya no I I complete agree was playing on 400 dpi like 2 months after I got a computer and iv had it since 2014 and fuck me wish I bump the dpi a long time ago
wrong but semi right. as an FPS player who is high rank in all FPS games (radiant valorant, global csgo, FPL). we use 400 DPI and 800 DPI because it's more consistent when it comes to pixel tracking. if there is less variable for mistake on lower DPI but if you go too low it will become skippy and unplayable so people assumed 800 DPI was the best but 400DPI has been proven to be just as good.
@@Soleft i can only speak on behalf of everyone i know and everyone in the pro scene of FPS games. it's not like there is a data sheet, we just can feel the difference
I've been waiting for this
Yeah this one took a while. :)
Me too, amazing content.
@@BattleNonSense I can imagine, thanks for the hard work. Ever since you announced it I was very curious about the results.
Maybe you'd be able to make the movement slower by using gears? Or will that be too complicated or maybe cause too much inaccuracy?
@@BattleNonSense Thanks for the results. I would be very very very interested in the same test just with different brands. Take 2 common logitech mice, a razer, a Corsair and compare them please. And if possible with an even lower initial speed, because when tracking a target while full auto I am sure I dont move my mouse as fast as seen in your first test... Thanks! :)
@@BattleNonSense quake champions at next patch get nvidia reflex ,plz test input lag 🙂
I love high DPI but I hate how most games the i game sensitivity settings aren't low enough
@@kmndra5831 even worse is some games in full screen borderless modes, high DPI can make the invisible cursor go outside the border and then when you click you click out of the game
I just experience this in New World
With raw accel you can use high dpi with a lower sens multiplier
@@a1e738 I know, that's what I do, but not every game allows you to go low enough in sensitivity if you use 12k DPI and above
@@wile123456 He's probably talking about the program "raw accel" iirc there used to be a raw accel dll injection so it's compatible with virtually everything.
Polling rate = temporal resolution. DPI = spatial resolution. The higher latency from slower movement is because time = distance/velocity. The "distance between dots per 1 inch" is greater at lower DPI before the sensor registers movement, and hence faster mouse movement nullifies this.
This
so, keep using 400 dpi ?
@@kontrasergeant yes only noobs watch these videos and think they are real
Explain better to a stupid gal like me?
really wish you would test other sensors besides this one! would be interesting to see if this curve drawn is universal or some affect of the sensor in the death adder.
It's kind of sad that he is just testing one mouse. Would be great to see input lag results for the XM1 too.
are all sensors the same?
@@TheKillerZmile no
its the death adder. Theres other testing by other people with many more test subjects that show this. The G Pro X superlight was tested alongside the 10khz models that just came out, there was no latency difference between them. The death adder is unique in this oddity.
@@TheKillerZmile no
If anyone is having a hard time with windows sensitivity you can go into the registry to edit in decimals. I am running 3200DPI most of the so I run it at 2.5 though it doesn't display in either of the mouse menus. Remember to backup your registry before you make any changes. The reg Key is
"Computer\HKEY_CURRENT_USER\Control Panel\Mouse" > "MouseSensitivity"
I want to use 6400dpi, but the problem I encounter in game is that the mouse will be too fast, even in lowest game sensivity.
My question is, will a low windows mouse sensitivity solve the problem with high DPI I described above?
the best way to look at DPI vs Sensitivity is to think of analog signals. The high DPI is a hardware version of a high signal, while the high sensitivity is a software amplification. With a high dpi (i use 900) and a lower sensitivity, you can get the fast movements you want without sacrificing the super fine details when you are sniping long distances. You cannot just say they have the same equivalent DPI because the sensitivity x DPI is the same number. How the system amplifies each signal will cause issues.
Yeah so basically just go for highest DPI and lower in-game sens accordingly, and you get both more accuracy and less lag.
with 900 dpi you basically make your mouse in need to add an interpolation process while registering your mouse mouvement wich can create jitter and inaccuracy. Most mouse has a native dpi of 800 so you need to make a dpi number that is a multiple of 800 since 900 its not a multiple of 800 I suggest you first to check the native dpi of your mouse and use a multiple of it for native 800 dpi use 800 - 1600 - 2400 - 3200 - 4000 etc (tbh more than 2400 is useless in my opinion) Also you need to take in count your monitor this is due to the fact that DPI does not adjust with resolution, meaning lower DPI settings on higher resolution monitors can cause slight stuttering and imprecise aiming in games. Just realise its a 2 years old comment lol. You probably doesnt care anymore at all aha
@@w0uffv379 i have steel series prime and it says 18000 cpi so should i use multiplies of 18000 also should i change desktop resolution when i get into game?
Are the results adjusted for sensitivity?
Cuz it would be interesting to see if there is a difference between (let's say) "100 DPI with sensitivity 10" in game compared "1000 DPI with sensitivity 1" to see if there is a delay because the sensor isn't registering the movement or simply because the game doesn't update the camera position because the input value is too small!
= same eDPI
Yeah the test methology is inaccurate and inconclusive without accounting for eDPI. He should retest with the proper ingame sensitivity adjusted.
@@TheVoitokas this
I think he's not using a game with a concept of 'sensitivity' or 'eDPI', it's using software which flashes the screen black -> white on an input (i.e. one mouse packet with count values)
@@tomhepz we still need ingame tests with same eDPI lol. Nobody plays testing software😂
what we all have been waiting for, amazing job!
BE CAREFUL NOT TO INCREASE DPI TOO HIGH ON YOUR MOUSE, some mice will start smoothing at certain DPI points and will have higher latency. All of Razer's new wireless mice have very good sensors with little to no smoothing (hard to tell) up to 20,000 DPI.
I know this is older but I would like to see where you got this info.
@@DoubsGaming I'd never be able to track it down, I actually looked for it a couple days ago. Their is/was a reputable source for that though.
@@DoubsGamingIt's been a month, bit might still be useful. Techpowerup reviews include in-depth testing of sensors, with one part being smoothing.
What does smoothing means and why is it bad?
@@rubenbaczo8497 will be less raw, will have like a "filter" between your hardware(mouse) and the cursor in the monitor, will not be 1:1 (mouse movement and cursor in monitor), and this usually add latency (less real time between the response with the mouse/system)
Keep in mind, USB devices don't send updates on their own, they are "polled" at a rate by the computer. The higher DPI's diminishing results may be due to that polling rate. Some mice let you increase that rate.
USB 1.1 and 2.0 are 1kHz max, 1ms repear rate. And it's the device that decides this, not the PC.
@@CanwegetSubscriberswithn-cu2it All his graphs are showing is a delta normalized DPI, DPI / (hand distance / cursor distance moved) * delta. This is simply how mouse sensors work, the lower your DPI the longer between your mouse jumping pixels, this should be obvious and common sense considering you know that DPI means "Dots Per Inch." Waffle never claimed it's determined by the PC so we already know you're intellectually disingenuous lol
@@SoftBreadSoft I have actually read tge USB protocol specifications and have programmed USB devises. The 1kHz polling rate was the max available in hatdware for USB 1.1 and 2.0.
Only USB 3.0 added the ability to clock the interrupt transfers ss multiples of 125us instead of 1ms.
The polling is done by the USB controller, not the "PC". While there are devices which alow changing the polling rate, these are pre-defined by the device. The PC may choose which one, but they are pre-determined by the device, not the PC. I'm familiar with the actual USB specification, I've done ISB development. I know how this shit operates.
@@CanwegetSubscriberswithn-cu2it Was he using USB 1 or 2? That would be another flaw in the video.
1 thing, speed of tracking and hz ramp are connected. If you move your moouse real slow the sensor won't track at 1000hz.
Why not
@@firellio070 download MarkC windows10 fixer and use the movement recorder. See for yourself. I'm no claiming to know why, but it is like that.
Are you sure this is not simply "the lower the dpi the more movement needs to be done to start mouse movement, hence high delay (of the first action, not continuous actions)"?
This is a very insightful comment.
Maybe using the data for when the mouse stops (instead of the start) can minimize this effect...
Does that matter? are you going to game the system by introducing a quick flick before every tracking attempt?
I would wager that this is almost entirely a sensitivity effect, not a latency effect. Controlling for the difference in sensitivity by testing e.g. 1600dpi 4sens against 800dpi 8sens should nullify the overwhelming majority of the difference.
Chris is using the data measured with different sensitivites to say that a higher DPI which is scaled down to have equal sensitivity would have an improved latency. The data does not support this at all - it's a mistake in interpretation.
That's the thing. Lower DPI requires more movement to be registered, and is thus slower. The higher DPI for a given movement, the more often updates get registered, hence lower input lag. It's similar to how higher FPS provides lower input lag due to more updates, except that in the case of mouse, the frequency scales with movement speed, hence lower delay at higher speeds. At very high speeds, the input lag due to DPI updates approaches zero, leaving the rest of the system lag that in this case seems to be 20ms.
@@AnimeReference yes, it does matter for specific games, bunnyhop to be specific
I always use 1200 dpi. It just feels right for me. Glad to know that is in a good range concerning input lag.
same. but i do use 800 for games
Amazing channel!!!
Subscribed! Loved the testing methodology and it helped me understanding why I do prefer higher dpi (1600) over the "standard" 400...
i use 9000
@@hapticwarframe5730 i use 100000
You didnt mention the reason behind the increased input lag. It seems logical to me, that lower DPI "react" slower because the steps between each count are larger, so you have to cover a larger area until the sensor starts detecting movement. Amirite?
Yes
Obviously it reacts slower. The video is about finding out just how much more input lag it adds, not if.
@@rdmz135 Huh?!! Do you even know how the LDAT works?!
This also means that discarding 50% of the mouse counts (or making them move a camera 50% as much) via using a lower sensitivity multiplier would nullify any latency improvement from doubling DPI.
@@AerynGaming Exactly. As soon as every DPI adjustment meets the distance necessary to report 1 count, ALL of then would have the SAME exact input lag at that moment. This video has the most stupid conclusion ever about how mice resolution works. It's like saying that 4k gaming has 50% less input lag at horizontal and vertical movements because you're moving "double the pixels".
You can arguee that higher DPI resolution at the same speed will trigger higher polling rate frequency report, but AGAIN, this would be nulified as soon as the lower DPI settings moves the same distance needed for a single count.
The only effect that would had at going at higher DPI values is a smoother "granularity effect", that people misinterpret as "pixel skipping". Assuming that the eDPI would be the same.
I am always set to 800DPI and it seems that it's perfectly fine for pretty much all situations, I rarely have a game that it too high on minimum value, I tried 1600DPI to get the tiny reduction in latency but some games couldn't go low enough and I can't be bothered constantly switching, but the difference in latency between 800 and 1600 is so minor anyway
I set mine to 20k DPI and turned the windows mouse settings cursor speed to 1. Feels the same as 1600 at default windows setting.
@@CABALlc1 you don’t want to fuck with windows settings. Leave it at 6
I believe you can use RawAccel to solve that. Sens multiplier. You also don't have to use the accel feature.
@@CABALlc1 windows mouse settings dont affect any game still played today
on 1080p 800 dpi should be subpixel precise to about 9.4''/360° and slower. on 1440p to about 12.8''/360° and slower, bot assuming 90° fov
As other people stated you should compensate for the sensitivity adjustment by changing ingame sensitivity to match the old one.
Should have used eDPI to maintain the same overall "sensitivity" and relationship between DPI and in game sensitivity values.
If he doubled the DPI, halving the in game value would maintain the same relationship.
Otherwise all we're seeing is the time relation between sensor signals as the sensor tracks it's position
@@geologik7500 exactly
I said the same in my comment, he replied as well, I don't think he agrees.
@@Anderson_Roger I think you didn't formulate the question very well, the simplest way to put is as Dane did
@@geologik7500 time relation between sensor signals is delay though. It's not like he measures smoothness, he measures first detectable change.
Are you sure you aren't measuring a different effect? Like how perhaps a slower DPI means the mouse will have to move further to register sufficient change? In both refresh rates at 100 DPI, you measure about +20ms response time compared to the fastest DPI setting, which suggests to me you are measuring the hysteresis of your setup. Solid methodology though!
he always measure the same distance and the same speed, its like setting low dpi and high in game sens vs high dpi low and low in game sens, it looks correct to me
@@dubby_ow He doesn't do that in this video. See my comment, he even replied confirming that. (he should have done that)
@@Anderson_Roger I feel like static in game sens is a right choice here, so only the dpi of a mouse can have an impact on the result, he couldnt done both, to keep eDPI the same and compare, but I feel that in game sens being just a multiplier of the mouse dpi, wouldnt change anything
I had a similar thought. For these tests, if you scaled in-game sensitivity with DPI, then I think the results would be more comparable. Meaning, 100 DPI with 5 sens would become 200 DPI with 2.5 sens, and so on. Therefore, the amount of 'turning' that occurs is the same since the ratio of DPI to sensitivity is the same.
Also big note: in games with high zoom levels(i.e. battlefield 4's 40x scope) low dpi will fuck up your aim as it'll become jittery, at that point if you're gonna snipe from long range you might as well use a controller instead with how jittery the aiming becomes at sub800 dpi
nice videos mate keep up love ur work
Wow, great information, man! Thanks for your time! Great job :D
Question:
When at a lower DPI (like 100) it takes a longer distance of travel for the mouse before it sends another signal. A longer travel does not mean more input lag, it means more movement of the mouse before it sends more signals to the pc.
Was this considered when making the test?
I dont know as much about latency as you do, so correct me if im wrong.
At the slow speed (10mm, 100ms) it should only take 2.5ms to move between DPI points at 100dpi. Maybe the acceleration delay, from stationary to full speed, is affecting it. It could be mouse smoothing built into the firmware.
I put in the same question (he replied to my comment). It wasn't apparently.
6:00 sensitivity bars so long yet I always use something around 1-5 even with 800 DPI
Who the hell uses the other end of the game sensitivity bar?
@@speedweasel I think that depends on many other things. I have been sitting in front of a PC for 21 years (both for work and gaming, since I was 6) and I have no issues.
@@tredbobek Same for me. A4tech x7 with 800 DPI over 15 years of gaming in all genres (mostly action), the only thing that I have improved over the years is the area for a large mouse pad.
Curious how this would work with something like RawAccel.
yes
Did you change the Sensitivity ingame to travel the same distance with different DPI? I think this way you tested, in the first case, 100 dpi will travel a little distance, and the 1600+ will travel so much faster, thats why the 1600+ had less input lag. So changing the DPI will make no difference, just change the velocity will. I think the correct way to test will be travel the same distance (like 30cm) with different DPI/Sensitivity values, to see if the DPI alone will change the Input lag.
Yea, I was wondering the same. Possibly the LDAT software is looking at some minimum delta of movement per frame, and at higher dpi you are more likely to trigger that
@@StardidiMarcelis yea, that's what I thought
I did the same test and compensated the sensitivity to match my old one went from 800 to 1600 but no way to actually measure it feels better tho I don't know maybe just placebo effect.
Good point. Most gamers probably have a pretty tight range of speed at which they're comfortable, so the whole "does more dpi reduce latency" thing only becomes interesting when you want to keep your comfortable, trained speed, but wonder if more or less dpi is better.
If you do your 30cm/360°, you want to keep it.
He tested for intial movement, you're goofy
To avoid jitter, there is nil response when going from "distance moved 0 dots -> 1 dot", the larger in size the dot, the larger the delay to go from value 0 to value 1.
you have to spam 1 3 1 3 1 3 to get out the lan jitters in freeze time
truly appreciate your hardwork, i really learned something new and really valuable today
800 DPI master race
600 DPI Immortal Race
400 DPI god race
200 dpi devil race
1100 dpi gang.
69 dpi freaky race
thank you for all the work you do, mate
I always understood that setting the Windows setting to anything other than the default 6/11 would mean you lose a 1:1 relationship between physical mouse movement and how the input is understood by the computer, e.g. skipped pixels for setting higher values. Is that still the case?
Yes, because it works more like the sensitivity slider in games, as in its a modifier AFTER the mouses DPI.
4/11 translates to 1/2nd and 3/11 translates to 1/4th. You do keep your 1:1 relationship. But like if you are used to 800 dpi and you go to 1600 dpi for the lesser input lagg but want to remain your 800 dpi mouse feel on desktop.
Then you change dpi to 1600 and your windows cursor speed to 4/11
Games ignore the windows setting
@ Not all games do.
@@nikidino8 pretty much all games that matter have raw input
6:15 or sometimes u can edit the configuration file for the game and set it to something lower than whats possible ingame
And might even have more digits to work with. Like when the game only offers two in the UI, but the config file supports 8, so you can have your perfect 1.23624 sens.
Can you give a reason as to why this is?
Taking the dpi value literally as "dots per inch", do you have to displace the first row of dots for the mouse and system to discover the movement?
This would explain why faster moving speeds show less dependence on the dpi value, because the faster you get the minor is the different between how long it takes you to cover the first inches of dots NOMATTER how far they are apart?
For slower moving speeds, it takes you longer to cover the first row of dots, and for different dpi values, they are spread apart further. Hence the higher influence of the dpi value on the system latency for slow speeds?
It would be great to have an explanation next to your testing :)
Anyway, as always, great video!
Calling it input lag is a rather poor choice of words, it's not like a signal takes more time to reach its destination but rather there are fewer updates in a given time. He should have made it more clear that it's behaving closer to how polling rate affects input lag which is in the end the amount of information in a given time period.
In Counterstrike we compare sensivity in eDPI which is calculated by multiplying DPI with your sensivity. So If you have a DPI about 800 and a ingame sensivity about 1 but want a smoother Mouse with the same sensivity you need to Set your DPI to 1600 and your ingame sensivity to 0.5. Or 400 DPI and Ingamesense 2 for a more steady but Not so smooth sensivity. Very simple
In my setup, frame rate would still matter far more than mouse dpi (mine is 400). My 6700k is falling behind, it seemingly can't keep up with newer games like BFV and is probably bottlenecking the system.
WOW!! Thank you, didnt expect that!
Been thinking about this, When I increase my DPI to 1000 from 800 (adjusting in game sense to account for eDPI)
I feel like it is a bit to responsive with over shooting etc, so maybe I should be looking at a slight reduction in game sens rather than the standard eDPI calculation. :thinking:
btw...
if you get a razer 8k viper...
it still sends the signal at 8k frequency no matter which actual polling rate you choose for operations.
I think it's more to do with you have to move your mouse further to register a change on lower dpi. This being said 1600 still feels more responsive than 400 and because it's smoother, it's easier to track targets even during a flick while at 400 I can't track the image at all.
What about using crazily high 25600 DPI and resampling it to actual speed you want that equals to "1200" or "1600"?
The fact that you don't have a million subscribers is truly insane.
i just came see this and try it out right away. switching from 800 to 1600 dpi for some reason feels insanely good. tracking in apex feels 100% so much better!
I was thinking the same thing. I’ve always been at 800 and never changed it before because I change my in game sens accordingly. I see this comment is a month old, are you still on 1600 and do you enjoy it?
@@Sandzzy i'm still at 1600 dpi rn and i do really enjoy it! i change my windows mouse cursos speed setting to 5 instead of 6 because its too fast if i just browsing in general.
Im using 450 dpi gonna try 1600 dpi tomorrow
I think this makes logical sense due to a lower DPI value only detecting movement after a larger physical distance on your mousepad. This will mean the first input is naturally going to be delayed, however this is the very reason many pro gamers will choose to use 400 DPI, because it detects only significant inputs and will not move your crosshair held on a pixel with a shaky hand for instance. As seen by Battle(non)sense's test with flick shots however, it does not have an effect on the latency of the mouse.
This is exactly why I use 450DPI (used 400 @ 1080p, went to 450 when I upgraded to 1440p). I tried 800 and it's not steady enough, picks up too many micro movements.
Which is why most CSGO players do 400dpi. U need pixel accuracy and ur hand will always have microshakes. We are not robots
@@chy.0190 If you use a low sens you don't get pixel skipping. It is a preference thing, not a legacy thing.
@@chy.0190 No, csgo works really well on 400 and pros will usually play 400 or 800 because it is ideal for stability and precision.
Only reason I ever left 400 is orher games not doing well on it, but csgo worked just fine.
@@badnewsbruner too true. Anything above 800 is for CQB
I feel we are missing an insight here?
If we imagine that DPI is the amount of squares on a chessboard, and we use that chessboard as a mouse mat. Each square is a signal of movement from the mouse.
So the higher DPI, the more squares on the chessboard.
If we now drag our mouse across the chessboard - the smaller the squares, the faster the initial response of movement would be.
But would higher DPI really reach its target faster?
I've always been using 400 DPI, not only because it feels good imo, but also because a lot of games have horrible ingame sensitivity settings, sometimes not even supporting decimal spaces. That means I basically have to use a low DPI to get anywhere near my usual eDPI.
But I'm sure I'm not the only one with this problem :/
Yup that was the main reason I stuck to 400 dpi for so long, but recently after battlenonsense made this discovery I switched to 800 dpi anything higher is too fast for me, and so far I havent ran into too many problems with sensitivity scaling in modern games, but a few years ago it was really common.
esports title have raw input
@@TheKillerZmile raw input doesnt invalidate the use of a higher DPI. Raw input means mouse input goes directly to game, meaning the input doesnt get rendered by your windows sensitivity settings or mouse acceleration by windows, anything like that. It's useful for sure, but the whole DPI thing still applies. 400 DPI is pretty suboptimal but unless you NEED every competitive edge you can get, you should probably go up to 800 or 1600. Just because my mouse can I use 12000 DPI with sensitivity on everything including windows scaled all the way down. Unless a game doesn't support extremely low sensitivities I have the DPI scaled up in Logitech G HUB. If your mouse can increase DPI super high I recommend you try it out, it's super cool to see how responsive your camera is to tiny mouse adjustments. And when you take a second to look close you can see how unresponsive your mouse was compared to with a higher DPI. This is unnoticeable with normal use unless you're making microadjustments in an FPS game
rawaccel homie!
Basically DPI can be viewed as a measurement update frequency per inch. The higher the DPI the sooner the first measuremnt gets sent when the mouse moves. Which results in lowering the input lag in effect. But we can also increase the physical movement speed of the mouse movement to get the measurement sent earlier when we have low DPI which also lowers the input lag. At certain movement speeds we reach a cap on input lag irrelevant of the DPI.
I just read your farewell announcement Chris. I'm sad that you're no longer able to make new content due to circumstances out of your control but I'm also glad to have be here with you since the beginning of your RUclips journey since the Battlefield 4 days. Your videos have always been top notch quality and I appreciated every second of it. You are very talented and it's very rare to find someone such as yourself who consistently creates top quality RUclips content. I hope everything works out for you and your family and I wish you all the best. Good luck to you and your family and take care of yourself Chris. ♥
omg where did you read that?
Has he quit?
what happened?
@@riba2233 yeah he had to leave RUclips for family reasons. He made a community post on this channel explaining this.
@@Superdazzu2 he had to leave RUclips for family reasons
The reason I'm not moving up to higher dpi values from my 400 is because my edpi feels the same as the standard windows cursor speed (which is 6). Therefore, if I increase dpi, my cursor will fly very quickly, and slowing it down through the mouse parameters in windows will add mouse acceleration (you can check it by yourself by using Mouse Movement Recorder app), so I do not recommend sacrificing the lack of acceleration for the sake of a couple of milliseconds of Input-lag. Display scaling in windows settings also adds mouse acceleration, so keep it 100%.
I always do the same. 1000dpi, 1000hz, Win slider on half, the rest is game by game.
I know is not much, but I always like and watch your videos to the end. Your analysis and test methodology needs more recognition
For anyone curious, i suggest you to calculate your EDPI first. It's basically your current DPI X In game sensitivity. For example, 400dpi x 1.3 sentivity= 520 edpi. If you want to switch to higher dpi, simple divide your EDPI to the new DPI option. For example, 520 edpi÷800dpi= 0.65 is the new in game sensitivity with 800 dpi.
1.3 what sensitivity? in pubg that i care about there is vertical sensitivity and general sensitivity but the general is 0-100 and i have it 42 with 1100 dpi,that means my edpi is 46200,so if i want to go 400 dpi that makes 46200/400dpi=115,5 so i cant do that,,,i guess in pubg works till around 500dpi or something...on higher dpi works ok i guess couse you can go to the 0 if you need too...
Now if only more games would properly support very low sensitivity settings, so I can rock high dpi everywhere :D
Yeah I hate it when games dont do that. WHY DO DEVS STILL MAKE SLIDERS WITHOUT NUMBERS.
Try RawAccel
I would prefer it if, instead of a slider, I could actually type out a specific sensitivity to 3 decimal places of precision. The relationship between DPI and sens is pretty exponential and shooting for high DPI really requires more precision on the sensitivity scale.
@@Zoddom to trigger autistic guys like me
@@Goodmanperson55 that entirely depends on the engine and the scale of the sensitivity modifier.
i assume the "input lag" is the time between the start of the movement and the first registered "pixel" or "dot" of the sensor. thats probably why the faster solenoid setting reduced the latency in lower dpi setting, because the first pixel would be registered by the sensor earlier
Very interesting. I have for a long time used high dpi, with in game devider (0.01 - 0.99) as I belived I felt more connected to the game (1800). Then for windows desktop stuff I just used a differnt profile on the mouse.
Thank you. This was very helpful and informative
Hey Chris great video I would’ve love to see the test at 500 Hz considering that polling rate is much more consistent among majority of 1000 hz mice
500hz will be much more delayed than 1000
@@ULouOW that is actually not true
@Alex.UA6 Yea, not necessarily more delayed, but just generally bad. nobody would ever deliberately decide to use 500HZ over 1000HZ. I can see people deciding to use 1000HZ over 2K of 4K because they got used to 1K, but generally most inputs from a 500HZ mouse will be more delayed than a 1000HZ one
it depend on the game tbh for example r6 is coded for 128hz mouse polling rate so if you use 1000hz there you nerf yourself. apex legends for example support max 500hz (mby that changed i talk about like a year ago i didnt play r6 and apex since) so better use 500 on apex and most game tbh counter strike does support very well high mouse polling rate so its fine to use 1000hz + there same for valorant and im not sure but i think call of duty also support it very well.
Excellent video, as always, it's mindblowing to see people actually downvoting this.... I don't see any reason why, your tests are SOLID!
Great video. I’d like to see a video about changing the in-OS Windows sensitivity and how it affects games and input lag.
Historically, adjusting the in-OS Windows mouse sensitivity from its default position produces major mouse inaccuracies, which can lead to your games feeling all wrong. It is reminiscent to when you have the “Enhance pointer precision” checkbox set on ON. (Which you should also never do.) There is a lot of research out there about it that you can Google for anyone curious.
Most games nowadays use raw input from the mouse though, so that _shouldn't_ be a factor. Unless you're playing an older game of course. Completely agree that it could be a nice video to see though, just get a definitive answer once and for all.
@jparkerwillis I’d also like to know if the settings are different if you’re using the new Windows 10 sensitivity slider or the classic method inside the Control Panel.
Generally speaking the windows mouse settings do not affect games as these use raw input.
@@Plazmunky It's the new one that matters in Windows 10, not sure why everyone always talks about it like there's just 11 steps. The "classic" steps just correlate to _1, 2, 4, 6, 8, 10, 12, 14, 16, 18, 20_ on the new slider - yet all of those on the new one do affect cursor movement. If you set it to 19 for example, instead of 20 there is a difference 5% in speed.
@@BattleNonSense hi there, I'm curious about using macro software like X-Mouse button control will add some input lag, can you test it?
for some reason many gamers use that software to change or create macro on their mouse like burst click or like auto reload when clicking button
please make a video about the accuracy. it's still a hot topic what's best, like 400dpi @ 2 sensitivity vs. 800dpi @ 1 sensitivity.
I was also hoping to see testing for DPI changes vs in-game sensitivity as well. I think I set and forgot my mouse at around 1200 dpi and adjust sensitive per game to what feels right. I have a small space for the mouse, and use wrist movents usually, with a little bit of arm room.
The same thing applies. All ingame sensitivity setting affects is how much your camera rotates per pixel traveled. Meaning it depends on your DPI. If your DPI is 1000, your camera will make 1000 movements per inch the mouse travels. If your DPI is 10000, your camera will make 10000 movements per inch the mouse travels. Lower the sensitivity accordingly to make up for this
You must be not that good at fps games then💀
@@bigbob5103 Pretty old comment but the only FPS games I remember playing the most was the Halo series and that was on console. I dont have much interest in FPS most games in general, not the multiplayer ones anyway.
@@Vaxtin oh okay that makes sense then😭😭😭 I would get so triggered if there was a slight difference between my sens in each competitive shooter I play
Long-story short, use higher DPI and lower the in-game sensitivity to make up for it, then after that get a smooth mousepad, trust me it makes all the difference, maybe the mousepad even more.
The amount of effort put into this video is insane to me
Maybe mice with lower max DPI on their sensors generally have less input lag than using the same DPI on a mouse with a higher max DPI. So a Viper Mini with 8k max DPI using 1000 DPI has less latency than a Deathadder v2 with 20k max DPI using 1000 DPI because of how high/low the DPI is currently set relative to the maximum DPI possible.
🙌🙌🙌
yeah but in the end its still 1000 dpi so its gonna be the same thing
Nice! Amazing work!
So you're measuring initial mouse movement, correct?
Let's say I have a length of, I don't know 10mm, along this length I have two setups:
Length 1: 100 dots
Length 2: 3200 dots
If I drag something along this length, I'm far more likely to hit a dot earlier on length 2 than 1.
What you're measuring is "time to hit first dot" and not the response time.
Imagine Usain Bolt was sprinting against a 5 year old and they both set off at exactly the same time. If we apply the same testing philosophy that Battlenonsense did in his video, we would only measure the time until the first foot landed on the ground, and because of their tiny little legs and narrow stride pattern, we would therefore conclude the 5 year old was the faster runner.
This should be updated since higher polling rates are now more widely used.
I would like to note that you can't have windows cursor speed presets like you can with DPI presets. So some games that use the cursor in game will have an uncomfortable speed if you use 2 different DPI for desktop and gaming.
Just get a bigger mousepad ;)
Does the software stop counting after the first sensor count received of after a certain number of counts received?
You actually did it lol. :D
It's because high dpi settings are more sensitive than low dpi. The sensor of mouse catches smaller movement of mouse at higher dpi. So as you can see in the video, mouse travel speed affected on initial delay difference between DPIs, But your total travel time to destination(flick time) will be same anyway.
Well i don't think this initial low delay provides any benefits.
Assuming your dpi is 1600 and in-game sense 0.25 in CSGO. Csgo's degree per pixel is 0.022 so your initial travel degree will be 0.022x0.25 = 0.0055.
Compared with 400dpi@1.0(same eDPI) your benefit is only under 0.011°. From over 0.011° to 0.022° scenario, 400dpi will be faster than 1600dpi because 400dpi will jump to 0.022° immediately while 1600dpi still traveling per 0.0055°.
Flicking in 0.011° faster but slower in 0.022° isnt meaningful.
Use whatever you want. 400dpi is also good option
You’re forgetting that with the same eDPI and high DPI equivalent you will also get significantly smoother motion while tracking because information is sent more often from the mouse as opposed to with low DPI where the game has a stronger multiplier for less data to achieve the same distance traveled. Alongside the input lag benefit, higher DPI is much more precise whether your flicking or tracking.
@@xzraiderzx308 Flicking is more consistent in lower DPI.
Lower DPI eliminates shakiness which is the biggest factor that messes up flicks.
Tracking is more consistent in higher DPI.
so basically what i took away from this was; low sens, high dpi
high sens, low dpi
but depending on what speed youre moving your mouse at neither matters that much. there’s also ig taking into account what kind aimer you are as well as depending on game(s) you play. in a precision FPS shooter like CS or Val for example, if you’re a smooth aimer, it’d be better to play low sens high dpi since most of the time you’ll be trying to achieve the most consistency in accuracy, and obviously from a logical standpoint the vice versa would apply for aimers that like to be more flicky. if you want to achieve the best of both worlds per say then i guess it would be best to go with 800 dpi and whatever you would consider a medium sens. but really at the end of the day its just personal preference. if you read this, ty and have a good one :)
I like to crank the DPI and lower the sensitivity in game to what feels good. Seems to get a much smoother feel. Basically same idea as a higher resolution making a smoother image all other things being equal.
consistently dropping groundbreaking content
The methodology here is really great
OMEGALUL
Good job man. Full watch and thumbs up my brother.
Can you double check this with adjusting the sensitivity to scale the dpi, to make sure you are actually measuring the input lag due to DPI and not just a sensitivity effect?
There is no "sensitivity effect". When the mouse reports movement the angle you look changes in game no matter what sensitivity is set. LDAT reports the first instance of movement, not how much it moves.
@@imadecoy. saying this is pointless if it hasnt been tested.
@@TorMatthews Testing it is pointless if you understand how a mouse works.
@@imadecoy. i hope you aren't planning to go into the sciences as a profession
@@TorMatthews Too late, I'm an electrical engineer. Dipshit
That feels wired maybe i got somethink wrong but:
- If i move my mouse at 1600 DPI ~ 20mm from left to right to turn around 180 degrees for example, shouldnt be the calculations
distance / time ? less distance in less time = less input lag also ?
- if i move my mouse at 400 DPI ~ 80mm from left to right to turn around 180 degrees shouldnt it be slower and have more input lag special cause ME move the mouse causing the input lag to increase on move distance ??
~ 1600 dpi ~ 20mm " flickshot " vs 400 DPI ~ 80mm " flickshot " causes the 1600 DPI " less movement ~ distance / time "= less input lag at all ?
While testing the latency seems useful on the face of it, there's a lot of information missing that would make this test actually useful in real life. First, mouse movement to cursor fidelity at each DPI setting in addition to testing for input lag. Second: testing across multiple mice used in the esports space, from multiple additional brands including (but not limited to) Logitech, HyperX, and Zowie. Third: test how much the proprietary software adds to possible latency (and movement fidelity) by turning it off (which in many cases forces the mouse to a default dpi setting) and adjusting the operating system mouse sensitivity instead. Such a video would be AMAZING.
true, but this dude already puts in so much effort into his videos, it would be insanity to go so much further and actually make these experiments be fully applicable in real-life applications. i'd love to see it, you would love to see it, everyone would love to see it, but i think he'd need a friend or someone to help him with the boring and time-consuming parts.
it's sad, but at least we have SOME data which we can apply, and hope it has a positive impact on our latency. i love videos like these because i'm really concerned with the details of my system, even though i'm not yet able to play games professionally. so at least we can find some minimal use from these experiments of his, not all of the effort is going to waste.
the main issue is as you said, a lot of missing info. but that does not necessarily mean we can't use these graphs to adjust our system. for instance, me using the razer viper 8k, a DPI of 1600 is going to be objectively the best (check out his other video if you want) no matter if other factors also increase the latency, because afterall it's just going to be simple addition/subtraction, i.e. if an application increases latency, that latency is very unlikely to directly impact how much latency a low DPI induces for instance, instead, their 'scope' is more likely to be separately/individually influencing latency.
albeit not impossible, the results should be predictable is what i'm saying, however we have no way to confirm that or know exactly how other factors influence things. and just doing the minimal amount of work to prove this would still be a lot of work.
POLLING RATE (mouse refresh rate in HZ) is what changes your input lag // DPI is just the measurement of the mouse movement over a certain distance; you can lower dpi but increase sensitivy, or vice versa, and you will accomplish the same thing.
I think the 4k mouse trend is mostly a joke in 99% of scenarios. A lot of great pros still use Zowie EC2. You can say they're stuck in their old ways if you want but it is certainly a comfortable mouse. Watch the video on the gamer nexus video about latency which includes a very well educated Nvidia spokesperson. Mice are one of the lesser important things in overall latency but still important if you're competitive with a good to great rig. He said most gaming mice are in the 1ms response time and even the most popular mouse in shooters (g pro wireless) have a 0.8ms response time in his case. There can be slight deviations I'm sure. In the end, I think what is most important is at least a 1 ms response time to when you press the left and right click buttons and comfort. Mousepads can also be thrown in the comfort level. Some pros who playing games that require heavy tracking like lyr1c use a glass pad. A glass pad does have a weakness though and that's stopping power for pixel perfect flicks
What you should also keep in mind is your mouse sensor's limits. For example my Razer Deathadder Elite has a maximum DPI value of 16000 but after 1900 DPI it starts to use smoothing which increases input lag. At higher steps is smoothes even more increasing the overall latency so I have to use my mouse at 1800 DPI to avoid this and have the lowest possible input lag.
Fun fact - 1800 DPI was the original Deathadders DPI.
I loved it.
No DA V2 Pro Wireless with 800/1800/2300 (Warzone/raraly/near always)
Is this the same for razer death adder v2 wired ? Should I use 1600?
@@haven252 it uses a different sensor, you need to check reviews and tests, but around 1600 should be good sensitivity/input lag combination.
FINALLY.
And with this people won't take me for a crazy person when i say "high dpi low sensitivity is better for sniping": low dpi mean more distance necessary for the mouse to register movement and also for this movement to translate in-game.
Try playing with a 40X scope in bf4 with 100dpi vs 2000, it'll be like the difference between a slideshow an real life.
Great video as always. I wish all games handled mouse sensitivity like Overwatch does where you can manually put it in down to the decimals.
Agreed.
in some games though you can do it even if the ingame settings won’t allow it. You just have to tinker with game files. In r6s, you have to open some ini file. It’s not as intuitive but that’s just how it is. It differs for every game.
Just run something that feels comfortable, you don't need every game to be exactly the same
Been using 1600 this entire time, good to know I haven't been handicapping myself by using my preferred sensitivity
I'd love to see if the DPI affects tracking accuracy. If you could somehow do the same movement over and over, but not a simple linear one, I'd love to see if the lower latency comes at the cost of lower accuracy in travelled distance (I'm assuming that the higher sensibility should have higher errors and while also extrapolating less data more often).
It doesn't in any good sensor implementantion nowadays. Which is truth for probably any new mouse and sensor.
Still, you don't have any difference in "latency" going at higher or low DPI values. Don't worry about it. Use a value that you're confortable with.
@@daniloberserk the video and other independent test literally show a difference in latency lmao what? Also why are you putting the word latency in quotes as if it's something that doesn't exist? It seems more like since you don't understand it, you rather play make believe that it isn't real
@@RequiemOfSolo I'm honestly tired to discuss this here. This discussion isn't even new, I remember when cpate talked about that on OCN. I literally work with perypherals and I'm digging every community about this stuff since 2004. I don't care if dumb kids nowadays don't have enough critical thinking to filter stupid information. Battle non sense spreaded misinformation from amd chill and the person who actually developed that tool had to correct him about his misinformation on the topic. And battlenonsense never replied the post or updated the video. So this should tell enough about the "source" that he is.
-
Does an atomic clock measure a second faster then an quartz clock?
Is 4k screen 4 times faster then 1080p because they show 4 times the amount of pixels?
Unless some sensor has serious design flaws, every count will be reported at the same rate. The handicap here is an bigger threshold of movement for less DPI. It's that simple.
Since people is trying to reduce input lag while using the same eDPI, and a flick from point A to point B will still take the same amount of distance when moving the mouse. It doesn't matter if "first on screen reaction" happens faster with more DPI, you're just moving less degrees of motion.
No one flicks an subpixel motion, your input chain will not work any faster. More "reactive" motion doesn't mean you have less latency. This is why DPI /CPI is mouse RESOLUTION and not mouse "speed". And the rule with resolution is, enough is enough. Unless you're playing with an ludicrous high eDPI value, It's pointless to change DPI for any "theoretical" advantage, even for the geekest of the geeks.
Like seriously. It's not that hard to understand. You can have the same effect just by moving your arm faster. As soon as the speed of motion matches a single count for any DPI you're using, both will be reported with the same latency.
Of course. Since we can't have infinite speed and acceleration. Higher DPI will always report a single count faster. But unless you're raising your eDPI, you can't take advantage from that for any real application , since any usefull motion will take hundreds or even thousands of counts sometimes.
He should've measure the amount of time for a motion from point A to B on the screen. Not "first on screen reaction". As I said a lot of times here already, bad methodology leading to stupid conclusions.
-
Believe in whatever you want. Battle non sense is just and average guy with an enormous ego, he's not willing to take a second thought about his conclusions. But I do, as I used to believe that high DPI was objectively better for the same reason.
Good luck with those "faster" 0.00001 degrees of motion. It'll certainly unlock your divine powers in gaming.
God bless those sacred RUclipsrs dumbing down complex discussions so the average kids can "know stuff" without wasting their brain power.
Maybe one day when I have enough patience I'll make an more "visual" explanation for the dumb ones.
If you really need less input lag on a mouse. Which is already the smallest source of lag in your input chain. Just buy any mice that has a polling rate of 8khz. I arguee that the best thing about 8khz is better flick shot precision and not input lag. But only for games who actually can support sub frame input. As overwatch does for example.
@@daniloberserk can you point me to where I claimed the input lag difference was large enough to significantly improve my in game skill? I don't seem to ever recall that being the point I made. Weird how that happens. Please refrain from turning this into your personal blog where you dump all of your schizophrenic ramblings and pent up anger from previous interactions. My point was that the reduction simply does exist.
Also you're wrong about the mouse being the smallest source of lag in the entire chain. It is one of the smallest yes, but game simulation and driver latency are both smaller on average for majority of systems/games/drivers. Display scanout can be even smaller too with 240-390hz displays.
@@RequiemOfSolo it doesn't exist any difference if you hit the threshold for a single count on any DPI setting. It's really not that hard to understand that. Just because you need less movement for a single count with high DPI doesn't mean the mouse works "faster". And as I said. First on screen reaction for this measurement is useless.
Unless for some reason an bad sensor may work different in some DPI setting, which used to happen when sensors had native resolutions. Not the case anymore.
Even an schizophrenic may understand this discussion better then you tbh (or most fanboys from this video).
Average person uses 1000 dpi, so is it better to increase dpi to 3000 and change sensitivity in game to 0,05? How does it work?
I wonder what's the response in different games 🤔
It really shouldn't matter, especially if the games are using raw input. I'd steer clear of changing the pointer speed (leave it at 6/11) though.
@@chy.0190 it does if you don't have raw inputs within the game you're playing.
Won't some of the diminishing returns over 800DPI be due to the polling rate being capped at 1000hz also?
If DPI is 1600 but polling is only 1000hz, it's not a true 1:1 reporting rate to the PC?
That would assume your moving the mouse at the same speed consistently all the time
The reason people use lower dpi is for accuracy and not latency. Latency seems barely different on 800 vs 1600 so perhaps 800 is the best of both worlds if latency really is something you thinks limits you.
what accuracy, it is greater with higher dpi
@@riba2233 you introduce more "noise" into your movements with high dpi, which can negatively affect your accuracy, especially when tracking
@@Zalazaar yes and no but I see your point
I hope you disabled enhanced pointer precision in the windows settings. And It would be cool to see if high mouse dpi but low ingame sensitivity settings effect the results? (basicallly 400dpi 4 ingame sens. vs 1600dpi 1 ingame sens comparison)
Nah, this guy is a an absolute noob. He won't have even used the windows mouse accel patch nevermind unchecked the pointer precision box.
Battle(non)sense? More like bottle nonce sens(itivity)! LOLOL
@@rdmz135 what do you mean by raw input? This Nvidia latency system (monitor connection etc?). Windows mouse setting affects game setting.
@Him nah, this guy doesn't know what he's doing. OP is right to doubt him
@Him *shrugs*
The reason low dpi causes higher delay is simple: 100 dpi is doing 100 samples/inch, 1600dpi is doing 1600 samples/inch.
if you move with 1inch/second speed for 1 inch distance:
100dpi = 100 sample per second = 10ms delay between 2 sample
1600dpi = 0.625ms/sample = 0.625ms delay between samples
after 1600 the diminishing return kicks in hard and the difference is barely noticeable even on slowly moving mouse.
if you move the mouse fast, lets say 10inch/second speed for 1 inch distance:
100dpi = 1000sample/second = 1ms delay
1600dpi = 16000sample/second = 0.0625ms delay
That is why there is barely any difference for fast movements.
Also there can be differences above these for each mouse because they might process the DPI data differently before sending it.
The results may be different between different mouses.
Good thing he tested exactly the mouse I use, great video
I've been using 800 toggle for minor sniping corrections and 6400 with windows pointer speed at 3 for about 10 years now. In-game where possible i select untainted system speed or 1.00 which usually represent system speed. So I basically figured out with my senses since 2001 what's the best way to setup a mouse hardware wise, interesting
Great video, my main gripe is you recommend adjusting the Windows sensitivity at the end. Everything I have read is to NEVER ever ever adjust that to anything other than 6/11 as it introduces imperfect mouse tracking.
The 6/11 info is possibly outdated by many many years. I believe issues are supposed to arise going over 6, not under. Plus, most games use Raw Input, ignoring the Windows slider anyway.
@@paft the raw input I agree with, do you have any further information on the 6/11 sensitivity not introducing pixel skipping anymore?
@@ChrisGarcia683 I edited my post a little. It's 4am...i don't have any info off the to top of my head, but will reply when I do and remember . I've personally never needed to adjust it, buts there's MarkC reg tweaks for that. Or Raw Accel for the sens option.
If you disable EPP and use the reg fix, the values below 6/11 are pretty much a linear scale. Going down to 5/11 gives you 75%, 4/11 gives 50% and below that it halves the speed with each step.
Damn Yesterday I was thinking does dpi makes a difference in latency and I saw you're Razer Viper 8K Video and what I Saw is the higher DPI you go up until 3200 the lower the delay gets, but I watched this video and now I play at 1600 dpi instead of 400 dpi cause all the pro's made it 400 I copied but thank you Battle(non)sense for making me change my mind, EDIT: btw I use very low sens at 2000 edpi and moving my mouse like a turtle that's why I changed it
He didn't really come to the conclusion that higher DPI = better. He specifically states so in the video.
Only a single mouse was tested and other potential variables need to be considered. I've watched Logitech engineers explain how their sensor technology works in their top end mice and DPI should not impact input latency, only how fast the mouse moves.
Why not just use 800? 2 miliseconds difference is hardly noteworthy.
Well He tested the razer viper 8k and that's my mouse, and tbh I really don't care about how minor is the difference in latency if I get it lower than its good for me and I'm playing with a comfortable 1600 DPI and he stated to use what ever dpi you use and I use very low sens and move my mouse very slow so that's why I made the Dpi Higher and it feels for some reason smooth and no it wasn't pixel skipping .
Only reason im not on a higher dpi and a lower in game sens is because of looting, when im looting on like 1600 dpi my cursor is just going crazy
How many mice did you test this with, hope your sample size was more then one, as this might be just a property of a single mouse maker
wouldn't you have to move the mouse 4x faster at 100DPI to compare against 400DPI and then compare the latencys? because you would do that to make the same flickshot.... it seems to me that the "latency" - idk if you could even call that like this - is just because it for sure takes longer to reach the first dot at 1/100 inch than at 1/400 inch when you move at the same speed. that doesn't seem like latency though to me because the sensor is just not sending data until you move it one dot. if it was for real latency difference one had to measure the time from sensor registers movement to display. If you adjust the speed i think it could/should be the same results too.
Your test probably suffers from the same problem as other video I've seen: You're not measuring latency, you're measuring accuracy+latency. Your results show time it takes for your machine to move the mouse by initial 1 unit of distance from stand-still, plus latency. 1 unit of distance is bigger for lower DPI, thus the time is bigger as well.
I'm glad to see someone actually understands how to properly research something. This entire comment section is filled with pseudo intellectuals who huff their own farts.
Good job buddy 👍
As a 400 dpi player i am now thinking going to 800 and reducing in game sense
try it.
It's easy. I went 400 to 1600 and changed in game sens as you said.
@@gunning4uuuu for example
16 sens and 400 dpi equals to
4 sens and 1600 dpi
Thats what i did from 400 dpi to 1200 dpi and the delay and tracking was better. I up my mouse dpi and lower my windows and ingame sens and its like a free upgrade from 125hz to 1k polling rate. It really detects micro movement which is higher polling rate mouse offers
Currently im running 400 dpi, 13 ingame.
So if I change to 800 dpi, the ingame sense is equal to 6.5?
Just to be sure :)
Man thats an Insane amount of hard work
1600 seems to be the optimal choice IMO. Love your videos, keep up the great testing!
Highest DPI is the optimal choice.
@@SimoneBellomonte 1600 is the highest you'll get any difference on with a 1000hz mouse, highest DPI would be terrible for normal PC usage and for setting sensitivities in game