DPI vs Sensitivity Gaming β What's the Difference?
DPI vs sensitivity in gaming are two separate controls that both affect crosshair speed, yet they operate at completely different levels. DPI (dots per inch) is a hardware setting β your mouse reports it has moved a certain number of pixels per physical inch of travel. In-game sensitivity is a software multiplier your game engine applies to that raw input. The combined result is captured by a single number: eDPI (effective DPI) = DPI Γ in-game sensitivity. According to ProSettings.net tracking of over 500 professional FPS players, the average pro CS2 eDPI is approximately 800β1000, achieved through many combinations of DPI and sensitivity that all multiply to the same result. Understanding this relationship lets you match your settings to any pro, convert across games, and stop second-guessing your gear. Pair these settings with a verified 1000Hz polling rate for the most consistent raw input.
What Is DPI in Gaming?
DPI stands for dots per inch β a measure of how many pixels your cursor moves for every inch your mouse physically moves across a surface. At 400 DPI, moving your mouse one inch moves the cursor 400 pixels. At 1600 DPI, that same one-inch movement moves the cursor 1,600 pixels β four times faster.
DPI is set in your mouse's companion software (Razer Synapse, Logitech G HUB, SteelSeries GG) or, on some mice, via a physical button that cycles through preset values. The setting is stored on the mouse's onboard memory and affects all cursor movement at the operating system level β not just in games.
Modern gaming mouse sensors (PixArt 3370, 3395, Hero 25K) are accurate across a very wide DPI range. The βoptimal DPI rangeβ for sensor accuracy is typically 400β3200 DPI, within which pixel tracking error is negligible (under 2β3 pixels per foot of movement). Source: RTings.com sensor accuracy measurement methodology.
What Is In-Game Sensitivity?
In-game sensitivity is a multiplier applied by the game engine to the raw mouse input it receives from Windows. When your OS sends a mouse movement of X pixels (determined by your DPI), the game scales that by its sensitivity value before moving your crosshair.
Critically, each game uses its own sensitivity scale. A sensitivity of β1.0β in CS2 produces a very different crosshair speed than β1.0β in Valorant, because the two games apply different internal scaling factors. This is why you cannot transfer sensitivity values directly between games β you must convert using the game-specific scaling factor (or use an eDPI calculator).
In-game sensitivity can also be affected by field of view (FOV) settings in some games β a wider FOV compresses the visible scene and effectively lowers the angular sensitivity of your mouse. In scope-based shooters, ADS (aim down sights) sensitivity multipliers further divide the sensitivity landscape. Always configure your base sensitivity first, then set scoped sensitivity separately.
What Is eDPI β The Number That Actually Matters
eDPI (effective DPI) normalizes your DPI and in-game sensitivity into a single comparable number:
A player running 800 DPI at 1.0 sensitivity has 800 eDPI. A player running 400 DPI at 2.0 sensitivity also has 800 eDPI. Their crosshairs move at exactly the same speed. eDPI lets you compare your settings directly to pros, regardless of what hardware or in-game value each uses.
The practical use of eDPI: if a pro player whose aim you respect uses 400 DPI at 2.0 sensitivity (800 eDPI), and you currently use 1600 DPI at 3.0 sensitivity (4800 eDPI), you can see immediately that you are playing at 6Γ their aim speed. Matching their eDPI is the starting point for adopting a more controlled, precise aim style. See our detailed eDPI guide for a full breakdown with conversion tables.
What Is a Good eDPI for FPS Gaming?
The table below shows eDPI ranges and their typical player profiles. These are based on analysis of professional player settings across CS2, Valorant, and Apex Legends (Source: ProSettings.net 2024 database).
| eDPI Range | Label | Player Profile |
|---|---|---|
| Under 200 | Ultra Low | Some RTS / strategy games; extremely rare in FPS |
| 200β400 | Very Low | Top Valorant pros; large pad, ultra-precise tracking |
| 400β800 | Low | CS2 / Valorant pros, precision FPS preferred |
| 800β1600 | Medium | Most competitive FPS players; balanced tracking |
| 1600β3200 | High | Battle royale, casual FPS; faster target acquisition |
| 3200+ | Very High | Casual play; difficult to control in aim duels |
For most players new to optimizing their settings, starting at 800 eDPI and adjusting from there is a reasonable anchor point. If you find yourself unable to track moving targets smoothly, go higher. If your crosshair overshoots frequently on flick shots, go lower. Make changes in 10β15% increments, and allow 2 weeks of practice before judging.
High DPI Low Sensitivity vs Low DPI High Sensitivity β Which Is Better?
For the same eDPI, both approaches produce the same crosshair speed. The technical debate centers on sensor behavior at different DPI settings:
Argument for lower DPI (400β800): Most gaming sensors operate with maximum accuracy in the 400β1600 DPI range. Some sensors introduce interpolation artifacts β subtle micro-smoothing of cursor paths β at high DPI values (3200+). These artifacts are imperceptible to most players but theoretically reduce tracking precision on small micro-adjustments. Lower DPI avoids this entirely.
Argument for higher DPI (1600β3200): Higher DPI gives the sensor more position samples per inch, which can improve cursor smoothness at very slow movement speeds. At low DPI, moving the mouse extremely slowly can reveal pixel βsnappingβ β the cursor jumps between discrete positions rather than gliding smoothly. Higher DPI reduces this effect.
The practical conclusion: 800β1600 DPI is the sweet spot for virtually all gaming mice and players. It avoids both interpolation artifacts at the top end and pixel snapping at the bottom, while keeping in-game sensitivity values in a comfortable numeric range (0.3β2.0 for most games). Pro player data confirms this: the most common DPI values among tracked professionals are 400, 800, and 1600. Source: ProSettings.net distribution analysis across CS2, Valorant, and Apex pros.
How to Find Your Ideal DPI and Sensitivity Settings
Follow these steps to dial in your settings systematically:
- 1
Set your DPI to 800
Open your mouse software and set DPI to 800. This is the most common pro setting and sits squarely in the optimal sensor accuracy range for every modern gaming mouse.
- 2
Calculate your target eDPI
Start at 800 eDPI if you have no existing preference. If you already play at a known sensitivity, calculate eDPI = current DPI Γ current sensitivity and use that as your baseline.
- 3
Set in-game sensitivity = target eDPI Γ· DPI
With 800 DPI and a target of 800 eDPI, set in-game sensitivity to 1.0. For 1200 eDPI target: sensitivity = 1200 Γ· 800 = 1.5. Simple division.
- 4
Disable mouse acceleration everywhere
In Windows: Settings β Bluetooth & devices β Mouse β Additional mouse settings β Pointer Options β uncheck 'Enhance pointer precision'. In your game: disable any acceleration or smoothing options. Raw input should be enabled.
- 5
Practice for 2β3 weeks before changing anything
Muscle memory takes 10β20 hours of deliberate practice to form. If your aim feels off after one session, that is normal. Only adjust after a genuine multi-week trial where improvement has plateaued.
Converting DPI and Sensitivity Across Games
Because each game uses a different internal sensitivity scale, you cannot copy sensitivity values directly. What you can do is use eDPI as your anchor, then calculate the correct in-game sensitivity for each game:
In-game sensitivity = target eDPI Γ· DPI Γ· (game-specific scale factor)
Some common scale relationships:
CS2 sensitivity 1.0 β Valorant sensitivity 0.3148 (same 360Β° turn distance at same DPI)
Apex Legends uses a different field-of-view compensation β its sensitivity values are not directly comparable to CS2
Overwatch 2 at 'Relative' and 6% adds FOV compensation; use 'Relative' off for 1:1 tracking
Fortnite 'X-Axis' and 'Y-Axis' sensitivity values require per-axis calibration
The safest approach is to use a dedicated sensitivity converter tool or to look up the target game's scale factor in a community database (Mouse Sensitivity Calculator, KovaaK's sensitivity calculator). Once converted, verify your new setting with a 360Β° rotation test: mark your starting position and measure how far you must move the mouse to complete a full 360Β° in-game. This physical distance should match across all your games at equal eDPI.
Why DPI and Sensitivity Only Matter With Acceleration Disabled
Mouse acceleration modifies cursor speed based on how fast you physically move the mouse β fast movements get an extra speed boost; slow movements stay slower. With acceleration enabled, the same physical distance produces different cursor travel depending on your movement speed. This makes muscle memory impossible: your aim becomes inconsistent because the same flick shot lands in different places depending on how quickly you executed it.
Windows' βEnhance Pointer Precisionβ is a form of mouse acceleration. It must be disabled for all gaming contexts. In-game, look for options labeled βmouse accelerationβ, βpointer accelerationβ, or βraw inputβ β raw input bypasses Windows processing and ensures DPI Γ sensitivity is the only factor determining cursor speed.
With acceleration fully disabled, your DPI and sensitivity settings become reliable, repeatable, and trainable. This is the prerequisite for any serious aim improvement. Pair it with a verified polling rate β use our Mouse Polling Rate Test to confirm your mouse is running at 1000Hz, not silently throttled to 125Hz by a USB hub.
Frequently Asked Questions
What is the difference between DPI and sensitivity?
DPI (dots per inch) is a hardware setting that determines how many pixels your cursor moves per inch of physical mouse movement β it is set in your mouse's driver software. In-game sensitivity is a software multiplier applied on top of DPI within a specific game. Both affect how fast your crosshair moves, but they interact differently: DPI affects the raw input signal at the OS level, while in-game sensitivity scales that signal within the game engine. The combined effect is captured by eDPI (effective DPI) = DPI Γ in-game sensitivity.
Is it better to have high DPI low sensitivity or low DPI high sensitivity?
For the same eDPI value, high DPI / low sensitivity and low DPI / high sensitivity produce nearly identical aim speed. The technical argument for high DPI is sensor precision β modern gaming sensors at 800β1600 DPI are operating well within their accuracy range. The argument for lower DPI is avoiding sensor interpolation artifacts that some mice introduce above 3200 DPI. The practical recommendation from professional player analysis (Source: ProSettings.net) is 400β1600 DPI with in-game sensitivity adjusted to reach the desired eDPI β not high DPI used as a substitute for in-game sensitivity.
What is eDPI and how do I calculate it?
eDPI (effective DPI) is your mouse DPI multiplied by your in-game sensitivity. It gives you a single number that represents your true aim speed, comparable across different games and settings. eDPI = DPI Γ in-game sensitivity. Example: 800 DPI Γ 0.5 sensitivity = 400 eDPI. 400 DPI Γ 1.0 sensitivity = 400 eDPI. Both produce identical crosshair movement speed in-game. eDPI lets you compare your settings to pro players regardless of what DPI or sensitivity value each chooses.
What eDPI do pro FPS gamers use?
Professional CS2 players average approximately 800β1000 eDPI (Source: ProSettings.net CS2 pro settings database). Professional Valorant players average slightly higher, around 260β320 eDPI, because Valorant's sensitivity scale works differently. In Apex Legends, the average pro eDPI is roughly 1200β1600. These are relatively low sensitivities β pros prioritize precision over speed, using large mousepads (400mm+) to accommodate wide sweeping movements at low sensitivity.
Does DPI affect aim accuracy?
Within the sensor's optimal range (typically 400β3200 DPI for most gaming mice), DPI does not affect aim accuracy. The sensor delivers the same precision regardless of which DPI value is selected. Outside that range β very low DPI (below 200) causes visible cursor snapping due to limited position samples, and very high DPI (above 6400) can introduce interpolation artifacts on some sensors. The aim accuracy impact of these extremes is real but affects fewer than 1% of players who are using settings within normal competitive ranges.
Why do pro gamers use low DPI?
Professional gamers use low DPI because their desired eDPI is low (high precision, low aim speed) and they prefer to carry the full sensitivity value in hardware DPI rather than in-game sensitivity, or simply because their preferred DPI happens to be low to hit their target eDPI. Many pros use 400β800 DPI with in-game sensitivity values of 1.0β2.5 to achieve eDPI values of 400β2000. Low DPI is not inherently better β it is just the natural result of choosing a low target eDPI.
Should I change my DPI or in-game sensitivity to aim better?
For aim consistency, the most important factor is not changing either frequently. Pick a target eDPI by starting around 800β1200 eDPI for FPS games, then adjust your DPI and in-game sensitivity to reach it (e.g., 800 DPI + 1.0 sensitivity = 800 eDPI). Play at that setting for at least 2β3 weeks before evaluating. Frequent sensitivity changes reset your muscle memory, which is the primary determinant of aim consistency. Once you find a comfortable eDPI, convert it across games using the eDPI formula rather than guessing.
Last updated: