Nvidia Profile Inspector (NPI) is an open source third-party tool created for pulling up and editing application profiles within the Nvidia display drivers. It works much like the Manage 3D settings page in the Nvidia Control Panel but goes more in-depth and exposes settings and offers functionality not available through the native control panel.
The tool was spun out of the original Nvidia Inspector which featured an overclocking utility and required the user to create a shortcut to launch the profile editor separately.
Nvidia Profile Inspector typically sees maintenance updates released every couple of months, keeping it aligned with changes introduced in newer versions of the drivers, although the project were seemingly on a hiatus between January 19, 2021 and November 13, 2022. An unofficial fork was released in April 2022 with improved support for newer drivers and now no longer available since an official update from the original author.
Main window of the global/base profile.
Sync and Refresh settings
The built-in G-Sync status indicator overlay
- Allow enables the use of G-Sync, and synchronizes monitor refresh rate to GPUs render target.
- Force off and Disallow disables the use of G-Sync.
- Fixed Refresh Rate is the traditional fixed refresh rate monitor technology.
- Ultra Low Motion Blur (ULMB) uses back light pulsing at a fixed refresh rate to minimize blur.
It is highly recommended to change this using the Nvidia Control Panel > Manage 3D Settings > Monitor Technology instead to properly configured this and related parameters.
- Default is Use the 3D application setting, and it is recommended not to go above 3. Values of 1 and 2 will help reduce input latency in exchange for greater CPU usage.[8]
- When Vsync is set to
1/2 Refresh Rate
, a value of 1 is essentially required due to the introduced input latency.[citation needed]
- Highest available - Overrides the refresh rate of the exclusive fullscreen game to whatever is the highest available on the monitor. This setting is automatically used when G-Sync is enabled. Note that this override might not work for all games, in which case an alternative such as Special K might be needed.
- Use the 3D application setting / Application-controlled - Uses the refresh rate as requested by the application. If using G-Sync, frame rates above the requested refresh rate will result in screen tearing as G-Sync will go inactive, or V-Sync synchronizing the frame rate to the refresh rate if enabled.
- This setting is only exposed in Nvidia Control Panel for monitors supporting refresh rates of at least 100 Hz.
- This setting is only effective in DX9/10/11 applications. On Windows 10 (and above) - Fullscreen optimizations must also be disabled.
Official description:[9]
Smooth Vsync is a new technology that can reduce stutter when Vsync is enabled and SLI is active.
When SLI is active and natural frame rates of games are below the refresh rate of your monitor, traditional Vsync forces frame rates to quickly oscillate between the refresh rate and half the refresh rate (for example, between 60Hz and 30Hz). This variation is often perceived as stutter. Smooth Vsync improves this by locking into the sustainable frame rate of your game and only increasing the frame rate if the game performance moves sustainably above the refresh rate of your monitor. This does lower the average framerate of your game, but the experience in many cases is far better.
Official description:[10]
Nothing is more distracting than frame rate stuttering and screen tearing. The first tends to occur when frame rates are low, the second when frame rates are high. Adaptive VSync is a smarter way to render frames using Nvidia Control Panel software. At high framerates, VSync is enabled to eliminate tearing. At low frame rates, it's disabled to minimize stuttering.
- Make sure you are clearing out any flags in this field for a game profile when forcing AA as it will interfere and cause it not to work if you aren't careful.
- Introduced with Half-Life 2 in 2004.
- Defaults to
ON
starting with Fermi GPUs.[11] - Should not be set to
OFF
on modern hardware.[citation needed]
- Application Controlled - The application itself controls anti-aliasing settings and techniques. The display driver does not override or enhance the anti-aliasing setting the application configures.
- Override Application Setting - The display drivers overrides the anti-aliasing setting of the application. This allows one to force anti-aliasing from the display driver. General rule of thumb is to disable any in-game anti-aliasing/MSAA when using this to avoid conflict. There are exceptions though; generally noted in the Anti-Aliasing Compatibility Flags document.
- Enhance Application Setting - Enhances the anti-aliasing of a game (e.g. enhance in-game MSAA with TrSSAA), which can provide higher quality and greater reliability for applications with built-in support for anti-aliasing. You must set any anti-aliasing level within the application for this mode to work with the Antialiasing - Setting override.
- Use
Override Application Setting
if the application does not have built-in anti-aliasing settings or if the application does not support anti-aliasing when HDR rendering is enabled. When dealing with modern games, you will most likely want to use this option and not any of the other two. -
Enhance Application Setting
is entirely dependent on the implementation of MSAA in the application itself. This is can be hit or miss in modern DirectX 10+ games; more often than not either it does not work at all, breaks something, or looks very bad. See Enhance application setting for more information. - DirectX 10+ games ignore this flag and always treat it as
Enhance Application Setting
.[13]
- Explanation of MSAA methods
This is where the specific method of forced/enhanced MSAA or OGSSAA is set.
- Technical explanation
- Interactive example
- This enables and disables the use of transparency multisampling.
- Further Reference
- This sets whether
Supersampling
andSparse Grid Supersampling
is used.
This enables Nvidia's Multi Frame Anti Aliasing mode. This only works in DXGI (DX10+) and requires either the game to have MSAA enabled in the game or MSAA to be forced.What it does is change the sub sample grid pattern every frame and then is reconstructed in motion with a "Temporal Synthesis Filter" as Nvidia calls it.
There are some caveats to using this though.
- It is not compatible with SGSSAA as far as shown in limited testing.[citation needed]
- Depending on the game, MFAA causes visible flickering on geometric edges and other temporal artifacts. Part of this is nullified with downsampling.[citation needed]
- It has a minimum framerate requirement of about 40FPS. Otherwise MFAA will degrade the image.
- Visible sawtooth patterns with screenshots and videos captured locally.[citation needed]
- Incompatible with SLI.[citation needed]
- Turn FXAA
On
to improve image quality with a lesser performance impact than other antialiasing settings. - Turn FXAA
Off
if you notice artifacts or dithering around the edges of objects, particularly around text.
- Enabling this setting globally may affect all programs rendered on the GPU, including video players and the Windows desktop.
Texture filtering settings
-
16x
results in the best quality. - It is possible to add a global override that forces anisotropic filtering on all games. This can solve texture filtering issues that would otherwise exist in games that have mediocre texture filtering. See Why force Anisotropic Filtering (AF)? for details.
User-defined / Off
.- Modern GPUs can perform 4x anisotropic filtering at no performance penalty.[14]
On
it will ignore user-defined driver overrides. Some games default to this, such as Quake 4, RAGE, Doom (2016), Far Cry 3, and Far Cry 3: Blood Dragon.- Select
On
for higher performance with a minimal loss in image quality - Select
Off
if you see shimmering on objects
- Select
On
for higher performance with a minimal loss in image quality - Select
Off
if you see shimmering on objects
- Manual LOD bias will be ignored when this is enabled.
The level of detail bias setting for textures in DirectX applications. This normally only works under 2 circ*mstances, both of which requires Driver Controlled LoD Bias set to Off
.
- When Antialiasing - Mode is set to
Override Application Setting
orEnhance Application Setting
, and an appropriate Antialiasing - Setting is selected. - If you leave the Antialiasing - Mode setting to
Application Controlled
orEnhance Application Setting
but you set the anti-aliasing and transparency setting to SGSSAA (e.g. 4xMSAA and 4xSGSSAA; TrSSAA in OpenGL) then you can freely set the LOD bias and the changes will work without forcing anti-aliasing. This has the side effect that in some games if the game has MSAA, it will act as if you were "enhancing" the game setting even if usingApplication Controlled
.
Notes:
- If you wish to use a negative LOD bias when forcing SGSSAA, these are the recommended amounts:
- 2xSGSSAA (2 samples): -0.5
- 4xSGSSAA (4 samples): -1.0
- 8xSGSSAA (8 samples): -1.5
- Do not use a negative LOD bias when using OGSSAA and HSAA as they have their own LOD bias.[15]
- Has no effect on Kepler GPU's and beyond unless using Nvidia SSAA (SG, OG or Hybrid); can be worked around by setting Antialiasing - Transparency Supersampling to
AA_MODE_REPLAY_MODE_ALL
.[16][17]
- Should be set to or left at
High Quality
if not using a GeForce 8 series GPU or earlier.[citation needed]
- Disabled if Texture Filtering - Quality is set to
High Quality
.
- Performance
- Quality
- High Quality
-
Quality
andHigh Quality
are nearly identical, whilePerformance
noticeably lowers the resolution and precision of the effect in many games with less-accurate and stronger shading.[citation needed] - Certain games need
Performance
to support forced ambient occlusion, such as Oddworld: New 'n' Tasty![citation needed]
Enabled
when using forced ambient occlusion.- If you are using an older OpenGL application, turning this option on may prevent crashing
- If you are using a newer OpenGL application, you should turn this option off.
- Adaptive
- Automatically determines whether to use a lower clock speed for games and apps.
- Works well when not playing games.
- Older Nvidia GPU Boost GPUs tend to incorrectly use a lower clock speed.[citation needed]
- Setting power management mode from
Adaptive
toMaximum Performance
can improve performance in certain applications when the GPU is throttling the clock speeds incorrectly.[20]
- Optimal power
- Same as Adaptive, but will not re-render frames when the GPU workload is idle; instead it will just display the previous frame.
- The default power management mode on modern graphics cards.
- Prefer maximum performance
- Prevents the GPU from switching to a lower clock speed for games and apps.
- Works well when playing games.
- Wasteful when not using GPU-intensive applications or games.
- Added in driver 337.88
- Most newer applications should benefit from the
Auto
orOn
options. - This setting should be turned off for most older applications.
- Auto-select - When possible - driver coverts fullscreen-sized windows into exclusive fullscreen.
- Use block transfer - Disables any driver conversions, all apps will work in legacy windowed mode.
- Has no effect when Vulkan/OpengL present method set to
Prefer layered on DXGI Swapchain
.
- As Needed (default) - Resources for workstation features are allocated as needed resulting in the minimum amount of resource consumption. Feature activation or deactivation often causes mode-sets.
- Moderate pre-allocation - Resources for the first workstation feature activated are statically allocated at system boot and persist thereafter. This will use more GPU and system memory, but will prevent mode-sets when activating or deactivating a single feature. Invocation of additional workstation features will still cause mode-sets.
- Aggressive pre-allocation - Resources for all workstation features are statically allocated at system boot and persist thereafter. This will use the most GPU and system memory, but will prevent mode-sets when activating or deactivating all workstation features.
- 0x00000000 OGL_DEFAULT_SWAP_INTERVAL_APP_CONTROLLED - Let the application decide.
- 0x00000001 OGL_DEFAULT_SWAP_INTERVAL_VSYNC - Follow the Vertical Sync setting used under the "Sync and Refresh" section
- 0x10000000 OGL_DEFAULT_SWAP_INTERVAL_FORCE_ON - Force Vertical Sync on
- 0xF0000000 OGL_DEFAULT_SWAP_INTERVAL_FORCE_OFF - Force Vertical Sync off
- 0xFFFFFFFF OGL_DEFAULT_SWAP_INTERVAL_DISABLE - Same as FORCE_OFF
- 0x00000000 SHIM_MCCOMPAT_INTEGRATED Use Integrated Graphics
- 0x00000001 SHIM_MCCOMPAT_ENABLE Use DGPU [Nvidia Card]
- 0x00000002 SHIM_MCCOMPAT_USER_EDITABLE Use Chooses GPU [Via Menu/Control Panel/Right Click Menu]
- 0x00000008 SHIM_MCCOMPAT_VARYING_BIT If you know what this setting does, please add a description here.
- 0x00000010 SHIM_MCCOMPAT_AUTO_SELECT Use Auto Default Option [Nvidia Control Panel Decided]
- 0x80000000 SHIM_MCCOMPAT_OVERRIDE_BIT If you know what this setting does, please add a description here.
- Off – Silk is disabled.
- Low – Moderate smoothing is enabled and most microstutter is eliminated.
- Medium – Many stutters and hitches are removed in typical games.
- High – More smoothing is applied and may result in observable input lag.
- Ultra – Maximum smoothing is applied and most stutters and hitches in games are eliminated. Lag may be unacceptable in some games.
- Selecting High or Ultra settings for silk can increase noticeable lag when playing, and may not be appropriate for first person shooters or competitive gaming.