Antialiasing is a technique used to smooth the edges of objects in a scene to reduce the jagged "stairstep" effect that sometimes appears. Full-scene antialiasing is supported on GeForce or newer hardware. By setting the appropriate environment variable, you can enable full-scene antialiasing in any OpenGL application on these GPUs.
Several antialiasing methods are available and you can select between them by setting the __GL_FSAA_MODE environment variable appropriately. Note that increasing the number of samples taken during FSAA rendering may decrease performance.
The following tables describe the possible values for __GL_FSAA_MODE and the effects that they have on various NVIDIA GPUs.
__GL_FSAA_MODE | GeForce, GeForce2, Quadro, and Quadro2 Pro |
---|---|
0 | FSAA disabled |
1 | FSAA disabled |
2 | FSAA disabled |
3 | 1.5 x 1.5 Supersampling |
4 | 2 x 2 Supersampling |
5 | FSAA disabled |
6 | FSAA disabled |
7 | FSAA disabled |
__GL_FSAA_MODE | GeForce4 MX, GeForce4 4xx Go, Quadro4 380,550,580 XGL, and Quadro4 NVS |
---|---|
0 | FSAA disabled |
1 | 2x Bilinear Multisampling |
2 | 2x Quincunx Multisampling |
3 | FSAA disabled |
4 | 2 x 2 Supersampling |
5 | FSAA disabled |
6 | FSAA disabled |
7 | FSAA disabled |
__GL_FSAA_MODE | GeForce3, Quadro DCC, GeForce4 Ti, GeForce4 4200 Go, and Quadro4 700,750,780,900,980 XGL |
---|---|
0 | FSAA disabled |
1 | 2x Bilinear Multisampling |
2 | 2x Quincunx Multisampling |
3 | FSAA disabled |
4 | 4x Bilinear Multisampling |
5 | 4x Gaussian Multisampling |
6 | 2x Bilinear Multisampling by 4x Supersampling |
7 | FSAA disabled |
__GL_FSAA_MODE | GeForce FX, GeForce 6xxx, GeForce 7xxx, Quadro FX |
---|---|
0 | FSAA disabled |
1 | 2x Bilinear Multisampling |
2 | 2x Quincunx Multisampling |
3 | FSAA disabled |
4 | 4x Bilinear Multisampling |
5 | 4x Gaussian Multisampling |
6 | 2x Bilinear Multisampling by 4x Supersampling |
7 | 4x Bilinear Multisampling by 4x Supersampling |
8 | 4x Bilinear Multisampling by 2x Supersampling (available on GeForce FX and later GPUs; not available on Quadro GPUs) |
__GL_FSAA_MODE | GeForce 8xxx, G8xGL |
---|---|
0 | FSAA disabled |
1 | 2x Bilinear Multisampling |
2 | FSAA disabled |
3 | FSAA disabled |
4 | 4x Bilinear Multisampling |
5 | FSAA disabled |
6 | FSAA disabled |
7 | 4x Bilinear Multisampling by 4x Supersampling |
8 | FSAA disabled |
9 | 8x Bilinear Multisampling |
10 | 8x |
11 | 16x |
12 | 16xQ |
13 | 8x Bilinear Multisampling by 4x Supersampling |
Automatic anisotropic texture filtering can be enabled by setting the environment variable __GL_LOG_MAX_ANISO. The possible values are:
__GL_LOG_MAX_ANISO | Filtering Type |
---|---|
0 | No anisotropic filtering |
1 | 2x anisotropic filtering |
2 | 4x anisotropic filtering |
3 | 8x anisotropic filtering |
4 | 16x anisotropic filtering |
4x and greater are only available on GeForce3 or newer GPUs; 16x is only available on GeForce 6800 or newer GPUs.
Setting the environment variable __GL_SYNC_TO_VBLANK to a non-zero value will force glXSwapBuffers to sync to your monitor's vertical refresh (perform a swap only during the vertical blanking period).
When using __GL_SYNC_TO_VBLANK with TwinView, OpenGL can only sync to one of the display devices; this may cause tearing corruption on the display device to which OpenGL is not syncing. You can use the environment variable __GL_SYNC_DISPLAY_DEVICE to specify to which display device OpenGL should sync. You should set this environment variable to the name of a display device; for example "CRT-1". Look for the line "Connected display device(s):" in your X log file for a list of the display devices present and their names. You may also find it useful to review Chapter 13, Configuring TwinView "Configuring Twinview" and the section on Ensuring Identical Mode Timings in Chapter 19, Programming Modes.
Setting the environment variable __GL_FORCE_GENERIC_CPU to a non-zero value will inhibit the use of CPU-specific features such as MMX, SSE, or 3DNOW!. Use of this option may result in performance loss.
The NVIDIA GLX implementation sorts FBConfigs returned by glXChooseFBConfig() as described in the GLX specification. To disable this behavior set __GL_SORT_FBCONFIGS to 0 (zero), then FBConfigs will be returned in the order they were received from the X server. To examine the order in which FBConfigs are returned by the X server run:
nvidia-settings --glxinfo
This option may be be useful to work around problems in which applications pick an unexpected FBConfig.