RivaTuner is a complete powerful tweaking environment, providing you everything you may need to tune NVIDIA GPU based display adapters. The widest driver-level Direct3D / OpenGL and system tuning options, flexible profiling system allowing to make custom settings on per-application basis, both driver-level and low-level hardware access modes, unique diagnostic and realtime hardware monitoring features and exclusive power user oriented tools like built-in registry editor and patch script engine make RivaTuner's feature set absolutely unmatched.
RivaTuner supports all NVIDIA display adapters starting from Riva TNT family up to the latest GeForce 6800 series and widest range of NVIDIA drivers starting from the oldest Detonator 2.08 up to the newest ForceWare drivers family.
In addition to complete NVIDIA hardware support, RivaTuner also provides limited support for display adapters based upon ATI RADEON 8500 and newer ATI graphics processors. All RivaTuner's features besides driver-level tuning options are also available on supported ATI hardware.

License agreement:

Your use of RivaTuner is governed by the following conditions. Please read this information carefully before using RivaTuner. By using it you are agreeing to the following conditions:

1. RivaTuner is freeware and can be freely used for any noncommercial purposes, subject to the following restrictions.
2. RivaTuner can be only distributed electronically through the official hosting partners www.guru3d.com and www.nvworld.ru. Distribution through other websites or in any other form without the author's permission is prohibited.
3. RivaTuner is supplied "as-is". The author assumes no liability for damages, direct or consequential, which may result from the use of RivaTuner.
4. RivaTuner's SoftQuadro4, SoftR9x00 and SoftFireGL patch scripts are intended for testing purposes only. These scripts were exclusively developed in order to prove tests results stated in author's articles. The use of these scripts for any commercial purposes is strictly prohibited. The author expressly disclaims any liability for improper distribution and improper use of these scripts.
5. No derivative works and idea stealing. You may not clone any of RivaTuner's original modules and bring them to the tweaking scene as your own ideas without any references to the source.
6. RivaTuner is a copyrighted material of Alexey Nicolaychuk aka Unwinder. You may not decompile, disassemble or otherwise reverse engineer this product. You may not include the parts of RivaTuner (databases, patch scripts and bundled drivers) in your software without the author's permission. You may not alter or modify RivaTuner in any way or create a new installer for it.

Copyright (C) 1998-2005, Alexey Nicolaychuk aka Unwinder

System requirements
Revision history
Version 2.0 Release Candidate 15.6
Version 2.0 Release Candidate 15.5
Version 2.0 Release Candidate 15.4
Version 2.0 Release Candidate 15.3 New Year edition
Version 2.0 Release Candidate 15.2
Version 2.0 Release Candidate 15.1
Version 2.0 Release Candidate 15 hotfix
Version 2.0 Release Candidate 15
Version 2.0 Release Candidate 14.3 New Year edition
Version 2.0 Release Candidate 14.2
Version 2.0 Release Candidate 14.1
Version 2.0 Release Candidate 14
Version 2.0 Release Candidate 12.4
Version 2.0 Release Candidate 12.3
Version 2.0 Release Candidate 12.2
Version 2.0 Release Candidate 12.1
Version 2.0 Release Candidate 12
Version 2.0 Release Candidate 11.1
Version 2.0 Release Candidate 11
Version 2.0 Release Candidate 10.2
Version 2.0 Release Candidate 10.1
Version 2.0 Release Candidate 10
Version 2.0 Release Candidate 9
Version 2.0 Release Candidate 8.2
Version 2.0 Release Candidate 8.1
Version 2.0 Release Candidate 8
Version 2.0 Release Candidate 7.1
Version 2.0 Release Candidate 7
Version 2.0 Release Candidate 6
Version 2.0 Release Candidate 5.1
Version 2.0 Release Candidate 5
Version 2.0 Release Candidate 4.1
Version 2.0 Release Candidate 4
Version 2.0 Release Candidate 3.1
Version 2.0 Release Candidate 3
Version 2.0 Release Candidate 2
Version 2.0 Release Candidate
Version 2.0 beta 2
Version 2.0 beta
Frequently asked questions (FAQ)

System requirements:


Known issues:

Revision history:

Version 2.0 Release Candidate 15.6:

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 15.5 (published on 07.05.2005):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 15.4 (published on 04.03.2005):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 15.3 New Year edition (published on 28.12.2004):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 15.2 (published on 04.10.2004):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 15.1 (published on 05.09.2004):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 15 hotfix (published on 02.07.2004):

Minor bugfixes:

Version 2.0 Release Candidate 15 (published on 28.06.2004):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 14.3 New Year edition (published on 28.12.2003):

Minor bugfixes:

What's new:

Warning! Due to unknown reason ASUS 9800XT boards tend to misread data from its hardware sensor when VPU is highly loaded (mostly when running 3D applications). The problem is not RivaTuner specific and it is also echoed in generic ASUS SmartDoctor hardware monitoring software suite. The problem is being investigated now and presumable it is not related to ASUS 9800XT-specific sensor implementation and misreading is caused by problems in ATI driver's I2C interface. As a temporary workaround for this problem, the current release of RivaTuner provides heuristic algorithm allowing monitoring module to detect and filter incorrectly read data on ASUS 9800XT boards. However, it is not 100% workaround and it may not work on some ASUS 9800XT boards.

Version 2.0 Release Candidate 14.2 (published on 11.12.2003):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 14.1 (published on 02.11.2003):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 14 (published on 30.09.2003):

Minor bugfixes:

What's new:

Note: DetonatorFXDecoder and DetonatorFXD3DAntiprotection were developed exclusively due to numerous requests from NVIDIA users to allow them to install LODBiasFix patch on the Detonator 45.xx and higher. DetonatorFXDecoder and DetonatorFXD3DAntiprotection support will be stopped after fixing negative LOD bias adjustment bug in NVIDIA Direct3D driver.

Version 2.0 Release Candidate 12.4 (published on 28.04.2003):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 12.3 (published on 17.03.2003):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 12.2 (published on 30.01.2003):

Minor bugfixes:

What's new:

Note: SoftQuadro4 support is currently stopped and these scripts are not 42.70+ compatible.

Version 2.0 Release Candidate 12.1 (published on 13.01.2003):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 12 (published on 28.09.2002):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 11.1 (published on 23.07.2002):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 11 (published on 26.06.2002):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 10.2 (published on 08.04.2002):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 10.1 (published on 01.04.2002):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 10 (published on 06.02.2002):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 9 (published on 29.12.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 8.2 (published on 10.12.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 8.1 (published on 6.12.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 8 (published on 14.11.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 7.1 (published on 30.09.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 7 (published on 20.09.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 6 (published on 2.08.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 5.1 (published on 8.07.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 5 (published on 25.06.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 4.1 (published on 23.04.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 4 (published on 05.04.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 3.1 (published on 11.03.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 3 (published on 05.03.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate 2 (published on 11.02.2001):

Minor bugfixes:

What's new:

Version 2.0 Release Candidate (published on 29.12.2000):

Minor bugfixes:

What's new:

Version 2.0 beta 2 (published on 26.11.2000):

Minor bugfixes:

What's new:

Version 2.0 beta (published on 17.07.2000):

The first public release


Q: How does RivaTuner work? Is it just a usual registry tweaker?
A: Yes, it was a registry tweaking utility in the beginning. Now RivaTuner can work in two modes and tweak your graphics subsystem either at the display driver level or at low level.
At display driver level RivaTuner changes the settings via the registry and directly calls the driver's functions in order to perform some operations (e.g. query and set the clock frequencies, update an overlay context, change a color scheme and so on). In this mode RivaTuner can also read some info (e.g. AGP settings) directly from your graphics hardware but it doesn't use a low-level access to your hardware to change anything.
In low-level mode RivaTuner works directly with your graphics hardware. When you make any changes in this mode RivaTuner mostly doesn't use either Windows API or display drivers and directly programs your graphics hardware.

Q: Can I close RivaTuner after changing the driver's settings? Will the tweaks have an effect in this case?
A: Yes, of course. All the driver's settings are stored in the registry so you can safely close RivaTuner or even remove it from your hard drive, but the tweaks will still work. It's not necessary to load RivaTuner at Windows startup in order to apply the changes you have made. However, some options do require RivaTuner to reside in memory, for example 'Restore clock frequencies after suspended mode' option and low-level refresh overrider module. The power users may also wish to keep RivaTuner resident in order to use its hardware monitoring module, built-in registry editor's and low-level diagnostic module's tracking features.

Q: And what about the low-level overclocking? Will it work if RivaTuner is not loaded at Windows startup?
A: Yes, it will. The only condition is not to remove RivaTuner's folder from your hard drive. RivaTuner adds itself with the /S command line switch to the autorun registry key when you enable any low-level settings (e.g. AGP settings, overclocking or color correction) at Windows startup. When this command line switch is specified, RivaTuner loads itself at Windows startup, executes startup daemon (the procedure that configures startup settings depending on the options you've chosen) and unloads itself from memory immediately.

Q: How can I completely remove RivaTuner from my system?
A: Just run the uninstaller from start menu or from 'Add or remove programs' control panel applet. It will remove RivaTuner from your hard drive and clean any RivaTuner specific settings from your registry.

Q: What about driver-specific tweaks I've made with RivaTuner? Will they stay after uninstalling RivaTuner?
A: Yes, they will. RivaTuner doesn't make any changes in the driver's settings during uninstallation. If you want to reset the driver's settings to defaults, use RivaTuner's "Reset all the driver's settings to their default values" function before uninstalling it. Also, you can save pre-RivaTuner's state of the driver's settings to a preset, then simply restore settings with this preset before uninstalling RivaTuner.

To create a snapshot containing pre-RivaTuner's driver settings, perform the following:

1. Open the "Launcher" tab and click the "Add new menu item" button, then select "Regular menu item type" and press OK. Menu item editor dialog will appear.
2. Type a name for new menu item. For example, "Pre-RivaTuner restore point".
3. Check the "Associated preset" option then click "Create" button to create a preset containing your current driver's settings.

New menu item will appear. Now you can rollback to this restore point by double-clicking this menu item. When RivaTuner is minimized to tray you can also rollback to this restore point from the context menu (right click RivaTuner's tray icon in order to open it).

Q: I've noticed that RivaTuner service is running even after unloading RivaTuner. Is it a bug?
A: No, it is by design driver usage strategy. It was introduced in version RC12 in order to make RivaTuner compatible with Windows XP fast user switching feature. Since that version RivaTuner uses install-start-load-unload driver usage behavior instead of install-start-load-unload-stop-uninstall. However, the previous mode (install-start-load-unload-stop-uninstall) can be also activated with the IODriverUninstallBehaviour registry entry.

Q: Where do you get info about all these tweaks? Are you working at NVIDIA?
A: No, I'm not working at NVIDIA. I'm just a discontented owner of NVIDIA display adapter and I want to get all from my hardware and its' drivers. I'm a professional programmer and reverse engineering is my hobby so I simply rip the info from the drivers using SoftICE, IDA and some other additional tools. I used NVIDIA display adapters since Riva128 times but I've never been satisfied with NVIDIA driver's control panel interface. It's a bit strange for me that some really useful settings are hidden from the end uses. That's why I started to code my own tweaking utility, which can satisfy all my needs.

Q: What is *.RTD database and how can I use it?
A: *.RTD database is just the list of registry entries, which can be viewed and edited via the built-in registry editor. RivaTuner uses *.RTD files to store info about the registry entries, used by a driver. Different driver versions use different sets of the registry entries, stored in different registry keys, so the information for each driver version is stored into a separate database. Database for each driver version contains the list of registry entries used by this driver. Default values and descriptions for each registry entry are also included in the database. You don't make any changes in your registry when you are opening new database in RivaTuner, you just load the list of registry entries, which can be modified via the built-in registry editor.

Q: Do I need to load new *.RTD database after installing new drivers?
A: Don't care about databases until you are going to edit the registry directly with the PowerUser tab. Forget about databases if you are not an experienced user and you are not using this tab. The rest of RivaTuner's features absolutely don't depend on the currently loaded database, this tab is just an advanced tool for experienced users, which doesn't directly affect any RivaTuner's module. So you may safely work with RivaTuner even if completely old database is loaded in PowerUser tab. RivaTuner will correctly detect any supported driver and allow you to change all the supported options via the Direct3D / OpenGL / System tweaking dialogs regardless of the database you've loaded in PowerUser tab.

Q: What is *.RTP preset and how can I use it?
A: *.RTP preset is a script, which can add or remove entries from your registry. It is similar to a *.REG file, but it has some advantages:

1. Both Window 9x and Windows 2000/XP store settings in system dependent registry keys. It means that your *.REG files may work incorrectly on another PC or even on your PC after reinstalling a display driver. RivaTuner uses macro names to export and import presets so they will always work correctly.
2. *.REG files cannot remove entries from registry. Presets can do it.
3. Preset files can contain driver version dependent and hardware dependent registry entries.

You can use pre-created presets (located in "Presets" folder) or create your own presets via the built-in registry editor. The presets are integrated in the Explorer shell so you can just click any *.RTP file within the Explorer in order to import preset data. You can also run a preset via RivaTuner's launcher by associating a preset with a launcher's item. Follow the next steps in order to associate a preset with a launcher's item:

1. Open the "Launcher" tab and click the "Add new menu item" button, then select "Regular menu item type" and press OK. Menu item editor dialog will appear.
2. Type a name for new menu item. For example, "Apply quality optimized settings".
3. Check the "Associated preset" option then select your preset and close the menu item editor.

New menu item will appear. Now you can launch the preset associated with this menu item by double clicking it. When RivaTuner is minimized to tray you can also launch it from the context menu (right click RivaTuner's tray icon in order to open it).

Q: I need the preset for Quake III. Where can I download it?
A: You don't need the presets for all the games you have. I've added the presets only for those games, which cannot run properly without the specific driver's settings (e.g. all the NFS series require enabled table fog emulation and the nonstandard texel alignment scheme; games based upon the Unreal engine looks much better with a negative LOD bias and floating point W-buffer format in 16-bit modes and so on). For the rest games you can use the "High quality" and "High performance" presets.

Q: Some settings (e.g. automatic mipmapping) are grayed in RivaTuner. How can I activate these settings?
A: RivaTuner grays some settings due to one of three cases:

1. Settings are grayed if your hardware does not support them. For example, TNT/TNT2 chipsets do not support anisotropic texture filtering. GeForce do not support automatic mipmapping, fast trilinear filtering mode etc.
2. Settings are grayed if your drivers do not support them. For example the "Enable S3TC quality trick" option is supported by the Detonator 6.47 and higher only.
3. Settings are grayed if your operating system doesn't support them. For example the "Use busmastering mode for video" is supported by Windows 9x only.

The best way to determine why does a setting is grayed is to click the button on RivaTuner's window caption then click an option you want to know about. RivaTuner will display detailed help on this option and explain everything you need to know about this option.

Q: It looks like RivaTuner caused the problems with 3DMark2001 on my system. It crashes with the following error message: ''P_D3D::DRV_allocateMap - device does not support bump normal maps". Is it a bug in RivaTuner? Can you fix it?
A: No, it's not a bug. This problem is caused by the incompatibility between the texture format setting in RivaTuner 2.0 RC9 and the Detonator 27.xx. NVIDIA just added new capability bit to the (D3D_)D3D_52971801 (encrypted SurfaceFormatsDX7) and (D3D_)D3D_52971801 (encrypted SurfaceFormatsDX8) registry entries in 27.xx drivers. This bit allows the Direct3D driver to export D3DFMT_Q8W8V8U8 pixel format and it must not be set on pre-27.xx drivers, but it must be set on new drivers. Unfortunately there is no way to make pixel format setting forward compatible because the driver doesn't allow to set unused bits and resets these registry entries to defaults when at least one extra bit is set. So the only way to work around this problem is to wait for updated RivaTuner. RivaTuner 2.0 RC10 is fully 27.xx-compatible, but this problem may appear again in the future if NVIDIA will add new texture format capability bits.

Q: AGP settings in RivaTuner don't work properly on my system. What's the problem?
A: Unfortunately the Detonator's AGP settings don't work on certain platforms (especially non-Intel based). On the most of Intel based platforms these settings work fine, but on the rest chipsets these settings can be ignored or even cause your operating system to crash (e.g. on VIA Apollo Pro 133A AGP transfer rate settings are ignored and AGP FastWrites settings may cause OS to hang). To change AGP settings on such platforms you can use either BIOS/AGP GART driver settings or RivaTuner's low-level AGP configuration module or other utility, which programs AGP settings directly via the PCI configuration registers (PowerStrip, WPCREDIT, ZTAGPTool).

Q: I'm sure that my display adapter supports AGP FastWrites/Sideband addressing, but AGP settings in RivaTuner are grayed. How can I enable it?
A: First, you must have at least a GeForce256 and Detonator 5.32 or higher drivers to change FastWrites settings, and at least Detonator 6.34 to change Sideband addressing settings. Second, read the previous question. FastWrites and Sideband addressing settings can cause your system to hang, that's why I disabled them by default. If you know what is Safe mode and you don't afraid of BSOD, you can enable these settings in RivaTuner via the registry:

" LockDangerSettings"=dword:00000000

Q: RivaTuner shows that AGP SBA is enabled on my system, however other diagnostic tool I've used tells me that SBA is disabled. What is the problem and who is wrong?
A: It looks like you use AGP 3.0 graphics adapter and motherboard, but trying to diagnose it with a tool not conforming to AGP 3.0 specifications. According to publicly available AGP 3.0 specifications, AGP SBA is a must for AGP 3.0 capable display adapter, so AGP SBA is always enabled in AGP 3.0 mode and the SBA_ENABLE bits controlling SBA protocol state are meaningless until your system operates in AGP 2.0 mode. Some diagnostic tools simply ignore this specification and use old pre-AGP 3.0 styled AGP SBA detection behavior for AGP 3.0 mode too, relying on obsolete and meaningless SBA_ENABLE bits. RivaTuner follows AGP 3.0 specifications and always shows that SBA is enabled ignoring actual states of SBA_ENABLE bits as soon as it detects that your graphics subsystem operates in AGP3.0 mode. However, if you understand this specific and still want to see actual states of display adapter's / northbridge's SBA_ENABLE bits, you may do it with ForceAGP30SBADetection registry entry.

Q: I have used other tweaking utility before RivaTuner and I could change much more Direct3D and OpenGL options there. Why I cannot change these options in RivaTuner?
A: Unfortunately some coders are trying to boost download rates and increase popularity of their utilities by adding invalid, obsolete and even fake tweaking options. It is pity, but it is true. First of all, such utilities are dedicated to rookies in computer graphics. I'm not going to add fakes or untested options and deceive inexperienced users. To change untested options you may use RivaTuner's built-in registry editor. It will help you to change absolutely all registry entries.

Q: Sometimes I get the message 'Invalid registry entries have been detected. RivaTuner will use default or truncated values for these entries'. What does it mean?
A: It means that RivaTuner have detected some registry entries, which contains invalid values. Such registry entries will be ignored or corrected by driver. RivaTuner uses the same validation routines as driver. Usually this message is caused by other tweaking utilities, which set incorrect registry entries, or even by Detonator control panel interface. For example you can set PreRenderLimit entry to 0 via Detonator control panel interface. But Direct3D driver checks PreRenderLimit entry and set it to 1 if it is less than 1. If RivaTuner will detect that such entry, it will give you the warning message and correct it in the same way as driver.

Q: I cannot use RivaTuner. Each time when I'm trying to run it I get the message 'The ordinal 6880 could not be located in the dynamic link library MFC42.DLL'. Any clues?
A: RivaTuner requires Microsoft Foundation Classes (MFC) libraries in order to run properly. These libraries are included in Windows 98 SE /Windows ME / Windows 2000 / Windows XP. However, some software can replace these libraries with the older versions. In this case RivaTuner will not run properly. If you have got such error message, then just restore MFC42.dll from Windows CD or download the latest version of MFC from Microsoft.

Q: I tried to use RivaTuner on Windows 95 OSR 2 but it cannot start. The program always displays the error message: 'The RIVATUNER.EXE file is linked to missing export SHELL32.dll:SHGetSpecialFolderPathA'. Any clues?
A: RivaTuner requires SHELL32.DLL v4.71 or higher for full functionality. You must install Internet Explorer 4.0 Desktop Update or higher in order to use RivaTuner on Windows 95. Please read Internet Explorer's readme.txt to get more info about installing Desktop Update. I've changed my code in RivaTuner v2.0 Release Candidate 4.1 in order to improve compatibility with Windows 95. Since this version RivaTuner can start on this operating system with some limitations even if Internet Explorer 4.0 Desktop Update is not installed.

Note: Windows95 is no longer supported. RivaTuner v2.0 Release Candidate 6 is the last version, which could work with this OS.

Q: I cannot overclock my GeForce2 PRO/GeForce2 Ultra/GeForce3 with both the Detonator control panel and RivaTuner. The system just goes back to the defaults after reboot. How can I fix it?
A: It's the known bug of the Detonator drivers for Windows 2000 and it was fixed in the Detonator 12.90. Due to incorrect memory clock frequency validation the NvXTInit function always reverted the memory clock to the default value when it was above 400MHz. The problem was caused by the bug in videomemory type detection. Windows9x drivers used the following validation interval for the memory clock frequency:

80-400 MHz for the boards equipped with SDR memory
80-800 MHz for the boards equipped with DDR memory

Windows 2000 driver couldn't correctly detect a memory type because it checked GeForce256 DDR PCI DeviceID only. So driver detected the rest boards as SDR and used invalid validation interval. If you don't want to use 12.90 or higher drivers, you may use RivaTuner's NvXTInitFix patch script. It will made some correction in driver and force it to use 80-800Mz overclocking range regardless of videomemory type.

Q: RivaTuner reports wrong clock frequencies on my MX400. It shows 200MHz/334MHz instead of 200MHz/166MHz. The Detonator control panel gives me the same result. How can I fix it?
A: This problem is caused by the hardware feature of the clock frequency generator, which can be hardwired to halve PLL clock frequency. Usually this feature is used on the boards equipped with DDR memory, but it is also used by some hardware vendors on GeForce2 MX200/MX400 boards with 4Mx16 SDR memory modules. In this case you'll be able to overclock your display adapter safely, just remember that double clock frequency is displayed.

Q: It looks like I've overclocked my display adapter too high and now Windows freezes after logging in. Can you help me?
A: First, you should peek in RivaTuner's context help before using any option (especially system critical options like overclocking, AGP, NVStrap settings etc.), you can always find detailed instructions there. Yes, you can reset startup overclocking settings by pressing and holding the <Ctrl> button immediately after logging in Windows. It applies to both driver-level and low-level overclocking and the rest system-critical startup settings (AGP etc.).

Q: I know that RivaTuner provides two ways of overclocking, but I don't understand a difference between these modes. I was told in the forums that low-level overclocking sets clocks directly via VGA BIOS whilst driver-level overclocking do it via the Detonator driver. It is true? What's a difference and which way should I use to overclock my GeForce4?
A: It is almost true. The only correction is that RivaTuner don't use VGA BIOS and accesses PLLs (Phase Locked Loops, i.e. clock frequency generators) directly to set the clocks when operating in low-level mode. In driver-level overclocking mode RivaTuner doesn't access the PLLs directly and 'asks' NVIDIA driver to report the current clock or set a new one. So the basic difference between these modes is a level of communication between RivaTuner and display adapter's clock frequency generator.
However, NVIDIA driver have some interesting behaviors causing more differences between driver-level and low-level overclocking mode.
First, the driver tends to adjust memory timings to improve stability when changing clock frequencies on pre-GeForce FX boards. Opposing to that low-level overclocking module changes nothing but the clock frequencies, so when increasing memory clock you may see higher overclocking limit with driver-level overclocking mode, but the performance one the same clocks set can be higher for low-level overclocking. Second, driver's internal overclocking implementation allows it to change clock frequencies on pre-GeForce FX boards only during display mode change. So the driver must reinitialize display mode to force new clocks to be applied. Low-level overclocking mode allows applying clocks without this annoying display flashing. Third, low-level overclocking module allows power users to customize PLL dividers calculation algorithm, so you may fine tune clock with higher accuracy that it can be done by the driver.
I won't recommend you something regarding choosing preferred overclocking mode - just analyze the differences and choose the mode corresponding to your needs.

Q: I've heard that RivaTuner is the only tool providing low-level overclocking for ATI boards. Is it true?
A: No, it is not true. The most of ATI tweaking / overclocking tools control clocks via direct access to the PLLs (i.e. at low level).

Q: I cannot set clocks in RivaTuner on my RADEON 9800PRO after adjusting clocks in ATITool/ATITrayTools. Can you fix it?
A: No, I will not fix anything because it is not RivaTuner's problem. To maximize clock generation stability RivaTuner uses low clock frequency generation accuracy mode and locks PLL reference divider adjusting clock frequencies via feedback dividers only. The tools you've mentioned don't use this safety measure (however, ATITool does have an option for locking reference divider) and may distort reference divider and set it to non-default value even when restoring defaults clocks. This may seriously limit range of available clocks when programming them via feedback dividers adjustment only. So to adjust clock frequencies via RivaTuner after performing clock frequency adjustment with these tools you should either allow RivaTuner to alter reference divider by selecting high clock frequency generation accuracy in advanced overclocking options or simply reboot system to allow VGA BIOS to restore default reference divider state.

Q: I cannot activate low-level overclocking on my NV3x+ display adapter. I've heard that RivaTuner doesn't provide low-level clock frequency control for these boards because GeForce FX GPUs themselves don't support low-level overclocking. Is it true? If not, will you implement low-level overclocking for GeForce FX GPU family in the future?
A: It is quite stupid to talk about lack of low-level overclocking support in the GPU, because low-level overclocking term assumes nothing but the direct access to a clock frequency generator. So it can be implemented for any display adapter. This mode is not available in RivaTuner due to absolutely different reason, which is pretty simple - low-level overclocking is useless for these boards because of dynamic 2D/3D clock frequency adjustment specific to GeForce FX GPU family. It is absolutely pointless to program clock frequency generator directly because the driver will override its' state as soon as the system will change 2D / 3D state. So the only correct way to overclock GeForce FX based board is to 'tell' the driver which clocks should be set for 2D / 3D via RivaTuner's driver-level overclocking module or any other driver-level overclocking tool.

Q: However, I've heard that some tools provide low-level overclocking for NV3x+ series. Can you comment it?
A: Plain wrong. All currently available GeForce FX overclocking tools providing separate 2D / 3D clock frequency adjustment control clocks via NVIDIA driver due to the reasons explained in the previous question. Due to the same reasons, all driver-level GeForce FX overclocking tools have the same maximum stable clocks because none of them control clock frequencies directly.
You can easily determine which overclocking way is used using the following algorithm: just change the clock in any third party tool you are examining then peek in NVIDIA control panel or in RivaTuner's driver-level overclocking mode. You won't see any changes there in case of low-level overclocking (because NVIDIA driver is not directly notified about clock frequency change and still 'sees' old clocks), but you'll see new clock frequencies if they were passed to NVIDIA driver.

Q: Can you explain me specifics of dynamic separate 2D / 3D clock frequency adjustment on NV3x+ GPU families? Are the clocks dynamically adjusted by the GPU or by VGA BIOS?
A: There are few important things you should know about dynamic separate 2D / 3D clock frequency adjustment on GeForce FX boards. First, the clocks are controlled neither by GPU itself nor by VGA BIOS. Dynamic clock frequency adjustment is performed entirely at display driver level and VGA BIOS is used just as a source info storage for the driver. Second, GeForce FX boards can have up to four fixed clocking modes. These modes are also called performance levels. Besides different clock frequencies, each performance level may have its' own core voltage and own fan speed. During system POST VGA BIOS programs initial or so called safe performance level. Core voltage and the clock frequencies corresponding to this performance level are rather low and usually set to 1.1V / 250MHz / 500MHz in the most of NV30/35/38 VGA BIOSes. This ensures that the system will be able to boot and you'll be able to start Windows even without auxiliary power cable connected to your graphics card. Besides safe performance level programmed by VGA BIOS, there are three performance levels more. VGA BIOS contains so called performance table, used by display driver and defining clock frequencies, core voltages and fan speeds (if possible) for each performance level. Performance table can contain up to three different performance levels called performance level 0 (or standard 2D mode), performance level 1 (or low power 3D mode) and performance level 2 (or performance 3D mode). Higher performance level indices mean higher clocks, higher core voltage, higher performance but higher power consumption. As soon as Windows is loaded and the driver detects that auxiliary power cable is connected to your graphics card and VGA BIOS contains non-empty performance table, the driver switches graphics card from safe performance level to performance level 0 or standard 2D mode. During OS runtime, the driver tracks GPU state and switches display adapter to the maximum performance level when GPU power is needed (i.e. when 3D application is running), and vice versa iteratively lowers performance level if GPU power is no longer needed and power consumption can be saved by lowering core voltage and clock frequencies.

Q: So if I understand you correctly fan speeds are specified directly in the performance table in VGA BIOS. Can I edit it, for example set constant fan speed for all performance level?
A: Yes, you can do it easily, but only on the boards with reference cooling system. In simple phrase, the fan must be physically connected to the PCB's reference fan voltage control circuit on you system in order to use performance-level specific fan speed adjustment.

Q: I can hear that my GeForce 6800's fan changes speed when I start 3D applications, however I don't see any changes on the "Fan voltage" graph. Can you comment it?
A: There are a few things, important for fan control understanding. First, reference design display adapters based on NV3x and newer graphics processors have GPU controllable fan connectors, i.e. the graphics processor itself can scale fan voltage from 12V down to zero. Fan speed is controlled using this approach on reference design 6800 series and on some non-reference display adapters manufactured by some vendors like Gainward and Leadtek. RivaTuner's "Fan voltage" graph reflects actual fan voltage scaling level only on such boards. If the fan is not connected to this reference circuit and always fed with 12V, this graph is simply not trustworthy. Second, vendors can install I2C ICs, containing combined thermal sensor and PWM (Pulse Width Modulation) controllers. Such sensor chips can control fan speed using pulse with modulation instead of linear voltage adjustment. Fan speed on such display adapters is set either by vendor specific software (e.g. ASUS SmartDoctor), or automatically by the sensor depending on the current temperature. For example, ASUS uses this approach. Third, some vendors use entirely fan speed solutions, where the cooling system itself allows controlling fan speed either automatically or manually (e.g. AGP MSI NX6800 series).

Q: I'm having problems with applying overclocking at Windows startup on my GeForce FX with both Coolbits and RivaTuner. Core clock stays overclocked but memory clock resets to defaults as soon as I reboot the system. Can you help me?
A: Yes, it is a known issue and it is caused by incorrect startup overclocking settings you've made. Actually, only NV30 based boards provide separate core and memory clocks for different performance levels. The rest NV3x boards provide separate core clock frequency control, but unified memory clock frequency control. It means that the driver don't control memory clock frequency independently for each performance level, so when you change it for a certain performance level it is also affects the rest performance levels. However, the driver's control panel itself and all driver-level overclocking tools allow you to specify independent core and memory clocks for different performance levels even for boards with unified memory clock (e.g. 300MHz/900MHz for standard 2D mode and 300MHz/1000MHz for performance 3D mode). Such settings will always cause interference and the actual memory clock will be set to the value, corresponding to the memory clock for the last programmed performance level. Control panels startup daemon routine programs performance 3D clocks first, then standard 2D clocks. So either startup memory clock for standard 2D mode or default memory clock for this mode will be actually set. To avoid the problem, simply ensure that you've specified the same startup memory clocks for all performance levels. Starting from version RC15, RivaTuner includes special memory clock frequency reset protection wizard, allowing to detect and fix such incorrect startup settings automatically.

Q: I've enabled overclocking in ForceWare control panel and selected manual overclocking mode, but RivaTuner's "Enable driver-level hardware overclocking" option is still disabled. What's a problem?
A: It is normal behavior for 56.xx+ driver series. When you perform clean driver install, enable manual overclocking mode for the first time and don't apply custom clocks at Windows startup, overclocking is actually enabled in two passes and final initialization is performed after OS reboot. So after reboot and final initialization this option will be enabled in RivaTuner too. Alternately, you may simply re-enable overclocking in RivaTuner to force it to perform the same initialization without OS reboot.

Q: I've unchecked "Allow separate 2D/3D clock frequency adjustment" option during enabling driver-level overclocking in RivaTuner on my GeForce 5800 and my system booted at inexpediently low clocks (250MHz/500Mhz instead of 300MHz/600MHz). Also, I cannot even reach default 3D clocks. What's wrong?
A: There is nothing wrong, please read above-stated question explaining specifics of GeForceFX specific separate 2D/3D clock frequency control. By disabling separate 2D/3D clock frequency control you simply force your display adapter to work in so called safe performance level, which is programmed by VGA BIOS and has lower core voltage comparing even to standard 2D performance level. So the range of stable clocks is much lower too due to core undervoltage.

Q: I've unchecked "Allow separate 2D/3D clock frequency adjustment" option during enabling driver-level overclocking in RivaTuner, but cannot find an option for re-enabling it. Can you help me?
A: You are prompted to change this option during enabling driver-level overclocking only. To switch between separate / unified clock frequency adjustment modes simply disable driver-level overclocking in RivaTuner, click Apply, then re-enable it again.

Q: Starting from version RC15 I see no progress bar when testing clock frequencies on my GeForceFX. Is it normal?
A: Yes, it is normal. This version introduced instantaneous driver-level stress test feature. There is no need for displaying any progress bar because the test is performed in a split second.

Q: RivaTuner doesn't allow me to set core clock higher than 420MHz on my GeForce6800 and displays "The driver failed to pass internal test with the clock frequencies you are about to set. Please decrease clock frequencies and try again." error message. Can you fix this bug?
A: There is nothing to fix. Similar to the driver's control panel overclocking interface, RivaTuner performs clock stability testing and invokes the driver's internal stress test to verify is it safe to set desired clock frequency or not. As soon as the test is not passed with new clocks, you are not allowed to apply the changes. However, if you are power user, you may either disable internal clock stress testing with "DisableInternalClockTest" registry entry or disable clock testing at all with "DisableClockTest" registry entry. However, I strictly recommend you to avoid using these options if you are beginner and terms like throttling and RobustChannels tell you nothing.

Q: I've disabled stress testing in RivaTuner and tried to overclock my GeForce 6800 Ultra to 490MHz, however graphics performance went down drastically. Can you fix this bug?
A: .There is nothing to fix, you've purposely disabled clock frequency stability testing and went eased stable clocks range, so you should understand that you actions can cause unpredictable system behavior. Stress test failure means that there were some internal GPU functionality failures when the chip was running at that clocks. You should understand that as soon as you forcibly set these clocks, similar GPU functionality failures can and surely will take place during running some heavy 3D applications. When such failures are detected by the driver, it may throttle clocks to lower performance level (Low Power 3D or even 2D). Alternately, the chip may simply hang (happily RobustChannels technology allows to recover system without reboot). Both cases will cause performance drop.

Q: I've successfully enabled LowPower 3D clock controls with "EnableLowPower3DControl" registry entry, but I cannot increase the clocks even by 1MHz due to internal stress test failure on my GeForce FX 5900. What's a problem?
A: It is by design feature. Unfortunately the driver has internal stress test implementation for 2D and 3D modes only. So there were only 3 choices for clock testing implementation in RivaTuner: to test LowPower 3D clocks in 2D mode, to test LowPower 3D clocks in 3D mode or to avoid testing at all. Considering that core voltage for LowPower 3D mode is usually greater than 2D mode voltage but less than 3D mode voltage, LowPower 3D clocks stress testing in 2D mode can give too pessimistic result due to undervoltage and stress testing in 3D mode can give too optimistic result due to overvoltage. Due to beginner protection approach, I've chosen the most safe way, so RivaTuner always tests LowPower 3D clocks in 2D mode. If you are power user and understand this specific, you may disable stress testing (please refer to the previous question) and set higher clocks for LowPower3D mode.

Q: I'm having serious problems with overclocking my GeForce FX 5600 under the ForceWare 6x.xx series. As soon as I change memory clock via RivaTuner's driver level overclocking module, driver's overclocking panel or even generic ASUS SmartDoctor software supplied on CD with my display adapter, performance goes down drastically. Can you help me?
A: It is known issue of ForceWare 6x.xx drivers, which affects only GeForce FX display adapters with non-reference VGA BIOS containing no (or zero sized) performance table. The only workarounds until fixing the problem in drivers are either to adjust clocks at low-level with tools like PowerStrip or to flash reference BIOS allowing the driver to use correct clock adjustment routines. Alternately, you may alter the clocks directly in VGA BIOS with some specialized tools.

Q: There are a lot of rumors about GeForceFX software voldmods on the net. I've seen some online BIOS voltmodding tutorials and even volmodded BIOSes available for download. Can you comment it?
A: Yes, GPU core voltage is really software controllable on GeForce FX graphics processors. Unfortunately, all online BIOS voltmodding guides I've seen seem to be written using blind comparison of different BIOS binaries without actual understanding of software voltage control internals. So they contain some logical errors. The same applies to some voltmodded BIOSes available for download on some websites.
To understand internals of software voltage control, let's start from the very beginning. NVIDIA boards have some GPU controllable GPIO (General Purpose Input Output) pins, which are used for different purposes. Up to three of these pins can be used to control core voltage on GeForce FX based boards. The states of this pins form binary word (up to three bits width), which uniquely identify target core voltage. This word is called VID, or voltage identifier. So to program desired core voltage driver simply sets each pin to the corresponding state via the corresponding GPIO register. But VID interpretation entirely depends on the PCB's core voltage generation logic, for example most of NV35/38 boards control core voltage via ISL6569 IC, where its' VID0 and VID1 input pins are hardwired to 0 / 1, and VID2 - VID4 pins are programmable by GPU. So core voltage on these boards can be adjusted in 0.8 - 1.5V range with 0.1V granularity and all three GPIO pins are used. Other boards may have (and do have) simpler voltage control logic (e.g. simplest 1-bit VID selecting one of 2 predefined voltages). As I've said before, VID interpretation may differ depending on the PCB design, and driver knows nothing about it. To allow hardware vendors to alter voltage control logic safely, NVIDIA introduced so called voltage tables in BIOS with BMP structure version 5.25 and newer. For older BIOSes driver uses its' own GPU-specific internal voltage table. Voltage table begins from the header, containing total amount of voltage entries, size of each entry and valid VID bitmask. The last field is the most important, because it 'tells' the driver which pins actually control the voltage. For example, nobody prevents hardware vendor from using 2-bit VID defined by pin 0 and pin 2. In this case VID bitmask will contain 101b. Take a note, that the driver will never program masked pins. Array of voltage table entries follows by the header. Each voltage table entry contains target voltage identifier (target voltage (in Volts) * 100) and VID defining this voltage. The first element of each entry (i.e. target voltage identifier) is used just to allow the driver to pick the corresponding VID from the table (because the driver knows nothing about VID, it knows just the target voltage picked from the corresponding performance level entry in the performance table). So when programming the voltage, the driver simply picks required voltage entry from the table by scanning all voltage table entries, comparing target voltage identifier with voltage identifier of each entry and selecting the closest entry. When the entry is selected, the driver disassembles VID on separate bits, and programs each non-masked bit via the corresponding GPIO register.
If you've read all this info carefully, you may already see logical errors and potential problems in currently walking voltmodded BIOS yourself:
First, it's plain wrong to voltmod BIOS by copying 1.5V VID from NV38's voltage table to all other BIOSes without seeing the PCB and its' voltage control logic as it is advised in BIOS voltmod tutorials. VIDs do not have to be the same on all boards.
Second, it's wrong to ignore VID bitmask and to edit voltage table entry's VID only. As an example, let's take a board with the following 2-bit VID:
00 -> 1.1V, 01 -> 1.2V, 02 -> 1.3V and 03 -> 1.4V. Attempt to boost voltage by increasing VID to 4 will actually lower voltage and result in setting 1.1V (4 & 3 = 0). Attempt to boost voltage by copying NV38's 1.5V VID (7) will simply do nothing (7 & 3 = 3). The same attempt on the board with different 2-bit VID interpretation (e.g. 01 -> 1.4V, 02 -> 1.3V, 03 -> 1.2V) will also lower voltage and set it to 1.2V. So if you can actually see the PCB and are sure that there are more than 2 bits in VID - you've to change VID mask too. Otherwise, you simply shouldn't touch it.
To help you to see if your voltmodded BIOS really affects VIDs, RivaTuner gives you an ability to monitor state of voltage related GPIO pins in realtime, so you may see which VID is currently programmed by the driver. Using RivaTuner's VID interpretation feature you may also see both raw VID data and target voltage corresponding to this VID (to select VID interpretation mode right-click VID graph in the hardware monitoring window, select Setup from the context menu and press More button). Furthermore, RivaTuner's diagnostic report module allows you to see internals of voltage table stored in VGA BIOS and warns you if there are some entries with invalid VIDs, which don't conform to VID bitmask.

Q: If it is possible to program VID pins, will RivaTuner provide us an ability to adjust GPU core voltage on-the-fly for GeForceFX display adapters?
A: No, sorry. I'll never add software voltage adjustment to RivaTuner as well as I'll never provide info about the GPU registers controlling GPIO VID pins to third party tools creators. I don't want to be related to development of the tool responsible for burning someone's system, and direct voltage control via Windows utility is one on the things that can help beginners to fry their GPUs.

Q: I've read a lot of about so called throttling on GeForce FX display adapters. Can you explain me what is throttling?
A: Throttling is the driver's protective technology, limiting the maximum available performance level for your graphics hardware during 3D applications runtime. Throttling is aimed to prevent your graphics hardware from damage in case of GPU overheating, internal stress test failure, GPU undervoltage or internal GPU errors caused by too aggressive overclocking. Depending on the reason causing your system to throttle, the driver may either return your system to normal state either when throttling condition is reset or leave your system in throttle state until the next reboot.

Q: It looks like I'm having problems with throttling on my GeForce FX. When I go beyond certain clock frequency limit, my system completely freezes for a few seconds then it throttles to lower performance level (sometimes to low power 3D, sometimes to standard 2D). I've heard that this freezing is a result or hardware overheating protection, which is automatically activated by the GPU when some of its' local parts are overheated, so this technique cannot be disabled as it is implemented in the chip itself. Is it true?
A: There are a lot of rumors about this infamous freezing and subsequent throttling, you've just read the most fabulous of them. No, it is not true. You may also find some variations of this hypothesis, assuming that freezing is initialed by the driver's throttling algorithm is case of GPU overheating, so the driver simply cools GPU this way when it is too hot. This hypothesis is absolutely incorrect too. Freezing is not purposely initiated either by the driver or by GPU itself. Quite opposite, freezing is not the result but the reason causing your system to throttle. Probably most of users familiar with ATI display adapters know about well advertised VPU recover technology, introduced in the Catalyst 3.8. This technique allows the Catalyst to reset graphics processor when it no longer responds to the display driver. This helps to avoid system hangs in case of aggressive overclocking, graphics processor overheating etc. However, just a few users know that NVIDIA have the similar technique called Robust Channels since Detonator 40.xx drivers family. Robust Channels also allows to recover system when graphics processor hangs due to some reason. So the freezing you see is nothing but the natural GPU hang caused by too aggressive overclocking. Then Robust Channels technology simply detects that GPU hangs and no longer responds to the driver, resets it and throttles down to improve stability. You do may disable Robust Channels technology by setting RmRobustChannels registry entry to 0, but it will not help you to avoid image freezing. Quite opposite, the system will simply totally hang.

Q: If I understand you correctly, there are few reasons which can cause throttling. Is it possible to know what exactly causes my system to throttle?
A: Happily, NVIDIA miniport driver has nice event logging system which can be activated by setting RmLogonRC registry entry to 1. In this case, after reboot the driver will always add all throttling related events to the system log, so you'll be able to use event viewer and see what exactly causes you system to throttle. So you'll be able to determine the reason causing your system to throttle: hardware error recovered by Robust Channels technique, thermal conditions, stress test failure or lack of auxiliary power.

Q: I've heard about so called throttling modded BIOSes and seen online BIOS modding tutorials. Can they really disable throttling? Can you comment it?
A: No, they cannot disable throttling completely, but they do can help to reduce effect of throttling in some cases. Unfortunately, all online throttling BIOS modding tutorials also contain some logical and technical errors, which can potentially cause the problems.
First, the creators of these mods seem to misunderstand throttling term, wrongly assuming that low power 3D mode (i.e. performance level 1) and throttling mode are the same things and talking about 'throttling' any time when the system runs in performance level 1. It is plain wrong because temporary switch to low power 3D mode in some cases is absolutely normal for GeForce FX boards, as soon as low power 3D mode is not the only performance level the driver can switch to in case of real throttling algorithm activation. So these modded BIOSes just virtually 'eliminate' low power 3D mode by making it equal to performance 3D mode, but don't prevent the driver from activating throttling algorithms and throttling down to lower performance level 0.
Second, there are serious problems in their performance table editing logic, causing side effects of some systems. All currently available throttling BIOS modding guides absolutely wrongly advise you to edit the second item of voltage table to change voltage for low power 3D mode, assuming that voltage table entry indices directly correspond to performance level indices. It is absolutely incorrect, there is do direct conformance between these things. There are BIOSes with low power 3D mode voltages specified by the first voltage table entry, also there are BIOSes with both 2D and low power 3D voltages controllable by the same voltage table entry. And attempt to edit voltage table using such tutorials may cause unpredictable results. If you've read voltage control related discussion carefully, you should remember that the driver uses target voltage based access to the voltage table instead of index based access. Actually, target voltage for each performance level is directly specified right in the performance table, i.e. the driver scans all voltage table entries and tries to select matching voltage from it instead of accessing to fixed entry by index. So actually when you are copying the third voltage table entry to the second one as it is advised, you simply totally remove one of available voltages from the table, causing driver to fail and select wrong voltage for some performance levels. To modify low power 3D voltage correctly, you should leave voltage table unchanged and simply edit target voltage for this mode in the performance table.
To help you to detect such incorrectly modded BIOSes, RivaTuner's diagnostic report displays both performance table and voltage table internals, so you can see if the voltages specified for each performance level really exist in the voltage table.

Q: So it is impossible to disable throttling via editing VGA BIOS?
A: It is possible, but with completely different approach unlike it is advised in throttling BIOS modding tutorials. The only way to disable throttling at VGA BIOS level is to disable separate 2D / 3D clock frequency adjustment. It can be easily done by hiding performance table from the driver (either by zapping pointer to the performance table in the BMP structure or by setting performance table size to zero like it is done in ASUS BIOS). It will force the driver to disable dynamic clock frequency adjustment algorithm and to run at safe performance level set by VGA BIOS. Take a note, that you'll also need to correct BIOS initialization script to set safe performance level parameters to desired ones.

Q: Why does RivaTuner monitor abnormally low 1.0V core voltage on my GeForce 6800? Can you fix this bug?
A: You should look at "Core VID" graph's X-axis dimension carefully. I bet that if you'll do it, you will not find "Volts" there. Quite opposite, you'll see that the value displayed on this graph is non-dimensional. The graph you are looking at is a raw untransformed VID data. Refer to the questions discussed above to understand what is VID and to find instructions on enabling RivaTuner's voltage interpretation function, allowing you to see target voltages on the graph instead of raw VID data.

Q: I've selected "1.1V + 0.1V / 0.3V loop" VID interpretator on my ASUS V9999GT with RivaTuner's automatic VID interpretation selection function, however there is still noticeable difference between voltage monitored by RivaTuner (1.2V) and voltage monitored by ASUS SmartDoctor generic software (1.35V). What's a problem?
A: Address your question to ASUS engineers, which altered PCB's voltage control circuit but forgot to alter VGA BIOS and stored 6800 non-ultra specific voltage table in it instead of altered one. Due to that reason, it is impossible to select correct VID interpretation automatically. You should select "1.1V + 0.2V / 0.3V loop" VID interpretator manually if you have such board.

Q: I've tried to flash ASUS V9999GT BIOS on my ASUS V9999LE, and I do see a difference in core voltage monitored by ASUS SmartDoctor (1.25V Vs 1.35V with new BIOS). However, I see absolutely no changes on RivaTuner's core VID graph. Can you fix it?
A: There is nothing to fix. I strictly don't recommend ASUS V9999 owners to flash different model's BIOS, as it completely invalidates SmartDoctor's voltage reading. ASUS V9999 series use Fintek F75373S sensor chip having 4 analog voltage inputs and 8-bit ADCs, allowing to measure voltages up to 2V maximum. However, some voltages (e.g. framebuffer Vddq) can be higher than 2V, that is why the sensor's input voltages are scaled down with fixed model-specific ratios. When you change VGA BIOS, SmartDoctor software "see" different model and uses different voltage reading ratios, resulting in wrong voltage readings.

Q: I've heard that RivaTuner can be set up to display core temperature in system tray, however I cannot understand how to enable this function. Is it true? Can you help me?
A: Yes, you really can display current value of each of available hardware monitoring data sources in system tray icon. To do it right-click desired graph (e.g. core temperature), select "Setup" from the popup menu, tick "Enable ... in tray icon" and click apply. Take a note that this function works only when RivaTuner is minimized to tray (i.e. when "Send to tray on close" is enabled in Settings tab) and when background hardware monitoring mode is enabled.

Q: RivaTuner fails to adjust gamma with Catalyst 4.2. The same applies to Adobe Gamma and other gamma correction tools I have. What's a problems, can you fix it?
A: ATI driver team tried to make their control panel the only available gamma adjustment tool and intentionally corrupted GDI Get/SetDeviceGammaRamp API to block all third party gamma correction tools. As a counter-measure, since version RC15 RivaTuner provides you new gamma correction mode via direct access to ATI RAMDAC palette, which doesn't depend on ATI programmers' tricks and works with any Catalyst driver, even gamma correction locked. So simply select this mode to make gamma correction settings work. Now you can select preferred tool yourself instead of using the tool ATI force you to use.

Q: Why do I see noticeable difference in core temperatures monitored by RivaTuner and ATI Overdrive tab on my RADEON 9800XT? Does RivaTuner incorrectly read info from the sensor? Will you fix it?
A: I will not fix anything, quite opposite I'd suggest ATI to fix their own control panel. RivaTuner displays the only core temperature that can be retrieved from RADEON 9800XT hardware sensor with the maximum accuracy. ATI already admitted that RADEON 9800XT boards don't have on-die thermal diode, and the temperature is monitored by the thermistor located near the graphics processor. So the temperature displayed by the control panel's Overdrive tab is just an attempt to approximate the real on-die temperature by adding constant 20°C offset to the real sensor's temperature. If you believe that such correction can approximate the real temperature with +/-2°C inaccuracy like ATI claims - you are free to specify 20°C offset for temperatures monitored by RivaTuner. To do it right-click core temperature graph, select Setup from the popup menu then enter 20°C temperature offset and click OK. lf you prefer to see the real sensor's temperatures instead of tying to guess on-die temperature - just use default settings.

Q: I know for sure that my RADEON 9800PRO has R360 VPU, but RivaTuner detects it as R350? Can you fix it?
A: No, there is nothing to fix, it is normal situation for ATI boards. ATI PCI DeviceIDs can be partially strapped by VGA BIOS, so any R350 based board can be virtually turned to R360 and vice versa by editing strapping byte in BIOS. There is no hardwired DeviceID available in the graphics processor's registers aperture space, so you cannot read anything but strapped DeviceID via software. Thank ATI for that.

Q: Why does RivaTuner show non-integer temperatures opposing to integer values displayed in ATI Overdrive tab?
A: Ask ATI, not me. The sensor chip they use (LM63) can monitor temperatures with 0.125°C accuracy.

Q: As far as I know, LM63 sensor chip used to monitor temperatures on RADEON 9600XT/9800XT/X800 series can also monitor fan speed. Is it true? Will you add fan speed monitoring for these boards?
A: Yes, it is true. LM63 sensor chip is really fan speed monitoring capable and contain input pin for the fan's tachometer output. However, this pin is multifunctional and can be also optionally used as output pin for alerting when monitored temperature exceeds programmed limits. ATI hardware seems to use only the second mode for this pin, so unfortunately it is impossible to use LM63 to monitor temperatures on RADEON 9600XT/9800XT/X800 series boards.

Q: What is ambient temperature displayed by RivaTuner on my RADEON 9X00XT/X800? Where do you read this value from?
A: Actually LM63 sensor can monitor two temperatures. The first one is external temperature, i.e. temperature retrieved by IC from ASIC's on-die thermal diode or on-package thermistor. The second one is internal LM63's temperature. RivaTuner calls this temperature ambient temperature. So to find a point on the PCB where this temperature is monitored, simply look at it and find LM63 IC (e.g. on RADEON 9600XT boards you may see LM63 right near the fan's power connector). Ambient temperature is monitored exactly at that point.

Q: I've heard that RivaTuner uses driver-level access to thermal sensors on NVIDIA boards. Is it true? If yes, will you provide low-level access to the sensors in the future?
A: Yes, it is true. By default RivaTuner uses driver-level wrapper to monitor temperatures on NVIDIA display adapters equipped with thermal sensors, supported by NVIDIA display drivers. Opposing to ATI drivers supporting just LM63 and its' clones, NVIDIA driver supports widest range of thermal sensor ICs and provides access to both core and ambient temperatures. So it is much more safe to access the sensor via NVIDIA driver instead of low-level access toI2C bus, because it greatly reduces the risk of I2C collisions. However, RivaTuner includes open source monitoring plugins, providing low-level to the most popular sensors used on NVIDIA display adapters (MAX6648 and LM89/LM99). If your display adapter is equipped with such sensor, you can optionally choose temperature provider via the source's properties dialog.

Q: Why do I see a constant 10°C difference between temepratures monitored by RivaTuner and SpeedFan on my MSI NX6800?
A: Please read the previous question. By default RivaTuner uses more safe and collision free driver-level temperature readings instead of direct access to the sensor. NVIDIA display driver compensates temperature readings nonideality via adding fixed sensor specific offsets to the temperatures, retrieved from the sensor. The offset for MAX6648, which is installed on your NX6800, is equal to 10°C. If you wish to see raw temperature readings, you may configure temperate data source to read the temperatures from the sensor directly.

Q: I've heard that I can use RivaTuner to shut my system down when core temperature reaches the critical limit. Is it true?
A: Yes, it is true. You can use RivaTuner's hardware monitoring thresholds to perform system shutdown in case of GPU overheating. To do it you must perform the following sequence of actions:

1. Press "Enable background monitoring" button in the monitoring window's toolbar to ensure that RivaTuner will monitor the system even when when monitoring window is closed.
2. Right click "Core temperature" graph and select "Setup" from the context menu.
3. Click "Add new threshold" button.
4. Type a name for the threshold, e.g. "Shutdown temperature".
5. Specify threshold value, e.g. 90°C.
6. Configure path and command line for the application to launch on upward threshold crossing. There are various of system shutdown via the command line, for example Windows XP owners may use shutdown.exe utility integrated in OS:
6.1. Click "Browse" and specify shutdown.exe location (e.g. C:\Windows\System32\Shutdown.exe)
6.2. Type "-s -t 0 -f" in the "Command line" edit box. These parameters specify shutdown action (-s), zero delay before shutdown (-t 0) and forces shutdown utility to close all applications (-f). You may refer to shutdow.exe built-in help to get more info on its' command line parameters.

Q: I've heard about RivaTuner's NVStrap driver but I have no ideas how do I use it. Can you help me?
A: RivaTuner contains NVStrap driver configuration panel for Windows 2000/XP/9x/ME, which allows you to install / uninstall the driver as well as to configure the driver's parameters. Just click the device customization button on the main tab then select the 'Customize low-level system' button in order to activate the 'low-level system tweaks' dialog and get access to the 'NVStrap driver' tab.

Q: I cannot find the 'NVStrap driver' tab in the 'Low-level system tweaks' dialog. Any clues?
A: RivaTuner hides this tab if the NVStrap.sys driver cannot be used on your system. The 'NVStrap driver' tab presence depends on the following conditions:

1. NVStrap.sys file must exist in RivaTuner's Tools\NVStrap folder.
2. If you have multimonitor system with or more display adapters, then the primary NVIDIA GeForce256 or higher display adapter must be selected as a tweak target in the main RivaTuner's tab. NVStrap driver configures your VGA adapter before loading OS kernel and at that time only the primary VGA adapter is programmable. You will not be able to use the NVStrap driver if your system boots on non-NVIDIA VGA adapter.
If you have multimonitor system based on a dual head display adapter, primary display device must be selected as a tweak target in the main tab in order to install and configure the driver.

Q: Can you explain me NV40 softmodding internals?
A: NV40 graphics processor's internal configuration is rather flexible and amount of active pixel and vertex units can be easily controlled by special GPU units configuration register. State of each pixel and vertex unit is mapped to the corresponding bit in the configuration register, logical 1 in this bit enables mapped unit, logical 0 - disables unit and prevents graphics processor from using it. This technique allows creating 16x1,6vp / 12x1,5vp / 8x1,4vp boards based on NV40 graphics processor core. So any NV40 graphics processor can be easily reprogrammed to use 1-4 pixel units (or 4-16 pixel pipelines) and 1-6 vertex units via masking the corresponding bits in the configuration register. However, GPU manufacturing process cannot give 100% manufacturing yields and the chips often have some faulty units due to manufacturing process non-ideality. To accommodate this fact, graphics processor's units quality testing is performed after graphics processor manufacturing, and faulty / instable units are marked with so called hardware units mask, which is defined by strapping resistors located on GPU package. Hardware units mask effectively locks such pixel and vertex units at hardware level, making a bit of the configuration register hardwired to 0 as soon as the corresponding bit in mask is set. The mask forces the chip to ignore any attempts to write 1 to masked bit of the configuration register, making unit activation physically impossible for BIOS and third party software tools trying to program configuration register. Happily, hardwired mask itself is stored in a register, which doesn't affect GPU functionality. It only allows software to see which units are bad, but doesn't actually strap anything. The real mask, affecting configuration register bits' programmability is stored in different register, which is simply initialized by BIOS with value stored in hardwired register. So it is possible to reprogram it, allowing graphics processor to activate all units, even if they are hardware masked.
During POST VGA BIOS protects masked units from enabling using hardwired value. Then it enables some or all non-hardware masked units. Some vendors also tend to lock units with so called software units mask, i.e. VGA BIOS simply doesn't enable a unit even if it is not hardware masked. This mainly applies to GeForce 6800 boards having only one hardware masked pixel / vertex units. To fit in specs, one pixel / vertex unit on the boards with such chips is soft-masked in VGA BIOS.
RivaTuner provides two modes of unlocking pixel / vertex units on NV40 GPU based boards. The first one is the safest - you may enable only software masked units, disabled by display adapter's vendor in VGA BIOS due to some reason (mostly, to fit graphics processor in specs). The percentage of the boards, locked with software units masks is rather low and approximately equal to 5% according to our statistics. However, modding success rate for such boards is almost equal to 100% because locked units have successfully passed hardware quality testing and are locked due to marketing reasons only.
Second mode is more risky, because it allows you to force the GPU to ignore hardware mask, and activate unit even if it is marked as bad. However, statistics shows that GPU manufacturers also use hardware masking to fit some processors in specs, so you may also try your luck and attempt to activate hardware masked units. Just remember, that you do it at your own risk. Enabling hardware masked units, which have not passed hardware quality testing, may cause unpredictable results including permanent hardware damages.

Q: How can I see if my GeForce 6800 is softmoddable or not? Does the moddability depend on display adapter manufacturer?
A: No, it doesn't depend on brand. The only factor defining softmoddability success is a quality of pixel / vertex units of your GPU. Also, the chances greatly depend on software units mask, which may exist in your VGA BIOS. You may easily see if your display adapter has software units mask in VGA BIOS using RivaTuner's graphics subsystem diagnostic report module and looking at "SW units mask" line in "NVIDIA VGA BIOS information" report category. If you'll find there something different from "none" (e.g. pixel 0001b, vertex 000000b) - it means that graphics adapter's manufacturer tried to lock some graphics processor's pixel / vertex units in software. In this case, you have very high chances for successful softmod.
Otherwise VGA BIOS doesn't disable any units at software level, your graphics processor's configuration is defined by amount of hardware masked units end everything depends on their quality. If the units are locked due to physical damages - you won't be able to enable them without seeing side effects like rendering artifacts / system instability / etc.

Q: What will happen if I'll try to unlock bad pixel / vertex units? Will I see rendering artifacts?
A: It depends on 'Allow enabling masked units' option state. If it is disabled, graphics processor's logic simply will not allow you to enable damaged units. Otherwise you do may see rendering artifacts or experience system instability.

Q: I'm beginner and I never used RivaTuner before, but I'd like to try NV40 softmodding feature. Can you provide me small "how to" guide?
A: Yes. But I still strictly recommend you to read context help for all RivaTuner's options you'll be using instead of following provided guide blindly. To mod NV40 perform the following steps:

1. Run RivaTuner and look at the main tab. You'll see your current graphic processors configuration and amount of active pixel / vertex units in device status string, for example: NV40 (A1,12x1,5vp).
2. Click "Customize" button located at the right of device status string to activate device customization toolbar.
3. Click "Graphics subsystem diagnostic report" button in device customization toolbar to activate RivaTuner's diagnostic module.
4. Scroll down "Report categories" list and tick "NVIDIA VGA BIOS information" report category.
5. Click "Capture report" button in the "Report preview" window to refresh report.
6. Scroll down "Report preview" window and find "NVIDIA VGA BIOS information" manually or simply double click "NVIDIA VGA BIOS information" category name in the "Report categories" list to automatically navigate to "NVIDIA VGA BIOS information" in the "Report preview".
7. Look at line displaying "SW units mask". If you see "none" there, it means that VGA BIOS allow activating all non-hardware masked GPU pixel / vertex units, so your configuration is determined by hardware. In this case, I strictly recommend you to forget about softmodding if you are beginner and if you don't understand what do you do exactly. Power users may proceed and try to activate hardware masked units. If you see something different there (for example, pixel 0001b, vertex 000000b) - you have high chances to unlock software locked units.
8. Click "Low-level system settings" button in device customization toolbar to activate "Low-level system tweaks" panel.
9. Select "NVStrap driver" tab.
10. Press "Install" button if you never installed the driver before. If you already have it installed after the previous version of RivaTuner, ensure that you have followed RivaTuner's warning and updated driver when RivaTuner offered you to do it. If you have mistakenly ignored the warning, simply press "Reinstall" button to update the driver manually.
11. If you are a power user, if you have read the previous questions carefully and understand what does the "Allow enabling masked units" option do, and, the most important, if you understand what side effects can it cause - enable this option. Otherwise proceed with enabling software masked units only.
12. Select "Custom" in the "Active pixel/vertex units configuration" list to activate the "Customize" button, then click it to activate "Custom graphics processor configuration" dialog.
13. Tick all disabled pixel and vertex units and click "OK".
14. Reboot system when prompted.
16. Start RivaTuner and look again at your current graphics processor configuration and amount of active pixel / vertex units in device status string. If you see no changes there - the units you tried to unlock are hardware masked and you have not enabled "Allow enabling hardware masked units" option. Otherwise, if you see desired configuration, - proceed with system stability testing to ensure that unlocked units are really fully functional.

Important note for users / display adapter sellers / reviewers testing many NV4x based boards in succession: If you're experimenting with NV40 softmodding, please ensure that you've uninstalled NVStrap or set "Active pixel/vertex pipelines" to "determined by VGA BIOS" before replacing an adapter with new one. Otherwise you'll see the pixel / vertex units configuration set by NVStrap instead of hardware default configuration. If you forgot to do it before installing new adapter - simply uninstall NVStrap or click "Reset to default" button in the "Custom graphics processor configuration" dialog then reboot the PC to see hardware default configuration for new display adapter.

Q: I have 6800 NU and right after installation RivaTuner displayed 12x1,5vp configuration on the main tab. In the NVStrap tab I saw that I have pixel unit 1 and vertex unit 3 disabled, so I ticked them and restarted PC. After reboot I still see 12x1,5vp on the main tab, however NVStrap tab shows 16x1,6vp configuration. What's going on?
A: NVStrap's status bar (displayed in the upper part of Custom graphics processor configuration panel) shows you default, current and target configurations. 16x1,6vp configuration you see is the target configuration, i.e. configuration you've asked the NVStrap to set after the next reboot. The value you should look at is the current configuration, showing which pixel and vertex units are actually currently enabled. Both context help and FAQ clearly state that the units can be hardware masked, so current and target configurations can be different due to your attempts to activate hardware masked units. So if you see differences in target and current configurations after reboot and see "disabled" state near the unit you tried to enable - it is protected by hardware mask. In this case you may either try to enable it using "Allow enabling hardware masked units" or forget about softmodding, if you don't want to risk with enabling a unit, which can be potentially damaged.

Q: RivaTuner really shows changes in graphics processor's configuration on the main tab after using NVStrap, but I'd like to verify it with some other tool too. What software would you recommend?
A: First, I strictly don't recommend to use any other currently available diagnostic tool to detect an amount of active pixel / vertex units on NV40 family because all of them use hardwired info indexed by chip model only and don't reflect actual graphics processor's configuration. Info about correct pixel / vertex units detection way will be passed to Everest creators ASAP so you'll be able to use it to verify changes in future with this tool, but now I'd recommend you to use fillrate and vertex processing speed limited benchmarks. 3DMark2003 / Multitexturing fillrate test is changed drastically as soon as you alter amount of active pixel units, 3DMark2003/Vertex shader test is sensitive to vertex processing speed reflects changes as soon as you alter amount of active vertex units.

Q: I know about "SW units mask" info allowing us to tell quickly if the board have chances for softmodding or not, but I simply cannot force RivaTuner to display VGA BIOS information. Due to some reason it displays "Cannot dump NVIDIA VGA BIOS information" in diagnostic report. What's wrong?
A: It seems like you tried to play with NVStrap's or RivaTuner's internal PCI DeviceID override function, either flashed VGA BIOS, whose PCIR DeviceID doesn't match with your display adapter's one. When dumping VGA BIOS RivaTuner uses some emergency measures and doesn't display dump of VGA BIOS if the board's current PCI DeviceID doesn't match with PCI DeviceID stored in PCIR header in BIOS. To get it working, either reset all PCI DeviceID related changes you've made in RivaTuner or load RivaTuner.rtd database and set DisablePCIRCheck to 1 to disable PCIR checking emergency measure.

Q: I see "pixel 0001b, vertex 0000b" in "SW units mask" so my display adapter surely has software units mask, however NVStrap doesn't have any effect and I still cannot unlock additional units. What's wrong?
A: "SW units mask" info allows you to see if your display adapter is 100% not moddable without enabling hardware masked units, but it doesn't allow you to see if it is 100% moddable without using this option. This magic line doesn't give you any warranties, it just tells you that display adapter manufacturer tried to lock some units in software so they can be potentially unlockable. However, these units can be locked hardware masked too.

Q: Why do I see "N/A" for all pixel/vertex units in "HW masked" column of the NVStrap's pipeline configuration window on my GeForce 6200?
A: Currently RivaTuner provides hardware units mask decoding for NV40 and NV48 GPUs only.

Q: I've successfully unlocked 16 pixel pipelines on my GeForce 6800 and I don't see rendering artifacts in any games, however after unlocking it I'm experiencing problems with overclocking. Both RivaTuner and the driver's control panel refuse to increase core clock even by 1MHz. Can you fix this bug?
A: There is definitively nothing to fix, and I have to disappoint you: your mod is not successful. When you change the clocks, internal clock frequency stress testing is performed by the driver, and if you cannot increase the clocks even by 1MHz it can only mean that pixel / vertex units you have unlocked are not able to work properly even at default clocks. Some online tutorials advise you to disable stress testing with "DisableClockTest" registry entry, however I strictly don't recommend you to do it. Remember, that even if you don't see visual problems in your games after softmodding, the driver detected problems in their functionality during stress test. So I don't recommend you to fool yourself and hide the problem behind disabled stress testing. You can try to boost core voltage to make the problem disappear, but remember that this operation is extremely risky and can permanently damage your hardware.

Q: I've unlocked 16 pixel pipelines on my GeForce 6800, however I see rendering artifacts? How can I get rid of them? Can voltage mod help me?
A: Unfortunately, the chances to get rid of them almost equal to zero. Rendering artifacts is an indicator of physically damaged unit.

Q: I've successfully unlocked hardware masked pixel / vertex pipelines and I don't experience any problems with system stability and rendering artifacts. I'm using Linux so I'd like to finalize NVStrap's pipeline configuration via VGA BIOS mod. Is it possible.?
A: Yes, RivaTuner's distributive includes NV40BIOSHwUnitsMaskEliminator patch script, allowing you to unlock desired hardware masked units at VGA BIOS level. If you have the NVStrap driver installed, the script will ask you which units are you about to enable so you'll be able to change any VGA BIOS image to activate either all or currently enabled pixel / vertex units only. Otherwise you'll be only able to unlock all the masked units simultaneously. Please use this script with extreme caution and install it only after testing enabled units and ensuring that all of them are 100% working.

Q: I've tried to mod my GeForce 6200 to 6600 with the NVStrap, however the driver doesn't seem to work. There are absolutely no changes in the pipeline configuration, even after ticking "Allow enabling hardware masked units" option. Can you help me?
A: First, you should look at the graphics processor's revision. If it is NV43 revision A4 or newer, then I've to dissappoint you - NVIDIA seem to progress with their hardware protections, so currently these chips are completely unmoddable. You won't be able to unlock hardware masked pipelines, as well as you will not be able to unlock professional capabilities on it. The same applies to all revisions of NV41 graphics processors.

Q: I've installed the NVStrap driver, selected "Quadro" graphics adapter identification mode, rebooted my system and Windows successfully detected Quadro board, but the Detonator drivers fail to install due to the 'Data invalid' error. Can you fix this bug?
A: .The 'Data invalid' error is a well-known problem of Windows 2000/XP, which appears on some systems during installing new hardware and it is not related to the NVStrap driver. You can find information and the workaround for this operating system's problem in The Inquier's news archive.

Q: I've read the previous question but I still cannot fix 'Data Invalid' error on my Windows 2000 based system. Regedit.exe doesn't allow me to change permissions for registry keys. Can you help me?
A: You can change permissions for registry keys with regedt32.exe utility, included in Windows 2000 distributive. Just type regedt32 in the command line to start it.

Q: Windows fails to find supported driver for my display adapter after installing the NVStrap driver and selecting "Quadro" graphics adapter identification mode. Can you help me?
A: Avoid selecting automatic "Quadro" graphics adapter identification mode for GeForce graphics card models, which don't have the corresponding Quadro models. Automatic "Quadro" graphics adapter identification modes assumes that your gaming display adapter has the corresponding professional model and uses the following GeForce to Quadro PCI DeviceID mapping strategy:

PCI DeviceID = PCI DeviceID | 3 for GeForce3 and older graphics processors
PCI DeviceID = PCI DeviceID | 8 for GeForce4 and newer graphics processors

If your graphics adapter doesn't fit in this scheme and doesn't have Quadro clone with PCI DeviceID generated with above formula, Windows will not be able to install drivers for it. In this case you must use custom graphics adapter identification mode and select desired Quadro PCI DeviceID manually.
However, to simplify this approach, the driver contains special internal GeForce to Quadro PCI DeviceID mapping table, allowing you to use automatic "Quadro" display adapter identification mode for some of such non-mapped display adapters. In this case the driver maps your GeForce to the closest Quadro model. The list of special mappings covers the most of currently available AGP GeForce display adapters and includes 171, 172, 173, 181, 182, 301, 331, 332, 333, 334, 282, 302, 320, 321, 322, 323, 326, 327, 341, 342, 343, 344, 40, 041, 042, 045, 140, 141, and 14F PCI DeviceIDs.

Q: NVStrap's graphics adapter identifiacation option doesn't seem to be able to change PCI DeviceID on my GeForce 6600GT AGP. Can you fix it?
A: This option affects PCI DeviceID on native AGP/PCI or PCIE display adapters only. PCI DeviceID of the cards with HSI AGP-to-PCIE or PCIE-to-AGP bridge is hardwired on the bridge and cannot be changed via software. However, NVIDIA driver itself doesn't use the bridge's PCI DeviceID and use the GPU's model ID (which is always affected by NVStrap). So you cannot force OS to "see" a different display adapter, but you can force NVIDIA driver and control panel to do it.

Q: Does the previous question and answer mean that I will not be able to mod GeForce 6x00 with HSI AGP-to-PCIE or PCIE-to-AGP bridge to Quadro?
A: No, in general it doesn't. Happily, NVIDIA driver unlocks professional OpenGL capabilities via checking software overridable chip model ID instead of hardwired display adapter's (or HSI bridge's) PCI DeviceID. So you'll get all professional caps unlocked and the driver will detect a Quadro, but OS will still see a GeForce. However, you can really experience difficulties with some CAD/DCC applications, which perform direct PCI DeviceID checking and refuse to run if no Quadro PCI DeviceID is found.

Q: Can I use the NVStrap driver under Windows NT4?
A: RivaTuner doesn't support Windows NT4, but you may try to install the NVStrap driver manually. To do it just copy the NVStrap.sys file to the %WinDir%\System32\Drivers and run NVStrap.reg file then reboot your system. However, I've not tested the driver under this OS so I cannot give you any warranties.

Q: I've installed the NVStrap driver and Windows detected my GeForce as Quadro. However, the additional OpenGL settings are still not available in display properties and GLInfo still displays the OpenGL renderer name as GeForce. Any clues?
A: NVIDIA protected the Detonator 30.82 and higher against the NVStrap driver. You must use RivaTuner's NVStrapAntiprotection patch script in order to use the NVStrap driver with the latest Detonators.

Q: When I install the NVStrap driver and select "Quadro" graphics adapter identification mode, my systems starts responding slowly after few minutes of work, then it completely hangs. Any clues?
A: The symptoms you are describing are the results of NVIDIA's protection against the NVStrap's PCI DeviceID override, which was introduced the Detonator 30.82. When the driver detects that PCI DeviceID was changed via the NVStrap, it iteratively increases internal delay counter and purposely spend time in internal wait loops, emulating progressing system slowdown and finally system hang. You must use RivaTuner's NVStrapAntiprotection patch script in order to use the NVStrap driver with the latest drivers. GeForce FX and newer display adapter owners may also use "Use ROM straps for PCI DeviceID programming" option, which allows to work around this protection without patching the Detonator / Forceware driver.

Q: My PC doesn't resume from S3/Suspend to RAM (S4/hibernate) mode when I use NVStrap. Can you fix this bug?
A: The NVStrap driver is not compatible with S3/Suspend to RAM and S4/hibernate. It's not a bug, it is an implementation specific, caused by the NVStrap's stealth load-configure-terminate implementation, which will never be addressed.

Q: When I used RivaTuner with my old RADEON 9800PRO, it correctly detected that it has DDR2 memory type. However, it detects DDR memory on my new GeForce 6800 Ultra, but I know for sure that it is DDR3. Is it a bug?
A: DDR / DDR2 / DDR3 memory type detection is available on ATI GPU based display adapters only. This information is simply not available in NVIDIA GPU's registers space. SDR / DDR type is the only thing that can be detected, so both DDR2 and DDR3 are also treated as DDR.

Q: I cannot set some display modes after using RivaTuner's 60Hz refresh rate fix wizard for Windows2000/XP. Any clues?
A: Don't try to use this fix if you have not installed native drivers for your monitor and Windows detects it as the 'Default monitor' or 'Plug and Play monitor'. In this case Windows will enumerate unsupported refresh rates and the fix will not function properly.

Q: I cannot use TV-out after fixing 60Hz bug with driver-level 60Hz fix wizard. What's wrong?
A: TV-out requires 60Hz refresh rate in order to function properly. Once you've removed 60Hz from the list of supported refresh rates for some display mode, you cannot longer use this mode for TV-out. If you are using driver-level 60Hz fix wizard, simply avoid fixing 60Hz for display modes you are using for TV-out. If you need to fix 60Hz for display mode set on your TV-out, use different tool (e.g. RivaTuner's low-level refresh overrider or RivaTuner's low-level monitor driver wizard).

Q: I used to use RivaTuner's 60Hz fix wizard for fixing 60Hz bug under Windows XP. However, I cannot longer use it after installing the ForceWare driver because 60Hz fix wizard's button is simply missing. What's wrong?
A: Driver-level 60Hz fix wizard is purposely blocked for NVIDIA ForceWare drivers in version RC14.1 and newer. This module is not required for new generation drivers, because NVIDIA finally provided unified Direct3D / OpenGL refresh override tool in ForceWare control panel.

Q: I've read the previous question and I know about generic ForceWare refresh override tool, however I don't like ForceWare's refresh rate override implementation. My monitor clicks during refresh rate override and I'd like to exclude 60Hz from the list of supported refresh rates at all instead of overriding refresh rate. Can you please unlock 60Hz fix wizard for ForceWare drivers too?
A: Driver-level 60Hz fix wizard is not the only module in RivaTuner allowing you to do it. RivaTuner provides 3 different tools, which can be used for fixing 60Hz bug using different approaches. The second tool is low-level refresh overrider, which provides you exactly the same way of fixing 60Hz bug like ForceWare refresh override tool. The third tool is low-level monitor driver wizard, and it allows you to remove 60Hz from the list of supported refresh rates like driver-level 60Hz fix wizard. The only difference is that driver-level 60Hz fix wizard limits refresh rates from display adapter's side, whilst this tool limits refresh rates from the monitor driver's side. To fix 60Hz just activate low-level monitor driver wizard, create a driver with custom minimum refresh rates for display modes you want to fix, then simply install the driver you've created.

Q: Can I fix 60Hz on Matrox / SiS / 3dfx board with RivaTuner?
A: Yes. Two of three 60Hz fixing aimed tools available in RivaTuner are vendor independent and can be used for fixing 60Hz on any display adapter. You can use either low-level refresh overrider or low-level monitor driver wizard module for fixing 60Hz bug on these graphics cards.

Q: I used to use RivaTuner with NVIDIA graphics cards, however after moving to ATI graphics card I cannot longer use RivaTuner to tweak Direct3D/OpenGL settings. RivaTuner fails to detect Catalyst x.xx driver on my system and always displays 'No supported drivers detected for this display adapter'. What's wrong?
A: Nothing is wrong, it must be so. RivaTuner never provided driver-level settings for ATI display adapters and probably never will. Driver-level tweaks are exclusively available on NVIDIA boards only. However, ATI users may use RivaTuner's patch engine to install patch scripts on Catalyst / FireGL drivers. RivaTuner also provides a full set of low-level driver independent functions to both ATI and NVIDIA boards, so you may use the following features:

1. Overclocking module, overclocking profile manager allowing you to create different overclocking profiles for different applications.
2. AGP configuration module.
3. Refresh overrider module (can be used for fixing 60Hz under Windows 2000/XP via resident refresh override utility).
4. Monitor driver wizard (can be used for fixing 60Hz under Windows 2000/XP at the monitor driver level).
5. Color correction module, color scheme manager allowing you to create different color schemes for different applications.
6. Graphics subsystem diagnostic module.
7. Hardware monitoring module.
8. Statistics server appliacation.

Q: How do I use RivaTuner's patch scripts? What do I do with these *.RTS files?
A: *.RTS files are dedicated for processing with RivaTuner's built-in patch script interpretator. RivaTuner automatically registers itself as *.RTS file handler. So you can just run RivaTuner at least once to register *.RTS file extension then simply open the script you need via Windows explorer. All the scripts are located in PatchScripts subfolder in RivaTuner's folder. Take a note, that the link to this folder is also automatically added to Start menu during installing RivaTuner.

Q: I've modded my GeForce4 Ti4600 to Quadro4 900XGL with SoftQuadro4, but my PC hangs as soon as I start any OpenGL application. What's a problem?
A: Please read patch script description and documentation carefully. The latest supported driver version is Detonator 42.51, newer versions are protected and not supported by the script. The symptoms you are talking about are caused by your attempt to patch and use protected driver.

Q: I'd like to use ForceWare with SoftQuadro4. Do you plan to support protected drivers and update SoftQuadro4?
A: No, I don't. SoftQuadro4 development is discontinued and the chances to see updated script some day are pretty low.

Q: Is it possible to mod GeForceFX to Quadro with RivaTuner?
A: Officially it is impossible and RivaTuner doesn't provide documented tool for modding GeForceFX to a Quadro. I didn't investigated NV3x to NV3xGL GPUs moddability because from my point of view NV3x GPU family is too weak to invest time in such research. However, some users informed me about successful GeForceFX->Quadro modding with RivaTuner and Detonator FX drivers family. After investigating this question, I can confirm that some NV3x (presumably only NV30 / NV35) boards can be really modded with RivaTuner's NVStrap / NVStrapAntiprotection and SoftQuadro4 scripts under the Detonator FX (version 44.xx - 45.xx) drivers. Modding seems to be possible due to NV3x-specific 'hole' in Detonator FX's anti-SoftQuadro protection system. The protection seems to be fixed in ForceWare driver family, so modding any NV3x with these drivers will result in PC lockup after starting any OpenGL application.
So you may try to use SoftQuadro4 / NVStrap with Detonator FX drivers, but just remember that this mod is not officially provided by RivaTuner so I cannot give you any warranties.

Q: I'd like to buy gaming oriented display adapter then mod it to professional Quadro / FireGL with RivaTuner. What will you recommend to buy?
A: I strictly don't recommend you to buy display adapter special for softmodding to professional model. My SoftQuadro4 / SoftFireGL researches are aimed to prove or disprove a lack of differences between graphics processors used on gaming and professional boards. The scripts just allow you to verify my test results, and they are not intended for permanent usage. As soon as I provide research results to the community, hardware vendors react immediately and protect their drivers against the scripts, because they don't want to loose money on professional boards sales.
However, I physically cannot and will not update these scripts and constantly investigate / block new protections added by hardware vendors to their drivers. As soon as the first generation of scripts is launched and GPU moddability is proved, the scripts support can be discontinued at any time because protection / antiprotection chase is endless and for me it is just a waste of time.
So if you buy display adapter to softmodding - get ready to the fact that the development and support of script you are going to use will be discontinued one day.

Q: How do I install the SoftR9x00 patch script?
A: Just perform the following sequence of actions:

1. Extract ATI driver into the temporary folder (e.g. C:\SoftR9x00\) with WinZip or just run driver setup to extract driver to its default location (usually C:\ATI\Support\).
2. Run SoftR9x00 patch script (read the previous questions to get instructions on installing *.RTS scripts) for your operation system.
3. Patch script window will appear. Press <Continue> button after reading and accepting SoftR9x00 license agreement. Depending on the OS you are using, patch script interpretator will offer you to browse for ati2mtag.sys or ati2xvag.vxd file.
4. If you are using generic driver distributive with packed installation files (*.DL_, *.SY_ etc.) then select ati2mtag.sy_ or ati2xvag.vx_ in the 'Files of type' dropdown list. If you use unpacked modified driver ATI drivers (e.g. ripped driver packs from www.radeon2.ru) then simply leave the 'Files of type' dropdown list unchanged.
5. Select target file in the folder where you've extracted ATI driver. RivaTuner will patch it and display log window. Ensure that it doesn't contain error messages.
6. Install the driver you have patched. To do it use device manager and manually update the driver and specify *.INF file for the patched driver. If you are installing the driver via its' executable installer, ensure that you've removed .cat file from the driver's subfolder, otherwise automatic installed will use original digitally signed packed file instead of unpacked one.

Q: How do I verify that SoftR9x00 is installed properly?
A: First, RivaTuner must display that 8 pixel pipelines have been successfully activated (the number of currently active pixel pipelines is displayed on the main tab, e.g. "256-bit R300 (8x1) with 128MB DDR memory" ).Second, Windows must display 'RADEON 9700' in the Display properties -> Adapter -> Adapter information -> Chip type. Third, OpenGL renderer name must detect Radon 9700 too (you may verify OpenGL renderer name with RivaTuner's diagnostic report module or other specialized software like GLInfo, SiSoft Sandra or simply use some OpenGL games (e.g. use Quake III -> System -> Driver info). Finally, you can compare file modification date for %windir%\system32\drivers\ati2mtag.sys and ati2mtag.sy_ in the driver's distributive. They must be different.

Q: I did it step by step but it didn't work. Is it a bug in script or my R9500 is just unmoddable?
A: The only reason that can cause it is that the script is not properly used (i.e. you have installed the original driver). Please read the previous question carefully and verify your actions step-by-step.

Q: I've downloaded patched ati2mtag.sys and it worked perfectly, but I cannot get RivaTuner's script to work. What's wrong with RivaTuner?
A: It can only mean that you cannot patch and properly install the driver. Nothing more, nothing less. Please read the previous question carefully and verify your actions step-by-step.

Q: Is there any difference in performance between distributed patched driver and RivaTuner's patch script?
A: No. Both of them give absolutely equal performance.

Q: I've tried to apply SoftR9x00 directly to ati2mtag.sys directly in my windows\System32\drivers folder, but I don't see any difference after reboot. What's wrong?
A: I don't recommend to patch drivers on-the-fly if you don't know what is Windows file protection and how to avoid it. When digitally signed driver is installed, operating system tracks changes in such files and automatically replaces files when you made any changes in them. Use patching on-the-fly only if you perfectly understand and can perform all the additional steps necessary to avoid Windows file protection (e.g. empty dllcache).

Q: I've performed hardware mod of my R9500 to R9700 and got checkerboards artifacts. Can I fix it with your SoftR9700 script? Will it help me?
A: No, you cannot and it will not help you. SoftR9x00 is a full software analogue of hardware mod, it performs the same thing (i.e. forcing PCI DeviceID at the driver level).

Q: I've modded my RADEON 9500 to 9700 with SoftR9x00 and got checkerboards artifacts. Can this issue be software related? Should I try an alternate software mod, will it help me?
A: No, it cannot be software related. Checkerboard artifacts can appear after hardware and after both of software mods. You may try to download modified driver, but it will not help you. Both software mods use driver-level PCI DeviceID forcing approach. RivaTuner's script modifies PCI DeviceID request (ANDs and ORs some bits of the PCI DeviceID) so the driver just thinks that the regular R9700 is installed in the system. Similar technique (a lot of replaced DeviceIDs in the driver's devices table) is used in the w1zzard's patched drivers, walking on the net.

Q: However, I've heard that some people have different results using different mods. Can you explain it?
A: Some people are just trying to talk about the things that are beyond of their knowledge. It is absolutely impossible.

Q: I cannot install RivaTuner's SoftR9x00 script on Omega drivers. What's a problem?
A: Both NVIDIA and ATI Omega drivers are already patched with RivaTuner's scripts. There is no need to install the scripts on them.

Q: Can you please create your own pre-patched drivers based upon original Detonator / ForceWare / Catalyst / FireGL driver? I'm not an experienced user and pre-patched driver installation is much simpler than manual driver patching for me.
A: No. Sorry, but I will never provide such drivers to the community. First, there are already some RivaTuner scripts based driver packs walking on the net (e.g. Omega driver, Forsage etc.). Second, I totally disfavor patched driver distribution idea. It's just a question of principle and programmer's ethics. Being professional programmer myself, I can say that I definitively wouldn't like to see that somebody modifies then distributes my modified code. And I don't want to spring the same mine for other developers.
The only legal and moral way I see is external patch utility, allowing you to modify the code yourself. During installing the script you have to read and accepts license, which warns you that the code is modified by third party so it may not work as it was supposed by original code creators.

Q: Where can I post RivaTuner related questions?
A: First, you can post your questions in thematic discussion forum on official RivaTuner's support website www.guru3d.com. I browse this forum almost every day so it is the most likely that I'll address your questions there. Second, you can post your questions via email. Anyway, please ensure that you have completely read this file, RivaTuner's context help and did a search in the forum before you will post your question. The questions which were already discussed there will be ignored.
Also, please take a note that I've disjoined NVIDIA World team and no longer respond to any questions in www.nvworld.ru forum. You still can get rather qualified help there, but don't expect that all RivaTuner related replies will be 100% technically correct.

Q: Can you teach me how to tweak my display adapter?
A: Please don't bother me with such questions. Any non-RivaTuner related system tuning questions and help requests will be ignored.

Q: Why do you answer 'RTFM' so often? Is it so difficult to give detailed answer instead of such rough reply?
A: Yes, it is very difficult for me. I've neither wish nor time to reply on the same questions daily. RivaTuner's technical support via email/forums eats a lot of my time and I don't want to waste it. So don't feel aggrieved if you've got such reply. Just read the documentation and you'll surely find the answer on your question there.

Q: Where can I download the latest versions of RivaTuner?
A: You can download it from the official RivaTuner's hosting partners: www.nvworld.ru and www.guru3d.com.

Q: Where can I download a localized version of RivaTuner? I don't understand English and I'd like to see Russian interface.
A: I'm not going to implement multilanguage support in the nearest future. Please don't spam me with localization related questions (this basically refers to Russian teenagers), I will not reply.


RivaTuner is a single man freeware project, developing and existing only due to enthusiasm. However, it won't be possible to provide support for support such wide range of display adapters without having actual hardware samples for testing. Currently RivaTuner's development lab is equipped with:

AGP NVIDIA GeForce4 Ti4600
AGP NVIDIA GeForceFX 5950 Ultra
AGP NVIDIA GeForce 6800

Thanks to Andrew Worobiew @ iXBT / Digit-Life, Partick Kan, Guennadi Riguer @ ATI and Peter Yeung @ HIS for providing these samples for testing. If you want to extend range of supported hardware, wish to add some features specific to different display adapters, and can donate any display adapters not listed above, please contact me via email.


Alexey Nicolaychuk aka Unwinder, RivaTuner programming, design, NVIDIA databases, NVIDIA / ATI patch scripts