[GUIDE] How to undervolt your GPU - WINDOWS ONLY

I am not responsible for any damage to your equipment, you alone are at fault if something goes wrong! UNDERVOLTING, UNLIKE OVERVOLTING, IS UNLIKELY TO CAUSE ANY DAMAGE. HOWEVER, PROCEED WITH UTMOST CAUTION SINCE THIS IS NOT SOMETHING THAT ANY MANUFACTURER WILL LIKELY GIVE YOU SUPPORT WITH IF ANYTHING FAILS! INCREASING VOLTAGE ABOVE THE DEFAULT VALUE CAN PHYSICALLY DESTROY YOUR GPU AND THEREBY POTENTIALLY SHORT CIRCUIT YOUR ENTIRE SETUP!!!Information: I will be using a Desktop PC on the screenshots in this tutorial. The process remains the same for eGPU setups.

Introduction

Spoiler

The theory behind undervolting

Some of you may have already heard the term "undervolting" in the past few years. The idea behind the rather well spread concept of underclocking is to reduce power consumption and thereby heat generation of computer components like CPUs and GPUs. The theory behind the reduction of power consumption by reducing voltage lies in what electic power is and can be explained by how it can be calculated:

P = I*U

P is the electric power, I the current and U the voltage.
Too keep it simple, we will assume the current to be constant. If we want to reduce the power, we only have one option: reduce the voltage. The logical idea now would be to go as low as possible, approaching 0V as good as we can, right? Of course, however we are limited by how semiconductors work. I won't go into detail on this topic now, since that's not what this is supposed to be about. However, the limitation tells us we need to pass a specific voltage so the semiconductors we have in our PC's can function properly (or at all if you go low enough). This limitation also is the reason why manufacturers go higher than necessary, every piece of electronic varies in it's properties, and so does it's minimal voltage requirement. The easiest way to work around this variance is for manufacturers to choose a high enough but still safe voltage for mass produced components. This is what we can use to our advantage and find a custom voltage value that works for our equipment.

Why do you want to do this at all?
That's a good question, with an easy answer: eGPU enclosures often place our graphics cards snugly next to a beefy power supply or just have no case fans or bad airflow in general. By undervolting we reduce the amount of heat energy that's being transformed from eletric energy inside our enclosures. This means, we have potentially less temperature inside our little power boxes which in turn leads to potentially reduced component wear and in terms of a circular movement (higher temperatures mean higher electrical resistance which means higher power consumption -> heat) further reduces overall power consumption on longer terms. We also reduce the power we draw from the power supply, thereby also reducing the temperature "generated" (=transformed and lost electrical energy) by the power supply inside our enclosure.

Here is a little peek into how much less my Gigabyte GTX 1070 Mini OC uses before and after undervolting:

HWiNFO readings before undervolting

HWiNFO readings after undervolting

Let's get this started.

Preparations

To undervolt your GPU (yes, this is an eGPU forum, so this guide is about GPU undervolting) you need to meet some minor requirements. For NVidia GPU's I don't know exactly which chips will work. I only heard about 10 series being supported but nothing below, it's very likely anything after 10 will also work but I don't know and can't test because I don't own any newer graphics card. For AMD owners undervolting is a lot easier than it is for NVidia users. While I did some undervolting myself on an RX570, I don't know which GPUs support it at all and also can't supply any pictures because I don't have mine any more. However, I will try to describe the process as good as I can recall it from memory. Feel free to add pictures and correct my steps in this thread.

Spoiler

NVIDIA

After making sure our GPU can be undervolted (check a search engine of your choice) we need to get you some software to get the process going. In this tutorial I will be using MSI Afterburner which can be downloaded here (direct link from MSI).
Once the download has finished, unzip the archive and run the contained .exe to install the application. Launch the just installed application and you will be greeted with something similar to this (the theme can be changed in the settings):

Once Afterburner is running (and the currently active application), press Ctrl+F to open the "Voltage/Frequency curve editor":

When I first saw this curve, it looked terribly complicated to use and I actually had to look for a video on Youtube on how to use it (kudos to the creator, sadly I can't remember which video it was). Once you get the hang of it, it's much easier to use and everything it tells you makes absolute sense. Unlike AMD's approach, this graph tells you the voltage on the x- and the frequency on the y axis. Voltage is our GPU core voltage, the one we want to change, and frequency our GPU clocks. Before you start fiddling with the curve, we need to find out which frequency your GPU is most likely to run most of the time.
To do this, we need to put some load on it. I will be using Tom Clancy's The Division 2 in this tutorial (you can choose any game/benchmark that will not result in a CPU bottleneck, we need to push the graphics card to it's limits).
With the game running and pushing the GPU into definitely not limited territory (preferably in windowed mode so you can read the information from Afterburner), we can now check which frequency the GPU is running at in MSI Afterburner:

In my case, the frequency is 1847 MHz. Now you want to find the frequency we just determined on the y-axis (left side) of your voltage/frequency curve editor. Once you have that, follow the frequency to the right until it meets the curve, the point of intersection should be aligned with the voltage the main Window of MSI Afterburner shows on the right hand side while checking for the frequency. Once you found that point, select every point after that one by holding shift while clicking and dragging anywhere starting left of the first point after our target frequency.

With all irrelevant points selected, click one of them and drag them below the target. (don't worry, dragging multiple ones at once can be buggy but that doesn't matter)

To lock the changes, click the apply button (the tick icon) on Afterburners main window. The just changed points in the curve editor should now have become a straight line close to our desired frequency.
Next we need to find the lowest voltage our target frequency will work at. For this, we will make an assumption and pick a starting voltage of 950mV. Find 950mV on the x-axis and drag the dot aligned with 950mV to the desired frequency (in my case, it's in the area of 1847 MHz) - you don't have to be spot on, just get it in the approximate area and you should be fine. Now, like before, select all points after the 950mV one and drag them below 1847 MHz. Click apply in Afterburners main window.

Now go ahead and see if the newly set voltage results in a stable clock by running your game. If it does, you can continue going to the left, as low as possible. Instability can take many forms, the most common one for me usually is a simple game crash followed by a message that the graphics device was lost or similar. If that happens, the driver should usually recover on it's own, if it doesn't just do a quick reboot and pick a higher voltage for the target frequency. For this, just go the other way around and drag a point to the left of the desired voltage below the target frequency.
Once you found the lowest possible voltage, it's advisable to go one step higher to make sure it really is stable (and the gain of going that one step lower usually isn't worth the cost of possible instability). For me, the final curve on my 1070 looks like this:

Like I said earlier, this curve varies from GPU to GPU, so my curve might not work for you, or you may be able to go even lower to achieve a stable setting.

When you are done with testing, and have found a good setting, you can save the profile in MSI Afterburner (so you don't have to redo it on every boot). In order to save a profile, make sure the lock over "profile" (below the voltage and temperature gauges) is unlocked, click the save icon to the right and select one of the blinking numbers 1-5. After saving the profile, you can also enable the "Startup" button on the left, this will result in Afterburner applying the frequency and voltage settings when Windows boots. (you need to have MSI Afterburner set to run on Windows startup in order for this to work).

Congratulations, you have successfully undervolted your GPU!

Spoiler

AMD

While looking for some pictures to use for the AMD part, I discovered that the folks at AMD actually seem to care and extensively explain how their driver features work. Yes, you read that right – as I mentioned earlier, AMD users have a much easier life when it comes to modifying their GPUs compared Nvidia owners.

AMD includes all major functionality of MSI Afterburner right in their Adrenaline drivers, so you don't need to download any additional software and also don't need to have that software running in the background at all times.

The driver feature we want to use is called "Radeon WattMan".

To access Radeon WattMan, open up your Radeon control panel. The easiest way, which should be enabled by default, is to open the context menu on your desktop (right-click) and clicking AMD Radeon Settings. You will eventually be greeted with a screen like this:

You will want to select "Gaming" as highlighted in the screenshot. The upcoming screen will display software the driver recognized on it's own and enables you to configure a specific profile for each game/graphics intense software you may be running. You can also manually add an .exe for software the driver didn't recognize on it's own by pressing "Add" in the top right corner. Since we want to apply the undervoltage for every software, we will ignore this step and just go with the "Global Settings" option instead. You will get a warning message, read it CAREFULLY and accept it if you agree with what it tells you.

Here a quote from the official WattMan manual (I highly recommend keeping this in mind):

NOTE! AMD PRODUCT WARRANTY DOES NOT COVER DAMAGES CAUSED BY OVERCLOCKING, EVEN WHEN OVERCLOCKING IS ENABLED VIA AMD HARDWARE AND/OR SOFTWARE. DO NOT ACCEPT THESE TERMS IF YOU UNLESS YOU AGREE TO VOID YOUR WARRANTY.

Once you accepted that you are the only one responsible for what you do in these settings, you will see a number of options and information on the screen. In my case of the RX570, the graph displaying all current information was unusable and mostly quite inaccurate, especially for finding a suitable voltage.

The part we are actually interested in are the frequency and voltage sections.

I couldn't quite find a proper method to apply the voltage back then. It seems that at least some AMD chips will ignore specific voltages they believe to be too low to properly work, so keep an eye on the power draw in the "Performance Monitoring" section of your "Global Settings" while testing the new voltages. If the power draw with changed voltage doesn't differ from the default setting at all, the voltage change didn't apply!

Spoiler

Radeon VII and Radeon RX5700 series

For Radeon VII and Radeon RX 5700 Series graphics cards this section slightly differs from the older cards. Since I never used any of the new cards before, all I can provide here is guesswork based on what I see on the screenshot from AMD!

First, you want to change the mode from "Automatic" to "Manual". The procedure for here appears to be similar to the one used by MSI Afterburner on NVIDIA chips. The only difference seems to be in the amount of default setup points and the switched x and y axis (right away, this one looks more like what I first expected to be more logical when I opened the curve editor in Afterburner for the first time).
It appears that moving the right bracket will enable you to change the frequency and moving the orange dots is to modify the voltage connected to the specific frequency on the bottom. Since we want to reduce power consumption, I advise you stick with the default clock (it's also much easier to find a suitable voltage without going beyond the default frequency).

Spoiler

R7 260 and newer, but older than the cards mentioned above

In case of older Radeon cards, the section you need to look out for in WattMan looks like the following:

Ignore the Frequency (%) highlighted in the screenshot and leave it at 0%.

In this case, we want to edit the voltage applied to the clocks in the different GPU states of our chip. To make this happen, switch "Voltage Control" from "Automatic" to "Manual". Read on below.

If you set the frequency of your highest state below the frequency of the preceding states, you also need to reduce the voltage of those preceding states in order for the voltage setting to be applied, otherwise the GPU will likely just use the higher voltage of the lower state!

As you may have read in the theory section, you can't go down to any that sounds like it's your lucky number. You need to reduce it in steps and check if it's stable for every step you take. For the beginning, I recommend to check and see which state/frequency your GPU is most likely to enter with the software you plan to put max load onto the chip (read below). Once you know the state (likely to be the highest) you can start editing the voltage. To speed things up a little, take off 80-100mV for the first one or two steps. If it's stable, go on. If not, go back to the last stable and reduce the step size (e.g. 10-30mV). I just mentioned you need to check for stability. By this, I mean you need to put some load onto the GPU. The - in my opinion - best way of doing this is with a real world test, rather than a benchmark. Pick a game of your choice which you are sure of to be able to max the GPU load without being bottle necked by your CPU. Let it run for a few minutes and walk around/change the scene a bit so you can also "simulate" small changes in GPU load.

Save your profile to make sure you won't loose your setup once you're finished, and maybe also every now and then between the testing steps in case you get a BSOD instead of a driver crash.

If everything went well, you have successfully undervolted your GPU, congratulations!

In case you have any questions, feel free to ask and I will try my best to give you an answer!

This is great stuff especially for those new to GPU tinkering! In my opinion though, due to the general efficiency of NVIDIA architectures, in most cases, people tend to increase power and overclock instead to achieve some notable performance uplift (though equal performance at lower watts is also uplift in a sense). Nonetheless, appreciate the insight here. Undervolting is key on AMD scalable architectures such as Vega, so I’m looking forward to that section!

@mac_editor while my experience with AMD on the RX570 (as well as the comparison of TDPs on their websites) seems to confirm the claim of more efficient architectures on Nvidias chips, I would still like to point out that the reduction of power consumption for me is about 25% while keeping almost the same clock speed and therefore performance. This definitely doesn't say much about the overall efficiency and doesn't provide any information for us to say who went on a safer way to be sure the chips all work (Nvidia or Gigabyte), but it still is a massive drop in energy consumption. To be fair, on my GTX 1060 3GB the difference isn't as big as on the 1070, and I also have to admit that I didn't expect it to make this much of a difference on the 1070 to begin with.

I also just added a piece of information on the AMD part – I hope this is enough to cover everyone's needs.

I just finished the AMD section and updated the post (also added some readability improvements by using spoiler blocks).

The screenshots used in the AMD section are from the official WattMan manual from AMD found here. If an AMD employee reads this and sees this as copyright infringement, please let me know so I can remove the screenshots from the post!