1

I have a powerful nVidia GPU, an old and slow AMD GPU, and a motherboard with two PCIe slots.

I also have an old, obscure video game which has graphical glitches under nVidia cards. It's a rhythm game which requires good audio/video sync, so using a Virtual Machine is out of the question. It works fine under my old AMD card.

What is the easiest way to switch to my AMD GPU before running this specific game? Anything quicker than manually opening up my computer and switching out the cards will be an improvement.

1 Answers1

1

In windows 10, whichever graphics card is hooked up to the monitor that is selected as your main display is that graphics card that will be used to render the game. If you have two monitors, one monitor hooked up to each graphics card, than you can select "Make this my main display" to the monitor that is hooked up to the AMD card, then afterwards when you launch the game it will use the AMD GPU. When you want to go back to the NVIDIA card then you would need to select the monitor hooked up to the NVIDIA as the main display so that the next game or application you use will use the NVIDIA monitor.

Something I discovered is that: depending on how the game handles switching of windows, if the game does not crash while switching main displays, the game can continue using the GPU it was initially using when launched even if you switch the main display with the game still open. Using this method it is possible to launch a game under one main display to use one GPU, switch main display to a monitor being used by the other GPU, then launch a different game to use the other GPU. This way you can have two separate games or applications each using there own dedicated GPU.

(If someone figures out a way to tell windows which GPU to use for which game without having to use hacks like switching your main display, please inform me because I want to know too).