- Home /
"-adapter" Unity player command line does not work
The "-adapter" command line argument that is still in the docs does not work any more. It used to work in older builds of my same project. It seemed to work when I was building with DirectX 9 and stopped working in DirectX 11 and DirectX 12. Since DirectX 9 support has been removed since Unity 2017.3, that isn't even an option anymore. Is this a bug? Is there some new way to select which GPU a Unity game will use on a desktop with multiple graphics cards and monitors? It seems crazy to me that such a major game engine would not provide some way to select which graphics card to use.
Answer by radiatoryang · Nov 12, 2021 at 12:13 AM
update for anyone who reads this in 2021: it's now -force-device-index #
where # is the computer's GPU index starting from 0 ... seems to be in 2019 LTS, 2020 LTS, and 2021