- Home /
Switching graphics cards makes Executable run in small window instead of full screen
Hi,
I've created a game in Unity 4.5.3f3 that I'm running under Window 7 on an all-in-one PC. The default graphics card is an Intel(R) HD Graphics (not sure how to find more info about it) which is too slow to run the game smoothly. However, when I switch to the build in AMD Radeon HD 7600A graphics card, using AMDs 'Configure Switchable Graphics' option (right click on desktop, then select 'high performance' for the games excutable) the game runs in a small window instead of full screen. I didn't have this problem in the past with the same machine and game so my best guess is, that it is related to the upgrade from Unity 3.x
Here are some of my 'build settings'/'player settings' in Unity:
Default is Full Screen: [x] Default is Native Resolution: [x]
Display Resolution Dialog: Disabled
Any help appreciated. I'm a bit lost on how to attack the problem. Anyone heard of something familiar?
PS: Display drivers are up to date according to Windows.
Your answer
![](https://koobas.hobune.stream/wayback/20220613180211im_/https://answers.unity.com/themes/thub/images/avi.jpg)
Follow this Question
Related Questions
GUITexture size and full screen 1 Answer
How to do UV mapping on precodurally generated mesh 0 Answers
Don't stretch in full screen (2018) 0 Answers