- Home /
Screen.resolutions doesn't work very well
Hi everyone! I'm making an FPS with some friends, and today i encountered a big problem with Screen.resolutions variable. I'm using this function to get all supported resolutions:
bool supportedRes(int w, int h)
{
bool mb = false;
int k = 0;
foreach(Resolution r in Screen.resolutions)
{
Debug.Log ("Index " + (k+1) + ": " + r.width + "x" + r.height);
if(r.width == w && r.height == h) mb = true;
k++;
}
return mb;
}
I think that's ok, but when i start the game the Debug.Log function reports me these supported resolutions:
Index 1: 640x480 Index 2: 480x60 Index 3: 60x800 Index 4: 1024x768 Index 5: 720x60 Index 6: 60x1280 Index 7: 1280x960 Index 8: 1024x60 Index 9: 60x1366 Index 10: 1440x900 Index 11: 900x60 Index 12: 0x0 Index 13: 0x0 Index 14: 0x0
It's very strange, i think the Unity.Resolution variables are changing their information with other variables, for example in Index 5 we got 720x60(x1280) but he must be 1280x720(x60). And that fake indexes? 12, 13, 14? And why with index 11 i get 900x600x0 when i must get 1600x900x60
I hope you can help me..greetings.
Sorry for bad english, but i'm a young italian guy :D
Did you try the code here? http://docs.unity3d.com/Documentation/ScriptReference/Screen-resolutions.html
Your answer
Follow this Question
Related Questions
Adding y position? 4 Answers
How do I fix textures? 0 Answers
What is the problem in my script? 2 Answers
Camera Problem 0 Answers
Why is my camera shaky in editor but not in compiled project? 4 Answers