- Home /
Why do my sprites get distorted in fullscreen mode
I am trying to get pixel perfect 2D on the screen and I have made a script that is attached to the scene that sets the orthographicSize of the main camera like this:
void Start () {
float pixelToUnits = 100;
float cameraSize = (Screen.height / 2) / pixelToUnits;
Camera.main.orthographic = true;
Camera.main.orthographicSize = cameraSize;
}
If I run this in the editor it all looks good and it also looks good on my iPad, but in some resolutions (e.g. 800/600) my sprites gets stretched wider. I think this is because this resolution has a different aspect ration than my monitor (monitor native res is 1920x1200 that gives 1.6 and the resolution 800x600 gives 1.33), am I right?
Is there a good way to fix this so my sprites never gets stretched in any resolution?
P.S. I asked this in the forums also, but I think it belongs here.
Thank you Søren
What tools are you using? (just to help figure out where the problem may lie, such as unity's built in 2d, 2d tool kit, ngui, etc)
The camera's size property probably doesn't have much to do with it, since with an orthographic camera size just changes the zoom pretty much. Are you changing the viewport Rect values? Or any other values?
Do you have any code changing the sprite's size? Any anchors on any images?
I haven't used the new 2d kit in unity, but a way to fix those types of issues before was to get the aspect ratio you are building, and set a ratio to scale the sprite to be the same size.
I am using Unity's build in 2D and I am changing no other values, but I have attached my camera to the player (sprite) if that matters.
So one solution might be to scale the sprites, but I have to do that to every single sprite in the game then :(
Try this to keep the same aspect ratio on all device/platform: mobile-device-screen-sizes.
That did not work for me, but I came up with something that I think will work most of the time. I have a script that sets some defaults for my game and in that I have the following:
void Start () { #if UNITY_STANDALONE Resolution[] resolutions = Screen.resolutions; res = resolutions[resolutions.Length - 1]; #endif }
void Update() { SetCameraAspectRatio (); }
void SetCameraAspectRatio() { #if UNITY_STANDALONE float targetAspectRatio = 0; if (Screen.fullScreen) { targetAspectRatio = (float)res.width / (float)res.height; } else { targetAspectRatio = (float)Screen.width / (float)Screen.height; } Camera.main.aspect = targetAspectRatio; #endif }
It only runs if it is a standalone app as the problem only exists if the screen resolution can have different aspect ratios (may be a problem on consoles also, will see if I need that too). I try to get the monitor native resolution in the Start method (I think its not guaranteed to be the last one, but often it is I hope), and if it goes into fullscreen I use that to calculate the aspect ration, if not fullscreen I use the actual screen size to calculate aspect ratio.
What do you think, it seems to work but is it a good solution?
Your answer
Follow this Question
Related Questions
2D Animation does not start 1 Answer
2D Sprite leaves a trail when moving diagonally 1 Answer
2d Water Help 1 Answer
Problem with SetResolution() 0 Answers