- Home /
iOS "Best Performance" Gives Odd Resolutions (Unity 4.0)?
Short Version:
In Unity 4.0, we get new resolution choices for iOS. However, I'm finding that the "Best Performance" option gives very odd resolution choices, which are NOT the pre-retina resolutions I would have expected. E.g. the iPad 3 gives 1536x1152 in best performance rather than the 1024x768 I would have expected.
Does anyone know what is up with these strange resolution choices under "Best Performance"?
Details:
I ran tests across the set of devices I had available, trying each of the provided settings. Here are the results:
Native:
1136x640, iPhone 5
2048x1536, iPad 3
1024x768, iPad 2
480x320, iPod Touch 3
Auto (Best Performance):
852x480, iPhone 5
1536x1152, iPad 3
768x576, iPad 2
360x240, iPod Touch 3
Auto (Best Quality):
1136x640, iPhone 5
2048x1536, iPad 3
1024x768, iPad 2
480x320, iPod Touch 3
768p (iPad):
(distorted) 1024x768, iPhone 5
1024x768, iPad 3
1024x768, iPad 2
(distorted) 1024x768, iPod Touch 3
320p (iPhone):
(distorted) 480x320, iPhone 5
(distorted) 480x320, iPad 3
(distorted) 480x320, iPad 2
480x320, iPod Touch 3
640p (iPhone Retina):
(distorted) 960x640, iPhone 5
(distorted) 960x640, iPad 3
(distorted) 960x640, iPad 2
(down-sampled) 960x640, iPod Touch 3
Findings:
I gather these are all "virtual" resolutions, as in many cases they do not seem possible on the actual device. It's sort of nice that Unity allows this.
As expected, "native" and "auto (quality)" are redundant for this set of devices. This isn't necessarily an issue, as other devices may be different.
The "768p", "320p" and "640p" options are next to useless when targetting multiple devices. Despite the descriptions, they are NOT defining only the vertical resolution, but are fixing the aspect ratio as well. The result is that "640p (iPhone Retina)" is distorted on a retina iPhone 5 or any other device of a different proportion than the older retina iPhones. A really questionable choice to include these given Unity's supposed "build once, deploy everywhere" philosophy.
The thing which really throws me, though, is the set of "best performance" resolutions. Although they are all the correct proportions, I would have expected to get the standard non-retina resolutions here (e.g. 1024x768 for iPad 3), but instead we see some in-between resolution. Surely the lower-res option would be better for performance, so why not choose it? In fact, with the choices available, it is not possible to get both 1024x768 on an iPad 3 and 480x320 on an iPhone 4 with the same build settings — a desirable configuration for apps with intense visuals.
Am I missing a technical reason for this? Is it a bug or oversight of some sort?
Actually, I shouldn't say it's "in-between" standard/old and retina, as for the non-retina devices you actually get something lower than the standard resolution.
The "best performance" resolutions seem to be choosing exactly 3/4s the native resolution. I guess that's a useful approach as well, but why not let us choose that proportion, or when to apply it? What if I want half the native resolution? Or only want the resolution scaled down on retina devices? Again, there seems to be NO option at all for setting all devices to use the pre-retina resolutions.
Also of note: none of the iOS devices I tested returned anything for Screen.resolutions (the array of supported resolutions). This makes choosing your own resolution in code a bit difficult.
I just spent some significant time trying to figure out why my iPhone 5 was distorting sprites in the X-axis. Turns out that setting 640p resolution in iOS build settings was the culprit -- a setting that I'd looked at several times and each time and said "Nah, that couldn't be it". It was literally the last thing I tried changing (I switched it to Native).
I'd upgrade your rating of "next to useless" for this feature to "worse than useless".
Answer by numandina · Jul 04, 2013 at 01:03 PM
Dude you can change this in Xcode. From Unity, set Target Resolution to Best Performance, then in Xcode open AppController.mm and scroll down to here:
SetupTargetResolution(EAGLSurfaceDesc* surface)
{
// while this may look stupid, we call that function from inside unity render loop
// so dont fiddle with resoltion right away, but postpone till the end of the frame
int targetRes = UnityGetTargetResolution();
float resMult = 1.0f;
if(targetRes == kTargetResolutionAutoPerformance)
{
switch(UnityGetDeviceGeneration())
{
case deviceiPhone4: resMult = 0.26f; break;
case deviceiPad1Gen: resMult = 0.5f; break;
case deviceiPad3Gen: resMult = 0.5f; break;
default: resMult = 1.0f;
}
}
Now you can change the resolution as you see fit!
Is this still the case? I'm not finding this code file or code in the project. I'm on Unity 5.0.1 Pro.
Hey ByteJam.
Looks like they moved it with Unity 5.
Look at DeviceSettings.mm, which appears to have similar code:
int targetRes = UnityGetTargetResolution();
float res$$anonymous$$ult = 1.0f;
if(targetRes == kTargetResolutionAutoPerformance)
{
switch(UnityDeviceGeneration())
{
case deviceiPhone4: res$$anonymous$$ult = 0.6f; break;
default: res$$anonymous$$ult = 0.75f; break;
}
}
if(targetRes == kTargetResolutionAutoQuality)
{
switch(UnityDeviceGeneration())
{
case deviceiPhone4: res$$anonymous$$ult = 0.8f; break;
default: res$$anonymous$$ult = 1.0f; break;
}
}
You can just do this from script with Screen.SetResolution(x,y) - you don't need to muck with the .mm files to do this.
Of course you can. But again, the whole point of having an "auto" resolution choice is to not code your own resolution settings.
It seems that Unity's auto-res logic is very simplistic, setting a multiplier and differentiating only between iPhone 4 (and below?) and "the rest". Better than nothing, of course, and possibly expanded on in future versions of the engine. Given the plethora of devices out there, an actual performance test & adjustment routine would probably be more desirable and Unity doesn't provide such a utility yet.
In any case, the original question (which is old and fuzzy) amounted to "why these resolutions?" and the answer is in the multiplier code numandina points to.
Answer by Jason-RT-Bond · Mar 13, 2014 at 06:34 PM
Since this question doesn't have an "answer" yet as such, I think I should add one in case someone else stumbles on it:
The "best performance" settings gives you a 0.75 multiple of full resolution across the board. An odd choice perhaps (why not 0.5, or a choice?), but makes math sense if not practical sense.
The "best quality" option just gives the native resolution it would appear, so it's redundant?
The "640p" and other options imply an EXACT resolution and are ONLY useful if you are targeting that one type of screen. If you try it for varied phones or tablets with different proportions you will get distorted results. Since they are just the native resolutions of specific devices.
As for my original case: If you wanted to choose 1024x768 for both iPad 2 and iPad 3 but retina for iPad 4+, which is a common choice for performance reasons relating to the iPad 3, you need to do it manually in code.
That's current as of Unity 4.3.
Also I should add that I'm a pretty big fan of Unity and don't mean to be so negative about its features. However, I am quite disappointed by these resolution choices. The ultimate answer is "always run native" or "run native and then adjust using your own logic in code". The options presented by the Editor are not practical or sensible at all.
Your answer
Follow this Question
Related Questions
Unity 4: problem with DOF on iOS 1 Answer
gl class iOS 1 Answer
unity 4 app does not compile to ios 0 Answers
Resolution independence 0 Answers
Three ingame buttons conflict with each other on iOS 0 Answers