- Home /
How to force Android to use a 24-bit depth buffer in Unity 5, to prevent z fighting?
I've been seeing problems in my game with "z fighting" between terrain and water. It looks perfect on PC, but on my Samsung Galaxy Note 8 Android device, the resolution of the depth buffer is inadequate and z fighting results in the appearance of what look like horizontal ridges at the water line. I was able to ameliorate this problem by increasing the camera's near clip plane from 0.3 to 50.0, but the problem is still there (See image below).
Because it looks great on the PC, but the depth resolution is too coarse on the Android device, it appears that the Android device is using a 16-bit depth buffer rather than the standard 24-bit depth buffer.
Unity used to have a setting in the Android Player Settings, called "Use 24-bit Depth Buffer", to force an Android device to use a 24-bit depth buffer rather than 16-bit. This option was removed in Unity 5. The notes on "What's new in Unity 5" (https://unity3d.com/unity/whats-new/unity-5.0) state "Replaced 16bit depth buffer support with the ability to completely disable depth and stencil buffers." This seems to imply that 16-bit depth buffers are out, and all depth buffers will be 24-bit from now on, but nonetheless the poor depth resolution I'm seeing on Android (whereas the same game looks great on PC) indicates that in Unity 5 it's still using a 16-bit depth buffer on this Android device.
Is there any way in Unity 5 to force an Android device to use a 24-bit depth buffer? Or any other solution to this problem?
I don't know If that's even the right direction, but I found two entries in the docs that might be the way you go:
http://docs.unity3d.com/ScriptReference/DepthTexture$$anonymous$$ode.Depth.html
http://docs.unity3d.com/ScriptReference/Camera-depthTexture$$anonymous$$ode.html
Thanks for the links. I did attach a script to the camera that makes sure the depth buffer is turned on, in the start function:
void Start ()
{
GetComponent<Camera>().depthTexture$$anonymous$$ode = DepthTexture$$anonymous$$ode.Depth;
}
This didn't make any difference, however. The problem doesn't seem to be that it's missing a depth buffer; the depth buffer just seems to have too little depth resolution, with distinct visible "steps" of depth at a good distance from the camera. All I can imagine is that it's using a 16-bit depth buffer (on this particular Android device) rather than 24-bit. It looks fine on PC and iOS.
Just out of curiosity, what is your clip range? As long as you follow the 1:1000 rule, you shouldn't have too much trouble with the z-fighting...
Near clip plane is 50, far is 2000. 1:40 ratio, so I would expect it should be fine. And it does look great on PC and iOS. It's just on Android (particularly the Samsung Galaxy Note 8) that I'm seeing the z fighting -- I think because it's using a 16-bit depth buffer rather than 24-bit.
I was originally using the default 0.3->4000 clip planes, and had much worse results on the Android device (still looked fine on PC and iOS). By decreasing the clipping range the results are greatly improved, but still poor.
Answer by PitaNGura · Apr 20, 2020 at 07:24 PM
Did you find any solution to your problem? I have a problem with an UI mask and I read online that the only fix is changing the depth buffer to 24 bit but it doesn't seem like having any way of doing that.
Your answer
Follow this Question
Related Questions
Unity 2D autoScale the sprite objects in worldspace according to resolution? 0 Answers
Andriod GoogleMobileAds not working in Real Device(Http respose code:400), 0 Answers
Android Game Crash On Startup After Updating Unity 0 Answers
Unity Google Play Service Plugin Social.localUser.Authenticate(…) App Crash 1 Answer
Screen rotates despite playersettings set to: Landscape Left 1 Answer