- Home /
Writing depth to render target / iOS
Hey all,
after seeing that the cameraDepthMode does not really allow for the camera to write to the _CameraDepthTexture sampler under iOS platforms, I decided to manually write depth of an object to a RGBA32 fullscreen rendertarget. For this, I use a 2nd camera, copied from the main one, just setting the layer to render only the things that I want. Anyway, the camera rendering works fine, I am having trouble getting the depth texture sampled correctly. This is the depth writing shader:
struct v2f {
float4 pos: POSITION;
float2 depth: TEXCOORD0;
};
v2f vert(appdata_full v) {
v2f o;
o.pos = mul(UNITY_MATRIX_MVP, v.vertex);
// this doesn't seem to work on iOS. I just use its equivelant below.
//UNITY_TRANSFER_DEPTH(o.depth);
o.depth = o.pos.zw;
return o;
}
half4 frag(v2f i): COLOR {
// this doesn't seem to work on iOS. I just use its equivelant below.
//UNITY_OUTPUT_DEPTH(i.depth);
return half4(i.depth.x / i.depth.y , 0, 0,1);
}
I am reading the depth in my water shader, in order to make some foam edge around objects in the water. The sampling code is this, according to how its done in the soft particle implementation:
float depth = LinearEyeDepth(tex2Dproj(_DepthTexture, UNITY_PROJ_COORD(i.projUV)).r);
where projUV is calculated in the VS:
o.projUV = ComputeScreenPos(o.pos);
COMPUTE_EYEDEPTH(o.projUV.z);
Now, the problem is that I don't get the linear view space depth as I should. The depth is always 1.0, unless I go really close to the surface, where I can see some gradient.
Am I missing something here?
Your answer
Follow this Question
Related Questions
Cutting hole in water plane using stencil buffer.Help making it visible from both sides. 0 Answers
Why CYAN is default object color (with default materials & shaders) 0 Answers
How to not have custom node variables be global across all material instances 0 Answers
Unity 5 fragment shader that writes custom depth into the depth texture 1 Answer
Overwrite DepthNormalsTexture 2 Answers