- Home /
,Can camera output negative values to render texture?
,I have a render texture that is created in code and is then set as a targetTexture of a camera. I manually call the camera.Render() to write to that texture, then I use that texture later in a compute shader. The camera's culling mask is set to a layer where only one object is present, so that is the only rendered object on the texture. The object uses a custom shader where I output the normals of the object from the fragment shader. The problem I have is that I want to identify negative values of y-component of the normal vector in the compute shader, but it seems that all values in the texture are non-negative. I've tried outputing all negative values from the shader, but in the compute shader the values still weren't negative. Does the camera clamp the values? If so, can that be changed somehow?
Here's the C# code:
private void Awake() {
rt = new RenderTexture(DIMENSION, DIMENSION, 32, RenderTextureFormat.ARGBFloat);
waveEffects.enableRandomWrite = true;
rt.filterMode = FilterMode.Point;
rt.Create();
renderCamera = GetComponent<Camera>();
renderCamera.targetTexture = rt;
renderCamera.enabled = false;
computeShader = Resources.Load<ComputeShader>("shaders/ComputeShader");
}
void FixedUpdate()
{
renderCamera.Render();
computeShader .SetTexture(csKernel, "texture", rt);
waveEffectsShader.Dispatch(csKernel, DIMENSION / 8, DIMENSION / 8, 1);
}
Here's the fragment shader part of the debug version of custom shader:
fixed4 frag(v2f i) : SV_Target
{
return float4(0, -1, 0, 1);
}
And finally, this if clause fails in the compute shader:
float4 original = texture[id.xy];
if (original.g < 0) {
waveEffectsTexture[transformedCoordinates] = float4(1, 1, 1, 1);
}