- Home /
Multiple RenderTextures in a Shader,
Hi, I'm having an issue with rendertextures in a shader. I have two cameras rendering to two rendertextures, A and B. I'm having trouble getting these into a shader, basically, since I don't know Shaderlab that well.
Shader "Custom/GoToTransparent" {
Properties{
_Color("Color", Color) = (1,1,1,1)
_LeftEye("Left Eye", 2D) = "white" {}
_RightEye("Left Eye", 2D) = "white" {}
_Blend("Cross-eye Blend", Range(0,1)) = 0.0
}
SubShader{
Tags{ "RenderType" = "Opaque" }
LOD 200
CGPROGRAM
#pragma surface surf Lambert
struct Input {
float2 uv_LeftEye;
float2 uv_RightEye;
};
sampler2D _LeftEye;
sampler2D _RightEye;
half _Blend;
half _Color;
void surf(Input IN, inout SurfaceOutput o) {
fixed4 c = lerp(tex2D(_LeftEye, IN.uv_LeftEye), tex2D(_RightEye, IN.uv_RightEye), _Blend.x) * _Color;
o.Albedo = c.rgb;
o.Alpha = c.a;
}
ENDCG
}
Fallback "Diffuse"
}
The idea is to have two separately controlled eyes. The closer their dot product of their forward vectors is to 1, the closer the two cameras move together, removing a "double vision" effect. But when not "looking" at the same thing, they'd blur together in the middle of the screen. The lerp part is entirely wrong and just cribbed from an earlier shader I made, but that's what I'll work on after just seeing the two textures in the final output.
Finally, I'm pretty sure I know this needs to be in a shader to psuedo depth sort the blurred vision instead of relying on opacity and not have one "eye" win.
Your answer

Follow this Question
Related Questions
Render scene depth to a texture 4 Answers
Mulit-Pass Custom Shader Workflow 1 Answer
Blit isn't writing to render texture 1 Answer
Cubemap render faces 1 Answer
Update all slices of CustomRenderTexture 0 Answers