- Home /
Render Texture Screen Effect: Custom Shader Not Tiling
I am by no means a Shader programer by any stretch. I'm just fumbling through. I've managed to implement a camera dissolve effect which will gradually make what a camera see appear using a noise pattern and superimpose it over a static background -- rendered by another camera.
I've got it working almost exactly as I imagined it in mythology head. Except that the shader I'm using, that I found in the internet and, is not tiling the noise image when I'm blitting the Render Texture. No matter what values I use for X and Y Noise image tiling -- it has no effect.
I've tried several changes to the shader with suggestions I've found but nothing works. Can I get a hand from the Unity Answer Community?
Below is the current working shader code:
Shader "Custom/Unlit/DissolveEffectShader"
{
Properties
{
_MainTex ("Texture", 2D) = "white" {}
_NoiseTex ("Texture", 2D) = "white" {}
_EdgeColour1 ("Edge colour 1", Color) = (1.0, 1.0, 1.0, 1.0)
_EdgeColour2 ("Edge colour 2", Color) = (1.0, 1.0, 1.0, 1.0)
_Level ("Dissolution level", Range (0.0, 1.0)) = 0.1
_Edges ("Edge width", Range (0.0, 1.0)) = 0.1
}
SubShader
{
Tags { "Queue"="Transparent" "RenderType"="Transparent" }
LOD 100
Pass
{
Blend SrcAlpha OneMinusSrcAlpha
Cull Off
Lighting Off
ZWrite Off
ZTest Always
Fog { Mode Off }
CGPROGRAM
#pragma vertex vert
#pragma fragment frag
#include "UnityCG.cginc"
struct appdata
{
float4 vertex : POSITION;
float2 uv : TEXCOORD0;
};
struct v2f
{
float2 uv : TEXCOORD0;
float4 vertex : SV_POSITION;
};
sampler2D _MainTex;
sampler2D _NoiseTex;
float4 _EdgeColour1;
float4 _EdgeColour2;
float _Level;
float _Edges;
float4 _MainTex_ST;
v2f vert (appdata v)
{
v2f o;
o.vertex = UnityObjectToClipPos(v.vertex);
o.uv = TRANSFORM_TEX(v.uv, _MainTex);
return o;
}
fixed4 frag (v2f i) : SV_Target
{
// sample the texture
float cutout = tex2D(_NoiseTex, i.uv).r;
fixed4 col = tex2D(_MainTex, i.uv);
if (cutout < _Level)
discard;
if(cutout < col.a && cutout < _Level + _Edges)
col =lerp(_EdgeColour1, _EdgeColour2, (cutout-_Level)/_Edges );
return col;
}
ENDCG
}
}
}
Think I might have the same issue as you...
We've just upgraded from 5.5.1p1 to 2017.1.0f3 and found that a number of our blit materials are no longer working as expected. Having spent the day investigating it, the issue appears to be that the Tiling value (scale in script) on the material is ignored and (1, 1) is always used.
In the sample shader program on https://docs.unity3d.com/$$anonymous$$anual/ShaderTut2.html the member of the appdata
parameter value passed into TRANSFOR$$anonymous$$_TEX is v.texcoord
, but I see you pass in v.uv
I am NOT clear what the difference between these two is (based on their NA$$anonymous$$ES they sure sound like the same thing), still perhaps changing it will help.
Edit: oh never$$anonymous$$d, they are using appdata_base
as the input type: missed that- sorry.
Your answer
Follow this Question
Related Questions
Curved Screen Post Processing Effect 1 Answer
Stereoscopic postscreen effect UV differences (multi pass) 0 Answers
How to Post Process a WebCamTexture in realtime. 4 Answers
How to visualize depth value of unlit line renderer? 0 Answers
Rendering different shader when camera get near object 1 Answer