Understanding UV coord differences between shaders and Unity.
I'm trying to figure out what's happening to the UVs in my scene when rendering to texture, because I'm getting unexpected behaviour and it has something to do with flipping the Y coords in my UVs in my shader.
Basically, wen I look at the output of a camera in the editor, it looks fine, but when I look at the output of the same camera, but on a render texture, it's upside down.
However, by removing a line of code in my shader that I added (`uv.y = 1.0f - uv.y`) the opposite happens - the camera view is upside down, but the render texture is correct.
Can someone explain why this is happening? I thought the output of a camera would match 1:1 with the render texture it generates, but this is not the case.
I can add pictures to illustrate my point if necessary - I'm thinking this might be a common issue though (my Google-fu is weak and I couldn't find anything about this elsewhere).
Hi, this page of documentation about platform specific rendering differences might explains what you experience
Your answer
Follow this Question
Related Questions
RenderWithShader result is all black 0 Answers
Blending two textures from different cameras 1 Answer
CEFGlue with Unity 1 Answer
Problem with camera Rendering 2 Answers