- Home /
Modifying a quad UVs to display a screen fixed texture.
I wish to display a full screen texture onto a quad (an actual object in the scene). The texture must stay locked to the viewport and the only window to it would be that quad. Think of the quad as a green screen for displaying that texture.
Every frame I do:
Transform every vertex from the quad into world space by using the quad's transform function TransformPoint
Transform every resulting vertex into screen space by using the camera's WorldToScreenPoint function.
Map the screen coordinates into UV space by dividing them by the screen's dimensions (0.0->1.0).
Use these normalized screen points as new UVs for the original vertices (minus the z).
Now the problem is that I get unexpected results with the texture not looking sharp when it should have been rendered at the screen resolution, and more so it looks (and animates) quite wrong. Any ideas about the algorithm?
Answer by Bunny83 · Sep 25, 2015 at 08:46 PM
Well, that doesn't work like a green screen since you still render your quad in worldspace. So if the quad is tilted it will be rendered with perspective. What you want to do is either:
render another quad in screen space that matches your 4 screenspace corners
use an approach like green screen by first rendering your screen space texture as background, then render your masking quad with a depth mask shader and finally render your normal scene.
The second approach should work for my purposes but I was more interested by doing it without any shader use. I was hoping someone would tell me what's so horribly wrong with my algorithm.
Again, for clarification: I wish to render any object as a green screen texture. An overlay for a backdrop if you will. Simply put, it's like having an unlit material that renders a texture mapped to the screen.
EDIT: sorry for replying so late, I had some issues with this answering system (I didn't use the comment button, I used the reply one)
Sorry, just looked through my open tabs and found your reply ^^.
You said
Use these normalized screen points as new UVs for the original vertices ($$anonymous$$us the z).
I'm not sure what you mean by "$$anonymous$$us the z". UV coordinates are local to the surface of the object. If you render the object tilted, the part that is further away need to be corrected due to affine projection. However you project a screenspace texture linearly onto the tilted quad as if it's a screenspace quad but the graphics hardware does account for the perspective of the tilted quad. As i said one solution is to actually disable the renderer of your quad and ins$$anonymous$$d render a screenspace "quad" where you adjust each corner to fit at the projected coordinates.
With a shader it's quite easy to use the screen space coordinates for texture mapping like you can see in the Detail Texture in Screen Space shader example(scroll down).
Why do you want avoid using a special shader? I mean you always use some kind of shader and the normal perspective correct texture mapping isn't appropriate for your usecase.
Thank you! That affine projection thing seemed to be the culprit here. I'm going to try both solutions to become a bit more familiar with Unity. Thanks again!
Your answer

Follow this Question
Related Questions
UV mapping large levels 2 Answers
Modelling - One Part or several and UV Mapping 1 Answer
uv Tiling y error 2 Answers
Blender Vertex colors issue 0 Answers
How do I use the output of Unwrapping.GeneratePerTriangleUV? 4 Answers