- Home /
Project Texture Coordinates From Camera
Hi All,
I'm just getting into Unity, my experience lies mainly with Blender.
I would like update UV coordinates in realtime, projected from a camera. In Blender to do this (not in realtime) you can 'Project from view'. To do something similar in Blender in realtime you can similarly change the map input to use screen coordinates.
Any help would be wonderful, thanks :)
Answer by Owen-Reynolds · Dec 06, 2011 at 04:32 PM
Assuming you want this in real-time, fastest (in machine time) way is with a shader (on the Material.) The shader is already computing Screen coords. Not to much extra work to replace the UV coords with those new numbers.
The Unity shader drop-down doesn't have a "UV from view" shader, but there are various shader snippets floating around. If you can't find one mostly written, learning shaders isn't trivial -- not worth it for just this one effect.
It is also possible to modify mesh.UV (see the Mesh class) each frame. One obvious problem is that multiple instances of the mesh will have different camera views. You'd probably need to make copies. Changing mesh.UV is "slow" in the sense that you may lose one whole frame/sec(?)
You'd just need to pull out the corrosponding xyz vert, apply the parent transform to get world space, then Camera.WorldtoView
to get screen space. Then scale that however and set UV. That seems like more work, but there may be more, easier to read, examples of mesh manipulation than for shaders. Plus, playing with model/world/Camera/Screen coords is good practice if you do want to learn shaders.
Thanks so much for the detailed response, it's quite reaffir$$anonymous$$g to know I was on the right track after asking the question.
Which of these methods would be recommended for cross-platform dev? Also, would any of the visual shader editors that are floating around be capable of such an effect?
Thanks!
If they allow you to replace the UV channel (I'm sure they do) and you can find CameraView.xyz or ScreenView.xyz, then a drag&drop shader editor should be fine. Have the vertex shader assign the UVs (it can be done in the pixel/fragment shader, but won't be any better and will run slower.)
It's not a very heavy-weight effect. I can't imagine that even a bad cell-phone shader can't do it. If it's faking the shader in CPU, you're no worse off. But, I'm a terrible person to ask about multiple platforms.
Thanks, that's great. I'll try out some experiments with drag and drop shader editors.
As I'm projecting onto a 3D mesh, would a vertex shader give me distortion compared to a pixel shader?
Thanks again, your replies have been really enlightening.
A regular unwrap only assigns UV-coords to each vertex. So your assignments will look exactly as distorted as all Project-From-View's.
The pixel shader is for things that aren't uniform: the middle of a tri always uses the average of the corner UV-coords, but it does not use the average specular light or the average color. You have to compute and look those up per pixel.
Hi! I'm trying to write a surface-shader-code that should project the uv's from the camera view. So, by writing the following, what would that mean for writing a surface-shader-programm-code? Could you write an example?
"You'd just need to pull out the corrosponding xyz vert, apply the parent transform to get world space, then Camera.WorldtoView to get screen space. Then scale that however and set UV."
Answer by thehen · Dec 08, 2011 at 08:50 AM
Excellent stuff, thanks so much for the help, you've really helped me out :)
Your answer
Follow this Question
Related Questions
Assigning UV Map to model at runtime 0 Answers
Blender textures don't appear in Unity 1 Answer
Texture for my building 1 Answer
Rotation problem with screen space texture projection 1 Answer
Blender model to Unity problem??? 3 Answers