- Home /
Can Normal maps be used to create a 2D sprites lighting system?
Hello! A friend had the kindness of creating a shader for sprites for me that can take normal maps as well. I'm trying to do achieve something as shown in this example http://robotloveskitty.tumblr.com/post/33164532086/legend-of-dungeon-dynamic-lighting-on-sprites But my results have been less than optimal. Could it simply be that I'm creating lousy normal maps? This is what I have going so far... https://www.dropbox.com/s/j46krzl1v1whdv7/2013-12-04%2016_05_16-nDo%203D..png The normal maps just look like some kind of artefacts in the middle of the sprite, they don't actually look like that convincing shaded effect.
Any suggestions?
Answer by Kencho · Dec 05, 2013 at 03:33 AM
It's perfectly possible. I did a couple of successful experiments in a matter of hours (video 1, video 2). Sprites are no different than bump-mapped walls for the GPU in the end. There are, however, some things to take into consideration:
You have to do your own shader. I started by downloading the default shaders' sources to use them as a base, and integrate the lighting model into it. Enable lighting, first of all. Modify the default vertex shader to define the tangent vector for each vertex (the default shader defines v.normal as float3(0, 0, -1). Define v.tangent as float4(-1, 0, 0, 1)). You will need it for the bump mapping.
Normal maps cannot be defined per-renderable, just like you do with the sprite atlas sheet, what is a problem as you have to define a new material per sprite atlas, which is inefficient. However, this is only half true, as you can define other per-renderable textures manually, using Renderer.SetPropertyBlock(). The bad news is that they aren't serialized. The good news is that you can figure alternative ways to have those textures serialized, and applied to the SpriteRenderer, for instance, on Awake() ;)
Won't work with automatic laying out of sprites. I think this isn't available on the free version, but in case you throw a bunch of separated sprites and let Unity stitch them together in a single sheet, you'll have trouble using normal maps. Mostly because the UV coordinates of each sub-sprite in the sheet must correspond exactly with the UV coordinates of them in the normal map. But remember: UV coordinates are resolution-independent (range from 0 to 1). I.e.: your sprites sheet can be for instance 2048x2048 and its normal map 512x256.
There seems to be a glitch when looking for the nearest lights, so in certain cases you may see lighting inconsistences near the lights' range end (like lights were suddenly stopping lighting the sprite, etc.) Also, Unity's sprites and their lighting doesn't get along with scaling, so try to have your sprites with a uniform (1, 1, 1) scale if you plan to light them. My theory is that it's a bug when recalculating the sprites' boundaries, not taking the scale into consideration. Translation and rotation seem to work fine though :)
As excellently pointed out by Jessy in the comments below, as soon as batching happens, the vertex data fed as input to the vertex shader no longer is in object space (relative to each sprite fragment separately) but in "batch space" (relative to the batch itself, most likely the same as world space). Thus, using the left vector for tangents may be wrong if a sprite fragment rotated or scaled and batching happened.
Whew! Hope that helps :)
You are THAT guy! Somebody posted your video 1 at the end of this thread http://forum.unity3d.com/threads/210399-4-3-Sprites-and-Lighting And I was already looking for the ways to contact you.
Thanks so much for the info! :) I found it super helpful to understand the problem and I'm sure many will. In the end, my friend's shader was working, it was my normal maps that were messed up, but I wasn't aware of the other considerations you mentioned. Great job.
Hehe, thanks a lot! :) Now that Google+ is integrated with YouTube it should've been easier to contact me, or so I thought. I'll drop by that thread to let them know there are some details here.
If you're interested, I also posted some details on the construction of the normal map (and a download of the textures used for the cel shading test) on the comments of the videos. Very artist-oriented, like everything sprite-ish, hahaha.
Programmatically defining tangents doesn't work, with batching. There's no use for normals. All that's needed is a 2x2 matrix; for sprites, I think Unity should provide that matrix as the "tangent".
Jessy, the tangents and normals are defined in the shader, not through scripting. It works with batching because it's done after batching, in the polygon soup that the GPU processes. Normals and tangents are required for the per-pixel normal calculation (and binormals/bitangents too if you want to do parallax mapping). It has nothing to do with sprites; it's the same requirement as for any arbitrary normal-mapped 3D model.
Normals are only required if you need to perform a cross product (not defined for 2D) to create the bitangent. The bitangent in 2D is just vec2(-tangent.y, tangent.x)* tangent.w; storing both vectors in one vec4 would be better.
Unity doesn't create tangents for sprite meshes at present. Rotate or invert the scale of your sprite, and now you have no idea what the tangent actually is, if it got batched, because local space is now the same thing as world space in the shader. You can't pull the tangent out of the model matrix like you could otherwise.
You should probably take this to the forum and link here if you don't understand; I can try to answer any questions you have there.
Answer by Jessy · Dec 04, 2013 at 07:58 PM
Unity currently doesn't do a good job of supporting this, for these chief reasons:
A sprite can't be defined from multiple textures. A sprite mesh stores no normals or tangents.
In fairness, my friend's script does works in making the sprite able to store normals and bump. But I think what I need is a Transparent/Bump diffuse kind of renderer, and that's why its not working.