- Home /
Confused by Blender to Unity workflow regarding textures
Hello guys. I'm fairly new to the creation of materials in Blender, although I can make decent models. While I've been searching Google for hours on end, I still haven't found a satisfactory explanation about this. I'll try to explain my problem as simply as I can. Keep in mind that I'm a self-educated modeler, so I might be missing on a lot of terminology, even though I have a rough understanding of the whole process.
So, I've sculpted a nice rock model in Blender. I find a suitable texture for it, I unwrap my model and begin editing the material in node editor. When I'm done, I'm aware that I can only export the mesh itself to Unity which won't contain the material and texture information. I'm also aware of baking, but this is where the confusion begins, since I honestly don't know what to bake. My understanding is that when the final edited texture is finished, I can bake what I see on a new texture which also has the UV map information on it, so when I supposedly import the model in Unity and put my texture on it, it will automatically take the combined texture. But:
Unity's standard shader contains a field for a normal map and an albedo map. Do I bake all that information on a single image in Blender and drag it onto Albedo, or should I make a separate normal map if I want my model to contain bump information in Unity?
If I use the combined baking in Blender, I also get pre-calculated shading from Blender's light source. I know i can just delete it, but then I end up with a dark texture.
My end, rough question would be: how do I make my model, create a texture for it that I like, and make it respond to light in Unity, also containing bumps, displacements etc.?
Thank you very much, and excuse my ignorance!
If you are using Cycles, there is a nice shader node available online that is designed to closely mirror Unity's standard shader. If you stick to using that, you can bake all the inputs to that shader to a texture by connecting it to an emissive shader, and then baking only emission.
I find this is a nice workflow to produce reasonably sophisticated materials in blender (that use normal maps, smoothness/metallic, AO, emission etc), and have them work in Unity.
Answer by termway · May 31, 2017 at 04:34 PM
If you want to use Unity rendering, you need a Unity mindset and you need to consider Blender as an helping tool. Namely ask you the question in how you do a particular task/render in Unity, then ask you how Blender can help you with that task. For example you don't lightmap your object on Blender, you create an additional uv channel with Blender and then Unity will perform itself the lightmap and bake the texture using this uv channel.
You should bake all yours textures separately (normal map, height map, ...). Standard shader will then combine all those textures to perform the rendering according to your scene configuration (light, reflection, fog, ...).
If you want to bake shadow information, you should use Unity lightmapping to use Unity light source. Dark texture is often a symptom of material/scene misconfiguration.
You should check this documentation page for more information about material parameters like texture format supported by Unity : https://docs.unity3d.com/Manual/StandardShaderMaterialParameters.html
Here an example of simplified workflow between Unity and Blender On blender :
Create the mesh geometry
Create your uv mapping according to your texture (or the other way around)
Optionally bake additional map (normal map, ...)
On Unity :
Configure scene parameters (light source, environment lighting, fog, ...)
Configure the material (Match the corresponding baked texture, tuning shader parameters)
What you do not want is wasting time tweaking parameters on your modelization software (Blender) which cannot be used by your target rendering software (Unity).