- Home /
Why are there stretch marks using my triplanar shader?
And I found this triplanar shader here: "http://www.martinpalko.com/triplanar-mapping/"
But wanted it to work on a terraformed sphere (which I have already, generated procedurally, along with the correct normals and tangents). I just found an old post here I thought could help: https://answers.unity.com/questions/754969/how-to-apply-a-tri-planar-shader-to-a-planet.html
End goal: Have the cliff texture apply to the sides (normal towards x & z), and blend the grass along the top (normal close to y).
However, in combining the two I get texture stretching in SOME areas. I put the code above at the top of the "surf" function of the triplanar shader. I converted it to work in model space, and it seems ok only on some sides, but the transition has bad stretching in some spots - perhaps the shader is ok and my normals are bad? I tested my normals and tangents before and they seem just fine (lighting works well). Here is my combined shader:
Shader "Custom/Triplanar/Planet"
{
Properties
{
_DiffuseMapX("Diffuse Map X", 2D) = "white" {}
_DiffuseMapY("Diffuse Map Y", 2D) = "white" {}
_DiffuseMapZ ("Diffuse Map Z", 2D) = "white" {}
_TextureScale("Texture Scale", float) = 1
_TriplanarBlendSharpness("Blend Sharpness", float) = 1
}
SubShader
{
Tags { "RenderType"="Opaque" }
LOD 200
CGPROGRAM
#pragma target 3.0
#pragma surface surf Lambert
#pragma vertex vert
sampler2D _DiffuseMapX;
sampler2D _DiffuseMapY;
sampler2D _DiffuseMapZ;
float _TextureScale;
float _TriplanarBlendSharpness;
struct Input
{
float3 worldPos;
float3 worldNormal;
float3 vertex;
float3 normal;
};
void vert(inout appdata_full v, out Input o) {
UNITY_INITIALIZE_OUTPUT(Input, o);
o.vertex = v.vertex;
o.normal = v.normal;
}
void surf (Input IN, inout SurfaceOutput o)
{
float3 up = normalize(IN.vertex);
float3 right = normalize(cross(up, float3(0, 1, 0)));
float3 forward = normalize(cross(right, up));
float3 localNormal = float3(dot(IN.normal, right), dot(IN.normal, up), dot(IN.normal, forward));
// Find our UVs for each axis based on world position of the fragment.
half2 yUV = IN.vertex.xz / _TextureScale;
half2 xUV = IN.vertex.zy / _TextureScale;
half2 zUV = IN.vertex.xy / _TextureScale;
// Now do texture samples from our diffuse map with each of the 3 UV set's we've just made.
half3 yDiff = tex2D (_DiffuseMapY, yUV);
half3 xDiff = tex2D (_DiffuseMapX, xUV);
half3 zDiff = tex2D (_DiffuseMapZ, zUV);
// Get the absolute value of the world normal.
// Put the blend weights to the power of BlendSharpness, the higher the value,
// the sharper the transition between the planar maps will be.
half3 blendWeights = pow (abs(localNormal), _TriplanarBlendSharpness);
// Divide our blend mask by the sum of it's components, this will make x+y+z=1
blendWeights = blendWeights / (blendWeights.x + blendWeights.y + blendWeights.z);
// Finally, blend together all three samples based on the blend mask.
o.Albedo = xDiff * blendWeights.x + yDiff * blendWeights.y + zDiff * blendWeights.z;
}
ENDCG
}
}
This is for a Unity game I’m building here: http://martianworlds.com
Edit: I think my issue was the coordinate planes used for UV were not rotated with the normal, but that would always result in a UV of 0,0 on the X,Z plane (Y axis). Turns out plan B is working the best, with very little FPS drop, so not bad. I'll work in this direction. For those who wish to know, the solution for me was to use work in local object space, using the triplanar routines for a single texture. I did this for two textures and blended between them using the dot product of the surface normal and the normalized local vertex point (the "up" vector from the planet object's center to the surface).
Answer by Bunny83 · Jul 26, 2018 at 09:59 AM
Because you do not base your blending on the actual surface normal like the original shader but you use a rotated local space. However this doesn't make much sense because you have to pick the right texture based on the actual orientation. In your example you have a very steep hill / cliff so if you look at it from the top view There is very little difference in the world space projection and therefore the corresponding texture lookup will sample just a few pixels across the whole surface. That is usually not a problem because the worldspace normal is used to select the right projection. That means the stretched portion would actually get a blend factor close to 0 so it wouldn't be visible at all. Though since you use a normal that is local to some pretty arbitrary local tangent space the x, y and z values of the normal do no longer correspond to the actual surface angle in relation to your 3 projections.
What's exactly the reason to calculate that strange tangent space? You only seem to use it for the triplanar mapping and there it's just wrong.
If you want something like what IndieLegion roughly explained in his post, the way he phrased it, it doesn't make too much sense. One way you could achieve a 2 texture approach would be to do the "unaltered" triplanar mapping using the worldspace normal. Do this for both textures, the inner and the outer one. Now you could just blend between the surface and the core texture based on the height of the fragment. Though keep in mind that when using a single threshold for the blending between core and surface texture you get the blending seam at the same height everywhere on the planet. One could use a blend texture which acts like a height map to apply different blending thresholds to various loations on the planet.
Of course you could also do the triplanar mapping for 3 textures but it quickly gets out of hands. Keep in mind that each triplanar lookup requires 3 texture lookups
The reason is that the triplaner shader works fine where Y is always up (0,1,0), but that means all mountains on my spherical terrain only render ok at the “North Pole” and not at the sides. By rotating the normal the idea was to redirect the sampling (blend weights) that the normal was used for so that the normal is in relation to the up vector at its position on the sphere.
Thanks again for your input. $$anonymous$$y issue has nothing to do with world space. I can texture it using triplaner just fine using the local object normal and coordinates (the shading must be consistent to the object space, not the world where the object can rotate within). Turns out my issue mostly seems to be that the axis planes that do not get rotated with the normal, but if I did that then the UV on the Y plane would always be 0,0 lol Your last point was already my plan B before I posted this. ;) In fact I even considered "painting" using UVs or vertex color channels, but there can be issues with that also. I'll see if I can use multiple triplaner levels and blend based on normals and height. I already have a working shader that textures based on height from planet center, but the effect obviously causes "rings" around steep mountains. With some randomness it looks a bit better, but not realistic enough, so I'll have to experiment a bit more with plan B. :)
Answer by hexagonius · Jul 26, 2018 at 06:14 AM
As far as I understand it, you picked two online sources, merged them and now you don't know what you ended up with? Maybe these sites help you to get to understand the shader you created:
https://unity3d.com/learn/tutorials/topics/graphics/gentle-introduction-shaders
http://developer.download.nvidia.com/CgTutorial/cg_tutorial_chapter01.html
I’m not new to shaders. I just recently spent a lot of time understanding the triplaner concept and it works great, and the other unity post shed interesting light on translating the normals for use on terraformed spheres (planets). It should work properly, and the picture clearly shows it mostly does. I’m still investigating the issues and I’m hoping someone with more years experience can see it where I can’t.
Your answer
![](https://koobas.hobune.stream/wayback/20220612171550im_/https://answers.unity.com/themes/thub/images/avi.jpg)
Follow this Question
Related Questions
Add transparency to sprite shader with cast shadows and receive shadows and lighting 2 Answers
Issue with scaling shader and non-zero local position 0 Answers
Shader: texture sample exact pixel value 1 Answer
Is it possible to avoid writing to certain g-buffers? 1 Answer
My shader code is having an error with returns at lines 50 and 60. 1 Answer