- Home /
terrainData.heightmapTexture float value range
While rendering to terrainData.heightmapTexture I discovered that writing 1.0f to pixels doesn't result in terrain of maximum height (as specified in "Terrain Height" inspector field) but 0.5 does (1.0 is twice that and not available for manual brush edits). Seems odd/surprising but I expect there is sensible reason behind it. Can sb explain this behavior?
(image shows terrain after rendering a sine wave (0.0-1.0 range) to it. I was using a compute shader > Graphics.CopyTexture > terrainData.DirtyHeightmapRegion path)
Btw. anyone can reproduce this using my code sample from this post;
Answer by andrew-lukasik · May 08, 2019 at 09:55 PM
Correction: Not a bug. It's a feature :
"The heightmap implementation itself is signed but is treated as unsigned when rendering so we only have half the precision available to use for height values. That's why all of our Terrain painting shaders clamp the returned value between 0f and .5f so that we don't end up writing signed values into the heightmap. If you were to put in values greater than .5, you'll see the Terrain surface "wrap" to negative height values. I can't say why this was done but it probably has stayed this way because it would take a lot of code changes to make either of them signed or unsigned to match.
The values are normalized so that we can get the most precision we can out of the .5f for a given Terrain's max height. 0 being a world height offset of 0 and .5f being terrain.terrainData.size.y (the max height)"
Answer by dan_wipf · May 06, 2019 at 06:13 AM
i think it’s dueto they use an unsigned 16bit int => which has a range from -65535 to 65535. i came across this when i wanted to set/getheights and the output was a value between 0 and 0.65535(as the hughest point on terrain) not as unity says 0-1..
That is my suspicion too. For CPU this data is formatted, like you said, as unsigned 16bit integer (range 0 to 65535). But for GPU it's R16_UNOR$$anonymous$$ where UNOR$$anonymous$$ describes hardware conversion: unsigned & normalized (ie: 0.0-1.0 on read/write as a float).
From what I can deduct - in order for this conversion to go astray like this it must be (wrongly?) cast to signed int16 just before GPU conversion does the rest - treating 32767 as 1.0 and not 0.5 (?). But ... that would mean there is bug somewhere in the pipeline and I don't wan't to jump to this conclusion too hastily.
Your answer
Follow this Question
Related Questions
SetAlphaMaps Opacity Gradient 0 Answers
Why does my .raw heightmap create ugly errors when loaded into a terrain? 1 Answer
How to export heightmap of a terrain during runtime. 0 Answers
Terrain Texture (applied programatically) Disappears When Entering Play Mode 0 Answers
How to generate multiple Terrain objects with different TerrainData and Splat Texture via C# script? 1 Answer