- Home /
Var types and math; unexpected results?
I'm generating a seeded "random" value within a shader, along an arbitrary grid size. For example, every 50 units can be deemed the grid size, and every 50 unit range has an identical psuedorandom return value within that range.
float multiple = 50;
float grid = floor(vertexPosition.x/multiple);
//grid is now the multiple, for example, x position 120 = grid 2
float p = distance(vertexPosition.x, grid*multiple) / multiple;
float distortion = lerp(psuedorand(grid), psuedorand(grid+1), p);
Problem is, p is returning unexpected values (over 1), because the distance between vertexPosition.x and grid*multiple is coming out larger than 50. I can't figure out why this would happen. Grid*multiple must surely equal where the 50-unit space that the vertex resides in begins?
Comment
Your answer
![](https://koobas.hobune.stream/wayback/20220613190948im_/https://answers.unity.com/themes/thub/images/avi.jpg)