- Home /
Detecting corners in a grid
My problem is a little bit complex so bear with me as I explain it. I'm build a perlin noise-based procedural generation algorithm that goes through the following steps:
Generates a list of perlin noise values.
Flattens those values out into values for tiles.
Instantiates a tile based on the value.
The generated terrain looks something like this:
![alt text][1] The problem I can't solve is that I want it to instantiate tiles that slope down to reach lower tiles at the edges of tile regions, areas like where the red tiles meet the green and green meets the yellow. In short, I would like to be able to do the following:
Loop through the list of tile values.
Determine if the tiles that border the tile are lower than it and where they are in location to the tile in question.
Based on the location of those tiles, determine if the tile is on an edge or a convex or concave corner.
I know this is a bit of complicated request, but I've tried to think of a way to do this without any luck. If someone knows of a way to do this, I would appreciate it. [1]: /storage/temp/104625-2017-10-28-progress.png
Answer by nihohit1 · Oct 29, 2017 at 07:21 AM
I assume you have a two dimensional array of height values, generated in step 2 of your algorithm. If run time is not an issue, you can run another scan after the initial array is generated and manually check whether each tile neighbours lower tile using an if statement that defines the conditions you described, and create a second array using the enum I described below.
A more time efficient solution would be to run a compute shader using a kernel that describes each slope direction, but that's not very simple, and I would advise trying to brute force the issue before trying that.
enum SlopeDirection {
flat,
up,
upLeft,
upRight,
etc.
}
Sorry for the late reply, but your first solution worked for me. Thank you!