- Home /
[Shader included] Geometry shader draw order reversed in vertices above camera y-position
I've run into an issue with rendering GPU point clouds via a geometry shader.
.
My goal here is to be able to render large transparent point clouds with multiplicative alpha blending. I have already tried doing so using particle systems as well as CPU-based meshes rendering only vertices. Unfortunately, both of these solutions are not very scalable as ~12 million voxels/particles really starts to tank my framerate. Therefore, I am now exploring the use of compute buffers and geometry shaders. Once I have those down, I will also likely be incorporating compute shaders to further speed things along.
.
So here's the issue: It would appear that all particles above the y-position of the camera render in reverse draw order, meaning that the furthest particles are rendered on top!
.
.
.
.
.
I've been scratching my head over this for almost a week now, so I figured I'd see if anyone out there might be able to throw me a bone. I've explored applying this shader to a non-procedural mesh and avoiding the use of buffers, I've tried ZWriting but that nerfs transparency... I believe my problem involves the UnityObjectToClipPos() function, but unfortunately my unfamiliarity with matrix algebra has stopped me from thoroughly investigating.
.
I just want to point out as a reminder that these quads need to be transparent, and other alpha blending techniques don't satisfy the task. I want to figure out what's causing this systematic issue at its root.
.
Please feel free to grab the CS Script and Geometry Shader to check it out in Unity (tested in 5.6.4f1). Simply place the CS script on an empty game object, and pass the geometry shader to the CS script's shader field. Hitting the play button should instantiate a million quads on the GPU (takes about 2 seconds on my GTX 970).
.
I'm really interested as to why this issue is presenting itself. Thanks in advance!
.
-Peter
Answer by Buckslice · Nov 29, 2017 at 12:57 AM
The reason you are seeing that line in the middle is due to the order you spawned the particles in the buffer. When viewed below you are looking at the first particles to be drawn so they can't be blended with the ones drawn behind them. When viewed from above it looks nice because the particles at the top of the cube are drawn last.
If you give your particles a complete random position inside the cube that will get rid of the line at least, but still won't be perfect.
inputData[i].position = new Vector3(Random.Range(-1f, 1f) * size.x, Random.Range(-1f, 1f) * size.y, Random.Range(-1f, 1f) * size.z);
To get true translucent blending you need to draw your particles back to front in relation to the camera. So somehow sort them in the compute shader I guess? Good luck!
-Buck
Ahh, thank you so much! In all my research into depth buffers and draw order, never did I think to look into the order of vertex creation itself. Brilliant! Time to play around with compute shaders :) Thanks again!
Your answer
Follow this Question
Related Questions
Procedural grass placement 0 Answers
GPU Debugging - incorrect Pixel Shader Output 0 Answers
(Bilboard Shader) Max Screen Size 0 Answers
Is terrain generation on GPU possible/supported? 1 Answer
Transparent Blend by Shader Variant in Surface Shader 0 Answers