- Home /
Particle Systems, sorting, draw order and performance
Dear all,
I'm sweating a little over how to display some sensordata in a sensible manner these days. The sensor data are basically 2D images that make up cross-sections of a body of matter inside a container, and my task is to recreate, with 3D geometry, the objects that the sensor detected. The assignment is totally analogous to the tomographical cross sections generated by a CAT scanner in hospitals, and the algorithms that subsequently reconstruct the organ that was scanned.
I'm aware that 3d geometry from cross sections is commonly created using marching cubes and edge detection, but the time I've been given doesn't allow me to implement this algorithm properly just yet, so I've opted for something simpler to begin with: particles. To fake volumetric reconstruction from the tomograms, I've instantiated collections of particles ordered in a slice of space aligned with the position of the 2D image at the point in space where it was detected. I then run through all the particles in the slice and give them a color corresponding to the pixel color they map to in the tomogram. I made a little sketch (all hail Power Point) to try and illustrate the idea. Imagine you run this sensor past a pyramid that lies down on its side; the resulting tomograms might look like those in the image. Then, I color grids of particles like the ones below. When placed close together and given a bit of jitter, they can fake the reconstruction of the pyramid enough to be satisfactory:
Obviously, in the application, there are far more particles per slice; 4096, to be exact. To render them, I have a single particle system with a MeshParticleEmitter attached, whose mesh property is null and whose Emit property is disabled. Its animator has the "Does Animate" property set to false as well. Then, every update in a script, I just call MeshParticleEmitter.Emit a number of times with the particle positions and the color in that pixel, so I have full control over the colors and the positions particles end up at. This actually works pretty good, especially since I was able to multithread the instantiation and destruction of new slices and particle positions, so it doesn't impact the framerate.
Now, for the problem:
Particles aren't depth sorted. So when you view the slices from one of the sides, you frequently see particles belonging to slices further away drawn on top of particles that belong to slices closer to the camera. This means you could look at the slices from the far right and down "through" the slices, and not see a large green square with a red border, like you're supposed to, but a constant green particle in the middle (because all slices have a green particle in the middle) and then a mishmash of flickering red/green particles because there's no telling what slice just happened to Emit a particle first at any given time. So how do I fix that?
Are particles belonging to different emitters sorted with respect to their emitter? I.e., what if I use one particle system per slice instead of one overall particle system for all slices? Would the slices then at least be depth sorted with respect to one another (but still not individual particles in the same slice, of course)?
To be visually optimal, I need to display a total of about 16.000 particles at a time. Can I depthsort those with bubble sort with any kind of reasonable performance, like Daniel Brauer suggested in this question? And if I do try that, what would I sort? The emitter's array, i.e. this one? Are particles rendered in the order they appear in that array, so that I would want closer particles to have higher indices in that array, so that they overdraw the ones further away?
...Is there an easier way to accomplish this?
If you made it this far, thanks a lot for taking the time to read this incredibly verbose question. (Brevity doesn't seem to be my strong side) ;)
Bonus info:
The ParticleRenderer component offers the option "Sorted Billboard" under its "Stretch Particles"-property. This actually sorts the particles correctly, but, as expected, it butchers my framerate. Does anyone happen to know what algorithm it uses?
Answer by aldonaletto · Sep 16, 2011 at 11:41 AM
I'm not sure if I understand exactly what you're doing, but if the particle slices were generated by different objects, you could use the renderer.material.renderQueue to sort the slices. Higher values are rendered over the lower ones, thus you should decrement the renderQueue value with increasing distance from the camera (typical values: 2800 - 3000).
This could solve partially the problem - the slices would be drawn in the correct order, but inside each slice the particles would remain unordered. If observed at nearly perpendicular angles, particles in the same slice could superpose each other randomly.
A complete solution would require sorting all particles, as you've said - but this is very time consuming! The bubble sort algorithm could become a nightmare - it's acceptable only for small arrays: the average time spent is proportional to N2, so big arrays will take an indecent amount of time. There's a better algorithm called Quicksort: it's based on a binary division, and takes a time proportional to N*log(N). Besides these two, you can find several other algorithms in this article.
Thanks a lot for answering, aldonaletto. :) I know about quicksort as well, but that algorithm is in fact not well suited to this particular purpose, for two important reasons.
First of all, the use of bubble sort is not necessarily limited to small arrays. As $$anonymous$$ Brauer pointed out in the related question I linked to, bubble sort runs in near linear time when the array is almost sorted to begin with. When depth sorting particles, that is very often the case because the camera doesn't drastically change position between frames, so the depths of particles are almost the same. Therefore, it often lands itself at its best-case complexity, which out-performs quicksort.
Second of all, quicksort's basic version doesn't run in-place; it requires the allocation of additional arrays (the less-than and greater-than arrays around the pivot) for each recursive execution. It is therefore inefficient in terms of memory when it comes to sorting large amounts of objects. Bubble sort suffers no such limitation because it sorts in-place. Quicksort has a more advanced sibling that attempts to mitigate this, but it's harder to implement.
The depth of particles is mostly an issue when the slices are viewed from the ends. Particles within the same slice overdrawing each other don't concern me nearly as much, so I think I'll end up going for a solution that only sorts the slices. Unfortunately, they are not currently individual GameObjects, so I can't access their renderQueue. The reason why is that I'm updating their position in a separate Thread to save performance. I will lose this optimization if I make them individual GameObjects, because I can't call Instantiate or translate Transforms from a separate Thread.
If you don't $$anonymous$$d too much about the order inside a slice, you could rearrange the particles array in index ranges: from 0 to 4095 place the nearest particles, from 4096 to 8191 place the next nearest slice, and so on. I don't know if they are drawn by index order; if they are, this could do the same effect as ordering the render queue.
+1'ing you for the help so far. :) I have a feeling we're on the same page, but I'm not entirely sure how rearranging particle array indices differs from outright sorting them. Are you talking about some sort of batch re-arrangement based on their membership in the slices? Can you elaborate on this?
I don't know exactly how the particles are added to the array at first, but if you generate one slice at a time, then these particles probably are in slice order: the first slice occupies the first 4096 positions, the second slice the next 4096 positions, and so on. If this is true - and supposing particles are drawn starting at index 0 - when the camera is looking to the last slice, it will be ok because the last slices are drawn over the previous ones. If the camera goes to the other side, you just have to swap the slice order - using your picture: swap layers 1 and 5, than swap 2 and 4 (the 3rd layer doesn't move, of course). It's just simple element swap: swap each element from 0 to 4095 (1st slice) with its equivalent from 16384 to 20479 (5th slice) and so on. There's no need to compare distances for each element, thus it may be fast enough.
Ahhh, I understand what you mean, now. Yeah, assu$$anonymous$$g the particles are drawn in index order, that would definitely work. The problem now is that the particles aren't static; they get killed and respawned all the time. I call Emit to spawn them manually, and then the particle system kills them automatically. Only their postions are pre-generated (in a separate, non-Unity class I wrote), not the actual particles themselves. So, because I pick a random slice (the separate class) to instantiate a particle in every Update, the particles cannot be assumed to be in slice order in the array.
I have fixed this entire situation by running with your first suggestion though, that is, using a separate, prefabbed GameObject with a particle system for each slice ins$$anonymous$$d of one overall system. Unity then sorts those gameobjects, giving me the correct order when viewed from the ends, and unsorted (which was okay) when viewed perpendicularly. So I consider the question closed. :-)