- Home /
3D array flattening / unflattening to / from compute shader
Hello,
I'm currently using compute shaders to compute 3D arrays of voxels and I'm experiencing some kind of "reversed" 3D coordinate issues, i.e X and Z coordinates seem to be reversed. I am suspecting this is because of the way Unity tranfers a 3D array between the CPU and the GPU using a compute shader.
My question is : how does Unity flattens (and "unflattens") a 3D array when calling ComputeBuffer::GetData()
or ComputeBuffer::SetData()
?
When I do this on the CPU, I'm nesting loops in the X, Y, Z order, ie :
for (int x = 0; x < size; ++x)
{
for (int y = 0; y < size; ++y)
{
for (int z = 0; z < size; ++z)
{
flattened3DArray[x + size * (y + size * z)] = actual3DArray[x, y, z];
}
}
}
I am suspecting that, internally, Unity is doing that in the Z, Y, X order instead of X, Y, Z, which would explain my issue.
You're gonna say that I just have to try and see by myself. That's true, but I'm still asking the question because I'm wondering whether the behaviour of Unity regarding this is well-defined or if it is some kind of OS or hardware dependent thing, which isn't something I can't check by myself. And also because it's painful to setup. :D
Thanks for your help !
Answer by D43DB33F · Jun 14, 2018 at 03:58 PM
Sorry I didn't search enough before posting :
https://stackoverflow.com/questions/21596373/compute-shaders-input-3d-array-of-floats
That seems to be what I suspected.
Your answer
Follow this Question
Related Questions
getting texture from shader into c# script. Also: best way to get values from GPU to CPU? 0 Answers
What GPU, CPU for Unity? 2 Answers
What are the differences between Progressive GPU and Progressive CPU lightmapping? 0 Answers
Multiple Lighting and DrawCalls Playable Alpha to Help Understand Problem 0 Answers