How to pass a RInt RenderTexture to a ComputeShader?
Hi,
I've been trying to pull Int32 data from fragment shaders for a bit of a while now, and the most successful solution I've had so far, is to pass the data to RenderTextures with ARGBFloat pixel formats, and then convert (caste) it in a compute shader. I've been trying to get the fragment shader to pass the data to a RenderTexture with a pixel format of RInt. I've only found one post where the person claims to have done this, but that post never actually went into how they got the data out of the RInt RenderTexture afterwards.
Now, I've written fragment shaders that will just pass 42 to every pixel of the RenderTexuture, only to get nothing but 0s when I try to read those pixels in a compute shader. I'm trying to use a compute shader because the Texture2D.ReadPixels method (as far as I can tell) doesn't work with Pixels of type RInt. Since all I ever get is zero values, I don't know if I'm failing to read the data from the RenderTexture, or if I'm failing to write the data into the RenderTexture.
Does anyone know how to actually pull data from a RenderTexture with RInt pixel format, and place said data into a readable format in system memory? While the ARGBFloat then caste to int works for some of my use cases; it's not ideal and like I said, it only works for some of my use cases. I have found references where people that came across this problem were simply told to write a native plugin for Unity, but after looking at a few tutorials on writing native plug-ins for Unity, I have concluded that those answers are of no use. There's a bit of a gap between what a tutorial will teach you, and what you need to know to access RenderTexuture data. Not to mention, it would be a bit of a shocker to go to all that work only to find out that the issue was with the fragment shader.
What are you trying to do? There might be more simple solution than whatever you trying to pull from ComputeShader
I'm trying to pull the int values from a fragment shader and read them on the CPU (or at least that is the end result). The ComputeShader is just an in-between step that I'm trying out. In my latest use case, we want the vert ids of the visible verts, but using a float that gets caste to an int isn't an option because the mesh has more than 2^16 verts in it. The reason I'm trying the RenderTexture is because the RenderTexture has RInt as one of the available pixel formats which is exactly what I need. The RenderTexture lets me send that info somewhere that isn't a computer monitor, but I still need to get it to the CPU.