- Home /
glReadPixels from a RenderTexture
Hi.
I have a camera that renders to a RenderTexture.
I would like to read back that data with glReadPixels. My native plugin gets called with GL.IssuePluginEvent in the OnPostPostRender method of a script attached to the specific camera.
Unfortunately the pixels I am getting back is my main camera view even if I do a RenderTexture.active = myRenderTexture before my native call.
The documentation at https://docs.unity3d.com/Manual/NativePluginInterface.html that discusses plugin callbacks on the rendering thread uses the words "the camera". Either they actually mean "all the cameras" or I am doing something wrong.
Has anyone been able to call glReadPixels from a RenderTexture ? There are many similar questions on the site, all of the ones I'v found are either pre-mutithreaded rendering or are unanswered.
Please help me. Thanks!
Answer by SkavenPlanet · Feb 13, 2017 at 12:12 PM
I call it in update but you should be able to call it in any function. Though you won't always get data back immediately so you should not try to read into a function's local variables. All I did to test it was download the native plugin example and modify the SetTextureFromUnity and OnRenderEvent functions and add a ReadPixels function (and delete the code for API's other than OpenGL, etc.) which should make it easy to replicate.
This is the SetTextureFromUnity from RenderingPlugin.cpp
// --------------------------------------------------------------------------
// SetTextureFromUnity, an example function we export which is called by one of the scripts.
static void* g_DataHandle = 0;
static int g_TextureWidth = 0;
static int g_TextureHeight = 0;
extern "C" void UNITY_INTERFACE_EXPORT UNITY_INTERFACE_API SetTextureFromUnity(void* dataHandle, int w, int h)
{
g_DataHandle = dataHandle;
g_TextureWidth = w;
g_TextureHeight = h;
}
This is the OnRenderEvent from RenderingPlugin.cpp
static void UNITY_INTERFACE_API OnRenderEvent(int eventID)
{
// Unknown / unsupported graphics device type? Do nothing
if (s_CurrentAPI == NULL)
return;
s_CurrentAPI->ReadPixels(g_DataHandle, g_TextureWidth, g_TextureHeight);
}
This is the actual read pixels in RenderAPI_OpenGLCoreES.cpp
void RenderAPI_OpenGLCoreES::ReadPixels(void* data, int textureWidth, int textureHeight)
{
int currentFBORead;
int currentFBOWrite;
glGetIntegerv(GL_READ_FRAMEBUFFER_BINDING, ¤tFBORead);
glGetIntegerv(GL_DRAW_FRAMEBUFFER_BINDING, ¤tFBOWrite);
glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBOWrite);
glReadPixels(0, 0, textureWidth, textureHeight, GL_RGBA, GL_FLOAT, data);
glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBORead);
// *You can uncomment this and pass in a pointer to a texture2D "textureHandle" and copy data directly into it here*
// GLuint texResult = (GLuint)(size_t)(textureHandle);
// glBindTexture(GL_TEXTURE_2D, texResult);
// glTexSubImage2D(GL_TEXTURE_2D, 0, 0, 0, textureWidth, textureHeight, GL_RGBA, GL_FLOAT, data);
}
I don't know if you can directly read to a Unity Texture2D so I added code that will copy the data (float array) to it.
My test Unity script.
using System.Collections;
using System.Collections.Generic;
using UnityEngine;
using System.Runtime.InteropServices;
using System;
public class Test : MonoBehaviour {
[DllImport("RenderingPlugin")]
private static extern void SetTextureFromUnity(float[] data, int w, int h);
[DllImport("RenderingPlugin")]
private static extern IntPtr GetRenderEventFunc();
public RenderTexture rTex;
public Texture2D tex;
float[] data = new float[0];
int size = 1024;
// Use this for initialization
void Start () {
tex = new Texture2D (size, size, TextureFormat.RGBAFloat, false);
tex.Apply ();
rTex = new RenderTexture (size, size, 0, RenderTextureFormat.ARGBFloat);
data = new float[size * size * 4];
}
// Update is called once per frame
void Update () {
ReadBack ();
}
void ReadBack () {
RenderTexture.active = rTex;
SetTextureFromUnity (data, size, size);
GL.IssuePluginEvent (GetRenderEventFunc (), 1);
Color[] colors = new Color[data.Length/4];
for (int i = 0; i < data.Length; i+=4) {
colors [i/4] = new Color (data [i], data [i + 1], data [i + 2], data [i + 3]);
}
tex.SetPixels (colors);
tex.Apply ();
}
void OnGUI () {
GUI.DrawTexture (new Rect (Vector2.zero, Vector2.one * size), rTex);
GUI.DrawTexture (new Rect (Vector2.right*size, Vector2.one * size), tex);
}
void OnRenderImage(RenderTexture src, RenderTexture dest) {
Graphics.Blit (src, rTex);
Graphics.Blit(src, dest);
}
}
Although I'm basically only reading the framebuffer here you can invert the colors of rTex (or use a different renderTexture, etc.) to see that it is actually reading from the one set to active in ReadBack(). Also there's code in there that turns the data array into a texture but this is just to show that it works.
Hi, it took them 15 hours to let your answer go through. I see a couple of small difference, I'll have a look tonight and get back to you. Thanks!
Same result Unity just crashes :( Here is my project with your changes to the native plugin example, please check if that works on your side. https://drive.google.com/file/d/0By43hTsy2rhrVExYNmJ5WElz$$anonymous$$0E/view?usp=sharing
To be honest I don't even think I saw the dll or bundle in the plugins folder but there are several errors with your project source anyway (I could not build it immediately). First, you're going to want to just delete all of the other API's since we're just worried about OpenGL, delete their .cpp files and remove them from RenderAPI.cpp. Second, you're still using "g_textureHandle" as the to pointer to the float array from unity but passing in a null "g_dataHandle" pointer into the read pixels function. These should be the same variable, it doesn't matter what the name is but it has to be declared before the SetTextureFromUnity function (look at my SetTextureFromUnity again). Then it should work.
EDIT: And if you're on Windows you might have to make sure Unity is forced to run in OpenGl mode: https://docs.unity3d.com/$$anonymous$$anual/CommandLineArguments.html I don't know if this is absolutely necessary or not though.
Ok, yes I used an uninitialized pointer, after I fixed that I had your example running. $$anonymous$$nowing its suppose to work I tried to figure out why its not working on my code. It seems it has something to do with glew because I can call base opengl functions like glGetIntegerv but the moment I call a GL function that is defined in glew (eg glBindFramebuffer) unity crashes even though I have it running in OpenGL 4.5+. I'm sure I'll solve that. $$anonymous$$any thanks for your time!!
Hey @SkavenPlanet, thank you so much for your example! This could greatly improve my performance by completely removing Texture2D.ReadPixels!
I do have one question though: So I deleted all the DirectX files from RenderingPlugin project and copied ReadPixels into both OpenGL2 and OpenGLCoreES, which works just fine. But at ReadPixels there is a section commented saying "You can uncomment this and pass in a pointer to a texture2D "textureHandle" and copy data directly into it here". Since the ReadPixels signature is
ReadPixels(void* data, int textureWidth, int textureHeight)
and since it is called with
s_CurrentAPI->ReadPixels(g_TextureHandle, g_TextureWidth, g_TextureHeight);
does that mean that void*data from ReadPixels is the "textureHandle" texture2D that you talked about in the ReadPixel comment? If so, then
GLuint texResult = (GLuint)(size_t)(textureHandle);
should be changed to
GLuint texResult = (GLuint)(size_t)(data);
Or did I misunderstood it ?
Also, does that comment means that I wont be needing the following code in order to fill a Texture2D? That "tex", from "Test.cs", will be directly filled in the cpp code?
Color [ ] colors = new Color [ data.Length / 4 ];
for ( int i = 0; i < data.Length; i += 4 )
{
colors [ i / 4 ] = new Color( data [ i ], data [ i + 1 ], data [ i + 2 ], data [ i + 3 ] );
}
tex.SetPixels( colors );
tex.Apply( );
Once again, thank you for your contribution! =)
For the ReadPixels signature, the "data" pointer should point to the array that you want to fill with the values you read from the GPU. If you pass in a textureHandle (a pointer to a Unity texture) it automatically fills the texture by filling the underlying float array with those values in which case you would not need to do SetPixels(), Apply() in C#. You can alternatively pass in a pointer directly to a float array and fill that ins$$anonymous$$d so you don't have to do GetPixels() if you just want the values. In the code I posted, you do the former with the "textureHandle" pointer and keep the line: GLuint texResult = (GLuint)(size_t)(textureHandle);
and the latter with the "data" pointer and use GLuint texResult = (GLuint)(size_t)(data);
Thanks for the detailed explanation, I got it to work in no time!
Just one question, when I enable antiAliasing for the rendertexture it gives me a blank screen. Some opengl error going on there. I found some answer from the following link. But couldn't quiet make it work.
Any suggestions?
Answer by ReynV · Feb 11, 2017 at 11:41 PM
It seems the FBO unity is binding to GL_READ_FRAMEBUFFER_BINDING in OnPostRender is not the same as the one being written to.
When I try to bind the read to the same as the write unity crashes.
static void UNITY_INTERFACE_API OnGLReadPixelEvent(int eventID)
{
int currentFBORead;
int currentFBOWrite;
glGetIntegerv(GL_READ_FRAMEBUFFER_BINDING, ¤tFBORead);
glGetIntegerv(GL_DRAW_FRAMEBUFFER_BINDING, ¤tFBOWrite);
// Set the read frame buffer
glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBOWrite);
// Read the pixels
glReadPixels(readPixelsX, readPixelsY, readPixelsWidth, readPixelsHeight, readPixelsFormat, readPixelsType, readPixelsDestination);
// Restore the read frame buffer
glBindFramebuffer(GL_READ_FRAMEBUFFER, currentFBORead);
}
Is it at all possible to use glReadPixels ? It seems RenderTextures need to be written to a texture before we can read them. Texture2D does not support reading ARGBInt and even if it did glReadPixels should be a lot faster.
Am I using glBindFramebuffer correctly ? Is there a way to get unity to bind the read framebuffer as well ?
Thanks!
I just tested your code and it seems to work fine. Are you manually setting the active render texture in Unity before calling glReadPixels in your plugin?
EDIT: $$anonymous$$y bad, I just re-read the initial post. The error might be something else entirely? Do you have to call it in OnPostRender?
Hi, Thanks for the response, so good to hear someone has something working. No I dont need to call the plugin in OnPostRender. Where do you call it? I believe I tried it in OnPreRender as well. Sorry a power outage is preventing me from test at the moment. Any chance you can share your project? Thanks!!
I submitted an answer with the necessary code but I've never posted here so it's awaiting moderation or something.
Answer by XinYueVR · Aug 28, 2017 at 06:40 AM
void RenderAPI_D3D9::ReadPixels(void* bufferHandle, int textureWidth, int textureHeight) { //@TODO how ?
}
void RenderAPI_D3D12::ReadPixels(void* bufferHandle, int textureWidth, int textureHeight) {
//@TODO how ? }
void RenderAPI_D3D11::ReadPixels(void* bufferHandle, int textureWidth, int textureHeight) {
//@TODO how ? }
I don't think your supposed to post that as an "answer" but take a look at this: https://forum.unity3d.com/threads/asynchronously-getting-data-from-the-gpu-directx-11-with-rendertexture-or-computebuffer.281346/