- Home /
Reading RenderTexture with glReadPixels has weird alpha blending
I developed an Android plugin in Java where I'm reading pixels from a RenderTexture using GLES2 glReadPixels.
I have elements in the scene that are using alpha and different kinds of blending. For example the fonts in UI, or a mask that I implemented using the stencil buffer.
I'm rendering with a camera to a RenderTexture, and it's working correctly because I can display it in a RawImage in the screen and it looks good during the game. But when I read the renderTexture with glReadPixels, it looks like alpha blending is not working correctly. When the background is black or really dark, it looks perfect, but when there are light colors and white, I see weird colors in those pixels where alpha blending should occur.
As far as I understand, the color buffer of that render texture should have the pixels with the final result after the render. It uses the depth and stencil buffers during render but the color buffer should have the result in the end. Is this true or do I have an incorrect idea about the graphics pipeline? When I call glReadPixels in the attached FBO, does this operation read the color buffer only or does it use the depth and stencil buffers for something?
I really need to understand this better so I can find what the problem is. Thanks!
Your answer

Follow this Question
Related Questions
Using OpenGL FBO to record game play in Android 0 Answers
Android - GLES2 - read rendered screen and resample 0 Answers
Using Rendertexture on Android Native JAR 1 Answer
OpenGL error due to RenderTexture size set too high. Any way to change size during runtime? 0 Answers
OPENGL NATIVE PLUG-IN ERROR: GL_INVALID_OPERATION: Operation illegal in current state 0 Answers