how to pass a png texture created in managed Unity code, and render it from inside a c++ plugin?
If I load a png sprite file from Unity (via Resources.Load()), how do I pass sprite.texture to a c++ plugin code and render it with OpenGL? I tried passing sprite.texture.GetNativeTexturePtr() to the native code then use glBindTexture and glDrawArrays to draw the texture, but it shows nothing on the screen. It shows a rectangle in the correct place when I disable the texture, but nothing shows when I enable texture.
(the Unity example project for native rendering plugin does not pass the texture data, but rather creating the data inside the plugin.)
Native code snippet:
     glEnableClientState(GL_TEXTURE_COORD_ARRAY);
     glEnableClientState(GL_VERTEX_ARRAY);
     glEnable(GL_TEXTURE_2D);
 
     GLuint texID = (GLuint)(size_t)(texturePtr); // texID from GetNativeTexturePtr.
     glBindTexture(GL_TEXTURE_2D, texID); 
     glColor4f(1.0f, 1.0f, 1.0f, 1.0f);
  
     const float verts[] = {x, y, x+w, y, x, y+h, x+w, y+h};
     const float texCoords[] = {0,0,1,0,0,1,1,1};
  
     glVertexPointer(2, GL_FLOAT, 0,verts);
     glTexCoordPointer(2, GL_FLOAT, 0, texCoords);
     glDrawArrays(GL_TRIANGLE_STRIP,0,4);
               Comment
              
 
               
              Your answer
 
 
              koobas.hobune.stream
koobas.hobune.stream 
                       
                
                       
			     
			 
                