- Home /
Is RenderTexture simultaneous writing and sampling possible?
Hi,
I've been struggling with this one for a while: I have a camera rendering a scene into a render texture. This rendering uses Unity's shader replacement mechanism to render objects with a custom shader. This pixel shader first samples this render texture (the very same texture the camera is rendering to) and then modulate the sampled pixel color (paint application). This results in kind of an incremental modification of the render texture. Everything works as expected with OpenGL, but D3D(11) seems a little bit more picky: looks like the texture sampled is... just another arbitrary texture.
Hope this makes sense, kinda difficult to explain.
I of course can provide source code if relevant.
Answer by wibble82 · Dec 15, 2015 at 11:55 AM
Short answer: not really!
Longer answer: The behavior is probably undefined. You almost certainly won't be simultaneously reading/writing, even if it appears to work. However depending on the underlying implementation you may be getting away with reading from last-frame's result while writing to this frame's one.
To do this correctly, look into how post effect shaders work in unity. It is entirely possible to define extra buffers and explicitly manage the saving of last frame's result to be sampled from this frame. Here's a great tutorial on the subject:
http://www.alanzucconi.com/2015/07/08/screen-shaders-and-postprocessing-effects-in-unity3d/
Thanks for your answer. "Simultaneous" is indeed incorrect, "sampling the render texture currently being rendered" might be more relevant.
I read through the link you provided but I could not find any mention of frame buffers management, did I miss something??
Hmmm - it doesn't actually show the saving of frame buffers now I think about it - just the use of the OnRenderImage function (which is the key bit).
If you look at that tutorial, you will notice that what they're doing is:
Taking 2 render textures as inputs to the function
The source is what the camera generates through rendering
The destination is what will be output (either to the screen, or the next camera)
They then 'blit' the image from source to destination, using a shader to apply extra effects
If you have a script like they do, you can just as easily define your own extra 'RenderTextures', literally by just creating them as needed. In your case you'd probably want some code that went:
Have an extra buffer called 'previous'
Write a shader that takes the source and the previous as input, and uses them to blit to the destination
Blit the source to previous (or maybe the destination to previous), so you can use it next frame
Check out the post effects shaders that come with unity for extra examples - the motion blur post effect scripts do exactly this stuff.
-Chris
Ok, Graphics.Blit() seems to be the key. Will try asap. Thanks!