- Home /
How can I set the exact camera render size for render textures?
I'm looking to have a camera only render to a certan size (for example, 512x512) but specifying a RenderTexture size doesn't change the camera bounds. How would I go about changing the bounds to reflect this? And how would I be sure that the camera is only rendering to this size and nothing more?
I ask this for optimization reasons, I want a small rendering space so it can render quickly and efficiently, among other reasons.
If the camera is (it actually isn't, but lets assume it does) writing to a 512x512 texture, then thats exactly whats going to happen. The camera isn't going to render more any more pixels than it's render target.
But when the camera has a RenderTexture target, it still shows full bounds, like it is rendering to the screen. And the RenderTexture target does not receive the full render area of the camera, it's always smaller and cut-off. This is the part that is confusing me.
You just have to keep the same proportions or the same sizes of camera viewport and render textures in order to get the same images in both. The camera renders to texture as if viewport had the render texture proportions.
But it doesn't do that. If you have a rendertexture for a target, then the camera just renders to that. If the rendertexture is 512x512 then that's what the camera's pixel dimensions are.
I think I've just found what I wanted. If I force the camera.aspect to what I want it renders to all the texture pixels ins$$anonymous$$d of cropping to the 2:1 aspect ratio of the texture.
Your answer
Follow this Question
Related Questions
How to get depth texture and render texture from one camera 0 Answers
Texture Render - Camera only render changes 0 Answers
RenderTexture doesn't get cleared and camera just adds to it 1 Answer
Is there a way to write to Deferred Depth from a command buffer in CameraEvent.AfterImageEffects? 1 Answer
Masking elements with non-rendered layers? Culling masks, camera, render textures 0 Answers