- Home /
Best way to downsample camera viewport?
I am making a mobile FPS with a retro-like aesthetic and am trying to figure out a good way to render camera output at a quarter of the native resolution, then scale it up to fill the screen - both to achieve the aesthetic I want, but also for performance reasons.
I have tried these methods:
1. Simply change the viewport rect values of the camera to 0.5. This downscales it well, but I can't figure out a way to then actually scale the rect up to fill the screen.
2. Change the resolution of the game overall. This has two issues: It doesn't downsample using point interpolation so it looks blurred rather than pixelated, and it also results in everything being downsampled including the UI, which I do not want.
3. Use a render texture and a separate camera. This works the best of the approaches I've tried and produces a result that looks exactly like I want - with no AA or blurred interpolation and working on a specific camera. The issue with this method, though, is twofold: Partly I worry about performance since I'm now just trading off resolution for texture memory, and partly it's practically frustrating to work with two cameras rather than one.
Is there a better, simpler way to do this? Optimally I'd like a straightforward solution that literally does nothing else but render a quarter-sized viewport and then upscales it to fill the screen - interestingly, if I change the viewport rect of the camera like in method 1 above and then use the editor "scale" slider I can sort of emulate the look I'm going for - is there an actual native way to do this at runtime, perhaps?
As a side note I also attempted using an image effect to achieve a "pixelated" look but since this doesn't offer any actual performance boosts it seems like a bad approach in this case.
Answer by kataS_94 · Aug 26, 2017 at 08:20 PM
I had the same problem until I finally found a really simple way to do it. However it still have the problem of blurred interpolation on the upscale process and I don't really know how to solve it at the moment. Here is my script:
[ExecuteInEditMode]
public class KCameraResolutionScaler : MonoBehaviour
{
public new Camera camera;
[Range(0.01F, 1.0F)]
public float renderScale = 1.0F;
public FilterMode filterMode = FilterMode.Bilinear;
private Rect originalRect;
private Rect scaledRect;
void OnDestroy ()
{
camera.rect = originalRect;
}
void OnPreRender ()
{
originalRect = camera.rect;
scaledRect.Set(originalRect.x, originalRect.y, originalRect.width * renderScale, originalRect.height * renderScale);
camera.rect = scaledRect;
}
void OnRenderImage (RenderTexture src, RenderTexture dest)
{
camera.rect = originalRect;
src.filterMode = filterMode;
Graphics.Blit(src, dest);
}
}
It basically sets a rescaled viewport rect to the camera before every render and then restores it to its original value before the render goes on the screen. I have tested it on a mobile game that I am currently developing and I could notice a great boost on performance without loosing resolution on the game UI.
EDIT: To be honest I didn't find a way to set the scaling filter because I didn't try. I have edited the code below so it supports the preferred filter mode x).
This is brilliant, and the ONLY way to create dynamic resolution in DX11 on the built-in rendering pipeline. (Unity's DX12 is unusable even in 2022! And their SRP pipelines are still not production ready...)
Your answer
Follow this Question
Related Questions
camera view independent of resolution 1 Answer
Why Is My Orthographic Camera Not Rendering at the Full Resolution? 1 Answer
How to retain absolute gameobject size in pixels no matter the resolution? 2 Answers
Scale box collider to camera view? 2 Answers
newbe scaling maze sceen for different resolutions 0 Answers