- Home /
Why is Deferred rendering making everything so pixelated?
So I'm experimenting alittle with SSR, and as soon as I flip Render mode over to Deferred rendering all edges starts to get jaggy. I assume there is some sort of setting for this and discover that anti aliasing wont make a difference anymore. I ask a programmer (I'm an artist) he tells me deferred rendering is rendered with an off-screen buffer so anti aliasing as an principle is not the same as it is in forward. I'm thinking "okay so maybe this off-screen buffer has a setting somewhere?" So that Unity dont look like 7 harsh years of starvation. But no, I cant find anything. I look at tech demos, most of them use SSR but no jaggy lines(!). Em I missing something or em I just naiv thinking that Unity would actually let me have the nice rendering they flout with in there demos?
Answer by tanoshimi · Sep 28, 2016 at 08:36 PM
If you use the anti-aliasing image effect attached to your camera (imported from Standard Assets) it should work whatever rendering pipeline you use (unlike the old anti-aliasing quality setting which only works in forward rendering).