- Home /
How to properly set up quad positions for 2d with an orthographic camera?
For once, a brief question, as a corollary to my longer one here:
http://answers.unity3d.com/questions/20459/how-to-retain-full-quality-on-a-rotated-textured-quad
Basically, I'm still seeing quality drop issues where my sprites look fuzzy (see examples above), but ONLY at specific resolutions. Or rather, I'm seeing it at all resolutions except a few odds ones. This is most evident when resizing the unity editor and looking at what happens. At 1267x721, it is just as crystal clear as can be. At any sane resolution, and most other nonstandard resolutions to boot, it looks muddy. Depending on the resolution, it might look more or less muddy. It seems to be that the more a resolution trends toward prime numbers, the clearer it seems, oddly.
The code I'm using to set my orthographic camera size is simply:
double orthoSize = Game.Instance.MainCameraCamera.pixelHeight / 2.0;
Game.Instance.MainCameraCamera.orthographicSize = (float)orthoSize;
And then I'm using Game.Instance.MainCameraCamera.ScreenToWorldPoint in order to translate my screen coordinates to world coordinates. I've tried doing some stuff with the following:
WorldUnitsPerScreenPixel = ( Game.Instance.MainCameraCamera.orthographicSize
* 2f ) / Game.Instance.MainCameraCamera.pixelHeight;
But no matter what I multiply by this (quad scale, position, etc), it doesn't seem to really have any effect.
Presumably I'm missing some critical step for fully setting up an orthographic camera for use in 2D. Any ideas? Or, for that matter, if there's a setup pattern for using a perspective camera, then I'd be fine with that, too. I've also experimented around there, but I can't figure out the relationship between Z depth, fov, and the screen resolution for getting pixel-perfect results.
Update
So very close on this one. This example project shows something extremely curious, though: textures positioned on the half-pixels show up pixel-perfect, whereas textures positioned on the full pixels show up blurry.
The transform of the camera that Eric posted about below is absolutely awesome, and that let me stop doing the stupid Camera.ScreenToWorldPoint conversion all over the place, which is wonderful. Those ideas are incorporated into the above example project.
I should note that the example project uses Graphics.DrawMeshNow, and imports the textures asynchronously at runtime from a WWW class, but as you can see the texture itself is able to be drawn perfectly fine... as long as its transform is at an offset.
I suppose I can just offset everything by 0.5 if I have to, and that seems to work reliably, but it's just hokey enough that it makes me worry that some user, somewhere, is going to get majorly messed up because of that. Any ideas on this last part?
Answer by Eric5h5 · Oct 07, 2010 at 02:43 AM
For the camera, I use
camera.orthographicSize = Screen.height/2;
transform.position = Vector3(Screen.width/2, Screen.height/2, -1);
Then transform.position.x and transform.position.y for a quad are the number of pixels across and up. e.g.,
transform.position = Vector2(100, 35);
positions the quad 100 pixels over and 35 pixels up. How the quad is positioned exactly depends on where the pivot point is.
Hi Eric, thanks so much for posting this. I've posted an update above, with an example project, showing something kind of curious. I was able to get this working with your example here, but only if I add 0.5f to the x/y coordinates of all quads I'm rendering. Any ideas on that one part? I went ahead and put together an example project showing the sort of code path I'm using, which demonstrates the issue -- hopefully that will make it easier. Thanks so much for your help!
@x4000: a bit hard to say since you're making the mesh on the fly, but I'd guess it has to do with the way the mesh is constructed (related to my comment about the pivot point). If you offset the quads ins$$anonymous$$d then you shouldn't have to offset the coordinates.
Answer by Anton Grigoryev · Dec 30, 2010 at 10:10 AM
In the code from your example project in MainCameraScript.cs set
texture1.mipMapBias = 0;
and all will be pixel-perfect in your project! You don't need add magic 0.5f to the position now! Be happy!)) Merry Christmas and Happy New Year!
Unfortunately, that doesn't work when I test with it. I'd tried that in the past, and simply setting that and getting rid of the 0.5f doesn't give any improvement over the original base complaint. Sorry -- but thanks for the note!
It's strange, because in my Unity 3.1 it works! What version do you have?
Your answer
Follow this Question
Related Questions
Why is this pixel art not lining up correctly? 1 Answer
3d and 2d game 0 Answers
2D Animation does not start 1 Answer
2D plane not drawing texture correctly 2 Answers
Poor Sprite Quality 1 Answer