- Home /
GameObject GUI?
From what I've gathered around the internet, Unity's built-in GUI leaves a lot to be desired and isn't very efficient. Instead, making GUI's based on GameObjects with textures and colliders seems to be the way to go? I'm guessing this works through ray casting? I just need some pointers on how to go about making this.
I realize that Unity is currently remaking their GUI, but for the time being, I need a GUI that'll be flexible and easy to change.
If you are an adept programmer, then you an roll your own, but there are some complicated things that packages like NGUI and EZGUI do that are not quickly duplicated. For example:
They build and use texture atlases. That is, large parts of the UI are displayed using a single material that with a large texture uv mapped into individual quads.
They use a bit-mapped font and display the the text in quads in a dynamically created mesh. The quads using kerning to layout the text.
There are lots of other things they do, but at a $$anonymous$$imum I'd think you need the first and probably the second for building an interface using world objects to make sense. If you don't do these things, you'll end up with an interface that is no more efficient that GUI in terms of drawcalls.
I'm looking to make a main menu with the GUI, so efficiency isn't too much of an issue, despite what I said in the question. Looking at EZGUI and NGUI, I'm sure I can recreate most of their basic effects myself, so I can't really justify spending that much on them. Great answer though, thanks!