- Home /
Only RGBA32bit works when setting Texture2D pixels on iOS?
I recently hit an issue on a personal project where one of my textures wasn't showing up on the iOS build. The Texture in question is a Texture2D that is created from a sprite sheet manually via Texture2D.SetPixels and then applied to a GUITexture icon on demand.
This requires that a texture be read/writable and has an alpha channel.
This works fine with RGBA Compressed DXT5 on OSX but on iOS I have only been able to use this method with RGBA32 (which is a whopping 1MB for a 512x512 texture). It doesn't seem to work with any of the compressed formats and to my surprise it didn't even with using RGBA16.
Is this really a restriction on iOS using this method? Almost seems worth my while breaking the spritesheet up into micro textures instead.
Regards Hobsie
Don't know the "official" answer, but I suspect to write pixels to a compressed texture, that texture needs to be decompressed, you then write the pixels, and then the texture is re-compressed. That means Unity would need to know how to uncompress and re-compress textures. I suspect that Unity cannot do this, since, I believe, conversion into iOS compressed formats is handled by ImgTech's PVRTexTool when you make a build.
That seems rather annoying but makes a lot of sense with what I'm seeing.
I was hoping to build my GUI without OnGui methods (for performance on iOS) or 3D primitives so the GUITexture system seemed well suited for my purposes. However, since you can't specify UV co-ordinates directly on a GUITexture it seemed reasonable to create textures from a sprite sheet using SetPixels when loading the scene.
I guess if I want to continue using this method I might need to put up with a bloated on disk size if I have several sprite sheets?
Edit: Although thinking a little more about it, why does it fail to work with RGBA16 but works with RGBA32? Strange.