- Home /
Why ARGB32 texture has signed components?
After some experimenting with native plugin which renders to a texture (the same principle as in native plugin example) I found that all components in native texture pixels (both Direct3D on Windows and OpenGL on Linux) have range 0-127, i.e. fully white non-transparent color is (0x7f, 0x7f, 0x7f, 0x7f). Further value incrementing till 0xff does not have any effect. Seems that they are treated as signed 8-bits values with all negative values clamped to maximum. The texture is created in C# with format TextureFormat.ARGB32. So why it drops the most significant bit and thus shrinks available color range?
Tested on several machines. Seems this behavior depends on hardware - some machines have full range (0-255), some of them have half range. Could it be a unity bug (improper format initialization)?