- Home /
getPixels returning nan on iOS
I'm sampling some of a texture2d (provided by vuforia, to display their webcam), and in the editor it works fine.
However, in iOS I always get nan for the pixels. The texture2d itself is NOT null and displays perfeclty - it's just I can't read anything from it. Pixel format is : RGB565
Sooo tripped out by this.. anyone got any ideas?
Start with the script reference:
http://docs.unity3d.com/Documentation/ScriptReference/Texture2D.GetPixels.html
Is the texture marked r/w enabled?
Beyond that, I know that SetPixel doesn't work w/ compressed textures. It's possible GetPixel would have problems with this particular compression, though I don't see why.
How can I tell if the texture is r/w? It's not my that creates the texture.. and vuforia is pretty big.. so a hint on what to look for would be awesome
And to clarify - this texture is not an "asset" in my project, it comes downstream from the vuforia library and is created in runtime, by their code
(FYI, I converted your answer to a comment)
You know, I went to look at the script reference, and while I was sure there was a flag you could check, I no longer see it. It's been a while since that was something I wanted to do, so I'll have to defer to someone else :)
I've converted my answer to a comment, since it obviously doesn't answer your question. Hopefully someone else will have come across this behaviour.
Your answer
Follow this Question
Related Questions
Color of texture pixels not pure 2 Answers
getting texture from shader into c# script. Also: best way to get values from GPU to CPU? 0 Answers
Texture2D GetPixel returns the same value everywhere 2 Answers
How can I check how many pixels on a texture are transparent without tanking performance? 4 Answers