cast of Int to Long reads garbage
Hello all,
I've come across a very weird issue. I was tasked on checking a crash of a game in iOS and after much debugging I found that a method that was expected to receive a long type data was getting a weird number.
The method that was calling that function was sending an int through, and up to that point the data was intact, so after much testing I printed the binary form of the numbers and found that the initial sequence of the number was fine but the final part was a random sequence of numbers that i can just assume was part of memory.
In short if the number i was passing was 10010101010 the number receiving it got 100101010100001011 (not the actual numbers) as you can see the initial sequence was right but the rest is something else. I did a temporary patch where i bit shifted by 32 ( x >> 32 ) and this solved the crash. But as such this is just a patch and it doesn't fix the underlying issue that data is being corrupted, and if it happens there it might be happening somewhere else?
So my question is, why would this happen only in iOS? Does it have to do with endianness? and last, How could i go about fixing or testing this?
Definitely sounds like a bug. Would be great if you could make a $$anonymous$$i project to reproduce and report it.
Your answer
Follow this Question
Related Questions
iOS crash randomly on loading scene Unity 5.3.1 1 Answer
Facebook SDK Login crashes on iOS 10 2 Answers
my ios app often crash on lauch,without unity logo show 0 Answers
Unity crash destroyed script 1 Answer
Unity 4.6 | 4.7 AdMob iOS Crash 0 Answers