- Home /
High number of 4096 atlases crashing Unity at build time
We've recently had to convert a rather large project to "retina quality". This involved bumping from 2048 to 4096 max sprite atlas sizes. Using 2DToolkit, this wasn't a huge deal, just took a bit of time. For reference, think Talking Tom style game, with hundreds of large frames of animation built to sprite sheets, with each animation loaded and unloaded from memory on the fly. Tap his belly, load tickle animation, play it, replace with the always loaded idle animation, and unload tickle anim. Memory use is totally fine with this approach, and surprisingly load times for each anim were more than acceptable.
However, that was before retina came into the picture. There are approximately 75 of these 4096x4096 textures in the project. Using Unity Cloud Build service, we've been able to compress the atlases over the course of 8-9 hours of build time. At the end of the build process, it crashes with the old "out of memory" error.
I can understand how that might be possible, but at the same time, why is Unity ever trying to load multiple atlases into memory during the build process? We had to take some careful steps just to ensure we could build these sprite atlases without the editor itself running out of memory, but never envisioned that the build process would have the same problems.
Has anyone else been so crazy that they did something similar, and perhaps found a workaround for this?
An image showing the memory crash, for reference: http://puu.sh/cVPkk/e70d8746e7.jpg
While we've been hesitant to go there, to avoid releasing a fairly big project on beta software, I'm starting to think you're right. Working on getting the project into Unity 5 now. Here's hoping Prime31 has kept up-to-date with their plugins too!
Answer by smoggach · Nov 18, 2014 at 02:47 PM
I've been here before. My project had a lot of high res 3d animations and of course my first attempt was to use spritepacker. However, as far as I know, both Spritepacker and Resources need to load their entire contents into memory in order to function (which fails once you reach your system's memory limit)
Unity will just crash if spritepacker does it. Resources won't crash but it will give you a corrupted file (because it couldnt get the memory to properly fill it).
The only way I've found is to use StreamingAssets and load the frames via asynchronous WWW request. (WWW.textureNonReadable will decompress fastest)
Interesting, thanks for the input. I have also tried the loading each frame via Strea$$anonymous$$gAssets idea, but we found because of the sizes of each frame, we couldn't get acceptable framerates for the animations that way.
Regarding Spritepacker, isn't that only used during the initial creation of sprite sheets? I have run into the issues with building too many frames to too many sprite atlases at once, causing the editor to run out of memory as it loads each frame into RA$$anonymous$$ before deter$$anonymous$$ing how to pack them. I wouldn't expect that same process to happen at build time though? Perhaps I'm wrong?
the sprite packing has the option to happen automatically when you update it or only at build time. If you found a way to do it incrementally that's cool but I didn't notice the problem until i'd lost all my built atlases and unity tried to rebuild them all at the same time.
The best framerate I could achieve with Strea$$anonymous$$gAssets was 12fps which was good enough for our purposes but I would not take this path again.
I eventually gave up and started looking into video playback which looked promising. Also sparse textures looked interesting but didn't have time to investigate either.
Your answer
Follow this Question
Related Questions
Distribute terrain in zones 3 Answers
Getting black screen after executing on the Android phone 1 Answer
Game Objects are not visible in build of my 2D Games 0 Answers
releasing a demo to app store 0 Answers
Got a Photon project; What's next? 2 Answers