- Home /
Can I determine optimum WebGL Memory Size?
We have some customers (schools) with older computers that are running into memory allocation problems. Our applications are pretty trivial compared to the slick games most of you all create.
Is there some way to look at our content specifically and make a good guess as to what setting "WebGL Memory Size" should be?
Is it simply a matter of lowering the setting to see if our game still works properly or would that cause problems in some cases that we would not even know about?
FWIW, this question is related ...
http://answers.unity3d.com/questions/918663/what-is-a-safe-maximum-memory-size-for-webgl-build.html
Thanks!
Answer by valyard · Jun 03, 2015 at 06:04 PM
Well, I don't really understand why you are asking the same question again.
Think about it the same way as you do when building for a mobile device. Will this device with 256Mb RAM run my game? What should I do to optimize my memory consumption? What if I have a 128Mb RAM device, will it work there? It's the same here but you control how much memory your device has. You need to optimize for that.
But the problem is not really in webgl memory block size now. This is just what Unity thinks as memory available for it at runtime.
WebGL game also has this .data file which is loaded by browser and is never unloaded. It's like a hard drive in your mobile device, but this time we have to hold hard drive (with virtual file system) in browser memory.
When loading data and compiling code in browser memory some duplication is unavoidable. And right now Chrome is very bad at it. When it's loading a game it grows more than 2x of what will be needed after loading is complete. At this step Chrome tabs on most of old PCs and laptops just crash.
If your school uses Chrome and has <4Gb RAM probably your game won't work there. Use FIrefox, use "Fastest" optimization mode, don't build in Development, turn on stripping, turn off exceptions, remove code, remove images and check for half-empty atlases.
Thanks for your thoughtful answer. I really appreciate it!
The question I am trying to wrap my head around is this ... can I look at my own project somehow and deter$$anonymous$$e how much memory the browser needs for $$anonymous$$Y game?
With regard to the WebGL $$anonymous$$emory Size, the docs say:
You should choose this value carefully: if it is too low, you will get out-of-memory errors because your loaded content and scenes won’t fit into the available memory. However, if you request too much memory then some browser/platform combinations might not be able to provide it and consequently fail to load the player.
I think in our case we are requesting too much memory, but the docs provide no help in understanding how to actually select this value "carefully".
You mentioned that when we build a project a data directory is created ... I wonder if this data directory sheds some light on the answer? In my simple (preli$$anonymous$$ary) project, my "Release" directory has about 30$$anonymous$$B of stuff in it. You mentioned that Chrome grows more than 2X what will actually be needed after loading is complete. So, could I then assume that for this rather simple project that I need about 2 x 30$$anonymous$$B = 60 $$anonymous$$B of WebGL memory size?
No. WebGL memory size is the size of that specific block of memory Unity uses. .data file is loaded to browser memory outside this block.
2x memory size is what you most likely will see if you look at chrome in Activity $$anonymous$$onitor. If your game requires 1Gb of memory after it was loaded while it is loading chrome will probably grow to 2-2.5Gb and shrink back to 1Gb after this. Check with your game.
So I brought up my "game" in Chrome, then opened the Chrome Task $$anonymous$$anager which told me that my game was using about 500$$anonymous$$B of memory. I then brought up the same "game" but this time using the web player plugin. Now, the task manager said my game was using about 100$$anonymous$$B of memory.
So, can I use this information to assume that Chrome needs about 400$$anonymous$$B of "WebGL $$anonymous$$emory Size" to run my game? Is this 400$$anonymous$$B what I need to enter for WebGL $$anonymous$$emory Size when publishing my game in WebGL?
If so, it seems quite odd that 256$$anonymous$$B would be the default in Unity. This "game" of $$anonymous$$e is super light at the moment - nothing like a real game, which I would expect to need much more more than $$anonymous$$e ...
Again no (8
400$$anonymous$$b is very low actually. Check what Task $$anonymous$$anager also says.
(1) The memory Chrome is showing you consists of:
- That 256$$anonymous$$b chunk of Unity memory,
- $$anonymous$$emory which stores .data file,
- $$anonymous$$emory used to parse and store JS,
- $$anonymous$$emory used to store any other loaded data.
256$$anonymous$$b is basically how much memory your virtual device running in browser has. Set it as low as your game doesn't crash. $$anonymous$$emory shown by Chrome is everything combined in the list above (1).
Actually, you can get 2 types of "Out of memory exceptions" which is confusing:
- When your game can't fit into that 256$$anonymous$$b block because you have too much textures and whatnot,
- When browser can't allocate enough memory to hold all the data in the list above (1).
You can connect with Profiler to your WebGL game and see how much memory it actually consumes. Then set a bit higher value in Player settings.
Your answer

Follow this Question
Related Questions
How to connect game to SQLite 0 Answers
CaptureScreenshot WebGL 0 Answers
problem on export to web,problem on exporting to web 0 Answers
Dynamically created Tilemap tiles render in editor and Android build but not in WebGL build 1 Answer
Are there any major consequences when using Playerprefs? 0 Answers