- Home /
 
How do I know if there's enough graphics card memory to allocate a new texture?
I'm creating a Unity Web program that is texture-heavy and may be opened in multiple browser tabs at once. Thus, the total amount of graphics card memory required is large and unpredictable.
I want to know during the program's operation if loading additional textures will cause the application to run out of texture memory. (The user would then be prompted to close other viewers that are using a lot of texture memory or the program may down-res the textures it is planning on loading)
Things I have tried:
Creating the texture2D and then catching an exception if it fails. This doesn't appear to work; an OutOfMemoryException is thrown, but the program crashes prior to the return of the "new texture2D" call.
Using SystemInfo.graphicsMemorySize() (http://docs.unity3d.com/Documentation/ScriptReference/SystemInfo-graphicsMemorySize.html). Unfortunately this would only work if the user is not running other 3D content -- since it only reports the total texture memory, the program does not know how much is already allocated. In our case, several instances of our viewer are often running simultaneously.
Possible approaches:
Is there another way of directly allocating texture memory in a more recoverable way so that I can test-allocate the texture memory prior to allocating the texture2D?
Is there a way that I can detect (directly or indirectly) when the system is close to running out of texture memory?
Is there a relatively easy way for all instances of our viewer in different browser tabs to talk to each other locally? Then, each instance can share how much memory it is using, and (presuming there is not other 3D content running) the program can arrive at an estimate of total texture memory used.
Your answer