- Home /
Why does OPENGL use 4 to 8 times the vram compared to DirectX
We are working on a game with unity3D.
The past few weeks I have been polishing a game, adding textures, effects, particles etc. With the unity3D stats view I have managed to keep the mem usage around 4-8 MB.
When we wanted to export the game to an iPhone, we stumbled on the fact that the game used 20 to 40 MB of texture mem usage.
We already found out that this is due to the fact that the PC uses directX and the mac used openGL.
When we forced my PC in openGL mode, the mem usage of the mac was pretty much thesame as on the pc (20 to 40).
So my questions are:
Why does openGL use so much texture mem usage? (its *8 at most points).
How can we properly fix this?
Is it causing a problem? The file size isn't any worse? This is only while running? To me, it sounds like DirectX is not properly detecting available memory. In theory, OpenGL is running faster/cooler as a result.
No its not causing a problem (yet?), we just tested on iPhone 4 hardware.
The file size is the same.
I guess this is only while running. When running I can see the VRA$$anonymous$$ in the stats window.
DirectX not properly detecting available memory, is it possible to verify this so we can mark this as the cause of the problem?
I'd check to see how the textures are compressed in each case.
Answer by Owen-Reynolds · Aug 07, 2013 at 08:25 PM
I'm out-of-date on video hardware, but:
Regular computer programs list minimum memory required, and the ideal amount. If you have the minimum, the program needs to make more frequent disk swaps, etc... and is slower. If you have even more than the the ideal, you may continue to see minor speed-ups (like going from area 2 back to area 1 has no load time.)
If you really need to know the exact amount of memory required, you have to start with very little, and keep adding more and more until it feels "playable." But there is no real number.
A graphics card manager does the same thing. If you have extra texture-mem laying around, it will fill that up. If you have less than "needed" it will still run, but slower.
I don't know how Unity counts Video-Ram usage. Maybe by counting the size of textures used this frame, or during the most recent second (a more realistic measure.) On an iPhone, it may be switching to the amount actually used. So it probably still "needs" 4-8M, but it finds a way to run faster by using 20M more. Plus, iPhones have that crazy image format. It compresses amazing small, but it may just take more V-Ram.
Answer by FulkoW · Aug 12, 2013 at 09:43 AM
Thank you all for your answers (I am replying this for Phim) @Ralph: The textures look fine in both directX and openGL @Owen: We tested it on mac AND PC in different versions of unity even, and it keeps on popping up thesame mem amount, so it would be strange if all the machines would detect thesame amount of mem faulty.
@Owen: So basically you are saying that it will work like old dosgames? mem bases, so it will be unplayable with a faster machine? However I do like the second part of ur post, but I find it strange if it really works like that, my PC has a LOT more mem than the 40mb it comes up with, and for testing purposes this doesnt seem like an ideal way for unity to implement this.
It would mean you can never get proper data out of it, so you will always have to keep on testing every little aspect along the route on all devices.
Thanx for everything though,
Fulko