- Home /
Performance Differences in Standalone player versus Web Browser?
Summary - I'm seeing performance differences in the Web Browser versus a standalone executable version of a Unity build. And I don't know why. :)
Situation: I've created the simplest scene I can - a 50-yard terrain, and a FPS controller at one end, plus a Frames-per-second guiText. I create both a standalone executable, and HTML file for testing. I have not changed any of the default Unity parameters.
In the standalone file, I run (via the FPS controller) from one end of the ramp to the other end - it takes 7-8 seconds to reach. Frames-per-second is showing in the thousands.
In a Web Browser (both Opera and IE) I do the same run down the ramp, and it takes 14-15 seconds to complete. The Frames-per-second generally shows 100+.
Why is there a time difference between these two?
At first, I was thinking that the difference in framerate would affect the run speed. But I looked over the FPSWalker script, and it uses FixedUpdate(), as well as Time.deltaTime applied to speed. So I would have thought that this compensated for the framerate difference.
I previously asked a question about Animation clips playing for different times, and it was suggested it might be a bug. But just moving the FPS controller around, in standalone versus a browser - it's too basic. I can't see how something that fundamental would be a bug. :)
Which means it's something in the configuration, in the scripts - or possibly in my assumptions. :) My first assumption is that they should perform the same - but I don't really know that. How should it work, given identical scenes, and no Unity configuration changes? And if they should work the same - what could cause it to not do so?
Update: mea culpa... I thought the code for reporting my Framerate was working, but maybe not. In any case, I tried setting the framerate to 30 frames-per-second, and both the standalone executable and Web browser then performed at the same speed. Sigh...
Answer by duck · Mar 08, 2010 at 10:00 PM
The frame rate is capped to a maximum when running in windowed mode in a web browser (i.e. not fullscreen).
This was implemented by UT so that the web browser doesn't end up eating up all the cpu power just because Unity is running (which, in turn, helps the Unity plug-in to not become known as a CPU hog). If you switch to full-screen mode you should see the frame rate cap lifted.
The difference in physics behaviour is a little strange though. (I thought I had spotted the reason for it, but my theory turned out incorrect so I have deleted that portion of the answer!)
I'm not sure why that's an error. The reference page for fixedDeltaTime actually says, "For reading the delta time it is recommended to use Time.deltaTime ins$$anonymous$$d because it automatically returns the right delta time if you are inside a FixedUpdate function or Update function."
Hehe yes I was just discussing this in IRC and discovered that it's not an error. I wasn't aware that deltaTime had this extra functionality!
Deleted incorrect portion. The reason behind the frame rate difference remains correct as far as I'm aware, though :)
Hm, I probably should get an IRC client... :) I forgot to add to my first comment, I did switch to full-screen, and didn't see a major frame-rate increase of the browser. Still roughly in the 120+ range.
I was under the impression that the frame rate cap was actually 50fps for webplayer builds, so I'm wondering how you're seeing measurements that high - perhaps post your fps measurement code? (and click the "irc" link in my unityanswers profile for information about the unity irc channel, including a link to a browser based client)
Answer by Ashkan_gc · Jun 01, 2010 at 09:16 AM
the web player has other small diferences. the input code and some others are removed from the engine of web player and it uses javascript on the web to detect them.