- Home /
Unity 3.5 webcam framerate question
Hello everyone, I'm having somewhat of a problem using the new WebCamTexture feature in Unity3.5.
I have implemented a webcam in a Win32/Android app, and it works seamlessly on both platforms. However, I am only able to receive frame updates at around 7~8 frames per second using both a generic built-in laptop webcam on the PC and the back facing camera on the Android Galaxy S2 in a well lit scene. This seems to be the case when initializing the camera at 320x240 or 160x120 (The Galaxy S2 lowest native resolution is actually a bit higher - 177x140 I think?), and both when setting the target FPS to 30 and when leaving it out which should default to highest framerate available).
Now, I assume there is a slight overhead converting the camera stream into a Texture2D object, however, is it that influential? Is it a Unity3d related issue or hardware?
Also, are there any cross-platform alternatives for reading a video stream at a low resolution high framerate? I don't need to actually display the image as a texture, but rather do some image processing, right now I'm converting from Texture2D Colors32 format to raw bitmap data for processing which creates another overhead.
Finally, if this problem cannot be handled at Unity's level, what would be good library choices for accessing a camera input at ~160x120@30fps? OpenCV? DirectShow? DirectShow(dot)Net?
Any pointers are highly appreciated :)
[edit:] DirectShow dot Net was automatically converted to link :/