- Home /
WebCamTexture iOS blurry snapshot
I am currently using the WebCamTexture class to display live camera feed until the user takes a snapshot, and then I use said snapshot as a Texture in my app.
Here's the code I'm using at the moment :
private WebCamTexture cameraTexture;
private Texture2D snapshot;
public UITexture webCamTexture;
snapshot = new Texture2D(cameraTexture.width, cameraTexture.height);
snapshot.SetPixels(m_CameraTexture.GetPixels());
snapshot.Apply();
webCamTexture.mainTexture = snapshot;
Note : The UITexture class comes from NGUI, it's only to display the texture on the scene.
On Android devices, there appears to be no problem. However, when I use this on iOS devices (tested on iPad2 and iPad3), the texture instantly becomes blurry when I set it. Is this a focus problem ?
I've tried a couple things, mainly waiting for the end of the frame before taking the shot and calling cameraTexture.Pause() before getting the pixels, to no avail.
Why does the iOS texture become blurry ?
Done some further testing. Apparently the maximum supported resolution by Unity is 1280x720, don't know if this does anything because the device I've tested on has only 1024x768 resolution... Anyone ?
What device are you testing on, what resolution is returned for the WebCamTexture by Unity, and what resolution front facing camera do you think the device has? $$anonymous$$aybe your device has a vga camera, and you are scaling that to fix the screen? That will make the video feed look blurry.
As stated in my question, I am testing on both the iPad2 and iPad3. I am unsure about the resolution returned by Unity, but will come back to you with this once I've tested it. However, it isn't the video feed that's blurry, only the snapshot I'm taking when I Set/Get the pixels, which leads me to believe the resolution isn't the problem as the camera live feed looks decent.
Are there another free solution using default API from Unity?