- Home /
How to render device camera feed to a texture without using WebCamTexture
Hello Unity Developers,
I want to render device camera feed on a texture in unity with all the features of camera, like, autofocus, focus on particular area, flash etc in which autofocus is mandatory for me. Since, WebCamTexture doesn't have autofocus so I can't use that.
I have already seen many links but didn't find any solution, but here : https://stackoverflow.com/questions/35766200/texture-rendering-on-ios-using-opengl-es-in-unity-project I can see something which can help me, but I am not able to make this working in unity.
Please let me know how can I do that and provide me some piece of code on ios (objective-c) side which can create and render camera on textureID (GetNativeTexturePtr()) received from unity.
Edit : Target Platforms are Android and iOS.
Any help will be appreciated. I am waiting for your valuable comments.
Thank you in advance.
Valuable comments would be to look into using a plugin, like Natcam for instance. It is not free but it is worth the $40 or so that it costs and does exactly what you are after.
I want to do without plugin, so please tell how can I do myself.
Well it's going to take more than just an answer on this. You'd need to access hardware camera via native plugin then access camera buffer and convert that to a format that Unity can handle. Finally, turn your byte array to texture2D with consideration to the flipping UV of the native camera whether you are on face or rear camera. This would be the top of the tip of the iceberg. Way too much to ask on a forum.