- Home /
How to simultaneously render multiple (web)cam streams in Android
Hello everyone, I'm trying to render as 2D textures two cameras of my android device simultaneously on two different raw images. Unfortunately, seems that the provided WebCamTexture API allows only to play one camera per time:
WebCamDevice[] devices = WebCamTexture.devices;
rightCam = new WebCamTexture(devices[0].name, Screen.width, Screen.height)
leftCam = new WebCamTexture(devices[1].name, Screen.width, Screen.height);
rightCam.play();
leftCam.play();
Namely, the rightCam becomes inactive ( rightCam.isPlaying = false) as soon as the leftCam becomes active and viceversa.
Are there some workarounds or some other plugins / SDK that I can use to let this things work? I've found nothing online and on the forum. It seems that other people had similar issues but still not found any solution.
Thank you for any suggestion/help! Francesco
Your answer
Follow this Question
Related Questions
WebCamTexture.devices.Length returning 0 in Unity 2020.3.8f1 0 Answers
,iOS WebCamera Texture GetPixels() call returns only black colours? 1 Answer
Accessing camera torch/flash issues - Torch works but can't access while streaming webcamtexture 0 Answers
Unity wont detect device camera 0 Answers
webcam Capture from webcam encountered: IndexOutOfRangeException: Array index is out of range 0 Answers