- Home /
How to integrate UNITY with live video using OVRvision
Hi all,
A japanese dev team created an add-on to the oculus rift called OVRvision, which hooks up two webcams to the front of the rift. This creates a cool opportunity to combine virtual reality with live video. http://ovrvision.com/
How could one overlay live video feed from the OVRvision into a game environment? Hoping to use the oculus's new positional tracking system in tandem with sixense and dexmo body and hand motion tracking devices.
Besides understanding the basics of a web stack, I don't have any coding knowledge, but I want to fundamentally understand the basic technology behind (and difficulties of) an experiment like this. Any help is appreciated!
Hi!! I already have two webcams strea$$anonymous$$g live video into two differente quads in unity but How can I send de video of the left webcam to the left screen of the oculus and the video from de right cam to the right screen of te oculus??
Answer by Sylux102 · Mar 19, 2015 at 07:49 AM
Use WebcamTexture and set each cameras webcamtexture to a quad renderer.material. child each quad to its respective gamecam and that will do what you need.
Your answer
Follow this Question
Related Questions
Error loading video from custom local url and playing it using VideoPlayer component 1 Answer
projet convert video 2D to virtual reality 3D in Unity with Oculus Rift 1 Answer
vcredist.exe does not exist using ARToolKit 0 Answers
ARCore for Unity save camera image 1 Answer
Hello, how to store augmented reality content in cloud ? 0 Answers