Livestream of virtual camera to FIFO file
Hey there,
as the titles may imply, i am trying to get the buffer of my main camera and encode it to png or may mp4 to send it via FIFO pipe or UPD over the network.
I started with ReadPixels from the camera in a coroutine, EncodeToPNG and tried to send it via websocket. But thats obviously way too slow:D
Unfortunately I am super new to video stuff so I don’t know if it would be better to stick with the MP4 idea or go with the EncodeToPNG version. I guess if the connection would work through UPD it could be much faster and okayisch but also the mp4 version way better in terms of compression.
So it would be interesting to know what you guys think!
P.S.: In case somebody is interested how the mp4 capturing works, there’s a plugin for that;) https://github.com/unity3d-jp/FrameCapturer
Answer by maechtigerhoros · Mar 14, 2017 at 12:29 PM
Just in case and some want to achieve something similar, here’s how I do it now and how I got there.
After and horrible day of UDP connection errors, because of crazy ip- and network-magic happening at my uni, I found an article from a senior developer at google (maybe interesting for someone: https://medium.com/google-developers/real-time-image-capture-in-unity-458de1364a4c#.uhpegugfd ) describing how they tried to achieve something similar. Since this was way too advanced for me, talking about garbage collection and overheating, I remembered this awesome syphon-pipe when I was playing with projection-mapping.
Turns out this is exactly what I need. Easy to use stream out of Unity without any noticeable performance loss. Even with almost 60 FPS coming in to an iOS device.
Unity side:
https://github.com/keijiro/Funnel
Then use either UDP Syphon or TCP Syphon (iOS compatible) to setup up the client and server
Of course, this it not a solution to stream png or mp4 to a fifo pipe, but it provides the outcome I was looking for;)