- Home /
Good way to pipe the ARKit video feed into RenderSettings.customReflection
I'm trying to figure out a way to pipe ARKit's video camera feed into RenderSettings.customReflection so I can fake reflections in AR! Any ideas? The main components I'm starting from are found in Unity's own ARKit plugin. First is the YUVShader.
This shader seems to combine a couple textures that make up the video feed (which comes in a biplanar YCbCr (also called YUV) data format for some reason, more on that here.)
Then there's a component that uses this shader.
Together they magically put the feed from the device camera behind everything in the scene.
So, what I would like to do is also pipe this texture into something that will be properly picked up by standard shaded objects. My guess is I want to feed it into RenderSettings.customReflection although I have no idea if Unity will actually work correctly with a shader or command buffer or something changing that texture every frame.
Any tips at all would be very much appreciated! Thank you!
Your answer
Follow this Question
Related Questions
Can ShaderLab be configured to use EXT_shader_texture_lod instead of ARB_shader_texture_lod on iOS? 1 Answer
Turn on iPhone torch using ARKit 1 Answer
How to get stable (differentiate Bump or Dip) surface in the LiDAR Mesh? ARKit 0 Answers
Post-processing v2 Ambient Occlusion problems with AR Foundation 0 Answers