- Home /
Aproximating light source intensity and direction via Camera grayscale image
Hey Guys,
so I am thinking about a concept for AR. Imagine your typical AR scene by using an App on your smartphone, with a cube being projected onto the AR card thingy and a directional light in the scene.
Could there be a way to render the smartphone's camera image to an offscreen buffer in grayscale, and to then change the intensity of the directional light (DL) based on the number of bright pixels (with a defined range)?
And could it be possible to change the rotation of the DL based on the highest concentration of bright pixels in a certain region (assuming that there is only one light source)?
How could one approach that?
Maybe some of you understand what I try to achieve here and could give advise.
if you are trying to match AR objects illu$$anonymous$$ation with scene (in this case real world) this is already implemented with ARCore SD$$anonymous$$, I suggest you to check it here
if can't use ARcore for some reasons, still you can take picture from camera and convert it to grayscale and read every pixel data one by one, but that will be very heavy on CPU without low level progra$$anonymous$$g, especially doing it every frame maybe even impossible with current mobile devices.
Your answer
Follow this Question
Related Questions
How to turn off sun effect of directional lighting (in augmented reality)? 0 Answers
Cloud recognition in Vuforia 0 Answers
UART - Sliding 3D model on the marker 4 Answers
Video from camera on background 1 Answer
Can I use the augmented reality of vuforia and upload it in App Store and playstore? 1 Answer