how to convert depth image into 3D point cloud in Unity or HoloLens?
I am trying to input a Depth image into Hololens and display a 3D point cloud which generated from the depth image. I think the Hololens should have some functions that can change the depth data input 3D points since it using depth sensor. So, if possible, can someone tell me how to use the functions to convert my own depth image. Or, any idea how to write a function in the C# script.
Thanks.
Changlin
Answer by thelghome · Mar 09, 2020 at 03:22 AM
you have to convert the depth map into 3D mesh first. Then, you can visualise the points in mesh with shaders.
In FM POINTS, we use similar method to convert any 3D scene into points.
asset store: http://u3d.as/1uHj
demo tutorial: https://youtu.be/AK3gWnlIsBM
Your answer
Follow this Question
Related Questions
how to make a plataforme game with 3d depth 1 Answer
3d grid/matrix inside a mesh 1 Answer
Creating mesh from random points 1 Answer
Point Generator(based on time) 0 Answers
How do i Instantiate sub-Points with in Multiple Points?? 0 Answers