- Home /
NVidia Flex VR Shader rendering issue
I am working with NVidia Flex for a fluid simulation, using the FlexDrawFluid2 shader, which is working great. However, I am running into an issue when I add Vive VR to my scene. The fluid simulation is rendering in two locations in the HMD. In an attempt to resolve the issue, I also implemented NVidia VRWorks, but the SPS and LMS modes did not help like I was hoping they would.
After looking around, I am very sure that the FLEX shader I am using is not setup to work in VR, but I am not very familiar with shader programming to understand how some of the suggested fixes could resolve my issue.
Has anyone else had this issue and successfully resolved it with a fluid shader that will run in VR? If so, I would appreciate any assistance in implementing a fix.
Thanks in advance for the help!
Answer by vizitechusa · Sep 10, 2018 at 08:13 PM
After researching other efforts around this sort of issue, I found a work around solution to this issue. Create an empty under SteamVRObjects. I called it FluidVRCameras. Under that, create two new camera objects, LeftCamera and RightCamera. Set each camera setting to render "Only Left" and "Only Right" instead of "Both." Also, add the Steam VR_Camera script from the SDK to each camera.
DO NOT make these new camera's children of the default VRCamera (eye) object. This will cause double translation to effect your cameras. Actually, it does not seem I need the VRCamera (eye) object any more and I deleted it from the hierarchy and replaced the Hmd Transforms public gameobject with my FluidVRCameras object. Seems to be working well so far.
I am able to see the FlexFluid shader, in SteamVR in Unity 2917.4. I have not worked on optimizing it yet, so there still might be issues, but it is a big step in the right direction for me.
From reading various posts, etc, I suspect that there is a much more economical solution to solve the screen space rendering issues of the shader. I am not a shader expert, but it seems like there is a way to get the fluid shader to output a render to the left eye and right eye directly. Then you would not need to create a camera rig like I mentioned above and just use the standard SteamVR setup, I believe.
If someone reads this and is successful at creating a fix for the FlexDrawFluid2 shader, I would really appreciate sharing it here.
Your answer
Follow this Question
Related Questions
I get weird bug when i use scene depth node in shader graph (lwrp) in VR Oculus rift. 0 Answers
Fuse objects visually when close to each other Shader/ Effect 2 Answers
BlackScreen on build Android VR, After LWRP install 1 Answer
Universal Render is stuck applying Cyan 1 Answer
Single pass rendering + Custom fragment shader with lightmaps & reflection probes? 0 Answers