VR feed in a screen UI element performantly
I'm working on an asymmetric 2-player Steam VR experience, in which one of the players uses an HMD, while the other one controls their experience with a mouse and a screen.
The non-VR user is shown a complicated UI, in which the central frame is supposed to show the VR user's perspective.
I know I can do it by:
1. Using a UI canvas in 'Screen Space - Overlay' mode, and overlaying the UI on top of regular non-VR user display. This way, the regular VR feed can be shown through the UI, in the parts that I leave transparent. However,that's not what I want. I want to see the full VR user's view in a scaled frame, without losing the peripheral elements which would be covered by the UI in this case.
2. Using an additional camera, that is parented to the Head object and renders to a RenderTexture. This RenderTexture is then displayed in a RawImage UI element. Yes, but this is very costly to do in good quality.
3. Using a post-process effect, to blit the VR user's feed into a RenderTexture. Yes, but the image is then fish-eye distorted (which is later corrected by HMD's lens). I could use a shader to undistort it, which is currently my best bet, but I'm thinking: Isn't there a simpler and less expensive way?
For example, do you know a way I can access the final texture displayed as a VR user's stream for 2d screens? The thing that's behind my UI in Option 1 is all I need, if I can only scale it to show fully in a UI element.