- Home /
Is it possible to get the actual simulated lens info?
The camera in Unity has FOV and not lens info as a film camera would have. I don't know enough about the camera back of the Unity camera to calculate what the Focal Length of the camera. I would like to drive the FOV using the Focal length, similar to the way Maya handles camera settings. Is it possible to get the lens info in unity?
there is no lense, in real time rendering, there is no way of simulating a lense. $$anonymous$$aya uses raytraced rendering, that is why it has a simulated sense, with focal point and everything. In real time, this is impossible to do, so depth is done by using FOV. There is no focal point.
What do you want to use the focal point for? Can't you just define it?
Well. The goal is to mimic the attributes from the camera setup in $$anonymous$$aya, which does, to some degree, model a real world 35mm camera back. It is the correlation between the FOV and how that translates into the lens info... i.e., the FOV in Unity translates to about a 22mm lens in $$anonymous$$aya. I'm simply trying to find a way to either calculate this data within Unity, or bring that information over directly from $$anonymous$$aya, and have that effect the FOV of the Unity camera.
Answer by hoy_smallfry · Apr 03, 2013 at 10:08 PM
@BenProductions1 is correct that real time rendering uses a different rendering method compared to the ray tracing of Maya. It's apples and oranges there.
However, there is a way to convert the two values similar to how radians/degrees or Celsius/Fahrenheit can be converted. This link explains the relationship:
http://paulbourke.net/miscellaneous/lens/
Keep in mind that this conversion is an onto transformation, meaning the conversion is one-way, from focal length to FOV. You can't convert from FOV back to focal length properly, because more than one focal point maps to the same FOV (as explained at the bottom of the link).
Since this isn't something Unity has built in, if you really want conversion from focal length to FOV, you have to write your own function to do it. You could even go so far as to write your own inspector script for Camera components, which would give you the option of using either measurement. There are plenty of examples out there for writing your own custom scripts.
I'm not sure about the "Ray Tracing" comment, as I am talking about OpenGL rendering via Playblasting, so it is relative to any other realtime rendering solution in that sense. Sorry for the confusion there. Nonetheless, it is the camera focal length in $$anonymous$$aya that drives the FOV when you are looking through the camera in $$anonymous$$aya in your panel window.
Wanting to basically mimic a $$anonymous$$aya camera, verbatim, in Unity seems a bit backwards I know, but as I am working to utilize Unity for more film related purposes, the closer I can get the Unity camera to behave or at least accurately mimic what I do with the camera setup I have in $$anonymous$$aya, the more likely Unity could serve as a realtime rendering solution for us. BTW, I don't have many problems with writing my own scripts, however Trigonometry is not my strong suit. :)
Thanks for the comments and the link. Cheers!
Answer by backcode1 · Dec 20, 2013 at 10:17 PM
All of these systems (Maya, Unity, etc) use a pinhole camera. Maya does for both software and hardware rendering, OpenGL & DirectX on hardware. So if they are the same, why can't you just enter a focal length and get the perspective camera to look right?
The problem with most hardware rendering solutions is that they ignore the film back and just start from the convergence point. To calculate your Field of View (Frustum) you need to know how far is your convergence point from the film back and how wide is the film back. The same 20mm lens on a 70mm film back will look wider than it does with a 35mm film back.
If you look at Digital SLR cameras it was a big deal when someone came out with a "Full Size Sensor". This meant the sensor in the camera was the same size as the 35mm film back. Before that they were smaller and would make images look more telephoto than they did on film.
Since Maya is used in Visual Effects it often has to match the film size the live action plate was shot on. For this reason, Maya has the ability to both set the size of the film back and the size of the lens. This is then used to figure out the FOV and then the FOV is used in the actual calculation to render the image. Since Unity and other realtime engines don't care about this, they simply leave it out of the calculation.
Easiest solve is to write a component that takes into account the film back and focal length and sets the camera's FOV accordingly.
Here is a good explanation: http://cs.wellesley.edu/~cs307/readings/06-camera.html
Raytracing is irrelevant when it comes to figuring out Focal Length to FOV conversion.
Your answer
