Rendering a GameObject on the same frame it is created
For my project I need to create thumbnail images of various assets. I'm doing this on demand, instantiating the asset in front of a dedicated camera, rendering the camera to a texture, and then destroying the asset instance.
I would like to do this in a single script, but it seems that gameobjects do not actually appear in the scene until after the frame they're created. If I pause the editor at the point of instantiation, nothing has been added to the hierarchy. If I then move one frame forwards, it is added. This means I render a blank texture.
Is there any way to force a region of my hierarchy to update, so I can create, render and destroy the asset all in one frame?
If not, does anyone know of a different approach?
Update I managed to make it work. The issue is I didn't call camera.Render(), so the camera probably contained data for the previous frame - which was blank.
Answer by Statement · Feb 24, 2016 at 09:59 AM
Should work (at least it does so on my machine). This example loads the asset called "Model" from Resources, creates a render texture and renders the model to that texture. The texture is then fed to a RawImage, for display on screen.
To set it up, ensure you have a model called "Model" (or create a cube/drag it into Resources and rename it Model). You also need a disabled camera, a location for the thumb object and a RawImage UI component to see the result.
Press R to generate the thumb texture.
using UnityEngine;
using UnityEngine.UI;
public class RenderThumb : MonoBehaviour
{
public Transform thumbLoc;
public Camera thumbCam;
public RawImage thumbImage;
void Update()
{
if (Input.GetKeyDown(KeyCode.R))
{
Destroy(thumbImage.texture);
thumbImage.texture = MakeThumb("Model", 64, 64);
}
}
private Texture MakeThumb(string asset, int width, int height)
{
// Load
var thumbAsset = Resources.Load<GameObject>(asset);
var thumbObj = Instantiate(thumbAsset, thumbLoc.position, thumbLoc.rotation);
// Render
thumbCam.targetTexture = new RenderTexture(width, height, 24);
thumbCam.Render();
// Unload
Destroy(thumbObj);
Resources.UnloadUnusedAssets();
return thumbCam.targetTexture;
}
}
In case any future visitors are having the same problem I had, the camera must exist for a frame before the rendering can happen. The inactive camera in the scene looks like an implementation detail, but it's a technical requirement for some reason.
If you need the position of the camera to be controlled by the newly spawned object, you can move the camera to the location of a transform in the prefab.
Answer by Shlemon · Feb 24, 2016 at 12:20 PM
My code follow the same structure as yours, but it doesn't work.
private static void CreatePatchThumbnail(PatchTemplate patch) {
// Create asset and texture
PatchView patchView = new PatchView (patch, contentsRoot.transform);
Texture2D thumbnail = new Texture2D (128, 128);
// Render asset to texture
camera.aspect = 1.13f;
RenderTexture.active = camera.targetTexture;
thumbnail.ReadPixels (new Rect(0, 0, 128, 128), 0, 0);
thumbnail.Apply ();
RenderTexture.active = null;
// Store texture and destroy asset
patchThumbnails [patch] = thumbnail;
patchView.Destroy ();
}
I have a camera with a RenderTexture set as the source, and I then use ReadPixels to extract the data to a Texture2D. Also, my asset is not a single file in memory, but rather a procedurally generated compound of multiple files - so I have more complex instantiation code.
But otherwise, our programs are functionally the same. Mine does not work, though. I can render things that have existed for at least a frame, but in the same frame as the instantiation call there is nothing on screen (or in my scene hierarchy!). If I run the debugger and step through the program, nothing is instantiated after the instantiation call. Instead, it only appears once the frame is complete.
Is there an obvious mistake I've made somewhere?
EDIT The problem was I didn't call camera.Render(), like in your example. Thanks