- Home /
Drawing a continuous texture ONGUI, Need help
Hello,
I am trying to draw a continuous texture with ONGUI.I have my script which does draw a texture depending on wherever i touch on the screen.While i keep moving my finger on the screen, texture follows it, like i am dragging it.But what i want to do is drawing a texture and leaving it there, while i keep moving my finger on screen a new texture has been drawn. This is what i want to achieve and i just cannot get the idea of doing it.Whatever i did, i just dragged it.
Any ideas so far?
Thank you!
Sorry, but I have no idea what you are asking. Do you have code you've written that doesn't work?
Oh i am sorry, of course, here is what i have for now.Of course this will just draw the texture wherever my touch is on the screen and when i move my finger around, the texture will follow.But what i want to do is sth like this;
#pragma strict
var image : Texture;
function OnGUI ()
{
if(Input.touchCount > 0)
{
for(var t : Touch in Input.touches)
{
if(t.phase != TouchPhase.Ended)
{
GUI.DrawTexture(Rect(t.position.x, Screen.height - t.position.y, image.width, image.height), image, Scale$$anonymous$$ode.ScaleToFit, true, 0);
}
}
}
}
Answer by MikeNewall · Apr 24, 2014 at 11:14 AM
Using OnGUI one way to create a line would be to draw multiple textures. You can cache the touch position every few frames and for each position draw a texture. There would be a draw call for each texture though, and OnGUI is called multiple times a frame, so on mobile it's not a good idea for performance reasons. You can reduce the number of textures making up the line by increasing the interval between the recording of touch positions which will result in a line with less density (Imagine changing the flow property of a brush in photoshop) and thus fewer draw calls, although I think a better solution is to use meshes as they can be batched.
Saying that you might be able to get away with a single draw call if you could somehow combine the textures each time you add a new one using get and set pixels, but that seems over complicated.
Using the mesh approach you could use a line renderer which draws a mesh between points. If you convert the touch positions to world space you can use them as points for the line renderer.
What I think I would do though is use decals. If you create a prefab from a quad with a decal shader applied you can instantiate one at each new touch position. The decal mesh will be batched because it shares the same material so it's far better than using OnGUI. The only issue you might have is with Z fighting but this shader shows you how to solve that
http://wiki.unity3d.com/index.php?title=BlendedDecal
Hope that wasn't too long winded :p
Edit: When you convert the touch positions to world space the position of the decal/line renderer point wouldn't match the position of the uses finger on screen because of perspective applied by the camera. Rendering the decals with an orthographic camera would fix this.