- Home /
Blur a texture?
I know about full-screen blur, but what I need is to blur a regular texture. Can that be done? Or does that only work full-screen? If it can be done, any samples out there?
Answer by Hoeloe · Apr 22, 2014 at 09:55 AM
Why would it only work for full screen?
Think about it. What are you doing when you apply the blur to the full screen?
First, render the camera to a RenderTexture. Then, draw that texture to the screen with the blur applied.
What you're doing here is converting the screen to a texture, and then drawing it as you would a regular texture. Just skip the first step, and use a normal texture.
I'm not sure what you mean by 'skip the first step'. $$anonymous$$aybe I should add: this isn't really for a blurred final effect. I'm needing to blur a low-res 16-bit gray image and use the result as the alpha channel for another higher-res RGB image (masking).
Skip rendering the screen to a texture, and just draw a regular texture.
In this case, you can do two things. Either, a) blur it in a program like Photoshop and import it (if this is possible, do this). Or b) render the image to a texture. If you have Pro, you can use a RenderTexture for this. If you don't then you can still do it, but it's a little more complicated.
If you don't have Pro, you'll need to do some ridiculous tricks to get this to work. This might be just doing the blur on the CPU, rather than with a shader (NOT recommended), or drawing the blurred texture to the screen, then copying the screen using SetPixel.
I have Pro. I am currently blurring on CPU but frame rate is less than optimal even with multithreading. Blurring must be real time as it's live video (depth) frames. So, how do I 'apply blur' is the question. Seems like a multi-tap but I thought those were for 'the screen' and I'm trying to avoid SetPixel calls as much as possible.
Blurring on the CPU will not get you good framerates even with threading because your CPU does not have enough cores to run each thread at once. As I've said already, you need to use a RenderTexture, by setting the game's RenderTarget to render directly to the texture, rather than to the screen. The best thing you can do is look up the documentation of these.
Ummm... It sounds, from what you've just said, that you have absolutely no idea what a shader is.
When drawing something, to the screen or a texture buffer, the GPU goes through various stages. Two of these involve defining a mesh (for 2D objects, this is just a quad), then drawing each pixel of the texture to that mesh. Both of these stages can have shaders applied to them, which alter the behaviour of that stage. Vertex shaders can, for example, change the positions of vertices. This is useful for effects such as fur or water. Fragment shaders (the second kind) have access to the texture being drawn, and the UV coordinates of the mesh at the point it's being drawn onto. You can manipulate this any way you like. That's the point of a shader, that's why they exist.
You may start to see why it doesn't make sense to say the shader is "attached to the camera". The camera renders through the GPU pipeline like anything else - shaders on the camera are exactly the same as other shaders, they just are applied to the image containing the scene, rather than to a pre-defined texture.
Your answer
Follow this Question
Related Questions
Terrain Texture now blurred under android/open GL2.0 build 1 Answer
Terrain texture low quality 0 Answers
Combine Shaders 0 Answers
Why is my UV'd texture displaying wrapped with the wrong scale? 0 Answers
Transparent shader and scrolling 0 Answers