- Home /
"Blend SrcFactor DstFactor, SrcFactorA DstFactorA" Does not work
Hi,
I've just observed that "Blend SrcFactor DstFactor, SrcFactorA DstFactorA" form does not work in Sprite-Default (and Diffuse) types of shaders. More specifically, the alpha channel blending part (after the comma) does not work for me.
I have two sprites: 1) large background image, 2) small white, opaque circle. I want the background to be dark (through lowering its alpha channel) and then make it lighter (i.e., bump its alpha channel) using the circle. My intent was to use shader for the circle that would take the already-rendered background color and set its alpha channel to 1. So I copied Sprite-Default shader and changed:
Blend One OneMinusSrcAlpha
to
Blend Zero One, One Zero
for the background shader I again copied Sprite-Default and lowered alpha manually in frag function:
c.a = 0.2;
No effect. And then after testing other possibilities I observed that changing the alpha channel blending part (the one after the comma) to "One One" and "Zero Zero" had no effect on the result at all. As if it did not work at all.
Am I doing something wrong? Is this a bug? or an intended behaviour?
Thank you for reading,
"I want the background to be dark (through lowering its alpha channel) and then make it lighter (i.e., bump its alpha channel) using the circle." - I think there's some misunderstanding here - alpha channel does not affect light/darkness of the image. You'd get that by changing the RGB values.
Answer by joons · Sep 28, 2015 at 06:26 AM
( This is not correct answer of the question. I just want to say "Blend SrcFactor DstFactor, SrcFactorA DstFactorA works in some situations" for unity beginners like me . Because I misled by the topic title . )
Alpha factor works when I draw on a transparent RenderTexture(ARGB32).
Alpha factor doesn't work when I draw on screen (opaque background) , maybe because dst alpha must be always 1.0 .
To test the separated alpha factor easily.
create sub camera ,
set sub camera 'target texture' as the RenderTexture ,
set sub camera 'clear flags' as Solid Color ,
set sub camera 'Background' as transparent (A < 255) ,
and draw object with alpha factor
and draw quad(or Plane) with the RenderTexture on Main camera.
Answer by Bunny83 · Sep 26, 2015 at 01:28 PM
Like @tanoshimi said you misunderstood how the blending works. Blending is only done by the blend function. Your factors have no effect on the actual output. All you do is using the alpha channel of the source (the shader output) but the color of the background (dest). The alpha channel of the framebuffer has no effect on how the color is displayed. It's just another channel.
Blending happens when a new color output from a shader is written to the framebuffer.
The source factor defines how the source color should affect the output while the destination factor defines how the original / current color should be treated.
How the two colors are actually combined after the got multiplied by their factors is defined by the blend operation (BlendOp)
The term alpha blending usually referes to this setup:
BlendOp Add
Blend SrcAlpha OneMinusSrcAlpha
This will simple add the source color and dest color together but multiplies the source with it's own alpha value and multiplies the desination color by "1 - source alpha"
This
Blend Zero One, One Zero
simply says:
Ignore the source color
Use only the destination color
With blend operation "add" the resulting color will be just the destination color
The 3rd and 4th parameter just affect the alpha value that gets written into the framebuffer. There you say:
Use the souce alpha
Ignore the destination alpha
This will result in just the souce alpha value
Again, at this point the "blending" is finished. The outputted color is the final color.
You might want to take a look at Unity's documentation
I've found this article where the actual factors are well explained. The article focuses on OpenGL, but it almost always works the same way since that's what the GPU provides.
I'm not sure i understood what you want to do. However if your background actually contains "something" (texture / color information) and you render it "black" / "dark" that information is lost. Something like that can be done the other way round by drawing something on top of the background that darkens everything except the thing you want to see. This can be done by also drawing a fullscreen mesh on top and place your "hotspot" with vertex manipulation and / or UV manipulation at the desired point.
Another way is to use a 3 pass method.
First draw the background at full brightness. Write 0 alpha to the framebuffer
Draw your hotspot object. Have that object write 1 alpha to the framebuffer
Draw another full screen dark quad with a shader that blends based on the destination alpha value.
So the background would be drawn with
Blend One Zero
The hotspot would be drawn with
Blend Zero One One Zero
Make sure your background has it's alpha set to 0 or close to 0
The "darkening" quad would be drawn with
Blend OneMinusDstAlpha DstAlpha
This would do the following:
Simply draw the background as it is, ignore whatever was in the framebuffer.
When drawing the hotspot, ignore the color of the hotspot but only use it's alpha channel. Only the alpha channel is written to the framebuffer, the color stays the same. The resulting framebuffer just contains the full bright background texture and our setup alpha mask which is 0 for the background region and 1 for the hotspot.
When drawing the "darkening" quad it will blend itself with the background texture based on the alpha value of the framebuffer. So whereever there is a "0" it will render the darkening quad on top. Whenever there's a "1" it won't draw anything
The result would be a dark screen where you see the background only at the point where you've drawn the hotspot.
To draw those 3 things in order you can use either the queue tag in the shader or use 3 cameras + 3 layers.
Thanks , but I just wanted to say that 'Blend SrcFactor DstFactor, SrcFactorA DstFactorA works' .
After reading API reference , I searched examples which uses separated alpha blending , but I couldn't find .
And when I found this topic title , I thought that 'Separated alpha factor doesn't work always by bug ? No answer means no solution? Or no one uses it?' .
I know about how blend function works on OpenGL , but I don't know much about drawing mechanism of Unity .
In my case , I wanted to transplant my project from OpenGL to Unity , I use glBlendFuncSeparate() to draw with multiply blending with alpha(as transparency).
DstColor One$$anonymous$$inusSrcAlpha, Zero One.
$$anonymous$$y project is not so small and transplanting is not easy for me . So I was confused about unity shader , and I was misled by this topic title for 1 hour .
That's why I wrote 'It works in some situation' .
Your answer
Follow this Question
Related Questions
Sprite masking 3 Answers
ShaderGraph Position from Texture Sample Issue 0 Answers
How to render a sprite in the opaque pass 0 Answers
Shader sprite overlay other sprites 0 Answers
Radial blur effect shader for textures to work with sprites? 1 Answer