- Home /
Fading Issue
I've tried two methods to fade the alpha of a sprite to 0 to make it fade away. Both are in the code below - I just comment one or the other out as I test. In either case, as soon as the code to begin the fade initiates, the sprite disappears completely. I can see the alpha value I'm trying to set change, but there is no fade, even if I change the timeline. It just instantly blinks out of existence. Any thoughts? Thanks!
using UnityEngine;
using System.Collections;
public static class ExtensionMethods {
public static void SetAlpha (this Material material, float value) {
Color color = material.color;
color.a = value;
material.color = color;
}
}
public class FadeAway : MonoBehaviour {
public bool autoDistroy = true;
public float delay = 1.0f;
// Use this for initialization
void Start () {
// StartCoroutine(FadeTo(0.0f, 1.0f));
StartCoroutine(FadeScript(1.0f));
}
IEnumerator FadeTo(float aValue, float aTime)
{
Debug.Log("Entered FadeTo");
yield return new WaitForSeconds(delay);
Debug.Log("Delay Done");
float alpha = transform.renderer.material.color.a;
for (float t = 0.0f; t <= 1.0f; t += Time.deltaTime / aTime)
{
Color newColor = new Color(1, 1, 1, Mathf.Lerp(alpha,aValue,t));
Debug.Log("About to transform...");
transform.renderer.material.color = newColor;
Debug.Log(newColor.a);
if(newColor.a <= 0.05)
{
Debug.Log("Destroyed");
Destroy(gameObject);
}
yield return null;
}
}
IEnumerator FadeScript(float aTime)
{
Debug.Log("Entered FadeScript");
yield return new WaitForSeconds(delay);
Debug.Log("Delay Done");
for (float t = 0.0f; t <= 1.0f; t += Time.deltaTime / aTime)
{
renderer.material.SetAlpha(renderer.material.color.a - .1F);
Debug.Log(renderer.material.color.a);
}
}
}
Answer by Xepherys · Feb 17, 2014 at 02:01 AM
I was able to resolve this myself - the issue was with the order the sprite was on the layer. For whatever reason it was visible when it spawned in, but as soon as the blending started, it wasn't in the right order. Changed it from -1 to 1 on the prefab, and it works great. Thanks for the help!
You should accept this answer. If you don't know how, watch the broken video tutorial to the right and read the FAQ.
Answer by Benproductions1 · Feb 16, 2014 at 05:41 AM
yield
in the context you're using it causes the function to return and continue from the point of yield
after the condition is met, ie the delay has ended. So if we take a look at one of your two "methods" (they cause the same result either way), lets pick the smaller one: FadeScript
Since we have a problem, lets debug it. The first debugging tool we have at our disposal is our own brain, so lets act like C# to try and figure out what the problem is: Lets do FadeScript()
.
First we call Debug.Log
, irrelevant to our problem.
Next we yield return
an IEnumerator
for delay
seconds, which means we wait for a certain amount of time... irrelevant to our problem.
Then we have another Debug.Log
, again: irrelevant, nothing wrong here.
Then we perform a for loop with t
. For each iteration while t <= 1
we add some value to t
. During the loop we continually decrease the alpha of the material. We do all this during one frame, because nowhere were we instructed to wait or anything else.
Can you spot the problem?
You need to revise on how to use yield
.
Hmmm, but that isn't what is wrong with the other method he tried. Which looks like it should work to me. @Xepherys - what happened when you tried the FadeTo method?
Both methods instantly make the sprite not visible.
Ben, I'd love to agree with you, but that isn't the way it appears in either instance, as I can watch my console and see the value of the alpha drop. The problem is that the first time the for loop begins, the sprite is given an alpha value, but disappears from view. Each iteration of the loop thereafter decreases the alpha value (the value held there correctly decrements) but is irrelevant since the sprite isn't visible anyhow.
And your debug shows an instant 0 for the alpha value? (Please post comments using the Add New Comment button, I converted your last one from an Answer).
@Xepherys The for
loop runs at one time, without delay between it. That doesn't mean that your Debug.Log
won't show up. If that is not what happens, then you should make sure that you are actually calling the FadeScript
function and whether you posted the correct code.
I am definitely calling FadeScript
. I even changed it to the following:
IEnumerator FadeScript(float aTime)
{
Debug.Log("Entered FadeScript");
yield return new WaitForSeconds(delay);
Debug.Log("Delay Done");
for (float t = 0.0f; t <= 1.0f; t += Time.deltaTime / aTime)
{
yield return new WaitForSeconds(renderer.material.color.a);
renderer.material.SetAlpha(renderer.material.color.a - .1F);
Debug.Log(renderer.material.color.a);
}
}
Still disappears instantly the first time through. I can see that the yield
is working because my debug of the value of renderer.material.color.a
counts down much more slowly. If this isn't what you mean by adding a delay, please let me know what I'm doing wrong. The way I read this, the delay should tick each time the for loop runs, and the delay should be shorter and shorter with each run of the loop. Regardless, the very first time that renderer.material.SetAlpha
ticks, the whole sprite goes invisible.