- Home /
AudioSource and AudioSettings help needed to keep perfect sync between music audio loops.
Im working on a game for my masters project, unfortunately I have no game development or scripting experience before I started this project. Right now I'm working on the most complex aspect of what Im doing, the concept of the game is to use a game to deliver a composition interactively by using loops of audio which are played, muted and changed (change only by area) by the user. Because I am using loops of audio which will have one instrument, or a small group of instruments each, I need to ensure all the audio clips playing will stay in synchronisation with each other. Ive started writing something but I need more information about the functions than unity docs seems to provide, their audio sections seem very limited, and this game will basically have to function as a music sequencer.
Because I want to configure how each loop is played in musical time, I will need math, Im thinking using AudioSettings.dspTime to get the absolute time that all the audio-emitting objects will take reference from. Playing loops would have to be initiated by AudioSource.PlayScheduled to make sure the audio plays on configured bars/beats/other divisions, while AudioSource.SetScheduledEndTime would be used to make sure the loop does not overrun, or another play isnt called, until the next division of time I want the music to play is reached. As I understand it the only way to make anything work faster than the Update function is to use the OnAudioFilterRead monobehaviour, unfortunately Ive only been working in C# and the documentation for that is only in Java, and I still dont understand quite how that works, especially as its use in the documentation in AudioSettings.dspTime doesnt work at all when I copy pasted it into my project, it just comes back with an error.
I tried asking for help on the forum with no luck. I dont expect anyone to write the scripts for me, what I would like though are more examples of how these functions can be used for the type of game I am building, that is using scripts to control all functions of the playback, including points at which to loop, play and mute/unmute audio, making sure all objects containing audio (of which there will be many) stay in strict synchronisation with each other.
Ive been trying to put some of this into my mute script, this is how far Ive gotten so far.
using UnityEngine;
using System.Collections;
[RequireComponent(typeof(AudioSource))]
public class Audiomute : MonoBehaviour
{
private GameObject player;
private Crosshair looking;
public int bPMCalibration = 120;
public int beatsInBar = 4;
public int division = 4;
public int clipLength = 4;
public int startDivision = 1;
public bool triplet = false;
private bool isPlaying;
private double sampleRate;
private double beat;
private double startDelay;
private double stopafter;
void Start()
{
audio.loop = true;
audio.playOnAwake = true;
isPlaying = true;
double startTick = AudioSettings.dspTime;
sampleRate = AudioSettings.outputSampleRate;
double nextTick = startTick * sampleRate;
double qbeat = (sampleRate * 60) / bPMCalibration;
double bar = ((qbeat * 4) / division) * beatsInBar;
if(triplet == false)
{
beat = bar / division;
startDelay = bar / startDivision;
}
else
{
beat = (bar / division) / 3;
startDelay = (bar / startDivision) / 3;
}
stopafter = beat * clipLength;
}
void OnAudioFilterRead(float[] data, int channels)
{
double samplesPerTick = sampleRate * 60.0F / bPMCalibration * 4.0F / division;
double sample = AudioSettings.dspTime * sampleRate;
}
void Awake()
{
GameObject player = GameObject.Find("Player/Camera");
looking = player.GetComponent<Crosshair>();
}
void OnMouseDown()
{
if(looking.imnear)
{
if(audio.mute == false)
{
audio.mute = true;
//maybe change colour of halo
Debug.Log ("Sound Off");
}
else
{
audio.mute = false;
//maybe change colour of halo
Debug.Log ("Sound On");
}
}
}
}
I should add in this script I want the mute function to work only at points set by the startDivision and the looping must be controlled by PlayScheduled and SetScheduled end time, set by clipLength, with reference to the bpm and time signature.
Would procedural audio help me at all if I want to use loops? (As a musician Im not keen on it, seems to me like the kind of thing you do when you dont want to employ a composer or sound designer)
Thanks for the Events thing, I'll certainly look into that, maybe I can use it to set up a metronome which the other objects can use as reference for when things can happen. The only thing that worries me is can you use it with OnAudioFilterRead because triggering audio events for me needs to happen faster than game frames, ideally I need sample level accuracy, or close if it can keep readjusting itself.
Events should work from OnAudioFilterRead, as the way I've understood it they're basically lists of functions. You just loop over the list and call every function in them. So whether this happens every game frame of less or more frequent, the listener is none the wiser... it just knows its event has been triggered.
Thanks for that, I think what I'll try and do is build a metronome that sends bars and beats from dspTime to be used by scripts on the objects that play audio, that objects script will then use the time of the last bar or beat and the time at which an action has been triggered to delay the desired action by calculating the difference between the last event, the player action, and the set time after an event that the players action can be executed.