Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
I want to create a musical experience for the player where they are driving the melody. James outlines how he uses Unity3D and MIDIUnified to achieve this.
James Bowling is an independent game developer from Blocky Pixel working on a rhythm based adventure game called deSYNC. You can follow his progress here.
Being part rhythm game, sound is one of the most important aspects of deSYNC. I want the player’s interactions with the environments to be the driving force behind the music, the better the player does, the more layered the music. The path the player takes changes the melody – loop the same area, the melody will repeat. Go backwards and hear the melody in reverse.
Check out Build 3 here.
Keep in mind this is all “programmer tunes”, so it’s more to demonstrate the tech rather than putting together something that sounds awesome. I will probably gratuitously misuse music terminology here too.
The first thing you’ll notice with this build is that I’ve replaced the metronome tick with a bass-kick sound. It’s important that I get in that half beat. I still need to work on the readability of the beat patterns, but the backing rhythm is the first part of that. I’m also planning on getting those half-beat markers pulsing to their rhythm, which should make things a little clearer.
As you move the marker around the map, you’ll hear the melody start to play.
When I first started thinking how to solve this, I thought I would need to sample each instrument, use some audio sources and try to pick random notes to create the melody. There are a few problems with this – first it’s a pain in the arse. I need to create a different audio file for each note, and there are 8 of them in each octave, in each scale. Then I need new audio files for each instrument I want to use in the melody too. I didn’t want to spend a day exporting audio sounds and tweaking them in Audacity.
Then I had a thought. What if I kick it old school. What if I reach back to the days of gaming-old and dust of… MIDI. Good ol’ MIDI! After a quick search on the Unity asset store I found a plugin called MIDIUnified for the low-low price of $10.
Bear with me for a bit, because this will take a little to get through. If you’re not a dev, you may want to tune out.
I’ve used a few music toys on the iPad, and what a lot of them do is give you a scale (eg CMajor), and play around with an octave. It turns out you can move up and down a scale and it sounds not-terrible. Change the scale, keep the pattern, and it still sounds cool, just feels different. C Major scale will have a different feel to a C Minor scale. So, if I just define the melody as steps in a scale, I can swap out the scale and it will still sound nice. Not only that, if every eight beats I shift the octave, I add another layer of dynamics to what’s being generated. Using midi allows me to easily get these features.
MIDI works by passing a number to the MIDI Controller representing a note. I can map every note from Middle C upwards in 12 numbers, and chuck it in an enum. I can use each of these enum values to play a note using the midi controller.
public enum Note
{
C = 60,
CSharp = 61, //Db
D = 62,
DSharp = 63,
E = 64,
F = 65,
FSharp = 66,
G = 67,
GSharp = 68,
A = 69,
ASharp = 70,
B = 71
}
Once I had these values, I then threw them in a List<>, so I could access them using an index. This is what I will reference to get the note value to pass to the midi controller.
List NoteMap = Enum.GetValues(typeof(Note)).Cast().ToList();
I created a second map to make things a little less confusing when configuring the scales. Musical notes have multiple names, and different scales will use different names.
private static class NoteMapOffset
{
public static int BSharp = 0;
public static int C = 0;
public static int CSharp = 1;
public static int Db = 1;
public static int D = 2;
public static int DSharp = 3;
public static int Eb= 3;
public static int E = 4;
public static int Fb = 4;
public static int ESharp = 5;
public static int F = 5;
public static int FSharp = 6;
public static int Gb = 6;
public static int G = 7;
public static int GSharp = 8;
public static int Ab = 8;
public static int A = 9;
public static int ASharp = 10;
public static int Bb = 10;
public static int B = 11;
public static int Cb = 11;
}
So now I have my mapping sorted out, I can start to add scales, which look something like
int OctaveLength = 12;
public static int[] CMajor = {
NoteMapOffset.C,
NoteMapOffset.D,
NoteMapOffset.E,
NoteMapOffset.F,
NoteMapOffset.G,
NoteMapOffset.A,
NoteMapOffset.B,
OctaveLength + NoteMapOffset.C
};
public static int[] CbMajor = {
NoteMapOffset.Cb,
OctaveLength + NoteMapOffset.Db,
OctaveLength + NoteMapOffset.Eb,
OctaveLength + NoteMapOffset.Fb,
OctaveLength + NoteMapOffset.Gb,
OctaveLength + NoteMapOffset.Ab,
OctaveLength + NoteMapOffset.Bb,
OctaveLength + NoteMapOffset.Cb
}
Next up I created a Controller to manage all of these melodic sequences, notes and scales. I added a method to it called “GetMidiNote”. It will take a scale to play, and the index of the note to play.
public int GetMidiNote(int[] scale, int index, int octave = 0)
{
var note = 0;
var noteMapIndex = sequence[index];
if (noteMapIndex >= OctaveLength)
{
note = (int)NoteMap[noteMapIndex - OctaveLength];
note += OctaveLength;
}
else
{
note = (int)NoteMap[noteMapIndex];
}
note += (octave * OctaveLength);
return note;
}
And that’s it! Now I can have a melody, say “3, 2, 3, 0, 1, 2″, and play it on any scale, and in any octave. I can change the theme of the sound by simply change the scale I’m presently using. I can use MIDIUnified to play that midi note value.
The final step of the process was integrating it into the player movement. Each of the connectors has a melody value for each of it’s beats. As the player taps to the rhythm, I tell the MIDI controller to play the connector’s note.
This ended up a lot longer than I planned, but hopefully it gives some insight in to how I’m generating the audio in deSYNC.
Read more about:
BlogsYou May Also Like