Having released Let There Be Smite into the world I’ve started work on a couple of new games. One is a more straightforward one based on those amazing safety instructions you get on the plane – that’s the one I’m “really” working on. But I’ve also been going through the interesting new experience of producing kind of “technical prototypes” for another game, tentatively titled The Biggest Opa, which is about dancing to the Zorba the Greek song.
In particular, I’ve had to spend time investigating ways to synchronise animation with music in Flash, specifically in FlashBuilder (the non-graphical IDE for coding in AS3). I’ve always been deeply inspired by the work of that amazing guy who produced sweet synchronised music-with-words (if you can remember this stuff, please send me a link so I can remember), but I’ve never really understood how you could sync such things.
Butt-loads of research later, I’m still not that much more enlightened, but I’ve at least tried a couple of things out. The main way I’ve tended to run into on the internet is using a program called FlashAmp to list amplitudes per “frame” of music. You then can trigger stuff based on the current amplitude of the music. Of course, you can also do that directly in the code by checking the amplitude of the sound currently playing, but the list has the advantage of foreknowledge – you know when certain amplitudes will be hit and can prepare accordingly. Since The Biggest Opa is a rhythm game, I need to know when particular beats are coming so I can animate some kind of icon to indicate to the player when to hit the right buttons. But of course FlashAmp is for Windows and I’m not prepared to go there, so…
I came up with a pleasingly hacky solution that, so far, is working. Basically, while listening to the Zorba music I record a “clap track” where I clap into the microphone at the moments when I want a button to be pressed in the game. This generates a mostly silent audio track with occasional spikes of maximum amplitude that is synchronised to the original track (after some fiddling – Audacity doesn’t seem to manage to keep them synched after the initial recording).
Now, if I play the clap track and the original track at the same time (starting at the same frame of the game) I can theoretically check the amplitude of the clap track in order to know when a button should be pressed, and that will correspond to some appropriate beat of the music. Perfect. I currently have this working for a short clap track but need to test it out with a track lasting for the whole song to check whether it falls out of sync at some point (probably will, and then I’ll be totally screwed). Further, I need to start the clap track earlier than the real music by some known offset so that it’s predicting the coming beat (in 10 frames of whatever) rather than occuring at the same time – that allows me to animate the symbols rather than pop them up at the instant they need pressing.
In short, I’m rather pleased with this little foray into solving a technical problem – it’s way outside anything I might consider my area of expertise, but so far I’m successfully muddling through. I feel so… technical!