Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Follow up to my previous post in which I covered the basics of getting an image loaded into Phaser, this time I add touch and sound.
As I continued my adventure in migrating my Phaser game Word Fall from desktop to iOS, I spun my wheels trying to achieve something which should have been easy. This was a classic case of something which is so hard right up until the moment it turns out to be easy.
I'll start with the step which I completed exactly as expected. If you read my First Steps article, you know I finished with a Phaser stage displaying a moving sprite like this:
Enabling touch to make this an interactive demonstration is done in a straight-forward Phaser way. Just enable touch on the sprite and assign a handler to the sprite's "onInputDown" event to change the sign of the delta position.
Beautiful! This is how game dev should be...
In my previous article I described how so many of the challenges I ran into were related to timing - to how important it is to ensure a file is fully loaded before trying to use or access it. With that in mind, I hacked together a process to ensure I waited to instantiate the game object until both the image and the audio were loaded. The audio tracks used in this article were provided by Andrew Martin (@7HzResearch) of 7Hz Research for use in Word Fall.
This doesn't work because the "onload" event doesn't apply to the <audio> (or the <video>) tag AND because you have to manually load the audio file using the "load" function (at least this is true for mobile Safari).
Without the "onload" event, how do you know when an audio file has completely loaded? It seems like this should be incredibly easy and I don't understand why there isn't an event for this fairly significant milestone, but I couldn't find one. There are a bunch of events for the <audio> tag as described here. The "loaded", "canplay" and "canplaythrough" events seem very useful, particularly for video, but none of them are described as doing what I need (e.g. signaling the completion of the load action).
The "canplaythrough" event is the closest to what I'm looking for. It is triggered when the system estimates enough of the file has loaded that it will likely be able to play through to the end without having to stop for further buffering. In the testing I did, it was only triggered after the entire audio file was loaded. I suspect this might not be true for a video file or over a slow network connection. Fortunately, for this case, this is an audio file and it resides locally.
The final step here is to start playback, the "play" function should do the trick...
So there's no audio to go with that picture. That's because mobile Safari requires user action to start playback. Fortunately, we already have an object which responds to touch. Moving the "audio.play()" command into the sprite's "onInputDown" function works, clicking on the mushroom as it moves across the screen reverses its direction just as it did before and now it starts the music. That feels like real progress. The problem is that this is done entirely outside of Phaser.
In order to use this method for audio in my game, I will need to completely re-write my game's music manager. One of the goals of this project is to see if I can create some boiler plate code that will allow me to drop a functioning Phaser game into Xcode without having to re-write a bunch of code. I've already added the code that should load the music file into Phaser's cache and then add it to the game so it can played using a more Phaser-like approach. This doesn't work.
For a long time I didn't understand why it doesn't work. I can add an <audio> tag, I can trigger its play function and get sound out of my speakers, why can't I get a Phaser.Sound object to work? The parallels to moving an <img> tag over to a Phaser.Sprite are unmistakable and the fact that this doesn't work was baffling.
This is starting to feel like one of those bad daytime soap operas that keeps replaying the same scene over and over again. This time, the fix wasn't as simple as finding a function on an existing object which grants access to local files. In order to figure this one I first identified that my audio file was never making it into Phaser's cache, so the problem was with the "game.load.audio" function. To figure out why, I eventually dug into the internals of Phaser itself (thankfully it is open source and this is an option). By searching through Phaser's source code I found that there are two paths by which audio files can be loaded into the cache. One for when the audio is played using HTML5 Audio Elements (i.e. <audio>) and one for when it is played using Web Audio.
All modern iOS devices support Web Audio, and given the additional power and versatility of this API, it is the default audio option for Phaser. As a result, the "usingWebAudio" code path was being executed. This code path uses the XMLHttpRequest API (hence "xhr" for the variable name). The problem here is that (at least as far as I can tell - this thread was opened in March 2016, but as of November 2017 there was clearly still no solution), the XMLHttpRequest API cannot be given permission to access local files, which explains the failure to load my audio files into the cache.
Since I already have <audio> tags working in my Phaser project and it appears the XMLHttpRequest API is the culprit in Web Audio failing, I looked for a way to force use of <audio> tags instead. For the second time in this post something went just swimmingly - I cam across a short thread on the HTML5 Game Devs Forum which suggested setting the window's "webkitAudioContext" property to null. This worked beautifully!
Withholding any audio until the user interacts with the app works, but it isn't exactly a "native" experience and would set the game apart from its competitors. Waiting for user interactions works perfectly for the sound effects, as they only occur in response to user action, but for the background music, this isn't what I want. In all likelihood, I'll move to using the native audio manager available via Swift and iOS so the background music can start on launch and leave the SFX in Phaser in order to minimize the amount of code I am rewriting.
Between this article and the previous one, I wrote about getting a WKWebView up and running in Xcode, and connecting it to a local html file. Then I discussed modifying that html file via javascript contained in a separate file in the App's main bundle. Combined this enabled me to get Phaser running and to add a moving sprite to the game's stage. This article covered adding touch interaction and audio. Technically, that should be everything we need to get a full fledged game running. In the next article I'll attempt to do exactly that. I hope you'll join me as I discuss the details with a practical example.
Read more about:
BlogsYou May Also Like