Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Featured Blog | This community-written post highlights the best of what the game industry has to offer. Read more like it on the Game Developer Blogs or learn how to Submit Your Own Blog Post
Writing a streaming MMO patcher is a major undertaking. Determined to make your own? Here are some things to consider.
Originally from: onemanmmo.com
I needed a way to deliver my game to players, but I have a fundamental problem: the game is really, really big. That means that it is completely impractical for players to download the whole thing in one go, it needs to have a streaming installer.
I did a lot of research on patching and installers over quite a few months. There are so, so many approaches you can take. Windows installer is a great tool, use it if you can. I tried really hard to find an existing solution, but I'm sad to say, I didn't find a solution that would suit my game.
The biggest roadblock to using an existing solution was that I have a lot of files. My test world has around 18,000 files and the full dataset will have many, many more. With that many files, a regular single-file self-installer would be gigantic.
You wouldn't believe how much work adding this one progress bar was!
One of the best articles I found on the subject was a blog post on the Everquest II streaming client by one of the developers.
What spurred me to finally start development was having to manually copy the game data over the LAN every time I wanted to do a multiplayer test.
I worked out these requirements:
Minimal initial installer size.
Streaming installation during gameplay.
Supports 100,000+ files.
HTTP server for file delivery.
Bandwidth-efficient file transfer.
Optional verification of file content during load (cheat prevention.)
Encryption of some files on the server.
Chain of trust security for binaries and UAC elevation.
Support for multiple simultaneous server connections to maximize client performance.
I spent a lot of time investigating binary diff's for files because it has the potential to reduce the size of patch file downloads, particularly for big files. I found the bsdiff algorithm which seems to be just the ticket, but there was a fundamental problem: people were going to be upgrading from any one of many previous builds, which meant that the number of diff files that would need to be managed on the server could quickly become unmanageable. That complexity might be worth it if you have a lot of big files, since binary diffs work best on large files, but I don't, I have many tiny files. In the end I settled on having each release be a self-contained, complete dataset for the game.
The first thing to tackle was the manifest of all the files in a release. Something I discovered while building my asset system is that with so many files, just the file name, date, size, and flag data alone gets very large. An idea I got from the Everquest article was the idea of splitting the manifest into smaller pieces. By doing that, for each new release you only need to download the portions of the manifest that have changed. A large majority of the file data should be constant between releases. I have a main manifest file with data to verify all the rest of the manifest files and all the directory tree information in an ultra-compact format. I also have a file for each directory and a file that lists files that need to be deleted for each given release. All of the manifest files are compressed.
Within the manifest I support any number of filesets. Each fileset represents a directory tree below a specific directory on the target system. I use one for my application data, and another for the application binaries.
Quite a bit of development time went into the HTTP downloading system. I built a system which can retrieve files from any one of several HTTP servers is parallel. I implemented it with a priority queue in front of it, so the asset system has the ability to download something right now if it needs to.
I built the individual HTTP file downloader to support HTTP 1.1 pipelining, chunked transfer, and gzip/deflate compression (great for saving bandwidth on text-based assets.) Cheers to the Zlib developers for making decoding the compressed streams so easy.
People always complain that remaining time estimates on computers are terrible, and now I understand why. I started out with a system that figured out the average download rate so far, then figured out the remaining time based on the number of bytes left. That worked terrible and the estimated time fluctuated wildly as the download progressed. I then refined the estimate with an Exponential Moving Average which is the recommended way to improve download estimates. What this does is bias the average calculation towards the most recent download speed. That also totally doesn't work. Looking through the logs I quickly discovered why: the download rate fluctuates wildly from a few hundred K per second up to 8MB/S. Estimates gonna suck, get used to it.
Security is a big consideration with a patching system. (Don't believe me? Read this for a good scare.) I'm using Crypto++ for RSA file signing and for file verification hashes. The manifest is signed using RSA so the client knows the patch comes from Secret Lair Games. The manifest contains hashes of all the other files that hold manifest data, as well as hashes for executables and files where the asset system wants to be able to verify the files on load (for anti-cheat.)
For application binaries I use the "Chain of Trust" approach to make sure that only safe binaries make it into Windows' Program Files folders. This article from Microsoft has additional details on this.
The final stage of the patching process is the ElevatedFileCopier. On Windows, you need administrator access to write files to some areas. Unfortunately, the whole process has to have those credentials, so you have to start another process. The easiest way to start a new application with UAC prompts is to use ShellExecute. UAC prompting is controlled by the UAC Execution Level setting in the application's manifest. ShellExecute checks the application's manifest and takes care of the UAC prompts to the user so they can provide their administrator credentials. Changing this setting is pretty simple as long as you completely ignore everything about it on the web (which is all outdated and wrong.) If you have Visual Studio 2010, you don't need to make any .manifest files, just open your Project Properties, go to Linker, then Manifest File, UAC Execution level and set it to requireAdministrator. Tada, you're done.
The only gotcha with this is: What if you want to replace the ElevatedFileCopier application itself? Windows normally blocks any attempt to modify a running binary. The trick is that on Windows an application can rename its binary while running. So just have ElevatedFileCopier.exe rename itself to DeleteMeNow.exe and write the new version to ElevatedFileCopier.exe. Next time ElevatedFileCopier runs it will be the new version and it can look for DeleteMeNow.exe and delete it if it finds it.
One thing I didn't expect to turn out to be as difficult as it was, was file times. I figured I'd put the file time in the manifest in UTC, check it against the time I got from the web server (which is always in UTC), and check the local file time for mismatches. First problem: to set the file time I tried _utime in the C library which is supposed to set a file time from UTC time. It turns out that Windows' _utime messes up with Daylight Savings time and sets the file time one hour off. So I replaced _utime with SystemTimeToFileTime and SetFileTime. Once that was done, I discovered that _stat has the same problem.
While I was dealing with file times I also needed gmtime to convert C time to UTC. To convert back you use timegm, which Windows unfortunately doesn't implement at all, so I got to implement that. The next problem was that file systems record times with different resolutions, from accurate to the millisecond all the way up to accurate within one day. I settled on matching file times within a ten second range.
The last piece of the patcher was individual asset streaming. This feature came together much easier than I could have hoped. I was able to modify an existing asset container within my asset management system to simply call the patcher to download asset files as they are needed by the game and provide a callback when they are. Not all the systems in the game support assets which aren't immediately available, but the terrain loader already queued blocks for loading, so getting it to wait a little longer was easy.
One additional piece that would be nice to have is a downloadable one-file installer. For now I'm going to start with a simple Windows installer. To have a one-file installer I would need to build the patching code into a tiny Windows app that also can select the destination folder under Program Files, set up the file permissions for the ApplicationData folder for the game, set up the program in Programs and Features in Control Panel (and do the reverse for removal.)
Don't write your own patcher. I can't stress this enough. It's a big complicated problem with lots of moving parts. So much so that I'm thinking about licensing the patcher. Indies shouldn't have to do this. I haven't decided for sure, the potential market is microscopic, it wouldn't be free, and I really need to finish my game first, but if you're interested, add a comment below and let me know.
Read more about:
Featured BlogsYou May Also Like