Sponsored By

The Joy and Tragedy of Flash Programming

I recount a few of my experiences programming Flash games (which I loved!) and touch on the political nature of the 'Flash is dead' meme (which fills me with sorrow!)

Brendan Vance, Blogger

September 18, 2015

5 Min Read
Game Developer logo in a gray background | Game Developer

The phrase 'Flash is dead!' isn't so much a declaration of fact as it is an expression of one's political alignment. To discover the nature of this alignment we need only ask ourselves: if indeed Flash were dead—if, somehow, a medium for creative expression were capable of experiencing death—who would we say had killed it? Was it Steve Jobs in the hardware biz with the iPhone? Or was it the front-end web person in tech services, scheming to fix the broken relationship between HTML and CSS? Was it Google Chrome, whose commitment to 'openness' has predictably come to preclude any software its parent company can't manipulate? Or was it JavaScript, that mangy nightmare of a programming language whose hunger will consume the world?

Should we rejoice in our ever-impending freedom from all of Flash's dreadful security problems (to be replaced, one assumes, by every other platform's dreadful security problems)? Or the tremendous memory management and performance benefits made possible by... uh... cross-compiling to JavaScript? Shall we shower ourselves in the splendour of CSS3 on Chrome, then shower ourselves in the splendour of CSS3 on Firefox, then shower ourselves in the splendour of CSS3 on IE 10.0 and up, then media query into a set of stylesheets that works on more than fifty percent of recent android phones?

As I said, it's very much a question of politics.

What gets lost in the ideological shuffle, though, is how wonderful a programming language Flash's ActionScript 3 is. It's both powerful and flexible, which is nice; yet beyond that, AS3 is fun. Where Java is verbose, consistent and largely insufferable, AS3 gives you getters and setters to break up the monotony. Where C# gives you a giant kitchen sink in which to deploy any programming pattern ever conceived by humankind, AS3 lends itself to a more particular, off-beat style of code.

Most centrally, AS3 provides all "The Good Parts" of JavaScript while at every turn being worlds better than JavaScript. You of course get all the whacky closure-based nonsense you'd find in JS and any other function-y language; yet AS3 provides syntax for strongly-typed variables, permitting software devs to write honest-to-goodness APIs with coherent, half-readable method signatures! You get IDEs capable of auto-completion, tool-assisted-refactoring and reference lookups! You get the gift of sanity! Do you know what sort of creature writes software in Notepad++ with naught but some crummy syntax highlighting plugin? A WILD ANIMAL. Possibly a rabid one.

Then when it comes time to program some ambiguously-typed, data-structure-melting atrocity, it's as simple as casting all your variables to * and hacking away the night. I must concede that kitchen-sink languages like C# present all sorts of interesting rope with which to hang yourself in these situations—generics, operator overrides, reflection, occult contracts and so on—yet I'm not certain a jungle of generics has ever proven much more comprehensible than a few AS3 dictionaries packed full of mystery meat. In neither case is anyone going to understand anything you just wrote two months from now.

Speaking of dictionaries: AS3's dictionaries are obtuse and absurd and absolutely marvellous. For some reason you use 'for(x in dict)' loops to iterate through keys but 'foreach(x in dict)' loops to iterate through values. Anything can be a key, and anything can be a value; strict typing is not permitted, so the language almost begs you to do something cool/weird. Astonishingly, dictionaries can use weak-referenced keys! In fact, any AS3 event dispatcher can use weak-referenced callbacks! Not even C#, that notorious syntactic sugar thief, is able to do that!

There is a sublime quality to Flash's outrageous RAM and CPU overhead that doesn't quite come across until you've used it to program games. I once built a Flash game that streams hand-painted full-resolution environment art from the internet and loads all of it into memory at once:

I once programmed a dynamic 2d lighting engine that renders circular gradients into a bitmap every frame, blitting it back across the screen as an overlay:

I wrote all this using Flash's excellent 2d graphics API, which treats me like a human being by asking for shapes, fills and gradients rather than vertices, triangle maps and my first-born's blood. I never had to explain the foreign notion of a 'line segment' to some illiterate thousand-threaded polygon pump, futzing with triangle fans and clipper libraries and "programmable" shaders. The libraries I used were entirely CPU-based—which is to say slow as hell—and I still got away with murder.

To use my favourite Bogost-ism, I find certain a squalid grace in hogging 2 gigabytes of RAM and 3 gigahertz of computing power just to draw a 2d videogame. It speaks to a promise fulfilled: a world where we can make sprawling, beautiful, outrageous software without fear of over- or under-utilizing the hardware against which we labour. A world free of scarcity, in which computers serve programmers rather than the other way around. A world without limits.

Yet in the shadow of Flash's ever-impending death—an era of innovation for its own sake, leaving everything behind amidst our fumble for the bleeding edge—that world cannot exist. The hardware will always be too slow; the software will always be half-broken and impliable. This is not cause for celebration; if anything, it's cause for sorrow. There is value in the old and the venerable. There is culture. There is expressiveness. There is joy. We'd be wise to treat it with respect.

Read more about:

Featured Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like