Trending
Opinion: How will Project 2025 impact game developers?
The Heritage Foundation's manifesto for the possible next administration could do great harm to many, including large portions of the game development community.
Has the age of so-called "2D" programming drawn to a close? Certainly for some genres the twilight has come -- a designer suggesting a side-scrolling adventure game would likely have just recovered from a long coma. But for other categories, a 3D environment is not an easy fit.
Has the age of so-called "2D" programming drawn to a close? Certainly for some genres the twilight has come -- a designer suggesting a side-scrolling adventure game would likely have just recovered from a long coma. But for other categories, a 3D environment is not an easy fit. Titles such as Sid Meier's Alpha Centauri (and his Civilization series), the Age of Empires games, Talonsoft's Operational Art of War series, and the entire Red Alert series all use a 2D interface and map to good effect. That is not to say everything must look flat, for instance Age of Empires 2 and Red Alert 2 make use of detailed graphics for units, buildings, and landscape that have the appearance of depth. However, this does not change the fact that they are still 2D game engines. More than that, it can be strongly argued that a move to a "full 3D" environment would reduce the quality of gameplay on these products. For instance, Sierra's Ground Control could be considered the 3D evolution of the RTS concept, yet it was Red Alert 2 that beat it out to win IGN's recent Readers Choice Award for Best Strategy Game of 2000.
Having determined that the opportunities for a 2D engine are still alive and well, PC programmers are currently facing three problems: first, Microsoft's recently released DirectX update has focused all its new API development into the area of 3D support; next, game players are clamoring for higher graphics quality and special effects; and finally, there is a substantial learning curve for a 2D programmer to move to the world of 3D graphics APIs such as Direct3D and OpenGL. Yet the newest release of Microsoft's DirectX, version 8, offers some features to help make the transition a bit easier.
A Brief Overview of Traditional 2D Methods
The king of 2D graphics concepts is the sprite. At its very simplest, a 2D engine is capable of drawing a background image and then drawing individual graphic objects -- sprites -- onto that image. The use of color-keying allows a specific color in a sprite to be considered transparent, which allows sprites to be any irregular shape. And the use of a series of sprites to represent an object, for instance a number of different facings of a unit, allow the simulation of animation and action when they are drawn in sequence. The fundamental programming process for 2D graphics is the blit -- bit block transfer -- which is used to move the bit image of a sprite onto the display area. Finally, to provide a smooth transition of the display from frame to frame, graphics operations (primarily blits) are usually done to a back buffer, which is basically a working copy of the screen area. Once the frame is complete, the back buffer is flipped to the display area (front buffer), and the next frame begins from scratch. Almost everything else in 2D graphics is built on these base concepts.
In DirectX, Microsoft provided a capable (but sometimes limited) interface for 2D graphics which they called DirectDraw. This API provided all the basics that a 2D engine required. A DIRECTDRAWSURFACE is used to represent both raw-data graphic surfaces (filled with background images or sprites) and the primary display surface (front and back buffers). Most of these surfaces could be in either video memory or system memory (with the exception of the active front buffer of the display, which is of course always in video memory). The DirectDraw Blt() command is used to move images or portions of images around between surfaces. While there is not any specific support for sprites as a unique object in DirectX, it is a trivial matter to create a library to split one large surface into an array of rectangular cells for use as sprite objects. Take a look at some of the data files on disk used by many commercial games and you'll find these graphic "template" files of object sprites.
The DirectDraw Blt() function handles color keying and also provides support for a few more advanced options such as mirroring, stretching, and shrinking of bitmaps. These features, however, are often driver-dependent and are supported to different extents on different video hardware, providing inconsistent results if used. A simplified code snippet of drawing a frame in DirectDraw is provided in Listing 1.
Changes in DirectX 8
With the release of DirectX 8, Microsoft made some significanet changes in the design and implementation of the graphics engine. They replaced the separate DirectDraw and Direct3D components of DirectX 7 with a new component officially called "DirectX Graphics." It does not take long, however, to notice that all the focus in this new component is on 3D functionality, and DirectX Graphics is often interchangeably called "Direct3D 8." The interfaces of DirectX Graphics are not derived from the interfaces of previous DirectX versions, but are an entirely new design. As in all releases of DirectX, older interfaces are available -- meaning that you can install DirectX 8 but still program for the older version 7 (or earlier) functions. And this is, in fact, the oft-suggested solution when DirectDraw programmers take a look at the new API in version 8. To quote from Microsoft's DirectX FAQ, posted February 1, 2001:
"What happened to DirectDraw? Much of the functionality of DirectDraw has now been subsumed into the new Direct3D 8 interfaces. Developers working on purely 2D applications may wish to continue using the old DirectX 7 interfaces. Developers working on 3D applications with some 2D elements are encouraged to use Direct3D alternatives (point sprites and billboard textures, for example) as this will result in improved performance and flexibility."
While this is the official line, there are a few problems with this answer that cause confusion among 2D programmers looking to port their code to the latest API. First, it is misleading to say that "much of the functionality of DirectDraw" has been included in the new Direct3D 8 (possibly explaining their use of the uncommon word "subsumed"). Most significantly, there is no longer a blit command or equivalent function. Early in the beta process of DirectX 8, developers looked at new functions such as CopyRects() to provide some blit features but these lack necessary support for important capabilities such as color keying. Point Sprites, a new feature in DirectX 8, are technically quite limited and do not provide required functionality. That leaves "billboard textures," a concept with little information in the DirectX SDK and no provided examples for a 2D programmer to follow.
There is, however, the carrot at the end of the stick: "improved performance and flexibility." Clearly DirectX 8 is designed to get the most out of modern video cards, utilizing all the hardware acceleration available. Scaling, rotation, and most importantly alpha blending are all standard features with 3D graphics, and all hardware-supported. Alpha blending was defined in the DirectDraw Blt() function, but listed as "Feature Unsupported in DirectX 7," leaving programmers wishing to use this feature quite unsatisfied. So, how can a 2D programmer take advantage of these DirectX 8 features?
Implementing the Fundamentals in Direct3D 8
From this point forward I will refer to the new DirectX Graphics component as Direct3D 8, for the sake of clarity. Some of the changes made with Direct3D 8 result in a much simpler and cleaner setup and initialization. To some extent the philosophy is a "black box" approach where many details are hidden from the developer, and this does mean that less code is required to get the API up and running. See Listing 2 for an entire Windows application that initializes Direct3D 8 and gets us ready to roll. For the sake of this article, only a bare minimum is presented here, and only for full-screen mode, but you can see that the startup sequence is quite straightforward.
Now, how do you get those graphics to the screen? In 3D terminology, what needs to be done is to "render a textured quad with an orthogonal view." This is actually quite simple in plain English -- a quad is a square or rectangle, and a texture is just a bitmap image. Drawing textures on polygons (triangles, squares, whatever) is a basic function of 3D hardware. The orthogonal view means that you'll be looking straight down on it, no perspective effects. This concept is also sometimes called "billboard textures," referring to the fact that the object you are looking at is actually flat and two-dimensional, like a billboard sign.
If you are particularly ambitious you can implement a 2D engine in Direct3D 8 by writing a library of routines to create, calculate, and display textured quads. Be prepared, however, for some late-night sessions of learning new terminology, new concepts, and few good examples. Luckily, the Microsoft DirectX team has given us some hidden gems inside their D3DX Utility Library. This is an additional library of code separate from the DirectX API itself, but providing some very useful helper and support functions. Some of the functions will load your graphics to a surface from a good selection of file formats (including .JPEG, .PNG and .TGA as well as the standard .BMP bitmaps), and some will do various conversions and calculations automatically. For the 2D programmer, the most significant D3DX utility functions are part of the ID3DXSprite interface, which "provides a set of methods that simplify the process of drawing sprites using Microsoft Direct 3D."
If you research this interface in the DirectX SDK docs, the first thing you'll notice is that they are rather sparsely documented. The second thing is a complete lack of examples -- in fact, none of the three dozen or so example programs provided with the SDK uses this interface. Then if you dig deeper and try things out, you'll find out that the documentation that is there is actually incorrect in a few places. But all this should not discourage you because of one fact: it does work, and in fact it works quite well.
The heart of the ID3DXSprite interface is the Draw() function, which provides all the functionality to do a good ol' blit. Inside this routine there is a lot going on; specifically, it sets a series of "render states" to get the hardware ready to draw how you want it to, it creates the vertices that define the location of the on-screen quad, it sets the texture coordinates, it sets the camera view angles, and then it sends all the info to the 3D hardware which then draws the texture (your image) in the right place on the screen. But all this happens under the hood, and you can get this to work even if you can't tell a vertex from a vortex. We will shortly write a wrapper function to ID3DXSprite::Draw that will mimic all the functionality of the old DirectDraw Blt().
Before we go any further, it is time to clear up some potential confusion. Some of you may be asking if the ID3DXSprite interface has anything to do with point sprites, which are often mentioned in Direct X docs that discuss the new features of version 8. The answer is nothing at all, nada, zip. They're entirely different things, so forget you ever heard about point sprites. You may also be wondering about the ID3DXSprite interface itself -- a pointer to this interface is created by calling D3DXCreateSprite(). The docs say that D3DXCreateSprite() "creates a sprite object," and that it returns an address "representing the created sprite." Very confusing and inaccurate terminology, since this function doesn't actually do anything with "sprites" at all, and (with the exception of some internal data structures) it doesn't even "create" anything. Instead, it returns a pointer to the ID3DXSprite interface that you use for accessing the sprite functions. All you need to do is call it once when you create (or reset) the device, and release it when you close your application.
Another area of potential confusion, especially for programmers from the world of DirectDraw, is the difference between surfaces and textures. A texture is really just an object that contains one or more surfaces. It may contain multiple surfaces in the case of MIP-mapping, which is storing smaller versions of a bitmap for use when an object is in the distance. MIP-mapping is used in 3D work to minimize the size of bitmaps that are moved around and used when objects are far away and detail is not needed. We don't care about this for 2D work, so we will always use textures with a single top-level surface, called level 0. The GetSurfaceLevel() method retrieves a pointer to the surface of a texture, so you can work with the surface data if need be.
There is, however, one more big difference between surfaces and textures, and you actually have to pay some attention to this one: size restrictions. The width and height of a texture is restricted in most hardware to being a power of two for each dimension, and some hardware imposes maximums and other rules on these sizes. For the width and height, power-of-two sizing means that sizes like 128, 256, 512 and 1024 are acceptable numbers. So a texture of 256 pixels wide by 256 pixels high is acceptable, and most cards will allow different width/height values such as 256x128 or 64x16. This power-of-two limitation shouldn't be a major concern, because your bitmap doesn't need to fill the available space of the texture. If you have a 96x80 pixel graphic, put it in a 128x128 texture and just specify the location of your graphic when you use the texture. If you have two 60x60 images, template them together into a single graphic and load it into a 128x64 texture. There will be some wasted space, but in the future you'll start thinking about creating graphics in sizes that will work well with texture limits.
The maximum-size restriction is a bigger issue. Modern hardware allows textures of 1024x1024 or larger, even as big as 4096x4096 (which takes 64MB of video memory in 32-bit color), but some older hardware -- particularly 3dfx Voodoo products -- have a maximum texture size of 256x256. If you currently load larger images, either as a background image or as a larger templated file of small sprites, then you'll have to chop these into smaller images to be compatible. Most game designs today no longer load a large background image, since maps are usually made up of a collection of individual tiles and smaller images. There are CAPS details you can get from Direct3D 8 that will report the maximum texture sizes and other restrictions (see D3DPTEXTURECAPS_POW2, D3DPTEXTURECAPS_SQUAREONLY, MaxTextureWidth / MaxTextureHeight, and MaxTextureAspectRatio in the SDK docs).
The D3DX utility library also lets you get a bit lazy and will automatically calculate acceptable sizes for you, so if you call D3DXCreateTexture() and ask for a 100x60 texture, the function will check the hardware capabilities and create a legal size behind your back. Same with the functions that load an image into a texture from a file. And should you have a large source image that you can't or don't want to break up into smaller parts, you can load the large file into a surface with D3DXLoadSurfaceFromFile(), and then use CopyRects() to move partial correctly-sized sections of the image to the surface of smaller textures, which you then draw with.
A final warning before we get to the code for all this is that you should be careful of the amount of video memory you are using. If you have a 1024x1024 image template of sprites that you are loading, 32 bits per pixel, and an 800x600 screen resolution, then you will need almost 6MB of video memory (room for the front buffer, back buffer, and the texture). However if you use a number of the smaller 256x256 textures, then DirectX will automatically remove (flush) textures from video memory if the space is required to load new textures. Note that if this happens multiple times each frame you will take a performance hit.
Implementing a 2D Wrapper for Direct3D 8
Listing 3 provides a set of wrapper functions to simplify implementing a 2D engine in Direct3D 8. I already incorporated the ID3DXSprite initialization and release code into the program skeleton I presented in Listing 2. The D3DX utility library provides a lot of functions to help get things rolling, such as the D3DXCreateTextureFromFileEx(), which will easily load a bitmap image into a texture for you. As this function supports a lot of options you may not need, I've simplified it into a LoadTexture() function and commented some of the options you may consider using. The main blit routine is BltSprite(). This version of BltSprite() is designed to mimic the functionality of the old DirectDraw Blt() routine, renamed so that you don't entirely forget that you're not using DirectDraw anymore. This routine covers most of the features used in the DirectX 7 function with the exception of filling a surface using the DDBLT_COLORFILL flag, which is discussed below. Now, believe it or not, you have a fully working 2D engine implemented in Direct3D 8. Boy, that was easy, wasn't it?
Now that you've taken a moment to celebrate, it's time to point out a few important things. One is that color keying is no longer part of the implementation, alpha operations are used instead. Color keys are not actually understood by 3D hardware, instead everything is in various amounts of transparency. In a D3DCOLOR ARGB value, the fundamental color format for Direct3D 8, the alpha value is in the most significant byte and represented by a value of 255 (0xFF) for opaque, and 0 for fully transparent. Since by far the most common use of a color key is as a source color key (the color key value is associated with the source image), this is implemented in D3DXCreateTextureFromFileEx() by asking for a ColorKey value which is then searched for in the image and replaced with a D3DCOLOR of 0x00000000 (transparent black). So if your source image uses black as the color key value, specify a ColorKey value of 0xFF000000 (opaque black) and, upon loading the file, D3DX will make all the black pixels transparent. Many developers use magenta for the color key; for that specify 0xFFFF00FF for the value, and DirectX will replace the magenta with transparent black. Specifying a key value of 0 will disable any transparency replacement.
I've also taken a shortcut and specified 32-bit color ARGB as the internal format for loaded textures. This is not necessarily your screen format, as the video hardware will automatically convert between texture and screen formats when rendering. In addition, if my chosen format is not supported, D3DXCreate* should adjust the format of the texture to be correct for your hardware -- you can do this yourself with the D3DXCheckTextureRequirements() function, which will return the acceptable value sizes and formats based on the closest match to what you want. In an actual program you would likely scale your format from 16-bit to 32-bit based on total video memory or system speed.
Special Effects: Alpha Blending, Scaling, and Rotation
So far I've touched on how to replace DirectDraw with Direct3D 8 functions, but now that we've done so we have exposed some new functionality that is very easy to exploit. One feature is alpha blending, which provides the ability to make graphics partially or fully transparent. Depending upon the graphics format of your textures, you may have up to 256 levels of alpha, as described earlier, ranging anywhere from 0 for fully transparent to 255 for opaque. A value of 128 would provide 50 percent transparency, and experimenting within the 0-255 range provides the opportunity for some very interesting effects. It is now possible to draw graphics and sprites with a see-through effect, suitable for anything from smoke and fire to heads-up-display (HUD)-type graphics to… you get the idea.
There are two ways to take advantage of alpha values. First, you can create your source graphics with an alpha component, allowing you to make a single bitmap with varying degrees of transparency throughout the image. Many graphics editors support alpha values, and you'll need to save in a file format that stores this data (.PNG or .TGA, for example). When you load your image into a texture which has an alpha value in its format, the alpha information is maintained. Then, when you use ID3DXSprite::Draw(), the alpha value is used to decide how to blend your image with the background graphics. The second way to take advantage of alpha values is to use the Color parameter of the ID3DXSprite::Draw() function, which "modulates" the color and alpha channels by the value provided. Ordinarily, you would specify 0xFFFFFFFF for this value to get a standard blit. By changing this to 0x80FFFFFF, you will draw your image at 50 percent transparency. It is also possible to change the other bytes to affect the color of your drawn image, making it very easy and fast to draw a single object in a variety of color shades.
The next effect is scaling. This is actually available in the DirectX 7 Blt() command by way of the automatic stretching and shrinking operations, but Direct3D 8 does it much better. In DirectX 7 the results were often unexpected on different video hardware and drivers -- some operations were not supported between video memory surfaces, and sometimes the result of a stretch or shrink would be graphically poor. With 3D hardware these effects are accelerated and usually filtered to improve the display quality -- for instance, a stretch of a texture would have algorithms applied that would reduce pixellation and smooth out the resulting image.
Next up in the effects list is rotation. This is another effect that was available in DirectX 7, but support was very haphazard. For hardware that didn't support rotation directly, DirectX only emulated 90-degree angle rotations, and so the effect was not really usable in production code. In Direct3D 8, rotation is a standard feature, calculated into the transformation matrix (which we'll discuss in a moment) and so is universally supported. In Listing 4 I've provided an upgraded version of our BltSprite function, BltSpriteEx(), which supports rotation and color modulation values.
Finally, for those wishing to go one step further, and maybe even dip your toes into 3D waters, the ID3DXSprite interfaces offer an alternate version of the Draw() function, called DrawTransform(). Instead of taking scaling, rotation, and translation (positioning) information, the DrawTransform() function takes a transformation matrix, which defines the geometrical transformations for every pixel using a 4x4 matrix. Using this you gain access to effects involving the Z-axis, meaning distances closer to or farther from the viewer. A popular effect is perspective, which would give your image the impression of receding into the distance. The use of transformation matrices is far beyond the scope of this article, but for those interestedyou can start with "About 3-D Transformations" in the DirectX 8 SDK and go from there. Be prepared to dig out your old math textbooks.
Replacing GDI Functions with Direct3D 8
For DirectDraw programmers moving to Direct3D 8, I've pointed out that the first question is usually "Where's the blit function?" We've covered that topic, but the second question, almost as common, is "How do I get GDI functions to work?" In DirectX 7, programmers could get the GDI DC (Device context) handle and then use Win32 GDI commands directly on the surface. While GDI is generally slow, in many situations (particularly where execution speed is not a big issue) GDI provides very robust functions that handle a lot of graphic basics, saving programmers from much of the effort of creating routines from scratch. If nothing else, GDI routines could be used during prototype development, to be replaced by custom optimized routines later on. With Direct3D 8, that option is gone. It is no longer possible to "legally" get a DC handle to the primary surface, and so GDI commands to the screen are not possible. There are two possible solutions to this. First, you can create an off-screen bitmap surface with the CreateCompatibleDC/ CreateCompatibleBitmap() combination, do your GDI commands to this bitmap, and then copy the bitmap data to a texture that you can then draw to the screen. Second, you can replace the GDI commands with new custom versions.
In graphics, the two most common uses of GDI appear to be drawing rectangles (filled and unfilled) and drawing text. In Listing 5, I've provided a basic rectangle routine that cheats a bit and uses features of the Direct3D 8 Clear() function to implement filled and unfilled rectangles. The documentation indicates the main use of Clear() would be to, as the name implies, clear a surface, but since you can specify the clear location and color, this also makes a good general-purpose rectangle fill, duplicating the features of the DirectX 7 Blt() with the DDBLT_COLORFILL flag. The Clear() function should use hardware acceleration where available, making it a much better choice than any brute-force pixel routine. For applications that are fully 3D, there are better ways of doing this, but for our purposes this should suffice.
For text, the D3DX library provides us with the ID3DXFont interface, which works in a similar way to the ID3DXSprite interface. You need to pass the D3DXCreateFont() function a standard GDI Font Handle (HFONT), and in return you get a pointer to an ID3DXFont interface that has one main function, DrawText(). This function mimics the GDI DrawTextEx() very closely, such that the parameters are almost identical. In fact, the DirectX 8 SDK has a documentation error on the return value from this function, and the actual return value is the same as in the GDI function -- the return value is the text height of the drawn text. The visual quality and spacing of text output from this function is excellent, and there are lots of formatting options, but the drawback is speed. Under the hood, the ID3DXFont::DrawText() function actually does what I discussed above, which is to create a GDI-compatible bitmap, draw the text to the bitmap, and then copy the bitmap to textures, and render the textures to the screen. So you get all the sluggishness of the original GDI functions, combined with a lot of overhead -- in the end, this function is up to 6 times slower than the original GDI DrawTextEx() was.
The DirectX 8 SDK provides a sample that offers a solution to this -- the "3D Text" sample program defines a class called CD3DFont, which implements a much faster, though less robust, text solution. It uses GDI only at the time of initialization to create a master texture of available characters, which is then used to draw each individual character from the texture, not requiring further calls to GDI. You lose some nice features such as font kerning and formatting options, but you can always customize the CD3DFont routines to meet your needs.
2D Or Not 2D?
Despite the difficulties that DirectDraw programmers voice when attempting to move to the new Direct3D 8 API, the transition is actually quite painless. The techniques presented here are oriented towards this goal, and as such are poorly suited for being combined with true 3D programming. In particular, 3D programmers must learn to minimize state changes and operations that may cause "pipeline stalls," which significantly affect the ability of 3D hardware to work independently of your program. Yet, as discussed at the very start of this article, true 3D is not beneficial to every gaming situation, and developers of traditional 2D genres need to think very carefully before adding visual glitz that may, in fact, be detrimental to gameplay.
Possible next steps beyond what are demonstrated in this article involve taking a more serious step into the Direct3D 8 API, using vertex buffers and the DrawPrimitive() call to draw polygons on the screen. While at first this seems very intimidating to 2D programmers, it is actually not overly complex to rewrite the D3DXSprite interface routines from scratch and thereby get full control (and some improved performance).
The last word likely belongs to hardware. Video hardware, and the differences from machine to machine, have long been the curse of game programmers. DirectX, and in particular version 8, have looked to try to level the field and offer more consistent features and performance across the board. The downside is the increased minimum hardware requirements. To take advantage of Direct3D 8, the user must have a 3D accelerated video card with at least a DirectX 6-class driver available. Using the techniques listed here in a full product would likely result in a minimum 8MB video memory requirement as well. Unlike previous DirectX releases, version 8 does not provide a software rasterizer, so emulating 3D hardware on older video cards is not an option. (DirectX 8 does support plug-in software rasterizers, but at the time of this writing none are announced or expected.) Some software, such as , allows the user to choose between two versions of the graphics engine, one using 2D features and the other one enhanced with many of the 3D-accelerated effects described above. Developers must consider whether such duplicated effort is really worthwhile, or whether the growing number of users with 3D hardware is sufficient to concentrate on for new projects.
______________________________________________________
Read more about:
FeaturesYou May Also Like