Sponsored By

Wings. Part 1 – Leap of Faith

This is first part (of 6) of history of making 'Skydive: Proximity Flight' Game.

Kirill Yudintsev, Blogger

May 10, 2011

2 Min Read
Game Developer logo in a gray background | Game Developer

cossposted from here

Ever since the dawn of time, man has dreamingly gazed into the sky, hoping that one day he would be able to fly.


                   Part 1 – Leap of Faith

This story began in 2003.

Gaijin was a freshly created start-up.

In the beginning of 2000, a certain art project captured my imagination.

I’m talking about http://video.google.com/videoplay?docid=-7901131669838021630# - Zack Booth Simpson’s project. A person approaches the screen, and hundreds of butterflies created by the projector sit down on his or her shadow.

This idea captured my imagination. Now it’s called Augmented Reality or Mixed Reality, but back then I was imagining all kinds of possibilities: a game is on the screen and a person is standing next to the screen. His body controls the game process. No controllers; you just approach and start playing. Anyone could play: a kid who isn’t comfortable with a joystick, an adult who doesn’t feel like learning to play, anyone. It’s like magic; blowing everyone’s expectations and making magic possible for people is definitely worth working for.

We had plenty of project ideas immediately: drawing magic spells, dancing, playing a goalkeeper, but I’ll talk about those later. It doesn’t look all that original and new today, but it was the beginning of 2003, and Kinect didn’t exist. Even Wii and EyeToy were nonexistent.

'Magican', 2004

'Magican', 2004

'soccer', 2004

'Soccer',2004



In 2003, we started working on our project. We decided from the start that it should work even with the cheapest web-cameras (which were a lot worse back then). The challenge was to recognize human contours, at least the arms and head. In a word, what Kinect is now doing, but we’ll talk about that later.

Soon we understood that the problem was virtually unsolvable, at least in real time. Recognizing a shadow against a white background is one thing, but trying to recognize something against an varied background, with a resolution of 320х240 and a frame rate of 20 fps, is another thing altogether. Understanding where the user’s arms and head are can be hard even for a human eye!

We discarded solutions that required a white background. We discovered that any IR remote control worked really well – an IR-lamp is a bright white spot in any lighting conditions. There were other ideas involving all kinds of accessories that worked well, but reluctantly we discarded them all.

We wanted people to be able to just walk up and play.

Another idea was much better – image changing. Motion detection followed by recognizing the image by its changing contours. A lot like EyeToy, right? Well, just as we got started with the idea, EyeToy was released. 

to be continued... 

Read more about:

Blogs
Daily news, dev blogs, and stories from Game Developer straight to your inbox

You May Also Like