Archive for the ‘Technical’ Category
My First GDC
Friday, March 19th, 2010
I returned home from the Game Developers Conference (GDC) nearly a week ago, but I feel like it has taken me this long to be able to recover from the late nights, the jetlag, the cold I caught, and put things into perspective. I thought I’d share a summary of my experience there, for those who are thinking of maybe going next year.
Executive Summary: SO AWESOME!!!
GDC is held every year in San Francisco. I’ve been in the games industry for over 6 years now, but this was my first time at the conference, so I wasn’t really sure what to expect. I managed to get an All Access Pass to the conference, so I was there for the summits and tutorials, as well as the main conference.
Let me back up a little and talk about my reasoning for going, as that will help you understand why the conference was so valuable to me. At the beginning of the year I started thinking about what conferences I wanted to attend. 360iDev was a must-attend for me, so I booked that first. However, I was torn between attending WWDC (Apple’s big annual conference) or GDC. I attended WWDC last year and it was great. But this year I felt like what I really needed was general game design inspiration, and less Apple-specific technical inspiration. With that in mind, I chose GDC. My goal for the conference was to focus mainly on game design sessions and take in a few technical and business sessions.
So, I arrived in San Francisco Monday, March 8th, the day before the Summits started. I managed to meet up with a bunch of iPhone devs I know from Toronto, other conferences, or Twitter. We had a few beers and tried to adjust to west coast time. It was a good way to ease myself into the week.
Tuesday and Wednesday were the Summit & Tutorial days at GDC. There were two summits I was interested in: the iPhone Summit, and the Independent Games Summit (IGS). I think I spent about 60% of my time at the IGS and about 40% at the iPhone Summit. I saw some great technical iPhone talks by Noel Llopis from SnappyTouch and Phil Hassey from Galcon. I also saw some great IGS talks that ranged in topic from managing an independent game studio’s creative process, to how to better design indie games. I saw a session by Ron Carmel from 2D Boy, several awesome sessions by the people at thatgamecompany (Flower is one of my favourite games), and a terrific session by Randy Smith from Tiger Style (among so many others!). By the end of the Summits, my head was already spinning with inspiration. The IGS design talks in particular were extremely motivating for me. Getting a chance to meet and hear amazing indie game designers/developers talk about their processes was fantastic. It started me thinking about a lot of things as they relate to my own processes. More on that later…
Thursday through Saturday were the main conference, expo, and Independent Games Festival Awards. I sat in session after amazing session listening to industry leaders in game design, technical development, and business talk about their processes. I saw Peter Molyneux talk, Sid Meier talk, and even Will Wright talk. I saw a moving and inspirational talk by Brenda Brathwaite on her exploration into board games with serious themes. I saw a head-ache inducing (in a good way!) talk on PixelJunk Shooter’s real-time fluid dynamics system that made me really miss doing PS3 SPU programming. I saw an in-depth and honest look a the successes and problems encountered by Naughty Dog’s attempts to create an active cinematic experience for Uncharted 2. I was blown away by the quality of the content, and I was left reeling by how the talks started forcing me to think about the direction I want to take with my own games.
But of course, the sessions are only part of GDC. The other part comes from meetings and parties. I was able to set up a few meetings with iPhone press to show them my new game. That was really great to be able to demo the game in person. I think it was extremely valuable. Then each night there were countless parties happening. Each party was a great chance to meet people in person who I’ve only communicated with on twitter or via email. It was a chance to discuss iPhone development with other people going through the same thing as me. It was a chance to discuss game design in general with other game designers and developers. It was a chance to have fun with people who share in the same daily challenges that I do.
For me, I got out of GDC exactly what I wanted: design inspiration, new friends, new business connections and a wealth of knowledge. But perhaps most importantly, GDC helped me to put me back on track with where I want to take my games. When I decided to go indie in 2008, it was because I wanted to make the games that I was compelled to make. What I’ve noticed is that I’ve been making more and more design decisions lately based on what I think will sell well. This isn’t how I want to make games. I want to make the games that I have to make, not that I think I should make because I think it might make some money, even though the idea doesn’t excite me. Granted, I would love to be able to make the games that I feel compelled to make and have them also become a financial success. And obviously I can’t ignore the fact that I’m running a business. But GDC helped to remind me of what I want my priorities to be, and that, to me, is the most important part of having gone.
Owen
GDC, 360iDev, and More
Friday, February 19th, 2010
It has been quite a while since my last post. I would have been posting, but I’ve been doing some contract work, so I haven’t had a lot of my own news to talk about. I’ve also been doing my year-end bookkeeping, and as exciting as that is, I’m not sure anyone wants to read about my adventures in recording business receipts from 2009.
However, over the last few days I’ve been returning to my own projects and getting back into the swing of things. As you might recall, I took part (remotely) in September’s 360iDev Game Jam night and created a prototype for a puzzle game idea I had. I’m happy to report that I’ve been developing that idea further and it’s coming along nicely. The game now looks very different from the screenshot I posted on Touch Arcade, by the way. As for a preliminary look at what the game is becoming, I might have something to show early next week, so keep an eye out for that!
Yesterday I decided to rewrite the rendering part of my engine to take advantage of a bunch of optimizations I had been putting off making. It turns out that the changes I made over the last day reduced my render time by about 40%! That means that I can render nearly double the sprites on the screen without dropping my framerate. This is great news, and I’m looking at porting the changes back into Monkeys in Space at some point to help out with that game’s performance.
In other news, I thought I’d mention that I’ve decided to attend two conferences this spring: the Game Developers Conference (GDC) in March and 360iDev in April. I’m looking forward to both conferences, but I’m especially excited about GDC as I’ve been in the games industry for over 6 years now, but I’ve never gone to GDC! I’m really excited to get a chance to finally go and see what all the fuss is about. I’m also looking forward to attending sessions on game design that are more broad than just iPhone games.
That being said, I’m also happy to be attending 360iDev again. It will be great to see the iPhone developers I speak with every day on twitter in person again. Last year I had a fantastic time at the conference and I expect no less this time. I’ll also be speaking at 360iDev. For my presentation I will attempt to create an iPhone game prototype in 80 minutes based on audience suggestions. While doing that, I’ll be highlighting some of my best practices for rapid prototyping. If you’re attending the conference, I hope you’ll check it out. If you’re not attending the conference, why not? Check out this amazing schedule of speakers. And if you’re thinking about it, go register! 🙂
Owen
I’m Still Here…Making Progress
Wednesday, October 21st, 2009
It’s been a little quiet here on the blog and video blog lately. I apologize for not being more communicative. I’m hoping to get a new video blog posted in the next couple of days. The gameplay for the game has been nailed down and I’m really close to having the in-game art locked. Once I finish up a HUD for the game I’ll post some screenshots and even some in-game video! Woo!
However, before that can happen I’ve been taking care of some less exciting stuff. The last couple of days I’ve been working on a save system for the game. It turns out saving the state of a game that has a physics engine and is interacted with in real-time is a lot more complicated than for a turn-based puzzle game. Who knew? I kid. But seriously, what I hadn’t anticipated was having to build a whole new save game framework to handle it nicely.
At this point I’m getting a little concerned with the fact that there are only 10 days left in October. If I’m going to submit by the end of the month, I’ve got a lot of work to do. If I really buckle down I still might make it, barring any horrible problems [knocks on wood]. Otherwise I might end up pushing a week in November. Here’s hoping the next two weeks go well!
The good news is that my play testers seem to be really enjoying the game! There are a couple that keep battling for the high score in the leaderboards. It’s exciting to see! A week ago I thought I was Mr. BigShot with the high score in the game for the first level. Now I’m 5th. I love it when that happens. It means the people playing the game are discovering strategies that I hadn’t anticipated.
It should be an exciting next couple of weeks!
Owen
Video Blog – Episode 6
Wednesday, September 2nd, 2009
It’s been 2 weeks, but it’s time for another episode of the Streaming Colour Video Blog. This week I talk about the new texture atlas and bitmap font systems and show what the textures look like.
Owen
Texture Atlases and Bitmap Fonts
Thursday, August 27th, 2009
I realise that I haven’t been posting a lot of text posts about the new game’s progress since I started posting video blogs (or “vlogs” as the kids call them these days). However, I spent the last two days working on some code that doesn’t really feel appropriate for the vlog, so I thought I’d write about it here. Hooray!
I decided to take a bit of a break from gameplay coding and instead focus on some performance stuff. I want the game to run at 60 fps, even on a 1st gen iPod touch, if I can. On Monday I took at look at the frame rate and it was around 40 fps on my iPod. I booted up Shark (a performance analysis tool for Mac OS X and iPhone) and took at look at what was slowing things down. The game’s not doing a whole lot right now, so there’s no reason it shouldn’t be running at 60 Hz. Plus, the physics engine uses a fixed time-step, so the frame rate needs to maintain 60 fps as much as possible to keep the physics in check.
What Shark told me was that my font rendering was taking up about 40% of my frame time! At 40 fps, each frame is taking about 25 ms, which means about 10 ms was devoted to rendering text on the screen! That’s insane! Especially since I was rendering two strings: “Score: 5” and “Debug”.
The code I was using for text rendering is pretty inefficient, and I had always intended on replacing it. I just hadn’t realised how inefficient it was. The code was from the old CrashLander example app that Apple pulled off the dev site a long time ago (because of stuff like this). It was using CoreGraphics to dynamically render the text out to a texture, which was then used as an alpha mask to render the text to the screen. This was all extremely slow.
So, it looked like I needed to implement my bitmap font system sooner than I had planned. A bitmap font is a fast way of drawing text, but that has some drawbacks. A bitmap font is created by rendering out the character of a font at a particular size to a texture (the big downside of using a bitmap font is that it doesn’t scale well, since the font is rendered out at a specific size). This is all done on your computer. You end up with a texture atlas (more on that in a minute) that contains all the characters you want to be able to render on one big texture (or several small textures, if you want).
So what’s a texture atlas, you ask? A texture atlas is when you cram a bunch of smaller textures into one big texture. You end up with one large texture (that has power of 2 dimensions for optimal video memory usage) that has all the smaller textures laid out next to each other. A texture atlas also requires a data file that describes the atlas. The data file will contain information on where each sub-texture lives in the atlas. This data can be used in the game to determine which small portion of the atlas to draw onto a polygon that gets drawn to the screen.
The reason this is done is that every time OpenGL has to change which texture it’s currently drawing with, it takes time. So if you draw, say, 100 different little sprites every frame, and each one is in it’s own texture, OpenGL has to change which texture it’s using 100 times. This can lead to a lot of inefficiency and can actually significantly slow your rendering code. But if you put those 100 sprites into one big texture atlas, then OpenGL doesn’t need to swap textures, it just changes the coordinates of the current texture each time.
So for texture atlases to really work, you want sprites that need to be drawn together grouped into the same atlas. You lose all the efficiency gains if you have two atlases and every alternating draw call is in a different atlas. In huge 3D games, this usually means putting all of a character model’s textures in one atlas (so a soldier gets his uniform textures, facial textures, etc all put into one atlas), since the character is rendered all at once. In a small game like mine, I can generally fit all the sprites I need into one texture atlas.
Finally, the other big benefit of texture atlases is that they can be more video memory efficient. You can do a much better job of fitting non-power of 2 textures into one giant power of 2 atlas, than padding out each smaller texture to a power of 2. This means you’ll have more VRAM available for other things.
Building a texture atlas creation and rendering system was the first thing I did this week. I actually use a Python tool that my friend Noel pointed me to, to do the packing of the texture:
- AtlasGen (atlasgen.svn.sourceforge.net)
Then I parse the atlas data into a plist which I can load at runtime. Writing all this code would allow me to speed up the rendering of my sprites in general, but I could reuse the rendering code for my font rendering system.
So, back to the bitmap font system. I considered building my own bitmap font generation system, but that seemed silly. I poked around on the internet looking for Mac tools available, but couldn’t find any. Then Noel pointed me at some tools for Windows. Earlier this year I bought a copy of VMWare Fusion, so I can run Windows programs on my Mac. Hooray, it’s coming in handy! I did some more poking around and found this tool, which I quite like:
- Font Studio (www.nitrogen.za.org)
One of the nice things about this tool is that it seems to handle kerning (the distance between adjacent characters) quite nicely, even for italic fonts. This tool allows me to export a font texture and also generates the data file for me. Then it was just a matter of parsing the data file in the game and using the existing texture atlas code I had already written. Finally, to render the string, it’s just a matter of iterating over the string to render, grab each character, look up into the font data, and draw the appropriate part of the texture onto an appropriately sized quad.
The net result of this? My frame rate is holding at 60fps most of the time now. I still get some spikes, but it’s good enough for now. Shark tells me that my font rendering now takes about 9% of my frame time. On a 16.7 ms frame (60 Hz), that’s about 1.5 ms. And digging further into the profile, it looks like a significant portion of that time is actually spent inside NSString operations. The actual rendering is about half the time. That’s a huge reduction in render time! On an iPhone 3GS, this thing will fly!
So things are looking good. The frame rate is back to a point where it matches the physics system, which means I can do some proper tuning of the physics now. Thanks for sticking with me through such a technical post. Look for another video blog episode in the next day or two.
Owen