My Experience as an IGF Judge

7 01 2010

[This opinion piece is part of a Gamasutra article. This below article is the original full piece.]

This week the finalists of the Independent Games Festival 2010 were announced. I was one of the 150+ judges asked to score just over a dozen of the 301 indie games entered into the competition. I’ve been asked by quite a number of people – both developers and gamers – how exactly it all happens, so I decided writing up my experience of the process and how I felt about it all might be interesting reading. Maybe even enlightening? Either way, my thoughts are below!

This was my first time judging the competition, so everything was pretty new to me. The contest is split into two rounds – the first selects all the finalists, while the second reveals the winners in each category. In round one, each judge is given 7 weeks to play through just over a dozen randomly-picked entries. I was assigned quite a variety, with a mixture of puzzlers, platformers, shmups and first-person shooters, as well as some… oddities.

I remember looking down my list for the first time and spotting some names I recognised and others I had not a clue about. I decided at that point that the only way I was going to judge this list of games fairly was if I played them from top to bottom, not picking out the ones I had already come across or was previously excited to play. More on my own personal methods of judging later, though.

For each game, there were 5 scores out of 100 to give in each of the following categories:
Excellence In Design – game mechanics, level design, difficulty balancing
Excellence In Audio – music and sound effects
Excellence In Visual Art – appearance, visual effects
Technical Excellence – Technical aspects e.g. game engine use and code base
Overall Rating – based on overall impressions of the game

Each of these individual scores then translate into the different award categories of the same names (except ‘Overall Rating’ of course, which becomes the coveted Seumas McNally Grand Prize).

Scoring is as you’d expect. 0 is abysmal, 50 is an average experience and 100 is perfect execution. Of course, individual perceptions of worth out of 100 will range from person to person – I may consider a game to be average and give it a 60, while someone else might feel the very same as me yet give the same game a 40 – and so a sort of Idiot’s Guide to Scoring is helpfully supplied, so judges can try to match their thoughts up with a specific scoring range.

So, back to my scoring. The first obstacle to overcome is having misgivings about a game before even playing it. We’ve all done it at some point – be it a screenshot, or a clumsy game description or maybe a trailer of suspect quality, it’s easy to conclude that you’re not going to enjoy a game before even installing it. Obviously in these circumstances this isn’t fair in the slightest, so personally for myself, it was very important to leave all these kinds of thoughts well alone.

My range of titles turned out to be quite the mixed bag, with a number of superb gaming experiences slotted in between some other not-so-fantastic. My personal means of scoring each game was with a pen and paper at the ready, noting good and bad points as I went along, and using them to come to a conclusion at the end.

Along with the scoring, there was also ‘Anonymous Feedback‘ to be given – obligatory for the first time since the competition began. This, I felt, was incredibly important. To understand how important the feedback was, I put myself in the shoes of a developer. I’ve just submitted what I believe is my best work ever. More than anything now, I want to know what people think. I don’t just want a string of numbers thrown back at me with no explanation as to what they mean. If I’m scoring low in the Audio section, I want to know why!

With this in mind, I made sure to give each of my entries a decent amount of feedback, be it praise or constructive criticism. I didn’t dance around the subject though – if something was good I said so, and if something was bad I made sure the developer understood that I didn’t enjoy that specific area as much as I would have liked.

An area that I felt mildly confused about was the topic of length. I had games in my list which were over in a matter of minutes, then I had other titles which went on and on for hours. Now clearly these shorter games weren’t short due to the developers being lazy or running out of ideas – this is just how the developer chose to express him or herself. But then if a developer has put, say, a month of work in, and produced something short but sweet – but then another developer has slaved away for a whole year, crafting something wonderful with a good few hours of play to explore, should one get precedence over another? It’s a tricky one, I believe.

The other feature for judges to indulge in a bit of was the ‘Judge Notes‘. At the bottom of each game page was a comment box, allowing judges to discuss said game. This was an area with which I had a bit of an issue with. See, my belief was that these notes were for discussing technical issues – for example, if a particular judge couldn’t get the game to run, he/she could post in the notes with their difficulties, and other judges could jump in and try to help them out.

However, judges were also encouraged to discuss gameplay, strengths and weaknesses of the game et al. This is the part I had a problem with. If say, the tenth game I came to judge already had a bunch of comments of it saying ‘erugh this is horrible’, ‘really poor gameplay’ etc, that would automatically put bias in my mind before I’d even booted the game up. Of course the answer to this is to simply not read the comments until I’d played the game, but since they are situated right beneath the game description, it was a little difficult not to! Maybe I was in the minority, but I would much rather the judges’ conversing was strictly for helping each other out and that was it.

Other than that, however, I felt that every game had as much chance as any other, which really is a remarkable achievement considering there were 300+ games and 150+ judges to co-ordinate. The judging was a very painless experience, meaning it was easy to slot playing through my games in with the rest of my work.

So that’s where the whole process is up to right now. The next step, which begins soon, will involve each judge scoring around 20 of the finalists (using the same categories as before), with these scores eventually being tallied up and the winners announced in March. Exciting stuff – and good luck to all the finalists!

[Part of this article can be found at Gamasutra, IGF.com and on the Indie Games Blog.]

About these ads

Actions

Information

One response

16 07 2014
yahoo

naturally like your web-site however you need to check the
spelling on several of your posts. Several of them are rife with spelling issues and I
to find it very troublesome to tell the reality then again I’ll
certainly come again again.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s




Follow

Get every new post delivered to your Inbox.

%d bloggers like this: