Thanks to all the judges for their effort! Remember, the judges did their best given what they were handed. I hope everyone walk away satisfied, although there might be a handful of games that are missing a review from a judge or two due to problems running the games - I do not feel that it affected the outcome. If anyone thinks his game is unfairly treated he can contact me.

I think I was injuried since ChrisM was not able to play my game. I have attached it in the previous thread could you please review and add some score!?!?! Having zero in all categories drop me very down in the list for PacPitfall.

Also there were other games that ChisM was not able to play. Since other judges were able to play I think it's not fair with us!!!!!!!

Thanks, but I really think the order is fine - a lot of people had trouble playing 4x4k due to sensitivity, it's just a bit too hard.

I'm never sure about including zero scores (due to failure to run) into averages makes sense. I don't think it's worth revisiting since the contest has never been really been about the results - but maybe next time use that old remove the lowest score tactic?

Congrats again to everyone that was part of the contest this year. Fantastic quality and loads of fun games to show off (lets port them all onto Android now and make millions! ).

Also a big thanks to the judges that evidently spent ages going through every game and giving thoughtful feedback!

Well, I can't say that Frequent Flier was unfairly judged - the reviews follow the rules. It's just a shame that it came last because now noone will try it out, and I think the game has enough merits that it's definitely worth playing at least once (if you have a mic ).

Well, about the zero scored games, what other score could I possibly give them? Obviously there were issues with launching these since all of the other games launched. I hated to score these three this way, but there was no alternative. And I am not the only judge who had issues.

I tried running them in Firefox and Safari. No dice. As for PacPitfall, I was able to play the jar file you posted and it did work, just after the competition ended. It was a good attempt at the Pitfall gameplay, pits, vines, etc. Just not very engaging. 78% Overall, 83% Presentation, 75% Technical. There you go

I think Kev's suggestion might be acceptable for next year. Or, judging can be done by splitting the games evenly across multiple judges.

As for people defending their games vs. the scores they may have received, that is expected! Remember, we are judging based on our personal observations and are just that, our reviews.

I didn't expect so detailed reviews ! The wait was really worth them and thanks to the judges for their time. The results seem fair and in particular the first place is completely deserved - congratulations to Markus who left everyone but him... dead !

However I must admit I was a bit disappointed by results concerning some of my favourite games, as for World Rally Driver 4k that I found almost perfect technically and on the gameplay level, as well as 4x4k which was not so hard when you remembered to push [space]... And not to speak of Asteroid Alert! and its impressive Elite-like radar, Q*Bert4K, Jetp4k and Deathchase for their perfect conversion, Gravitational Fourks for its physics and graphics, and I suppose that J-Type suffered from being Squarius direct sequel...

My final comment will be that next year it would be nice to reconsider the hosting solution for the games, as it seems it caused a few problems in this 2009 contest. See you in 2010 !

Well, about the zero scored games, what other score could I possibly give them? Obviously there were issues with launching these since all of the other games launched. I hated to score these three this way, but there was no alternative. And I am not the only judge who had issues.

Of course for some of the judges some games did not work so they gave it 0% and thats fine, but I think that overall score of the game shoudl not be affected by it .Its rather a suggestion for a further contests, because i think once the results are officialy published they should not be touched(unless someone cheated or something like that).

I agree that nothing is ever perfect But as long as we try to achieve perfection, then time is on our side.

Hopefully we'll be able to use what we've learned from this year (huge number of games, better formulated rules for judges) and apply it to the judging process next year. So, next year we might have a different type of a judging process, better, faster.

For games receiving zero score in all categories from one judge, then I recommend reading what the other judges had to say. Perhaps a feature is missing from the judging process so a judge can "pass" on a game, and then his zero score won't count.

Yeah, last time the results started getting fiddled with we ended up in a bickering mess, 2 years ago was it.

I want to avoid fiddling with the results. It's never possible to please everyone, I'm sure there are games there that ARE better than the score indicates, and are lower in score order than lesser games, but that's life At some point we'll have a user feedback/score on all these games so the community can decide on it's own, until then we'll have to use this imperfect judging review process, which although does give all games equal fair shake.

Can I make a suggestion that for next year the handling of "can't run" games be changed? I'm a bit miffed that I tested on a whole bunch of machines and never had anyone who had a problem. It would have been nice to have been contacted and informed of the problem rather than getting a (very score damaging) 0/0/0.

Score aside, I'd still like to fix whatever issue darkfrog had so no-one else bumps into it in the future, platform and VM version would go a long way.

I have finished to read all reviews, and I agree with userek : the game score should not have been affected by the impossibility for a judge to launch it. Now I wouldn't be shocked if the classments were recalculated without counting the 0%... However reviews are of course more important than scores, and even if a recalculation was made I suppose the top 3 in every classment wouldn't be affected. Beside this problem, the organization of the contest was very good and certainly represented a huge work, so please don't take any offense from this very final comment !

I do not feel that it affected the outcome. If anyone thinks his game is unfairly treated he can contact me.

Thanks all.

I very much think you should've notified the community about this issue before posting the scores and asked for their opinion. Also, we've seen the 0-score "couldn't run" issue several times before, and solutions to the problem were posted every time (including median, or an average of the middle three scores, or counting these scores as dash (-), not affecting the scores). I also think you could've contacted these authors to give them a chance to fix the problem.

Two of my games wouldn't run on darkfrog's machine (with vista64). I think this is at least as likely to be an issue related to the JVM or obfuscator as my code. Either of the solutions above *or* having a standard platform for the judges to test on would've solved this issue.

In this case, the result lowered the average score for these two games (Bridge4k and Pixeloids4k) from around 90% (or slightly above) to around 75%, in effect moving them about 30 places down in the score list. And I'm not the only one with this problem.

Frankly, I'm astonished this wasn't fixed beforehand because we've seen this issue before.

With that said, I appreciate all the work of both organizers and judges. But I do think this particular issue could've been solved in a hundred different ways that would've caused way less bitterness and given us a more fair scoring system.

EDIT: Out of curiosity, could you post a list of the scores as they would've been with 0% ("couldn't run") scores not counted? I'd like to see it, at least...

I very much think you should've notified the community about this issue before posting the scores and asked for their opinion. Also, we've seen the 0-score "couldn't run" issue several times before, and solutions to the problem were posted every time (including median, or an average of the middle three scores, or counting these scores as dash (-), not affecting the scores). I also think you could've contacted these authors to give them a chance to fix the problem.

Two of my games wouldn't run on darkfrog's machine (with vista64). I think this is at least as likely to be an issue related to the JVM or obfuscator as my code. Either of the solutions above *or* having a standard platform for the judges to test on would've solved this issue.

In this case, the result lowered the average score for these two games (Bridge4k and Pixeloids4k) from around 90% (or slightly above) to around 75%, in effect moving them about 30 places down in the score list. And I'm not the only one with this problem.

Frankly, I'm astonished this wasn't fixed beforehand because we've seen this issue before.

I admit this is a problem. The reviews got finished so late that there was little time for afterthoughts.

I am now pondering with the idea of skipping these zero scores, and only averaging the score that is above zero. This WILL affect the order of the games, even in the top 10 and even in the top 3, although the #1st place is firmly held by Left 4k dead.

I can easily correct this, and in fact have the code ready for it, but I need "go ahead" from you guys.

- Remove all the highest scores for each entry - Remove all the lowest scores for each entry- Remove all both the highest and the lowest for each entry- Remove a particular judge's results for each entry- Make all the scores for every game but the one selected zero (ok, maybe not this one ).

It'd hardly be fair to judge some entries on 4 judge's results and some on 5 given the disparity in the judges ranges of scores and approach to judging.

This could go on and on and on...

[size=5pt]One could also argue that it's a coding contest and failing with an NPE is a failure to code to some API or framework given how many other games managed to run on the same setup.[/size]Kev

I would certainly appreciate it, appel. Let's see what the rest of the community thinks. Perhaps a poll?

EDIT:

I see you just changed it. Thanks a bunch!

Please, just keep in mind that I want the community to have a say in this! If they preferred the old version, I will accept it (although I don't have to like it!). I don't want to be the one ruining the contest for somebody.

I would certainly appreciate it, appel. Let's see what the rest of the community thinks. Perhaps a poll?

EDIT: I see you just changed it. Thanks a bunch! Please, just keep in mind that I want the community to have a say in this! If they preferred the old version, I will accept it (although I don't have to like it!).

No, this was clearly an error, which has now been corrected.

An error doesn't become an mistake until you refuse to correct it, and I don't want to make this into a mistake.

java-gaming.org is not responsible for the content posted by its members, including references to external websites,
and other references that may or may not have a relation with our primarily
gaming and game production oriented community.
inquiries and complaints can be sent via email to the info‑account of the
company managing the website of java‑gaming.org