MAKERSCORE

An explanation of what it is and how it is accumulated

Makerscore is a quantitative representation of your contributions to the site. Certain contributions net you a certain amount of makerscore, for example, a tutorial will get you

30, while a completed 5-star game will get you

750! You can find your total makerscore under your name in both your user profile and under your avatar in posts on the site. You can see a summary of makerscore earned by clicking on Submissions in the toolbar above.

As you garner more makerscore, the space in your personal locker grows (at roughly the rate of 40kb per point of makerscore).

Makerscore is recalculated in an overnight batch job (so when you don't see immediate changes in your makerscore, go to bed! Everything will be better in the morning).

SUBMISSION

MAKERSCORE

Article

30

Tutorial

30

Script

30

Utility

30

Review

50

Game Media

15

Game Image

1

Game Blog Entry

4

Game Page

2

Game Download

15

GAME + RATING

MAKERSCORE

Unrated game

10

0.5 star game

25

1.0 star game

50

1.5 star game

75

2.0 star game

110

2.5 star game

160

3.0 star game

220

3.5 star game

280

4.0 star game

340

4.5 star game

400

5.0 star game

500

COMPLETED GAME*

MAKERSCORE

Unrated game

15

0.5 star game

37

1.0 star game

75

1.5 star game

112

2.0 star game

165

2.5 star game

240

3.0 star game

330

3.5 star game

420

4.0 star game

510

4.5 star game

600

5.0 star game

750

ACHIEVEMENTS

MAKERSCORE

Event contest entry

Varies

Event contest winner

Varies

Had a game featured

75

*A completed game receives a 50% bonus in makerscore.**A user marked as 'Tester' or 'Other' on a gameprofile will receive 25% of that gameprofile's makerscore value.

So Anaryu brought up some important points, that actually making games is the riskiest/least rewarding form of generating makerscore.

This isn't ideal.

Let's say you have a completed game (unreviewed), so that means you have a gameprofile (15), say, 10 blog posts (40), a download (15), 12 images (12), a few pages (6), and a media (15) == 103 makerscore (with obviously some variability). What should we do to raise the basescore of a game? The gameprofile of a completed game? Should we bump that up to 100? 250? 500? Or perhaps we should make the "main" download worth 150 (and additional downloads 15)? Should there be different amounts based on role? Leads get the full 100%, developers 80%, testers 25%? Or should it be a game is worth 103 makerscore/# of devs = your portion?

What do you think we should do?

So we should make games worth a fixed higher rate, and then perhaps apply a bonus for reviews?

I do believe that incomplete games should rely on other methods to generate their makerscore - blogs, media, pages, etc..., so the base score for their profile should be low. However, completed games should get a significant bump, regardless of whether or not they are reviewed.

Also, Anaryu suggested that perhaps reviews are just bonus makerscore, and that each review should generate bonus score. So if you got a half dozen of low scoring reviews, at least you've proven that the game is interesting/buzzing enough to generate that! Worth considering, anyway.

One more wrinkle in all of this is that episodic games can be both "complete" and "incomplete" at the same time - you can have an episode that is worthy of being a complete game, but the OVERALL series is still incomplete. What to do then?

Lastly, the makerscore is calculated by person one by one, so making it so that a team of developers split a pot of makerscore per game might not be truly feasible, from an algorithmic perspective. I am not opposed to adding a new role though called "Participant" or "Contributor" for gameprofiles for these collaboration games, and they can receive a reduced percentage of the total makerscore value of a gameprofile (I believe Testers and Others receive 25%, for example)

author=kentonaLastly, the makerscore is calculated by person one by one, so making it so that a team of developers split a pot of makerscore per game might not be truly feasible, from an algorithmic perspective.

In the algorithm, you don't need to know anything about the other developers for the game, or how much makerscore they're getting for it. All you need to know is the number of developers for the game. If you made a game that has 4 developers, you get 25% as much makerscore from it; number of developers would simply modify the game's value, the same way the number of stars the game has does. This should make the algorithm work.

It comes down to what you/we/RMN wants to encourage, but personally I definitely believe that a Complete Game with a Download should be worth the most of anything, probably by a large margin. Finishing a project is a really difficult thing to accomplish and I think it makes sense to encourage that.

Miscellaneous stuff like blog posts, extra downloads (past the first), images, media, etc. could be worth a few extra points, but could even be capped per game, to prevent abuse (not sure how prevalent abuse is, tho). Uncompleted games are worth points, but the bonus for completing a game should definitely be significant (100%? 200%?), just in case someone needs additional inspiration. It's easy to start a game, but very difficult to finish.

author=kentonaAnaryu suggested that perhaps reviews are just bonus makerscore, and that each review should generate bonus score. So if you got a half dozen of low scoring reviews, at least you've proven that the game is interesting/buzzing enough to generate that! Worth considering, anyway.

It's definitely worth considering. I like the idea of games getting small bonuses for being "popular" or whatever (like if they're buzzing, or get a lot of reviews, or get good reviews). However, I don't want to discourage people from making small games for jams or events... but it makes sense to reward games that have seen a lot of quality effort and polish, as well.

author=kentonaLastly, the makerscore is calculated by person one by one, so making it so that a team of developers split a pot of makerscore per game might not be truly feasible, from an algorithmic perspective.

Damn, I was just going to suggest this.

author=kentonaI am not opposed to adding a new role though called "Participant" or "Contributor" for gameprofiles for these collaboration games, and they can receive a reduced percentage of the total makerscore value of a gameprofile (I believe Testers and Others receive 25%, for example)

Would something like this make sense for the bigger projects (like Befuddle Quest)? Again I don't think it's right to discourage group efforts, but at the same time, making a single map is pretty different than making an entire game. Is this something that's abused enough to be a problem? Maybe games with >10 people behind them grant reduced Makerscore? I really can't say and I'm worried about solutions that encourage abuse. Just a thought.

For the record, I definitely like the achievement/MS reward system for events and I think those should stick around. They encourage people to participate in events and make friends :)

author=kentonaAlso, Anaryu suggested that perhaps reviews are just bonus makerscore, and that each review should generate bonus score. So if you got a half dozen of low scoring reviews, at least you've proven that the game is interesting/buzzing enough to generate that! Worth considering, anyway.

I dunno. If you review a game specifically just to point out that the other review is biased and that the game is really nowhere near as good that review claims it is, it shouldn't raise the game's score. "Buzz" on its own isn't a measure of effort or quality, it's just a measure of buzz.

author=kentonaAlso, Anaryu suggested that perhaps reviews are just bonus makerscore, and that each review should generate bonus score. So if you got a half dozen of low scoring reviews, at least you've proven that the game is interesting/buzzing enough to generate that! Worth considering, anyway.

I dunno. If you review a game specifically just to point out that the other review is biased and that the game is really nowhere near as good that review claims it is, it shouldn't raise the game's score. "Buzz" on its own isn't a measure of effort or quality, it's just a measure of buzz.

Speaking from the website's point of view, though, buzz is important, and the website wants to reward buzzed about games with makerscore, so people are encouraged create more buzz?

author=LockeZAn unrated incomplete game with no download is not worth a hundred makerscore. That definitely takes less effort to make than a review or tutorial, and it really has no benefit to the community.

Once it has a download, it can count as a game instead of just a game page. But with no download, 10 points is about right.

I didn't say it was. The top one is unfinished games, the bottom one completed games.

author=kentonaI am not opposed to adding a new role though called "Participant" or "Contributor" for gameprofiles for these collaboration games, and they can receive a reduced percentage of the total makerscore value of a gameprofile (I believe Testers and Others receive 25%, for example)

Would something like this make sense for the bigger projects (like Befuddle Quest)? Again I don't think it's right to discourage group efforts, but at the same time, making a single map is pretty different than making an entire game. Is this something that's abused enough to be a problem? Maybe games with >10 people behind them grant reduced Makerscore? I really can't say and I'm worried about solutions that encourage abuse. Just a thought.

Well, I guess there aren't that many "collaborative" games like this - there are the Super RMN Bros games, the couple of Mario Mansion games, and the six Befuddle Quest games, and maybe a handful more. (My suggestion is to PARTICIPATE in them. If you want easier to get makerscore, then join up! That would be my preferred solution going forward...)

I will see about the ease to detect if a game has a main download - because I am liking the general numbers Fomar is suggesting.

But what is the consensus on "reviews as bonus score" vs "average review score determines final makerscore"?

If there's not that many collaborative games, then yea, participants deserve the Makerscore. If your score is partially a representation of your contribution to the community, then getting a lot of MS for contributing to a community project makes total sense.

I kind of like the "reviews as bonus score" idea, only because it puts the focus on finishing a game instead of on reviews - good reviews are just an extra benefit. Maybe each review would grant that game a bonus of X makerscore (based on the review score) up to a limit. That would reward games both for getting good reviews and getting lots of reviews (an indication of popularity). I definitely think the bulk of the score should come from completing a project.

I don't mind Fomar's numbers, but I'm wondering if the difference between finished and unfinished should be larger, to further emphasize finishing projects. Maybe completing a project isn't as important as I think it is, but it seems like a good habit to establish (and reward).

It seems that any calculation of Makerscore is going to be arbitrary until you qualify what the values represent. Is it community improvement, content creation, or simple participation? It's hard to judge someone's role on the site based off their Maker since there are so many sources, each weighted in a particular way, which may or may not imply meaningful contribution. It defeats the purpose of a highly-specific scores.

I'd caution against using Makerscore to drive community participation. Quality games should drive participation. Post quality is diluted by the extent to which the user was driven to post for the points. You're more likely to get rushed reviews and redundant screenshots. Not to say that many people would go through the trouble just for a few points, but does encourage quantity over quality, and being that there is no feasible way to measure quality at this point, you end up with people soft-spamming for boredom points. And nothing is more discouraging than getting half-butted reviews for a project you worked hard on.

Have you considered awarding points only for review scores and game downloads? Everything else should follow if people are making games other people want to (and actually do) play.

I'm lastly under the impression that games are scored according to average review. This can be arbitrary in that a rushed-quality, little-played game with a single five-star rating from a friend of the developer beats out a popular, quality title with many four-star reviews. By giving the developer points for each additional review it encourages dev teams to make games worth taking the time to review.

Actually I think something is up with MakerScore addition. I haven't had a review MS addition since the 20th and I've had 3 reviews approved since then. Normally it takes anywhere from an hour to half a day for it to show up, but a week?