We currently have a review queue for first posts as well as for low quality posts.

We currently also have a problem with the signal-to-noise ratio of new questions, and the current system seems insufficient to alleviate this problem.

I propose to amend the current review system in a similar manner on how journal article are peer reviewed before they are published.

TLDR:I suggest that questions by new users, or users with a history of bad questions, are required to be reviewed before they can be answered

My proposal:

Questions by users with bad previous questions (and possibly by new users) do not appear on the frontpage or in search results by default.

The only place those questions show up is in a special review queue. Reviewers can vote to publish those questions, suggest how the question can be improved, or skip. Reviewers by default get questions tailored to their interests using a similar algorithm as currently used to select questions for the "interesting" tab on the frontpage.

Questions with positive reviews get unlocked and published. Question with requests for improvements get sent back to the author, who can then edit the question and resubmit it. To avoid questions in low traffic tags from hanging around in limbo for ever, questions with no negative reviews automatically get published after x minutes.

Once a user has a certain (very low) number of question that passed review with the first attempt, their questions get published automatically. If they post a number of closed or downvoted questions, this is reverted.

Gamification: To attract enough reviewers and to prevent review-bots, reviewers get reputation for questions they approved: If the question they approved is later upvoted, they gain reputation; if it is downvoted they loose reputation. In essence the reviewers share responsibility for the quality of the question with the asker. This avoids some of the downsides of the badge-reward / audit-punishment system in our current review queues.

For the initial test phase this review system would only apply to users with a history of bad questions, i.e. users close to a question ban. If it works well, it should probably be extended to all new users, otherwise it is too easy to circumvent.

Possible drawbacks

this would be a massive barrier of entry if extended to all new users.But perhaps that is what we need?

Instead of being judged by the broad masses of stackoverflow peers, each question in the review queue would depend on the whims of a small random sample of users.Of course, a user could always try again if their question fails review, and thereby get some new reviewers, but the fundamental problem remains. There will probably a lot of outcries over rejected questions.

We currently have no hope of ever emptying the close-review-queue, so a queue for all questions by new users may be too much to handle.I hope that with the proposed gamification we would attract enough reviewers to keep the size of the queue short. And if we automatically publish questions that were in the queue for 30 minutes, then the worst that would happen is that some users would have to wait half an hour for their answers.

It is not that different from the current system. We already review all questions and close the ones deemed bad; and we ban users with excessively many bad questions. The only difference is that currently everyone can participate in the review of every question, and the proposed system adds a lot of complications on top of that.However the current system is not working as intended. Bad questions do not reliably get downvoted or closed, often they get answered instead. The number of views on bad questions is staggering. If we could concentrate all these views on better questions instead, we would improve the overall quality of the site.

Related proposals:
I hope this theme has not been beaten to death before. I could find no direct duplicate, the closest I could find was the question "Why not peer-review all new questions". However that question proposes a very different review system based on social networks, that boils down to more specific filters for each user.

Please do not take this proposal as a final draft, more like an early concept; perhaps some ideas in here can be used; most likely not everything as it stands. After writing all this, I am suddenly having doubts whether this could have some big downsides I am not seeing yet.

"the reviewers share responsibility for the quality of the question with the asker." Nooooooo. Becomes a disincentive for reviewing at all and will lead to edit wars as well.
– Ňɏssa PøngjǣrdenlarpMay 31 '14 at 19:11

12

Well, the reviewers would get rewarded if they approve of a good question, and would only get punished for approving bad questions. The idea is to give reviewers an incentive to approve good questions and reject bad ones. The reputation gains/losses could be relatively small.
– HugoRuneMay 31 '14 at 19:15

3

Considering the thorough, thoughtful, and vigilant reviewing </sarcastic> we see in the suggested edits queue, I have no faith that this would work out at present. It's an interesting idea, however, and one that's well worth its own Meta question.
– jscsMay 31 '14 at 19:36

7

The suggested edits queue is particularly bad, I agree. But the current system rewards reviews only with a boring badge for number of reviews; and the edit queue audits are trivial to spot. If each reviewer had a chance of discovering and sponsoring the next top-scoring question, now that would be a proper incentive to do a good job.
– HugoRuneMay 31 '14 at 19:42

2

Did you watch the latest podcast? Joel raises some interesting ideas there about this sort of thing.
– Benjamin GruenbaumMay 31 '14 at 19:57

6

the reviewer should see the question and some buttons: "just awful", "bad indentation", "unclear what you are asking (e.g. no '?' contained in question)", "too broad", "i don't understand the question, but it might be okay", "i understand the question and i like it", "i understand the question but i don't like it, but i think it's valid". ;)
– MichaelMay 31 '14 at 20:13

As a relative n00b, this sounds like a really good idea. I'd rather ask a bad question and have to wait for a while than see the quality of SO go down because we're too afraid to offend new users. Anyone who really cares about SO continuing to exist and improve would probably be in favor of such a change, (as long as it is implemented properly).
– Chris MiddletonJun 1 '14 at 23:10

This appears to be a repeat of Daniel's "show queue" idea, shown here - Ooops, sorry @DanielVérité, you caught this already. But, you see, I remembered the idea because I liked it!
– mattJun 2 '14 at 14:54

kuro5hin.org, the "alternative" to slashdot, implemented this idea of a peer-review queue on his popular blog fairly early (think mid-1990s). I wonder if Rusty is still around to chime in.
– PaulJun 3 '14 at 8:43

8 Answers
8

I can see your perspective on this. I can also see that this could be a useful feature to have, as long as it is implemented properly.

Looking through the comments, I can see people are pretty divided, and it's fairly obvious why: if this was implemented incorrectly, it could destroy the site. Let's have an example. User X posts a badly-worded, grammatically-incorrect question. He's got a history of bad questions and has just been put on this 'review-before-publishing' list. So he's used to quick responses and high quality answers to his questions; and given that he's consistent with his bad questions, he's also used to people editing them to correct them.

With this question, he gets no answers and little attention, and he doesn't know why. After a while he finds the answer elsewhere and leaves it, moves on. The process repeats over several subsequent questions.

Over time, this would give him the impression that it is better to find answers elsewhere.

Personally, I like the idea. I think it would eliminate a lot of bad questions and make it easier for answerers to find good questions to answer.

However, I think there should be a few things added to it before it goes live.

1. Notifications
There should be a notification to the user when they get put on this list. That would mean people wouldn't get confused about why they aren't getting attention.

2. Reviews
There should be a frequent review of the list. This means that every week? month? the list would be reviewed, and users whose question quality has improved would be taken off. This might be harder to implement though - once people are on the list are they going to continue asking questions? How do we judge whether the quality has really improved?

3. Search
You mentioned the possibility that these questions would not appear in search. I would tend to agree, with one caveat: there should be an 'include hidden questions' checkbox that would include questions asked by people on this list.

There are probably several more things that would need to be worked out. However, as an initial idea, I like it.

"Over time, this would give him the impression that it is better to find answers elsewhere." - wait, is that a bad thing? How many people post inappropriate and beyond-salvageable-through-editing (OP clarification required) (which is what we should be targeting) but otherwise particularly useful questions?
– Bernhard BarkerJun 1 '14 at 19:53

38

So, your "bad scenario" is that this will be bad to users that ask badly formatted questions about things that they could have answered themselves if they were not so lazy? And that such users might leave SO as a result, or they will be forced to do things properly? I do not know I will be able to stand a decrease in the number of questions asking "why 5/6 returns 0".
– SJuan76Jun 1 '14 at 20:02

1

regarding the list of users that cannot post questions without review: this could easily be automated - e.g. post one question that is approved without changes and you are out of the list, then post two consecutive questions that are heavily downvoted or closed, and you are back in. | Regarding user X getting used to reviewers editing his question to correct them: I don't think reviewers should edit-and-approve questions; either approve or suggest improvements. When a question needs improvement it is sent back to the original author, that allows him to learn from the review process.
– HugoRuneJun 1 '14 at 22:18

1

@SJuan76: I don't deny that a decrease in such questions would be a bad thing. However, badly-written questions are not necessarily bad/easy questions: they could be intelligent and hitting a very valid point, but would still be reviewed for the bad writing. Though in principle, I agree: these suggestions are going to need refining before they come out live on SO.
– ArtOfCodeJun 2 '14 at 12:10

@ArtOfCode It would be stupid (IMO) to force users to improve their clear question prior to it getting posted just because of a few spelling or grammar issues that can easily get fixed by anyone (these are not the posts we should be targeting with this). If the question is unclear because of these issues it's not a good question - there may be an underlying good question, but until the asker edits it, it isn't and it shouldn't get answers or get seen by many people.
– Bernhard BarkerJun 2 '14 at 12:53

1

If this is implemented correctly, we wouldn't need notifications as posts should clear the review queue within minutes (and either sent back to the poster or posted). If this is not implemented correctly, this queue will build fairly quickly and then we should probably just get rid of it.
– Bernhard BarkerJun 2 '14 at 13:02

1

@Dukeling: Yeah I see that. I think this is a good idea generally, and it is going to need refining. There are points from several answers and comments here that could be included, but it's going to be a bit of debate before anything gets trialled, let alone let loose on the site.
– ArtOfCodeJun 2 '14 at 14:09

@ArtOfCode: But everyone knows all answers on all sites ultimately lead back to StackOverflow!
– IanJun 3 '14 at 10:44

1

The notification thing could be like the pending edit one. You know, the "This won't be visible to everyone until it's been approved". Something similar to that could be used for the "unpublished" reviews.
– starsplusplusJun 4 '14 at 16:02

Props for "if this was implemented incorrectly, it could destroy the site". We've been working hard on balancing the existing review queues for two years now, and lemme tell ya: it is really hard to get it right. As we've gotten better at it, we've been feeding progressively more things into them, which I think is a sane way to proceed here - dumping 9K questions a day into a new system would make it insanely hard to correct even small miscalculations.
– Shog9Jun 4 '14 at 17:47

Requiring more attention by the community to deal with low-quality questions

Promoting abuse of the reviews to keep users out of my Stack Overflow

Diluting the signal to prevent new users and the community from learning

If you can find a way to implement this without falling in to any of those traps, I am all for it. And maybe someone will find a great way to do that. In the meantime I think it's more effective for us to try to focus our time on reducing the burden for signaling that content is bad, and making that bad content less visible to the users most perturbed by it.

For anything on SO to be sustainable, it needs to scale. The internet is getting bigger. The amount of things to ask questions about are increasing as new technology is invented. To just keep pace with the growth of the community, the amount of committed users needs to increase at the same rate as the amount of casual users. People are feeling the strain and making suggestions like this because they feel that balance is currently off.

By comparison, the close vote review queue doesn't even grow at that rate. And the consequences of a botched close vote review aren't that severe (the close system is designed to be a temporary thing that gives the author a chance to fix their post or ask about it on meta), whereas the proposed queue would (theoretically) stop the content from being contributed in the first place (otherwise you may as well propose dumping every new question in to the close vote review queue by default).

Those of us committed to improving the content and the site have limited time. We should be looking for ways to do more with less, and that means focusing more of our attention on the places where we have the largest positive impact, and less on the places where we have the least returns for that effort.

That's why we have suspensions. And audits. And moderators. And multiple reviews from different people. And all sorts of other checks in place to make sure that even if someone goes entirely off the deep end, they won't cause lasting harm to the community.

I will 100% guarantee you that there will be people who:

Reject every question that comes in and doesn't show 'effort' by their subjective determination

Reject every question that even has a hint of being part of a homework assignment

Reject every question that they assume would be found by even the most basic google query (without even checking to make sure it is that easy to find an answer)

Reject every question that doesn't have proper code formatting rather than edit it in or see if the community is willing

Yes, there are ways to reduce this risk, but going back to the first point, do we really want to spend more resources to review reviews that are likely made in good faith using good judgment just to weed out these bad eggs? Because unlike a close vote (where people can see the content, edit if they feel the desire, comment to help out the poster), these questions would be invisible to the community unless they exert additional effort to look for them.

Even if the system was designed to scale, will it scale while mitigating the potential harm from people who are trying to mold Stack Overflow in their own image?

Diluting the Signal

So much of what allows SO to be successful is the ability to look at the huge amount of content that exists, to analyze it, to see trends, and to figure out ways to use that information to help make it easier to do these things in the future. The whole concept of Stack Overflow was that questions matched with answers should have clear indications of their quality and whether or not they actually solve the problem. This is a two-way street -- SO learns from their users through the data, and users learn from SO as they browse around and see the signal from the community.

Look at the current issue with comments -- we have a single signal (upvoting) which makes it difficult to actually figure out which comments are worth showing/saving, and which ones are rubbish. We can make guesstimates, but that ends up pushing out potentially good content that doesn't nicely fit the imperfect algorithm. Look at the specific-question tag. Often this is a learning process for the whole community because a specific question helps spark a discussion that better refines what we do/don't want as a community. If we give the ability to prevent the borderline cases from seeing the light of day, we lose the ability to learn from those borderline cases and grow because of it.

And the new users lose out too. Even though some of us may have had bad questions, we used the feedback we got from the community to figure out how to improve our content, and ended up becoming an active member of this site in no small part because of that feedback. We are a very complicated community with a ton of contradictory rules and a lot of confusion even from regular users. New potentially great users need feedback to improve because it's confusing here, and even a great contributor may slip up at first because of how different SO is from what they're used to.

Those new users are denied the downvotes, the comments, the friendly edits, the close reasons, and most importantly the feeling that they are part of the community. And that has a huge chance of diluting the signal that we get.

The 4 points you've listed are the actual problems of such a queue. Well caught +1
– user2140173Jun 3 '14 at 15:28

6

Excellent summary. This is a viable solution for a group that has no desire for outside influence, but breaks down if the goal is to capture knowledge that does not already exist within the (small) group of insiders. Can you imagine a Stack Overflow where every new tool, library, language or API had to be vetted by the existing crowd before questions would be allowed? I can't; if SO had been launched that way, it would still be a tiny site for Microsoft / .NET programmers.
– Shog9Jun 4 '14 at 17:36

Add #4. It will make the site seems less responsive for new users, since their questions will not appear immediately. Do we want to turn new users away? Also, new users are not neccessarily clueless users. They could be super-experts just discovering SO.
– PM 77-1Jun 5 '14 at 20:39

1

@PM77-1, it may make the site seem less responsive (though how much? Does it really take less than 10 minutes for most good questions to get an answer?), but that is only a problem if it drives away the good new users we want to keep. We don't want everyone, which is why there are bans, suspensions, rate throttling, etc.
– jmacJun 6 '14 at 2:24

From the perspective of a new user, the down-vote isn't enough on its own. A short comment explaining why - doesn't need to be long or detailed - so i can figure out what i'm doing wrong and do the appropriate reading, is what helps me to understand SO aims and culture. The off-topic rule is often too subtle for us. Take this as an example stackoverflow.com/questions/37142879/… I thought it completely OT since its about electronics, but its answered by ~10K user.
– Niall CosgroveMay 12 '16 at 22:22

In short: no! Questions already are peer-reviewed: that's what the voting system is for.

The current system might not be perfect, but I doubt adding another layer of management would help.

I certainly cannot say I speak for everyone, but my main participation on SO consists of answering questions, and I'm not particularly interested in the management side of things. I very rarely use the review queue, although I do cast close votes quite regularly (and raise a few flags once in a while). I barely participate on Meta too (some meta questions, like this one, get my attention because I see them advertised on the side of the main site). Don't get me wrong, I am really grateful for those who do this, in particular moderators.

However, answering questions is taking enough of my spare time as it is. I regularly monitor the ssl tag: I have written answers to 5.5% of all questions there, and frankly, that's enough. I suppose monitoring a specific tag is close enough to a specific review queue as it is. I'm quite happy to downvote, comment and vote to close when required, but I do all this when I have a spare moment, and I wouldn't go to another reviewing place. (I do answer other questions from the main page once in a while.)

One of the downside of a separate review queue is that those reviewing might have no knowledge in the field they're reviewing. (The problem already exists on the ssl tag to some extent.) In this context, reviewers might just skip that question (so it could stay in the queue forever), or reject questions that are reasonable, but perhaps with too much jargon they're not familiar with, or without knowing which questions to ask the OP in comments to be able to improve the question with the required details. Telling the OP which details they need to give is actually a very important aspect if you want to improve a question (if you go to the doctors and tell them "I'm coughing", their main job is to ask you various questions to find out under which circumstances that happens, only then they'll suggest a fix).

There's also relatively little traffic on this tag as it is. There are many questions that I've voted to close or move, which didn't get enough other close votes to do so.

Another downside would be the delay to answer. Assuming I'd be more active on the reviewing side of the tags I know about, if I see a question for which an answer comes to my mind, I'd want to write that answer there and then. Having a step to review the question and having to come back later to answer that question would just be a waste of time, and I don't think I'd bother.

Sure, you could have a system where high-rep users have both the power to review and to answer when they review, but there could be a side-effect to this: we'd keep the interesting questions to ourselves, and knowledgeable new-comers would have no chance to gain rep by answering interesting questions (it can be difficult to provide a competing answer, especially when a high-rep user has already provided one). I'm happy with the principles of meritocracy, but I wouldn't want to see SO become more elitist.

You raise some good points. On the topic of obscure questions staying in the queue forever, i think this could be solved by a maximum time limit. If no reviewer wants to handle a given question for let's say 10 to 30 minutes, just put it on the frontpage and let the masses deal with it like today. Of course that does not solve the problem that reviewers might simply reject questions they do not understand. Regarding reviewers immediately wanting to answer the question, I do not really have a good solution, but I would consider it important to separate the two to prevent "question-hording".
– HugoRuneJun 2 '14 at 17:50

1

I'm a bit surprised by your time estimates. Do you really think that a 30-minute timeout is realistic, even for less obscure questions? Assuming that every new question (or at least most of them) will be reviewed within 1/2 hour sounds a bit optimistic to me.
– BrunoJun 2 '14 at 18:14

Just a thought, if your proposal was ever implemented. Perhaps having the option to see/hide the non-reviewed (or rejected) questions in the main list via a user preference would be good in this case, instead of closing them permanently. (Perhaps they could be closed automatically if not answered within a week or so.)
– BrunoJun 2 '14 at 18:17

I am not sure whether a 30 minute window is realistic, but I do think if we cannot deal with the majority of questions in 10 minutes, we should scrap the whole idea. The way I see it, either we get enough reviewers to keep the size of the queue near zero, as with most review queues today, or we do not have enough reviewers and the queue will keep on growing to infinity, as with the close-votes-queue. So either we can deal with new questions quickly, or not at all. (Note that not all new questions would enter this queue, only the fraction by users without good previous questions)
– HugoRuneJun 2 '14 at 18:52

2

I can tell you right now that we don't have enough reviewers to review every single question entering the site, @Hugo. Not for any definition of "review" that isn't "random person sez pass/fail" at least. That's why all of the existing queues are fed based on heuristics: first/late post, low quality, etc.
– Shog9Jun 4 '14 at 17:29

Yes!

... with a few adjustments.

My thoughts / suggestions / a bit of brainstorming:

Never mind not showing up on certain places, these questions only appear in the review queue (and aren't answerable until approved) - essentially they start off closed (or they can actually start off closed).

We shouldn't force people who don't want to see these questions into a little corner of the site, we should rather stop letting users who can't be bothered to put a decent amount of effort into their questions essentially run the site. There will be people wanting to answer their questions, thus they'd have no motivation to improve their questions if we allow them to be found and answered.

Reviewers can vote to publish those questions, suggest how the question can be improved, or skip.

Perhaps we should include (1) editing the question and (2) rejecting it (not salvageable - e.g. a cooking question) (although, if we do as below, a suggested improvement will essentially serve the purpose of a reject).

With regard to suggesting improvements - these can show up as comments, and if it needs improving, the question stays in limbo (not visible anywhere) until edited (or perhaps we can just close it and have it show up normally). Once it has been edited, it can perhaps go into the reopen queue (since that already essentially deals with this scenario).

Perhaps we should also include a "this question is good enough" check-box when suggesting improvements, which will essentially just cause a comment to be posted (and treat it as a vote to publish) (or allow for normal commenting from that queue).

Reviewers by default get questions tailored to their interests using a similar algorithm as currently used to select questions for the "interesting" tab on the frontpage.

I think I'd prefer being able to pick which tags I see (like the close vote review queue, but perhaps more than that - include favourites / ignored tags, if desired) (but we could certainly allow switching between the two).

It might also be good to allow reviewers to see everything. I don't really know JavaScript, but I'm sure I can judge when a JavaScript question is absolutely horrible.

Questions with positive votes get unlocked and published.

I don't think this is a good idea, at least not as stated. Users posting bad questions have bootstrapped themselves to a point where they can just upvote each others' questions (with the help of those being "nice" to them), so upvotes don't mean all that much.

We could perhaps consider only votes by users above a certain reputation.

But with my suggestion for how this will work above, this isn't relevant.

To avoid questions in low traffic tags from hanging around in limbo for ever, questions with no negative reviews automatically get published after x minutes.

Gamification: To attract enough reviewers and to prevent review-bots, reviewers get reputation for questions they approved: If the question they approved is later upvoted, they gain reputation; if it is downvoted they lose reputation. In essence the reviewers share responsibility for the quality of the question with the asker. This avoids some of the downsides of the badge-reward / audit-punishment system in our current review queues.

I like this idea, but I think it's idealistic - simply using votes could be a problem (because of the above reasons), and imagine someone creating a sock-puppet account to upvote questions they've reviewed - these could be extremely difficult to detect).

We could reward reputation for unopposed reviews (approved with no rejections, or rejected with no approvals), or perhaps have trusted users, those who's failed no / very few audits, and only reward reputation for reviews where these users are involved (subject to the above criteria). Then we could revert reputation (with punishment?) when a rejected question is reopened without having been edited or an approved question is closed.

The audit system is a good idea, the only reason it isn't working, IMO, is because the audits are badly chosen (with no functionality to deal with bad ones) (close votes) / badly generated / too easy (suggested edits). We just need to fix this first / make sure we have proper audits with this review queue.

"Questions with positive votes get unlocked and published." That was intended to mean positive review results, not upvotes. Since those questions only appear in the review queue before publishing, and that queue would need some minimum reputation to view, I don't think there is any danger of bootstrapping. Regarding sock-puppet accounts to upvote reviewed questions, this does not seem to be much harder to detect than current sock-puppet voting patterns.
– HugoRuneJun 1 '14 at 22:30

1

"With regard to suggesting improvements - these can show up as comments, and if it needs improving, the question stays in limbo (not visible anywhere) until edited" - So in general, if the improvement is made the comment becomes obsolete, and if the improvement is not made the question never leaves queue. Maybe it would be tidier to implement suggested improvements separately from the existing comment system; this would obviate the need for the "good enough" check box as well, since you could just use comments when the suggestion is non-critical.
– AirJun 4 '14 at 15:47

5

"Heavy reputation loss for failing audits" - that's a great way to make sure no one who cares about the reputation they've earned by laboring at answering questions ever touches review.
– Shog9Jun 4 '14 at 17:23

1

@Shog9 Not if you can also gain reputation for reviewing. Even if you take away my ability to delete my own answers, I would still post them, regardless of the fact that I can get downvoted into oblivion if my answer is sufficiently wrong, because I'd be sure enough that it isn't (and I'm sure many would say the same) - the exact same thing holds for reviewing (or at least it should).
– Bernhard BarkerJun 4 '14 at 17:28

3

Yeah, that's even worse. "I earned 100K by writing detailed answers to hundreds of questions" vs. "I earned 100K by rubber-stamping questions that other people wrote that ended up being useful because other people answered them." You're essentially proposing the establishment of a class of bureaucrats who contribute nothing of value to the site beyond adding resistance to the folks who do. Extrinsic motivation (gamification) can be powerful, but divorced of intrinsic motivation it becomes extremely dangerous; it's all too easy for folks to forget why they're playing.
– Shog9Jun 4 '14 at 17:43

@Shog9 It sounds like you're saying improving site quality isn't important. Admittedly improving the site and answering programming questions are fairly disjoint, so perhaps using the same metric for both isn't ideal, but I recall that all requests to create more metrics have been declined / ignored. We need more reviewers - we need to motivate them with something.
– Bernhard BarkerJun 4 '14 at 17:50

2

If we can't motivate them with a desire to see quality questions on the site, then we're screwed - that's all there is to it. If I don't care which questions get approved or rejected, then this becomes a prisoner's dilemma: I win by correctly predicting what my peers will do. Am I gonna stick my neck out on principle when doing so stands to drastically hurt my status on the site? I don't think so.
– Shog9Jun 4 '14 at 18:08

@Shog9 I guess I have a stronger belief in the power of fake internet points (and even for those who do want to improve the site - it's nice to get something a little more quantifiable than simply knowing you've improved the site). I tried to focus on subtracting reputation for only audits specifically to not punish for having a different opinion as others - perhaps we could also reward passing audits instead of unopposed reviews, but rewarding something that doesn't help anyone seems counter-intuitive.
– Bernhard BarkerJun 4 '14 at 18:47

3

You want something quantifiable? Poke around meta for a bit, there's a proposal for a mentoring system that gives folks a view into the progress of those they've helped in some way. This would be a reinforcement to those with an intrinsic desire to help, in the same way that reputation is currently. What you're proposing doesn't do that though; there's no connection between a given review and how useful that question ends up being. You can "gamify" anything by attaching points to it, but you have to be very careful about creating a feedback loop that excludes your actual goal.
– Shog9Jun 4 '14 at 19:16

@Shog9 The goal is to consistently have enough quality reviewers who will be able to deal with the increased number of review tasks that throwing all questions by new users into a review queue will create. I believe this proposal, if implemented well, has a pretty good chance of reaching that goal - we get the numbers through reputation rewards, and the quality through decent audits. And if we have quality reviewers reviewing most content from our biggest source of low quality content [citation needed], we should almost immediately get a massive drop in low quality content.
– Bernhard BarkerJun 4 '14 at 21:07

Especially if site authorities will gather enough courage to face the facts, and admit that 90% of site traffic are "Help me fix the code!" questions. And this peer review thing will just move these kind of questions somewhere else, to some another Stack Exchange site.

Really,

we have thousands of people who just need "another pair of eyes"

we have hundreds of such eyes, hungry to solve noob problems

we have site rules which forbid these two movements

WHY do we have such contradicting rules?

Why not let these poor folks find each other and feel satisfaction for the solved problem and pride for helping people - instead of feeling guilty? Why not to make everyone happy?

More often than not however, it's better for the user to arrive at the answer themselves, as this is more conducive to learning. When you simply give the person an answer, even if you explain it well, they aren't as likely to remember it. IMHO, that's why this rule needs to continue to exist.
– Leon NewswangerJun 1 '14 at 19:52

4

HA. HA. HA. And again I hear the voice from Imaginary Stack Overflow. This kind of answer will NEVER exist here. I cannot even count how many times I've been attacked by the local Tonton Macoutes with their ridiculous "This doesn't answer the question. To critique or request clarification you should comment" claim, when I tried to make a user " to arrive at the answer oneself".
– Your Common SenseJun 2 '14 at 4:26

2

Is there a "debug my code" exchange? Has anyone proposed one on Area 51?
– RubberDuckJun 2 '14 at 10:49

To be fair, I do really enjoy the slow pace and quality on CR.
– RubberDuckJun 2 '14 at 10:57

1

I never said we should adjust the rules to allow someone to more or less tutor someone. This is a Q&A site for people with minimal understanding, not a find-a-free-tutor site. That doesn't mean we should treat these questions in a manor that will say, help them pass a CS class without having any idea what they're doing, by simply providing an answer to a person who CLEARLY lacks understanding.
– Leon NewswangerJun 2 '14 at 14:36

@LeonNewswanger JFYI: presently Stack Overflow works EXACTLY AS YOU DESCRIBED - simply providing an answer to a person who CLEARLY lacks understanding. If you are ready to fight these two parties I described in my answer - you are welcome. While I am not. And all I want is less litter on the main site.
– Your Common SenseJun 2 '14 at 14:46

@YourCommonSense I actually think that depends on the tag. Each group that frequents a specific tag appears to apply the rules in the way they see fit. I don't think SO is as unified on any subject as people seem to think it is.
– Leon NewswangerJun 2 '14 at 14:50

@LeonNewswanger Besides, the behavior you described is actively opposed by the community.
– Your Common SenseJun 2 '14 at 14:51

2

@YourCommonSense You can find a group of people who oppose a rule for almost every rule on SO. There no longer is a single SO "community". SO has become a metropolis with many smaller communities inside. Each and everyone one of those communities opposes and supports different rules.
– Leon NewswangerJun 2 '14 at 14:56

2

@LeonNewswanger yes, you are right in general but wrong for this particular problem. You'll be torn in pieces if try to answer giving no answer.
– Your Common SenseJun 2 '14 at 15:06

@YourCommonSense - I was reading through your other posts regarding proposal of a system to judge question quality - i stand to your point - this seems like an excellent filter before posting question to the main answering queue - see my meta.stackoverflow.com/questions/258035/… - i called it mentoring, here someone called it sponsoring
– NirMHJun 2 '14 at 18:06

@NirMH well, as you can see, the are too stubborn to accept the facts.
– Your Common SenseJun 2 '14 at 18:12

1

@NirMH sorry, but I am not a meta-guy. I am coming here occasionally, but all local bureaucracy and endless-talk-of-nothing are driving me mad. I'd better answer a question or two on SO.
– Your Common SenseJun 3 '14 at 11:33

It might be possible to get closer to what you are suggesting, without a large change in features.

Assuming your goal is to see more interesting questions on the frontpage, and see fewer poor-quality questions:

Tweak the frontpage display algorithm so it shows fewer questions with 0 score, and more questions with positive score.

The algorithm might aim for a 60:40 ratio, or better when possible, so for each viewer at least 60% of the visible questions would have 1 vote or more.

The 0 vote questions could be selected at random, but it would be fairer to take some sort of round-robin approach, so each 0-score question will appear to an equal number of users.

The only issue here is if the site is really being flooded by 0 vote questions, they may fall off the bottom of the page before anyone gets to see them! But that is already a danger I suppose, and perhaps algorithms are already in place to ensure questions get seen before they disappear.

Very good idea, bearing in mind also suggestions by ArtOfCode in his answer. I once made a simple calculation of what would happen if new users wouldn't be able to ask questions immediately in this answer: Huge close votes review queue on Stack Overflow. Although the idea was different, these calculations partially apply here as well and suggest that it is possible that only a little number of good questions would be lost and huge number of bad questions wouldn't enter the system and spoil the close votes review queue.

Alternative: "on hold" users

If user have a history of bad questions, put the user's questions "on hold" (ex. 1 day), not for specific "peer review before publishing" but "for simple hold or eventual vote or eventual peer review".

The "wait 1 day" or "wait 2 days"... Is only a penalty for people who make no effort to do better. Not a penalty for the site or the administrators or reviewers.

The @HugoRune proposal of "peer review questions before publishing" is like the scientific publishing strategy, that have big transaction costs...