Already I have lost all hope of looking at late answers etc. It's just a parade of meh. I can't make any kind of intelligent decision because I can't see the other answers to the question, for instance. There's no Looks Good, only Skip, and those feel different to me. So I've been more interested in reviewing edits.

But people keep approving crap while I'm looking at it. Two or three "the post was already approved; please visit to edit" and I'm just "screw this whole thing".

Can we do something like one reject bounces all existing votes: you need three approves in a row? Because a lot of people are stupidly approving and actually driving me away from the queue.

Note this is less about letting bad edits through and more about keeping reviewers motivated.

I agree with you both but this is less about letting bad edits through and more about keeping reviewers motivated
–
Kate GregoryNov 13 '12 at 17:00

5

yeah Looks Good was referring to late answers. I'm depressed by the number of queues I just no longer am willing to look at. I am trying to stop Suggested Edits from being one of them.
–
Kate GregoryNov 13 '12 at 17:04

Seems to me that those are basically the same thing, in this case.
–
Pops♦Nov 13 '12 at 17:05

Sorry, that was confusion on my part. After re-reading I realized you were still talking about Late Answers at that point. Carry on. :)
–
Bill the LizardNov 13 '12 at 17:07

8

+1 for the sentiment at least, even if a dupe.
–
BartNov 13 '12 at 17:07

15

Ever since the badges (especially the gold), have not wanted to review anything
–
randomNov 13 '12 at 17:11

@Servy Precisely what I'm doing as well. No more review queues for me. Just general cleanup and searches for potentially bad content.
–
BartNov 13 '12 at 17:35

8

This review business will get out of control. It will get out of control and we'll be lucky to live through it
–
NullUserException อ_อNov 13 '12 at 18:22

3

One thing that might help is adding flagging option to each review action then letting moderators block users from reviewing suggested edits for a while. Pretty sure it was already suggested though.
–
Shadow WizardNov 13 '12 at 18:24

2

@gnat Kate just earned a Great Question badge for this post, and you think it "has not received enough attention"? (I'm kidding, I know what you mean. It just struck me as a funny incongruence.)
–
Pops♦Dec 5 '12 at 17:32

2

@PopularDemand LOL but of course. I for one see this as a very interesting, clearly presented design challenge. ;) Wonder why readers don't give more solutions. At first, I even thought about re-posting it at Programmers. "Imagine a site, like SO. Imagine a suggested edits review system like at SO. Imagine there are 'bad' and 'good' guys..." and so forth
–
gnatDec 5 '12 at 18:19

8 Answers
8

The best solution I've found to this problem is to not review posts from a queue.

Some of the other ways that I've found to find content to review are:

Participating in tag cleanups. It's not just about removing tags from posts; it's about fixing all of the problems with a post. Since there are no queues for this, it's all voluntary, so most of the other people doing this with you will be enthusiastic and capable reviewers.

Just scan through old posts. There's enough things that need fixing that random searches are reasonably productive.

Look through the questions of people posting on meta with "I can't ask questions anymore, WTF!?" (They tend to have lots of content that needs reviewing.)

Before answering questions in general, or immediately after answering any question, edit the questions as appropriate.

I stopped using the queues weeks ago, but I won't let that stop me from reviewing content anyway.

When I do go to the queues to review (which is less and less often every week) it's usually to find problematic reviews that need fixing. I look for edits that shouldn't be approved, favorite the link, and then comment/rollback the edit when it's approved, for example. Or, if the reviews tend to be just incomplete, not abusive, I just improve posts (now that the race conditions of doing so are lessened). I may not get review points for it, but the edits still take effect, which is enough for me.

Unanswered Questions by Tag is a query that shows problem tags that could use some clean-up help. Some of the help needed is in the form of answering questions, but quite a lot of it is in the form of editing or closing/deleting bad questions.
–
Bill the LizardNov 13 '12 at 18:26

"The best solution I've found to this problem is to not review posts from a queue." If that is the best solution that exists, that shows an epic failure of the system. The whole point of the queue is to make it easy to find questions that need reviews.
–
Nicol BolasNov 15 '12 at 6:47

6

@NicolBolas That's actually kinda the point of this answer, I just avoided saying it explicitly.
–
ServyNov 15 '12 at 14:33

people keep approving crap while I'm looking at it. Two or three "the post was already approved; please visit to edit" and I'm just "screw this whole thing".

Above is pretty frustrating in my experience. It is especially painful in cases of blatantly obvious crap that takes me just few seconds to reject.

It would be interesting to try exclusive review period to remedy this. By this, I mean that picked suggested edit is taken off the queue for 2-3 minutes so that no one else can review it until timeout expires.

This would guarantee that my decisions (at least reasonably quick ones) could only clash with those of the user(s) who held the suggested edit longer than mentioned timeout - which would weed out mindless click-through robo approvers.

Current delay is 2 seconds. Assuming that thorough reviewer spends one minute on analyzing and improving the edit, this means that robo-approver is 30 (thirty) times more "performant".

Think of it... 3 (three) rubber-stampers acting in parallel are capable of blindly approving 30 edits a minute, potentially destroying efforts of 30 (thirty) responsible reviewers who could have been spending this minute working on mentioned edits.

Increasing action buttons delay to, say, 20-30 seconds could help in somehow leveling playing field in favor of "good guys".

For the sake of completeness, I am aware of various ideas to detect and block fake reviewers (listed eg in linked questions) but here, I focus exclusively on how to block them from messing with my review process.

update Dec 20 '12

A new feature has been introduced that will likely help responsible reviewers part ways with robo-clickers:

An optimization to the Late Answers, First Posts, and Low Quality review queues has been deployed that will keep a single review from being shown to multiple people at the same time...

update Aug 31 '14

...each person reviewing suggested edits has a good opportunity to accurately review them.

It's a bit like ticketmaster, or any type of reservation system. When you visit a suggested edit review task it's now "checked out" to you for that time - the counter previously didn't reflect the amount of "checked out" tasks.

Mutual exclusion? Starving editors? WHAT SORT OF SORCERY IS THIS? I like it.
–
Tim Post♦Nov 14 '12 at 9:56

1

@TimPost from programmer's perspective, this one is just an interesting concurrency problem. Dining philosophers and such... :)
–
gnatNov 14 '12 at 10:03

2

That's why I mentioned starving editors (philosophers). If an ajax call was made to check and see if the post was locked in redis (a key with a very short TTL) and if not, lock it setting a key with a very short ttl, then it would probably work and be rather cheap.
–
Tim Post♦Nov 14 '12 at 10:06

@TimPost I was rather thinking in terms of item getting out from the queue and re-appending after timeout (lockless), but your approach would likely work too. And yes, you got it exactly right that cheap solution is what I'm after here (not trying to solve world hunger)
–
gnatNov 14 '12 at 10:09

4

I just upgraded to 2000+ in askubuntu and tried to do meaningful edits to some proposed edit... no way, "they were already accepted" all the time. Lost time. We really need to lock the reviews... put a reasonable timeout on it (5 minutes) and there should be no problem.
–
RmanoFeb 20 '14 at 0:11

Make better edit review audits. The current are terribly bad, for example 'vandalism' is imitated for inserting some random words, so obvious, you could as well put big text THIS IS AUDIT withing the text.

Make some more finesse audits. For example change some link to something posting to some off-topic page (or to tinyURL-like redirection to the image with information this is audit only). This will check if people notice that someone has changed link and if they actually click the link to check what's behind.

Make some audits changing closing tags for XML, removing dots from method calls, changing math operators or removing spaces from python code. Check if people actually analyse such 'improvements' in code.

Change 'I' to 'i', 'you' to 'U', or add 'Thanx for help!', 'Urgent!' or similar. Check if people notice and reject.

There are many things you can do with edit review audits. As for now, it's the poorly audited review queue.

@bjb568 Much as I dislike "tag-slappers", I disagree that it's always too minor. A post could really need a tag, but not have any other issues. This almost never happens in practice, but it's possible.
–
S.L. BarthJun 11 '14 at 8:21

1

@S.L.Barth Every second you take deciding whether there are other things to improve (there always are) is a second not devoted to cat-human hybrid research.
–
bjb568Jun 11 '14 at 8:24

This is still a problem. If changes were made to the site in late 2012, they did not solve it.

The solution seems obvious to me. If you have taken the effort not only to review edits, but also to enter a post to manually improve it, you are obviously more seriously concerned about the quality of the post than the robots are. So when you enter a post to manually edit, it should be locked for approving/rejecting by others for 10 minutes or so.

Perhaps with the exception of the original poster themselves and moderators.

Limit the Number of Reviews non-Moderators can do to 1 per X Seconds

Limit the number of reviews a reviewer can perform to one review per {amount of time between 10 and 60 seconds}. Don't enforce an on-page penalty, but an overall penalty and apply it to all types of reviews. This will at least give anyone a chance to grab a review. I can't add a comment until 15 seconds pass, this is the same sort of "penalty" that would only truly affect automotons.

Mechanics

The {10-60} second counter starts counting down when the review is presented to the reviewer. The review itself can take less than 10-60 seconds, and the review action can be submitted at any time, but the reviewer will be unable to access any item to review from any queue until that counter hits zero.

The time limit on the counter could be based on the type of review presented to the user based on an estimate gauged on the overall average time needed to process the review type. A First Posts presented could have a longer counter applied to it than, say a Low Quality Post or Suggested Edit.

Real reviewers won't feel the penalty because they should actually be taking the time to review.

Even if this doesn't stop the robo-reviewers, it will likely allow more reviews to be available for non-robo-reviewers to access and provide some atmosphere of fairness.

There is also the potential to reduce the penalty based on reputation. 60 seconds may be too much time, but 10 is (in my opinion) not enough time; it can be 30 seconds or 45 seconds, but it must be some amount that's semi-substantial.

Even if someone is taking a lot of time to review properly, there will still be a noticable percentage of reviews that take less than a minute. Even if a fair number of reviews should be that long or longer, not all need to be.
–
ServyJun 14 '13 at 15:53

I never said to force anyone to review an item for a minute. They can take a second to review. Just like I can't post another comment until X seconds pass, reviewers who have been presented with an item less than 1 minute ago won't be able to view another item to review (FROM ANY QUEUE) until that minute passes.
–
JoshDMJun 14 '13 at 15:55

2

And that will be very annoying and disruptive for people reviewing properly. We don't want that. We want to be annoying and disruptive for people not reviewing properly but while not affecting, or only minimally affecting, people reviewing properly.
–
ServyJun 14 '13 at 15:58

Which is why the penalty reduces with higher reputations.
–
JoshDMJun 14 '13 at 15:59

6

It has been shown on a lot of occasions that the quality of reviews has very little correlation with reputation. If it did, the problem would be easy to solve (just raise the rep requirement to review, or otherwise be more restrictive to lower rep users). The problem is lots of high rep users are bad reviewers, and plenty of not-so-high rep users are good reviewers.
–
ServyJun 14 '13 at 16:01

It doesn't have to be a full minute, just something ANYTHING to prevent 20 reviews being performed in under 2 minutes.
–
JoshDMJun 14 '13 at 16:01

Note there is already a delay when reviewing edits. The approve/reject buttons aren't enabled for several seconds.
–
ServyJun 14 '13 at 16:10

2

Just so you know, your average review time is well under 1 minute. You spend the most time reviewing first posts (kudos) averaging 61 seconds per review - but in the queue where you're most active (edits), you average a hair under 26 seconds.
–
Shog9♦Jun 14 '13 at 16:11

@Shog9 - does that average count the times when I edit the items in the queue, and the amount of time spent editing? Because I edit when I can. I've got an outstanding beef here where you get kicked out of the review queue for editing and re-editing items.
–
JoshDMJun 14 '13 at 16:13

@Shog9 With your detailed account of a user's activity, now I wonder what SE doesn't know about me.
–
AntonyJun 14 '13 at 16:14

2

Heh... Well, we specifically track review times in order to watch for signs of abuse, @Antony. Very fast reviewers find themselves held to a higher standard when it comes to audits and such. Also: other things we're learning from watching you...
–
Shog9♦Jun 14 '13 at 16:16

1

I just reviewed some SEs and can see why I'm faster there than on FPs; some SE rejects are obvious, while FPs definitely need more editing involvement. I admit I'm a badge hunter and hungry for items from those lesser queues (LQPs and LAs), but I think some form of a presentation delay (in conjunction with the button delay), possibly skewed based the type of review presented (45 seconds for a FP, 20 seconds for a SE), will add some semblance of fairness between those who automate and those who participate. I'm a victim of incentives, but I want to get there fairly.
–
JoshDMAug 12 '13 at 15:01

When you view a pending suggested edit, the system will avoid assigning that edit to any other reviewers until you've submitted your review or a reasonable period of time (currently 3 minutes) has passed. The number of "in review" tasks is tracked, and the main /review page updates the counts accordingly.

This is... considerably less dramatic than your suggestion, but it's much easier to calculate and explain, and so far appears to be quite effective.

"so far appears to be quite effective" - do we have any data on this? Just asking out of curiosity - not sure what type of data would help confirm this, though.
–
hichris123Oct 7 '14 at 23:06

Data isn't really punchy here, but I'll put something together as part of a larger report in a few weeks once the full set of changes has had time to bake. In the meantime, here's a little test you can do on your own: jump into /review/suggested-edits. There will very nearly always be edits to review, even if the counter says 0. Why? Because there are constantly locks being released as folks finish their reviews, and the delay in updating the page ensures that most reviewers aren't "bottoming out" as soon as they try to review. Review queues were suffering from a classic scheduling problem.
–
Shog9♦Oct 8 '14 at 15:22

Delay assigning points towards their badge progress until a decision is made about whether the Suggested Edit is accepted or rejected.

if the reviewer's Accept / Reject decision agrees with the actual result of the overall review, then count that review towards badge progress.

Certainly there will be those who are illegitimately penalized by this, but of all the queues besides Close Votes, Suggested Edits is the most active and any perceived loss in badge progress would be eventually re-gained. And real users shouldn't care as much about a minor loss of badge progress if it helps prevent abusers.

Alas it's very common for the robo-approvers to unanimously agree to do the wrong thing. In fact, that's why I started this question. You have a TERRIBLE review and people just all approve it. The actual result of the overall review is not a good measure of individual reviewer skill.
–
Kate GregoryJun 14 '13 at 15:27