Stack Exchange cares about links. Awesome. Now removing rel=nofollow would be the next logical step (since that obviously destroys linking as far as search engines are concerned).
–
Konrad RudolphMay 14 '12 at 19:55

Wait a second. The number 130398 hasn't been changed since the question was posted. So you timed it perfectly so that this would be post #130398, and then set up the bot to link to this question?
–
Mechanical snailJul 26 '12 at 11:30

2 Answers
2

Stack Exchange runs a bot that validates all the external links in our network of sites. This bot performs HEAD requests in a heavily throttled way. This bot was created to combat link rot in our network.

Links are tested once every 3 months.

If you notice any issues with this bot please email team@stackexchange.com

The proposed operational mechanism (this is not final as of April 26, 2012) is:

all being planned at the moment, in a nutshell, community will add a "special comment" to the post asking the creator to fix it, it will also add it to a list in /review and we will remove one of the tabs ... also we will add a couple of badges to help drive the fixing

namely:

Adds a comment to the post asking the author to fix it:

Adds the post (question or answer) to a special list on the /review tab

What does it do when it finds a dead link? Is it flagged at least? It would be awesome if posts with dead links appeared in the review page (or 10k).
–
Jeff MercadoApr 26 '12 at 7:00

17

@JeffMercado all being planned at the moment, in a nutshell, community will add a "special comment" to the post asking the creator to fix it, it will also add it to a list in /review and we will remove one of the tabs ... also we will add a couple of badges to help drive the fixing
–
wafflesApr 26 '12 at 7:03

I hope there will be badges for fixing link in your own posts, and that the auto added comment will be auto removed once the link has been fixed. It may be worth only adding items to the review tab, if the author has not fixed the link themselves within n days, also should the review tab only show posts with lots of views that have broken links, so the most important post are fixed first?
–
Ian RingroseApr 26 '12 at 11:30

I see answers only referring to external links all the time. This is something that should be fighted more vigorously on the moderation side. But this addition is also welcome!
–
UncleZeivApr 26 '12 at 11:34

17

Can you make this flag / remove known URL shorteners since this pretty much kills the existing "too expensive" argument.
–
FlexoApr 26 '12 at 12:38

1

@AviD actually you can sometimes find the important content, I saved a linkrot post thanks to the Wayback MAchine the other day.
–
Ben BrockaApr 26 '12 at 13:36

2

@waffles So I should deliberately add a dead link, wait 3 months and then edit it to get the badge? ;)
–
Lorem IpsumApr 26 '12 at 23:36

4

@yoda I think a better approach is to add a good link, then launch a DDOS on the target site, take it down, forever, then replace the link with a link to the wayback machine. AND PROFIT. I hear botnets can be rented for quite cheap these days.
–
wafflesApr 26 '12 at 23:45

3

Curious: are you also doing this for images? And if adding @awoodland's URL shorteners, then maybe personal storage such as Dropbox and the like could get some attention too. (Especially for images, I feel.) (Nice, by he way!)
–
ArjanApr 27 '12 at 7:13

It might make sense to incorporate this into the crawl. If we can't find the original source link in the crawl, ask Wayback if they have a copy of said link, and if they do we can fix the post ourselves without bugging users about it.

The crawl is of course still necessary to check the validity of the link. But we have a much better fallback that can save everyone a lot of work, and reduce load on the human-powered queues. Robot work is always preferable to human work..

This is awesome. Many users fix broken links by replacing them with a link to the Wayback copy since it's the simplest thing to do, hooray for robots! Sticking this on our feature agenda.
–
Tim Post♦Sep 26 '13 at 5:17

2

This is going to the devs sometime today to look into implementing. It should fix the immediate problem, provided that the wayback copy is the same age or newer than the post when the link was made, and we only automatically re-write them after several failed checks. There will be a unique PostHistory entry when this happens, so we can conceivably find / improve these posts if they too heavily depend on the now fixed link (but later, another project, another time). Hopefully nothing will get in the way of it working.
–
Tim Post♦Nov 22 '13 at 15:31

@JeffAtwood It went on the core call, but there's a bit of resistance. The Wayback API gives us everything we need to know in order to responsibly re-write the link (e.g. we're not linking to a version much older than what the author was originally linking to that never updated) - but the Community user would be making a ton of edits. Even though I specified a PostHistory event for Community fixing a link so these could be reviewed if needed, we'd need to better address the 'if needed' before this could go in. (1/2)
–
Tim Post♦Mar 24 '14 at 4:31

The other concern was that this would (at least in part) prevent posts that depended too heavily on a link from being surfaced and fixed to be fundamentally better posts, I don't entirely agree with that. Anyway, it's all leading back to finally getting the link review queue in shape to actually be used, at which time I'll propose this again in the context of actually having that in place - where we put the 'fixed' posts in front of people for edits to make them fundamentally better as is possible. (2/2)
–
Tim Post♦Mar 24 '14 at 4:35