You can wait until the crawl finishes and reaches 100%, or you can just view 404 broken links while crawling by navigating to the ‘Response Codes’ tab and using the filter for ‘Client Error 4XX’.

There are two ways to do this, you can simply click on the ‘tab’ at the top and use the drop down filter –

Alternatively you can use the right-hand window crawl overview pane and just click directly on ‘Client Error (4xx)’ tree view under the ‘Response Codes’ folder. They both show the same results, regardless of which way you navigate.

This crawl overview pane updates while crawling, so you can see there number of client error 4XX links you have at a glance. In the instance above, there are 4 client errors which is 0.58% of the links discovered in the crawl.

3) View The Source Of The Broken Links By Clicking The ‘Inlinks’ Tab

Obviously you’ll want to know the source of the broken links discovered (which URLs on the website link to these broken links), so they can be fixed. To do this, simply click on a URL in the top window pane and then click on the ‘Inlinks’ tab at the bottom to populate the lower window pane.

You can click on the above to view a larger image. As you can see in this example, there is a broken link to the BrightonSEO website (http://www.brightonseo.com/category/videos/), which is linked to from this page – https://www.screamingfrog.co.uk/15-top-quotes-and-takeaways-from-brighton-seo/.

Here’s a closer view of the lower window pane which details the ‘inlinks’ data –

‘From’ is the source where the 404 broken link can be found, while ‘To’ is the broken link. You can also see the anchor text, alt text (if it’s an image which is hyperlinked) and whether the link is followed (true) or nofollow (false).

It looks like the only broken links on our website are external links (sites we link out to), but obviously the SEO Spider will discover any internal broken links if you have any.

If you’d rather view the data in a spreadsheet you can export both the ‘source’ URLs and ‘broken links’ by using the ‘Bulk Export’, ‘Response Codes’ and ‘Client Error (4XX) Inlinks’ option in the top level menu.

There’s a number of ways you can export data from the Screaming Frog SEO spider, so I recommend reading our user guide on exporting.

Crawling A List Of URLs For Broken Links

Finally, if you have a list of URLs you’d like to check for broken links instead of crawling a website, then you can simply upload them in list mode.

To switch to ‘list’ mode, simply click on ‘mode > list’ in the top level navigation and you’ll then be able to choose to paste in the URLs or upload via a file.

Hopefully the above guide helps illustrate how to use the SEO Spider tool to check for broken links efficiently.

Please also read our Screaming Frog SEO spider FAQs and full user guide for more information.

Dan Sharp is founder & Director of Screaming Frog. He has developed search strategies for a variety of clients from international brands to small and medium sized businesses and designed and managed the build of the innovative SEO Spider software.

239 Comments

Good tutorial. Screaming frog has become one of the most widely used SEO tool in my arsenal. Hopefully, we will see similar updates and tutorials.
4xxx error code inspection as well as 3xx has saved a tons of my time analyzing the website.

This is a useful post for finding broken links within the website, what about links pointing outwards that are broken? I can use a free web service but wondered if this was possible within screaming frog.

Hey Dan,
love the tool but I can’t figure out how to identify pages that have external 404 errors (to other sites) from a list of URL’s that I import into Screaming Frog. (The “Outlinks” Tab does not give me 404 errors)

Hello, i was wondering, if after getting the broken links lists it is possible to export them altogether with the inLinks info. In that way you have in one report, the broken link and where in my page it is located. Thanks

They might not be found if, they are not linked to internally, if they are blocked by robots.txt, or if you’re using the trial version with 500 URL crawl limit and you reach that limit etc (More over here – https://www.screamingfrog.co.uk/seo-spider/faq/#14).

Hi,
I am sorry i need more details about this one too. When we run a report in crawl mode, go to response codes. Does it also scan the img src tags?
I wonder if it is possible to detect the broken images inside img src.
thank you.

Great tool! I am using a redirect plugin to send all my 404’s to my home page but I think it’s slacking sometimes. I noticed that some other sites have taken over my 404’s somehow (I see them in the Google search with my title but their link)

I have got a problem with my screaming frog software. There are so many link in my website but when I put my site and press the start button, it only shows the web link in the below table not the complete website. Please advice how I can fix this problem?

Appreciate the post. Ive been trying to find a more effective way of finding the broken links on sites. Screaming Frog accomplishes that with ease…now i just have to figure out the rest of the features

Thanks for this. Someone introduced me to Screaming Frog last month and I’ve been using it for a 3/4 weeks now. Locating and sorting out broken links will be a useful addition to my knowledge of SF. Cheers

I hope this doesn’t sound like a silly question, but I’m assuming the program ignores (doesn’t “click on” or “follow”) Google Adsense ads or any other pay-per-click ads one has displayed on their site?

The reason this isn’t found as a broken link, is because it’s not a broken link with a typical 4XX error response. The server returns a ‘DNS lookup failed’ response as its status.

If you look under the ‘external’ tab, you can see it, but the best way to catch these (different) errors, is also to consider looking at ‘No Response’, which can be found under the ‘Response Codes’ tab. You can see this example here –

I able to use the tool to crawl the source site and it works great! Unfortunately, when I try crawling the target site, it only finds the one site with a status of forbidden. I have double checked the permissions to both sites so I am not sure why I am getting only one site and how to get past this error.

Will this software find broken links from other websites that point to mine? Like if a person posts a link to a product in their blog but that product no longer exists? I’ve moved to a new website and want to link up old urls to the new ones. There are lots of pins on Pinterest going to the old product urls.

If you don’t link to pages that error internally on your website, it won’t find any broken links.

So if they are old URLs, which have external links pointing to them, then I’d recommend using –

1) Google Search Console – It’s free. Google crawls the whole web and will show you any errors from external links.
2) A tool like Majestic (my fav) or Open Site Explorer (which are backlink analysis tools). You can view a ‘top pages’ report and they show status codes, too.

No worries. Yeah, you can see 404s in Google Search Console (formally Webmaster Tools) and I highly recommend using as well.

So a good question. The benefits of using the Screaming Frog SEO Spider are –

1) It’s instant. Obviously you have to wait for Google to crawl your site (or page!) to find the broken link(s). With the Spider, you can crawl anytime instantly. Whether that’s pre launch, a site migration on a staging server, or when you’ve just made a lot of changes etc. So you can fix them, before publishing and before any potential problems. Conversely, if you or a developer has fixed some broken links, you also have to wait for Google to re-crawl them for them to disappear in Search Console. So, how do you know they are all fixed and none have been missed? In the Spider, you can verify it, by re-crawling and double checking they have been fixed straight away.

2) Google doesn’t show you the ‘inlinks’ (the pages which link to the 404s) very well. It’s really clunky, you have to click each individual broken link to view in a little window, and you can’t bulk export errors alongside the source ‘inlinks’ like the Spider, to just simply fire over to a developer. They also have a limit of 1k errors, and the SEO Spider will show ALL found (some large sites have thousands obviously). We have given John Mueller and their team feedback on these points, as it’s frustrating.

3) Google doesn’t show you external broken links. I.e, if you link out to external websites and those links are broken, you wouldn’t know in Search Console.

4) It’s easier to prioritise which broken links to fix in the Spider. In the ‘lite’ free version, you can analyse the number of ‘inlinks’ a page has, or the ‘level’ (how deep a page is) which are both useful measures to consider of how important a page is, particularly on large sites. If you have the premium version, you can match against Google Analytics or Search Analytics (from Search Console) data, so you can fix broken links which historically have the most traffic, conversion, rankings etc.

5) The SEO Spider also crawls directives like canonicals or rel next/prev and will tell you if URLs there are broken. Google don’t show you those. The configuration options in the premium version, also allow you to crawl links with ‘nofollow’ etc, we again, you’d have to remove to get Google to crawl.

6) To use GSC, the site will need to be verified by you, or you’ll need to get the site owner to verify and grant you access. This can be a pain if you’re an external party etc and waiting on slow clients / devs. With the SEO Spider, you can just start crawling.

I think those are the main points! Obviously the SEO Spider will also report on all response codes (like redirects, whether they are 301s or 302s, server errors etc) and plenty of other data at the same time anyway, so the tool is more than just finding broken links.

There are benefits of using Google Search Console as well, though for balance – Google crawl the web and can find broken links via other websites, which link to old pages (or just incorrectly). For big audits, we often combine the two.

I have just started using Screeming Frog and I absolutely love it! I love that it’s so fast and easy to use. By far one of the key tools I have now. Thank you for developing it and making it available for others to use.

Someone recommended Xenu for checking for broken links, then I remembered this post. I use Screaming Frog almost daily now when quoting clients and diagnosing on-page SEO issues. Thanks again for making such a great tool.

I am wanting to use this tool as a way to check if the external links in my content pieces are 200 response, 404, etc… I understand that this can be done using the spider mode but when I try using the list mode, it gives confirmation of the page response (as described in this article) but it doesn’t check the external links on the page. Make sense?

So is there a way I can upload a list of pages and have the tool crawl the external links on that page too?

Is it possible to show the text that a broken link is linked to? So far it seems as if the Broken Link Report produces a list of broken links, but does not show the text that the broken links were found in.

If this is possible, can you explain how I can do this? Thank you for any help that you can provide.

I am a newbie to Screaming Frog but overall first impressions is that it is a valuable SEO tool. My 404 errors do not seem to be so straight forward. The pages linking to the page not found either do not exist or, after searching high and low I simply can not find the suspect link to the page not found. Do you have any articles on how to use screaming frog for ideas of further analysis?

Thanks for the comment. When we updated the SEO Spider to allow crawling of canonicals, we decided not to class them as an ‘inlink’, as they are not a proper HTML anchor tag obviously. So they don’t appear in the usual ‘4xx client error inlinks’ export currently, or as an ‘inlink’ in the lower window tab. This is something we are debating internally, as it would be useful, even if the definition is not completely correct etc.

However, you can still get this data – Use the ‘Reports > Canonical Errors’ export from the top level menu.

This will show you the ‘source URL’ of any canonicals that are not a 200 response, a 3XX, 4XX, 5XX and let you know if they are ‘unlinked’ (meaning the only way we found the URL was via a canonical, it doesn’t have a proper link on the site to the URL).

Great tool! Not only is it good for checking broken links (as described in the article), but it’s also been very helpful with identifying many other issues with clients websites. Keep up the good work ScreamingFrog!

When I use Google Search Console / Webmaster Tools, it reports that I have 138,000 broken urls on my website. When I did a crawl using ScreamingFrog and filtered for 404 errors, I find fewer than 2,000. What explains this discrepancy?

I think scream frog is one of the best software out there for on-page analysis. Thank you. One question. How does it find broken links pointing to a website (404 errors) does scream frog use a third party software like ahrefs? Thanks

In my case I extract a big list of errors from Search console by Google, and this list I upload it to Screaming Frog to check its status, but if I want to know from which urls those errors come from, I can´t.
I only can know from which url came from an error if I crawl the entire site, but for websites with a huge number of urls it is too complicated….

Hi guys, I use Screaming Frog Tool to find attactive expired domains, which I use to build my PBNs. We can also use found domains to get some strong BLs by 301 redirection. SF seems very usefull for searching expired links in edu or gov websites. Regards, Pawel

Hello, I am a digital marketer who is new to your tool, great job by the way. I am currently trying to make my way through your user guide, However, one question I cannot find the answer to is why does the tool report 404 errors for some URL’s with ” at the end? I have checked the corresponding pages on the backend with WordPress and they don’t seem to exist. Any idea why?

OK, I’m really impressed with this 404 broken link checker, this really helped us regain our search engine ranking, which dropped as a result of menu link change which rendered many pages on our site unavailable, but we were able to get all the broken link using Screamingfrog.

By the way I gotta say, this app is really fast to compare with other tools I have used in the past, too.

Can you make the option to find 404 pages from Google SERP? For example, if I use keywords on Google “site:ry.ae” I see a lot of 404 pages crawled. It will be great and amazing help for us, SEO experts to analyze crawled pages from Google directly.

Just had to say thank you for Screaming Frog, and this tutorial.
I’ve been scratching my head a little when it comes to tracking down broken links.
This tutorial was a great way to discover why so many SEOs whom I respect are always recommending Screaming Frog.

May I know if your plugin can do the following?
1) Search for broken links on a website and do it with different IP addresses? Reason is that my site has links that redirect users to different links based on their their country. So I would need to check whether the various redirects when users click on the link work as well.
2) Searches hidden pages on my websites for broken links.

Great Tutorial. Thank you very much for detailed guidance. It is very helpful. I love the screaming frog tool for finding broken links and doing keyword research on seasonal pages, really speeds up the process. :)

When in ‘Spider’ mode, looking at the directories, I see ‘https’ & ‘http’ folder with the structure of the site and all the same links.
Shouldn’t it show only ‘https’ since the website has switched from http and is now running only on https?

I want to thanks Screamingfrog because it’s helped me a lot to determine which of my content is duplicated and which URL should be removed permanently. With the help of your software, I solved my problems related to duplicate content which was sent by a Google Adsense reviewer. Thanks again to Screaming frog.

But when I came to your software it was a little bit difficult to determine that where my problem will be solved. I spent a lot of time to understand the structure your software. But, finally, I solved out my problems here.

Just gonna say thank you to Screaming Frog team for this tutorial. I’ve been scratching my head a little when it comes to tracking down broken links. This tutorial was a great way to discover why so many SEOs whom I respect are always recommending Screaming Frog. Thank you so much!