Hey friend! Have fun exploring Q&A, but in order to ask your own
questions, comment, or give thumbs up, you need to be logged in to your
Moz Pro account.
You can also earn access by receiving 500
MozPoints
from participating in YouMoz and the Moz Blog!

202 error page set in robots.txt versus using crawl-able 404 error

We currently have our error page set up as a 202 page that is unreachable by the search engines as it is currently in our robots.txt file. Should the current error page be a 404 error page and reachable by the search engines?

Is there more value or is it a better practice to use 404 over a 202?

We noticed in our Google Webmaster account we have a number of broken links pointing the site, but the 404 error page was not accessible.

If you have any insight that would be great, if you have any questions please let me know.

2 Responses

Since a 202 error is a server error, that's not categorizing that page right. A 404 says it doesn't exist, which is better. However, redirecting it to another similar and relevant page via a 301 is the best option.

Hey friend! Have fun exploring Q&A, but in order to ask your own
questions, comment, or give thumbs up, you need to be logged in to your
Moz Pro account.
You can also earn access by receiving 500
MozPoints
from participating in YouMoz and the Moz Blog!
Learn more.