How to check which URLs have been indexed by Google using Python

There are three main components to organic search: crawling, indexing and ranking. When a search engine like Google arrives at your website, it crawls all of the links it finds. Information about what it finds is then entered into the search engine’s index, where different factors are used to determine which pages to fetch, and in what order, for a particular search query.

As SEOs, we tend to focus our efforts on the ranking component, but if a search engine isn’t able to crawl and index the pages on your site, you’re not going to receive any traffic from Google. Clearly, ensuring your site is properly crawled and indexed by search engines is an important part of SEO.

But how can you tell if your site is indexed properly?

If you have access to Google Search Console, it tells you how many pages are contained in your XML sitemap and how many of them are indexed. Unfortunately, it doesn’t go as far as to tell you which pages aren’t indexed.

This can leave you with a lot of guesswork or manual checking. It’s like looking for a needle in a haystack. No good! Let’s solve this problem with a little technical ingenuity and another free SEO tool of mine.

Determining if a single URL has been indexed by Google

To determine if an individual URL has been indexed by Google, we can use the “info:” search operator, like so:

However, if the URL is not indexed, Google will return an error saying there is no information available for that URL:

Using Python to bulk-check index status of URLs

Now that we know how to check if a single URL has been indexed, you might be wondering how you can do this en masse. You could have 1,000 little workers check each one — or, if you prefer, you could use my Python solution:

To use the Python script above, make sure you have Python 3 installed. You will also have to install the BeautifulSoup library. To do this, open up a terminal or command prompt and execute:

pip install beautifulsoup4

You can then download the script to your computer. In the same folder as the script, create a text file with a list of URLs, listing each URL on a separate line.

Now that your script is ready, we need to set up Tor to run as our free proxy. On Windows, download the Tor Expert Bundle. Extract the zip folder to a local directory and run tor.exe. Feel free to minimize the window.

Next, we have to install Polipo to run Tor and HTTP proxy. Download the latest Windows binary (it will be named “polipo-1.x.x.x-win32.zip”) and unzip to a folder.

In your Polipo folder, create a text file (ex: config.txt) with the following contents: