Introduction

First of all, English is not my first language, so if you find any mistakes bear with me. I started this just to learn how to develop Windows applications using C#. This application simply finds which position your site URL is in for given keywords, in Google search results. I did this when I came across the Free Google monitor application which does the same thing as my application. I decided to test my C# knowledge by writing the application with the same capabilities.

Using the code

This application can be used as it is as a utility to check Google position. Coding is done in a simple way. Submit query for a keyword to Google with the following URL: http://www.google.com/search?q=keyword. The returning page can be parsed and checked against the given site URL to identify the position. Navigating to next pages in the search result is again simpler. Again submitting the same URL with start and end result will return the appropriate pages.

I used many code snippets from Code Project itself to implement the features like retrieving HTML pages from web server and HTML parsing etc. You may find some of the functions useful which can be used as it is. Again, I am not a seasoned C# developer. So if you feel some part of the code can be done in some other way, feel free send me mail.

Points of Interest

I wanted to write this utility with supports different search engines. But Yahoo! and MSN result pages are embedded scripts which need to be resolved to identify the actual URL. If someone has done already this kind of work, I am more happy to see those.

History

This is the first version.. Let me update it once I have enough things to mention here.

License

This article has no explicit license attached to it but may contain usage terms in the article text or the download files themselves. If in doubt please contact the author via the discussion board below.

This is something many people will need.
It would be even better with these features:

- Import a list of keywords. Drag and drop a text file for example.
- Export results to CSV for excel analysis.
- For each keyword result parse the page count return by google. Example: Results 1 - 13 of about 39,800. In this example, the number 39800.
- At later stage, put it in the system tray, schedule a scan, and alert for change in google position.
- Add few seconds delay between keywords query, to make google think this is a person using google, and not a spider. Make it random delay.
- Make sure Google sees your software as if it were MS IE.

Isn't Google's motto "We Aren't evil"? I somehow doubt that them going public is going to enable them to keep it that. They will have to succumb to their shareholders needs which is $$$$. I agree with the post about how google disects our pages, and we should be able to do the same. The API is a joke because it limits the amount of queries to 1000. That wouldn't be so bad except they limit your maximum search results to 10! Go figure. Great piece of code if you ask me.

You should use Google Web APIs[^]instead of using screen scraping technique. It has a lot of options to play around with and you don't have to worry about your utility being illegal as someone suggested.

Exactly, I fail to see the difference. When google indexes my site, it too simply fetches my webpages and throws away the ads. It then makes money from the advertising it generates from the people using their service to find my site.

Surely its not unreasonable to do the same to google's results, providing that its done a system friendly manner (i.e. that it doesn't hammer their servers with tons of rapid/simultaneous requests and that it doesn't scour hundreds of pages). Put simply if it doesn't advesly affect their performance I fail to see any objection.

I fail to see how this can be illegal, anymore than some new browser code which doesn't render images.