Some companies use bots to surf competitor websites and automatically collect their prices. This makes a ton of sense. If you want to price competitively, you need to know where your competition is pricing. If you collect this data often, you can detect when your competition changes their prices. If you want to have the lowest price on other websites like PriceGrabber, then you need to know how much competitors charge.

Some retailers don’t want their competitors knowing their prices, so they use technology to make it very difficult to do web scraping. Then companies figure out how to get around it. Then companies get better at blocking bots. This data war will go on, maybe forever. Absolutely fascinating.

Unless you’re a high-volume retailer, this probably isn’t directly applicable to you, but there are some great lessons we can take away.

First, you should put together a program to systematically and periodically collect your competitors’ prices. This is not always easy, but understanding your competitors’ pricing is crucial to understanding your buyers’ purchase decisions.

Second, your competitors change prices once in a while. You get to respond, but you can only respond if you know that it happened. Watch for price changes.

Third, web scraping is awesome. Unless you’re in the highly data competitive space of retail, odds are very good that you will be able to use a bot to scrape your competitors’ prices. Look into this. Several years ago, my company (a B2B company) used a bot to scrape prices from a distributor that put prices on their website. This resulted in fabulous information for intelligent pricing decisions.

You probably won’t get into an Internet arms race with your competitors, trying to scrape or hide your prices, but this article reminds us of the power of knowing our competitors’ prices, and that we can likely find them relatively easily. Put together your competitor pricing data collection program.