The Moz Q&A Community

Hey friend! Have fun exploring Q&A, but in order to ask your own
questions, comment, or give thumbs up, you need to be logged in to your
Moz Pro account.
You can also earn access by receiving 500
MozPoints
from participating in YouMoz and the Moz Blog!

canonical for stupid _GET parameters or not? [deep technical details]

Im currently working on www.kupwakacje.pl which is something like travel agency. People can search for holidays and buy/reserve them. I do know plenty of problems on my website, and thx to seomoz hopefully I will be able to fix them but one is crucial and it's kind of hard to fix I think. The search engine is provided by external party in form of simple API which is in the end responding with formatted HTML - which is completly stupid and pointless, but that's not the main problem. Let's dive in:

So for example the visitor goes to homepage, selects Egypt and hit search button. He will be redirected to

'wczasy-egipt' is my invention obviously and it means 'holidays-egypt'. I've tried to at least have 'something' in the url that makes google think it's related to Egypt indeed. Rest which is the complicated ep3[] thingy is a bunch of encoded parameters. This thing renders in first step a list of hotels, in next one hotel specific offer and in next one the reservation page. Problem is that all those links generated by this so-called API are only changing subparameters in ep3[] parameter so for example clicking on a single hotel changes to url to:

which is obviously looking not very different to the first one. what I would like to know is shall i make all pages starting with 'wczasy-egipt' a rel-canonical to the first one (www.kupwakacje.pl/wczasy-egipt) or shoudn't I? google recognizes the webpage according to webmasters central, and recognizes the url but responses with mass duplicate content. What about positioning my website for the hotel names - so long tail optimalization?

I know it's a long and complicated post, thx for reading and I would be very happy with any tip or response.

4 Responses

ok, will try all of these advices to be honest. I'm 99% sure I can't do much about the GET parameters, but will check.

Second thing which is making some kind of static pages and linking them with an iframe response seems really nice idea and is definetely doable. I will dive into that.

Third one is the most obvious one but I doubt I will manage to do it (even though I'm really not a bad developer ;)) there are about 30 parameters which need to be rewritten probably. It might be a better idea just to overwrite a few main ones (like which step user is at, which direction, which hotel etc). But can apache decode javascript? :> hmm..

Yeah, the iframe idea seems to be the easiest to implement and would give you a nice amount of control over both the URLs and the content on the pages. Generally Google tries to avoid indexing other sites' internal search results pages, so if you can add content around the iframe that helps make those search pages unique, that will help.

"Google has often taken a dim view of internal search results (sometimes called “search within search”, although that term has also been applied to Google’s direct internal search boxes). Essentially, they don’t want people to jump from their search results to yours – they want search users to reach specific, actionable information.

While Google certainly has their own self-interest in mind in some of these cases, it’s true that internal search can create tons of near duplicates, once you tie in filters, sorts, and pagination. It’s also arguable that these pages create a poor search experience for Google users.

The Solution

This can be a tricky situation. On the one hand, if you have clear conceptual duplicates, like search sorts, you should consider blocking or NOINDEXing them. Having the ascending and descending version of a search page in the Google index is almost always low value.

Likewise, filters and tags can often create low-value paths to near duplicates.Search pagination is a difficult issue and beyond the scope of this post, although I’m often in favor of NOINDEXing pages 2+ of search results. They tend to convert poorly and often look like duplicates."

First, I'd look for a way to shorten the URL via the API. There are a TON of blank variables in that URL so I'm guessing the API has everything turned on, even though you're not pulling results for all those variables. If you can, get it to return data on only the things being searched for.

Next, if the API is just too unmanageable, I'd look into building static pages that pull search results into them via an iFrame. That way you could control all the URLs and content for several hundred popular searches, have nice clean URLs, but still have the dynamic search results as a portion of the page.

A last option, if possible, would be to setup URL rewrites to change the popular searches into normal sounding pages, but that could be difficult and cause things to break if the API changes suddenly or throws more random variables into the mix.

Hey friend! Have fun exploring Q&A, but in order to ask your own
questions, comment, or give thumbs up, you need to be logged in to your
Moz Pro account.
You can also earn access by receiving 500
MozPoints
from participating in YouMoz and the Moz Blog!
Learn more.