Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our User Agreement and Privacy Policy.

Slideshare uses cookies to improve functionality and performance, and to provide you with relevant advertising. If you continue browsing the site, you agree to the use of cookies on this website. See our Privacy Policy and User Agreement for details.

16.
overdose.digital // @jasonmun // #smssyd18
• Every website is allocated a set (finite) amount of crawl
budget
• Allocated based on technical capacity and popularity of
the site
• You want Google to crawl your important pages more
frequently – not crappy duplicated/thin pages
https://webmasters.googleblog.com/2017/01/what-crawl-budget-means-for-googlebot.html

24.
overdose.digital // @jasonmun // #smssyd18
• It confuses the hell out of the bots
• No control over which version gets indexed and
displayed in SERPs
• If you are checking your SERPs in Google and notice the
message below, chances are you have some internal
duplicate content issue:
Source: Moz.com

37.
overdose.digital // @jasonmun // #smssyd18
• Common on all E-commerce websites
• Instead of showing all of your products at once, split it
across multiple pages
• Most of the time, these elements remain consistent:
Title Tag, Meta Description, H1, Text content
• Hint: Duplicates…Near Duplicates…??

38.
overdose.digital // @jasonmun // #smssyd18
• Google launched the support of pagination tags –
rel=“next” and rel=“prev” in 2011
• It indicates to Google that “a collection of pages” are
part of the same paginated series
• Strong hint to Google to consolidate indexing properties
such as links
https://support.google.com/webmasters/answer/1663744?hl=en

45.
overdose.digital // @jasonmun // #smssyd18
Disable Javascript in your browser, if NO pagination links are
available – it is problem
Check source code for pagination tags
rel=“next” or rel=“prev”. If missing, it is a problem

54.
overdose.digital // @jasonmun // #smssyd18
• Google can only rank the website that it can crawl
• Country specific websites will struggle to rank
• VERY little traffic and low indexation
.com .com/au/

57.
overdose.digital // @jasonmun // #smssyd18
https://support.google.com/webmasters/answer/1061943?hl=en
• Exclude search engines from the redirect rule
• Make an exception based on user agents
• This will allow Googlebot to crawl and index all country
specific websites