Google's Sandbox Still Exists: Exemplified by&nbspGrader.com

The author's views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.

For many in the SEO field, Google's "sandbox," a filter the search quality team created to help fight spam, is a relic of days gone by. However, we've been spotting new cases over the last few years, and I finally found a great example to share publicly (and got permission from the site owner). Grader.com, and the four subdomains underneath it - Twitter.Grader.com, PressRelease.Grader.com, Website.Grader.com & Facebook.Grader.com - are all under pronounced effects that highlight this algorithmic element's impact.

How do we know? There's a few common signals:

The domain is relatively new (usually less than 1 year old, but sometimes as old as 2)

Pages on the site are unable to rank for even exact title tag matches

Other sites/pages outrank it for search queries with clear navigational intent

A temporal cycle in which the site initially ranks quite competitively for relevant queries, then suddenly drops off 30-500+ ranking positions

Clearly, Google is penalizing or filtering this site in some fashion. They've got the pages in the index, but they can't rank for exact title tag matches while the other engines are showing them consistently in top positions. The homepage is a PageRank 5 (according to the toolbar) and shows ~450 unique domains linking to it. It's clearly a well regarded site that's earned natural, editorial links for providing valuable content, but Google's "sandbox' algorithm is restricting the site to search obscurity. All the subdomains on grader.com perform similarly - website.grader.com, pressrelease.grader.com & facebook.grader.com, suggesting this penalty is tied to the root domain, not any single subdomain.

The big frustration for site owners is how to escape this purgatory. As with most penalties from search engines, there's no clear path to resolving the issue - I've talked to Dharmesh Shah, who runs Hubspot (which owns grader.com), and there's been no messaging in Google's Webmaster Tools. Grader.com also experienced the common "sandbox cycle" of initially ranking, then losing their search traffic and seeing even exact match queries show the site in positions 50+.

I don't have any foolproof methodology to work your way out of the box, but we have noticed (through many painful campaigns for clients and Q+A with PRO members) that some common themes emerge:

You never "pop out" alone - it seems that Google has certain internally triggered events where a bundle of sites suffering from this issue all "emerge" to their expected rankings on the same day. You'll sometimes see forum threads and chatter about these "sandbox releases."

Earning higher quantities of links, particularly from trusted sources, seems to be a common path to emergence, though grader.com certainly has its share of excellent links from places like C|Net, MSDN, Mashable, ReadWriteWeb & more.

It seems you can lengthen your stay in the box by exclusively attracting the more typical "low quality" links that signal manual link building campaigns - such as built-for-SEO directories, article submission sites, reciprocal links, dofollow blog comments, forum signature links, etc. (this is speculation on my part, and more correlation than causation, IMO).

Re-consideration requests through Google's Webmaster Tools appear to sometimes (but rarely) help, and it's always hard to say if the request itself had any action above and beyond what would have happened normally (as the communication system provides no feedback).

In the SEO world, the speculation is that Google created this filter to combat the rise of newer sites spamming the index, and while it's certainly been effective for that (think back to how new, very low quality sites could rank from 2001-2004), there's a case of throwing out the baby with the bathwater. The best we can do is be aware of the issue, mindful of how it might affect websites launches and be tuned in to its evolution for signs of how to avoid and fix. As always, I'd love your feedback about how the sandbox has affected your sites (or those of your clients) and any tactics you've used to successfully escape.

p.s. On a personal note, the "sandbox" was really what brought me into the SEO community more formally and in a participatory role. In 2004 & 2005, while fighting against it for two of our clients, I began blogging, creating tools and posting in forums hoping to figure out the problem. I fought for a long time to gain any credibility (as most established SEOs did not believe in its existence until later on), but owe a debt of thanks to Aaron D'Souza from Google who, during a meeting at my first SES San Jose conference, confirmed its existence and effect (though he was naturally coy about revealing any particular details). He also indicated that inside Google, it had a different name (though I still don't know what that was/is). I still feel a great thankfulness whenever I think about Aaron and that event, as it completely changed my mentality & focus (from proving its existence to actually finding ways out).

p.p.s. Update in October 2009: Grader.com's content was moved to WebsiteGrader.com and they suddenly rank normally for all their content. Though others may still disagree, I'm tempted to say that's the final nail in the "does the sandbox exist" coffin. Whatever penalty/filter was on Grader.com doesn't apply to the content or the links, just the domain. Moving content to an un-boxed site has solved this issue time and again (Grader.com is just the latest public example).

Hey Rand, it's my rule of thumb so honestly it's really really accurate ;) , others can disagree but you giver free links away you run the risk of getting shafted, don't get me wrong I'm not having ago at grader, but I think they have a bad footprint that's all.

I would Nofollow the OBL's on the http://twitter.grader.com/davenaylor

I can already hear ..

a) FFS Dave has given another one away

b) quick get a script written

The Badge scam thing wasn't directed at SEOMOZ's badges or Graders badges and Sorry if I implied that , but remember I still call them badges of dishonour I got offered £200 to run one last week on a gadget blog I own and thats the 6th one this year lol, but I just see them has paid links :) sorry that my twisted veiw point

No worries Dave - I know you didn't mean to indict us or Hubspot, I guess I've just come across so few badges that anyone paid for and so very many that are organic that it seemed a weird "rule" to me.

Nice catch on the dup content, though - certainly something to address (though many sites have it far worse and can still rank for their exact title tag matchs).

I was wondering last night about that swap from websitegrader.com to website.grader.com should have decimated their search traffic the way it did. It seems pretty strange to me - there's still plenty of branded search traffic looking for the tool that probably can't find it via Google :(

Second, check the Y link data (true, it's clearly a different dataset to G's, but it is indicative of what's going on) - 200k+ links for the whole domain, but sub-1k to root. And 213k to twitter.grader.com, (whole site), but only 13k to twitter.grader.com root? I don't know what the graph of d(links)/dt looks like, or the second derivative graph, but I bet they spike like hell

Third, the grader.com homepage image is a cross-domain call from uitest.hubteam.com? Which appears to be an uncrawled subdomain of a badly regarded root domain. And that same page contains one line of text not involved in nav markup, then 4 links to sub-domains? With fat anchor text?

What exactly is it about this domain that a search engine is supposed to like? Every signal of quality I looked at is transmitting "SPAM!" on multiple frequencies, and with such amplitude, my eyes are ringing. I can't say it any better than this

If it ranks in Y / MSN, that's simply further evidence, if such was needed, that their respective algos suck (still). MSN has a definition of onpage relevance my 4 year old could spot the flaws in, and Y apparently still can't handle temporal analysis, depite having the widest selection of crawl data in the world via various purchases (maybe none of it's got historical timestamps, I dunno. My tame Y engineer doesn't work with crawl data), and are still vunerable to sitewides

Lastly, if I ever see toolman again, I think I'll slap him for ever mentioning the word "sandbox", then cry for all the pain he's caused with that one post

This is a solid analysis of what I think may be an algo penalty that presents as though the patient website was sandboxed.

FWIW, I like playing with the site a lot and use it to scan for new potential Twitterers.

I would think that between the Hubspot Board and the actual consultancy that the redirect and link building issues would have been addressed.

To Rand's comment about sandboxing, yes, I absolutely believe it still goes on, but I believe some sites skate through while others get nailed. And if we talk about "signals and cues to search engines" (which is how I explain this business to small biz owners), I think our entire industry got a flashing red signal with a siren blaring on duplicate content and canonical issues. I would say that any new site that shows strong on-page SEO, lots of cross-linking to domains on its own servers and doesn't address canonical issues is going to be blasted.

Here is the odd thing. The Twitter grader subdomain is probably strong enough riding the Twitter Viral Train to get amazingly good traffic. I'm pretty sure I know why they used a subdomain, but hmmm, I wonder if there would have been any impact as a subdirectory?

Good points. We should have been better about some of the core SEO issues. This was a bit of an experimental project and it ended up just taking off. I spent too much time on coding the features instead of thinking (as I should have) more about the SEO.

The rationale for the sub-domain was that the name of the site "reads" better. We wanted to move away from new domains for each app in the family (TwitterGrader.com, WebsiteGrader.com, etc.) to at least have a common base domain: website.grader.com, twitter.grader.com as in our experience, getting sub-domains to rank is a much easier process. And, we expect to launch several more "graders" over time.

The rationale for sub-domains vs. sub-folders was more out of user-friendliness. We felt that for an application called "Twitter Grader", the URL twitter.grader.com would make more sense than "grader.com/twitter". It just "reads" better as a sub-domain.

Based on all the comments here, I'm making another go at getting some of the SEO issues cleared up. That, combined with some other powerful links that came in today (including the one from SEOmoz) should help things along.

That is really possible, but it's ridicoluos if Google marked them as phishing scheme. Site with a lot of inbound links from world wide, site without (probably) any phishing complaints, site which is (probably) several hundreds or thousands times bookmarked at Google Bookmarks etc ...... how this site could be algorithmically penalized? Yes, theoretically it's possible, but in this circumstance Google algorithm really sucks.

Many of them use every link building tactic possible to accumulate traffic, including attempts at viral e-mail and social media notes to get thousands of people to link to a fake login. The proliferation of social media linking to these sub-domains is also exactly what most of these schemes aim for.

Alternately, the use of sub-FOLDERS instead of sub-domains gives a more legitamate signal about your service.

There are so many other factors involved in ranking beyond age of the domain that I don't think you can isolate that as the reason these subdomains are having trouble.

Rae brings up a good point (http://twitter.com/sugarrae/statuses/1327578001) that many of the pages may appear to be search results or autogenerated with low content value.

I wouldn't expect this page to rank for the text string in its title tag:

http://grader.com/

as it has little content.

I would advice them to add a few informative articles about "measuring what matters in inbound marketing", for intstance.

And the other sites also don't follow many of the fundamental best practices of SEO.

http://website.grader.com/

for instance, has very little content on it to index and no links on the home page to valuable pages. The home page is a form (which would have little ranking value) and the only links are to a login page (little ranking value) and a badge (little ranking value).

They could try adding a few informative articles about what the grading is about and why it's valuable, how it's actionable, etc.

This site seems to have lots of duplicate content issues. All the site reports are public and have very similar information on them:

I think someone in the comments earlier noted that the badge for this site has the link in js, so those aren't counting for PR.

Also, just a quickglance shows that they're not necessarily implementing good seo technical best practices. For instance, the home page loads from both http://website.grader.com/ and http://website.grader.com/default.aspx

And the "free seo tool" link in the sidebar on the home page (also the only text on the page) is to the non-canonical version of the ... home page.

I'm not sure about the link profile. More than half of the links (10k+) are from the grader.com domain. And some are from pages like this: http://www.online-college-blog.com/

I dunno. I would have to dive a lot deeper into this (what queries do they want to be found for? what pages do they want to rank for those queries? do those pages have the most relevant content compared to the competition? what is the tech infrastructure? what's the link profile? etc.) to have ideas on what might be going on.

If there's filtering or penalties here, I would suspect it's for reasons other than age. (paisleyseo may be onto something...)

I highly agree, one thing hubspot's grader tool websites have always been lacking in is content.

It is honestly surprising that hubspot expects to rank those domains when they lack the fundamental aspects that 99% of other website have, textual content.

Honestly, I don't see why Google tolerates any of the SEO companies. They all know we do automated queries.

Lets also not forget how generic a term like grader is, this has a huge relational categorical problem. I mean grader and grade is highly linked to the education community.

It takes a while to reprogram the internet to make grader mean something entirely different that it currently does. So I think thats the main sandbox issue.

If I were to search Google for facebook grader, I would expect to find a teacher or a teachers aid talking about be a grader of student papers. Not some stupid lame tool (meaning I am not part of the internet marketing community, but in the education industry) telling me how my marketing efforts are doing on facebook.

Honestly, how on earth do they expect to out rank twitter, facebook, etc. for grader terms. Those websites get USED! That is another factor in the algorithm Google can or thinks it can measure.

Wow, I could rant on and on about how I can see so many many factors that lean toward there not being a sandbox issue. But rather a case of google and the internet saying other websites are more relevant for certain searches.

I can pretty much confirm this with a test site that i made, i wondered if dropping a ton of links into a site might force it out, but like Rand says, it just keeps you in the sandbox for longer. Pretty obvious really but just wanted to add confirmation.

Also, Google never seem to give you any messages through webmaster tools if your site is just sandboxed and not down right penalised/infected with malware.

I have seem some rather confusing times where pages will appear back in the index, and rank well for say a couple of weeks, and then without any warning will dissapear again... extremely frustrating!

I think that Brand Name mentions (without links) from a variety of unique domains will get you out of the sand box.

My personal site www.wesupchurch.com (a campaign site), never experienced many of the sandbox effects (even with low quallity link building). I didn't recieve many high quality links to it, but I recieved tons of mentions of my name (or brand if you will). This was because every news paper in the state and every county clerk published a list of candidates.

I think this goes to show that google IS paying more attention to brand names.

I've experienced this with several client sites as well, some in the movie industry (getting mentions about their releases) and others in the political field.

But many sites where I've done massive manual link building, are still sandboxed.

I'm testing this theory with some newly registered domain names and will happily report back to the SEOmoz community when I have more solid statistics to back up my claim.

But in what I've noticed with clients sites the domain isn't as important as the mentions for getting out of the sandbox. This might largely be due to Google just considering the buzz factor.

So Microsoft (not that they'd ever be sandboxed)... might be seeking mentions of the brand name Microsoft and not just microsoft.com.

Again this is just a theory, but it goes back to brand building:

Obviously, page position is largely determined (arguably the majority) by backlinks. But my theory here is that brand development (along side with links) will get you out of the sandbox faster. I think that both are important in terms of overal internet marketing.

The Google sandbox is an interesting phenomenon. I understand the motivation behind it, but it seems that in many situations (like with grader.com), there are sufficient signals of quality and non-spaminess that the site should at least be able to rank for non-competitive keywords.

Makes no sense to me that twitter.grader.com (a PR5 and a mozRank of 4.7) can fail to rank well for a search on "twitter grader".

Will be interesting to see how this plays out and how long it takes.

Meanwhile, if anyone has been tracking the frequency of these "sandbox releases", I'd be curious to hear about it.

I wonder if grader.com triggered a bunch of Google's spamminess warnings, precisely because its viral propagation (at least for the Twitter grader) was so rapid.

Regardless, I think it's a mistake for Google not to rank it http://twitter.grader.com/ as the top result for a searches for twitter grader (which leads to the relevant but not optimal http://twitter.com/grader instead). I've noticed this for several months now.

I am about 95% in agreement with Dave Naylor here. This site just suffers from bad SEO!

Could it have anything to do with 50,000 internal pages that are essentially EMPTY results pages???

The concept is interesting enough, but twitter grader ignores just about everything an "old school" SEO would advise them to do. Simple things like content on every page (at least the homepage having the words "twitter and grader SOMEWHERE on the page) and a good PR distribution/crawl strategy.

Oh yes, it wouldn't hurt for the link from grader.com to twitter.grader.com to say "Twitter Grader" instead of "twitter.grader.com".

It seems that the branding on this site pretty much ignores SEO as does the content strategy. Good thing it is a social media play, not an SEO driven strategy.

Don't blame a mysterious "sand box effect" until you fix what's obviously bad about the site.

We've found that Google has become extremely slow at identifying and completing 301 redirects. We've got pages that were 301ed 3-4 months ago and are still listed. Eventually the cached version link disappears, but it can still take weeks for the old page to disappear.

And who's to know whether the link juice transferred properly? I would not trust any juice from a 301 that's less than 3 months old, for sure, and maybe older.

Hey guys, just thought i'd throw another example into the mixer, backing up my belief in the so called sandbox.

Our website is www.epictureframes.co.uk, we launched in May 2009 with a brand new domain name to match.

Whilst I may be a bit bias for obvious reasons; I firmly believe that our website is one of the UK's leaders when it comes to delivering what the user wants when googling the keyword 'picture frames'. After all this is what google advises you do in order to rank well.

After 7 months live we are still only found on page 16 of google for 'Picture Frames'. We have over 1000 pages worth of quality, relative content but still rank behind websites containing very little, irrelevant content.

I can see that there are serious flaws when it comes to G's algos' when this is the case! We have some very high quality links coming into the site from other market leading websites home pages and still this is the case!

Here's a little note in addition to things said by both DaveN and TallTroll, grader.com is registered for less than a year out.

Expiration date: 10 Feb 2010 05:00:00

Pre-sandbox results were basically the work of domainers buying keyword rich URLs that they could interlink amongst their other domains (presenting a hearty link graph) but they still were always cheap when it came to "registering" the domain. Often they'd taste for as long as the could and only register the ones that hit for ad revenue. You'd never see the domain registered for more than a year though.

So, like others have pointed out about cleaning up your own sites' inter-linking and spam signals, register the domain for a few more years. It's hokey, but it's yet another signal.

Yeah, I saw the short time-to-live, but that is on an initial reg in 1999 (I think) - I'm not sure if that would trip the TTL flag or not. I also couldn't be bothered to do the research on ownership history, so I don't know if the domain has had a recent data reset, which would just exaggerate the link data spikes even more

The Google sandbox effect/filter has always caused some sort of angst for SEO's, but I've noticed that now with the added filtering based on brands in competitive verticals, the paid linking devaluation schema and an added latency time has only made us as marketers (of course based on specific client vertical), very hyper sensitive.

Despite abiding by guidelines, best practices and good logical advice, something doesn't add-up. The sandbox should be holding sites in abayance for a cycle where quality checks are conducted algorythmically, then released. The sandbox should not be holding established sites that are making new efforts in marketing and adding valuable content in a sandbox... especially when they have not had the ability to get "viral". Clearly some of the wrong types of exploits or favoritisms get by Google, when they should be sticking with their company mission statement.

All this ranting to basically say, I think they need a new Algo update ;)

TallTroll and DaveN have nailed it. There always seems to be a solid reason for why a site is penalized. Does this bring up the old question: why isn't Google more forthcoming or transparent with its info on webmaster tools?

>> why isn't Google more forthcoming or transparent with its info on webmaster tools?

Because such transparency would be abused. As much as Matts close lipped stance on stuff can be an irritant on occasion, I totally understand it. And if the console delivered good info on bans / penalties, there would simply be a rush on sites stepping up spammy activity to determine thresholds and timescales for responses.

If it was known exactly what %age of spammy links tripped a penalty, how many sites would there be with (x% - 1) links in the world? And all of them would be above you... Thats not a Google any of us want to see, really

In case of Sandbox, when I started to learn SEO at that time my two website put in the sandbox. It was come out from sandbox in a month. At that time I observed that ranking of my domain comes on 1st page in month. And suddenly it was gone from Google SERP. At that time I had started to do properly quality link building. That was helpful to me in the case of Sandbox.

I also observed that sometimes Google put only single URL in Sandbox. If you are getting so many links on a particular URL.

In our experience, when a domain transfers to a new owner, with clearly different content on it from what was there, it gets treated as "new" and goes through the sandbox process much like a brand-new domain.

For domains that are new but just a "rebranding" (only domain name change, no content change), the transfer of the 301link value still took a few months and rankings were lost during that period.

I'm in the sanbox right now, and ik is driving me crazy!!! I have a blog about kippenhokken (dutch) but also here in the netherlands google can drive you mad. I will try to get better quality links. you got any tips on how to get more natuarall links?

Great article.
I feel much less alone with my current situation.
I transferred a blog domain to a a new domain (http://lifeinthefastlane.com). No longer found in search, although being crawled.
I tried everything with SEO, re-organising site etc all to no avail.
Only 6 months in and have all pages ranking well on Yahoo and Bing, all on the front page #1 or #2 but on Google for the same pages we list at #180.
So disappointing to have so much good content not listed on Google, with other sites with little or very poor quality ranking ahead in search.
Having read all the comments above - I can see we are not alone!
Thank you so much for the article

Oct 12, 2010 : our website drops down in Gg rankings(but ok in Bing and Yahoo).Same day : I receive a 75€ Free card for Adwords by Gg...I checked everythings on the website (seems ok) but I didn't use the Adwords card.

Dec 12, 2010 : the website is back in the (good) positions : cool.

Jan 12, 2011 : it drops down again (Bing & Yahoo OK)Adwords Card received again (on my mailbox), same amount...I didn't use it.My website is clean, there's no cheat or whatever..

So what can I say about my own experience:"The Google Sandbox Effect" is only a matter of $$ : Google makes money with Adwords. Who need Adwords the most but the small companies who suddenly has their websites ranked from the first page to nowhere ?Think of it. Bye for now.

So I just got done reading all this post and all the comments (whew)...

Question is the following: When Google does all of its updates (Farmer, etc.), is that when the new "batch" of websites becomes sandboxed? Is there a mythical timeframe or does it happen daily/weekly for sites to become sandboxed?

Is the Sandbox filter becoming more prominent again? I have noticed over the last 6 weeks it seems harder now to get new domain to rank even for their own name if keywords are spaced (your website name) vs one merged word (yourwebsitename). Normally I would expect new sites to rank quite high for there name quickly but lately I have struggled to get New sites up there.

I stumbled on this post because, guess what, I was doing some research on the sandbox effect, given that I suspected we start suffering it a few days back.

After reading the article I am 100% sure! We launched the website at the beginning of the year, did some promotion when we launched but no much effect on the ranking (the website is italian and targeting medium tail keywords). Then we did more campaigns before the summer and over the last 2-3 months several keywords went on page 1 and traffic started to flow in nicely. The content on the website is very good as the small bounce rate shows (around 50% in a very competitive arena). Then all of a sudden, all the main keywords dropped from the google ranking (while they held their ranking on yahoo and bing) even if the pages are still indexed as very focused phrase and branded searches show.

Important to say that these are all commercial keywords. We did all the type of promotion you can do when you are young and no big website will link to you. It is easy to say "acquire good quality links" but in a online world where reputable websites are scared by google or simply do not care about small players, getting relevant links from reputable websites. In my experience (broader as SEO for other larger websites) you can write good and useful contentas much as you want but getting "unsolicited" links is a slow and very unpredictable process.

It would be great and very helpful if anybody else who has been moved into the sandbox when they were previously ranking can share their experience with us!

The Sandbox can be fatal for those who have a Branded domain so centrally tied to thier web presence and identity. (However, any wise entity that relies on that ought to have registered a handful of domains to protect thier brand regardless - and should have alternate domains they can utilize if and when needed.)

For other's; cut your losses and build a new, clean site on a fresh domain and get on with your life - and business. The penalized domain may still earn it's keep on Bing/Yahoo and other tertiary links regardless.

You can kneel in Goog's Reinclusion confessional and beg, hope, wait. Or think you can 'Link your way out'. And maybe you can. But if you're stuck on Page 5 of SERPs and beyond for months on end with a domain over a year old, cut your losses and move on. Maybe it will magically rise out of the ashes someday. But the Opportunity Loss in the interim isn't worth sacrificing.

I thought I had done a considerable amount of work in making sure not to have "bad SEO", but looks like I missed a few spots.

I'm not sure there are duplicate content issues (if you saw any, would appreciate your pointing them out).

I've gone through just about all the links and ensure that I'm not giving away clean (dofollow) links where we should't be. That should be fixed.

As for the badges problem, I don't see this as being a "scam". Badges are installed at a user's discretion (they're not auto-installed as they would be for a theme/skin or something else). So, the user is aware of them and makes a choice. My understanding was that as long as users are not "tricked" into having badges on their site, it's OK.

On the topic of badges, I'd expect those backlinks to get discounted heavily (makes sense), but do you think they actually hurt? Should I put a no-follow on the badge code that I provide to users? It'll feel a little weird no-following my own links. :)

In any case, thanks for your help. Will get some of that stuff cleaned up.

"Rule of thumb if you going to give clean links awaywith no editorial process then you deserve to be under a penalty "

It may well be that some sites are penalized for providing clean links without a good editorial process, however, given the hundreds (probably thousands) of examples of sites that do this and clearly have no penalty, it cannot possibly be called a "rule of thumb."

And to suggest their badges are paid is a weird thing to do without evidence. Granted, technically Hubspot's an SEOmoz competitor, but your comments get parsed by some of the field's most important people - unless you've got evidence of this, it seems a bit witch-hunt-like to me. We've got badges on SEOmoz for our quiz and for Linkscape scores and the little ones in the sidebar, but we don't pay anyone for these. In fact, I think if you were to look across the web at all the "badges," you'd find that far under 5% (probably less than 1%) try to pay users to adopt them.

I've only personally launched relatively niche sites and haven't experienced much of a wait at all before I see results in google, but I've been very mindful of trying to attain only PR 2 and above links unless I feel it will be a source of legitimate traffic. In other words, I'm not submitting to every PR 0 directory I can find. It'd be nice to see more evidence of whether this is more than correlation, because it would turn the internet on it's side if everyone stopped wasting their time submitting to less than stellar directories and chasing worthless do-follow comments.

I think grader.com is a good example of why google should consider removing the sandbox alltogether. If a site is bringing in high PR backlinks then Google should automattically be taking note and bumping it up the SERPs regardless of it's age. Yes, use age and # of indexed pages as factor, but not so heavily weighted that new sites can't break through quickly if they are truly more valuable (as indicated by massive # of high quality and legitimate backlinks.

One of the websites I am working on seems to be hit by latest Sandbox update or whatever it is called.

An e-commerce website with ~30 products used to rank on 1st page for most of them but everything changed approx. 10 days ago when there was a sharp drop in Google's rankings.

It is a bit strange that right now that domain still ranks somewhere in the first 10 pages (mostly 3rd to 5th) for those keywords, but the landing page is "www.domain.com" instead of "www.domain.com/product.html" as it used to be.

Google's Terms of Service do not allow the sending of automated queries of any sort to our system without express permission in advance from Google. Sending automated queries absorbs resources and includes using any software (such as WebPosition Gold™) to send automated queries to Google to determine how a website or webpage ranks in Google search results for various queries.

Ok, I am scared now. I am a new webmaster(still learning) with a site only 3 months old. As I was totally unaware of this sandbox it was shocking for me to find myself a victim of it. Atfirst I was confused and thought my site might has been penalised for something coz it was ranking good and suddenly this sandboxed thingy happened. And I couldn't just believe it when I was told that my site was enjoying the so called google's "Honeymoon Period". Google forum repliers also told me that this sandbox thingy is normal and it is there to ensure that each new site fights its way up to top. Some suggested me to get backlinks to my site. So I thought it was normal and started planning for getting a lots of backlinks.

Now my ques to RANDFISH and other SEO's in this great site is:

As my site is new with only 1 backlink, by natural SEO laws it is supposed to work on getting backlinks. And I was exactly planning to work on this. Now that my site is sandboxed and as RANDFISH says and CHEWIE confirms that "It seems you can lengthen your stay in the box by exclusively attracting the more typical "low quality" links that signal manual link building campaigns - such as built-for-SEO directories, article submission sites, reciprocal links, dofollow blog comments, forum signature links, etc. (this is speculation on my part, and more correlation than causation, IMO)."

What am I supposed to do now with my site then? Should I stop working on getting BLs? I guess I should coz those are the only options for me to start with earning BL's. I can only dream of respected high quality sites giving me link, coz they wouldn't! And if I stop getting BLs for my site I will have to stay with 1 backlink till my site comes out of sandbox(and nobody knows when for sure). Any help or answers would be much appreciated. Thanks.

But the key is to build QUALITY backlinks. Backlinks that are low quality and look like search engine spam are not going to help you get out of the sandbox. In fact they may keep you there even longer.

It's beter to spend your time developing quality content that people will want to link to. Rand did a video a while back explaining the difference between good links and bad links, called "Dude, Your Links Kinda Suck". It's a good place to start, but I'm sure the more you read here at SEOmoz, the more you'll see the difference.

I have a site that is 13 months old - as it was built in a subfolder of a domain I decided to buy a domain especially for the site and migrated it over about 6 weeks ago. It was a wordpress site so I just imported the exact same structure and put a 301 redirect on the old site. Everything was going swimmingly for abour 4 weeks and then my traffic crashed. It seems that pages published since the new domain are ranking - but the rankings I had for the pre-switch pages have fallen from page 1 to page 19!As I am ranking still I assume this isn't a sandbox issue - but it seems to be a similar effect - ie drastic reduction in traffic.Anyone got any ideas on what I might have done wrong and how I could recover my rankings?

>Earning higher quantities of links, particularly from trusted sources seems to be a common path to emergence, though grader.com certainly has its share of excellent links from places like C|Net, MSDN, Mashable, ReadWriteWeb & more.

Here's your answer - that's all there is to it. Google sees fresh content, it ranks it for a bit but there are no links to support that fresh content and it becomes not-so-fresh after a while so it drops the site until the site has enough links to rank properly. I have seen people with banned sites claiming they were in a sandbox, I have seen website owners not doing shite about their sites claiming they were in a sandbox - bullshit in both cases.

As much as I would love to have google's algorithm, I'm glad they keep it a secret for reasons TallTroll described. The Sandbox definitely protects the 'old boys' from the hotshot new sites, but since I deal almost exclusively with optimizing new sites, I think there should be relaxed requirements for sites like grader.com.

Many of you brought up some great points about the horrible SEO of the grader site, and I agree that those should be fixed, especially the dupes and www 301 issues for starters.

Well, we don't have to call it a sandbox, but I'm not convinced that there's *something* that feels an awful lot like a sandbox phenomenon as the way people describe it, actually out there.

I don't think it's a total myth.

Even putting this particular case/example aside, I've experienced similar things before and the behavior seems to be identical to what we're talking about here. Seems to apply only to new domains (or newly transferred domains) and the effect is very, very real.

But, that's just me. We're all entitled to our opinions and theories. Mine just happen to be less informed than most (but I love them just the same). :)

Totally true, it's not a bad functional definition in some ways. The problem is that as the perception of the "sandbox" has grown up, a lot of other mythology has grown up with it, like the perception that new sites *have* to age for a while before they can perform, which isn't the case.

It's completely possible to take a fresh reg site and rank it well within weeks or even days of launch, but if you don't percieve it as a possibility, you're less likely to even try. And then people see that new sites don't rank well, quickly, which must therefore be due to the Sandbox...

The primary factor in the sandbox is simply the age of the site. New sites (especially if the content is not frequently updated) will have difficulty ranking for keywords.

If the domain has never been registered or if it was recently dropped, you might experience the sandbox effect.

The rules however are not universal, as some sites seem to avoid the effect all together. It seems to be linked to the difficult of the keyword. More competitive keywords will have a longer sandbox effect for new sites than less competitive words. This helps to keep search engine spam down on common searches.

For an SEO community I don't see much in techniques on how to escape this stupid filter. Also I primarily think it is about link velocity, despite what fat gutts says, because I have sites which are hanging on page 70. The question remains though that you must be able to sandbox competitors with tons of low quality links, because how could g00gle know it was me doing it?