Court recently talked about the Google Sandbox on his blog. If you’re not familiar with it, read his post. He gives a great introduction to it. I don’t want to be a Monkey See, Monkey Do like RT talked about recently but I wanted to expand upon my comment on Court’s post. In that comment I mentioned that this sandbox isn’t a simple on/off switch but the combined effect of several different flags and algorithms that are used to limit where new or otherwise suspicious web sites appear in search results.
While there is probably quite a lot of code behind the sandbox algorithms that take into account perhaps hundreds of different parameters, there are 4 main factors that seem to have the most weight in determining if, how long and how much a domain is in the Google sandbox.
1. Domain Age
The first is domain age. Of course, one of the easiest and well known ways to avoid the sandbox is to buy an aged domain. Google does give a pass, of sorts, to aged domains. If things click just right, this is the way to go but don’t depend on it.
Buying an aged isn’t always practical. It can be hard to find just the right aged domain name that fits with your blog’s branding strategy. The price, particularly for a domain with a high PR or desirable name, can be too high. The domain might have been blacklisted for some reason in the past so you should do research before buying.
Also, keep in mind that an aged, but dormant, domain that sudden gets a lot of links and/or has a recent ownership change can also trigger sandbox algorithms.
2. Speed of Link Growth
Next, the speed and number of links that a site acquires is a significant factor.
It is well known that if a new site shows up and suddenly has 10,000 back links in a few days or even hours that Google will probably put it in the sandbox for years if not deindex it entirely as a spam site. A large number of links so quickly just screams “black hat“. In your link building you must build slowly and somewhat naturally. That means a link here, a couple of links there. Unnatural, rapid, large scale link building rarely works for very long although there are some exceptions.
3. Trustworthiness of Links
Google’s algorithms consider is how trustworthy or important the site linking to a new site is. This is how certain commercial viral marketing sites have avoided significant sandboxing. They get links from a trusted parent site, like a movie studio, or buy a lot of trusted links via paid reviews and the like.
If you currently own an aged and trusted authority domain then working natural links into it for your niche blogs can help them avoid or lessen the impact of the sandbox. If you don’t own one, then there are some other techniques you can use to develop this kind of link although buying them outright is a bit risky although this is often more true of the seller than the buyer. This is why Google gets so aggressive about going after PR selling, it sells trust in the quality of a site and breaks their algorithms.
4. Anchor Text Variety
Lastly, Google looks for variety in your backlinks. If you suddenly have 500+ backlinks with the same anchor text, guess what, you will stand a very good chance of getting sent to the sandbox. Even established sites can get bit by this one, just ask John Chow. That’s why directory submission services rarely provide a great value. If they submit your niche blog with the same competitive term anchor text, for example, “car insurance“, all they’re doing is heaping sand upon your head.
A good backlink mix seems to be no more than about 30-50 percent using your primary keywords with the rest consisting of long tail term links that point to individual post pages. This variety, particularly when spread out among what Google considers to be trusted sites, seems to be the most effective way to limit the sandbox effect.
If you have anything to add about the Google sandbox or have any questions, feel free to ask. I just go by what I’ve observed with my sites and sites where I’ve had a consulting role.