Home | Archive | SEO Tools | Contact

Understanding Optimum Link Growth

Good evening all and Merry Christmas to all those who celebrate this time of year (you Pagans, you!). Rather than sit around the fire talking about yesteryear and smashing whiskey glasses into the fire, I’d like to talk to you about the far more interesting subject of link growth.

Link Growth on The Intertubes
For the context of this conversation (and by that I mean one-way lecture), I am assuming that everyone is defining link growth at the rate at which a domain as a whole and specific pages gain new backlinks. More importantly, how quickly search engines discover and “count” these backlinks.

I’ve blogged before about link velocity before and generally summerised that it was of course, a factor in how well your website ranks. However, as with most SEO topics, the devil is in the detail and there’s a lot of myths about the detail. So I would like to discuss:

1) What signals do “good” links and “notsogood” links give to your website?

2) How does domain age and your current backlink count play a part in determining your “optimal” link velocity?

3) Can you be harmed by incoming links?

These are what I believe are some of the most important (it’s definitely not all) factors attributing to link growth / velocity. As I want to have this blog post finished by Christmas, I’m going to try and stick around these core 3 points, although I’m sure I’ll end up running off at a tangent like I usually do. If however, you think I’ve missed something critical, drop me a comment and I’ll see if I can do a followup.

The difference between trust & popularity
When talking about links, it’s important to realise that there is a world of difference between a signal of trust and a signal of popularity. They are not mutually exclusive and to rank competitively, you’ll need signals of both trust and popularity, but for now realising they are different is enough.


For instance: Michael Jackson is still (apparently) very popular, but you wouldn’t trust him to babysit your kids now, would you? The guy down the road in your new neighbourhood might be the most popular guy in your street, but you’re not going to trust him until someone you know well gives him the thumbs up.

So for your site to rank well, Google needs to be able to have a degree of trust (e.g. source of incoming links, domain age, site footprints) to ensure your not just another piece of 2 bit webscum and it needs to know your content is popular (i.e. good content, link velocity, types of links). As I’ve already said, I’m not going to get into a drawn out debate about content here, just looking at links.

What comes first, trust or popularity?
It doesn’t really make much logical sense that you’ll launch a website and with no fanfare, you get a stream of hundreds of low quality links every week.

This kind of sits well with the original plan of the PageRank algorithm, which let’s not forget is actually (originally) trying to calculate the chance that a random surfer clicking around the web will bump into your site. This notion of a random surfer, clicking random links gave Google an excellent abstract to work out the whole “page authority” that the lion’s share of their algorithm sprang from.

Nowadays, you’ll hear lots of people trumping about going after quality (i.e. high PR links) rather lots of “low quality” (low PR links) while trying to remain relevant. From the algorithm origins point of view, the higher PR pages simply have more of these virtual random surfers landing on them; so more chance of a random surfer clicking your link.

Looking back at “time zero” when the PageRank started to propagate around the web, apart from internal PR stacking, all sites were equal, so PageRank was actually collected by raw numbers of links, rather than this “quality” (high PR) angle, which is actually just a cumulative effect of the PageRank algorithm (at least in its original form).

Hopefully, you’re still with more or not bored about going over fundementals, but without this level of understanding you’ll have a job getting your head around the more advanced concepts of link growth. Keep in mind here, I’m talking about pure PageRank in its original form (I’m sure it’s been updated since it was published), I’m not talking about ranking factors as a whole. To be honest, when I’m ranking websites (which I’m pretty good at), PageRank normally plays a very, very small role in my decision making, it is however useful as an abstract concept when planning linking strategies.

The point I’ve been eluding to here is, for Google to buy into the fact that yes your site is getting lots of natural “run of the mill” links, you firstly will need links from higher PageRank pages (or authorative pages, which are slightly different – bare with me). This line of thinking is of course assuming you don’t use a product like Google Analytics – (“Googlebot: Hmm, 58 visitors per month and 1,200 new incoming links per month, makes perfect sense!”).

Google is also pretty good at identifying “types” of websites and marrying this up to trust relationships. So for instance, I think most people would like a link from the homepage of the BBC News website, it’s a whopping PR9 and has bucket loads of trust. Here’s a question though: Is it a “relevant” link? The BBC News website covers a massive variety of topics, as most news sites do, so what is relevant and what is not is pretty much dependent on the story, which of course cover all topics. Does a link from the BBC News site mean your site is “popular”? No, (although it might make it so). Here’s a good question to ask yourself, between these two scenerios which is most believable:

1) Brand new site launched :: Couple of links from small blogs :: Gets 2,000 links in first month

2) Brand new site launched :: 1 linked from BBC News Homepage :: Gets 2,000 links in first month

Of course, you’ve hopefully identified situation 2 as the far more likely candidate. Lets consider what Google “knows” about the BBC website:

Googlebot says:

1) I know it’s a news website (varied topics)

2) I know millions of other sites link to it (it’s incredibly popular)

3) Lots of people reference deep pages (the content is of great content)

4) I see new content hourly as well as all the syndicated content I’m tracking (Fresh – as a news site should be)

5) It’s been around for years and never tried to trick me (another indicator of trust)

6) If they link to somebody, they are likely to send them lots of traffic (PR)

7) if they link to somebody, I can pretty much be sure I can trust this person they link to

Despite its critics, I’m a big believer in (at least some kind of) TrustRank system. It makes perfect sense and if you haven’t read the PDF, it’s very much worth doing so. In a hat tip to critics, it is incredibly hard to prove because of the dynamic nature of the web, it is almost impossible to seperate the effects of PageRank, relevance, timing, content and a myriad of other glossary terms you could throw at any argument. However, without leaps of faith, no progress would be made as we’re all building on theory here.

Site Note: While I’m talking about experimentation and proof, I’m still chipping away at my SEO Ranking Factors project (albeit slower than I like) and I’ll be willing to share some scripts for “tracking TrustRank” in the new year – dead useful stuff.

Okay, the point I’m making here is that these high trust/authority whatever you want to call them, sites are a stepping stone to greater things. I would agree with the whitehat doctrine that yes (if it’s your own domain at least) you will require links from these sources if you are to rank well in the future. We’ll look at some examples of how to rank without those links later (:

Trust needs to come before mass popularity and there are other things you may want to consider apart from just scanning websites and looking for as much green bar as possible. There are other mechanisms, which while I don’t believe Google is using to the full extent they should (even when they play around with that godamn WikiSearch – musn’t get started on that).

So looking from a Wikinomics aspect, they are less trustworthy but being on the front page of Digg, being popular in Stumble, having lots of delicious bookmarks could all be signals of trust as well as popularity (although at the moment at least, they are easier to game). I would expect, before Google can use these types of signals as strong factors of search, there will need to be more accountability (i.e. mass information empire) for user accounts. This is perhaps one of the things that could make WikiSearch work, being linked to your Google Account, Google can see if you use Gmail, search, docs, video, blogger, analytics, the list goes on – it’s going to be much harder to create “fake” accounts to boost your popularity.

Domain age and link profiles
Domain age definitely has its foot in the door in terms of ranking, however having an old domain doesn’t give you a laminated backstage pass to Google rankings. The most sense you’re going to get out of looking at domain age comes with overlaying it with a link growth profile, which is essentially the time aspect of your link building operation.

Your natural link growth should have an obvious logical curve when averaged out, probably something like this:

Which roughly shows that during a natural (normalised) organic growth, the amount of links you gain per day/month/week will increase (your link velocity goes up). This is an effect of natural link growth, discovery and more visitors to your site. Even if you excuse my horrific graph drawing skills, the graph is pretty simplified.

How does this fit into link growth then?
I’ll be bold and make a couple of statements:

1) When you have established trust, even the crappiest of crap links will help you rank (proof to come)

2) The more trustage (that’s my new term for trust over time (age)) the greater “buffer” you have for building links quickly

Which also brings us to two conclusions:

3) Straying outside of this “buffer zone” (i.e. 15,000 low quality new links on week 1) can you see penalised.

4) If you’ve got great trust you can really improve your rankings just by hammering any crap links you like at the site.

So, going along with my crap-o-matic graphs:

As I’ve crudely tried to demonstrate in graphical form, your “buffer zone” for links increases almost on a log scale, along with your natural links. Once you’ve established a nice domain authority, it’s pretty much free game with links, within reason.

I s’pose you’re going to want some proof for all these wild claims, aren’t you?

Can incoming links harm your website?
The logical answer to this would be “no”. Why would Google have a system in place that penalises you for bad incoming links? If Google did this, they would actually make their job of ranking decent pages much harder, with SEOs focusing in damaging the competition, rather than working on their own sites. It would be a nightmare, with a whole sub-economy of competitor disruption springing up.

That’s the logical answer. Unfortunately, the correct answer is yes. I’ll say it again for the scan readers:

It is possible to damage the rankings of other websites with incoming links

Quote me if you like.

Now by “bad links” I don’t mean the local blackhat viagra site linking to you, that will most likely have absolutely no effect whatsoever. Those kind of sites which Google class “bad neighbourhood” can’t spread their filth by just linking to you, let’s be clear on that. You’re more at risk if someone tricks you into linking to a bad site with some kind of Jedi mind trick.

There’s two ways I’ve seen websites rankings damaged by incoming links:

1) Hopefully this one is obvious. I experienced this myself after registering a new domain, putting a site up 2 days later – which ranked great for the first couple of weeks. Then, well.. I “accidently” built 15,000 links to it in a single day. Whoops. I never saw that site in the top 100 again.

2) There is a reliable method to knocking pages out of the index, which I’ve done (only once) and seen others do many, many times. Basically, you’re not using “bad” links as such, by this I mean not from dodgy/blackhat or banned sites, they are links from normal sites. If for instance, you find a sub-page of a website ranking for a term, say “elvis t-shirts” (this is a random term, I don’t even know what the SERPs are for this term) with 500 incoming links to that page. If you get some nice scripts and programs (I won’t open Pandora’s Box here – if you know what I’m talking about then great) and drop 50,000 links over a 2 week period with the anchor text “buy viagra”, you’ll find quite magically you have totally screwed Google’s relevancy for that page.

I’ve seen pages absolutely destroyed by this technique, going from 1st page to not ranking in the top 500 – inside of a week. Pretty powerful stuff. You’ll struggle with root domains (homepages) but sub-pages can drop like flies without too much problem. Obviously, the younger the site the easier this technique is to achieve.

You said you could just rank with shoddy links?
Absolutely true. Once you’ve got domain authority, it’s pretty easy to rank with any type of link you can get your hands on, which means blackhat scripts and programs come in very useful. To see this in effect, all you have to do is keep your eye on the blackhat SERPs. “Buy Viagra” is always a good search term to see what the BHs are up to. It is pretty common to see Bebo pages, Don’t Stay In pages – or the myriad of other authorative domains with User Generated Content rank in the top 10 for “Buy Viagra”. If you check out the backlink profiles of these pages you will see, surpise, surprise, they are utter crap low qualtiy links.

The domains already have trust and authority – all the need is popularity to rank.

Trust & Popularity are two totally different signals.

Which does your site need?

We have learnt:

1) You can damage sites with incoming links

2) Trust & Authority are two totally different things – Don’t just clump it all in as “PageRank”

3) You can rank pages on authority domains with pure crap spam links (:

Good night.

Like this article? Then subscribe to the feed!


Related Posts:


Next Post:
Blogs Worth Reading »

Previous Post:

« SEO For The Uneducated

18 responses to “Understanding Optimum Link Growth”

  • tim says:

    hmm interesting. i curious to see if eli from bluehatseo will weigh in on this, i know u guys have a level of respect for each other. he seems to be pretty firm in that no links can harm your rankings, as you both have mentioned the unfair sabatoge aspect of it. or maybe i’ve taken some of his blogs out of context and he really means those links wont hurt an already established site.
    long time lurker. i enjoy your stuff. keep up the good posts.

    Comment by tim
    December 12th, 2008 @ 7:52 am

  • Mark says:

    @tim

    I haven’t spoken to Eli about it before. I’d be interested to see what he has to say as well, fact remains however – I’ve seen it done, I’ve done it and I could do it again so there’s no “ifs” for me in this case.

    As I’ve said in the post, I’m not saying “bad” incoming links can hurt you, or incoming links can hurt you per se – I’m saying that if you building thousands of incoming links with off-topic anchor text you can stop a page ranking for a search term.

    If someone has a page they want to sacrifice, I’d be more than willing to demonstrate it with a public experiment.

    Comment by Mark
    December 12th, 2008 @ 10:03 am

  • sloth says:

    That’s very interesting. Would you say building high authority links first is pretty much essential or have you been successful ranking without bothering with this bit before the spam links.

    Do you always build quality links first in other words?

    I haven’t done enough of my own projects to be sure.

    Comment by sloth
    December 12th, 2008 @ 3:18 pm

  • Mark says:

    @sloth

    it totally depends on what you are trying to achieve. if you want to run a BH operation, then you’re going to need trust before you can rank with spam links. That’s why you see so many blackhats parasite hosting pages on trusted domains, like squidoo and bebo.

    If you’re just whitehat link building (if there is such a thing?), then you don’t really need to worry – but of course, trusted links with a lot of fat on them will help you rank, of course! (:

    Comment by Mark
    December 12th, 2008 @ 4:13 pm

  • Paul says:

    It’s been quite some time since your last post, but this was certainly worth the wait. I agree about the domain strength and I’m looking forward to your “tracking Trustrank” in the new year. keep up the excellent work.

    Comment by Paul
    December 12th, 2008 @ 4:54 pm

  • Dan says:

    Crushing post! this makes total sense. Thanks Mark

    Comment by Dan
    December 12th, 2008 @ 11:45 pm

  • Richard says:

    Hands down, the best post in modern times I’ve read on linking. And yes… the graph about the buffer zone is what made it come together for me. I’m extremely visual and graphs like that help me grasp concepts.

    Comment by Richard
    December 15th, 2008 @ 5:53 pm

  • Richard says:

    “You’ll struggle with root domains (homepages) but sub-pages can drop like flies without too much problem.”

    Where do subdomains fit into this? Are they hard to knock off like root domains or easy like sub-pages?

    Thanks.

    Comment by Richard
    December 16th, 2008 @ 6:16 pm

  • Mark says:

    Sub domains are treated very differently to individual pages on a site.

    Usually, a sub-domain is there because that part of the content demands a new section, as such it is likely to have a different source of links with focus on the default sub-domain page.

    in short, unless it’s a weak sub-domain, I imagine you’d be in the same boat as a TLD.

    That’s theory as I’ve never tested or seen the method tested on a sub-domain before.

    Comment by Mark
    December 17th, 2008 @ 2:15 pm

  • Rich says:

    Great post…

    Can you point in the right direction to find Pandora’s box?

    “If you get some nice scripts and programs (I won’t open Pandora’s Box here – if you know what I’m talking about then great) and drop 50,000 links over a 2 week period with the anchor text buy viagra, you’ll find quite magically you have totally screwed Google’s relevancy for that page.”

    Comment by Rich
    January 9th, 2009 @ 5:09 pm

  • Rich says:

    Great post…

    Can you point in the right direction to find Pandora’s box?

    “If you get some nice scripts and programs (I won?t open Pandora?s Box here – if you know what I?m talking about then great) and drop 50,000 links over a 2 week period with the anchor text ?buy viagra?, you?ll find quite magically you have totally screwed Google?s relevancy for that page.”

    Comment by Rich
    January 9th, 2009 @ 5:11 pm

  • White Hat to-be Black Hat says:

    Wow, ur article made my day, a lot of facts that I myself thought are playing. Today I have someone backing those.

    Now something more important:

    I had 10 domains, all for different niches. Made them online and it was never finished marketing work because focus on one single site was always not possible.

    Then one day I stumbled on to a post like this Great piece. Moral of the story was “Instead of working for 10 sites focus on building one big site and with time it will be ranking for everything”

    There is no shortage of proof that this works. So started working on this theory and am starting to have nice results currently.

    Now comes the heart breaker

    As the individual pages start to rank in competitive niches I fear they can be targeted to lose rankings, which could also damage the whole domain trust.

    Could u suggest a way to get the job more difficult if not possible for the Unethical stuff like this.

    waiting for ur reply.

    Comment by White Hat to-be Black Hat
    January 17th, 2009 @ 4:46 am

  • thrifty says:

    Im glad at the festive time of year you choose to talk about link growth. do you think SEO is getting a bit over complicated? is it not just about getting people searching for stuff relevant content or is that just a pipe dream. i am so Ficked off with searching for stuff that comes back with no matter who a load of trash that i didnt ask for, those who write content and links if they copncentrated on the “businees in mind” would write and optimise for that particular business more, rather than fookin around and waffling about a load of twaddle, that means jack Shat to the user?

    Comment by thrifty
    January 25th, 2009 @ 11:34 pm

  • thrifty says:

    by the way i have no chance to edit that comment :-)

    Comment by thrifty
    January 25th, 2009 @ 11:35 pm

  • Digerati Marketing » Using Twitter To Power Spam says:

    [...] and Spam Although I’ve only really talked about parasite hosting indirectly, when looking at ranking factors to do with age and trust, I think it’s a point briefly worth [...]

    Comment by Digerati Marketing » Using Twitter To Power Spam
    March 3rd, 2009 @ 9:56 pm

  • SEO Pakistan says:

    Wow… information filled article. It contains inside knowledge about Link Building and all aspects associated with it. I am impressed.

    Comment by SEO Pakistan
    November 5th, 2009 @ 11:05 pm

  • Steven Air says:

    well wrtitten informative article that appears to boil down to a basic fact – if your website has interesting content then more people will stumble upon it and increase your page rank…thing is i am running out of stuff to write about!!

    Comment by Steven Air
    November 10th, 2009 @ 11:31 am

  • david says:

    Very detailed article and the kind of information that is really useful. I agree with domain age as am always amazed how many times on a search i stumble across a high ranked page where the information is seriously out of date.

    Comment by david
    December 10th, 2009 @ 12:41 pm