Home | Archive | SEO Tools | Contact

Archive for the 'Blogging' Category

SEO Ranking Factors

Saturday, May 31st, 2008

Right, lets kick this thing in the nuts. Wouldn’t it be great if you could have a decent list of SEO Ranking Factors and more specifically, tell me exactly what you need to rank for a key phrase?

Well, SEOMoz went and done this.

You’ve probably all seen it before, the famous SEOMoz Search Ranking Factors, the highly regarded opinions of 37 leaders of search spread over a bunch of questions. It sounds slick, it looks cool and it’s a great introduction to SEO. There is, however, a rather major problem. None of them pissing agree! 37 leaders in search, closed ended questions, yet almost ALL of the answers have only “average agreement”, just look at the pie charts at the end, there is massive dispute between the correct answer.

I find this interesting. It leaves two possibilities

1) SEOMoz’s questions are flawed and there is no “correct” answer – this kind of kills the whole point of the project.

2) If there is a “correct” answer, then it would seem that 25%-50% of “leading people in search” don’t know WTF they are talking about.

Now before I continue, I’m not going to claim I have all the answers, far, far from it. I do some stuff and that stuff works well for me. The other thing I would like to point out is that I actually really like the SEOMoz blog and I think they provide extremely high quality content in high frequency, which is bloody hard to do. So please no flaming when I seem to be bashing their hard work, I’m simply pointing out a few things rather crudely. Oh, they’re nice people too, Jane is very polite when I stalk her on Facebook IM.

Anyway, back to slating. I think it is very hard to give quality answers to questions such as, how does page update frequency effect ranking? From my experience, I’ve found Google quite adaptive in knowing, based on my search query, whether it should serve me a “fresh” page or one that’s collecting dust. Eli from BlueHatSEO has also made some convincing arguments that the “optimum” update frequency of a page depends on your sector/niche.

Also, these things change. Regulary. Those clever beardies at Google are playing with those knobs and dials all the time. Bastards.

Okay, I now hate you for slating SEOMoz, do you have anything useful to say?
Maybe? Maybe not. As I mentioned in my last post, I’m going to talk about some projects I’m working on at the moment and one of these is specifically aimed at getting some SEO Ranking Factors answers.

I could of course just give what I believe to be the “correct” answers to the SEO Ranking Factors questions, but like everyone else, I’d be limited to my own SEO experience. We need more data, more testing, more evidence.

There’s loads of little tools floating around the net that will tell you little things like, if you have duplicate meta descriptions, your “keyword density” (hah), how many links you have, all that stuff. Then you’ll get some really helpful advice like “ShitBOT has detected your keyword only 3.22% on this page, you should mention your keyword 4.292255% for optimum Googleness”. Yes, well. Time to fuck off ShitBOT. These tools are kind of fragmented over the net, so it would take ages to run all 101 to build up a complete “profile” of your website, which really… Wouldn’t tell you all that much. It wouldn’t tell you much because you’re only looking at your own website, your own ripples in the pond. You need to zoom out a bit, get in a ship and sail back a bit, then maybe put your ship in a shuttle, blast off until you can see the entire ocean.

Well, crap. It all looks different from here..

Creating a Technological Terror
I can’t do this project alone. Fortunately, one of the smartest SEO people I know moved all the way across the country to my fine city and is going to help.

Here we go….

1) Enter the keyword you would like to rank for.

2) We will grab the top 50 sites in Google for this search term.

2) i) First of all, we will do a basic profile of these sites, very similar, but a bit more depth than the data SEOQuake will give you. So things like domain age, number of sites linking to domain, how these links are spread within the site, page titles, amount of content, update frequency, PageRank etc. We’ll also dig a bit deeper and take titles and content from pages that rank for these key phrases and store them for later.

2) ii) The real work begins here. For each one of these sites that rank, we are going to look at the second tier, which I don’t see many people doing. We are going to analyse all of the types of sites that link to these sites that rank well. This will involve: Doing the basics, such as looking at their vital stats, so their PR, links, age of domain, TLD and indexed pages.

Then we’re going to take this a step further. We are going to be scanning for footprints to work out the type of link. This means, is it an image link? Is it a link from a known social news site like Digg or Reddit? Is it a link from a social bookmarking site like StumbleUpon or Delicious? Is it a link from a blog? Is it a link from a forum? A known news site? Is it a link from a generic content page? If so, lets use some language processing and try and determine if it’s a link from a related content page, or a random ringtones page. Cache all of this data.

3) We have a huge amount of data now, we need to process it. Ranking for the keyterm casino, lets put it onto a graph showing their actual ranking for this keyterm vs their on page vital stats. Lets see the ranking vs the types of links they have. Lets see how the sites rank vs the amount of links, the age of links etc.etc…


4) We can take this processing to any level needed. Lets pool together all the data we have of the 50 sites and take averages. What do they have in common for this search term? Are these common ranking factors shared between totally different niches and keywords?

This is the type of information that I think I know. I think it would be valuable to know the information I know (=

So I guess you can expect a lot of playing with the Google Charts API, scatter graphs showing link velocity against domain age and total links and all that shit.


You get the idea.

There’s actually all other kind of secondary analysis that can be pumped into this data. For instance, even though it’s a kind of made up term, I think “TrustRank” has some sauce behind it. (There’s a good PDF on TrustRank here). Lets think of it in very, very simple, non-mathematical terms for a moment.

One fairly basic rule of thumb for the web can be that a trusted (“good”) site will generally not link to a “bad” (spam, malware, crap) site. It makes sense, generally very high quality websites vet the other sites that they link to. So it makes sense that Google select a number of “seed” sites and give them a special bit of “trust” juice, which says that whatever site this one links to, is very likely to be of good quality. This trend continues down the chain, but obviously the further down this chain you get, the more and more likely it is that this rule will be broken and someone (maybe even accidentally) will link to what Google considers a “bad” website. For this reason, the (and I use this terminology loosely) “Trust” that is passed on will be dampened at each tier. This allows a margin for calculated error, so if they chain in essence is broken, the algorithm maintains its quality, because it allows for this.

I think most people could name some big, trusted websites. Why not take time to research these sites, really trusted authority sites – one’s that it’s at least a fair bet has some of this magical Trust? Say we have a list of ten of these sites, why not crawl them and get a list of every URL that they link to? Why not then crawl all of these URLs and get a list of all the sites THEY link to? Why not grab the first 3 or 4 “tiers” of sites? Great now, you’ve probably got a few million URLs. Why not let Google help us? Lets query this URLs against the keywords we’re targeting. What you’re left with is a list of pages from (hopefully) trusted domains, that are related to your niche. The holy grail of whitehat link building. Now pester them like a bastard for links! Offer content, blowjobs, whatever it takes!

Wouldn’t it be interesting if we took this list of possible Trusted sites and tied in this theory with how many of our tendrials of trusted networks link to our high-ranking pages? There’s a lot of possibilities here.

This project will be taking up a significant chunk of my time over the next months. Maybe the data will be shit and we won’t find any patterns and it will be a giant waste of time. At least then I can say with confidence that SEO is actually just charm-glasping, pointy hat-wearing, pole chanting black art that so many businesses seem to think it is. At least I’ll be one step closer to finding out.

Apologies once again to SEOMoz if you took offense. I love you x

Posted in Blogging, Google, Marketing Insights, Research & Analytics, Search Engine Optimisation, Social Marketing, White Hat | 10 Comments »

I ask you.

Tuesday, May 27th, 2008

This blog is changing focus. I haven’t posted since I returned from Tulum (yes, I know I said Cancun, but it turns out I don’t listen as well as I should) because I’ve stopped myself from doing so. There’s been a lot of things I’ve wanted to talk about, such as Lyndon’s run in with Google over hoax linkbait, Google really getting to grips with forms, big sites that are cloaking and even geo-hashing. It’s like a trap for me to fall into, seeing all these opinions flying around and wanting to throw my 2 cents in. Of course, I have opinion on all of these topics, but one of my only objectives when starting this blog was to keep every post as informative as possible and try and dig up some strategies, techniques, theories or research that isn’t in a million other places. There’s an incredibly annoying echo effect on my RSS reader as these stories reverberate around the (gag) “blogosphere”. So, rather than post what everyone else is posting, I’d rather post nothing at all, so when you are here, hopefully you’ll get something really….. Nice.

That being said, any decent sized post I do, takes in total around 4 hours as I try and decode the gibberish noise in my head into something tangible enough to put on display. The process helps me organise my thoughts, however it is time-consuming and at the moment I’m incredibly time-starved.

At the moment, my time is broken down between a few major projects:

1) A massive (as of yet unnamed) project to analyse search ranking factors

2) The further development of my currently released SEO Tools

3) The growth and refinement of AutoStumble (now over 300 users!)

4) I’ve also just started work on niche, white-hat community site which will need a lot of attention.

5) Various other websites/maintaining current web property

The change of focus for this blog is going to be looking more closely at the SEO, programming, technical and marketing principles behind these projects – which will benefit everyone, as once again, blogging will become “integrated” in what I do and there’s still a lot of valuable stuff to share.

I ask you.
You’ve taken the time to subscribe (or at least) visit my blog, thank you. If you have a specific topic you’d like to see me write about, let me know and I’ll see what I can do. Would you guys like these single, sporadic and very detailed posts on more advanced SEO concepts on their own? Or would you like “lighter” reading for insights into current issues as well? Answers on a postcard, or in the comment box – whichever is easiest for you. I’m going to be checking out everyone’s sites and blogs who comments here so I can see who’s really reading this stuff :)

I’ll shortly be posting detailed overviews of the above projects.

Posted in Blogging, Community Sites, Digerati News, Marketing Insights, Search Engine Optimisation, Social Marketing | 9 Comments »

Will It Make Money? Top 3 Considerations

Wednesday, December 5th, 2007

Every single day I probably come up with three or four new ideas for websites. Every single year, I probably come up with three or four good ideas for websites. So how do you separate “good” ideas from “notsogood” ideas? There’s definitely a process, which most experienced developers/marketers do without even realising it. I’m going to try and outline my thought process and some of the tools I use to judge whether ideas make it to the web or to the recycle bin.

Consideration 1: Has it been done before?
Sounds obvious, huh? I really hate pissing on peoples’ parades, but working as a consultant I’m probably approaching triple figures for the amount of times when I’ve been told about the “next big thing”, only to have to show people a Google search result page with a dozen established websites already.

If you’re planning a fairly large project, it really does pay to load up Google and hammer it with everything you can think of which might possibly be related to your idea. Oh, your idea’s been done before? No, biggie – My mantra here is: Do it different, or do it better!

Different? That doesn’t just mean the core idea! For instance, you could do the basic idea but target it at a different audience. A great example of this is Sphinn.

Sphinn versus Digg?

Well, here’s the thing – there’s isn’t really a “Sphinn versus Digg”. Sphinn isn’t very much different from Digg at all, however it is aimed at Internet Marketers, which is a crowd that isn’t always welcomed with open arms over at Digg. It seems obvious now, but what would your first reaction be in a pre-Sphinn world if someone came to you and said “I’ve got this idea for a website, it’s a social site where people vote on news stories and…”? It would have been very easy to scrap the idea without further thought.

Better? Surf the web looking for opportunities, just how Danny realised that Digg could be better for search marketers, I could go and find a list of 10 sites now which I could use and say “this really could be better if…” – that’s where these “simple but great” ideas come from. Who 2 years ago thought MySpace would be being dominated by other social network site?

Facebook was not designed as a competitor to MySpace, it began it’s life in the halls of Harvard as a way for students to connect with each other. The idea slowly expanded to more ivy league schools, then universities, then companies, until it has reached its colossal size today. The idea started out with similar premise to MySpace, but again a different audience. It just so turns out it performs the function of MySpace, but in a much better way: Greater connectivity and less spam (for at now at least).



This is one of the reasons we can see MySpace’s brand searches suffer in Google as people leave in their droves and head for Facebook. You can see around 2007 MySpace really began to suffer and has started to decline in search popularity, which spells out a bleak future for them. I don’t want to get into a big MySpace vs. Facebook debate, I want to say: it doesn’t matter how big your competitor is, if you can do something genuinely better, you’ve got a chance.

Consideration 2: Intelligent monetisation

There are a whole bunch of ways you can make money from a website and one of the biggest mistakes I see is people just defaulting to the Adsense crutch. Don’t get me wrong, I’m a big Adsense fan, but it has its uses and it’s certainly not a silver bullet solution for monetisation.

Before you even get into monetisation, you should ask yourself the question; should you be trying to monetise a site from the kick off anyway? Obvious monetisation can adversely effect the credibility of your site, or worse yet – drive users away as you sell off the traffic that you’ve worked so hard to draw in.

I’ve mentioned before, I don’t use Adsense on this blog – and I think it’s a pretty good example. I don’t do sponsored posts, sell links or show Adsense because all of these things would drive users away from my blog, which I’m writing to get them here in the first place! I want you here to read this information, not con you into coming here for a few vague tips just so I can pawn you off to the highest bidder.

I imagine most of my readers will know about Adsense, so most probably won’t click on it anyway – so I won’t make much money. I guess I could blend it in and maybe get a few misclicks, but what’s the point in that? When I recommend certain products, or schemes I sometimes use an affiliate link, which I mark as (aff) to let people know what it is. This way, I add value to readers, not trying to get them to buy/subscribe/use something that’s not relevant to the post. If they have to look at it anyway, why not use an affiliate link? They would perform that action anyway. Marking the links with (aff) is just my way of communicating to my readers that they have the option of typing in the URL if they really don’t want me to get a commission – that’s their choice at the end of the day.

If you can “build in” a monetisation stream to your site, i.e. make it part of the integral process that 1) does not require the user to do more than they usually would and 2) still sees the user perform the actions you want them to, you’re on a winner.

There are tertiary methods of generating revenue, which can be very lucrative – but will never be core to functionality, such as CPM (cost per thousand impression) banners. If you run a community based website with 1000 uniques per day and an average of 10 page views, there’s a fair bit of money to be had from site-wide CPM advertising. There’s even more money to be had if you can directly sell these banner impressions to interested parties, rather than the sometimes rather low-paying CPM networks.

Do you like banners, though? When was the last time you went to a site and you thought “Wow, I’m really pleased that banner advert is there!” Rarely, probably never. As a rule of thumb people don’t like banners – however, they can pay the bills, so there has to be some kind of balance.

In the above example, we’re talking about building a community site, which is a damn hard thing to do – to reach that “critical mass” of users, where your user count will self-replicate and you don’t have to have your foot on the pedal to keep the thing alive. So, at these tender stages of your website’s life, is it a good idea to expose people to banner adverts? Unlikely.

Monetisation can be a bit of a gamble and there’s loads of examples we could work through, but there’s a few key rules to keep in mind:

1) Can you integrate your monetisation into the core functionality of your site?

2) Should you be using “push” monetisation straight away?

3) How will your users react and interact with different monetisation streams?

4) How do other sites in your niche monetisation their presence?

5) What actions do you want a user to take on your site and does your monetisation work against these?

6) Have you considered:

> Affiliate deals to monetise content
> Contextual advertising such as Adsense, Adbrite, PeakClick? (CPC)
> Cost per thousand impression (CPM) advertising such as TribalFusion, Casale, BurstMedia
> Having other sites or companies sponsor sections of your website?
> Does your site give to voluntary donations?
> What about subscription based systems?
> Can you monetise RSS or syndicated feeds?
> Can you do sponsored content? (Nofollowed of course!)

What I’m tarting on about is that you can’t make anything without visitors, so put them first. Maybe I should have just written that half an hour ago? (:

Consideration 3: Time vs Profit Ratio

Avid readers of my blog (I love you guys), will know I’m a big fan of “quick buck” ideas. These are ideas which are quick and easy to implement and will earn you a bit of pocket money. When building a web portfolio, diversification is the key factor to income stability. Although I have a few “battleship” sites, I’ve also got a million dingys floating about, so if a few Google bombs go off here and there, I’m still in pretty good shape.

A lot of people ask the question “I want to make money online, should I make one big site, or loads of little ones?” My answer is, both! (and everything between them for that matter). Small sites are a great way of testing ideas, monetisation streams, SEO techniques, designs, you name it. You can increase your overall chance of success by lowering risks early on. If you spend all of your time, money and resources on building your first battleship site and for whatever reason, it sinks – that leaves you in a nasty place. If you can get up and running with a few quick wins, you can use this revenue as a “margin of error” to play with when working on larger projects.

My most successful “dingy” site took about 20 minutes to build, about 20 minutes of promotion and it makes about $300 a month, with no work whatsoever. I’d say that’s a pretty good investment, by whatever yardstick you’re using. So what makes a “dingy” site?

It’s not size that’s for sure. Some of the quickest projects may be database driven sites with a million pages that are built just to catch long-tail queries. I generally class a site by three factors:

1) How long it will take to build, design and develop

2) How many visitors it will take to make the site consistently earn money

3) What ongoing maintenance and time will the site take?

The first is fairly simple and easily written off. If you’re confident you can design and develop the site, you’re onto a winner. A lot of the time, it’s easy to pick up a CMS such as WordPress, Drupel, Joomla or Pligg to smack a site together in no time. A real issue is how many visitors is it going to take to make the site earn money? This depends on our earlier points about monetisation streams, if you’re relying on CPM – it will take a hell of a lot, if you’re relying on single high paying affiliate commissions, probably not so many.

The most important by far for me, is what time, on an ongoing basis will this site eat up? As much as I love community type sites, they take a bastard amount of TLC to get off the ground. With many projects on the go, you really need to do some time planning to make sure you’ve got enough spare (or can outsource), to see these things through. An early mistake I made was building loads of sites and not giving them the attention they needed to grow. You won’t be getting a second chance to impress with a lot of visitors, so make sure you’ve got resources to spare to make it work first time round.

If however, you spend a little more time, you’ll see there are loads of drag and drop projects that you can set up and leave running at no more time expenditure.. Quick wins, like Google navigation queries (:

I hope these seeds give you some solid logic to build on. To be honest, I was going to do a top 5, but I’ve just moved house and I’m on “free city wifi” until I get broadband installed here. Unfortunately “free shitty wifi” would be more accurate as I’m getting about 33.6kbps modem speeds (remember them??). Oh, I’ve also got some dingys to inflate (:

Posted in Adsense, Advertising, Affiliate Marketing, Black Hat, Blogging, Community Sites, Google, Grey Hat, Marketing Insights, Paid Search, Research & Analytics, Search Engine Optimisation, Social Marketing, Splogs, Viral Marketing, White Hat, Yahoo | 7 Comments »

Getting Started: Making Money Online

Wednesday, August 1st, 2007

This is a jumbo post which I have contributed to Jon Waraas’ Blog so you’ll have to pop over there to read it. It’s a bit of a biggie (about 3,000 words).

I’ve also been working on that as well as the next part of Making Money With An Affiliate Empire series, so with a bit of luck that should be live by the end of the week..

I also have a special announcement later in the week, which you’ll like. That’s going to be first come, first served though :)

Posted in Adsense, Advertising, Affiliate Marketing, Black Hat, Blogging, Community Sites, Google, Grey Hat, Marketing Insights, Microsoft, Research & Analytics, Search Engine Optimisation, Social Marketing, Splogs, Viral Marketing, White Hat, Yahoo | 6 Comments »