Home | Archive | SEO Tools | Contact
« Previous Entries

Archive for the 'Social Marketing' Category

How To Make Money With An Automated Blog & AutoStumble

Wednesday, August 13th, 2008

Welcome to another “how to” post. If you follow the recipe here, you’ll be onto Stage 3 = Profit in no time. This ties quite nicely in with the Blackhat SEO Tools post and the AutoStumble post, for those who haven’t read them. It’s a little blackhat, but nothing to lose sleep over (hah, as if!) and this is really, really, reaaallyy easy stuff. Sitting comfortably? Let us begin..

What is the end goal?

The end-game of this post is to have a fully automated blog, which generates shitloads of traffic via StumbleUpon, referrals and Google Blogsearch. In the process, you’ll also gain loads of subscribers and generate some nice easy revenue. Once it’s built, the entire thing is just about hands free.

What you need before you begin…#
To complete this project you will need:

1) Nice clean installation of WordPress

2) The Digerati Blackhat SEO Tool Set

3) A registered copy of AutoStumble

Lets get started
I’m going assume you know the basics of setting up a WordPress blog. If not you can get more detail from Making Money With a Video Blog or if you’re totally new, check out the official WordPress documentation. So yea, if you’re that new, please RTFM.

Once you’ve got your WordPress blog installed and running, do the basics such as setting the permalinks to be the post title, so you get those little extra keywords in the URL. You’ll also need to find yourself a theme. As discussed in Making Money With a Video Blog, the layout is really, really important to get clicks on your ads. You could start with a template like ProSense, although I’ve found the click through ratio to be pretty low, but at least it’s quick. Ideally, have a hunt around so you meet the criteria of showing your content above the fold, centrally and having your ads nicely surrounded and blended in. The key here is to experiment and see what works well for you.

Plugins FTW
There’s a whole crapload of plugins that will make your life a lot easier. We’ll start off with the important one, FeedWordpress, which is part of the Digerati Blackhat SEO Tool Set if you don’t already have it.

Upload the feedwordpress folder, as usual to your wp-content/plugins directory. You’ll need to remove the 2 files from the feedwordpress “Magpie” subfolder, and put these into your “wp-includes” directory, which will overwrite some default WordPress files too. Don’t miss that step…

Once you’re installed and you’ve activated the plugin via your WordPress dashboard, you’ll have a new option on your main navigation.



Just like that. So give that a click and then go into the “Syndication Options” menu. From here you’ll be able to configure FeedWordpress to do your bidding.

You should get an option screen like this:


So lets run through these options.

1) The first thing you want to change is the “Check For New Posts” option. You’ll want to set this to “automatic”. This will go sniff your RSS feeds at an interval you specify to grab new content. You can leave it on every 10 minutes for now.

2) Make sure the next 3 boxes are checked, this will keep your feed information bang up to date.

3) You should set syndicated posts to be published immediately. This will allow you to get your content live ASAP, which is always a plus.

4) Pemalinks. This is basically when somebody clicks on the post, do they go to the original website that you er… Borrowed? The content from, or do they go so a scraped version on your site. For this example (which I’ll give the gonadless among you an ethical loophole for later), set it to “this website”.

5) I always set FeedWordpress to create new categories. I never display categories in the menu, but it gives the post a few more keywords and a bit more relevance for search. So, if someone else has gone to the effort of writing a tag, it would just be wasteful of you not to use it!

Okay, that’s set up… What exactly are we scraping?
To be honest, I’m not a big fan of people scraping content that people have sweated over. However, one thing I don’t mind doing is thieving from thieves.

You’re on the hunt for “disposable” content – generally not text based. Think along the lines of Flash games, funny videos, funny pictures, hypnomagical-optical-illusions – that kind of thing. The Internet is awash with blogs that showcase this stuff. Check out Google blogsearch and try a search like funny pictures blog. There’s hundreds of the leeching bastards showcasing other peoples pictures, videos, games and hypnomagical-optical-illusions for their website. They can hardly call it “their” content. With this ethical pebble tossed aside, we can go and grab some content.

There’s loads of ways you can hunt down potential content. You’re on the lookout for RSS feeds with this rich media. So you could try; Google Blogsearch, Technorati, MyBlogLog – basically any site that lets you search the blogosphere.

Once you’ve got the location of about a dozen or so RSS feeds, you can go to your Syndication menu again and “add a new syndicated site”. Simple matter, paste in the RSS feed location and hit syndicate. Once you’ve added them all, it “update”. Boom, shake the room, you’ve probably got a couple of hundred “new posts”.

New posts, no traffic
You want to of course, set up your WordPress RSS. Something like Feedburner is dead easy to set up and will get Google interested off the bat. Make sure you have a nice big RSS button and offer e-mail subscription (Feedburner does this) for those who don’t have a clue what the hell RSS is.

The cool thing about services like Google Blogsearch is that they’re pretty much chronologically sorted. So as long as you have a steady stream of posts, you’re guaranteed at least a trickle of traffic from long-tail searches.

Hot potato, grab and switch
If you really want to get some serious traffic, you’re going to need some “pillar” posts – content that you know for sure is strong. The easiest way to do this is to keep an eye on sites like Digg and Reddit. Check out on there what is going hot, what’s new and what’s viral. Probably the easiest thing to do is subscribe to the Digg Offbeat / Comedy RSS. This will give you constant updates on what’s upcoming.

Due to the differences in the types of people, there doesn’t tend to be as much overlap between hubs such as Digg, Reddit & StumbleUpon as you might first think. I’ve seen things go viral on Reddit and then take two or three days to make it onto the frontpage of Digg. So, you can grab content that’s going hot from one of these hubs; your proverbial “hot potato” and put in front of the nose of another audience.

Here’s where AutoStumble comes in
This is probably the easiest way to use AutoStumble. Grab your hot potato content from Digg and do a manual post on your blog. Submit this page to StumbleUpon.

AutoStumble costs £20 and is a desktop application, which allows you to automatically pool hundreds of StumbleUpon votes with other users. I.e., this is your quick way of getting your content to go viral on StumbleUpon. If you purchase and download AutoStumble, it is simple a matter of pasting in the URL you want to go viral on StumbleUpon and hitting “AutoStumble”.

A few hundred votes later. Voila. You have traffic.

The value of StumbleUpon traffic
1) The most I’ve had is just over 70,000 unique visitors over a 3 day spike from StumbleUpon. So firstly, you can generate a fairly decent bit of green from your initial CPM ad impressions and clicks on things like Adsense. (StumbleUpon users don’t tend to be as picky about clicking on ads as Diggers).

2) With this volume of traffic, you’ll likely find a few people who really like your content. You’ll get RSS / Email subscribers who will be a permanent addition to your monthly traffic (and revenue).

3) A lot of these social sites are populated with pretty tech savvy people. A lot of these people run their own blogs, forums, websites – or at least add content somewhere themselves on the web. If you get 10,000 visitors from StumbleUpon, you can expect a decent amount of lovely natural links from around the web. Links mean better website authority, better rankings, better traffic and better revenue. The value for me at least, is really long-term.

Making things easy for yourself
You’ll probably want to install some extra plugins such as:

  • WordPress Automatic Update – This will update your WordPress installation as well as plugins. Generally, it will save you a lot of time.
  • Clean Archives Reloaded – I use there on my archive page. It’s a nice way to layout all of your blog posts with clean anchortext to improve relevance with some internal linking.
  • Sitemap Generator – I don’t really bother with Sitemaps, but for those who do – saves you generating one from scratch.

Don’t forget, if you’re going to be switching content onto platforms like Digg or Reddit, make sure you have their native vote button included in the post! You want to make it as easy as possible to grab all of the votes you can. Again, personally – I don’t bother with the generic social bookmarking plugins for WordPress, as I find nobody actually seems to use them.

Oh, and before anyone chirps in trying to be clever saying “(sniffle) won’t duplicate content be an issue?” No! it won’t, fucktard! Get back in your hole. Aside from the dupe content filters being primarily built on shit, you’ll be posting mostly rich media. Google’s not too great at working out the exact content of pictures and videos… Yet. Yes, it will probably change one day in the future, and we’ll all look back on this post and laugh..At the moment, it’s not something they do well, so, well…. Ching..Ching.

Taking it one step further
This whole project should take you less than 30 minutes, from sitting down at your computer to having a fully automated blog posting and promotion system set up. If you like the idea, it would be an idea to package everything I’ve mentioned here together into your own custom install file, so you can deploy new sites in under 15minutes.

If you’re going to do this, you may as well make your cookie cutter solution as good as it can be. Hopefully, if you’re thinking down the right road you can come up with some of your own ideas to improve on these techniques (there are loads).

Why not look at only showing social voting buttons, from sites you know that your visitors actually use? Here’s some code.

Enjoy.

Posted in Adsense, Advertising, Black Hat, Blogging, Google, Grey Hat, Search Engine Optimisation, Social Marketing, Splogs, Viral Marketing | 33 Comments »

Blackhat SEO Tools & Scripts – The Digerati Blackbox

Thursday, June 12th, 2008

buenos dias, friends!

I’ve put together a little treat for all of you budding and new blackhats out there. I got quite annoyed this week with the whitehattards on Sphinn.

Those of you who actually know me, will know I believe whitehat stuff is very important to building a web business. However, I also believe there is strong case for at least experimenting with gray/blackhat (whatever you want to call it). There are some markets you literally cannot touch without getting off your rainbow shitting whitehat unicorn of light. Unfortunately, there’s a lot of, erm, “dedicated” whitehats out there that refuse to even learn what blackhat is. I’d like to take this opportunity to shed some myths (AKA venting) about blackhat. For those of you who don’t enjoy reading pissed off (I believe the whitehat word for pissed is “snarky” – Thanks Matt.C), feel free to skip down the page to the goodies.

Things that whitehattards believe to be true:

1. That “on page” SEO is some uber-skill which takes years to learn.

False. If you actually get a good web developer, the chances are he (or she!) will make a decent crawable website. You might be able to help them out with some keyword research to help target title/header tags, or give them a little advice on PR sculpting for large sites with nofollow. Good internal linking structures are pretty commonly well known – at least with the web developers I know. If any pure whitehat starts talking about precise keyword density, just laugh in their face.

2. The main thing about SEO is creating good content.
Good content gets links, yes. Well done. Why are you doing SEO when so many crimes are going unsolved around the world? Good content is important for whitehat site, yes. However, good content is not bloody SEO! How do I know this? Would you bother writing good content if search engines didn’t exist? Yes, you would. Therefore it is actually a component of web design, not SEO!

3. There’s no point in blackhat, you’ll just get banned.
This little corker comes from two types of people, normally from people who have never tried blackhat (glad they’re qualified to comment, why not go give a lecture on brain surgery while you’re at it). Or, secondly, people who have tried some very, very, basic blackhat and done it badly and left footprints like a crack-addicted yeti storming around the web. I know of many blackhat sites that have enjoyed top positions for years without getting caught for competitive key phrases those whitehats couldn’t touch with a NASA sized hard drive full of great content.

4. I’m a good whitehat SEO because “I know” where to get links from
Aww now, c’mon. Not really a “core” SEO skill is it? I’ll give it to you, that it helps. I think what you’re trying to say is “I understand how the web works and where it is possible to drop links” or “I use social news/community sites”. I know people who have never built a link in their life and would make great whitehat SEO link builders because they spend ages writing content for blogs and taking part in Digg, Reddit, Stumble, blaahh, blahhh. At best, it’s a transferable skill.

5. Blackhat SEOs only resort to blackhat because they can’t produce good websites
This one (which I saw several times on Sphinn), just leaves my jaw dropped. Generally, blackhats are far more accomplished programmers than whitehats and can build much cleaner and more efficient websites (and a lot do) if they wish. The fact is, by scripts and automation they’ve found a way to make a decent income without burning the midnight oil writing content about their new “diamond goat hoof jewellery” niche they’ve found. This comment normally comes from whitehats who wouldn’t know a blackhat if they spammed them in the face.

There is however, advanced white hat SEO, as Eli kindly demonstrates in his painfully bastardish always right way.

Ahem. Anyway…..

The Digerati Blackbox

So, I’ve collected together a set of tools, scripts, databases and tutorials which will help the beginner blackhat find their feet. Some of the stuff is pretty good, albeit fairly basic. You should be able to make something decent if you combine some of these scripts, or strip out some of the code into your own creations.

Blackbox Contents:

Cloaking & Content Generation:

cloakgen1.zip:
This is a cloak / dynamic content generation script. To use it you simply add a small piece of code to the top of each page you wish to be cloaked. When someone accesses your page then cloakgen is run and if the user-agent suggests the visitor is a standard user then they are simply shown your standard page. However if the user-agent suggests that the visitor is a search engine then it will start doing the business. It will start by finding out what page called it, then it will open this page and find out what the most common words on the page are. Once it has worked this out then it will scrape some content about that word from wikipedia and add it with your normal page content. Each keyword will be emphasised in a random way. For example the keyword could be bold or red font etc. The final page will be output in the following way:

Title of the page in capital letters
Large title at the top of the page
Content of the website with emphasization and wiki content

padkit.zip:
PAD is the Portable Application Description, and it helps authors provide product descriptions and specifications to online sources in a standard way, using a standard data format that will allow webmasters and program librarians to automate program listings. PAD saves time for both authors and webmasters. This is what you want to use with the below databases.

yacg.zip:
You should have heard of Yet Another Content Generator (YACG). It’s a beautifully easy way to get websites up and running in minutes with mashed up scraped content.

Databases:

articles.zip:
A database of 23,770 different articles on a variety of topics.

bashquotes.zip:
This is a database of every quote on Bash.org. This huge Database has every single quote as of May 1st, 2007!

KJV_bible.zip:
The whole thing King James Bible – Old & New Testament.

medical_dictionary.sql.zip:
Over 130,000 rows of medical A-Z

Keyword Scripts:

ask-single-keyword-scraper.zip:
This script allows you to scrape a range of similar keywords to your original keyword from Ask.com.

google-single-keyword-scraper.zip:
This script will take a base keyword and then scrape similar keywords from google.

msn-live-api-scraper.zip:
This script uses php cURL to scrape search results from the MSN LIVE Search API.

overture-single-keyword-scraper.zip
Enter one base keyword and scrape similar keywords from overture.

Linkbuilding Scripts:

dity.zip:
A very easy to use (and old) multi guestbook spammer.

logscraper.zip:
Nifty little internal linker (read more about it here)

trackback.zip
Very powerful trackback poster. Trackback Solution is 100% multithreaded and very efficient at automatically locating and posting trackback links on blogs.

xml-feed-link-builder-z.zip
Very nice script to generate links from to your site from people scraping RSS.

Misc Scripts:

alexa-rank-cheater1.zip:
Automate the false increase of your Alexa rating/rank.

typo-generator-esruns.zip:
Create typos of a competitive keyword and rank easy!

Scraping:

feedwordpress.0.993.zip:
Wordpress plugin that makes scraping the easiest thing in the world.

Proxies:

proxy_url_maker.zip:
Create a list of web proxy URLs used for negative seo purposes or spam

proxygrabber.zip:
A script to download proxies from the samair proxy list site.

CAPTCHAs:

delicious.zip:
Delicious CAPTCHA broken. In Python.

smfcaptchacrack.zip:
Simple machines forums captcha breaker compiled and designed to run on Linux but portable to Windows.

Tutorials:

curl_multi_example.zip:
What it says on the tin. Examples of m-m-m-multi curl!

superbasiccurl.zip:
4 super basic tutorials on using curl/regex.

I’d like to give special thanks to all donators and people who included their stuff here:

Steve – For the majority of scripting here.
Rob – For the databases
Eli – For delicious CAPTCHA breaker
Rob – For trackback magic
Harry – For proxygrabber/linux captcha scripts

Here it is:

blackhat seo tools
Download Digerati Blackbox Toolkit (51.4Mb)



Disclaimer: I’m not offering support on any of these tools or scripts, although I might do a couple of tutorial posts on how to use them. So don’t ask me how to use them, check out the respective author’s website if you get stuck. Obviously Digerati Marketing Ltd, I, my dog, or anyone else cannot be held responsible for any type of loss or damages of any kind (even an act of God Google) if you choose to use them. At your own risk blah blah blah. Zzzzzz. Enjoy.

Posted in Black Hat, Grey Hat, Marketing Insights, Research & Analytics, Search Engine Optimisation, Social Marketing, Splogs, White Hat | 64 Comments »

Last Chance For £10 – AutoStumble

Monday, June 9th, 2008

I’ve been really happy with the progress of AutoStumble. At the time of writing, we’ve got over 300 users and the network has produced 38,734 stumble votes!

For those of you who haven’t seen it before, check out my original post on AutoStumble. This should give you a pretty good idea of how it works.

In a nutshell, for those who can’t be bothered to read it – AutoStumble is a desktop application, whereby you are linked up with everybody else running the program. You enter your URL you wish to promote on StumbleUpon and your URL is then put into a “pool”, which is automatically and randomly StumbledUpon (voted) on by other users. To cover any patterns, we also introduce random URLs for that will be added to everybody?s list, so it would be very, very difficult for activity to be spotted algorithmically (or by hand for that matter!).

The end result is that you get dozens (or hundreds, depending how long you leave your URL in for) of StumbleUpon votes for your URL, which gives you a great head start to make your content go hot on StumbleUpon.

No, it doesn’t work by magic fairy dust, so you do actually need decent content to make it go fully viral on stumble. Your page trying to flog Viagra, or your boring analytical flow-diagram of your daily food intake isn’t going to take. If you’ve got content with viral potential however, AutoStumble is the perfect way to get you up those first difficult rungs of the social ladder.

Since we’ve got a lot of network users now and the network is running smoothly, I’m going to jack the price up to £20, being the horrible businessman I am.

So if you want to grab AutoStumble before the price doubles – you’ve got until Wednesday afternoon (GMT).

Or you can read the reviews.

Posted in Digerati News, Social Marketing | 19 Comments »

SEO Ranking Factors

Saturday, May 31st, 2008

Right, lets kick this thing in the nuts. Wouldn’t it be great if you could have a decent list of SEO Ranking Factors and more specifically, tell me exactly what you need to rank for a key phrase?

Well, SEOMoz went and done this.

You’ve probably all seen it before, the famous SEOMoz Search Ranking Factors, the highly regarded opinions of 37 leaders of search spread over a bunch of questions. It sounds slick, it looks cool and it’s a great introduction to SEO. There is, however, a rather major problem. None of them pissing agree! 37 leaders in search, closed ended questions, yet almost ALL of the answers have only “average agreement”, just look at the pie charts at the end, there is massive dispute between the correct answer.

I find this interesting. It leaves two possibilities

1) SEOMoz’s questions are flawed and there is no “correct” answer – this kind of kills the whole point of the project.

2) If there is a “correct” answer, then it would seem that 25%-50% of “leading people in search” don’t know WTF they are talking about.

Now before I continue, I’m not going to claim I have all the answers, far, far from it. I do some stuff and that stuff works well for me. The other thing I would like to point out is that I actually really like the SEOMoz blog and I think they provide extremely high quality content in high frequency, which is bloody hard to do. So please no flaming when I seem to be bashing their hard work, I’m simply pointing out a few things rather crudely. Oh, they’re nice people too, Jane is very polite when I stalk her on Facebook IM.

Anyway, back to slating. I think it is very hard to give quality answers to questions such as, how does page update frequency effect ranking? From my experience, I’ve found Google quite adaptive in knowing, based on my search query, whether it should serve me a “fresh” page or one that’s collecting dust. Eli from BlueHatSEO has also made some convincing arguments that the “optimum” update frequency of a page depends on your sector/niche.

Also, these things change. Regulary. Those clever beardies at Google are playing with those knobs and dials all the time. Bastards.

Okay, I now hate you for slating SEOMoz, do you have anything useful to say?
Maybe? Maybe not. As I mentioned in my last post, I’m going to talk about some projects I’m working on at the moment and one of these is specifically aimed at getting some SEO Ranking Factors answers.

I could of course just give what I believe to be the “correct” answers to the SEO Ranking Factors questions, but like everyone else, I’d be limited to my own SEO experience. We need more data, more testing, more evidence.

There’s loads of little tools floating around the net that will tell you little things like, if you have duplicate meta descriptions, your “keyword density” (hah), how many links you have, all that stuff. Then you’ll get some really helpful advice like “ShitBOT has detected your keyword only 3.22% on this page, you should mention your keyword 4.292255% for optimum Googleness”. Yes, well. Time to fuck off ShitBOT. These tools are kind of fragmented over the net, so it would take ages to run all 101 to build up a complete “profile” of your website, which really… Wouldn’t tell you all that much. It wouldn’t tell you much because you’re only looking at your own website, your own ripples in the pond. You need to zoom out a bit, get in a ship and sail back a bit, then maybe put your ship in a shuttle, blast off until you can see the entire ocean.

Well, crap. It all looks different from here..

Creating a Technological Terror
I can’t do this project alone. Fortunately, one of the smartest SEO people I know moved all the way across the country to my fine city and is going to help.

Here we go….

1) Enter the keyword you would like to rank for.

2) We will grab the top 50 sites in Google for this search term.

2) i) First of all, we will do a basic profile of these sites, very similar, but a bit more depth than the data SEOQuake will give you. So things like domain age, number of sites linking to domain, how these links are spread within the site, page titles, amount of content, update frequency, PageRank etc. We’ll also dig a bit deeper and take titles and content from pages that rank for these key phrases and store them for later.

2) ii) The real work begins here. For each one of these sites that rank, we are going to look at the second tier, which I don’t see many people doing. We are going to analyse all of the types of sites that link to these sites that rank well. This will involve: Doing the basics, such as looking at their vital stats, so their PR, links, age of domain, TLD and indexed pages.

Then we’re going to take this a step further. We are going to be scanning for footprints to work out the type of link. This means, is it an image link? Is it a link from a known social news site like Digg or Reddit? Is it a link from a social bookmarking site like StumbleUpon or Delicious? Is it a link from a blog? Is it a link from a forum? A known news site? Is it a link from a generic content page? If so, lets use some language processing and try and determine if it’s a link from a related content page, or a random ringtones page. Cache all of this data.

3) We have a huge amount of data now, we need to process it. Ranking for the keyterm casino, lets put it onto a graph showing their actual ranking for this keyterm vs their on page vital stats. Lets see the ranking vs the types of links they have. Lets see how the sites rank vs the amount of links, the age of links etc.etc…


4) We can take this processing to any level needed. Lets pool together all the data we have of the 50 sites and take averages. What do they have in common for this search term? Are these common ranking factors shared between totally different niches and keywords?

This is the type of information that I think I know. I think it would be valuable to know the information I know (=

So I guess you can expect a lot of playing with the Google Charts API, scatter graphs showing link velocity against domain age and total links and all that shit.


You get the idea.

There’s actually all other kind of secondary analysis that can be pumped into this data. For instance, even though it’s a kind of made up term, I think “TrustRank” has some sauce behind it. (There’s a good PDF on TrustRank here). Lets think of it in very, very simple, non-mathematical terms for a moment.

One fairly basic rule of thumb for the web can be that a trusted (“good”) site will generally not link to a “bad” (spam, malware, crap) site. It makes sense, generally very high quality websites vet the other sites that they link to. So it makes sense that Google select a number of “seed” sites and give them a special bit of “trust” juice, which says that whatever site this one links to, is very likely to be of good quality. This trend continues down the chain, but obviously the further down this chain you get, the more and more likely it is that this rule will be broken and someone (maybe even accidentally) will link to what Google considers a “bad” website. For this reason, the (and I use this terminology loosely) “Trust” that is passed on will be dampened at each tier. This allows a margin for calculated error, so if they chain in essence is broken, the algorithm maintains its quality, because it allows for this.

I think most people could name some big, trusted websites. Why not take time to research these sites, really trusted authority sites – one’s that it’s at least a fair bet has some of this magical Trust? Say we have a list of ten of these sites, why not crawl them and get a list of every URL that they link to? Why not then crawl all of these URLs and get a list of all the sites THEY link to? Why not grab the first 3 or 4 “tiers” of sites? Great now, you’ve probably got a few million URLs. Why not let Google help us? Lets query this URLs against the keywords we’re targeting. What you’re left with is a list of pages from (hopefully) trusted domains, that are related to your niche. The holy grail of whitehat link building. Now pester them like a bastard for links! Offer content, blowjobs, whatever it takes!

Wouldn’t it be interesting if we took this list of possible Trusted sites and tied in this theory with how many of our tendrials of trusted networks link to our high-ranking pages? There’s a lot of possibilities here.

This project will be taking up a significant chunk of my time over the next months. Maybe the data will be shit and we won’t find any patterns and it will be a giant waste of time. At least then I can say with confidence that SEO is actually just charm-glasping, pointy hat-wearing, pole chanting black art that so many businesses seem to think it is. At least I’ll be one step closer to finding out.

Apologies once again to SEOMoz if you took offense. I love you x

Posted in Blogging, Google, Marketing Insights, Research & Analytics, Search Engine Optimisation, Social Marketing, White Hat | 10 Comments »

I ask you.

Tuesday, May 27th, 2008

This blog is changing focus. I haven’t posted since I returned from Tulum (yes, I know I said Cancun, but it turns out I don’t listen as well as I should) because I’ve stopped myself from doing so. There’s been a lot of things I’ve wanted to talk about, such as Lyndon’s run in with Google over hoax linkbait, Google really getting to grips with forms, big sites that are cloaking and even geo-hashing. It’s like a trap for me to fall into, seeing all these opinions flying around and wanting to throw my 2 cents in. Of course, I have opinion on all of these topics, but one of my only objectives when starting this blog was to keep every post as informative as possible and try and dig up some strategies, techniques, theories or research that isn’t in a million other places. There’s an incredibly annoying echo effect on my RSS reader as these stories reverberate around the (gag) “blogosphere”. So, rather than post what everyone else is posting, I’d rather post nothing at all, so when you are here, hopefully you’ll get something really….. Nice.

That being said, any decent sized post I do, takes in total around 4 hours as I try and decode the gibberish noise in my head into something tangible enough to put on display. The process helps me organise my thoughts, however it is time-consuming and at the moment I’m incredibly time-starved.

At the moment, my time is broken down between a few major projects:

1) A massive (as of yet unnamed) project to analyse search ranking factors

2) The further development of my currently released SEO Tools

3) The growth and refinement of AutoStumble (now over 300 users!)

4) I’ve also just started work on niche, white-hat community site which will need a lot of attention.

5) Various other websites/maintaining current web property

The change of focus for this blog is going to be looking more closely at the SEO, programming, technical and marketing principles behind these projects – which will benefit everyone, as once again, blogging will become “integrated” in what I do and there’s still a lot of valuable stuff to share.

I ask you.
You’ve taken the time to subscribe (or at least) visit my blog, thank you. If you have a specific topic you’d like to see me write about, let me know and I’ll see what I can do. Would you guys like these single, sporadic and very detailed posts on more advanced SEO concepts on their own? Or would you like “lighter” reading for insights into current issues as well? Answers on a postcard, or in the comment box – whichever is easiest for you. I’m going to be checking out everyone’s sites and blogs who comments here so I can see who’s really reading this stuff :)

I’ll shortly be posting detailed overviews of the above projects.

Posted in Blogging, Community Sites, Digerati News, Marketing Insights, Search Engine Optimisation, Social Marketing | 9 Comments »