:::: MENU ::::

How does Google determine domain age?

This week I’ve given thought as to how Google determines domain age. I recently picked up a dropped domain that has previously been registered since 2007. I was feeling quite chuffed with picking up an aged domain, but then I give it some thought as to how Google would determine the age of a domain and perhaps my acquisition wasn’t as good as it first seems.

Market Samurai, a SEO tool has a feature entitled ‘SEO competition’. It essentially collects data for a variety of on page and off page factors allowing one to quickly assess how easy it is to target a niche. One of the factors it collects data for is domain age. Market Samurai uses the earliest record in archive.org to suggest how old a domain is.

Continue Reading


Five reasons why your SEO agency suck

I’ve been working on one client’s account and encountered some consequence of the operations of the previous SEO agency. The SEO agency in question has actually won awards, which comes as a great surprise since the quality of the SEO and the value generated for the client is questionable. Occasionally I get contacted by other SEO companies proposing the outsourcing of work, but all too often it’s the same old story; poor SEO practices. Therefore I thought I’d take some time to compile some of the most common characteristics of poor SEO agencies.


1) They do blackhat SEO
Some of the other reasons will comprise of the dysfunctions of practicing blackhat SEO, however it warrants a separate factor. There is a time and a place for blackhat SEO but on client accounts isn’t one of them. With the size of the accounts I work on, there are various stakeholder group interests to be taken into consideration, the risks of a strategy featuring blackhat far, far outweigh the advantages. The consequences for being penalised with a major client is that people could lose their jobs, and investors could lose capital. These are not risks that can be taken, and so it’s essential that strategies should be white hat.


2) … and they do it badly

Not only do they do blackhat, they do it badly. Your inbound link portfolio consists of a large proportion of inbound links placed with no context, with the perfect anchor text on PR loaded, low domain authority pages. They make no regard to maintaining an organic looking anchor text distribution, nor a natural PR distribution of links. Probably all of your key phrase targeted links appear only on pages PR3 and above, sending alarm bells to Google.

You’ll likely to have a range of links on penalised pages too, lowering your domain authority and link reputation.


3) They will turn off all your links when you switch agency
Unless you plan to be with your SEO agency for the rest of time, one day you won’t renew your contract and when that day comes, your SEO agency will stop paying for all the links that have been giving your rankings. When that day comes, you’ll lose your rankings, your links and you will be back to square one. Not at any point are you generating any sustainable competitive advantage. Indeed, with consideration to this you’ll only building up your barriers to switching supplier which in turn makes you more dependable and places you in a worse position to negotiate new terms.


4) They target vanity terms with no consideration to ROI
It’s common for management to place great pleasure on vanity terms, and sometimes as an agency you have to fullfill this. However it’s the responsibility of the SEO agency to convey the importance of leading a ROI focused campaign that delivers returns, not boasting material for management. SEO isn’t just about attaining high position in Google, but high positions that deliver returns on investment. 80% of traffic can come from the long tail, and ignoring this can be a huge mistake too. I’d always recommend tracking search traffic through to sales, so you can see which keywords are delivering you sales, and accordingly target these.


5) They underestimate good on page optimisation
It’s true that the Google algorithm heavily values inbound links in determining rankings, but on page optimisation should not be neglected. As previously stated, 80% of traffic is generally derived from the long tail, and accordingly on page retains great importance. Getting a good, indexable website is crucial, and an approach to content creation that gives consideration to semantics allows your to far better target the long tail.

No doubt I’ll continue to keep coming up with ideas why your SEO agency sucks, but this list will do for now. If you’re looking for a SEO agency, make sure you are satisfied with these points when you pick one.


RE: Is social media the future of SEO?

For a few months now I have been subscribed to Web Designer magazine, and generally it’s a good read. This current months issue was of great interest to me, covering jQuery, and an article titled ‘Is social media the future of SEO?’.

As a SEO executive, it’s probable my view of the article is a little bias, however I felt the author got either a couple of points incorrect or didn’t clarify himself well. I’d like therefore to take the opportunity to give my thoughts on a couple of the points.

The author describes the rise in social media as having two key impacts where SEO is concerned. Firstly that users spend more time on these channels and attempting to follow the ambiguous logic of the author, is less likely to search for things using search engines? The second facet is that on Twitter and Facebook, users tend to distribute more material. This is true, however the author fails to speak of probably the biggest challenge social media presents to SEO.

With more discussion and sharing being conducted on social networks, and less blogs and general websites, Google will be more inclined to adopt social networks as reference points in the algorithm. Currently professionals use link building as a strong arm of SEO strategy but if the reference point weighting is to shift dramatically to social networks, where linking is more sporadic, organic and harder to replicate, then that does indeed represent a challenge to SEO.

After, when writing in regards to personalized search, the author states:

How can the SEO of today deal with this? It simply can’t.

Rejection of this argument is two fold. Firstly the notion of SEO as a marketing channel that comprises of a static, never changing strategy is nonsense. SEO has continuously changed in reaction to modifications to the Google algorithm, and the introduction of personalized search is no different. Indeed, SEO strategies have already been devised for people to lever personalized search to their advantage, for instance the encouragement of users to search brands.

Secondly the author’s perception of personalized search is overstated. Our tests of the change personalized search, and wider commentary provides evidence that the impact of preference of previous search behaviour affects rankings minimally. Even then it will be limited to previous markets that have been searched within. If someone decides to purchase a desktop computer, and hasn’t made any computer related in the past, how does Google personalize such results? Unless Google starts generating search results largely on preference, the current ‘traditional algorithm’ will still be hugely influential.

It’s probably the complete rejection that SEO can’t react to personalized search that leaves the author open to criticism. I agree that personalized search does provide new challenges, but also provides new opportunities to gain competitive advantage. The conclusion of the article is good. SEO has to give consideration to social media, but the general theme that social media is increasing in importance at the expense of SEO, I believe in unfounded.

Oh, and the irony that I had to resort to social media in order bring this to the attention of the author is not lost on me. 😀



Is Meta data still important in the Google algorithm?

You could almost be forgiven for thinking that in this day and age Meta Data has no real importance. Ever since Matt Cutts confirmed that Google no longer uses Meta data in the algorithm, many webmasters have started to neglect it, in particular Meta descriptions. However I’m going to present the case that Meta descriptions are still very important in the Google Algorithm.

Well written and accurate Meta descriptions and titles can seriously improve CTR. In turn, by improving your CTR, one would improve their ranking. The slide below is from a SEOmoz presentation slide done by Rand Fishkin.

In a survey of 72 SEO professionals, on average traffic and CTR was deemed to be weighted 6.29% in importance in the Google algorithm. So indirectly Meta data still has importance in rankings. You have to take this with a pinch of salt, especially since I don’t know how knowledgeable these SEO professionals are, nor how they derived their weightings. In addition, the statistic suffers from the same problem that a simple mean statistic does. A mode and median would be interesting to read.

Regardless, there is still a consensus that CTR remains an important enough factor in SEO to put effort into Meta data. For those looking to improve their capabilities in writing Meta data, I’d recommend this article.


Deciding whether or not to use WWW in a URL

Over the past couple of days I’ve been deciding how to structure my blog URL. Prior to today I had two versions of my blog:

http://bowdeni.com
http://www.bowdeni.com

Although I obviously have just one blog, but to search engines these are actually two different sites. This presents problems for duplicated content and link building. For the former,  search engines don’t like duplicated content and with two variants of the site, it’s essentially duplicated perfectly. In terms of link building, you could waste efforts and not benefit from genuine links if they are targeting to the two different versions. I therefore went on a journey to build rationale to pick one of the two to keep.

Advantages of www variant:

  • Branding.  A lot of web users come to expect www as a prefix to a website, and when entering a URL into an address bar, will include the www subdomain.
  • Natural links. Usually when people organically link to you without being paid to do so (shock horror!) they’ll include the www subdomain.

Advantages of the non-www variant

  • Link building juice. My understanding of the search algorithms is that if you receive a link to this version of your site, Google will automatically credit the link equity to the www version of a site too.
  • Low char. The omission of the www. subdomain reduces the length of your links by four characters. On social networks, especially Twitter when you have a minimum amount of characters, this gives frees up more characters to use on Tweeting you!

With this in mind, I chose http://bowdeni.com . The rationale for this choice was the fact that the nature of my content is likely to be linked to on social networks. I’m at an early stage of running this blog and I believe there will be trend will be towards the non-www variant. I therefore added a canonical tag to my header on the blog as follows:

<link rel="canonical" href="http://bowdeni.com/" />

When search engines read this canonical tag, if there are any links to the www variant then it will credit the link equity to the non www variant. There is however a price. It seems that Google reduces link equity passed through a canonical tag, and it’s at the same level as a 301 redirect. To make everything watertight, I’ve 301’d it to the non variant anyway!

If I had a giant website such as the BBC where people will write out the www variant in links made , and I had a giant portfolio of links, then I’d keep the www variant. However my blog doesn’t have such a sparse portfolio of links (sad) and accordingly I can lever a bit more control over the link building.

I must confess that I had done a tiny bit of link building to my blog with the target URL being the www variant, so this very moment, I’m going to change all those backlinks where possible to ensure I don’t lose any link juice through the canonical tag or 301 redirects!


Campaign to include Mars drink in boots meal deal

I sent this email to Boots today.

Dear Boots,

I’m as big a fan of the Boots meal deal as anyone, but I feel as both a shareholder and a customer, that sustainable competitive advantage at a corporate level could be obtained by including Mars Drink in the Boots meal deal.

May I remind your board of directors that they must act in the interest of shareholders, and the decision to include the Mars drink (preferably the thick variety) would deliver unprecedented added value.

Continue Reading


8 techniques for link building (Part One)

weblinkSEOmoz recently ran a webinar on linkbuilding, outlining 8 major different methods of linkbuilding. I thought I’d write a bit about them, and add my experience of using them. This is a two part post, the following  four further methods will be posted soon. 

1. Manual Link Submissions/ Requests

This technique refers to approaching relevant sites and either attempting to leave a link or approaching the webmaster to obtain one. I have used this technique a great amount buthaven’t gained many high quality using it. It is a pretty tedious method of link building too.

Continue Reading


Learning some CSS

41luBO3igkL._BO2,204,203,200_PIsitb-sticker-arrow-click,TopRight,35,-76_AA240_SH20_OU02_One of the books that I received for Christmas was CSS: The Missing Manual. I did know some CSS and had built a 200 page medical website with CSS, but my knowledge of was rusty. This book is an excellent resource for CSS, whether you are learning it from scratch or already have a little bit of knowledge. 

Working in search marketing, it’s my aim to become fluent in HTML/CSS. Having witnessed first hand the massive impact landing pages has on converting traffic, I’d like to be able to position myself to be able to create and amend client landing pages. Using Google optimiser tool, I’ll be able to split test different landing pages and not only deliver beautiful traffic, but also spoon feed them into making purchases :D. 

Knowledge in CSS will additionally be useful for my own projects that I’d like to develop, but I need a book for finding the time!


Pages:12345