:::: MENU ::::
Browsing posts in: SEO

An Unexpected Ranking Signal – trackpageview

Here at Arena Quantum we like to do multi-click attribution, providing an insight into the true value of generics. To do this, we require an ad server. As part of a mistake putting tracking on, we discovered a very unlikely ranking signal we had not considered before. We uncovered evidence to suggest that Google treats the URL you specify to track the pageview as, in the same way a canonical tag.

For example, we placed the following code on superwidgets.com/redwidgets. Note that the domain we played it on differs to the URL we wanted to track it as.

In this example, we also own cheapwidgets.com. Only when examining the inbound links being reported by Google Webmaster Tools I noticed that there was a reported to be a link from cheapwidgets.com.

I have scoured the web page on cheapwidgets.com and the only reference to cheapwidgets.com is in the tracking code. Therefore it looks like trackpageview can act like a cross domain canonical. Key takeaway for this? Double check your tracking code to make sure you aren’t leaking any link juice.


Evidence Twitter influence UK financial SERPs

Historically Google has used links as a proxy to determine the most relevant and authoritative websites to return to a user’s search query. Late 2010 Google and Bing confirmed that they do indeed now use social signals as a ranking factor,  but only now are those in the SEO community starting to identify case studies where social signals are having a clear influence on search results.

A new case study can be added, Money Supermarket. Between 10th January and 16th of January Money Supermarket held a free prize draw. Users had to retweet a message (see below) containing a link to the car insurance product page, to provide them with a chance of winning a years free car insurance.

Money Supermarket Tweet

I believe this generated around ~2,500 RTs over the 7 days. The impact it had on Money Supermarket’s ranking for the search query ‘car insurance’ is most interesting.

Between September 21st and 11th January Money Supermarket  had an average of 6th for the search query ‘car insurance’. During this time, their best position was 4th,  held for just a couple of days while their lowest rank was 9th. Just two days after the competition ended, Money Supermarket started ranking 1st.

Rankings Graph

This 1st place ranking was held until 14th March, when Money Supermarket dropped back down to 3rd. How does Money Supermarket react to this drop? Another Twitter competition! Running from the 14th to 20th March. Money Supermarket are now ranking 2nd, and I’ll update this post after the competition has ended.

As always correlation doesn’t not necessarily imply causation. There may have been other signals having an influential role but certainly this case study adds further evidence to the importance of  social signals.




5 Predictions for the Google algorithm in 2011

Seems like a common thing to do, so I thought I’d lay down some predictions for 2011. One may argue that my predictions are based on optimism more than anything else. I believe December 2009 – June 2010 was a good period for the Google algorithm, but the second half of 2010 was disappointing.

Google made changes to page on a domain it ranked for a search query. For instance, Google started to display a more relevant deep page if it was the home page on the domain that was ranking. Unfortunately it seems it also did the reverse. In financial UK SERPs the home page of insurance websites appeared in search results when it was actually the deep page ranking. This had a few consequences.

Many in the SEO community took the backlink profiles of the ranking pages and were amazed when the sites ranking had natural link profiles, mainly composing of brand inbound link anchor text. It was proclaimed that Google had made great strides in web spam, and now capitalising on brand was important. Alas not. It was the deep pages with spammy back link profiles providing the rankings.

This brief discussion of 2010 helps provides the context to which my predictions for 2011 are made. I’ll let other people judge if they are realistic or just hopeful.

1) There will be a significant update to the Google algorithm in January

Common to previous years, Google have made few changes to the algorithm leading up to Christmas and usually rolls out a large change in January. An absence of notable updates in December ’10 leads me to think that it will be the same case in 2011 and we’ll see a significant update in January.

2) Efforts to combat web spam will step up in 2011

Matt Cutts at Pubcon stated that web spam resources had been taken away and deployed elsewhere in 2010. Especially towards the end of 2010 it showed too. Take a look through the back link profiles of those in the UK insurance verticals and you will struggle to find a clean, natural profile. However for certain competitive commercial keywords, sites are ranking with horrific back links. Usually such sites don’t tend to hang around on page one long, but recently such link building tactics are proving successful. The good news is that Matt Cutts has said that the resources have been restored to tacking web spam, and accordingly I expect there to be  big strides taken early in 2011. Maybe we’ll even see an update to the way spam is reported.

3) There will be another Toolbar PageRank update

There was only one major update to the Toolbar PageRank in 2010. I’m sure while implementing the May Day update, and Instant exporting PageRank to the toolbar was the least of their concerns. Some are questioning whether it will be updated ever again. If not, I’d like to see Google completely remove it rather than leave this legacy of an out of date metric. With more resources, I predict an update in the first half of 2011.

4) 2011 is not the year of social signals

It’s been confirmed that the open graph is now working alongside the link graph by Google. In 2010 Bing struck a deal with Facebook however, with both of these things in mind I still don’t believe 2011 is the year for social being used as an important ranking signal. Some individuals have demonstrated that you can get indexed from social platforms, and even rank for terms. However the weighting of social signals versus the link graph I assume are minute, and so for those operating in competitive keywords won’t see much of an impact.

5) Brand will continue to grow in importance

Ever since the Vince update brand has played a significant role in the algorithm, and I predict that this will continue in 2011. This will be best demonstrated in relevancy derived from anchor text being reduced. To take the example of the insurance vertical again, little or no people link with generic targeted anchor text to insurance providers naturally. They are far more likely to with brand. Therefore as a reference point, the algorithm should devalue relevancy derived anchor text and increase the value of authority from genuine brand links. I believe this is logical progression for the algorithm and although long over due, will happen finally in 2011.

So that forms my predictions for 2011. They’re more specific than many other people’s predictions, but I’m fed up of reading how ambiguous aspects such as ‘local’ and ‘mobile’ will become more important. I’ll leave readers with the ever true comment of Matt Cutts, who at Pubcon emphasised the importance of not chasing the algorithm, but instead predicting where it is going to go. It’s this that gives me the hope that 2011 will feature sites with natural back link profiles, and relevant results useful to users.


Can 302 Redirects Pass PageRank?

Many webmasters would agree that a 302 does not pass PageRank. However a recent blogpost at Search Engine Land and our own experiences at Arena Quantum may suggest otherwise. A 302 is used for a temporary redirect, and accordingly on paper should not pass PageRank. However what if the 302 redirect is in place for over two years? Would Google ignore the temporary classification of the redirect and change it to permanent? Our experience would indicate so. Consider the following example:

For whatever reason, when a high street brand created its website, it was decided that when a request for the root URL was made, it would redirect and serve content from two subfolders deep, with each subfolder containing relevant keywords (facepalm). This redirect would be done with a 302, as seen below:

302 redirect / www.example.com/widgets/blue/index.php

Overlooking the fact content of the homepage should be served without redirects, (constraints imposed), it was our recommendation that this redirect should be changed to a 301 permanent. In theory, all links pointing to www.example.com would make it the strongest page on the site, but the link equity would not pass to the page that served the content, thus creating a PageRank dam.  This theory is consistent with the experience of Tedster who stated he had worked with several large sites where “the PageRank stayed on the domain root for all of them”. The root domain for our client had 618 different domains pointing to it, from links from major publications including The Guardian. Potentially, this could be big.

Imagine our excitement. We’d identified the required change of redirect, got it implemented and waited for the ‘dam of PageRank’ to be unleashed, making the strength of deeper pages much stronger. Alas not. It’s been three months now and we have seen no impact. With consideration to the fact that the 302 had been in place for several years, it supports the theory that possibly after a given period of time, Google will start treating the 302 redirect as if it was a 301. It’s no surprise that this client had a number of other old 302 redirects for deeper pages. We made these 301 redirects too with no identifiable impact.

Do longstanding 302 redirects start to flow PageRank? It’s something that needs considering. Perhaps treating aged 302 redirects as permanent improves the quality of the link graph as a reference for Google to counter improper use of redirects.


Free Anchor Text Distribution Excel Spreadsheet

Anchor Text Distribution for OSE

This spreadsheet will allow you to quickly profile your competitor backlinks and identify what keywords they are likely to be targeting.

Simply copy and paste Open Site Explorer data into this spreadsheet, refresh the pivot table and you’ll see the top ten anchor texts used to link to a given website. In addition, a graph visually represents the distribution of anchor text. This graph will give some initial indicators on how natural the link profile of the website is. Download the link profile spreadsheet for free. Further instructions on how to use spreadsheet can be found below.

Continue Reading



Reviewing Advanced Web Ranking

I use a range of tools on a daily basis including SEOmoz and Raven but neither are great for tracking rankings. SEOmoz has limits and can be expensive if you want track hundreds of keywords. The data output of Raven is pretty poor, and the interface isn’t so slick. Step in Advanced Web Ranking.

I work agency side in Central London. I have a lot of accounts and one aspect of agency side SEO is that there is a lot of reporting involved. Transparency is key and we report rankings progress across all keywords we optimise for. Raven is ok but I needed a tool that would run in the background, and with a few clicks I could export the data into an Excel pivot table and send over to the client. That’s exactly what Advanced Web Ranking allows me to do.

Continue Reading


Hotmail SEO Fail

Would have thought that live.com would have served a HTTP status code that informed Google Bot to not cache.


Pages:123