:::: MENU ::::

Setting up an environment to code PHP

Had a very frustrating day as I’ve been trying to set up a local environment to code some PHP. I was using MAMP to make it quite simple. I could connect to the database on terminal but not by calling the mysqli_connect(). Adding these two lines of code helped find the problem:

ini_set('display_errors', 1);

When these two lines of code were added, I was able to see the following error message:


mysqli_connect(): [2002] No such file or directory (trying to connect via unix:///var/mysql/mysql.sock) in /Library/WebServer/Documents/connect/index.php on line 6 Warning: mysqli_connect(): (HY000/2002): No such file or directory in /Library/WebServer/Documents/connect/index.php on line 6

I never really did find a fix, but I found a deeper problem. I was putting files in the localhost directory and they were being executed fine. However, I then realised that the environment MAMP had created was in fact accessible by including the port, i.e. localhost:8888/. Shutting down the apache and mysql server with MAMP, I loaded up my files in locahost and confirmed what I knew, something else was executing them. I must have some old legacy servers running in the background I figure. Either way, I gave up.

Got a shiny brand new Macbook Air. Within 5 minutes I was able to connect to mysql in both terminal and by calling mysqli_connect(). The default password is root for MAMP, so when you change it with the command

mysqladmin -u root -p password "newpassword"

Make sure that you update the new password in the two following locations

/Applications/MAMP/bin/phpmyadmin/config.inc.php (line86)


If you don’t do the latter you’ll receive an error saying it can’t connect to the mysql database. All in all quite a frustrating experience today. I’d prefer to code on the iMac but ultimately the stress associated to have a ‘clean start’ will be rather annoying.

5 Predictions for the Google algorithm in 2011

Seems like a common thing to do, so I thought I’d lay down some predictions for 2011. One may argue that my predictions are based on optimism more than anything else. I believe December 2009 – June 2010 was a good period for the Google algorithm, but the second half of 2010 was disappointing.

Google made changes to page on a domain it ranked for a search query. For instance, Google started to display a more relevant deep page if it was the home page on the domain that was ranking. Unfortunately it seems it also did the reverse. In financial UK SERPs the home page of insurance websites appeared in search results when it was actually the deep page ranking. This had a few consequences.

Many in the SEO community took the backlink profiles of the ranking pages and were amazed when the sites ranking had natural link profiles, mainly composing of brand inbound link anchor text. It was proclaimed that Google had made great strides in web spam, and now capitalising on brand was important. Alas not. It was the deep pages with spammy back link profiles providing the rankings.

This brief discussion of 2010 helps provides the context to which my predictions for 2011 are made. I’ll let other people judge if they are realistic or just hopeful.

1) There will be a significant update to the Google algorithm in January

Common to previous years, Google have made few changes to the algorithm leading up to Christmas and usually rolls out a large change in January. An absence of notable updates in December ’10 leads me to think that it will be the same case in 2011 and we’ll see a significant update in January.

2) Efforts to combat web spam will step up in 2011

Matt Cutts at Pubcon stated that web spam resources had been taken away and deployed elsewhere in 2010. Especially towards the end of 2010 it showed too. Take a look through the back link profiles of those in the UK insurance verticals and you will struggle to find a clean, natural profile. However for certain competitive commercial keywords, sites are ranking with horrific back links. Usually such sites don’t tend to hang around on page one long, but recently such link building tactics are proving successful. The good news is that Matt Cutts has said that the resources have been restored to tacking web spam, and accordingly I expect there to be  big strides taken early in 2011. Maybe we’ll even see an update to the way spam is reported.

3) There will be another Toolbar PageRank update

There was only one major update to the Toolbar PageRank in 2010. I’m sure while implementing the May Day update, and Instant exporting PageRank to the toolbar was the least of their concerns. Some are questioning whether it will be updated ever again. If not, I’d like to see Google completely remove it rather than leave this legacy of an out of date metric. With more resources, I predict an update in the first half of 2011.

4) 2011 is not the year of social signals

It’s been confirmed that the open graph is now working alongside the link graph by Google. In 2010 Bing struck a deal with Facebook however, with both of these things in mind I still don’t believe 2011 is the year for social being used as an important ranking signal. Some individuals have demonstrated that you can get indexed from social platforms, and even rank for terms. However the weighting of social signals versus the link graph I assume are minute, and so for those operating in competitive keywords won’t see much of an impact.

5) Brand will continue to grow in importance

Ever since the Vince update brand has played a significant role in the algorithm, and I predict that this will continue in 2011. This will be best demonstrated in relevancy derived from anchor text being reduced. To take the example of the insurance vertical again, little or no people link with generic targeted anchor text to insurance providers naturally. They are far more likely to with brand. Therefore as a reference point, the algorithm should devalue relevancy derived anchor text and increase the value of authority from genuine brand links. I believe this is logical progression for the algorithm and although long over due, will happen finally in 2011.

So that forms my predictions for 2011. They’re more specific than many other people’s predictions, but I’m fed up of reading how ambiguous aspects such as ‘local’ and ‘mobile’ will become more important. I’ll leave readers with the ever true comment of Matt Cutts, who at Pubcon emphasised the importance of not chasing the algorithm, but instead predicting where it is going to go. It’s this that gives me the hope that 2011 will feature sites with natural back link profiles, and relevant results useful to users.

Can 302 Redirects Pass PageRank?

Many webmasters would agree that a 302 does not pass PageRank. However a recent blogpost at Search Engine Land and our own experiences at Arena Quantum may suggest otherwise. A 302 is used for a temporary redirect, and accordingly on paper should not pass PageRank. However what if the 302 redirect is in place for over two years? Would Google ignore the temporary classification of the redirect and change it to permanent? Our experience would indicate so. Consider the following example:

For whatever reason, when a high street brand created its website, it was decided that when a request for the root URL was made, it would redirect and serve content from two subfolders deep, with each subfolder containing relevant keywords (facepalm). This redirect would be done with a 302, as seen below:

302 redirect / www.example.com/widgets/blue/index.php

Overlooking the fact content of the homepage should be served without redirects, (constraints imposed), it was our recommendation that this redirect should be changed to a 301 permanent. In theory, all links pointing to www.example.com would make it the strongest page on the site, but the link equity would not pass to the page that served the content, thus creating a PageRank dam.  This theory is consistent with the experience of Tedster who stated he had worked with several large sites where “the PageRank stayed on the domain root for all of them”. The root domain for our client had 618 different domains pointing to it, from links from major publications including The Guardian. Potentially, this could be big.

Imagine our excitement. We’d identified the required change of redirect, got it implemented and waited for the ‘dam of PageRank’ to be unleashed, making the strength of deeper pages much stronger. Alas not. It’s been three months now and we have seen no impact. With consideration to the fact that the 302 had been in place for several years, it supports the theory that possibly after a given period of time, Google will start treating the 302 redirect as if it was a 301. It’s no surprise that this client had a number of other old 302 redirects for deeper pages. We made these 301 redirects too with no identifiable impact.

Do longstanding 302 redirects start to flow PageRank? It’s something that needs considering. Perhaps treating aged 302 redirects as permanent improves the quality of the link graph as a reference for Google to counter improper use of redirects.

Free Anchor Text Distribution Excel Spreadsheet

Anchor Text Distribution for OSE

This spreadsheet will allow you to quickly profile your competitor backlinks and identify what keywords they are likely to be targeting.

Simply copy and paste Open Site Explorer data into this spreadsheet, refresh the pivot table and you’ll see the top ten anchor texts used to link to a given website. In addition, a graph visually represents the distribution of anchor text. This graph will give some initial indicators on how natural the link profile of the website is. Download the link profile spreadsheet for free. Further instructions on how to use spreadsheet can be found below.

Continue Reading

Reviewing Advanced Web Ranking

I use a range of tools on a daily basis including SEOmoz and Raven but neither are great for tracking rankings. SEOmoz has limits and can be expensive if you want track hundreds of keywords. The data output of Raven is pretty poor, and the interface isn’t so slick. Step in Advanced Web Ranking.

I work agency side in Central London. I have a lot of accounts and one aspect of agency side SEO is that there is a lot of reporting involved. Transparency is key and we report rankings progress across all keywords we optimise for. Raven is ok but I needed a tool that would run in the background, and with a few clicks I could export the data into an Excel pivot table and send over to the client. That’s exactly what Advanced Web Ranking allows me to do.

Continue Reading

Hotmail SEO Fail

Would have thought that live.com would have served a HTTP status code that informed Google Bot to not cache.