An infographic about F-Commerce
Catching dropped domains can instantly provide you not only with a solid backlink profile decent but also referral traffic. In this blog post I provide some advice on how to catch them and bring them back to life, illustrated with my real life example.
Within a niche one of my clients works within, a satirical one page website generated hundreds of authoritative of links (TBPR5 if that ticks your boat). I first saw it on Reddit and dreamt I had come up with the idea, as it was a rare testament to the fact content is king.
In a moment of idle web surfing, I went back to revisit the site only to see that the domain was pending deletion. For those unaware, this is generally the ‘lifecycle’ of a domain.
- Expiration (around 40 days)
- Redemption period (around 30 days)
- Pending deletion (around 5 days)
Until it hits pending deletion, the owner can claw back their domain. Fortunate timing for me, as it was already pending deletion so I knew it would just be a matter of time before it would become available.
At this point, I wouldn’t recommend just hanging around. Instead, use a number of backordering services, and where possible, all of them. Three noticeable companies include Pool.com, Namejet.com and snapnames.com. Generally you don’t pay unless they catch it, in which case you are quids in. If two or more people attempt to backorder it, it goes to auction. That’s what happened with me on Pool.com, and so it was set to go the highest bidder.
My auction went on for about 45 minutes, and ended up at around £230 ($400). Anyone who is familiar with paid linking will know this to be good value. Not that I was concerned, I wasn’t buying it for link equity, but just for fun and lulz.
I haven’t done this technique enough times to suggest that how I resuscitated it definitely the cause, but there is logic behind it. When I speak of resuscitating a dropped domain, I mean that TBPR returns. From this, I take Google to algorithmically valuing it in regards to page and domain authority, plus TrustRank as it did before it dropped. Here is what I did…
- Visit archives.org and return what content you can find possible
- This includes page titles and meta descriptions
- My site was ODP listed, so I matched natural search copy with that
- Don’t add any links (yet) until the domain has been brought back to life
Sure enough, come the next TBPR update, that little green box was back. The site was receiving around 3,000 visits a month from referral links and continues to grow. Think about how you can use this approach, but don’t abuse it.
- Scrape around the Internet for sites with authoritative links that have dropped
- Keep an eye on content that goes genuinely unintentionally viral, but may be likely to drop in the future with automatic tools
- Harvest a list of dropping domains and pull SEOmoz data in to analyse strength to draw up a list of acquisitions
- (My favourite) Take the referral traffic the sites was getting and use it to get eyeballs on your new content. If you’ve got a ton of referral traffic from places like Reddit, invite people to check out new content. With that, you can amplify new content you are creating and leverage more benefit.
Haven’t seen this before.
This is what appears when you click on request call…
… and finally request email.
EDIT: Our Google rep has just confirmed this to be new They’re called communication ad extensions, are a free way to get leads, therefore won’t appear in the Adwords UI and only appear for 10% of queries. Currently in alpha.
Another awesome infographic Arena Quantum have produced for a client, congratulations to those who have worked on it.
Grow your own infographic from LoveTheGarden.com
Here at Arena Quantum we like to do multi-click attribution, providing an insight into the true value of generics. To do this, we require an ad server. As part of a mistake putting tracking on, we discovered a very unlikely ranking signal we had not considered before. We uncovered evidence to suggest that Google treats the URL you specify to track the pageview as, in the same way a canonical tag.
For example, we placed the following code on superwidgets.com/redwidgets. Note that the domain we played it on differs to the URL we wanted to track it as.
In this example, we also own cheapwidgets.com. Only when examining the inbound links being reported by Google Webmaster Tools I noticed that there was a reported to be a link from cheapwidgets.com.
I have scoured the web page on cheapwidgets.com and the only reference to cheapwidgets.com is in the tracking code. Therefore it looks like trackpageview can act like a cross domain canonical. Key takeaway for this? Double check your tracking code to make sure you aren’t leaking any link juice.
Historically Google has used links as a proxy to determine the most relevant and authoritative websites to return to a user’s search query. Late 2010 Google and Bing confirmed that they do indeed now use social signals as a ranking factor, but only now are those in the SEO community starting to identify case studies where social signals are having a clear influence on search results.
A new case study can be added, Money Supermarket. Between 10th January and 16th of January Money Supermarket held a free prize draw. Users had to retweet a message (see below) containing a link to the car insurance product page, to provide them with a chance of winning a years free car insurance.
I believe this generated around ~2,500 RTs over the 7 days. The impact it had on Money Supermarket’s ranking for the search query ‘car insurance’ is most interesting.
Between September 21st and 11th January Money Supermarket had an average of 6th for the search query ‘car insurance’. During this time, their best position was 4th, held for just a couple of days while their lowest rank was 9th. Just two days after the competition ended, Money Supermarket started ranking 1st.
This 1st place ranking was held until 14th March, when Money Supermarket dropped back down to 3rd. How does Money Supermarket react to this drop? Another Twitter competition! Running from the 14th to 20th March. Money Supermarket are now ranking 2nd, and I’ll update this post after the competition has ended.
As always correlation doesn’t not necessarily imply causation. There may have been other signals having an influential role but certainly this case study adds further evidence to the importance of social signals.
A lot of effort has gone into producing this infographic, hats off to some of the team here.
Infographic brought to you by flythomascook.com
Had a very frustrating day as I’ve been trying to set up a local environment to code some PHP. I was using MAMP to make it quite simple. I could connect to the database on terminal but not by calling the mysqli_connect(). Adding these two lines of code helped find the problem:
When these two lines of code were added, I was able to see the following error message:
mysqli_connect():  No such file or directory (trying to connect via unix:///var/mysql/mysql.sock) in /Library/WebServer/Documents/connect/index.php on line 6 Warning: mysqli_connect(): (HY000/2002): No such file or directory in /Library/WebServer/Documents/connect/index.php on line 6
I never really did find a fix, but I found a deeper problem. I was putting files in the localhost directory and they were being executed fine. However, I then realised that the environment MAMP had created was in fact accessible by including the port, i.e. localhost:8888/. Shutting down the apache and mysql server with MAMP, I loaded up my files in locahost and confirmed what I knew, something else was executing them. I must have some old legacy servers running in the background I figure. Either way, I gave up.
Got a shiny brand new Macbook Air. Within 5 minutes I was able to connect to mysql in both terminal and by calling mysqli_connect(). The default password is root for MAMP, so when you change it with the command
mysqladmin -u root -p password "newpassword"
Make sure that you update the new password in the two following locations
If you don’t do the latter you’ll receive an error saying it can’t connect to the mysql database. All in all quite a frustrating experience today. I’d prefer to code on the iMac but ultimately the stress associated to have a ‘clean start’ will be rather annoying.