Category Archives: SEO News

I Wish That The Google Sandbox Was Less Harsh

Kevin has just tagged me in the following post: 5 wishs for Google AdWords Editor, the meme was originally started here: If there was an SEO Genie my one wish would be by Kelvin Shuffle and now I have got to wish for something SEO related.

So here goes, I wish that the Google sandbox/no trust period or whatever you want to call it was less harsh, I understand why it is there and all, it is just that it seems to be taking longer and longer for sites to come out, I know the older a site is the more trust it will have and the higher it will be able to rank but there is definitely a fair bit of feeling like you are banging your head against a brick wall before you even see anything.

Using traditional link building techniques I would say it takes 8 months before you get any real results, I know that if you linkbait on a regular basis you can see results much sooner but this isn’t always possible. I think that 4-5 months would be a lot more reasonable.

I would like to know what the following people would wish for: Nowsourcing, aimClear, Sebastian and OnReact.

Posted in SEO News By David Eaves a UK search engine optimisation specialist.

So Called SEO Forgets To Optimise His Own Blog

When I 1st started this blog a couple of people warned me about the duplicate content issues with WordPress; I sort of put it on the backburner because I was busy and then just never got around to doing it. I was checking my rankings the other day for certain phrases related to some of the recent blog posts that I have done and half of them are getting duped out by either the category pages or the archive pages due to them pages having more PageRank. The fact that this blog does not get posted on too often has probably been a factor.

The easiest way to fix this issue would be to add the more tag to my posts so that you would have to click onto a post to see the entire article, but I tried that and I did not like it, I didn’t know where to put the tag and when I tested it, it looked funny.

Here are the two main things that I have done to this blog to try and solve the duplicate content issues without sacrificing functionality:

Replaced the old WordPress archives with a new handmade monthly archive; the new archive simply offers links to the posts that I have done for each month. This means that searchers can find the posts for each month, extra PageRank gets passed passed to the posts because they are not a million miles from the blog homepage and that there should be no archive duplicate content.

Nofollowed all of the links to the category pages; I really wanted to keep the category pages because I think they are useful so I decided that nofollow was the best road to go down, I couldn’t find any plugin for this so I just did it by hand. This should eventually cause the category pages to disappear from the search engines so that they will just be there for my visitors.

Hopefully once the search engines pick up on these things all of my posts will start to rank properly.

There are a few other things that I have done as well, three of them have nothing to do with SEO but may help my traffic:

Validated the homepage; this will almost definitely not have any impact on my rankings but there where about 50 errors and it is better to be safe then sorry. I have still not been able to completely validate the posts but I am sure I will get my head around it eventually.

Nofollowed the links to all of the subscribe buttons and the feed; this may mean I have a bit more PageRank for the posts, it was basically a waste of link juice.

Added a MyBlogLog widget; mainly because it will be nice to see who some of the people are who are visiting my blog. Please feel free to join my community.

Added a subscribe to the feed link to the bottom of every post; I got this idea off Patrick who I consider to be an expert blogger.

Added a button showing how many subscribers I have; people may feel sorry for me and subscribe out of pity.

Posted in SEO News By David Eaves a UK search engine optimisation specialist.

Google Indexing And 301 Redirect Nightmares

Google has indexed the wrong version of the homepage for 3 different client websites in the last 3 months, it is my fault really because I should have taken care of things before it happened. In all 3 cases the clients homepage rankings on Google almost completely disappeared (scary stuff). Here is some information on how I handled each situation.

The 1st one was really easy, the clients regular homepage www.domain.com had been de-indexed and replaced by www.domain.com/home. I set up a 301 redirect using htaccess from www.domain.com/home to www.domain.com. Because this particular site had navigation to both versions of the homepage, Google recognized the redirect within a couple of days and the client re-gained their rankings.

The 2nd one took me about two weeks to sort out, again Google de-indexed the clients regular homepage www.domain.com and replaced it with a different version, this time it was www.domain.com/index.html, this particular site had no navigation to the other version and there were no external links to it either (how Google picked it up I don’t know). I set up a 301 redirect using htaccess just like I did last time thinking it would be sorted in a couple of days. It wasn’t sorted in a couple of days, in fact nearly two weeks had gone by and it still wasn’t fixed, the client was understandably starting to get a little anxious. I then realised that Google just wasn’t going to recognize the redirect unless I placed a link to the version that was redirecting. Fortunately I have a directory at my disposal that gets re-cached by Google every other day so I added a temporary entry linking to the version that was redirecting (www.domain.com/index.html), within less then 48 hours Google picked up the directory listing, recognized the redirect and once again the client got their rankings back. If I had been a bit more on the ball this one could have been sorted out a lot quicker.

The 3rd one was also a bit of a nightmare, this time Google had de-indexed www.domain.com and replaced it with domain.com without the www. I thought no problem I will just set up a redirect with htaccess like I had done the previous times, the client had Windows hosting and htaccess was not supported, I tried to set one up using the internet services manager but every time I went into the file properties to set it up it said (This server does not support file changing permissions), neither me or the client could work out how to fix it, the web hosting company was totally useless and changing web hosts would have been a big job. I registered the site with Google Webmaster Central, created an XML sitemap and set the preferred domain to www.domain.com, within a couple of days Google re-indexed the right version of the homepage and the client re-gained their rankings.

If this has happened or happens to you, get the 301 set up and if necessary build some links to the redirect so that Google recognizes it.

To prevent this happening in the 1st place, make sure that the versions of your homepage that you don’t want are 301’d to the one you want i.e. if you want www.domain.com 301 all non www. URLs to all www. ones and 301 www.domain.com/home to www.domain.com. If you have this set up you will have no problems. If you are having problems setting this up contact your web host, hopefully they will be more helpful then the one that I dealt with, if they are not consider getting a new host like my client has now done.

Here are some useful links:

WebConfs.com – How to create Redirects

Bob Mutch on Redirects

Google Webmaster Central (Get yourself registered)

Posted in SEO News By David Eaves a UK search engine optimisation specialist.

Alexa Data Coming On Leaps And Bounds

I think that sometimes Alexa gets a bum rep. and all too often I hear people coming out with things like:

Alexa is useless
Alexa is easy to manipulate

Alexa is biased
Alexa sucks

In days gone by I would have to agree with most of these statements but recently I am finding that I am using the tool more and more often.

With Internet explorer browsers the Alexa toolbar would slow you down quite a lot, not to mention take up valuable screen space and many people would be put off installing it because lots of anti-virus programmes saw it as spyware.

More and more people are switching to Mozilla and Firefox browsers all of the time and more and more people are installing the hassle free search status extension that feeds data to Alexa. I believe that this is making the data that they provide more accurate and less easy to manipulate every day.

Also the Alexa information is now updated daily and is just one day behind so on Sunday you can see Fridays statistics which I think is pretty impressive, in days gone by it was not updated for long periods of time.

Finally the Alexa site used to be really slow and is still not the fastest website in the world but it is certainly much faster then it was 6 months ago.

So to all of you Alexa haters out there, get off their case I think they are doing a pretty good job.

Visit Alexa’s Homepage – Alexa.com

Posted in SEO News By David Eaves a UK search engine optimisation specialist.

Google Ranking Penalties

Recently I have found that Google have been penalizing sites for using various link building and SEO techniques that in day’s gone by have worked a charm. The obvious thing that comes to mind is paid links, Google have certainly had a big crackdown on paid links and many sites that have been buying them have dropped significantly in the rankings, however this is just the tip of the iceberg as far as I am concerned because many other sites that have not bought paid links have also been penalized (including some of the sites I have been working on), I have been studying the sites that have dropped and they seem to be the ones that have been worked on very recently, they also seem to be the ones with lot’s of anchor text links using a particular search phrase.

In days gone by when I have been building links to a site I would get as many as possible exact match search phrase text links linking to a particular site, the more links I got using a particular search phrase the higher the site would rank for that phrase.

Now the sites that have dropped have just been penalized on the one phrase that has been targeted and overdone, all of the other positions have stayed the same and what I have noticed is that the sites have only dropped to the position they were in prior to the new SEO links being built, Google are very clever, they have set up the algorithm so that it would be very difficult for you to hurt one of your competitors rankings.

It is still very early days and I am trying to work out whether the sites that have been hit with ranking penalties have simply had the links sandboxed (devalued for a certain period of time) or just penalized altogether so that something would have to be done about the search phrase text links in order to get the sites to rank high again. I am running a couple of experiments where we are not removing the links to the penalized sites but changing the anchor text to something more natural like the company name for instance, un-SEOin the link profile if you will, I am not doing anything too hasty I am just taking it one link at a time, monitoring the effect and building fresh links at the same time just using the company name as anchor text.

I haven’t noticed anybody else talking about this in detail yet, but I certainly feel that we will not be the only SEO who has been hit by this. I will post more of my thoughts about Google ranking penalties in the future.

Update: I am pleased to report that the experiments have gone really well and all of the sites bar one are back on track.

Posted in SEO News By David Eaves a UK search engine optimisation specialist.