There is a massive choice of SEO tools out there and most of the tools promise to be the SEO elixir that will deliver you top rankings in Google at the click of a button.
As an agency, we have been using Raven Tools for a number of years. Whilst not perfect, it suits our needs pretty well and we top it up with injections of insight from more specific tools / software such as Screaming Frog, the IIS search engine optimization tookit and Link Research Tools.
I have, however, been looking at other options over the past few weeks on behalf of a client. It has been an interesting process for me and I cannot deny that there are some good looking products available to the modern digital marketer. I have always been a fan of browser based solutions and most of the leading names are all cloud / browser based and are polished.
An email and subsequent perusal of one of the tools has led me to the writing of this blog post as I hope it serves as an excellent reminder that no tools are perfect and they are just tools – the art is in how to use them.
At this point, I want to make it clear that it is not an attack on Moz.com. Moz is the solution that I have recommended to the client after looking at lots. Of all the products, Moz is the best known and it is clear from the outset that it has been designed and built by people that understand search engine marketing. I like it and would happily use it.
I should also caveat the blog post with the fact that it is a very limited test and it could be a bit of a freak result due to the fact that the particular keyword / phrase in question is not very competitive.
I do, however, hope that it helps to show that you should not treat any automated tool as the gospel and follow every command blindly. Good tools help you in your role as a digital marketer but you must apply the grey cells when interpreting the data.
This weekend, I received an email from Moz.com with a snapshot of on-page grades. For the purposes of my testing the various SEO tools I was looking at, I used our own website with a few selected keywords. The email therefore showed pages on our own domain:
Straight away, my brain started to question why the home page and link detox pages on our site were deemed to be A grade, with no issues at all, but don’t rank very well. Compare that to the ethical SEO page, which has a lower page grade and 1 ‘high priority’ issue, yet enjoys the elusive no.1 spot on Google.
Clicking on the email took me to the page optimization score report:
I actually think that the page is laid out extremely well and manages to present quite a lot of information in a very friendly / unintimidating manner. For anyone who is not 100% confident in their on-page SEO, this is a very sensible report that introduces issues relevant to the page under scrutiny and shows how to fix the issues.
The big issue, in this case, is the fact that ‘ethical SEO agency’ (the phrase that was used to check rankings) is not actually present in the body content. There is also advice to use the target phrase in the meta description, url and H1 heading.
I would agree with all the above, although do not personally believe in the advice about urls and have seen far too many people destroy rankings by changing urls after seeing such a report, without any consideration to existing visibility and 301 redirects.
The fact is, however, the page enjoys extremely good rankings for the keyword in question. It is pretty much always in the no.1 spot when I have looked.
The same report for a different page shows a much higher overall score and only two recommendations:
I don’t want to blow our own trumpet, but that is a great score. There is very, very little more that can be done in terms of on-page optimisation according to the gospel of Moz.com.
The page doesn’t rank very well though.
There are a few possible conclusions that can be drawn from this limited example:
To be honest, I think that all these conclusions have some truth. The reality is that getting to the sought after no.1 spot on Google is very complex and it simply isn’t as black and white as any of the conclusions above.
I also think that the evolution of Google introduces some question marks about the value of some automated reports. I have seen plenty of examples of pages ranking very well for phrases that they really shouldn’t be ranking for (according to automated reports) as the keywords are barely mentioned at all. Google is getting better and better at understanding pages rather than simply analysing keywords.
What I do hope this small example proves is that you should not blindly accept what tools tell you. They do not always get it right. If you were to prioritise work in this example, you would be tinkering with a page that is doing very well, despite not having the best score according to Moz.com.
Tools should help you but the best digital marketers will know when to challenge the data tools are reporting. My advice? Use tools to do the heavy lifting but always take the time to analyse the reports carefully and do not be afraid to apply common sense.
What do you think? What tools could you not live without and how often do you see reports that you think are next to useless?