Google has seen some pretty drastic search system transformations in the past year, and Matt Cutts, the head of Google’s web spam team is often the anchorman behind the breaking news and other important developments in the SEO industry.

And through the Googlewebmasterhelp channel on Youtube, you’ll also find him in front of the camera answering your SEO queries and positively reassuring you with your Google related concerns. At least that’s the intention, I believe.

Matt’s videos often get mixed reactions from those in the SEO industry, due to some nifty side stepping, the obvious promotion of Google’s products and services, and the inability to give a straight answer when the only answer available is “Google has got it wrong”.

Even so, there’s a lot to be learned from the videos, some of it right there in black and white and the rest hiding deep beneath the surface of Mr Cutts’ fear-inducing silver tongue. You just have to look hard enough.

So, keeping that in mind and with 2014 almost out of the way, let’s take a look back at the things we’ve learned/heard from Matt Cutts in the past twelve months from his Youtube videos. Make of these what you will and I’d love to hear you opinions on these too.

Note: I’ll continue adding to this as Matt makes new videos towards the end of the year (updated today 16th December 2013).

1. Are there any specific tips for news sites with regards to having current or developing news stories? Should the user keep updating the same page, or create a completely new article when new information becomes available?

Should you have a news story, one page is all you really need. That’s where the page rank accumulates and it makes the topic very easy to share, all from one URL. Micro articles that link from one to the next also tend to lose people along the way, so it’s best to add updates and new information on the same URL. That one URL becomes your site’s authority page on the subject and will provide a much higher value to your site.

2. What should be included in a reconsideration request?

From a high level viewpoint the idea of a reconsideration request is to:

1. Tell Google you’ve stopped/fixed whatever violations were present within your website.

2. Give Google a good faith assurance that this won’t happen again

Good things to include would be details of sites you were contacting to remove offending links, whether or not you used a poor SEO and they implemented blackhat SEO techniques, and what you’ve done to ensure it doesn’t happen again.

Basically include as much information as possible to convince Google you’re doing things right this time and intend to do so for the foreseeable future.

3. What is Google doing about negative SEO?

Google has put a lot of thought into its algorithms, not only for the rank calculation side, but also with regards to what one user could do to another in an intentionally harmful way. They have tried to make the algorithms robust and resistant to negative SEO, but of course this is not always possible 100% of the time.

As Google evolved to target link networks, they also implemented the disavow tool, giving the user the ability to remove any unnatural links. This is where the user should start if they suspect foul play.

4. Does Google take action on sites that keyword stuff?

Yes, Google does not reward keyword padding and it’s a complaint Google will always take very seriously. For sites where it’s almost like throwing a dictionary up on the web of similar words one after another, penalties are applied and the user will often receive warnings in Webmaster Tools.

5. What percentage of page rank is lost via a 301 redirect?

The amount of page rank that dissipates through a 301 is roughly the same as the amount that dissipates from one link to another.

6. Should you use the autocompletetype attribute on web forms?

Filling out web forms can be a real pain, so it’s highly recommended that you use the attribute on your forms. There is no SEO value through using the attribute but from a user point of view you’re likely to get more shares if the form does most of the work for you.

7. If you have a ccTLD, can you indicate your geographic location for SEO purposes?

Google automatically assumes that websites using the country codes are located in that country, but this also depends on the percentage of websites using it from that location.

For example, the .co domain extension used to be exclusively utilised for Columbia, but is now being more and more used as a .com type extension for companies. If 90% of the .co domains are from the United Kingdom, Google won’t pay as much attention to the .co extension as a location marker for Columbia.

Be careful though. With the above points in mind it can obviously shave a negative SEO value if you don’t research it carefully.

8. Does Google use a set standard for manually removing webspam?

The ‘human’ webspam team have very good training on how to spot webspam, and do spot checks to ascertain how consistent the team members are quite often. For anything that seems against the spirit of the quality guidelines, Google reserve the right to take action and will do so.

9. Is WordPress or Blogger better for SEO?

Both sides have advantages. One one hand Blogger is very easy to get started and perfect for casual blogging. WordPress on the other hand has more flexibility. Both can rank very well if used correctly.

On the whole a standalone installation seems to be the preferred choice of SEOs due to its SEO plugins, and not the online blogging methods mentioned in the question.

10. Can someone other than me verify ownership of my site by adding a “verify” meta tag to the content they add?

Google only look for the verify meta tag in the HEAD section of the site, so it’s not possible unless the user has access to this.

11. What does Google think of single page websites that use one page and CSS/Javascript?

Google has improved where this is concerned, but it’s advisable to run a test first before you adopt a one page process. Each implementation (WordPress themes for example) is different, so make sure your site implements these techniques correctly, has an identifiable structure and can be indexed correctly.

12. If my website is down for a few days or longer, is there any way to get my Google rankings back?

Firstly make sure everything is back up and where it was before and that the problem won’t keep happening. Google keeps a record of sites that disappear, and assuming it’s for a legitimate reason, your rankings should recover to where they were previously over time.

13. Where will Google search be in ten years?

It’s difficult to ascertain what the future holds for Google, but most predict full voice recognition, a greater ability to understand context, and even conversational skills with multiple back and forths will be present. Google is aiming for a higher level of artificial intelligence and the ability to synthesise information by looking at multiple data sources.

In the future it might just be possible to ask:

“Find me a web programmer who lives in England and label the programmers in red that have WordPress knowledge”

14. Why do sites with bad backlinks STILL rank highly?

As good as it is, Google is still learning. The algorithms aren’t good enough yet.

It’s important to remember that while it may appear that you can view a site’s backlinks through a number of tools widely available, unless you’re actually the website owner it’s unlikely you will see all the backlinks that Google does, no matter what methods you use to list them.

15. How can a true authentic business compete with one that’s just SEO’d to the max?

Google is “trying” to mirror the world. If your business is huge offline, make sure you have a good, properly optimised website and give users a platform to share. The rest will take care of itself.

16. Why does a new page’s ranking slowly decrease over time after ranking highly for the first few weeks?

This is often because better versions of your content similar to your web page become available over time. For example, in the event of a natural disaster if you’re one of the first reporting the event you’re ranking highly due to being a big fish in a small pond. Once this even becomes reported more widely, you’ll essentially become the small fish in a big pond.

17. Should I buy a domain name that has a lot of spam links attached to it?

Probably not. Don’t be left “holding the bag”. You’d really need a lot of work documenting what you have done to fix things and file a very solid reconsideration request. Google won’t make it easy to repair domains with bad backlink profiles because it creates a loophole for negative SEO techniques to avoid punishment.

Matt Cutts himself states he would pass on buying a good domain with spam links.

18. Do internal website links with the same anchor text affect your SEO and count as duplication?

Typically not. Internal links are treated differently in this case, since Google expects duplication throughout a website, but it’s still not wise to overly stuff your internal content using the same anchor text over and over. Do it naturally to primarily enhance the user experience and don’t try to over optimise your website this way.

19. What are the top areas that SEOs make the most mistakes?

SEOs make mistakes across the board, but here are the five mistakes Google sees most frequently according to Matt.

1. Not making a site crawlable or not having a domain at all.

2. Include the right words on the page and make sure it’s readable.

3. Focusing on link building and not creating quality content instead.

4. Not paying enough attention to your page titles and page descriptions.

5. Not using webmaster resources that Google provides online.

20. Should I still use the HTTP Vary: User-Agent header for specific mobile websites after many big players said they don’t support it?

Google still supports the Vary User-Agent and it’s still an important consideration when choosing which content to deliver. Again however, it’s important to use this in a genuine way. Exploits exists, but Google are aware of them and will penalise you accordingly.

21. What should we expect in the coming months (from May 2013).

What’s likely coming is changes to queries to avoid another payday loans fiasco, eliminating pages and sites that rank well due to spam links, better detection for hacked sites, and an increase in determining authority levels of sites and users. Paid advertising will no longer pass on PageRank in the future.

Another change is that multiple search results from one site that are similar will be replaced with just one result – the one that’s most relevant to the context of the query at that time.

22. Why does Google continue to show multiple results from one domain?

This one’s great since it directly relates to the question above – something Google didn’t properly address or just hadn’t got around to changing yet.

Google introduced “host clustering” which stopped this to some extent and attempted to only produce a maximum of two quality results. This was quickly exploited by varying host names and sub-domains, but today you will typically see just one result per domain and only see more than this if the results are truly high quality or there is no competition. Google want diversity in all searches.

Even so, the further you go down the search rankings pages, the more likely you are to see duplications.

23. How can Google be truly confident with their SERPs if the subjective signals that influence rankings are created by humans themselves, who each have a unique perspective or bias towards any given subject and can’t possibly accurately understand the exact context of any given query?

Each search engine has its own philosophy and perspective and Google is no different. There is no scientifically provable “best way” to rank websites, but by listening to outside feedback and using an experienced infrastructure of information research analysts, Google is hoping to make SERPs return a set of results that are as logically correct as they possibly can be.

24. Is the colour/style of your shirt related to the SEO topic you are talking about?

Yes, sometimes. Every now and then we like to throw in an “inside joke” or two. Take a look at the image below for both 2013 videos on duplicate content. Funny, Matt. Funny.


25. What are advertorials?

Advertorials are native advertisements that appear that they’re only on a website because you paid someone a sum of money for them to be there. It’s not classed as a genuine recommendation or link to follow since paid links that pass PageRank change the Google landscape and those with the most money win every time. Paid links should always contain the rel=nofollow attribute.

26. Which aspects of Google updates do the SEO industry seem to perceive incorrectly? Where do SEOs waste the most energy?

Firstly that Penguin, Panda and Hummingbird were just changes designed to increase Google’s revenue from Adwords. Revenue actually decreased after each update. Short term revenue isn’t important to Google when compared to long term customer loyalty.

As for wasting energy, many SEOs think the whole industry is about just building more and more links. This shouldn’t be the central focus. Genuine quality content, a great user experience and authoritative social signals should be the areas they are addressing. The high quality related backlinks will follow naturally from that if the strategy is implemented correctly.

27. What are the common mistakes of people using the disavow tool?

1. The file should be a regular text file, not a spreadsheet or Word document.

2. People tend to attempt to pick and choose links, treading carefully before disavowing uncertain links. If you have a bad backlink profile it’s often better to “take a machete” to the links and go a little deeper. If you’ve used bad SEO in the past, Google prefer a much more “serious” approach to disavowing to show you’re serious about the cleaning process.

3. The syntax of the text in the files needs to be correct.

4. The story of why you are disavowing should be via the reconsideration request, not inside the disavow tool as a comment.

28. Has Google improved manual web spam actions and messages?

Google has recently increased the number of web spam actions and recommendations via Webmaster Tools to be much more specific, providing more and more information as to why your link or links are being flagged. This will continue to improve in the future.

29. Does using stock photos on your website have a negative SEO effect? Are original photos better?

Not really. Google don’t use this factor in their algorithmic web ranking right now, but it may be something they will consider in the future.

30. Why are some Google services unavailable in Albania. Does Google hate Albania? launched in 2013 and Google is continuing to increase support in Albania. There’s still a long way to go and work to be done, but it takes time to be able to support countries properly.

31. Will the Webmaster Tools API be updated allowing the user to retrieve “search queries” and “backlinks” data?

Yes. Google are working on it and will release the updates shortly. The following links will help users until these updates are rolled out.

Download the Search Queries data using Python:

Download Webmaster Tools data using PHP (not an official Google project):

Ask an API question in the Webmaster Help Forum:!forum/webmasters

32. If I write about another article and provide credit to it, should I link it in anchor text in the opening paragraph or in a separate source link at the bottom?

PageRank flows either way, so it doesn’t matter where the link appears. It’s often a better user experience to state the source at the top, but that’s a matter of preference.

33. If my site is down for a day, will my rankings be affected?

If it’s down for a day, it’s unlikely to be affected. Google try to compensate for short periods of downtime.

34. Does Google need lots of text to understand my website?

Google still needs text to index your site correctly, and would not recommend turning your website into an entirely image based resource, no matter how pretty it looks. The value you get from text is infinitely higher than anything else crawled on your website.

35. Why doesn’t Google provide examples of bad incoming links to my website in Webmaster Tools?

Google are working on providing more transparent data and will continue to keep improving this.

36. If I have multiple domains should I link them all together?

If you’re trying to “work” Google by linking keyword focused domains together to generate PageRank for each, it’s not a good practice. The only time this should be applicable and won’t appear “spammy” is if you’re using top level domains for a number of different countries.

Either way, unless the sites are closely related it’s unlikely PageRank will flow as you’d intend it to.

37. How does duplicate content that’s legally required and appears on many pages (Terms and Conditions for example) affect your site in the SERPs?

Unless the content that’s duplicated is spammy or keyword stuffed you shouldn’t worry. Nonetheless you are probably better off having one Terms and Conditions page and link to that page where necessary.

38. How does Google treat hidden content which becomes visible when clicking a button?

As long as the user experience is high quality and you’re not trying to hide a mass of keyword related content to manipulate Google, your site won’t be affected too much. As long as the Google bot can access the information in that first place of course.

39. Will Webmaster Tools ever give examples of which links of pages cause a penalty?

Google sometimes already manually supplies examples, and over time you will see more and more automated information giving you an idea of where to look to clean up your backlink profile.

40. Does the manual webspam team take the same actions toward counter spam for countries such as India as they do in the US?

Google target all areas and every language. If it’s possible Google always try to internationalise the algorithms and already have manual teams catering for 40 languages.

41. Should I be worried that my competition is creating negative backlinks to my website?

As long as you follow the correct guidelines to disavow and generate a reconsideration request you won’t have a long term problem. The responsibility falls on the user to clean up the mess though, and more importantly to monitor it actively.

42. Should I add rel=nofollow to widgets and infographics?

The abuse of widgets and infographics started to become widespread, and so today a widget or infographic won’t pass on PageRank nearly as much as an editorial link would. Including nofollow in your embeddable content is actually today’s best practice to be on the safe side.

43. Is site loading speed more important for mobile devices?

All things being equal, slow loading sites will rank lower. This isn’t specific to mobile devices however and is a factor that’s consistent across the board. That said, it’s advisable to make your website load as fast as possible for mobile devices to improve the user experience.

44. What is Google doing to manually support webmasters? Why don’t I get manual responses to my reconsideration requests?

The problem relates to scale. With over 250 million domains in existence, Google simply wouldn’t have enough staff to handle such a high volume. Google took manual action on over 400,000 sites up until 2013 so the manual service obviously exists, but with over 2 billion queries a day, one on one threaded conversations are unrealistic.

45. Does Google take action on sites that automatically pick up search queries and then provide a page for your exact query with no results and that’s full of adverts or unrelated content?

Nobody likes following a link that’s provided by Google only to see a ‘0 results found for searchterm’ message.

If you see something like that, please send a spam report to Google. Google doesn’t effectively eliminate these things, but are willing to take action. This will be addressed further in the future however.

46. Can nofollow links harm my site’s rankings?

There are hardly any cases that nofollow links can hurt your website, but if you’re manually spamming comments and it becomes noticeable to Google, manual webspam action may follow.

47. How will a webmaster know if Panda has affected their website(s)?

If you think you might be affected, the overall goal should simple be to ensure you have high quality content on your site and your backlink profile is authentic. Try to eliminate duplicate content, or content that’s derivative or just not useful.

48. Will having a website on both IPv4 (Internet Protocol version 4) and IPv6 cause duplication issues?

No, it won’t be considered duplicate content.

49. What can e-commerce sites do to avoid duplicate content when the same information may apply to a wide range of products?

If you’re using the same content that everyone else has on the web (affiliate marketing for example) you will find duplication penalties apply. You should try to differentiate yourselves from the competition and vary internal content wherever possible.

50. Does Googlebot really care about valid HTML?

There are many reasons to ensure your site has valid HTML, but Googlebot compensates for HTML errors and doesn’t penalise you for it. As long as the site is able to be crawled correctly the validity of your HTML won’t be a factor.

51. Should I use rel=nofollow for internal links – for example links to a login page?

For links internally you will typically want the PageRank to flow, so it’s good to leave off the nofollow. There are cases where this may not be appropriate, but in general it’s good to let Google assess what’s important on your website.

52. Why isn’t my site’s PageRank changing?

Google PageRank is only updated periodically and isn’t something you should focus upon frequently. Focus on the quality of your website and how you share the content before anything else.

53. Is using Geo-Location scripts considered spam?

Geo location isn’t considered spam if it’s used as it should be. However, if you treat Googlebot differently to regular users you will have a problem. Google considers this as cloaking and will result in penalties. Treat Google as you would a user and you will have no problems.

54. It’s hard to trust the results Google provides today. Should webmasters who are having problems getting quality rankings focus on more than just search engine results and work on getting leads from social media and other avenues?

Google will continue to improve its search service, but if you’re serious about running your business you should already be diversifying your marketing as much as possible. Don’t put your eggs all in one basket. Quality external marketing often translates to a positive impact in the online market too.

55. How can I guest blog without looking like I’ve paid for links?

Usually there’s a clear distinction between someone who is a high quality and genuine guest blogger and someone who does this just for spam. Google can detect these things quite effectively and this area will continue to be vastly improved as guest blogging increases. Guest blog in moderation and do it selflessly and of the highest quality and you won’t find an issue. Low quality guest blogging won’t pass on PageRank in the same way.

56. In terms of SEO, what is the difference between the B and STRONG tags in text?

In SEO terms these don’t have any real measurable value and both do the same thing, but it’s important to realise we’re in the age of the semantic search. If bold text stands out to users and is designed to provide emphasis, there’s the possibility that Google will also see this as a marker and record it accordingly in the future.

57. Does a site rank better if it has a lot of indexed pages?

The number alone won’t be enough to rank you higher. However, if you have more pages with different keywords on them you’ll likely find a wider scope of search results will include your website’s content.

58. Should I add markup on my videos, even if they’re on Youtube?

The more markup there is, the easier it is for search engines to ascertain exactly what’s going on in a video. It’s also essential that you allow Googlebot to crawl your Javascript and CSS for the same reason.

59. Is responsive design or traditional design better for SEO?

If you create a (which is essentially a standalone entity with its own SEO value and optimisation), if it’s optimised in the exact same way as the desktop site and has the same content then there will be no real difference in SEO. Responsive sites are for the most part completely identical to the desktop site, so with regards to SEO there is no shortfall to begin with.

60. Are all comments with links considered spam?

If your comment is authentic, isn’t using obvious keyword based anchor text and adds value to the source content, then it’s a nice social signal that can add a little to your overall authority level, whether the comment itself is nofollow or otherwise. The link itself may not pass on PageRank, but it will indicate you’re engaging constructively with others in your genre and your opinion has value.

61. Do all pages need a meta description?

Google does a pretty good job of creating its own page description, so unless you want to be really specific with something it’s often just as useful to leave it blank.

62. Should I use the disavow tool even if no penalty has been applied?

If you’re unsure of any incoming links to your site, feel free to just go ahead and disavow. Google appreciates the effort in creating a clean backlink profile.

63. How many links on a page should we have? Is there a limit?

There used to be a limit of around 100 links, but today it’s much larger. Even so it’s important that you keep it to a sensible number to improve the user experience first and foremost.

If you’re doing it to gain PageRank elsewhere it’s important to know that if you have 100 outgoing links on a page, the PageRank you pass on is shared equally between all 100 of those links, so the source URL receives 1/100th of its potential maximum PageRank value.

64. Is it a good practice to combine small portions of content from many sources to create your own article if you credit the source articles?

“Stitching” is something search engines really don’t like and won’t give you the SEO benefit you’d desire. Writing unique articles based on the portions you would copy in your own completely original way and providing additional thoughts or information is really the only way to do this and gain SEO value.

65. How can a site recover from a period of spamming links?

This really depends on how severe the offence is. The more severe the offence, the harder you will have to work to convince Google that (a) you won’t do it again, and (b) you’ve worked hard to clean it up and reverse the damage. Large scale action such as disavowing huge amounts of backlinks helps Google build a little trust and have faith in you.

66. What should I know if I am considering guest blogging?

Basically the number one rule is to provide high quality content that engages the user and to avoid using “spammy” practices. With guest blogging becoming something of a fad at the moment this is an area Google is actively monitoring, and even if you’re guest blogging on a high PageRank site you won’t always get that PageRank passed on if the article you provide is of no value.

67. How does Google handle duplicate content?

When Google sees duplicate content it attempts to analyse and group it all together to give the user diversity for the search term. It doesn’t treat the content in the same way as spam, but instead tries to cluster the information appropriately, only showing the best result for the user.

Obviously it doesn’t treat the duplication of Googlewebmasterhelp Youtube videos about the subject in the same way.

68. With a number of top level country based domains, is it okay to have the same IP address for each?

In an ideal world it would be great if you could have a separate IP address for each domain, but in general as long as you have different top level domains, Google is able to distnguish between them.

Final Thoughts

After trawling through five hours of Matt Cutts sound bytes for the benefit of this article it was apparent that Matt provided very useful information in 2013 via these videos. That’s a given. Some of these things might be eye openers to even the most knowledgeable of SEOs out there.

Even so, the ‘take home’ from watching these videos again wasn’t an increase in SEO knowledge for me personally. I actually started to feel uneasy the more videos I watched and was left with a tangible level of distrust. And it’s not that I don’t like Matt Cutts. Matt has continually provided a great service to the SEO community and one of which I’m thankful for, but the one thing that you’ll start to understand when you really look closer or read between the lines in these videos is that Matt isn’t actually delivering on Google’s promise of being totally transparent – far from it. As I see it, and feel free to correct me if your opinion differs, he’s actually being failed by the Google products he’s promoting and is being forced to sidestep as a result.

Google are well aware of their shortcomings, but won’t actively admit to them. Of course to admit your product isn’t as good as the public’s perception isn’t exactly a common business practice, but nevertheless hiding it isn’t anything like the transparency we were promised as customers.

In today’s SEO world it’s disappointing to me that black hat SEO and spam techniques are still rampant. If you compare some of the techniques used and measured as effective in black hat forums of late to Matt’s comments that claim they’re simply not possible, it becomes clear that some of the information provided in Matt’s videos amounts to little more than bravado or misdirection.

Black hat exploiters winning the spam war I can live with, but if we’re fighting Google’s failings and misdirection too, then that’s a whole different ball game.