An Illustrated Guide to Matt Cutts’ Comments on Crawling & Indexation

Posted by randfish

Late last week, Eric Enge of Stone Temple (and a co-author of mine on The Art of SEO) published a fascinating interview with Google’s head of Webspam, Matt Cutts. I think the whole of the SEO community can agree that Matt taking time for these types of interviews is phenomenal and I can only hope he does more of them in the future. Understanding more about Google’s positions, their technology and their goals will benefit website creators and marketers dramatically.

The interview itself is certainly worth a read, but as one mozzer noted to me during the email string on the subject "I’m embarassed to say I couldn’t make it all the way through." Fair enough; and that’s why I’m presenting Matt’s primary points in graphical, cartoon format. I’ve also included some adlibbing, interpretation and fun into these. Only the bits surrounded by quotes were actually taken directly from Matt’s words, so please do keep in mind that this is my opinion of what Matt means (along with the occassional editorial).

#1 – There is No Hard Indexation Cap; But Indexation Has Limits

#2 – Duplicate Content Might Hurt Your Indexation

#3 – Lots of Qualifiers on Whether Affiliate Links Count

#4 – 301 Redirects Pass Some, But Not All of a Page’s Link Juice

#5 – Low Quality, Non-Unique Pages Might Drop Your Indexation

#6 – Faceted Navigation and PageRank Sculpting are Thorny Issues

Personally, I liked how much Eric pushed Matt with scenarios that would require some advanced methods of showing faceted navigation to users but not search engines. However, I also understand that Matt needs to take a position that’s right for 95% of site owners 95% of the time or risk creating a new "PR sculpting" issue.

One other item that really stood out and got me excited was this response:

Matt Cutts: (with regard to links in ads) Our stance has not changed on that, and in fact we might put out a call for people to report more about link spam in the coming months. We have some new tools and technology coming online with ways to tackle that. We might put out a call for some feedback on different types of link spam sometime down the road.

That sounds really good – a huge frustration for the SEO world has been the fact that so many SEOs perceive their competitors to be outranking them with black/gray hat linking techniques and feel they must engage as well is order to stay competitive. Shutting this down or making SEOs feel that Google is taking consistent action when obvious manipulation is reported would go a long way to quelling this thorny problem.

My last recommendation is that you check out Eric’s 29 Tidbits from my Interview with Matt Cutts; a post that summarizes a lot of the critical information and takeaways quite neatly.

To end, I thought I’d add the four questions I wish Eric would have asked Matt (maybe next time!):

  1. With Google’s new recognition of internal anchor links and listings of those URLs in the search results, is it still safe to link to internal anchors on pages and trust that the link juice will flow to the page as a whole, or are content blocks inside individual pages now being treated as unique entities?
  2. With the handling of nofollow changing and Google crawling/executing Javascript, what’s the best way to link to a document on the web so human visitors can access it but search engines cannot WITHOUT wasting link juice/PageRank (robots.txt, for example, couldn’t do this) or cloaking?
  3. Does Google now (or will you in the future) consider the sharing/linking activities happening on Twitter, Facebook, etc. to have any impact on the overall link graph of the web (assuming we’re talking only about those links that don’t make their way onto standard web documents)?
  4. When people ask the question, "why is my competitor ranking so well with low quality/manipulative links?" you often reply that they should be careful in presuming that Google hasn’t already discounted the value of spammy links and the competitor is actually ranking on the basis of quality link sources. This creates an environment where marketers are constantly trying to discern which links pass value and which don’t – could you give advice for relatively savvy, experienced SEOs to help them make those determinations so they can pursue the right links and stop paying spammers for the wrong ones?

If you’ve got thoughts to share, questions outstanding from the interview or my amateur drawings or things you wish Eric had asked Matt, feel free to post them below.

Do you like this post? Yes No

http://tinyurl.com/ybs6d97

廣告

迴響已關閉。

%d 位部落客按了讚: