OK so the world of search has had something juicy to get its teeth into for the last week or so, in the form of what Danny Sullivan named the ‘farmer update, mainly because it was meant to remove content farms from dominating the SERP’s.
Wired.com got a great interview with Matt Cutts & Amit Singhal, who are the head of spam, & quality for Google. I am not going to go into too much depth, as those who want to read the full transcript can read it here. So here is the salient elements, some known, some suspected, but all now confirmed.Caffeine update brought faster crawling, and a far deeper crawl, which resulted in more pages of poor quality content. Google already have a, ‘unreadable gobbledygook filter, but this was semi readable, and was a whole new issue. As Matt said, it was like, “What’s the bare minimum that I can do that’s not spam?”. So there you have it, has he indicated that spun content not spam if it is readable?
When asked how they recognise shallow content the reply was clear, that they can’t do it algorithmically. So they went back to the old trustrank thing that I posted about many years ago and got people to manually seed the trust. Amit said
“where we basically sent out documents to outside testers. Then we asked the raters questions like: “Would you be comfortable giving this site your credit card? Would you be comfortable giving medicine prescribed by this site to your kids?”
but then Matt dived in with
“There was an engineer who came up with a rigorous set of questions, everything from. “Do you consider this site to be authoritative? Would it be okay if this was in a magazine? Does this site have excessive ads?” Questions along those lines.
We should look at that in detail, because what appears as throw away is in fact telling. DOES THIS SITE HAVE EXCESSIVE ADS!!! That right there is an element some have been discussing.
Matt then stated that they had compared the overlap between the Chrome site blocker and the new panda update (Named after the Google guy who created the specific technology to run this within the algorithm). When asked how you implement that algorithmically, Matt said it can be done if you rely on intuitive elements. This limits it to things like over commercialism of content for example.
Another comment of note! Matt said
“I got an e-mail from someone who wrote out of the blue and said, “Hey, a couple months ago, I was worried that my daughter had pediatric multiple sclerosis, and the content farms were ranking above government sites,” Now, she said, the government sites are ranking higher. So I just wanted to write and say thank you.”
Note the use of the term ‘content farms!
When asked why suite101 got tanked, yet Demand media didn’t Matt replied
“I feel pretty confident about the algorithm on Suite 101.”
This to me indicates they consider suite101 to be a content farm and poor quality.
Not a real lot there, other than maybe a couple of ‘between the lines’ comments than can be taken this way or that. My personal tests tell me that google has hit the rankings, but not stopped the sites hit from passing link juice. Probably once the dust settles, that will be the next step. ALTHOUGH it should be noted that miraculously, just before the update, google KNol changed all its outbound links to NOFOLLOW. maybe they knew something the rest of us didn’t 😉
A vert good read- well done Old Welsh Guy!
Todd Johnson
VERY interesting. Half of this stuff I’m still trying to grasp. I listened in to that interview. It sounds like you are trying to pick up inferred information based on some of the things these guys said. Makes sense. On first thought, I would’ve not attempted to see it that way. On another note, makes me wonder if I should put NOFOLLOW on some of my outgoing attribution links.
Having just started the SEO game and writing articles…etc whilst I applaud Google trying to hit copied content/farms etc – it feels like a bit hard on the people who write good content. Hopefully google will make some adjustments because article directories have lost traffic and copied stuff was ranking higher than genuine content which is silly.
@Danny,
The only reason for placing nofollow on an outbound is if you don’t trust the target page, which then raises the question, why link to it? Nofollow is ideal where if for example, you are writing about the spamming methods used, and this requires you link to bad neighbourhoods. Nofollow is ideal in that situation.
@Kevin
My Guess is Google will mess about with this a fair bit, If they thought they had got it right first time, they would ahve rolled it out across all their datacentres by now.