Rohan Ayyar works at E2M, a premium digital marketing firm specializing in creative content strategy, Web analytics and conversion rate optimization for startups.
A new era of tech events has begun
We’re back in New York this November for the 4th edition of our growth-focused technology event.
Penguin 1.0 targeted sites with backlinks that had anchor text overly optimized for money keywords. Penguin 2.0 went deeper and attacked sites that engaged in “black hat” practices – comments, directory listings, articles, internal pages of blog networks and mass link buying of pretty much any sort, came under its purview. And from first impressions, it appears Penguin 3.0 isn’t very different from its earlier editions in behavior.
And more than losses, sites that did good work after being burnt by earlier updates are seeing good recoveries in visibility and rankings.
However, Google has passed over a menace it’s been threatening to exterminate for quite some time now: guest blogging for links.
In his Happy New Year to SEO message earlier this year, Matt Cutts minced no words in advising marketers to stick a fork in it. But even though we witnessed the subsequent drama of the My Blog Guest penalty, in my opinion, all hell (in the true sense of the word) is yet to break loose.
Penalties dished out for guest blogging have mostly been manual up to this point. Google is yet to punish us with a guest blog penalty that scales.
So what I’m actually wondering is, “What if Google had penalized sites for links from guest posts with Penguin 3.0?”
Why would it do that?
Google’s stated goal is to get rid of “unnatural” and “manipulative” link building. And one has no option but to take its ever-changing definitions of these terms at face value.
The closest you can come to guessing what Google intends (not what they mean) is to read the Link Schemes section of its Quality Guidelines. Google say it’d look to deal severely with these types of link schemes:
The keywords for me here are large scale, campaigns, and anchor text links. These are what the algorithm will be looking to analyze.
But I digress. Point is, the unnaturalness and manipulability of links are open to Google’s interpretation, not yours.
Penalizing sites for guest posts with plugged links will be a utopian achievement for Google’s spam detection algorithm, because the patterns are harder to detect, compared to directory and article sites. But that’s not going to stop Google. It’s got a job to do.
- “Land backlinks from authority sites?” Eh?
- “Part of your SEO campaign?” No!
- “Guest blogging domination!” Please!
Let’s face it: whether you target “high authority” sites, get nofollowed links (more on this later), say “contribute” instead of “guest post,” go niche, or include infographics, there is no such thing as safe guest blogging.
Can Google do that?
Penguin could even incorporate Panda-like text/image signals (it already reads anchor text) to raise the bar of naturalness. Google would basically look at the sites from where you’re getting your links, and ask these questions:
- Are these sites of good quality in terms of perceived trust and editorial control?
- Are they in the same niche as, or do they have a category topically relevant to, the entity being linked to?
- On a balance of probabilities, does it appear they might accept compensation for links?
Google might even go for a new kind of penalty involving manual actions depending on flags that are raised algorithmically when a set level of unnaturalness is detected, provided they have a well-defined process in place.
How would Google do that?
This is where I begin the wild speculations. Anyone who’s done any amount of serious Web marketing or blogging can tell when a link is plugged into a guest post. And the way to tell that is by observing patterns.
There are a few linking patterns characteristic of guest posts that Google can identify (and penalize) with little effort. In fact, it’s already doing so in many cases.
The first? Links from author bio. Duh. And while you’re at it, don’t think links with “brand anchor text” are safe. We all know how linking with exact-match anchor text was considered a no-brainer as opposed to “Read more” for so long.
Google can figure out, with the same ease as the author bio, patterns in the backlink profile (authority range), speed and interval of acquisition (“campaign” alert), and authorship of the content. James Finlayson warned us about it quite some time ago.
Google might also be able to figure out if different authors associated with the same company are linking to the same sites from their articles. This is one pattern that unambiguously associates a marketing agency with their client.
Google has gathered all the user profile information they can from “content marketers” with its three-year long authorship scheme (which, incidentally, in Google’s own words, wasn’t “adopted” by anyone outside the SEO community). That they have stopped showing author information in search results is no reason to assume it isn’t using it internally for other purposes.
There are the in-between sites referred to as “community” or “niche” blogs. I like to think of them as the poor man’s Buzzfeed. They are usually well designed and have lots of engagement and inputs from the niche/community they cater to. It’s easy to get your post on them, because they publish members’ posts with little to no moderation.
It’s a cakewalk to spot unnatural guest posts on such sites. The classic modus operandi that spammers (I hate calling my bedfellows that) follow is sign up, fill out a profile, publish a post with typically two “contextual” links: one to a site you’ve never heard of and the other to Wikipedia.
Google has been increasingly targeting community sites in manual penalties; so they are now an endangered species.
When SEOs graduate to linking from within the content on blogs with editorial control, the first thing they do is list posts with a client’s site as one of the items. You’ve all seen the “5 Tools that Project Managers are Raving About” post (with aesthetic screenshots) delving into the nitty-gritty of Basecamp, Asana, Trello, Zoho and BrixHQ.
BrixHQ? That’s the one. How difficult do you think it is for Google to distinguish software that has 100 users from four others that have thousands? Or to pick a site ranking on page 15 from six others ranking on page 1 or 2?
Google targets not only sites with a bad inbound link profile, but also those with a bad outbound link profile. And these sites have – hold your breath – site-wide patterns.
Other host blogs require the guest blogger to include at least one link to another post on their site. Matt Cutts has said that typically, internal links do not give you trouble. But it appears he was referring to links given by templates as opposed to editors.
Other sites destined for grim penalties are those with posts unrelated to their core theme.
“Where should you buy your next home?” is not going to cut it on a site with Hardware, Infographics, Mobile, Social and Gaming on its menu.
Clicking through to the post, you find that it links to Shea Homes, and is authored by Now Sourcing (or their president, depending on the way you look at it). They’re a social media firm specializing in infographic design. You bet they know all about real estate!
Do they seriously think Google can’t see through that? Why wouldn’t Google port its text, ad, image and context reading abilities from a page to the whole domain?
So Google can spot a client’s link in a list of tools or tell it apart from Wikipedia. Then what does a smart guest blogger do?
Have more “researched” links to posts on other blogs and sneak one to their client in. Bad luck: The client wants a branded link. And so you have one link to a tool along with five others to articles. Odd one out!
Okay, you convince your client you can get them a link from an “authority” site. Bad luck: They’re new to the business and don’t have a blog. Our indefatigable guest blogger decides to ride his “authority” anyway and implant a link with – what else – keyword matching anchor text.
The example above links to Hostt.com, which, as of this writing, bears no relation to Peter Daisyme (paragraph subject), being action oriented (section heading), or traits of a successful entrepreneur (title of the post).
The other links in the above article are still bona fide, similar to those in point #1. But, you (and Google) can make out what stands out.
Say Googlebot spots this link and decides to investigate further. They might find that another authoritative author has linked to it from another authority site:
Although it could be said that I don’t have the authority or expertise to say this, I will: There is no credible reason for this article to mention, let alone link, to this site using the anchor text it has. Free Web hosting companies have little to do with reasons not to invest in startups (post topic), even if they happen to be founded by the post author.
Again, the anchor text screams keyword-rich. (“a” is a stop word that Google doesn’t count in queries, so why should it do so in anchor text?) And of course, all other links are to “well-researched” articles.
Google could also look at the relative placement or order of the unnatural links. In both the examples in point #2 above, the suspect link was placed at #2. Throw in a genuine-looking link first to put the editor at ease, and then get in the link you want. Then add four more links to taste. Easy-peasy.
Other, less careful authors tend not to place any link after they’ve slipped in theirs. That explains those posts with five links in the first half and none in the second.
Patterns, patterns, patterns
Time to move on to blogs that link out or promote spam (or black hat SEO, whatever you call it). We all know only too well that MyBlogGuest and PostJoint were penalized manually for being blog networks.
That would mean any site offering “guest blogging services” (having a page for the service or linking to it from a menu) or promoting a blog network is done for. And Marie Haynes has also said you could be targeted just for promoting ways to game Google. Scary.
Genius is as genius does
Back to tools. Now, I keep giving examples of sites that sell Web-based products or services, because those are the ones we can easily identify with. I could dig up links to real estate, insurance or auto sites that have been just as cleverly plugged, but as online marketers our most creative link building genius is reserved for Web/mobile products.
Let me start out by saying that Matthew Barby is a brainy guy and I’ve chosen some of his posts as examples because I know he deserves to be cited in a section with “Genius” in its header.
Matthew writes for Search Engine Land, the Moz blog and SEMrush. Just assuming he does link building for Buzzsumo (an excellent tool that helps you find high performing content and key influencers for any topic), you find the following stats for posts he has done in 2014:
- Search Engine Land: 3 out of 4 posts link to BuzzSumo
- Moz blog: 2 out of 3 posts link to BuzzSumo (in 2014)
- SEMrush: The only post here links to BuzzSumo (Okay, I lied. It doesn’t link, it cites. But perhaps SEMrush removed the link? They did that to me once.)
Six out of eight posts linking to BuzzSumo – are we on to something here? Could Google identify one or more sites an author cites or links to predominantly from his articles and run link:post ratios to determine if they’re “unnatural”?
|Site||Post Title||Section Heading|
|SEL||How To Get Your Content Linked To From Top-Tier Websites||Finding Content Gaps|
|How Press Requests Can Be A Link Building Gold Mine||How To Respond To Requests|
|Why SEOs Need To Stop Automating Email Outreach For Links||Stop Over-Automating|
|Moz||The Power of Authors and Content for Link Building||Finding popular content|
|My Recipe for Success: How to Launch a Successful Blog||Content analysis|
|SEMrush||What Your Company Blog Is Doing Wrong, And How To Fix It||Understand the current landscape|
Clever link builders never mention a product, service or tool by itself. They always club it with similar (which is different from competing) sites, so that the link/citation doesn’t stick out. Let’s call these winglinks.
Supposing Google sees two or more tools being cited or linked together in a lot of articles, they might figure out, based on ratios incalculable by us non-PhDs, which domains to penalize and which ones to ignore.
Social engineering, in the context of information security, refers to psychological manipulation. It is a type of confidence trick, which is in turn, an attempt to defraud a person or group after first gaining their confidence, used in the classical sense of trust.
Once an author has gained full posting rights to a system, and the host blog (especially one that has a thorough vetting process) is confident their articles are of a reasonably high quality, there is little to stop the author from blatantly promoting their own or clients’ sites from their posts.
We saw examples of this in the “Getting Cleverer” section above. Even reputed SEO sites are not immune to this practice.
The rationality of linking to a client site when discussing a typical practice in the industry they are in, is debatable at best. However, Google isn’t known to give you the benefit of doubt.
And so, it follows that even guest posts on respected SEO sites will not be spared.
But it was not to be! (This time)
The nightmare hasn’t come true (yet). Long live guest posts! Penguin 3.0 looks to be mostly harmless.
But you never know. Penguin 1.1 came out just a month after Penguin 1.0. And it was a targeted update that used data processed outside of the main search index. A significant message that Google sent out then was:
Still want to risk it?
Well, what can you do?
If you manage a multi-author blog that accepts guest posts:
- However painstaking your editorial screening might be, it’s nearly impossible to match Google’s thinking every time. A few bad apples might slip through now and then.
- You might switch all links to nofollow, like Econsultancy did. But that puts your editorial links in the same bracket as sponsored/untrusted links.
If you’re a link builder/content marketer who blogs/writes/contributes a lot of posts/articles/stories on various sites and link to your site from them:
- If you get an unnatural links warning from Google, don’t panic. Whatever the extent of the penalty you get, you can recover from it.
- Building nofollowed links might work for you. That said, the jury is out on how Google’s outlook on nofollowed links changes over time, should folks start building them in the thousands. I’ve personally seen Google cite a nofollowed link as an example in the “Sample URLs” they give when they reject your reconsideration request.
- Take a holistic look at your backlink profile. As Marie declared, “It is not the source of a link that makes it unnatural.” Understand that Google tries to analyze linking patterns to determine if there is intent to manipulate their search results.
- Don’t go downhill.
Read next: You can now book a hotel room on Yelp