Four Dots
Four Dots Blog
THE
INSIGHT

latest
from the blog

At a recent SMX Israel, Google’s Webmaster trends analyst, Gary Illyes announced that the next Penguin update will come before the year’s end and that it should enable the already impressively capable algorithm to run continuously and work in real-time. The unrolling date for Penguin 4.2 has since been pushed to an unspecified time next year, but it’s the promise of its new properties that’s really interesting.

If what’s claimed about them sounds a bit obtuse, Barry Shwartz from Search Engine Roundtable has been vigilantly tracking the announcements and news coming from Google and calling them out on the thin, filler content they’ve, somewhat ironically, been using to stuff their announcements.

We’ll try to summarize the conclusion of the discussion that’s been going on for a while now, and quickly go over the steps you should take in the anticipation of the change.

Real-time for real this time?

The term real-time shouldn’t be that difficult to get your head around, but somehow, people in charge of finding, specifying and cataloguing the meaning behind every word and notion that a user might enter in a query, not only failed to provide us with a reasonably comprehensive and clear explanation of how exactly they are using this term in relation to their algorithm(s, Panda is not excused from the debate) they’ve also gone back and forth on whether they have already blessed their pets with this nigh to mythical property.

Their interest in keeping us as confused as possible is obvious, but assuming that all this double talk comes solely from a desire to deceive or manipulate might be giving them too much, or too little credit.

Webmasters and SEOs have been (rightfully) complaining about the fact that you need to wait quite a while until the next Penguin update to see if your efforts to redeem yourself in those cold, avian eyes have been successful. When you consider that some updates have been more than a year apart, it’s not difficult to imagine that traditionally unapologetic Google has been pressured into appeasing the public with a promise of a better tomorrow today. Rescheduling the update now is likely to cause quite a commotion among those who started hoping they might recover by holidays, but to be fair, Google spokespersons were unanimously non-committal when it came to exact dates.

What they seem to mean when they say ‘real-time’

At first people took the statement at face value – you make a change, or one gets imposed upon you – you disavow a link, someone adds or removes a link to your site, some anchors get altered and so on, Google registers the alterations and before you can say “I’ve been a victim of negative SEO” your rankings are reflecting the new state of affairs.

Awed by the resources this kind of approach would require and wiser for knowing what became of similar promises Google has made before, most people seem to find this too optimistic to even ask for. Best case scenario seems to be that data is collected in real-time, but your site’s ranking gets modified the next time it’s crawled. Even if the delay between the changes and their effect was longer than it usually takes your site to get re-indexed, most webmasters would still greet the update with great enthusiasm. For the purposes of this article, we’ll assume this is the end goal, and try to take a look at what this would mean.

Implications

The most obvious benefit for webmasters and business owners would be that they wouldn’t have to wait as long for their site to recover from a penalty.

Likewise, it would allow everyone to be a bit more ‘experimental’ with their link building efforts. Black hats who have been using the churn and burn method – creating terrible sites that manage to survive and rank only until the next update won’t be able to use this tactic anymore, but chances are they will still find this change extremely beneficial. With faster processing of modifications to the website, trial and error will become a much more practical method of determining what goes with Google and what doesn’t.

Negative SEO might become more potent in the short term – results of a negative campaign should be able to hit you sooner, but damage control should also be easier, as you won’t be condemned to Google Hell until the next update, but only the next time your site is crawled.

As far as the non-black hat folk are concerned, the only obvious change should be in their vigilance. Staying in Penguin’s good graces will require that you monitor every change in your link portfolio and react accordingly as soon as possible (most people concerned with their online visibility have already been doing this anyway),but again, even if the damage gets done, rectifying the situation shouldn’t be as problematic as it used to be.

author avatar
Radomir Basta CEO and Co-founder
Radomir is a well-known regional digital marketing industry expert and the CEO and co-founder of Four Dots with 15 years of experience in agency digital marketing and SEO strategy, SaaS startup dev and launch, and AI solutions advocacy.

Share it around

Loading Disqus Comments ...