Four Dots
Four Dots Blog
THE
INSIGHT

latest
from the blog

“Nothing gold can stay” said the poet almost a whole century ago. Go figure.

Even if this means that Frost would have been insulted by the longevity his poem has achieved, obviously condemning itself by the standard it describes, he’d have to agree that just writing or publishing a poem isn’t worth quite as much as having it remain in people’s minds.

The same goes for links – as a lot of SEOs are too eager to forget – just creating a link won’t do you much good if that link is removed soon after, or if the site holding it goes to the dogs in terms of quality.

That’s why the links you’ve created (or hired someone to create) will only keep working for you if you: A) Make sure you are only reaching out to relevant and reputable websites; and B) Keep periodically coming back to those links and checking if they have been removed, or if the site linking to you has changed so much that you would prefer them removed.

If you are doing SEO for yourself or for just a couple of clients, you can do both manually. However, inbound marketing agencies with dozens or hundreds of clients couldn’t even dream of spending this much time on prospect assessment or backlink revisions, and usually have to look for SEO tools that can help them do this – or, like in our case, develop those tools from scratch.

So, regardless of whether you are a one man SEO agency; a huge comprehensive inbound services provider; or a business owner who either wants to take care of your own SEO or simply monitor the performance of someone you’ve hired to take care of it for you; here’s how to keep your backlink portfolio pristine, with or without help from specialized tools.   

Prospect quality indicators – Manual

The number of conditions that a site needs to meet in order to qualify as an interesting backlink opportunity can seem daunting. Especially if you consider the fact that, while you might benefit enormously from a link on a particular site, not everyone would get the same value from it. That’s why manual prospecting is not always about determining which sites meet all the required conditions, but instead, often turns into scanning the site you’re analyzing for unforgivable faults, and once you find any, removing it from the list of potential prospects. After all, it’s much easier to determine if a site has one negative feature than it is to verify that it has a bunch of the ones recommending it. This process of elimination begins the second you start your search.

Google SERP

Once you enter your query, you can whittle down the offered results to only the most pertinent ones. So, before you even begin looking at specific results, you can eliminate the ones you know you don’t want in a number of ways:

  • If your location isn’t the same as that of your target audience, you might want to change your region in the search settings, use one of the safest VPNs, one of many free local search tools, or a number of other methods. While in search settings, feel free to set the number of results to 100, but you can always to that later by appending the &num=100 search parameter to the end of your search URL string.  
  • Search tools allow you to specify the date range, again, something that you can also do with the help of advanced search operators (by combining years with a range operator – 2011..2013, or through the more convoluted Julian date format reliant datarange:), or parameters (as_qdr=x, replacing “x” with “d” for Day, “w” for Week, “m” for Month and “y” for Yeah, we get it!), whichever way is more convenient for you.
    You might want to search for date ranges in the more distant past, if, for instance, you want a selection of sites that you know have been around for a while; or you could focus on the more recent periods, if you want to be sure you’ll only get currently active websites in the results.
  • While you can do this when composing the query, there’s no harm in checking the initial results first. Either way, you final query should include negative specifiers, excluding certain sites, or specific notorious TLDs from being displayed in the results. You can do either by adding -site: operator to your query, followed either by the specific site you don’t want to see in the SERP (-site:facebook.com) or an infamous TLD (-site:.info).
  • Finally, there are vertical search options, allowing you to focus on images, videos, news etc, but to be fair, they are only useful in very specific circumstances.

Link prospecting by elimination

Once you have a final set of results, it’s time to take a look at each of them individually. You can do this from the SERP itself, or you can export them to a spreadsheet and do it from there (Scraper Chrome extension does a wonderful job of this, allowing you to not only copy results URLs but also page title and description elements). For the sake of simplicity, let’s assume you’ll be looking directly at the SERP. So, what is it that you can tell about a website, without even clicking on a result?

  • Domain name. A treasure trove of information. If it’s too long, contains too many hyphens or numbers, or one of the spammy TLDs that you forgot to exclude with site: operator, you might consider skipping the site. It takes some experience to reliably decide how many (total characters, numbers or hyphens) is “too many”, and even then, you might miss out on a great prospect just because the webmaster decided to get creative with their domain name, but in the long run, the time you save will more than make up for the wasted opportunities. Likewise, if the root domain contains words like article directory, submit, bookmarks, infographics, etc, they might just be remnants from the Dark Ages of SEO and should be either skipped or at least, scrutinized with extra caution.
  • Used protocol. While taking a look at the domain, you can’t fail to notice the http:// or https:// preceding it. While the secure https:// protocol used to be more of a recommendation than a necessity, and only important for certain websites (those requiring textual input from visitors), Google seems to be nudging the entire internet towards adoption of this standard, and is likely to take everyone’s willingness to comply into consideration when tallying the final account.
  • Meta elements. Spammy sites write spammy meta descriptions and titles, there are no two ways about it. If you notice signs of keyword stuffing or extreme illiteracy, you might want to just move on.
  • Snippets. While the absence of rich snippets is not a definite sign of a site’s inferiority, their presence is usually indicative of adherence to webmaster best practices, and a signal that you are dealing with someone taking good care of their site.   

Once you’re done with the browsing section, it’s time to do some serious clicking. Now, we are aware that a lot of results will lead to deep pages, but let’s assume that your first direct contact with a site will be made through its:

Homepage

You’ve read the advert, and now you’ve bought the car, so how does it feel? Did you get what you expected when you clicked on the result in the SERP? If not, you’ve experienced the same kind of user frustration that others visiting the site may feel, and this might be a red flag on its own, but one that might be purely context-dependent, and that shouldn’t cause you too much worry. However, site homepages are exactly where you’ll usually find the most blatant and unforgivable SEO transgressions, common sense violations, and linking ethicacy faux-pas. Here’s how to identify them.

  • Too many external links. Allowing for exceptions, the homepage is usually a site’s main link juice silo,  one that you use to boost your own pages, not promote other sites. If you notice that someone is being extremely generous with external links from the homepage, you are completely in the right to assume that something fishy is going on. Likewise, if you see similarly unrelated external links in the site’s main navigation, give it a wide pass.
  • Suspicious anchors. There doesn’t have to be too many links on a site’s homepage for you to start worrying about their suitability as a backlink opportunity. As a matter of fact, even if they only have one external link, as long as that link has an anchor suggesting adult or gambling related content, you probably don’t want to have anything to do with the site.
  • Design, responsiveness and layout. While noticeable from other pages as well, you’ll pay attention to these elements as soon as you hit the homepage. If a site’s homepage is difficult to navigate, if it displays poorly on smaller screens, has unforgivable aesthetics, or simply looks like something that was just thrown together; it’s relatively safe to assume that the rest of the site won’t look or perform much better.
  • Contact information. They don’t need to have it directly on their homepage, but if not, they better have a link to a “Contact Us” or “About Us” page, which will have at least a phone number or a physical address of the entity you are considering for collaboration. While there may be valid reasons for the omission of this info, (you wouldn’t expect it from someone running a personal blog, for instance) if you run across someone representing themselves as a company, agency, government or nongovernment organization, startup, or basically any kind of legal entity, and you can’t seem to find any verifiable contact info apart from an email address, give them a wide berth.
  • Too many ads. Again, something that you’ll have to look out for with other pages as well, but it is never as unbecoming as when it happens on the homepage. If you come to a website and see more than four or five ads above the fold, the site really needs to be somehow exceptional to still keep your interest.
  • Blog archive. If it’s a blog, of course. Usually in the right sidebar, taking a look at the blog archive is the fastest way to determine how long have they been around, what’s their publishing frequency, and if they are still active.
  • Social media profiles. Which social networks is the site active on? How many followers do they have, how many shares of their domain, how engaged is their audience? You can’t see all of this from the homepage, but it’s a great place to start exploring this aspect of a website.

If you’ve gone through these points and the site still didn’t disqualify itself as a prospect, you’re almost there, but not before you examine another one of its critical segments.

Content

Ideally, beautiful, informative, heartfelt, accurate and engaging content would be everything you need to reach people. But, with Google ignoring you or misinterpreting your intentions if you’ve failed to mention the relevant keywords often enough; and readers giving you a pass if you don’t have plenty of images, minuscule paragraphs, and enough shiny things to keep their attention by slightly distracting them; just enjoying someone’s content is, sadly, not nearly enough for you to try and get a link from them. Here are other conditions they have to meet.

  • Relevance. Does it make sense for you to get a link from the site? Are they talking about your industry, or at least a closely related one? Would their audience have a genuine interest in your link, and then perhaps, the rest of your site? Are you just trying to shoehorn your way into the site, or would its webmaster find it perfectly natural that you are inquiring about a cooperation? Naturally, just like with most of the other entries, this is not something to be adamantly fanatical about. Yes, a huge majority of your backlink portfolio should be made up of links from relevant sites, but it would be unnatural if absolutely all of them came only from sites that have a direct topical correlation. As a matter of fact, the more successful a site is within its own field, the more it stands to reason that it will eventually extend beyond the tight bounds of its niche, well into other disciplines, industries and spheres of interest. Of course, you can’t use this as an excuse to try and get a link to your car parts page from a site dealing with postnatal depression.
  • Content quality. You could claim that this is a fairly subjective category, but when you know that Google can see the word count of individual articles; calculate their readability based on sentence and even word length; detect most of the grammatical or orthographic errors; recognize synonyms – on lexical and on phrasal and idiomatic level; flag you for plagiarism; and pretty much do everything else apart from accusing you of selling out, because “You used to write from the heart, man”, you can be sure that complaints about the subtlety and tenuousness of the artistic expression would only be falling on the cold dead ears of a mechanical spider. To make the matters worse, while the spider is incapable of subjectivity it also paradoxically abhors exactness, which is why no one can say exactly how many words your content should have, what should its Flesch–Kincaid readability score be, or how informal can you get with your writing before Google starts believing you’re illiterate. While numerous studies have tried to correlate these elements with rankings, and while longer content of average readability, written in a semi-formal register seems to be performing the best, the studies are fairly inconclusive. Basically, if what you’ve read there is literate, informative, and seems to engage the site’s audience – through comments or social networks activity; you shouldn’t worry too much about Google being pedantic about it.
  • Sponsored posts. They’re everywhere, and they’re not always disclosed, either. Popular and established websites with a faithful audience can get away with openly publishing paid-for content, but smaller blogs doing the same can easily find themselves in Google’s disfavor. So, while seeing something like a paid post, sponsored post, paid review, etc. shouldn’t immediately send you running, it should serve as an incentive for you to be additionally cautious when assessing the site.         

If you’ve ever read anything about content before, chances are you’ve seen people attributing it with royal origin. And what is true royalty if not honest, direct, attention-grabbing and eloquent – but not to the point where the people can no longer understand its proclamations? If that’s the kind of content the site you’re considering is publishing, you just might want to go ahead and try to get in their good graces.

How legible is your content

Links

All of them. The ones leading from the site, and the ones coming to it. Are they linking out to suspicious sites, which would place you in a bad neighborhood, or have they resorted to dubious link building tactics in the past, leaving their backlink portfolio as sketchy as a drawing of a stick figure. Have they been getting or handing out links with suspicious, unnatural anchors, or have they gone so far to actually try to cloak outgoing links by hiding them from visitors? Are their total backlinks to unique referring domains ratio skewed too heavily in favor of the former? Are you likely to get a dofollow or a nofollow link from them? If you really want to get into it, this part of the analysis is usually the most time-consuming one; and while you can allow yourself to only perform a slightly more casual research of this aspect of a site, you must never disregard it completely if you are hoping for any kind of success with your link building.

Done?

There you go. Now do this for each and every prospect you come across, and in just a couple of short hours, you may have yourself a respectable list of some 10 to 20 decent prospects. Ok, this might be a bit of an exaggeration, you’ll probably do a bit better than that even with completely manual prospecting, but now have a look at how it all works when you have a specialized link prospecting tool at your disposal, in this case, our own Dibz.    

    

Prospect quality indicators – Tool-Assisted

People are often wary of automation because they feel like it precludes subtlety. And while some processes cannot be automated without making them clunkier or less sophisticated, there are those that can. Remember how in the manual prospecting section we mentioned that finding the right result is done by eliminating all the rest? This kind of approach is perfect for automation, as it allows you to only eliminate the sites that you are certain are undesirable. So, by raising or lowering your standards, you’ll have fewer or more websites to go through and directly evaluate, but if you are cautious, you’ll never have to worry about missing a potentially interesting prospect. So, here’s how the described process of elimination works with Dibz.

Remember all those vertical searches, parameters, operators and social media info that you had to think about? Most of this will be taken care of by the tool, you just need to set the guidelines. The first part of this is done through the spam filter page.

Dibz SEO spam factors

This page allows you to set your preferred standards when it comes to 17 separate ranking factors (apart from the ones you can see in the image, this also includes stuff like domain name length; presence of cloaked links, social profiles or sponsored content; number of external homepage links; total number of site’s indexed pages etc.). You can attribute a spam value to each entry, depending on how important you believe it be. So, for instance, you don’t like websites which have too many ad blocks. Dibz gives you a chance to decide how many is too many (in this case, 5) and to determine how many “spam points” (36, in our case) will be added to the site’s total if it has more than that. You can specify these values for each of the 17 factors, and then decide on how many points can a site have before it is removed from the list of filtered results in Dibz (you can still find these sites in the ‘all results’ tab). So, even before you begin your search, you’ve taken care of 17 separate considerations.

The same page also allows you to provide list of preferred TLDs, so you can request them from the search page; to define research type templates; and to compose a list of websites that you never want to see return as results (for instance social networks, which otherwise thickly populate results pages, and are not interesting as link building prospects).

Once you’ve done this, it’s time to move on to search proper. Here’s what this looks like in Dibz.

Dibz prospects search interface

While the right-hand side provides great time-savers on its own – preferred language for the results, desired TLD and date range; the truly important part is on the left, where you get to interchanging the actual values. basic search terms, as well as specific modifiers that you want to combine with those terms.

What does this do that Google can’t? Nothing, but, it does some things that Google won’t:

  • it combines each row from the Search Term field, with each row from the Custom Search Parameters field (or the ones from our Custom search templates);
  • performs a search for each of the combinations, with all the modifiers on the right considered;
  • pools all of those results together, remove duplicates and hides the sites that don’t meet your standards as defined through spam filter
  • gives you a convenient, export-ready list of prospects with last remaining bits of info that you still need about particular sites, including a list of their top anchors, spam filter value for the site, Domain Rank value, number of unique referring domains and pages, contact email, as well as the number of domain shares and followers on all major social networks.

The convenience of different filters and clearly displayed essential metrics aside; what we are most proud of, and what is probably the main benefit of Dibz is the ability to perform a string of operator-modified searches in succession, which is something that you couldn’t do in Google without having to complete a new captcha every couple of minutes. It’s difficult to believe how much times this saves you until you try it.

The sites in the list you are left with satisfy all the machine-measurable standards mentioned in the manual prospecting section, and since spam filter needs to be set only once, all you had to do to spare yourself the trouble of going through countless sub-par sites was to enter you search terms and custom operators.

You still have to visit each of the sites, decide if they are relevant enough, and generally, if they are someone you’d want to deal with. However, since a huge portion of unsuitable websites has been removed by our utility, you’ll get a much higher percentage of valid prospects than you would with even the most refined Google search.

Link Monitoring – Manual    

So you’ve made a beautiful link, now you just have to make a record of it. You fire up your trusty spreadsheet of choice, copy the URL of the page linking to you; perhaps make a column for the root domain, purely for organizational purposes; make a note of anchor text and target; whether the link is a dofollow or nofollow; contact details of the person you’ve negotiated the link with; date the link went live; if you’re a part of an agency, note the name of the link builder responsible for the link; name of the client the link was made for; a brief note about the link; and anything else you might need.

Now, if you are only doing this for yourself, or a couple of clients, creating this kind of a list, and retrieving info from it is fairly manageable; but anyone with more than two or three clients and employees creating links for them; and this quickly turns into an organizational nightmare.

Even if you simply had to record all the links and never look at them again, doing it manually is a hassle. However, you are not etching them in stone for future archaeologists to find, you will have to keep coming back to them, assessing their suitability for other clients, taking stock of used anchors and targets to prevent over-optimization and ensure appropriate diversity, and what’s most important, checking their health, i.e. if they are still live, dofollow, and if the site holding them is still acceptable.

Now, this is not even going into monitoring the actual benefits you are getting from the link, like the traffic it is sending your way, which you’d had to turn to Google Analytics and Console for. So, just to check if your links are still there, you’d have to visit each of the sites from your sheet, check your link’s target and anchor, and at least get the site’s DR. Doesn’t sound too bad? Not even when you realize that you’d have to do this for every link you’ve ever made, and then repeat the entire process at least once a month? We thought so.   

Naturally, there are agencies that believe their job is done the moment a link is live, but they and by extension, their customers, are thusly deprived of a complete overview of their backlink portfolio, and the ability to draft appropriate future strategies. Basically, you can’t plan a trip if you don’t know your starting point.

So, how do you eliminate monthly spreadsheet juggling, and carpal tunnel worth of clicking?

Link Monitoring – Tool Assisted

As our agency grew, so did the number of our clients and employees. Synchronising efforts of dozens of link builders, some of whom were sharing clients and link contacts, turned out ot be an incredible, time-consuming chore, one that opened countless opportunities for human error and miscommunications. Do we already have a link for a client on a particular site? Did we outreach a site for a client, and how did it go if we did? How many links were created with a specific anchor in a specific period? And finally, what’s the current status of those links, and the sites they’re on?

After listing each and every issue we encountered with manual approach, we set out to create a tool that would eliminate, or at least alleviate all of those problems. That’s when Base.me was born. Its main purpose being facilitating link data entry and retrieval, it became a self regulating system which organizes the workflow, performs scheduled or on-demand health checks of our links, and basically solved most of our issues with link building management and monitoring. However, since this utility is still being jealousy kept for internal use and for a handful of our friends and partners, the only way for you to see what this looks like in practice is to register your interest and perhaps qualify for beta access.

Until that happens, or we make the tool available to everyone interested, how do you replicate some of the features it offers, and cover other essential areas of link monitoring it was never meant to address?

It’s simple, you turn to our latest utility, a beautifully versatile digital marketing KPIs dashboard reporting tool, Reportz. We decided to develop this tool as soon as we took exact stock of how much time we were wasting on manual reporting. While we have been working with a more modest number of clients, patching reports together by copying data from different sources we were using to track campaign performance and trying to organize it so that clients have no problem understanding them, wasn’t too unbearable, even though it could take up to several hours per client. However, when you consider the fact that reports had to be created, or at least checked by people in middle to upper management, whose time is too precious to be spent on hours of copy/paste/format/repeat; and when you also account for a steady, fast-paced growth of our client base, the urgency of finding a way to automate as much of the process as possible couldn’t be ignored for too long.

We solved this by creating a tool which can be connected to the data sources you usually rely on for your KPIs; extract the data you specified through one of our offered templates, or your own custom setup; organize, contrast and compare that data in a way that a particular client finds optimal; and which can either send scheduled reports without you needing to move a muscle or simply be made constantly available to clients, who could check up on the exact, current state of their KPI’s.

While it was initially conceived as an SEO agency tool, the fact that it allows for convenient tracking of metrics from a single dashboard makes it just as suitable for business owners who want to keep a watchful eye over the performance of their digital marketing campaigns, whether they are run by someone in-house, or by an outside agency.

This is what the process looks like in practice, for agencies and for DIY SEOs alike – if you want to follow along, Reportz has a free trial, which we encourage you to make use of.

  • You start by creating a dashboard. Dashboards are pages containing all the data you want to observe. You can have one dashboard for several projects or clients, or several dashboards for a single project or client, it’s completely up to you. They can be set to observe any time range you’re interested in, customized with yours or your clients branding, password protected, and easily shared with your clients.
  • If you didn’t do so earlier, you will be asked to add your data sources. Even though we already integrate most of the major players (Google Analytics, Console and AdWords, Ahrefs, Rank Ranger, etc.) we don’t intend to stop adding new sources any time soon.
  • Data from these sources is displayed in individual, customizable widgets, which you can have an unlimited number of in a single dashboard. So, for instance, one widget can display a particular metric, graph, or list from Google Analytics (with the option of hiding data segments you are not interested in); another can be the overview of your rankings from Rank Ranger;  and the third could be a merger of two separate Google Console widgets, for instance, Top Clicks (Pages) and Top Impressions (Pages).
  • Once you’ve populated the dashboard with all the metrics you want to track or report on, all that’s left to do is to customize the dashboard and share it with your clients or management. If you want the same kind of report for several clients, you can easily clone the dashboard, imitating its layout and only changing the actual values.

What you can do with a Reportz KPI dashboard

If you’re working in an agency, you’ll probably be able to figure out how to do all of this on your own, but if you can’t, or if you are not a professional SEO, but just want to monitor a campaign someone else is running for you, you can contact us for a live demo, where we’ll guide you through each step of the process.

So, let’s say that you know how to do all of this, which metrics should you take a look at? While you can connect your Base account (if you have been granted access), and monitor your links from a Reportz dashboard, this would be missing an opportunity to do so much more. Namely, Reportz gives you a  way to extract and organize all the metrics you need to track the efficiency of individual links, and your entire link-building strategy.

Ahrefs – New and lost domains

If you want to be alerted of being linked to from a new domain, or losing links from a domain, having a widget with Ahrefs ref domains data is not just convenient, but essential. This allows you to promptly intervene if you want the removed links to be restored, or if you realize that domains you would rather not be associated with are linking to you.

Rank Ranger – Page Rankings

Not much to say here – by setting custom dates in Reportz, you can observe the rankings of your target pages in the period you were building links for them, and get another piece of the puzzle.

Google Console – Avg position (pages)

Similar to what you’re getting from Rank Ranger, but coming from your Console, and showing average page position in the selected period, along with the position change. The same logic applies – if you see that a page was dropping in rankings while and soon after you created a bunch of links for it, you might want to go back and give those links another look. Likewise, if you see a page has skyrocketed in rankings, and you don’t have anything else to attribute it to, examining the links created in that period might reveal some sites that you might perhaps want to contact again in the future.

Google Console – Top Clicks (queries and pages)

All the SEO talk can make people forget that links are there for the traffic, this is still their purpose and their greatest value. So it makes sense if you want to know which of your pages people like coming to, and which anchors seem to be doing a good job of directing them there. Crucial for your overall SEO strategy, when observed in appropriate periods, it also shows which of the links you’ve created are actually doing their job instead of just showing up for it.

Google Analytics – Top keywords and landing pages from organic

Pretty much the same as above, but from a slightly different angle. Might show you data that the previous combination didn’t, so it’s definitely worth your time to add these two widgets as well.

Google Analytics – Organic visits rate

Shows fluctuations in the number of organic visits you were getting in the observed period. Again, correlating your rankings and your link building efforts at a certain time is never completely straightforward, but when you account for algorithm updates, the effect of your other inbound channels, etc. you can get genuinely valuable insights.

Is that it?

Well, yes. You abstain from creating suspicious links and stay vigilant when it comes to removing those created without your intervention, and you should be fine. As long as the number of these links is relatively low, you can manage without specialized tools, but for larger volumes, even if you are not an SEO agency, but a business owner who wants to keep an eye on the way their site is being promoted, do yourself a favour and give our utilities a go.

author avatar
Radomir Basta
CEO and lead SEO strategist at Four Dots and lecturer at Digital Marketing Institute Also an angry driver and huge tattoo fan. In love with growth hacking.

Share it around

Loading Disqus Comments ...