A Publisher’s Guide To Bot Traffic

A Publisher’s Guide To Bot Traffic

It is exhilarating when an unexpected increase in visitors happens on a publisher’s website. After all, an increase in traffic almost most certainly relates to a rise in ad revenue. However, make no mistake and take heed that in a lot of cases, the traffic source may not even be human. It can be from bot traffic and prove to have adverse effects on a publisher’s business. Let’s take a closer look.

Is bot traffic bad or good?

picture of a robotNot all bots are designed to inflate traffic and with the intention to defraud marketers. As we live in an automated environment, some bots exist to perform specific and repetitive tasks that would be impossible or difficult for humans to execute with high speed. This includes tasks such as harvesting content, scraping data, capture analytics, etc.

Distinguishing good bots vs. bad bots does require a bit of a discerning eye, but it can be done if you have the right information.

Here are the different kinds of bots that you should keep an eye on:

Imposter Bots/ Impersonators

These are the most dangerous bots. Imposter bots mask themselves as legitimate visitors. They have a much more malicious intent than just generating a false click count as their purpose is to bypass online security measures. They’re often the culprit behind a distributed denial of service (DDoS) attacks. They may also inject spyware onto your site, or appear to be a fake search engine, among other things.

Click Bots

Click bots are the kind that fraudulently clicks on ads, causing website analytics to be skewed. If your analytics data show incorrect metrics, you may be making wrong marketing decisions. This malicious bot is especially harmful to marketers who are using pay-per-click campaigns. The clicks generated by these bots add up to wasted dollars on fake visits that didn’t even come from humans, let alone their audience.

Download Bots

These bots also fraudulently alter engagement data similar to Click Bots, but for download counts, instead of website visits.

Scraper Bots

Web scrapers achieve the opposite effect as copyright bots. Instead of protecting proprietary content, scraper bots steal content and repurpose it elsewhere.

Spam Bots/ Spammers

These are the most common bots that distribute “spammy” content like unwarranted emails, posting spam in comments, spreading phishing scams, and engaging in negative SEO against competitors.

Spy Bots

This is the kind of bot that mines data about individuals and businesses. It can gather email addresses from websites, newsgroups, chat-room conversations, and a lot more ways.

Why are bots bad for the publishers?

According to the definitions mentioned above, you can see why some bots can negatively affect your publishing business. Some may not pose any real risks and cause nothing more than just annoyance. However, as a publisher, it is important to note that Ad Networks are not only concerned with publishers, but advertisers as well.

They do regular quality checks to ensure that the websites they add to their network are free from fraudulent bots. Sadly, whether the publisher is aware of the invalid traffic running on their sites or not, Ad Networks are inclined to protect their advertisers more.

They would not think twice about banning a website that does not adhere to their policies. Publishers are entirely responsible for ensuring that their site is free from these unwanted bots.

How to detect unwanted bots?

If you have noticed an unusual increase/ decrease in your analytics metrics, then this may be an indication that your site has a high susceptibility to fraudulent traffic. There are usually signs that your site is being overrun with bots, and you need to be aware of them. Watch out for the following:

#1: Excessively high bounce rate and new session rate.

#2: Site content is showing up elsewhere online. This may indicate that scraper bots have visited your site.

#3: Poor Site Performance. If you noticed that the website runs slower and crashes frequently, it might be overrun with bots.

#4: Source of traffic seems weird (irrelevant geographical location). For instance, if your site is in English, but most of your traffic comes from non-English speaking countries.

#5: Your top referring domains (referral links) are one of the following:

-Spammy (not at all related to the site, contains pure links without content)

-Malicious (may contain malware)

-Suspended sites

#6: Less than 1-minute average time on site spent by users. If your site’s contents are lengthy and require your audience to stay on your pages for a long time, having a short average visit duration may mean that your visitors are not human at all.

What do I do?

After learning about the dangers of bad web bots and how they can hurt your business, you will likely want to take every precaution possible to defend against bot attacks. There are some preventive measures you can do on your own like implementing CAPTCHAs on forms and using a quality website builder for your site to be protected against hackers and their malware.

You may try filtering known bots when performing analytical analyses, but it is practically impossible to block bot traffic, and to be honest, you shouldn’t preemptively block bot traffic as not all bots are bad. Addressing it as-needed would be the better approach.

To ensure your site and business have the best protection available, you need to install the most up-to-date security tools to counter these potentially destructive scripts. It’s important to choose a solution that offers very high accuracy, provides real-time detection and mitigation, learns and improves continually.

Need help kicking invalid traffic and bots to the curb? Sign up for Traffic Cop today!


FAQ

How can you detect bot traffic?

Detecting bot traffic isn’t always easy. You can look for signs such as very high bounce rates, strange referral traffic sources, and traffic from irrelevant geographical locations in your Google Analytics accounts. We provide more information about detecting bot traffic in our blog post.

How do I stop bot traffic?

To prevent bot traffic, you could block specific sources from accessing your site in your robots.txt file. To prevent invalid traffic and bot traffic that can harm your publisher business by clicking on your ads, consider using an invalid traffic detection and prevention service such as Traffic Cop. Traffic Cop uses machine learning and fingerprinting algorithms to detect and block invalid traffic from viewing and clicking on your ads.

Why do bots visit my site?

Generally, bots visit your site to gather information or perform a specific task. There are many different types of bots, with some being good and others bad. Examples of good or normal bots include SEO crawlers, search engine bots, and copyright bots, to mention a few. Bots that you don’t want on your site are web scrapers, spambots, and those that click on your ads.

Kean Graham

CEO and Founder at MonetizeMore

Kean is the resident expert in Ad Optimization covering areas like AdSense Optimization, DFP Management, and third-party ad network partnerships. Kean believes in the supremacy of direct publisher deals and holistic optimization as keys to effective and consistent ad revenue increases.

Get our latest ad optimization tips delivered to your inbox

2 COMMENTS

  1. shady

    Useful information, I was planning on using traffic bot on my website
    ontheweblog.com now I won’t

    Reply
  2. Rugusat

    Nice informative article. Helped me.

    Reply

Submit a Comment

Your email address will not be published. Required fields are marked *