A slow website can frustrate your users and can hurt your search engines rankings because it effects user experience something search engines such as google look out for all the time.
Here are a few bots you should be blocking and why! At the bottom of this list is a link to a blacklist to help block these swines and speed up your site!
Scrapers downloading your website content is just a waste of your resource especially if they then use YOUR content for their own websites… nobody wants that!
Gettyimage bot lurking around the internet finding infringing images to sue webmasters!!! Again this slows down YOUR site and if you do have any “infringing” images these bots will then try to sue you.. it is possible to block these bots via htaccess if you know the right code!
Spammers… scrapebox, xrummer, gsa search engine ranker….. there are hundreds of spam bots ready to destroy and fill up your website with garbage that can hurt your search engine rankings too… so keep them out!
Dead link checkers.. Xenu! out to find expired domains through crawling just about the entire web.. to find expired domains! Xenu bots arent real people so why should they be allowed into your site and slow things down for real visitors?
Link checking bots, hiding your links is a good idea if you run a PBN as it helps keep your backlink profile safe from prying eyes… These bots also make money off your links too!
Make sure you block link checker bots from your website… why do you need to block bots like ahrefs and majestic from your site? WELL it is quite simple… they use up your bandwidth, slowing down your website and what is more they data that these bots collect if you run a private blog network can be harmful making your main website vulnerable to negative seo as it allows your competitors to see your most powerful links which they can then spam with thousands of spam links creating a very effective (even if big) negative seo attack. Even if you do not have a private blog network the exposed links can be seen on these website by your comptition and thus can be then “reported” to google.
Robots.txt or htaccess it is up to you… For link checking bots the preferred option is to block the main bots via robots as htaccess may not block the bots coming through because of changes in ip or user agents, however it has to be mentioned that some bots such as Majestic seo are cheeky (greedy) and do not give webmasters the easy option of refusing entry of their bots via robots.txt so you then have to find a way to block the bots through htaccess instead.
You can block the bots from crawling your site using robots.txt file to do so. Each bot will have a different command needed to be added to the robots.txt file which can be found on the annoying link checking website… most probably hidden away somewhere.
To block ahrefs seo for example use this in your robots.txt!!! It should do the trick, but make sure that the robots.txt file is accessible otherwise the bot will carry on as normal and crawl your site.
YOU HAVE BEEN WARNED!