Forum Discussion

doejohn0696's avatar
doejohn0696
Copper Contributor
Sep 10, 2024

Managing Googlebot/Bingbot Exclusions in Security JavaScript without Impacting SEO

I need to add an important security related javascript on my HTML pages that detects a few signals like presence of selenium variables in widnow/document objects. Once something is detected, a request is sent to my backend to capture this data.

Googlebot / bingbot may also emit some of these signals (I am tracking 20+ signals) & these bots make thousands of visits to my various webpages. So somehow I do not want to execute the script for these bots.

1. If I use useragent, either on backend to totally exclude this script for googlebot or on frontend to not execute the script - will it be safe for my SEO? Can Googlebot penalize this assuming script is used for cloaking etc.?

2. How bot detection companies like Human Security (PerimeterX) manage this? Do they track even Googlebot activity?

  • lewis8653g5415's avatar
    lewis8653g5415
    Copper Contributor

    Booths @ Managing Googlebot and Bingbot exclusions in security-focused JavaScript requires a careful approach to avoid negatively impacting SEO. It's essential to ensure that your security measures, such as bot detection and access restrictions, do not inadvertently block legitimate search engine crawlers like Googlebot and Bingbot, which are crucial for indexing and ranking your site. To achieve this, configure your JavaScript security logic to recognize and allow these bots while restricting harmful or unwanted automated traffic. 

  • AdnanQ275's avatar
    AdnanQ275
    Copper Contributor

    To safely add security-related JavaScript for detecting signals like Selenium, using the user-agent to exclude bots like Googlebot can help protect your SEO. As long as the script doesn’t alter content or mislead search engines, it won’t be penalized for cloaking. Companies like Human Security (PerimeterX) manage this by allowing legitimate bots while blocking suspicious activity. FintechZoom and similar sites can use user-agent checks to maintain security without affecting bot access for SEO purposes. Clear documentation ensures transparency with search engines.

Resources