The current robots.txt disallows ScannerBot and Cookiebot, which are the crawlers used by major consent management platforms (Termly, CookieYes, Cookiebot, etc.) to audit cookies on our sites. This means we can't run automated compliance scans to keep our cookie policies accurate and up to date.
This is a compliance issue, not a nice-to-have. GDPR and several US state privacy laws require accurate cookie disclosures. If our consent management tools can't scan our sites, we're left manually auditing cookies through browser DevTools, which isn't scalable and introduces compliance risk.
ArtCloud's built-in cookie policy tool is not a substitute. It offers no granular consent controls, no per-category opt-in/opt-out, and no script blocking. It doesn't meet GDPR requirements for meaningful consent. That's exactly why we use third-party consent management platforms, and why those platforms need scanner access to function.
Request: Remove the ScannerBot and Cookiebot disallow rules from robots.txt, or provide a mechanism for galleries to whitelist specific bots for their own domains.