

262·
3 days agoIm struggling to find it, but theres like an “AI tarpit” that causes scrapers to get stuck. something like that? Im sure I saw it posted on lemmy recently. hopefully someone can link it
Im struggling to find it, but theres like an “AI tarpit” that causes scrapers to get stuck. something like that? Im sure I saw it posted on lemmy recently. hopefully someone can link it
101 of log files
is to configure it yourself
yes i did read OP.
ed. i see this was downvoted without a response. But il put this out there anyway.
If you host a public site, which you expect anyone can access, there is very little you can do to exclude an AI scraper specifically.
Hosting your own site for personal use? IP blocks etc will prevent scraping.
But how do you identify legitimate users from scrapers? Its very difficult.
They will use your traffic up either way. Dont want that? You could waste their time (tarpit), or take your hosting away from public access.
Downvoter. Whats your alternative?