Drunk & Root@sh.itjust.works to Selfhosted@lemmy.worldEnglish · 3 days agoHow to combat large amounts of Ai scrapersmessage-squaremessage-square53fedilinkarrow-up1117arrow-down14file-text
arrow-up1113arrow-down1message-squareHow to combat large amounts of Ai scrapersDrunk & Root@sh.itjust.works to Selfhosted@lemmy.worldEnglish · 3 days agomessage-square53fedilinkfile-text
everytime i check nginx logs its more scrapers then i can count and i could not find any good open source solutions
minus-squareSheldan@lemmy.worldlinkfedilinkEnglisharrow-up2·17 hours agoSome of them are at least honest and have it as a user agent.
minus-squarekrakenfury@lemmy.sdf.orglinkfedilinkEnglisharrow-up2arrow-down1·16 hours agoIs ignoring robots.txt considered “honest”?
minus-squareSheldan@lemmy.worldlinkfedilinkEnglisharrow-up1·13 hours agoThat’s not what I was talking about
Some of them are at least honest and have it as a user agent.
Is ignoring robots.txt considered “honest”?
That’s not what I was talking about