I find the idea of self-hosting to be really appealing, but at the same time I find it to be incredibly scary. This is not because I lack the technical expertise, but because I have gotten the impression that everyone on the Internet would immediately try to hack into it to make it join their bot net. As a result, I would have to be constantly vigilant against this, yet one of the numerous assailants would only have to succeed once. Dealing with this constant threat seems like it would be frightening enough as a full-time job, but this would only be a hobby project for me.

How do the self-hosters on Lemmy avoid becoming one with the botnet?

  • wonderingwanderer@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    1
    ·
    9 hours ago

    Would something like Anubis or Iocaine prevent what you’re worried about?

    I haven’t used either, but from what I understand they’re both lightweight programs to prevent bot scraping. I think Anubis analyzes web traffic and blocks bots when detected, and Iocaine does something similar but also creates a maze of garbage data to redirect those bots into, in order to poison the AI itself and consume excessive resources on the end of the companies attempting to scrape the data.

    Obviously what others have said about firewalls, VPNs, and antivirus still applies; maybe also a rootkit hunter and Linux Malware Detect? I’m still new to this though, so you probably know more about all that than I do. Sorry if I’m stating the obvious.

    Not sure if this is overkill but maybe Network Security Toolkit might have some helpful tools as well?