

Companies are throwing away old hardware (like 8th/9th gen Core i5) that’s perfect for running Home Assistant. See if there’s an e-waste recycler near you - they might let you buy an old system for a nominal fee.
Aussie living in the San Francisco Bay Area.
Coding since 1998.
.NET Foundation member. C# fan
https://d.sb/
Mastodon: @dan@d.sb


Companies are throwing away old hardware (like 8th/9th gen Core i5) that’s perfect for running Home Assistant. See if there’s an e-waste recycler near you - they might let you buy an old system for a nominal fee.


In the context of Debian, “stable” means it doesn’t change often. Debian stable doesn’t have major version changes within a particular release.
Unstable has major changes all the time, hence the name.
I think testing is a good middle ground. Packages are migrated from unstable to testing after ~10 days of being in unstable, if no major bugs are found.


Use a page caching plugin that writes HTML files to disk. I don’t do a lot with WordPress any more, but my preferred one was WP Super Cache. Then, you need to configure Nginx to serve pages directly from disk if they exist. By doing this, page loads don’t need to hit PHP and you effectively get the same performance as if it were a static site.
See how you go with just that, with no other changes. You shouldn’t need FastCGI caching. If you can get most page loads hitting static HTML files, you likely won’t need any other optimizations.
One issue you’ll hit is if there’s any highly dynamic content on the page, that’s generated on the server. You’ll need to use JavaScript to load any dynamic bits. Normal article editing is fine, as WordPress will automatically clear related caches on publish.
For the server, make sure it’s located near the region where the majority of your users are located. For 200k monthly hits, I doubt you’d need a machine as powerful as the Hetzner one you mentioned. What are you using currently?
If your current setup works well for you, there’s no reason to change it.
You could try Debian in a VM (virtual machine) if you want to. If you’re running a desktop environment, GNOME Boxes makes it pretty easy to create VMs. It works even if you don’t use GNOME.
If you want to run it as a headless server (no screen plugged in to it), I’d install Proxmox on the system, and use VMs or LXC containers for everything. Proxmox gives you a web UI to manage VMs and containers.


You can somewhat avoid the issue of old packages by running the testing version instead of stable, but in that case you should ensure you get security updates from unstable: https://github.com/khimaros/debian-hybrid
I used to run some systems on Debian testing and never had any issues.


Once you’re past that, the userbase is smaller than Ubuntu’s
Is it? I feel like there’s far more Debian systems in the world, if you include servers.
Use a VPN that supports port forwarding, like AirVPN.
Blue Iris is by far the most capable NVR, but it’s Windows-only so you’d need a Windows or Windows Server VM. For a basic setup, Frigate is more than sufficient.
I’d say try Frigate on your ThinkCentre and see how well it runs. I wouldn’t buy new hardware prematurely.
Do I understand that I could then share the igpu between Jellyfin and Docker/Frigate?
I’m not sure about containers like LXC, but generally you need SR-IOV or GVT-g support to share a GPU across multiple VMs. I think your CPU supports GVT-g, so you should be able to find a guide on setting it up.
It must be a lot of work to self-host DigitalOcean.


There’s a great Android TV app called Tivimate. You can record shows to a hard drive attached to the Android box.
If you’re in the USA (or maybe Canada?), the Walmart Onn 4K is a very good device for the price ($30). The Nvidia Shield Pro ($200) is more premium. I have two of them.
The best IPTV services are secretive and require an invite from an existing user. The one I use hasn’t accepted new customers for a few years, so unfortunately I can’t refer you.
Fibre optic is generally better for this use case, but Ethernet would work fine too. Be sure to use CMX rated cable, as it’s rated for outdoor use (uses a more durable, UV-resistant jacket, and is suitable for direct burial).
Run it in conduit so you can easily replace it in the future if needed.


I know this comment is a bit old, but do you have any recommendations on how to learn about building custom Odoo modules? I’m an experienced developer (with over 20 years experience) but am new to Odoo. I’ve learnt some things by looking at the code for OCA modules (I had to debug an issue with the Plaid bank statement integration) but am interested in any resources you found useful.


Samba is good too, but needs some config tweaking to hit top speeds on faster networks (5Gbps, 10Gbps or more). Probably not relevant here since the Pi only has a gigabit Ethernet port.


I was thinking more about metadata for the torrent client, or for other apps, like Plex or whatever else is running on the Pi. Logs, but also databases (if they store any) and things like that.


Get rid of the SD card and only use the SSDs. It’s a common point of failure with Pis - SD cards aren’t designed for frequent writes.


Consider using NFS instead of sshfs for more reliability.
Definitely… I use Borgbackup for my backups, which encrypts the backups before sending them to the remote server. Not all use cases can do that though, so sometimes it’s useful to have filesystem-level encryption.
Oops, I didn’t know about the SX line. Thanks!! I’m not familiar with all of Hetzner’a products.
For pure file storage (ie you’re only using SFTP, Borgbackup, restic, NFS, Samba, etc) I still think the storage boxes are a good deal, as you don’t have to worry about server maintenance (since it’s a shared environment). I’m not sure if supports encryption though, which is probably where a dedicated server would be useful.


One of SQLite’s recommended use cases is as an alternate to proprietary binary formats: https://sqlite.org/appfileformat.html. Programs often store data in binary files for performance, but you get a lot of the same functionality included with SQLite (fast random access, concurrent usage, atomicity, updates that don’t need to rewrite the whole file, etc) without having to implement a file format yourself.
I’m not sure if this is still the case, but Facebook’a HHVM used to store the compiled bytecode for the whole site in a single SQLite database: https://docs.hhvm.com/docs/hhvm/advanced-usage/repo-authoritative/. Every pageload loaded the bytecode for all required files from the DB.
Even if you build your own thing to communicate with the AC, Home Assistant is still useful since it lets you easily automate things and interact with other devices, and you get a bunch of things included (nice UI, storage of historical data, dashboards, etc). You could build your thing as a Home Assistant integration.