I actually just switched a backup script of mine from using a tar.gz to just a regular tar file. It’s a little bigger but overall, the process is so much faster I don’t even care about the tiny extra bit of compression (100gb vs 120gb transferred over a 1gbit connection). The entire reason I do this is, like you said, transferring files over the Internet is a billion times faster as one file, BUT you don’t need the gzip step just for that
I actually just switched a backup script of mine from using a tar.gz to just a regular tar file. It’s a little bigger but overall, the process is so much faster I don’t even care about the tiny extra bit of compression (100gb vs 120gb transferred over a 1gbit connection). The entire reason I do this is, like you said, transferring files over the Internet is a billion times faster as one file, BUT you don’t need the gzip step just for that
You probably know of this already, but you might consider using Borg Backup instead. It only sends the changed bits which is even faster.
I’ll have to take a look. I was trying to keep it simple but I’m not opposed to giving that tool a shot. Thanks. :)