This article is currently an experimental machine translation and may contain errors. If anything is unclear, please refer to the original Chinese version. I am continuously working to improve the translation.
For tinkerers, switching devices or reinstalling the system might feel like a casual decision. But transferring and backing up data quickly turns into a headache.
For example, your PC gets infected with some nasty malware that just won’t go away — might as well do a clean OS reinstall. But then you realize migrating all those app settings and data from the C drive is a nightmare. You finally finish moving everything after half a day, wipe the old disk… only to suddenly remember you forgot something important.
This exact scenario has actually happened to me — more than once — and if you keep valuable data solely on your phone, it’s always at risk of being lost due to one random mishap or another.
So my solution? Set up an image backup server, combined with file versioning, to prevent data loss once and for all.
Here’s how I did it — feel free to take inspiration.
Server
Hardware
I’m using an Orange Pi connected to an external hard drive via a USB hub. For now, I’m testing with a regular 500GB HDD. If it fills up later, I’ll probably switch to high-capacity, cost-effective secondhand SAS server drives — they offer great redundancy too. The hard drive requires external 12V power. The Orange Pi connects to my home network through a gigabit router with 5G Wi-Fi coverage.
Software
The Orange Pi runs Armbian. After mounting the hard drive, I use Docker to run a vsftpd server, exposing it within my local network. For external access, I use free public Frp services (since free servers aren’t always stable, I run multiple Frp instances for redundancy). To keep things secure, I don’t expose the FTP port directly. Instead, I use OpenVPN with certificate-based authentication to connect to the internal network first.
I also wrote a Python script that encrypts the data using OpenSSL every two weeks and uploads it to OneDrive — just in case everything else fails.
Clients
Windows
I use GoodSync to enable real-time file backup. Since continuously monitoring file changes can be resource-heavy, I configured it to scan the entire drive every two hours and perform incremental backups. All historical versions of files are preserved indefinitely.
Linux
I do have a dedicated Linux machine, but it’s mainly for coding — all my projects are pushed to a Git server hosted on the Orange Pi, so no extra backup is needed. That said, GoodSync does offer a command-line version for Linux, which you can try out if needed.
Android
Android is a bit trickier. First, you need to root the device. Then install Titanium Backup to back up all data partition contents to the SD card every night. After that, use FolderSync to sync the backup files to the FTP server. As for system partition settings? Nah, not worth the hassle.
One downside: this setup doesn’t support true incremental backups, so it eats up a lot of storage. (Unless you’re willing to give up file version history.)
iOS
iOS is by far the most complicated when it comes to backing up to your own FTP server. Luckily, iCloud’s 5GB is enough for my 32GB iPhone. Apple only backs up app data, not the apps themselves — which actually helps save space. In the end, I just went with iCloud backups and called it a day.
This article is licensed under the CC BY-NC-SA 4.0 license.
Author: lyc8503, Article link: https://blog.lyc8503.net/en/post/backup-solution/
If this article was helpful or interesting to you, consider buy me a coffee¬_¬
Feel free to comment in English below o/