I’m confused about protecting backups from ransomware. Online, people say that backups are the most critical aspect to recovering from a ransomware attack.
But how do you protect the backups themselves from becoming encrypted too? Is it simply a matter of having totally unique and secure credentials for the backup medium?
Like, if I had a Synology NAS as a backup for my production environment’s shared storage, VM backups, etc, hooked up to the network via gigabit, what stops ransomware malware from encrypting that Synology too?
Thanks in advance for the feedback!
If your backups are visible from the targeted systems, you are doing it wrong. Done right, a backup utility at most only uses an agent on the systems to be able to contact them to get the data and the backups are not reachable.
Have a look at how BackupPC works, not even an agent, it accesses network shares to get the data:
I’ll check out backupPC. What is the most common/best practices way to make sure the backup medium isn’t accessible from any endpoints on the network?
Unplug it after the backup.
Immutable/offline backups. If you backup to local physical media (HDD/tape), physically disconnect/eject it and store it somewhere safe. If you back up to cloud storage (S3, etc), many of them have immutability options. If configured properly nobody (not even you) can delete or modify the backups (within the specified time period).
Backups serve different purposes and if encryption by malware is a threat, you have to do backups differently, as opposed to, for example, hardware failure, where your NAS is a valid approach. To protect against encryption malware, you must make your backups inaccessible. One example are read-only backup media like DVD-ROMs. Another example is to make regular backups on tapes or HDDs and lock them up somewhere. You only take them out after you have wiped all computers that were affected by malware.
What about simulated air gaps? So a backup system that turns off its own networking abilities once its done with the current backup and only turns its networking back on when it’s ready to start backing up again?
It depends on where you want to focus on protecting against ransomware. You can protect the NAS itself and you can focus on protecting network mount points hosted on the NAS. I would suggest planning for both since it’ll also protect you if your NAS dies/catches fire/etc.
To protect my home NAS itself from getting ransomwared, I use two external HDDs on a rotating basis stored in a safe. Every 6 months I plug the drive with the oldest backup into my NAS, overwrite the backup with the storage mount points I care about, print out a new label with the date, and put it in a safe. Ransomware can’t attack an unplugged drive.
If you’re a business or if the costs work out for you as a home user, you could just go cloud. Some cloud providers allow you to write to the store in an append-only manner. For example, Backblaze B2 lets you store data in a way where it won’t delete old data until after a certain amount of time passes. Depending on what you’re using to back up, you may be able to configure your NAS’s access to your cloud provider to only allow writes but not overwriting/deletes, or at least some form of file versioning.
To protect data from clients getting ransomwared, I turn on snapshots. With snapshots, if data gets written to your NAS, it’ll keep a copy of the old version of a file as well as the new version. You can restore snapshot’s fairly quickly because you just tell the file system to go back to the old versions. This gives you a near-instant way to restore data if a client becomes compromised and overwrites shared network storage. But note: this is not a replacement for a backup that is disconnected and offline
Some Synology devices let you turn on BTRFS snapshots and set a schedule for when they’re taken and when you want to clean them up. The issue is that this can “waste” a lot of space if you frequently change large files or add/delete large amounts of data. You will want to balance how often you create snapshots with how long you wish to retain the snapshots.
Using both of these mechanisms, I get the ability to quickly undo ransomware if my laptop gets hit and starts encrypting all of the data on its mounted network shares. If, for whatever reason, my NAS itself gets compromised, I’m only out 6 months of data. Could I back up more often? Sure, but my data doesn’t change that much anyways.
Look into the 3-2-1 strategy. Also: At least one Backup should be taken offline after the backup is done. This might be done via Tapes on a Tapelibary, where you would put your Used tapes into a fireproof safe (certified for Tape fire protection - ask me if you dont know what that means). Those backups that are not connected to a network are most reliable in such a scenario. Most encrypters encrypt right away and thus offline/archived backups are most likely not already affected.
If your trojan was keeping itself silent for a couple of months (some specialised do that) even your archives are at risk. In such a situation mostly the only solution is to build from fresh.
The backups are on a separate system with different credentials. One copy of the backups is sent to online storage that is immutable. You set a retention policy and then you can’t delete, overwrite, or change the backups.
3-2-1 standard is what saves you.
If you want an automated system that can protect against ransomware your backups need to be hosted in some way where the backup server has control of the retention and not the client (NAS, local disk, etc are not sufficient). If your NAS supports automated snapshots that can’t be deleted by the backup user it can mostly fill this gap but may need to be checked for how it handles snapshots when the disk fills.
For self-hosted solutions I’ve used BURP, Amanda, and Borg backup in the past but have switched to Proxmox backup server as my VMs all run in Proxmox. You still need to consider full disaster recovery scenarios where both your primary and backup system fail. For this PBS sports both tape and remote server replication.
There are also many cloud solutions that do this automatically. For cloud I would always use them in tandem with some kind of local backup.
For all of these they should have an admin account that has strong protection and doesn’t share credentials with any of the primary systems.
Its actually fairly simple. You just setup a backup server that connects to a network share and reads data.