Solved

Veeam - From Backup Copy Job to another on-site NAS.


Userlevel 7
Badge +1

Hello,

Are there any advices on how to have better performance for our “air-gapped” or “immutable” backup on site.

We have two locations, Austria and Serbia.
VBR Server is located in Austria where our critical infrastructure resides also.
Veeam is configured to do regular Backups to the Synology NAS located also in Austria.

After that we configure to do Backup Copy Jobs (immidiate) from Austria to Serbia.

In Serbia we have one main Synology NAS and other one also Synology for air-gapped.

I would try to do something like this:
1. Configure Power Schedule on NAS to be powered only on weekends.
2. Configure Backup Copy Jobs and chose source as Serbia backups that are already copied from Austria and send it to the smaller NAS in Vienna.

 

There is also few other solutions, to automate this with rsync if possible, to manual copy from one nas to another, or just to create regular backup jobs but then it will go trough IPSec and it will be a lot slower.

Is there any option to connect them maybe directly or what would be the best option?


Thanks everybody

icon

Best answer by dloseke 18 August 2023, 16:34

View original

2 comments

Userlevel 7
Badge +20

If you are using Synology, they have replication services built in for volumes, etc.   I have two Synology NAS devices at home - DS923+ and a DSS920+.  My DS920+ is older but has a ton more storage so I sync my iSCSI volumes used for vSphere in my homelab to the DS920+ as an extra backup layer for protection.  This way I can always sync them back if something happens on the newer DS923+.

Maybe that versus trying to craft something with Veeam will work better since you already have the Backup Copy from the main office to the DR site.

Userlevel 7
Badge +6

In Serbia we have one main Synology NAS and other one also Synology for air-gapped.

I would try to do something like this:
1. Configure Power Schedule on NAS to be powered only on weekends.
2. Configure Backup Copy Jobs and chose source as Serbia backups that are already copied from Austria and send it to the smaller NAS in Vienna.

 

There is also few other solutions, to automate this with rsync if possible, to manual copy from one nas to another, or just to create regular backup jobs but then it will go trough IPSec and it will be a lot slower.

Is there any option to connect them maybe directly or what would be the best option?


Thanks everybody

 

There are a few creative ways to hack things together.  I have a client that has tried one or two of them,  but in the end, what we found was the best solution is to use a LHR to connect to their NAS.  But the other creative solutions included things like

  1. Connect the NAS to a mechanical or digital timer.  Turn off the NAS when not in use.  Downside:  NAS shutdowns not necessarily graceful.  Using a Timer means you can’t necessarily connect on-demand unless you are able to manually flip the switch on the timer.  Also, the time could turn things off when you’re trying to read or write data causing failures.  Not a great idea at all.
  2. Connect the NAS to a mechanical or digital timer.  Connect the switch to a timer or smart switch.  Downside:  Same issues as above, but at least without the data corruption possibilities of the NAS going hard down.  Still have the same issues with interruption of service, etc.
  3. Same as the above solutions, but utilize a smart outlet, and then integrate scripting with your smart device services to turn on and off the outlet as needed.  Downside:  Automation required to integrate with smart services, still have the same disadvantages of using a mechanical or digital timer.
  4. Use a managed network switch to connect to the NAS.  Use a two scripts, one script to run before jobs that would administratively enable the network ports the NAS is connected to before running the job, one that would disable the network ports after running the job.  Downside:  I’m not sure there’s a way to connect to the NAS for maintenance tasks outside of running backup and copy jobs.  Also, you need to store the switch credentials in the script or get creative with how the script would log into the switch to up and down the network ports.

 

 

For really any of these “solutions”, there are some pretty severe disadvantages with accessing the repository on-demand.  It can be scheduled and scripted, and scripting was the best solution because if a job runs long, or you need to do an on-demand restore, it can still be automated to some degree.  However, it’s not very elegant.

Fortunately, Veeam released the hardened repository feature using the native immutability of the XFS file system and we didn’t end up needing these hacked together solutions.  There are still certain risks because it’s not technically a truly air gapped solutions like having a copy on tape media and stored off-site, but it’s much better than trying to cobble something together.

Comment