Solved

File Share vs Volume


Userlevel 3
  • Not a newbie anymore
  • 8 comments

If I’m creating a Windows backup, for a particular drive on a file server….is it best to create a File Share backup or a Volume level backup (essentially both backup the same set of files).  ie which is most efficient or recommended as a best practice?

icon

Best answer by MicoolPaul 6 June 2023, 10:17

View original

14 comments

Userlevel 7
Badge +20

Hi do you mean file level in agent or file share as in NAS Backup.

 

If you are looking at the agent then I’d recommend protecting the volume unless you need to specifically only protect a subset of data. It’s faster: https://helpcenter.veeam.com/docs/backup/agents/agent_job_mode.html?ver=120

  • File-level backup is typically slower than volume-level backup. Depending on the performance capabilities of your computer and backup environment, the difference between file-level and volume-level backup job performance may increase significantly. If you plan to back up all folders with files on a specific volume or back up large amount of data, we recommend that you configure volume-level backup instead of file-level backup.
Userlevel 3

I didn’t mean the Agent file level backup that shows up as it says that is slower, but the new option - File Share (i’m assuming it is new in 12, but I’ve only just installed - I think I read it was):

 

Userlevel 7
Badge +20

How large is the data set you’re trying to protect?

Userlevel 3

It can vary between 750Gb and 1.5TB

Userlevel 7
Badge +20

So you’ve just got to consider that file shares are licensed per 500GB (rounded down to the nearest 500GB).

 

Functionality-wise your differences are that agent-based backups support instant VM/Disk recovery if it’s a VM, and NAS backups support instant file share recovery. If the server is physical then you’d be better with NAS for the instant file share recovery as you won’t be able to instant VM/Disk back to source you’d have to attach it to a virtual resource or await all the files to be restored.

 

The other consideration is around the backups and their processes. Agent will only require a repository to write to, and any additional repositories for a backup copy job, a NAS backup job will require a file proxy as well, this role could co-exist on another server if appropriate to your architecture. NAS backup jobs have secondary copies you can create within the primary job as well as an archive repository. It’s just worth a quick read to understand how retention can differ if you need anything stored for the long term.

 

I did a post on this topic recently that might also provide insights into what is better for you: 

 

Userlevel 3

Essentially I have 5 file servers which I want to back up separately as an internal DR (quite comvoluted structure that we’re having to move to)….so I’m using the Community Edition.

I want to keep these copies for perhaps two years….shadown copies are limited to a max of 512.  So essentially I want an alternative to these to facilitate quick and easy restoration of files/folders in the case of accidental deletion etc.  This isn’t for DR purposes as such.

Userlevel 7
Badge +20

And how frequently do you want to protect the file shares, daily?

Userlevel 3

Yes

Userlevel 7
Badge +20

Then I’d just go down the agent route here, my justifications being:

Community is limited to 10x instances. An instance can be 1 server or 500GB NAS data, you said you’ve got 5 servers ranging from 750-1500GB, so you could end up over on community licenses now or in the future with NAS backup vs agent will be a fixed 5 server = 5 instances.

Agent backup processing will consume CPU/RAM on the file server when it’s backing up, but it’s daily so this could be shifted out of hours if you have resource contention.

You don’t need to create a file proxy for agent-based backups

You’re nowhere near the Microsoft VSS size limit (64TB) that would necessitate an alternative architecture.

 

Theres other questions I’d explore if this was a project I was working on, but it’s quite safe to say this will work well for you without needing to complicate things!

Userlevel 3

Fabulous, thanks ever so much for your input.

 

PS I thought the article on licensing was extremely clear…..any chance you could do the same for Microsoft? 😁

Userlevel 7
Badge +20

Thank you @Ian C 😆 

Microsoft have made careers out of making their licensing complex! I can actually say I’m not certified to write such a thing, they have extensive training on that stuff: https://www.microsoft.com/en-us/licensing/learn-more/training-accreditation

Userlevel 7
Badge +6

Thank you @Ian C 😆 

Microsoft have made careers out of making their licensing complex! I can actually say I’m not certified to write such a thing, they have extensive training on that stuff: https://www.microsoft.com/en-us/licensing/learn-more/training-accreditation

Personally, I love the Microsoft has certifications on licensing.  As such, I actually have a credential via one of our distributors for understanding NCE licensing.  Boggles my mind that they can make things complex enough for this.

Userlevel 7
Badge +6

I suppose @MicoolPaul has answered everything for you, but just as my own input, if it is a virtual machine, I probably would setup a VM level backup, and if you’re not looking to backup the entire VM and just the data volume, create an exception so that only the volume in question is backed up.  I’m diving more into the agent on a figure project that was recently signed off for, but since I know VM level backups much better, that’s the way I generally lean.  Honestly though, either is probably a good option for file restores.  I prefer not to use the agent for bare-metal restores in which the entire machine would need to be recovered, but I feel like they could go either way for file level recoveries.

Userlevel 7
Badge +6

PS I thought the article on licensing was extremely clear…..any chance you could do the same for Microsoft? 😁

 

I can take a crack at MS licensing if you have a specific question.  😁  

Comment