Solved

Veeam Backup for M365 - Archival Options


Userlevel 3

Hi Team

We have a customer where we have VBO deployed on premise with a local onpremise repository attached to an Object storage.

Customer was requesting for option to perform a monthly full backup to cloud and also a yearly backup to cloud .

Would this be possible ? Please confirm on the possibilities .

I have created an archival object storage repository for the same .

Based on my understanding of the technology , only backup copy to archival storage can be configured which will do a direct copy of the existing backup job.

 

icon

Best answer by MicoolPaul 2 August 2022, 17:35

View original

10 comments

Userlevel 7
Badge +8

Hi @karun.keeriot,

 

What are you currently using for object storage? As I’m interested in your comment of:

Customer was requesting for option to perform a monthly full backup to cloud and also a yearly backup to cloud .

Does this mean the object storage you’re using is local too? So you want a cloud backup option as an off-site backup copy?

 

Due to the way the data is processed and stored, the concept of full/incremental is slightly different to VBR. Within VBR you working with blocks of data, whereas with VB365 you’re working with files. So incrementals within VBR are reading the changed blocks to reconstruct an entire VM, whereas within VB365, you’re just tracking file version history.

 

You should use backup copy to create a separate copy as the goal seems to be, but there’s no native support to re-process all objects as a brand new full backup within an object storage repository, as that’s wastful and expensive IMO. If you want multiple full backups for redundancy, then you’re best creating multiple backup/backup copy jobs.

 

More context would help, so I can provide better insights/direction on what’s possible.

 

Some articles that might help on this subject are below:

https://helpcenter.veeam.com/docs/vbo365/guide/object_storage_retention.html?ver=60

https://helpcenter.veeam.com/docs/vbo365/guide/retention_policy.html?ver=60

 

Userlevel 7
Badge +2

I feel like it should be noted that it is this way for every M365 backup product I’ve seen.  It does, more or less, incremental backups to grab each file that has been added or changed.  M365 backups don’t run in the same fashion of traditional backups like you would use for instance, if you were backing up an Exchange server as a VM, or with an agent even.  It’s only looking inside of the mailboxes/OneDrive/SharePoint accounts/repositories for the data that has changed, etc.

Userlevel 7
Badge +1

@karun.keeriot Hello,
Since Veeam M365 v6 you have the possibility to create backup copy job, for long-term retention purposes, AWS Glacier storage and Azure Archive can be used to archive data. But I didn’t see a method to schedule a GFS archival purpose. as mentioned by @MicoolPaul the datas are processed and stored differently in Veeam for MS365.

https://helpcenter.veeam.com/docs/vbo365/guide/vbo_new_copy_job.html?ver=60


When using this type of job you have to create a retrieval job if you need to restore from it

Userlevel 3

Hi Guys

Thanks for the replies .Much appreciated. 

To give you a background of the requirement.

Customer has VBO deployed onpremise with a local repository deployed extended with an azure blob storage. There is also an additional archive blob storage configured under the oject storage repositories.

Requirement 1 :He needs to have 10yrs of his email data backed . ( Has a similar backup policy on VBR where we have configured backups to happen on last day of the month with a retention policy of 12 restore points)

Requirement 2 : Monthly full backups for 1 Yr ( Yearly Backup policy configured for month of December to archival tier using the proxy appliance from blob storage to archive storage on azure with 10 restore points )

Since the concept of a full backup does not arise with VBO what would be the best way to proceed with data backup and archival…

Based on my understanding there is a limitation with archival of email data as there is no GFS kind of a policy available. The best option is to create a backup copy operation which will be an additional copy of primary incremental data.

Since i dont want load on the primary onpremise local storage should i consider creating a new primary storage repository of the Azure Blob directly and just create a backup copy for the same onto archive tier.

Also is there a specific mode to create the archive storage cause the following storage does not show up when i go to create the backup copy job..(i.e the archive blob storage does not display in the dropdown to select the target storage repository) 

Please confirm if the archive storage needs to be also added as additional extent to a regular repository as option to create scaleout repositories does not come up in VBO.

 

If someone could give me a workflow that would be great. 

 

Thx again.

Cheers ! 

Userlevel 7
Badge +5

Hi @karun.keeriot 


If you need to have independent monthly archival restore points, then the only way is to use “local disk Repositories (jet databases)” and a VBR/Agent Backup Job. That will give you GFS Restore Points of the VB365 VM with all backed up data.

With Object Storage based repositories, there is no monthly/yearly archival option. 

 

Also is there a specific mode to create the archive storage cause the following storage does not show up when i go to create the backup copy job..(i.e the archive blob storage does not display in the dropdown to select the target storage repository) 

 

It will show up, if you have added AWS S3 Glacier or Azure Archive Blob object storage and created a new repository with that archive object storage. And it must use the same Retention Type (Item / Snapshot) as the backup job repository. Only backup job repositories with object storage can be used as a backup copy source.

Userlevel 7
Badge +8

Hi @karun.keeriot 

 

Thanks for the detail. @Mildur has already provided some great input on this, so I’d like to take a different approach to looking at this and try to add further value.

 

Requirement 1 :He needs to have 10yrs of his email data backed . ( Has a similar backup policy on VBR where we have configured backups to happen on last day of the month with a retention policy of 12 restore points)

This sounds like everything needs to be retained for 10 years. Email backup isn’t instantaneous like if it was part of an SMTP relay, so it has to happen on a scheduled frequency. If you only processed emails monthly, you could miss out on a lot of content within that duration.

Requirement 2 : Monthly full backups for 1 Yr ( Yearly Backup policy configured for month of December to archival tier using the proxy appliance from blob storage to archive storage on azure with 10 restore points )

 

I would tip this on its head and consider the following structure:

 

Backup Frequency: Daily

Repository Configuration: Snapshot-level retention, this will be similar to image-based backups.

Backup Repository: Object Storage with 1 Year retention

Backup Copy Job: Extended with AWS Glacier / Azure Archive with 10 year retention set to immediate.

 

This way, you create a backup to your faster and immediately accessible object storage initially. This has the benefits that, if you’re on a metered API, it’s cheaper for access and egress of data typically than archive. As we’re using snapshot-based backups in this scenario, it doesn’t matter what the age of the email is, just when the email was deleted from the M365 mailbox. You could have a 10 year old email that was deleted yesterday, so it will sit within your main object repository for a year.

 

Then I’d utilise a backup copy job with retention for 10 years, immediately performing backup copies. This gives an immediate benefit of having multiple backup copies, were you to suffer corruption or some other misfortune. But this also means that you can meet your 10 year retention requirement if necessary. There’s some ambiguity on when you said “needs email for 10 years”, whether that’s the age of the email (if so, swap snapshot-based retention for item-level retention), or based on when it was deleted from the mailbox.

Userlevel 3

A query on the creation of the archival object storage repository..

Once we create the object storage utilizing azure archival blob object storage container do we need to create a second local storage repository and extend the same with the archival object storage just created ?

The reason i ask this is because there is no clear details on the same in the guide.

Also when creating a backup copy job from existing job the following archival object storage repository does not showup...😐

Could someone throw some light on the same ..

Both the blob and archival object storage repository have been deployed with the same retention type.

 

Userlevel 7
Badge +1

Hello,
Backup copy capabilities are only available if you have specified an extended backup repository as a target for your backup jobs
You have to create another backup repositories extended with the Azure Blob Storage Archive as a target for backup copy jobs.

The extended backup repository where you keep your backups and the target backup repository must be located on the same backup proxy server and have the same retention type

After you just have to follow this : https://helpcenter.veeam.com/docs/vbo365/guide/vbo_new_copy_job.html?ver=60

 

Userlevel 3

Hi Guys

Ok So i just tested the same ..

Workflow as below :

  • Configuration of a Local repostory1 with Azure Bob storage extent - retention period 1Yr
  • Configuration of a local repository 2 with Azure Archival storage extent - retention period 10yrs
  • Created backup job to Local repository repository 1
  • Create a backup copy job of the same to local repository 2

Please let me know if this methodology looks right or if there is something else to be done..Retention policy settings is left to default snapshot based retention considering additional egress costs with item-level retention.

 

Thx

Karun

Userlevel 3

Hello,
Backup copy capabilities are only available if you have specified an extended backup repository as a target for your backup jobs
You have to create another backup repositories extended with the Azure Blob Storage Archive as a target for backup copy jobs.

The extended backup repository where you keep your backups and the target backup repository must be located on the same backup proxy server and have the same retention type

After you just have to follow this : https://helpcenter.veeam.com/docs/vbo365/guide/vbo_new_copy_job.html?ver=60

 

Thanks Stabz..👍

Comment