Skip to main content

Hi all, 

I’m working in a Object Storage repository sizing and I trying to use Veeam Calculator and Object First calculator too. 

 

These are the starting criteria:

Size data source: 200TB

Retention: 14 days

Immutability: 14 days

Forecast: 5%

Annual Growt rate: 3%

 

Entering these configurations I received 2 different results from Veeam and Object First calculator.

In the first case I received 217.6TB repository space. With OF calculator I receive a sizing of 436TB. I got an answer from an ObjectFirst SE about an advanced configuration to set to have a better sizing but final result was 335 TB. 

I think there is a big difference from 2 calculators.

Referring to BP page: https://bp.veeam.com/vbr/2_Design_Structures/D_Veeam_Components/D_backup_repositories/nasrepo.html
and using this formula: 

Object Storage Repository Sizing

Sizing

Formula

Full Backup

Source size - data reduction (%)

Incremental Backups

(FUll Backup * change rate) * retention (days)

Backup size

Full Backup + Incremental Backup

Metadata

Backup Size * 5%

Repository Capacity

Backup Size + Metadata + Workspace

NOTE: Workspace space is only required for backup on disk and it’s not required for backup to Object Storage. In the same way, for NAS Backup to Object Storage, the Metadata will be kept in the Cache Repository AND in the Object Storage itself.

 

Considering a data reduction = 15% 

calculation would be:  

 

200 - 15% = 170TB FULL

170*0.1*14 = 238TB incrementals

Full + incrementals = 408TB + 20 TB metadata
Total: 428 TB. 

So can we consider an error on Veeam calculator? 

Thanks. 

I don’t think you can consider Veeam’s calculator to be in error as I am sure the formulas work differently between taking in to account the storage which Object First is doing.  I cannot say this for certain and will let others who wrote the calculator comment - ​@david.tosoff 


I don’t think you can consider Veeam’s calculator to be in error as I am sure the formulas work differently between taking in to account the storage which Object First is doing.  I cannot say this for certain and will let others who wrote the calculator comment - ​@david.tosoff 

Hi ​@Chris.Childerhose but considering the formula in VeeamBP site it seems best sizing for an Object storage, not only Object First, Veeam calculator did not show the right result.

If the formula on the VeeamBP site is wrong, it would be helpful for us to have more correct informations.

 

 


I don’t think you can consider Veeam’s calculator to be in error as I am sure the formulas work differently between taking in to account the storage which Object First is doing.  I cannot say this for certain and will let others who wrote the calculator comment - ​@david.tosoff 

Hi ​@Chris.Childerhose but considering the formula in VeeamBP site it seems best sizing for an Object storage, not only Object First, Veeam calculator did not show the right result.

If the formula on the VeeamBP site is wrong, it would be helpful for us to have more correct informations.

 

 

Then you will need to wait for someone that helped with the calculator to answer this question.  Like I said I am not giving a correct answer here just that each calculator can be different I am sure.


Hi ​@Andanet Can you clarify your input values?

You specified these inputs:

Size data source: 200TB

Retention: 14 days

Immutability: 14 days

Forecast: 5%

Annual Growth rate: 3%

 

And your manual math:

200 - 15% = 170TB FULL

170*0.1*14 = 238TB incrementals

Full + incrementals = 408TB + 20 TB metadata
Total: 428 TB. 

 

I see 3 different rates with 3 different meanings/purposes…?

Forecast & AGR are to project into the future. Forecast would be an integer (years) rather than a percent. Daily Change Rate is to determine normal incremental size.

 

---

That said, if I go strictly based on the 10% daily change rate, no forecast, and 15% compression you have in your manual math… The calculator arrives at 390TB, which without going into all the internal complexities is within range.

Note that 20TB of metadata for 400TB of data is quite high. BP site may need updating if that is where it’s from. Calculations would be closer to <1TB (file count dependant)