damien commenge wrote:
OK, I just finish to look at both documents. There is something I totally don’t understand. Why they are both talking only about agent backup ?
As a Service Provider, it’s possible to install the Veeam Agent on individual servers and backup direct to object. The Service Provider Console provides the repository information including the credentials, but the data doesn’t flow through the console and the agent instead connects direct to the object repository at your provider of choice. However, I’m not sure about doing this outside of the service provider realm, but that’s the way I view it and why it would be talked about in regards to the agent connecting to the repo in this manner.
damien commenge wrote:
On my side, it’s just about VM and nas today. I don’t seen anything about “standard” backup on the PDF and on your video. Is it not compatible ?
My object storage will be used as backup copy destination.
Backup to Hardened linux repository and backup copy to Netapp S3 compatible.
SOBR is expected here.
Is it applicable in my scenario too ?
VBR can certainly backup direct to object, or copy direct to object. SOBR is no longer needed, but is still possible as well. Multiple buckets or folders are not needed when it’s for a single repository. One thing I want to note is that all data is handled by a single server, a single bucket is fine. However, if you have multiple servers that will accessing the account, a separate bucket is recommended for each server. This is because each server keeps a database of the data in the bucket, so if one server uploads data into a bucket, performance will be affected as other servers accessing the bucket will be surprised to see foreign data in that bucket and I believe will need to index that data. It doesn’t sound like this is your scenario, but I just wanted to call that out. One thing I haven't investigated is if this would have the same effect if each server used the same bucket but a different folder within the bucket. That said, I just prefer to keep my buckets separate, using different credentials for each bucket with policies set to isolate the buckets.
I just realized that a while back I write a blog post about my setup. I believe it’s a bit dated because I believe there’s more capability here, as well as Wasabi now supplies WACM for free (they charged $100/month when I posted this) so you can actually create separate sub-accounts for each tenant/customer/reseller/whatever you see fit. I posted the link below for reference.
Also, the way I’ve done this pretty well completely ignores the IAM/STS policies that Michael mentions in his response above. Reading his response, my assumption is that I’m doing everything more manually that what it appears may be possible using IAM/STS and allowing Veeam to manage the limited scope keys to folders within a bucket. Not that I think the way I’m doing things is wrong, but there may be a better way.