Veeam Amazing Object Storage Tips & Techniques Part 2

  • 23 December 2022
  • 3 comments
  • 575 views

Userlevel 6
Badge +2

One of my roles at Veeam is to work with our Technology Alliance Partners, resellers, and customers assisting them with their object storage related questions.  Quite often I need to be able to analyze the S3 API requests that Veeam’s software sends to the object storage repository.  One of my favorite tools that I use for this analysis is S3 bucket access logging.

By enabling this feature on a bucket, you will be able to capture information on all requests made to that bucket.  Requests like PUT, GET, and DELETE are all gathered and stored in a log that you specify when you enable the logging. 

Please Note: Not all object storage solutions support the S3 bucket access logging, so make sure that you check their documentation and/or support to verify that they support it.

You can either use the interface provided by your object storage solution or you can use the AWS CLI.  In this article I will use the AWS CLI.

The first step is we need a dedicated bucket to store our S3 logs in.  It is a best practice to have your logs in a different bucket that the monitored bucket.  If you were to store the logs in the same bucket, this will create a recursive loop that will cause your object storage to grow exponentially.  This could cause your on-prem object storage solution to crash or be extremely costly if using a public cloud platform.

So, let’s create our log bucket:

aws s3api create-bucket --bucket s3-log-bucket --endpoint-url=http://<your object storage endpoint>

#Result should look like this:

{
"Location": "/s3-log-bucket"
}

Now we’ll create the bucket that we want to monitor the S3 API requests:

aws s3api create-bucket --bucket s3-bucket-logging-demo --endpoint-url=http://<your object storage endpoint>

#Result should look like this:

{

    "Location": "/s3-bucket-logging-demo"

}

We have now created the two buckets that we will use to set up S3 bucket access logging.  To recap, the S3 API requests sent to the bucket s3-bucket-logging-demo will be captured in the bucket s3-log-bucket.

This is the command that will configure the logging scenario I just described:

aws s3api put-bucket-logging --bucket s3-bucket-logging-demo --bucket-logging-status '{"LoggingEnabled":{"TargetBucket":"s3-log-bucket","TargetPrefix":"logs/"}}' --endpoint-url=<your object storage endpoint>

Notice that the “TargetPrefix” keyword specifies where in the “TargetBucket” we will store the logs.  In this example we are storing the logs in folder/prefix named “logs” within bucket “s3-log-bucket”.

Since the put-bucket-logging command doesn’t return any results if the command successfully executes, we can use the following command to verify that the logging was configured as intended:

aws s3api get-bucket-logging --bucket s3-bucket-logging-demo --endpoint-url https://<your object storage endpoint>

#You should see results similar to this:

{

    "LoggingEnabled": {

        "TargetBucket": "s3-log-bucket",

        "TargetPrefix": "logs/"

    }

}

Now that we have the logging configured, let's use the put-object command to write an object to our target bucket.

aws s3api put-object --bucket s3-bucket-logging-demo --key testfiles/testput1.xml --body c:\capacity.xml --endpoint-url https://<your object storage endpoint>

#You should see results similar to this:

{

    "ETag": "\"b388f86fb56ef977ce0fd33c86d5c02a\""

}

Let’s take a look in our log bucket to see if any logs have been created:

aws s3 ls s3-log-bucket/ --recursive --endpoint-url=<your object storage endpoint>

#You should see results similar to this:

0 logs/

10980 logs/2022-12-22-13-13-41-E65BWIZP2H2F7I57

If you don’t immediately see results don’t panic, step away and try again in a few minutes.  Depending on your object storage platform, it can take up to an hour to see the logs populated with results.   

Once you have a log file in your log bucket, you can use your editor of choice to open the log file. 

Wasabi has a created their Wasabi Bucket log viewer which presents the log contents in a user-friendly format.

Inside the log file there might already be numerous entries already which can make it difficult to find for example the single put command I used earlier.  So, I will search for log entries that contain the name of the object “testput1.xml” that I put into the bucket.

The following entry was found:

2761027DB3AF893727E96FA81146161DDE9A3E23AFFBA62314DA700549D94A34 s3-bucket-logging-demo [22/Dec/2022:13:08:18 +0000] 75.131.150.183 2761027DB3AF893727E96FA81146161DDE9A3E23AFFBA62314DA700549D94A34 3593E240BE0826D1 REST.PUT.OBJECT testfiles%2Ftestput1.xml "PUT /s3-bucket-logging-demo/testfiles/testput1.xml" 200 - - 173 113 0 "" "aws-cli/2.7.7 Python/3.9.11 Windows/10 exe/AMD64 prompt/off command/s3api.put-object" –

There is a lot of information packed into that one log entry line.  I created the table below to help make the entry easier to understand.

Bucket Owner (Canonical ID)

 2761027DB3AF893727E96FA81146161DDE9A3E23AFFBA62314DA700549D94A34

Bucket Name

s3-bucket-logging-demo

Time

 [22/Dec/2022:13:08:18 +0000]

RemoteIP

75.131.150.183

Requester

2761027DB3AF893727E96FA81146161DDE9A3E23AFFBA62314DA700549D94A34

RequestId

3593E240BE0826D1

Operation Key

REST.PUT.OBJECT

Request-URI

 testfiles%2Ftestput1.xml "PUT /s3-bucket-logging-demo/testfiles/testput1.xml"

HttpStatus

200

ErrorCode

-

BytesSent

-

ObjectSize

173

TotalTime (ms)

113

Turn-AroundTime (ms)

0

Referrer

""

User-Agent

"aws-cli/2.7.7 Python/3.9.11 Windows/10 exe/AMD64 prompt/off command/s3api.put-object"

VersionId

  • (in this example versioning is off)

I won’t dive into a detailed explanation of every data element in this article, but if you ask for a data element to be explained in the comments I will do my best to provide one.

There are a few though that I use often and will point them out now.

“Operation Key” is the S3 API request that was captured.  In this example it was the REST.PUT.OBJECT which is the put-object command we entered on the aws cli.

“User-Agent” confirmed that the aws-cli was used and the command entered was in fact put-object.  You can use this User-Agent field to identify objects written by Veeam as well.  For example: "APN/1.0 Veeam/1.0 Backup/12.0" is what you will see for an object written by Veeam Backup & Replication version 12.0.  If you were to examine a log entry for Veeam Backup for Microsoft 365 you would see “APN/1.0 Veeam/1.0 Office365/6.0"

“Request-URI” is the “request unique resource identifier” and it shows you in the context of the S3 protocol what was executed.  in this example we see the PUT api was used to place “testput1.xml” into the bucket “s3-bucket-logging-demo” using the ”testfiles” prefix (testfiles%2Ftestput1.xml "PUT /s3-bucket-logging-demo/testfiles/testput1.xml").

Hopefully you can now see the value of using S3 bucket access logging in troubleshooting or just learning what requests are made to your buckets.  There is a ton of valuable information that can be extracted from these logs, but there are also additional benefits to enabling this feature.  It is also a recommended best practice for security purposes.  The logs that I showed you can also help your data security folks teams with auditing who is accessing the data in your bucket(s).  This ability can assist with tracking down data breaches as well as generating reports for compliance purposes.

In a future post, I will show you how you can use Veeam’s native S3 logging capabilities in Veeam Backup & Replication v12 to see both the requests made to the target buckets, but also the responses from the object storage back to VBR.


3 comments

Userlevel 7
Badge +7

@SteveF Great post! Very insightful and a great way to get ready for this new V12 feature!

Userlevel 7
Badge +22

Thanks. @SteveF this kind of deep dive is really helpful. I am especially interested in trying this with Object First next week when we get our appliance up and going in the lab!

Userlevel 7
Badge +6

I haven’t needed logging like this, but I did see the option in Wasabi and briefly wondered what it was all about.  Thanks for this helpful info!

Comment