Solved

Start-VBRComputerBackupJob returns when job starts, how to run+wait until job completes


Userlevel 2

hello, thanks, first time poster,

problem: `rclone copy` will run before the job completes.

$JobObject = Get-VBRComputerBackupJob | where {$_.Name -eq "ABP_EN10"}

# start job but do not wait until job completes
Start-VBRComputerBackupJob $JobObject

# rclone will not find any new files, as the job has not completed yet
rclone copy .....

how can i have the equivalent of

$JobObject = Get-VBRComputerBackupJob | where {$_.Name -eq "ABP_EN10"}

# run job, wait until completed
RunWait-VBRComputerBackupJob $JobObject

# now, rclone will find new files
rclone copy .....

 

 

 

icon

Best answer by asdffdsa 11 April 2023, 02:49

View original

This topic has been closed for comments

14 comments

Userlevel 7
Badge +20

You might need to put in a wait timer before rclone. I think that is the only way.

Userlevel 7
Badge +14

Manipulating the backup files manually looks like a bad idea to me.

You could run a backup copy job to copy it over to a second location, chained to the backup job itself, this way you should probably be able to achieve the same goal

Userlevel 7
Badge +17

Start-VBRComputerBackupJob just starts the job.

Check for the status of the session of the runnig job. When the job is running, the session is active. When the session is finished you can start your following actions.

 

These are the commands for “normal” VBR jobs:

$job = Get-VBRJob -Name "Backup Copy Job"

$session = Get-VBRSession -Job $job

I am not at a VBR server at the moment to lookup the commands for the Agent jobs.

Userlevel 7
Badge +17

After starting the job you can get the session with

Get-VBRComputerBackupJobSession -Name "YourJobName”

Look for the session in state “Running” and query this session in a while loop until it is in state “Stopped”. Then you can query the element Result and do your following actions depending on the result of this session….

Perhaps there is a more elegant way to do this, but this will work.

Userlevel 2

@mkevenaar- “Manipulating the backup files manually looks like a bad idea to me.”

i get your point, however, for over four years, this has been working 100% predictably on multiple servers and clients.

once veeam writes a .vib|.vbk, veeam will not never modify the file again.
in fact, if one of the veeam backup files changes, that is a major problem!

so manually pruning old backup files and/or running commands like these are fine.

# backup files newer than 90 days, copy to wasabi
rclone copy x:/path/to/veeam.repository wasabi: --immutable --max=age=90d

# backup files over than 91 days, copy to aws deep glacier
rclone copy x:/path/to/veeam.repository aws.deep.glacier --immutable --min=age=91d

# check the source and dest
rclone check x:/path/to/veeam.repository wasabi:

note: by using `--immutable`, if a source file was modified as compared to the dest, rclone will fail with hard error.

 

Userlevel 2

@JMeixner, thanks,

“Perhaps there is a more elegant way to do this, but this will work.”

yes, i found example codes such as  https://community.veeam.com/discussion-boards-66/vbr-powershell-command-to-fetch-agent-based-backup-2598

imho, seems a lot of scripting for what should be a simple task.
run a script after a backup completes.

anyhoo, i cannot complain, using an awesome free product and active forum.

so i guess i will have to learn enough powershell to get this done.

 

Userlevel 7
Badge +14

@mkevenaar- “Manipulating the backup files manually looks like a bad idea to me.”

i get your point, however, for over four years, this has been working 100% predictably on multiple servers and clients.

once veeam writes a .vib|.vbk, veeam will not never modify the file again.
in fact, if one of the veeam backup files changes, that is a major problem!

so manually pruning old backup files and/or running commands like these are fine.

# backup files newer than 90 days, copy to wasabi
rclone copy x:/path/to/veeam.repository wasabi: --immutable --max=age=90d

# backup files over than 91 days, copy to aws deep glacier
rclone copy x:/path/to/veeam.repository aws.deep.glacier --immutable --min=age=91d

# check the source and dest
rclone check x:/path/to/veeam.repository wasabi:

note: by using `--immutable`, if a source file was modified as compared to the dest, rclone will fail with hard error.

 

This is exactly what a Scale-Out Backup Repository can do for you, out of the box, without scripting and the possibility to restore from S3, if needed. The only caveat would be that you would have to switch to an AWS S3 bucket for your short time storage, I believe.

Userlevel 2

i thought SOBR and object storage requires a paid license, i am using community edition.

tho, i have used `rclone mount` to mount the wasabi bucket and from that, perform veeam instant recovery.

 

 

Userlevel 7
Badge +17

@JMeixner, thanks,

“Perhaps there is a more elegant way to do this, but this will work.”

yes, i found example codes such as  https://community.veeam.com/discussion-boards-66/vbr-powershell-command-to-fetch-agent-based-backup-2598

imho, seems a lot of scripting for what should be a simple task.
run a script after a backup completes.

anyhoo, i cannot complain, using an awesome free product and active forum.

so i guess i will have to learn enough powershell to get this done.

 

Ok, but you will need a kind of loop during the runtime of the process. The command in the other thread gives you the processes from the last 24 hours.

Backup jobs are asyncron processes. So, one command starts them and then you have to query the session.

 

Btw, did you experiment with pre and post backup scripts?

Userlevel 7
Badge +20

@JMeixner, thanks,

“Perhaps there is a more elegant way to do this, but this will work.”

yes, i found example codes such as  https://community.veeam.com/discussion-boards-66/vbr-powershell-command-to-fetch-agent-based-backup-2598

imho, seems a lot of scripting for what should be a simple task.
run a script after a backup completes.

anyhoo, i cannot complain, using an awesome free product and active forum.

so i guess i will have to learn enough powershell to get this done.

 

Ok, but you will need a kind of loop during the runtime of the process. The command in the other thread gives you the processes from the last 24 hours.

Backup jobs are asyncron processes. So, one command starts them and then you have to query the session.

 

Btw, did you experiment with pre and post backup scripts?

Post-job script sounds like a smart move on this one, though you’d need logic to ensure the job was successful before doing any copying.

 

Also, @asdffdsa I understand you’re using this for free so you’re coming up with creative ways to achieve this, but I would never be recommending performing these copies yourself as all of Veeam’s backup validation logic is missing. And if this is a production environment, streamlining this in a supported way (and with support) makes the license cost worth it IMO.

Userlevel 2

“the license cost worth it IMO”
of course, i expected that response at some point.

anyhoo, was just trying to run a simple script after a backup completes,
not get pushed into purchasing a license.

 

“all of Veeam’s backup validation logic is missing”
sorry, not sure what `all`means?
as “perform backup files health check(detects and auto-heals corruptions)” is enabled for each and every backup.

i simply disable:
“create synthetic”
“remove deleted items”
“defragment and compact full backup file”

so veeam will never modify a backup file, makes it easy to manually prune, as each .vbk is a complete full backup.

Userlevel 7
Badge +20

“the license cost worth it IMO”
of course, i expected that response at some point.

anyhoo, was just trying to run a simple script after a backup completes,
not get pushed into purchasing a license.

 

“all of Veeam’s backup validation logic is missing”
sorry, not sure what `all`means?
as “perform backup files health check(detects and auto-heals corruptions)” is enabled for each and every backup.

i simply disable:
“create synthetic”
“remove deleted items”
“defragment and compact full backup file”

so veeam will never modify a backup file, makes it easy to manually prune, as each .vbk is a complete full backup.

At the risk of being accused of as “that guy” pushing you into a license, which I’m not actually trying to do here, but making you aware of the EULA, you mentioned multiple clients: The EULA of the community Edition prohibits being used as part of a service offering:

 

https://www.veeam.com/eula.html

 

“You may not use the Free and Community Edition Licenses to provide services to third parties (including support and consulting services for existing Free and Community Edition License installations) or to process third-party data.”

 

Now, I’m aware that other clients could’ve been licensed etc etc, but just making sure you’re not accidentally getting into license non-compliance.

As for the backup validation logic, I’m referring to Veeam’s internal checks that a data transfer is good, plus all the features you’ve mentioned there to make sure the data is usable.

 

It also looks like you’re using object storage completely differently to how Veeam works with object storage, so I’d be mindful of maximum object size and your VBK files

Userlevel 2

sorry but you are “that guy”, multiple times over.

yes, i used the word `clients` in the context of
`this has been working 100% predictably on multiple servers and clients`
in that context `client` is a desktop machine

yet, somehow you twisted that to pushing licensing at me.

i came here to with simple question, and your pontificating ego turned that into a way off-topic about licenses.

in the end, thanks to other forum member, i got the answer i needed.

Userlevel 7
Badge +20

sorry but you are “that guy”, multiple times over.

i came here to with simple question, and your pontificating ego turned that into a way off-topic about licenses.

in the end, thanks to other forum member, i got the answer i needed.

please delete this topic.

😆

 

So you come to the community for advice, get that answer + additional insights from people trying to help with their own free time and your response is to insult.

 

Good luck in life with that attitude! 👍