Question

Need advice on speeding up retrieval Backup stats via PowerShell


Userlevel 1
  • Not a newbie anymore
  • 3 comments

Hi, everyone!   I have a PowerShell script that I use to retrieve some information about the daily backup sessions.  It works well, but I was trying to transition it from the Get-VBR-Job command to the Get-VBRBackup command.  Everything works but the newer script is very slow;  it went from a few seconds with the old script to 5-10 minutes with the new.  I was wondering if anyone could give me pointers on a more efficient way to get the session information or otherwise speed it up.

 

Thanks in advance!!

Robb

Original script

#backup Jobs

Foreach ($JobObject in Get-VBRJob){
#write-host "loop number ${$jobobject.indexof()}"
$session=$jobobject.FindLastSession()
if (!($JobObject.JobType -eq "BackupSync")) {
$JobOutput = New-Object -TypeName PSObject
if (((get-date).AddDays(-1) -lt $session.Progress.StopTimeLocal) -or ($session.Result -in "Failed","None")) {
$JobOutput | Add-Member -Name "Job Name" -MemberType Noteproperty -Value $JobObject.Name
$JobOutput | Add-Member -Name "Start Time" -MemberType Noteproperty -Value $session.Progress.StartTimeLocal
$JobOutput | Add-Member -Name "End Time" -MemberType Noteproperty -Value $session.Progress.StopTimeLocal
$JobOutput | Add-Member -Name "Duration" -MemberType Noteproperty -Value $session.Progress.Duration
if ($session.Progress.TransferedSize -lt 1KB) {
$xsize = $session.Progress.TransferedSize
$sz = "Bytes"
} elseif ($session.Progress.TransferedSize -lt 1MB) {
$xsize = $session.Progress.TransferedSize / 1KB
$sz = "KB"
} elseif ($session.Progress.TransferedSize -lt 1GB) {
$xsize = $session.Progress.TransferedSize / 1MB
$sz = "MB"
} elseif ($session.Progress.TransferedSize -lt 1TB) {
$xsize = $session.Progress.TransferedSize / 1GB
$sz = "GB"
} else {
$xsize = $session.Progress.TransferedSize / 1TB
$sz = "TB"
}
$xsize = "{0:n2}" -f $xsize
$JobOutput | Add-Member -Name "Data Transferred in GB" -MemberType Noteproperty -Value $xsize$sz
$JobOutput | Add-Member -Name "Result" -MemberType NoteProperty -Value $session.Result
if ((New-TimeSpan -Start $session.Progress.StartTimeLocal -end (Get-Date -f "MM/dd/yyyy HH:mm:ss tt")) -lt "24:00:00") {
$JobsOutput += $JobOutput
}}
}
}
$JobsOutput | Sort-Object -property 'Job Name' |Export-csv -Path $bfile -NoTypeInformation

New Script 

 

# Backup Jobs

foreach ($jobobject in Get-VBRBackup) {
if ($JobObject.JobType -ne "SimpleBackupCopyPolicy") {
if ($jobobject.FindJob()) {
$sess = Get-VBRsession -job $jobobject.findjob() | sort -Property Creationtime -Descending | select -First 1
}
else
{
continue
}
$session = Get-VBRTaskSession -Session $sess
$JobOutput = New-Object -TypeName PSObject
if (((get-date).AddDays(-1) -lt $session.Progress.StopTimeLocal) -or ($session.Status -in "Failed","None")) {
$JobOutput | Add-Member -Name "Job Name" -MemberType Noteproperty -Value $JobObject.Name
$JobOutput | Add-Member -Name "Start Time" -MemberType Noteproperty -Value $session.Progress.StartTimeLocal
$JobOutput | Add-Member -Name "End Time" -MemberType Noteproperty -Value $session.Progress.StopTimeLocal
$JobOutput | Add-Member -Name "Duration" -MemberType Noteproperty -Value $session.Progress.Duration
switch ($session.Progress.TransferedSize) {
{$_ -lt 1KB} {$xsize = $session.Progress.TransferedSize
$sz = "Bytes"}
{$_-lt 1MB} {$xsize = $session.Progress.TransferedSize / 1KB
$sz = "KB"}
{$_ -lt 1GB} {$xsize = $session.Progress.TransferedSize / 1MB
$sz = "MB"}
{$_ -lt 1TB} {$xsize = $session.Progress.TransferedSize / 1GB
$sz = "GB"}
Default {$xsize = $session.Progress.TransferedSize / 1TB
$sz = "TB"}
}
$xsize = "{0:n2}" -f $xsize
$JobOutput | Add-Member -Name "Data Transferred in GB" -MemberType Noteproperty -Value $xsize$sz
$JobOutput | Add-Member -Name "Result" -MemberType NoteProperty -Value $session.Status
if ((New-TimeSpan -Start $session.Progress.StartTimeLocal -end (Get-Date -f "MM/dd/yyyy HH:mm:ss tt")) -lt "24:00:00") {
$JobsOutput += $JobOutput
}
}
}
}
$JobsOutput | Sort-Object -property 'Job Name' |Export-csv -Path $bfile -NoTypeInformation

 


8 comments

Userlevel 7
Badge +17

Hi @safiya - I think this post may be best served in the YARA/Script Library group.

Maybe @SteveHeart can assist here?

Userlevel 7
Badge +20

What is the need to change the particular command for - is there something in the new one that the old does not give you?  Just curious and very interested in finding out why this is happening.

Userlevel 1

@Chris.Childerhose When I run the original one I get a message that the cmdlet is no longer supported for Computer backup jobs.   My (possibly incorrect) assumption was that the command may be phased out in a future release.  I just wanted to make sure I kept my script up.  I am still running the original  day-to-day while because of the speed issue.

Userlevel 7
Badge +20

That definitely makes sense as I am sure it will be phased out. Hopefully someone can chime in on the performance.

Hello,

I recommend you to be more specific in your request because currently those scripts are incomplete and you don't give an example of the result you want to achieve.

In the case of the examples I leave you here, a recommendation to improve performance would be to first get the list of sessions and then filter the result by "Backup Job".

In that way you don't have to perform a query by "Backup Job" which is a bit slower than the callback "$JobObject.FindLastSession()" which to my understanding should be a "store procedure\indexes" that is automatically updated after each backup job is finished.

I hope this can help you!

Greetings,

 

# Session per Backup Jobs

# Session per Backup Jobs

# Start Helper functions
function ConvertTo-FileSizeString {
<#
.SYNOPSIS
Used by As Built Report to convert bytes automatically to GB/TB/etc based on input object.
.DESCRIPTION
.NOTES
.EXAMPLE
ConvertTo-FileSizeString -Size $Size
.LINK
#>
[CmdletBinding()]
[OutputType([String])]
Param
(
[Parameter (
Position = 0,
Mandatory)]
[int64]
$Size
)

switch ($Size) {
{ $_ -gt 1TB }
{ [string]::Format("{0:0} TB", $Size / 1TB); break }
{ $_ -gt 1GB }
{ [string]::Format("{0:0} GB", $Size / 1GB); break }
{ $_ -gt 1MB }
{ [string]::Format("{0:0} MB", $Size / 1MB); break }
{ $_ -gt 1KB }
{ [string]::Format("{0:0} KB", $Size / 1KB); break }
{ $_ -gt 0 }
{ [string]::Format("{0} B", $Size); break }
{ $_ -eq 0 }
{ "0 KB"; break }
default
{ "0 KB" }
}
}

# End Helper function

# Main script start here

# CSV Output file vaiable
$bfile = "$env:USERPROFILE\jobs_new.csv"

# Array used to save eatch backup job sessions information
$JobsOutput = @()

# Used to storage backup job sessions
$Sessions = Get-VBRBackupSession

foreach ($jobobject in (Get-VBRBackup | Where-Object { $_.JobType -ne "SimpleBackupCopyPolicy" } | Sort-Object -Property 'JobName')) {
if ($Session = $Sessions | Where-Object { ($_.OrigJobName -eq $jobobject.Name) -and (((Get-Date).AddDays(-1) -lt $_.Progress.StopTimeLocal) -and ((New-TimeSpan -Start $_.Progress.StartTimeLocal -End (Get-Date -f "MM/dd/yyyy HH:mm:ss tt")) -lt "24:00:00") -or ($_.Result -in "Failed", "None")) } | Sort-Object -Property Creationtime -Descending | Select-Object -First 1) {
$JobOutput = New-Object -TypeName PSObject
$JobOutput | Add-Member -Name "Job Name" -MemberType Noteproperty -Value $Session.OrigJobName
$JobOutput | Add-Member -Name "Start Time" -MemberType Noteproperty -Value $Session.Progress.StartTimeLocal
$JobOutput | Add-Member -Name "End Time" -MemberType Noteproperty -Value $Session.Progress.StopTimeLocal
$JobOutput | Add-Member -Name "Duration" -MemberType Noteproperty -Value ("{0:dd}d:{0:hh}h:{0:mm}m:{0:ss}s" -f $Session.Progress.Duration)
$JobOutput | Add-Member -Name "Data Transferred in GB" -MemberType Noteproperty -Value (ConvertTo-FileSizeString -Size $Session.Progress.TransferedSize)
$JobOutput | Add-Member -Name "Result" -MemberType NoteProperty -Value $Session.Result
$JobsOutput += $JobOutput
}
}

# Output variable content to CVS File
$JobsOutput | Export-Csv -Path $bfile -NoTypeInformation

 

# Task Session per Backup Job Session

# Task Session per Backup Job Session

# Start Helper functions
function ConvertTo-FileSizeString {
<#
.SYNOPSIS
Used by As Built Report to convert bytes automatically to GB/TB/etc based on input object.
.DESCRIPTION
.NOTES
.EXAMPLE
ConvertTo-FileSizeString -Size $Size
.LINK
#>
[CmdletBinding()]
[OutputType([String])]
Param
(
[Parameter (
Position = 0,
Mandatory)]
[int64]
$Size
)

switch ($Size) {
{ $_ -gt 1TB }
{ [string]::Format("{0:0} TB", $Size / 1TB); break }
{ $_ -gt 1GB }
{ [string]::Format("{0:0} GB", $Size / 1GB); break }
{ $_ -gt 1MB }
{ [string]::Format("{0:0} MB", $Size / 1MB); break }
{ $_ -gt 1KB }
{ [string]::Format("{0:0} KB", $Size / 1KB); break }
{ $_ -gt 0 }
{ [string]::Format("{0} B", $Size); break }
{ $_ -eq 0 }
{ "0 KB"; break }
default
{ "0 KB" }
}
}

# End Helper function

# Main script start here

# CSV Output file vaiable
$bfile = "$env:USERPROFILE\jobs_task_new.csv"

# Array used to save eatch backup job sessions information
$JobsOutput = @()

# Used to storage backup job sessions
$Sessions = Get-VBRBackupSession

foreach ($jobobject in (Get-VBRBackup | Where-Object { $_.JobType -ne "SimpleBackupCopyPolicy" } | Sort-Object -Property 'JobName')) {
if ($Session = $Sessions | Where-Object { ($_.OrigJobName -eq $jobobject.Name) -and (((Get-Date).AddDays(-100) -lt $_.Progress.StopTimeLocal) -or ($_.Result -in "Failed", "None"))} | Sort-Object -Property Creationtime -Descending | Select-Object -First 1) {
if ($TaskSessions = Get-VBRTaskSession -Session $Session | Where-Object {((New-TimeSpan -Start $_.Progress.StartTimeLocal -end (Get-Date -f "MM/dd/yyyy HH:mm:ss tt")) -lt "24:00:00")}) {
foreach ($TaskSession in $TaskSessions) {
$JobOutput = New-Object -TypeName PSObject
$JobOutput | Add-Member -Name "Job Name" -MemberType Noteproperty -Value $TaskSession.JobName
$JobOutput | Add-Member -Name "Task Name" -MemberType Noteproperty -Value $TaskSession.Name
$JobOutput | Add-Member -Name "Start Time" -MemberType Noteproperty -Value $TaskSession.Progress.StartTimeLocal
$JobOutput | Add-Member -Name "End Time" -MemberType Noteproperty -Value $TaskSession.Progress.StopTimeLocal
$JobOutput | Add-Member -Name "Duration" -MemberType Noteproperty -Value ("{0:dd}d:{0:hh}h:{0:mm}m:{0:ss}s" -f $TaskSession.Progress.Duration)
$JobOutput | Add-Member -Name "Data Transferred in GB" -MemberType Noteproperty -Value (ConvertTo-FileSizeString -Size $TaskSession.Progress.TransferedSize)
$JobOutput | Add-Member -Name "Result" -MemberType NoteProperty -Value $TaskSession.Status
$JobsOutput += $JobOutput
}
}
}
}

# Output variable content to CVS File
$JobsOutput | Export-Csv -Path $bfile -NoTypeInformation

 

Userlevel 1

@jcolonfzenpr  Thank you! for your suggestions!   Yes, this is a part of a larger script.  Really what I am after is the Start time, stop time, duration, bytes transferred, and result or status of any backup sessions (physical or vm) that have ended within the last 24 hours.   It is part of a script that I have run automatically each day.  

 

Hello,

Based on what you describe I still believe that the best option is to get the complete list of sessions and then use the Where-Object cmdlet to filter the content for each backup job.

As shown in this code block, I first get the list of all the sessions "$Sessions = Get-VBRBackupSession" and then filter the content by backup job “$Sessions | Where-Object { $_.OrigJobName -eq $Job.Name }”.

 In this case, the cmdlet took 1 second to finish:

PS C:\Users\jocolon> "{0:dd}d:{0:hh}h:{0:mm}m:{0:ss}s" -f (Measure-Command -Expression { & {
$Sessions = Get-VBRBackupSession
$Jobs = Get-VBRBackup
foreach ($Job in $Jobs) {
$Sessions | Where-Object { $_.OrigJobName -eq $Job.Name }
}
}
})

#Elapsed time to completion
00d:00h:00m:01s
PS C:\Users\jocolon>

Here is the form where the "Get-VBRBackupSession" cmdlet is called for each backup job. In my case as I have 20 backup jobs configured in my homelab, the cmdlet "Get-VBRBackupSession" is executed 20 times!

In this case, the cmdlet took 32s second to finish:

PS C:\Users\jocolon> "{0:dd}d:{0:hh}h:{0:mm}m:{0:ss}s" -f (Measure-Command -Expression { & {
$Jobs = Get-VBRBackup
foreach ($Job in $Jobs) {
Get-VBRBackupSession | Where-Object {$_.OrigJobName -eq $job.Name}
}
}
})

# Elapsed time to completion
00d:00h:00m:32s
PS C:\Users\jocolon>

However, you must perform these tests in your environment because in development there are many ways to solve a problem!

Here is an example of the code I used:

# Session per Backup Jobs

# Helper functions Start here
function ConvertTo-FileSizeString {
<#
.SYNOPSIS
Used by As Built Report to convert bytes automatically to GB/TB/etc based on input object.
.DESCRIPTION
.NOTES
.EXAMPLE
ConvertTo-FileSizeString -Size $Size
.LINK
#>
[CmdletBinding()]
[OutputType([String])]
Param
(
[Parameter (
Position = 0,
Mandatory)]
[int64]
$Size
)

switch ($Size) {
{ $_ -gt 1TB }
{ [string]::Format("{0:0} TB", $Size / 1TB); break }
{ $_ -gt 1GB }
{ [string]::Format("{0:0} GB", $Size / 1GB); break }
{ $_ -gt 1MB }
{ [string]::Format("{0:0} MB", $Size / 1MB); break }
{ $_ -gt 1KB }
{ [string]::Format("{0:0} KB", $Size / 1KB); break }
{ $_ -gt 0 }
{ [string]::Format("{0} B", $Size); break }
{ $_ -eq 0 }
{ "0 KB"; break }
default
{ "0 KB" }
}
}

# Helper functions End here

# Main script start here

# CSV Output variable
$bfile = "$env:USERPROFILE\jobs_new.csv"

# Array use to store each backup job psobject
$JobsOutput = @()

# Variable use to store backup job sessions
$Sessions = Get-VBRBackupSession

foreach ($jobobject in (Get-VBRBackup | Where-Object { $_.JobType -ne "SimpleBackupCopyPolicy" } | Sort-Object -Property 'JobName')) {
if ($Session = $Sessions | Where-Object { ($_.OrigJobName -eq $jobobject.Name) -and (((Get-Date).AddDays(-10000) -lt $_.Progress.StopTimeLocal)) } | Sort-Object -Property Creationtime -Descending | Select-Object -First 1) {
$JobOutput = New-Object -TypeName PSObject
$JobOutput | Add-Member -Name "Job Name" -MemberType Noteproperty -Value $Session.OrigJobName
$JobOutput | Add-Member -Name "Start Time" -MemberType Noteproperty -Value $Session.Progress.StartTimeLocal
$JobOutput | Add-Member -Name "End Time" -MemberType Noteproperty -Value $Session.Progress.StopTimeLocal
$JobOutput | Add-Member -Name "Duration" -MemberType Noteproperty -Value ("{0:dd}d:{0:hh}h:{0:mm}m:{0:ss}s" -f $Session.Progress.Duration)
$JobOutput | Add-Member -Name "Data Transferred in GB" -MemberType Noteproperty -Value (ConvertTo-FileSizeString -Size $Session.Progress.TransferedSize)
$JobOutput | Add-Member -Name "Result" -MemberType NoteProperty -Value $Session.Result
$JobsOutput += $JobOutput
}
}

# Output variable content to CVS File
$JobsOutput | Export-Csv -Path $bfile -NoTypeInformation

I hope this can help you find the solution you are looking for.

Userlevel 1

@jcolonfzenpr awesome!  I will incorporate those changes in my script and report back. I believe you are right.  The culprit seems to be the getting the session information dynamically inside the loop.  After putting timing counters on different parts of the script, that takes about 15 seconds every loop.  which seems fairly obvious in hindsight :D  Thank you for your help!

 

Comment