Skip to main content

Hello

Hello

I'm trying to make a proxmox backup for a cluster in a POC
Fresh install of Veeam with latest patch
fresh install of proxmox 8, with latest patch (experimental, no subscription)

I created the pve proxies with a user as I found in the Veeam documentationhere is the "config"

as root on pve shell 

useradd svc-prx-veeam
passwd svc-prx-veeam


nano /etc/sudoers.d/veeam

add these lines
svc-prx-veeam ALL=(root) PASSWD: /usr/sbin/dmidecode -s system-uuid
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/kvm -S *
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/qemu-img info *
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/qemu-img create *
svc-prx-veeam ALL=(root) PASSWD: /usr/sbin/qm create *
svc-prx-veeam ALL=(root) PASSWD: /usr/sbin/qm ^showcmd [0-9]+ --pretty$
svc-prx-veeam ALL=(root) PASSWD: /usr/sbin/qm ^unlock [0-9]+$
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/socat ^TCP-LISTEN:[0-9]+,bind=127\.0\.0\.1 UNIX-CONNECT:/[a-zA-Z0-9_./-]+$
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/mkdir -p /var/lib/vz/snippets/
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/pvenode cert info --output-format json
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/pvesh ^get storage/([a-zA-Z0-9_-]+) --output json$
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/pvesh ^set /nodes/([a-zA-Z0-9_-]+)/qemu/([0-9]+)/config --lock ([a-zA-Z]+)$
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/pkill -9 -e -f -x socat *
svc-prx-veeam ALL=(root) PASSWD: /usr/sbin/lvchange -ay *
svc-prx-veeam ALL=(root) PASSWD: /usr/sbin/lvchange -an *
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/rbd device map *
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/mv ^-n /tmp/([a-zA-Z0-9_-]+\.config) /var/lib/vz/snippets/([a-zA-Z0-9_-]+\.config)$
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/rm ^/[a-zA-Z0-9_/-]+/VeeamTmp[a-zA-Z0-9_.-]+$
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/rm ^-f /[a-zA-Z0-9_/-]+/VeeamTmp[a-zA-Z0-9_.-]+$
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/rm ^-f /var/lib/vz/snippets/[a-zA-Z0-9_-]+\.config$
svc-prx-veeam ALL=(root) PASSWD: /usr/bin/rm ^-f /var/lib/vz/template/iso/[a-zA-Z0-9_.-]+\.img$

 

 

note that this  doesn't add the home dir, no issue while adding, but the job fails with an error about the home dir
I added the homedir with mkhomedir_helper

the job still fails but the error is different
before

Failed to reach the hypervisor. Error output: Could not chdir to home directory /home/svc-prx-veeam: No such file or directory

now

Error output: sh: 1: Syntax error: "(" unexpected
 

 

2025-07-05 00:52:25.8176 00006 01521] ERROR | OSshQmpSocket]: Failed to connect to the QMP device using SSH: Veeam.Vbf.Common.Exceptions.ExceptionWithDetail: Failed to reach the hypervisor. Error output: sh: 1: Syntax error: "(" unexpected
2025-07-05 00:52:25.8176 00006 01521] ERROR | OSshQmpSocket]:
2025-07-05 00:52:25.8176 00006 01521] ERROR | OSshQmpSocket]: at Veeam.Vbf.Project.Utilities.Ssh.CommandExecutors.CommandExecutorExtensions.ExecuteCommandEnsureExitSuccess(ICommandExecutor commandExecutor, String command, Int32I] successStatusCodes)
2025-07-05 00:52:25.8176 00006 01521] ERROR | OSshQmpSocket]: at Veeam.Vbf.Project.Utilities.Ssh.CommandExecutors.CommandExecutorExtensions.ExecuteCommandEnsureExitSuccess(ICommandExecutor commandExecutor, String command)
2025-07-05 00:52:25.8176 00006 01521] ERROR | OSshQmpSocket]: at Veeam.Vbf.Project.Utilities.Ssh.ProxmoxSshClient.FindFreePortOnConnectedHost(UInt32 rangeBegin, UInt32 rangeEnd)
2025-07-05 00:52:25.8176 00006 01521] ERROR | OSshQmpSocket]: at Veeam.Vbf.BackupAgent.BackupProxmox.ProxmoxBackupRestoreUtils.FindFreePortOnConnectedHost(ProxmoxSshClient sshClient, UInt32 rangeBegin, UInt32 rangeEnd)
2025-07-05 00:52:25.8176 00006 01521] ERROR | OSshQmpSocket]: at Veeam.Vbf.BackupAgent.BackupProxmox.Qmp.SshQmpSocket.ConnectAsync(String path, CancellationToken cancellationToken)
2025-07-05 00:52:25.8243 00006 01521] ERROR | ONbdEngine]: Failed to connect the NBD server to the hypervisor host: Veeam.Vbf.Common.Exceptions.ExceptionWithDetail: Failed to reach the hypervisor. Error output: sh: 1: Syntax error: "(" unexpected
2025-07-05 00:52:25.8243 00006 01521] ERROR | ONbdEngine]:
2025-07-05 00:52:25.8243 00006 01521] ERROR | ONbdEngine]: at Veeam.Vbf.Project.Utilities.Ssh.CommandExecutors.CommandExecutorExtensions.ExecuteCommandEnsureExitSuccess(ICommandExecutor commandExecutor, String command, Int32I] successStatusCodes)
2025-07-05 00:52:25.8243 00006 01521] ERROR | ONbdEngine]: at Veeam.Vbf.Project.Utilities.Ssh.CommandExecutors.CommandExecutorExtensions.ExecuteCommandEnsureExitSuccess(ICommandExecutor commandExecutor, String command)
2025-07-05 00:52:25.8243 00006 01521] ERROR | ONbdEngine]: at Veeam.Vbf.Project.Utilities.Ssh.ProxmoxSshClient.FindFreePortOnConnectedHost(UInt32 rangeBegin, UInt32 rangeEnd)
2025-07-05 00:52:25.8243 00006 01521] ERROR | ONbdEngine]: at Veeam.Vbf.BackupAgent.BackupProxmox.ProxmoxBackupRestoreUtils.FindFreePortOnConnectedHost(ProxmoxSshClient sshClient, UInt32 rangeBegin, UInt32 rangeEnd)
2025-07-05 00:52:25.8243 00006 01521] ERROR | ONbdEngine]: at Veeam.Vbf.BackupAgent.BackupProxmox.Qmp.SshQmpSocket.ConnectAsync(String path, CancellationToken cancellationToken)
2025-07-05 00:52:25.8243 00006 01521] ERROR | ONbdEngine]: at Veeam.Vbf.BackupAgent.BackupProxmox.Engine.NbdEngine.ConnectToHostAsync(Int32 vmId, Boolean closeOtherBackup, CancellationToken cancellationToken)
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: Failed to perform backup: System.Exception: Failed to connect the NBD server to the hypervisor host
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: ---> Veeam.Vbf.Common.Exceptions.ExceptionWithDetail: Failed to reach the hypervisor. Error output: sh: 1: Syntax error: "(" unexpected
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]:
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: at Veeam.Vbf.Project.Utilities.Ssh.CommandExecutors.CommandExecutorExtensions.ExecuteCommandEnsureExitSuccess(ICommandExecutor commandExecutor, String command, Int32I] successStatusCodes)
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: at Veeam.Vbf.Project.Utilities.Ssh.CommandExecutors.CommandExecutorExtensions.ExecuteCommandEnsureExitSuccess(ICommandExecutor commandExecutor, String command)
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: at Veeam.Vbf.Project.Utilities.Ssh.ProxmoxSshClient.FindFreePortOnConnectedHost(UInt32 rangeBegin, UInt32 rangeEnd)
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: at Veeam.Vbf.BackupAgent.BackupProxmox.ProxmoxBackupRestoreUtils.FindFreePortOnConnectedHost(ProxmoxSshClient sshClient, UInt32 rangeBegin, UInt32 rangeEnd)
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: at Veeam.Vbf.BackupAgent.BackupProxmox.Qmp.SshQmpSocket.ConnectAsync(String path, CancellationToken cancellationToken)
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: at Veeam.Vbf.BackupAgent.BackupProxmox.Engine.NbdEngine.ConnectToHostAsync(Int32 vmId, Boolean closeOtherBackup, CancellationToken cancellationToken)
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: --- End of inner exception stack trace ---
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: at Veeam.Vbf.BackupAgent.BackupProxmox.Engine.NbdEngine.ConnectToHostAsync(Int32 vmId, Boolean closeOtherBackup, CancellationToken cancellationToken)
2025-07-05 00:52:25.8293 00006 01521] ERROR | OProxmoxBackupManager]: at Veeam.Vbf.BackupAgent.BackupProxmox.ProxmoxBackupManager.PerformBackupAsync(Int32 vmId, String vmName, String vmNodeId, List`1 disksToBackup, VbrVmBackupSession vbrBackupSession, VmBackupParams vmBackupParams, ProxmoxSshClient sshClient, CancellationToken ct)
2025-07-05 00:52:25.8302 00006 01521] INFO | OProxmoxBackupManager]: Finalize backup

this happens just after the Peforming backup…. line in the log, there are successul ssh interactions before that line

2025-07-05 00:52:07.4388 00006 81521] INFO  | ISshClientExtensions]: Starting shell creation
2025-07-05 00:52:07.8179 00006 91521] INFO | ISshClientExtensions]: Successfully created the shell stream
2025-07-05 00:52:07.8350 00006 01521] INFO | ISshClientExtensions]: Executing the following SSH command with sudo using the shell stream: "dmidecode -s system-uuid"
2025-07-05 00:52:08.1737 00006 71521] INFO | ITaskService]: Registering the task 15baac90-f2e5-474b-9f02-beee2f2fd833...
2025-07-05 00:52:11.2200 00025 01521] ERROR | EProxmoxBackupManager]MProxmoxRestClient]: <== Request "Get" "https://.....local:8006/api2/json/nodes/..../qemu/101/config", body: "{}"
2025-07-05 00:52:11.2200 00023 01521] ERROR | EProxmoxBackupManager]MProxmoxRestClient]: <== Request "Get" "https://.....local:8006/api2/json/nodes/..../qemu/101/pending", body: "{}"
2025-07-05 00:52:11.2200 00025 01521] ERROR | EProxmoxBackupManager]MProxmoxRestClient]: ==> Response "Get" "https://....local:8006/api2/json/nodes/..../qemu/101/config", "code: 401 - No ticket", duration: "3 sec 142 msec", body: ""
2025-07-05 00:52:11.2200 00023 01521] ERROR | EProxmoxBackupManager]MProxmoxRestClient]: ==> Response "Get" "https://....local:8006/api2/json/nodes/..../qemu/101/pending", "code: 401 - No ticket", duration: "3 sec 55 msec", body: ""
2025-07-05 00:52:11.2257 00025 71521] INFO | IProxmoxRestClientFactoryBase]oRetry]: Authorization token is outdated during request https://.....local:8006/api2/json/nodes/..../qemu/101/config. Try relogin
2025-07-05 00:52:11.2257 00023 71521] INFO | IProxmoxRestClientFactoryBase]oRetry]: Authorization token is outdated during request https://.....local:8006/api2/json/nodes/..../qemu/101/pending. Try relogin
2025-07-05 00:52:11.5875 00022 51521] INFO | IProxmoxBackupRestoreUtils]: Start registering host info
2025-07-05 00:52:11.7680 00022 01521] INFO | ISshClientExtensions]: Starting shell creation
2025-07-05 00:52:11.8689 00022 91521] INFO | ISshClientExtensions]: Successfully created the shell stream
2025-07-05 00:52:11.8751 00022 11521] INFO | ISshClientExtensions]: Executing the following SSH command with sudo using the shell stream: "dmidecode -s system-uuid"
2025-07-05 00:52:11.9894 00022 41521] INFO | IProxmoxBackupRestoreUtils]: Using cluster for host info registration
2025-07-05 00:52:12.1538 00006 81521] INFO | IProxmoxBackupRestoreUtils]: Finished registering host info
2025-07-05 00:52:12.1554 00006 41521] INFO | IVbrBackupProgressReporter]: OnStartVmBackup
2025-07-05 00:52:12.1554 00006 41521] INFO | IVbrBackupProgressReporter]: Start "Progress report"
2025-07-05 00:52:12.1635 00006 51521] INFO | IProxmoxBackupRestoreUtils]: Creating VMB metadata
2025-07-05 00:52:12.2103 00006 31521] INFO | IProxmoxBackupRestoreUtils]: Processed VMB metadata: {"pDisplayName":"XXX","pUserData":"{\"type\":\"vm_pve\",\"backup_config\":{\"proxmox_cluster_id\":\"238891ca-419b-e891-83d7-de7db1bd0888\",\"proxmox_node_id\":\"4c4c4544-0039-3110-8050-c6c04f384432\",\"proxmox_node_name\":\"....\",\"vm_config\":{\"Id\":\"e947ff08-e358-49f7-9404-474cb4521ba1:101\",\"Name\":\"XXXXXX\",\"CoresPerCpu\":4,\"CpuCount\":1,\"MemoryMb\":8192,\"OsType\":\"win11\",\"BiosId\":\"e947ff08-e358-49f7-9404-474cb4521ba1\",\"Numa\":false,\"Kvm\":false,\"Boot\":\"order=scsi0;ide0;net0\",\"ScsiHw\":\"virtio-scsi-single\",\"NetworkAdapters\":{\"0\":{\"Model\":\"virtio\",\"NetworkId\":\"vmbr0\",\"Firewall\":true,\"MacAddress\":\"BC:24:11:67:73:0C\",\"Disconnected\":false}},\"Disks\":D{\"Id\":\"none\",\"BusType\":\"Ide\",\"Index\":0,\"StorageId\":\"none\",\"Size\":0,\"IsCdRom\":true,\"Format\":\"\"},{\"Id\":\"NFS_INFRA:101/vm-101-disk-1.qcow2\",\"BusType\":\"Scsi\",\"Index\":0,\"StorageId\":\"NFS_INFRA\",\"Size\":96636764160,\"IsCdRom\":false,\"Format\":\"qcow2\",\"Path\":\"/mnt/pve/NFS_INFRA/images/101/vm-101-disk-1.qcow2\",\"StorageType\":\"nfs\"}],\"NodeName\":\"....\",\"Template\":false,\"Properties\":{\"cores\":\"4\",\"vmgenid\":\"4fb2604c-3633-480d-8f62-1f2c0d1377a3\",\"boot\":\"order=scsi0;ide0;net0\",\"ostype\":\"win11\",\"memory\":\"8192\",\"scsi0\":\"NFS_INFRA:101/vm-101-disk-1.qcow2,iothread=1,size=90G\",\"name\":\"XXX\",\"net0\":\"virtio=BC:24:11:67:73:0C,bridge=vmbr0,firewall=1\",\"tpmstate0\":\"NFS_INFRA:101/vm-101-disk-0.raw,size=4M,version=v2.0\",\"agent\":\"1\",\"machine\":\"pc-q35-9.2+pve1\",\"digest\":\"68ee621c14f961d3931ff46b56a515e74fb19ddd\",\"efidisk0\":\"NFS_INFRA:101/vm-101-disk-0.qcow2,efitype=4m,pre-enrolled-keys=1,size=528K\",\"numa\":\"0\",\"sockets\":\"1\",\"bios\":\"ovmf\",\"ide0\":\"none,media=cdrom\",\"meta\":\"creation-qemu=9.2.0,ctime=1751390317\",\"smbios1\":\"uuid=e947ff08-e358-49f7-9404-474cb4521ba1\",\"cpu\":\"host\",\"scsihw\":\"virtio-scsi-single\"},\"Pending\":n{\"key\":\"digest\",\"value\":\"68ee621c14f961d3931ff46b56a515e74fb19ddd\",\"delete\":false},{\"key\":\"efidisk0\",\"value\":\"NFS_INFRA:101/vm-101-disk-0.qcow2,efitype=4m,pre-enrolled-keys=1,size=528K\",\"delete\":false},{\"key\":\"numa\",\"value\":\"0\",\"delete\":false},{\"key\":\"sockets\",\"value\":\"1\",\"delete\":false},{\"key\":\"bios\",\"value\":\"ovmf\",\"delete\":false},{\"key\":\"ide0\",\"value\":\"none,media=cdrom\",\"delete\":false},{\"key\":\"meta\",\"value\":\"creation-qemu=9.2.0,ctime=1751390317\",\"delete\":false},{\"key\":\"smbios1\",\"value\":\"uuid=e947ff08-e358-49f7-9404-474cb4521ba1\",\"delete\":false},{\"key\":\"cpu\",\"value\":\"host\",\"delete\":false},{\"key\":\"scsihw\",\"value\":\"virtio-scsi-single\",\"delete\":false},{\"key\":\"cores\",\"value\":\"4\",\"delete\":false},{\"key\":\"vmgenid\",\"value\":\"4fb2604c-3633-480d-8f62-1f2c0d1377a3\",\"delete\":false},{\"key\":\"boot\",\"value\":\"order=scsi0;ide0;net0\",\"delete\":false},{\"key\":\"ostype\",\"value\":\"win11\",\"delete\":false},{\"key\":\"memory\",\"value\":\"8192\",\"delete\":false},{\"key\":\"scsi0\",\"value\":\"NFS_INFRA:101/vm-101-disk-1.qcow2,iothread=1,size=90G\",\"delete\":false},{\"key\":\"name\",\"value\":\"XXX\",\"delete\":false},{\"key\":\"net0\",\"value\":\"virtio=BC:24:11:67:73:0C,bridge=vmbr0,firewall=1\",\"delete\":false},{\"key\":\"tpmstate0\",\"value\":\"NFS_INFRA:101/vm-101-disk-0.raw,size=4M,version=v2.0\",\"delete\":false},{\"key\":\"agent\",\"value\":\"1\",\"delete\":false},{\"key\":\"machine\",\"value\":\"pc-q35-9.2+pve1\",\"delete\":false}],\"Firmware\":1,\"IsProtected\":false},\"node_version\":\"8.4.1\",\"pool_id\":\"POOL-INFRA\"},\"vm_metadata\":{\"clusterId\":\"\",\"cpuCoresCount\":4,\"cpuSocketsCount\":1,\"disks\":d{\"diskName\":\"\",\"busType\":\"Ide\",\"diskUid\":\"none\",\"vmUuid\":\"\",\"storageContainerUid\":\"none\",\"storageContainerName\":\"\",\"targetId\":\"\",\"deviceIndex\":0,\"isCdrom\":true,\"size\":0,\"isBootDisk\":false},{\"diskName\":\"\",\"busType\":\"Scsi\",\"diskUid\":\"NFS_INFRA_101_vm-101-disk-1.qcow2\",\"vmUuid\":\"\",\"storageContainerUid\":\"NFS_INFRA\",\"storageContainerName\":\"\",\"targetId\":\"\",\"deviceIndex\":0,\"isCdrom\":false,\"size\":96636764160,\"isBootDisk\":false}],\"networkAdapters\":p{\"ipAddress\":\"\",\"macAddress\":\"BC:24:11:67:73:0C\",\"networkUid\":\"vmbr0\",\"isConnected\":true}],\"networks\":w{\"networkUid\":\"vmbr0\",\"networkName\":\"vmbr0\",\"networkAddress\":\"\",\"isIpManagementOn\":false,\"gatewayIp\":\"\",\"ipPool\":p]}],\"totalMemoryInBytes\":8589934592,\"virtualMachineName\":\"XXX\",\"virtualMachineUid\":\"e947ff08-e358-49f7-9404-474cb4521ba1:101\",\"firmware\":\"uefi\",\"engine_id\":\"\",\"agent_version\":\"12.1.3.217\"},\"network_config\":o{\"Address\":\"y.y.y.y\",\"iface\":\"vmbr0\",\"Type\":\"bridge\"}],\"storage_config\":o{\"storage\":\"NFS_INFRA\",\"Type\":\"nfs\",\"Content\":\"iso,images\",\"Active\":true,\"Avail\":1470775164928,\"Enabled\":true,\"Shared\":true,\"Total\":1759218761728,\"Used\":288443596800,\"used_fraction\":0.163961187246932}],\"node_info\":{\"node\":\"....\",\"ssl_fingerprint\":\"94:E5:34:8B:1E:87:0C:4B:3A:F4:18:66:AC:4D:7A:70:4E:3E:F5:B7:81:E8:F6:D9:99:31:A9:8A:28:20:06:89\",\"status\":1,\"maxcpu\":64,\"maxmem\":270345654272},\"version\":\"1.0\"}","pDisks":{"value":0},"disksCount":0,"pOsType":"windows11_64Guest","pFqdn":"","kbBlockSize":2,"pHostInstallationId":"238891ca-419b-e891-83d7-de7db1bd0888","pObjectTag":"238891ca-419b-e891-83d7-de7db1bd0888e947ff08-e358-49f7-9404-474cb4521ba1:101"}
2025-07-05 00:52:12.2111 00006 11521] INFO | IProxmoxBackupRestoreUtils]: Creating VMB disks metadata
2025-07-05 00:52:12.2257 00006 71521] INFO | IProxmoxBackupRestoreUtils]: Processed VMB disks metadata: t{"pDiskKey":"NFS_INFRA_101_vm-101-disk-1.qcow2","id":{"busNumber":0,"targetId":0,"lun":0},"capacity":96636764160,"isBackedUp":true}]
2025-07-05 00:52:12.2269 00006 91521] INFO | IProxmoxBackupRestoreUtils]: Creating VMB backup task data
2025-07-05 00:52:12.2403 00006 31521] INFO | IProxmoxBackupRestoreUtils]: Processed backup task data: "{"structSize":0,"pInstanceId":"e947ff08-e358-49f7-9404-474cb4521ba1","pPolicy":{"value":0},"pMetadata":{"value":0},"snapshotCreationTimeUtc":0,"forceFullBackup":false,"useCallbackWriting":true,"paramsList":{"value":0},"vbrParamsList":{"value":0},"enableBitLooker":true,"compressionType":0,"pPolicySessionTag":"0c1edea2-09b3-4911-938b-f84ec3a85287","backupObjectType":{"value":0},"forceNewBackup":false,"immutableTillUtc":0,"backupId":{}}"
2025-07-05 00:52:12.2403 00006 31521] INFO | IProxmoxBackupRestoreUtils]: Creating VMB backup policy
2025-07-05 00:52:12.2452 00006 21521] INFO | IProxmoxBackupRestoreUtils]: Processed VMB backup policy: "{"structSize":0,"pPolicyTag":"56f83abf-409c-4088-b378-83a48ebe6c88","pDisplayName":"Backup-INFRA","repositoryId":{},"pDescription":"Created by at 7/5/2025 12:26 AM."}"
2025-07-05 00:52:12.2634 00006 41521] INFO | IVmbApiBackup]: Starting the VmbApi.StartInstanceBackup operation...
2025-07-05 00:52:25.2477 00006 71521] INFO | IVmbApiBackup]: The VmbApi.StartInstanceBackup operation has been executed
2025-07-05 00:52:25.5340 00006 01521] INFO | IProxmoxBackupManager]: Backup session created
2025-07-05 00:52:25.5374 00006 41521] INFO | IVbrBackupProgressReporter]: OnStartDisksBackup
2025-07-05 00:52:25.5374 00006 41521] INFO | IVbrBackupProgressReporter]: Start "Progress report"
2025-07-05 00:52:25.5413 00006 31521] INFO | IProxmoxBackupManager]: Performing backup...

 

 

what could be the cause ? anyway to debug that syntax error?

any help is welcomed

Chris

Hello,

Don’t know if you solved it but we had the same behaviour.

We needed to delete our “xxveeam” local user on PVE and create it with the “adduser” command, not the useradd one.

And of course “usermod -aG sudo xxveeam”.

 


Hello,

Don’t know if you solved it but we had the same behaviour.

We needed to delete our “xxveeam” local user on PVE and create it with the “adduser” command, not the useradd one.

And of course “usermod -aG sudo xxveeam”.

 

Tremendously helpful. This solved my error: Failed to perform backup: Failed to connect the NBD server to the hypervisor host. Thank you very much!

Also, you may restrict the sudo privileges of the Veeam user according to KB4701.


Comment