Skip to main content

I have an internal K8S cluster (build by K3S)

1 default storageclass csi-driver-nfs , I have 1 QNAP NAS device as NFS server
I installed k10 by Helm 3
I can manual backup and export a simple nginx app (small 50MB volume for /var/logs/nginx) to Amazon S3 , restore app to new cluster successfully.
Now I try to manual backup and export my Hashicorp Vault to Amazon S3 , the backup run ok

But the export run fail

phases:
- attempt: 3
endTime: 2023-10-06T04:08:19Z
errors:
- cause: '{"cause":{"cause":{"message":"Failure in exporting
restorepoint"},"fields":[{"name":"FailedSubPhases","value":[{"Err":{"cause":{"cause":{"cause":{"cause":{"cause":{"cause":{"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:319","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData.copyVolumeDataPodExecFunc.func1","linenumber":319,"message":"Failed
to get snapshot ID from create snapshot
output"},"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:129","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData","linenumber":129,"message":"Failed
to execute copy volume data pod
function"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:1635","function":"kasten.io/k10/kio/exec/phases/phase.(*gvcConverter).Convert","linenumber":1635,"message":"Error
creating portable
snapshot"},"fields":[{"name":"type","value":"CSI"},{"name":"id","value":"k10-csi-snap-2w7cw2rbbpq4c22m"}],"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:442","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).convertSnapshots.func1","linenumber":442,"message":"Failed
to export snapshot
data"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:210","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).copy","linenumber":210,"message":"Error
converting
snapshots"},"file":"kasten.io/k10/kio/exec/phases/phase/export.go:168","function":"kasten.io/k10/kio/exec/phases/phase.(*exportRestorePointPhase).Run","linenumber":168,"message":"Failed
to copy artifacts"},"fields":[],"message":"Job failed to be
executed"},"ID":"95db58bb-63fd-11ee-8867-1a89428adbaf","Phase":"Exporting
RestorePoint"}]}],"file":"kasten.io/k10/kio/exec/phases/phase/queue_and_wait_children.go:196","function":"kasten.io/k10/kio/exec/phases/phase.(*queueAndWaitChildrenPhase).processGroup","linenumber":196,"message":"Failure
in exporting
restorepoint"},"fields":[{"name":"manifestID","value":"8d9fa0be-63fd-11ee-8867-1a89428adbaf"},{"name":"jobID","value":"8da08c09-63fd-11ee-81c1-a63a2ec5ff99"},{"name":"groupIndex","value":0}],"file":"kasten.io/k10/kio/exec/phases/phase/queue_and_wait_children.go:96","function":"kasten.io/k10/kio/exec/phases/phase.(*queueAndWaitChildrenPhase).Run","linenumber":96,"message":"Failed
checking jobs in group"}'
message: Job failed to be executed
- cause: '{"cause":{"cause":{"message":"Failure in exporting
restorepoint"},"fields":[{"name":"FailedSubPhases","value":[{"Err":{"cause":{"cause":{"cause":{"cause":{"cause":{"cause":{"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:319","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData.copyVolumeDataPodExecFunc.func1","linenumber":319,"message":"Failed
to get snapshot ID from create snapshot
output"},"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:129","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData","linenumber":129,"message":"Failed
to execute copy volume data pod
function"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:1635","function":"kasten.io/k10/kio/exec/phases/phase.(*gvcConverter).Convert","linenumber":1635,"message":"Error
creating portable
snapshot"},"fields":[{"name":"type","value":"CSI"},{"name":"id","value":"k10-csi-snap-2w7cw2rbbpq4c22m"}],"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:442","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).convertSnapshots.func1","linenumber":442,"message":"Failed
to export snapshot
data"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:210","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).copy","linenumber":210,"message":"Error
converting
snapshots"},"file":"kasten.io/k10/kio/exec/phases/phase/export.go:168","function":"kasten.io/k10/kio/exec/phases/phase.(*exportRestorePointPhase).Run","linenumber":168,"message":"Failed
to copy artifacts"},"fields":[],"message":"Job failed to be
executed"},"ID":"95db58bb-63fd-11ee-8867-1a89428adbaf","Phase":"Exporting
RestorePoint"}]}],"file":"kasten.io/k10/kio/exec/phases/phase/queue_and_wait_children.go:196","function":"kasten.io/k10/kio/exec/phases/phase.(*queueAndWaitChildrenPhase).processGroup","linenumber":196,"message":"Failure
in exporting
restorepoint"},"fields":[{"name":"manifestID","value":"8d9fa0be-63fd-11ee-8867-1a89428adbaf"},{"name":"jobID","value":"8da08c09-63fd-11ee-81c1-a63a2ec5ff99"},{"name":"groupIndex","value":0}],"file":"kasten.io/k10/kio/exec/phases/phase/queue_and_wait_children.go:96","function":"kasten.io/k10/kio/exec/phases/phase.(*queueAndWaitChildrenPhase).Run","linenumber":96,"message":"Failed
checking jobs in group"}'
message: Job failed to be executed
- cause: '{"cause":{"cause":{"message":"Failure in exporting
restorepoint"},"fields":[{"name":"FailedSubPhases","value":[{"Err":{"cause":{"cause":{"cause":{"cause":{"cause":{"cause":{"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:319","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData.copyVolumeDataPodExecFunc.func1","linenumber":319,"message":"Failed
to get snapshot ID from create snapshot
output"},"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:129","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData","linenumber":129,"message":"Failed
to execute copy volume data pod
function"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:1635","function":"kasten.io/k10/kio/exec/phases/phase.(*gvcConverter).Convert","linenumber":1635,"message":"Error
creating portable
snapshot"},"fields":[{"name":"type","value":"CSI"},{"name":"id","value":"k10-csi-snap-2w7cw2rbbpq4c22m"}],"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:442","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).convertSnapshots.func1","linenumber":442,"message":"Failed
to export snapshot
data"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:210","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).copy","linenumber":210,"message":"Error
converting
snapshots"},"file":"kasten.io/k10/kio/exec/phases/phase/export.go:168","function":"kasten.io/k10/kio/exec/phases/phase.(*exportRestorePointPhase).Run","linenumber":168,"message":"Failed
to copy artifacts"},"fields":[],"message":"Job failed to be
executed"},"ID":"95db58bb-63fd-11ee-8867-1a89428adbaf","Phase":"Exporting
RestorePoint"}]}],"file":"kasten.io/k10/kio/exec/phases/phase/queue_and_wait_children.go:196","function":"kasten.io/k10/kio/exec/phases/phase.(*queueAndWaitChildrenPhase).processGroup","linenumber":196,"message":"Failure
in exporting
restorepoint"},"fields":[{"name":"manifestID","value":"8d9fa0be-63fd-11ee-8867-1a89428adbaf"},{"name":"jobID","value":"8da08c09-63fd-11ee-81c1-a63a2ec5ff99"},{"name":"groupIndex","value":0}],"file":"kasten.io/k10/kio/exec/phases/phase/queue_and_wait_children.go:96","function":"kasten.io/k10/kio/exec/phases/phase.(*queueAndWaitChildrenPhase).Run","linenumber":96,"message":"Failed
checking jobs in group"}'
message: Job failed to be executed
name: Exporting Metadata
startTime: 2023-10-06T04:05:28Z
state: failed
updatedTime: 2023-10-06T04:08:19Z
- attempt: 3
endTime: 2023-10-06T04:07:11Z
errors:
- cause: '{"cause":{"cause":{"cause":{"cause":{"cause":{"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:319","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData.copyVolumeDataPodExecFunc.func1","linenumber":319,"message":"Failed
to get snapshot ID from create snapshot
output"},"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:129","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData","linenumber":129,"message":"Failed
to execute copy volume data pod
function"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:1635","function":"kasten.io/k10/kio/exec/phases/phase.(*gvcConverter).Convert","linenumber":1635,"message":"Error
creating portable
snapshot"},"fields":[{"name":"type","value":"CSI"},{"name":"id","value":"k10-csi-snap-2w7cw2rbbpq4c22m"}],"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:442","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).convertSnapshots.func1","linenumber":442,"message":"Failed
to export snapshot
data"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:210","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).copy","linenumber":210,"message":"Error
converting
snapshots"},"file":"kasten.io/k10/kio/exec/phases/phase/export.go:168","function":"kasten.io/k10/kio/exec/phases/phase.(*exportRestorePointPhase).Run","linenumber":168,"message":"Failed
to copy artifacts"}'
message: Job failed to be executed
- cause: '{"cause":{"cause":{"cause":{"cause":{"cause":{"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:319","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData.copyVolumeDataPodExecFunc.func1","linenumber":319,"message":"Failed
to get snapshot ID from create snapshot
output"},"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:129","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData","linenumber":129,"message":"Failed
to execute copy volume data pod
function"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:1635","function":"kasten.io/k10/kio/exec/phases/phase.(*gvcConverter).Convert","linenumber":1635,"message":"Error
creating portable
snapshot"},"fields":[{"name":"type","value":"CSI"},{"name":"id","value":"k10-csi-snap-2w7cw2rbbpq4c22m"}],"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:442","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).convertSnapshots.func1","linenumber":442,"message":"Failed
to export snapshot
data"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:210","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).copy","linenumber":210,"message":"Error
converting
snapshots"},"file":"kasten.io/k10/kio/exec/phases/phase/export.go:168","function":"kasten.io/k10/kio/exec/phases/phase.(*exportRestorePointPhase).Run","linenumber":168,"message":"Failed
to copy artifacts"}'
message: Job failed to be executed
- cause: '{"cause":{"cause":{"cause":{"cause":{"cause":{"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:319","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData.copyVolumeDataPodExecFunc.func1","linenumber":319,"message":"Failed
to get snapshot ID from create snapshot
output"},"file":"kasten.io/k10/kio/kanister/function/kio_copy_volume_data.go:129","function":"kasten.io/k10/kio/kanister/function.CopyVolumeData","linenumber":129,"message":"Failed
to execute copy volume data pod
function"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:1635","function":"kasten.io/k10/kio/exec/phases/phase.(*gvcConverter).Convert","linenumber":1635,"message":"Error
creating portable
snapshot"},"fields":[{"name":"type","value":"CSI"},{"name":"id","value":"k10-csi-snap-2w7cw2rbbpq4c22m"}],"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:442","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).convertSnapshots.func1","linenumber":442,"message":"Failed
to export snapshot
data"},"file":"kasten.io/k10/kio/exec/phases/phase/copy_snapshots.go:210","function":"kasten.io/k10/kio/exec/phases/phase.(*ArtifactCopier).copy","linenumber":210,"message":"Error
converting
snapshots"},"file":"kasten.io/k10/kio/exec/phases/phase/export.go:168","function":"kasten.io/k10/kio/exec/phases/phase.(*exportRestorePointPhase).Run","linenumber":168,"message":"Failed
to copy artifacts"}'
message: Job failed to be executed
name: Exporting RestorePoint
startTime: 2023-10-06T04:05:28Z
state: failed
updatedTime: 2023-10-06T04:07:11Z

Please give me some advices, thank you very much.

Hi, i have the same error

 

Can you try to do the same with a PVC in ReadWriteOnce ?

For me, in this case export works


Hi @Vecteur IT ,
I don’t understand your question, my Hashicorp Vault installed by Helm3 too and it use ReadWriteOnce
This is PV yaml for Vault

apiVersion: v1
kind: PersistentVolume
metadata:
name: vault-pv
spec:
capacity:
storage: 10Gi
accessModes:
- ReadWriteOnce
persistentVolumeReclaimPolicy: Retain
storageClassName: storageclass-csi-nfs-sgnnas03
claimRef:
name: data-vault-0
namespace: vault-csi
mountOptions:
- hard
- nfsvers=4.1
csi:
driver: nfs.csi.k8s.io
readOnly: false
volumeHandle: 192.168.7.107#k8s-nfs-sgnnas03#vault-pv
volumeAttributes:
server: 192.168.7.107
share: /k8s-nfs-sgnnas03
subdir: vault-pv


This is vault values related to above PV

  dataStorage:
enabled: true
# Size of the PVC created
size: 10Gi
# Location where the PVC will be mounted.
mountPath: "/vault/data"
# Name of the storage class to use. If null it will use the
# configured default Storage Class.
#storageClass: nfs-sgnnas03
storageClass: storageclass-csi-nfs-sgnnas03
volumeName: vault-pv
# Access Mode of the storage device being used for the PVC
accessMode: ReadWriteOnce
# Annotations to apply to the PVC
annotations: {}

This is storageclass and volumesnapshotclass

---
allowVolumeExpansion: true
apiVersion: storage.k8s.io/v1
kind: StorageClass
metadata:
name: storageclass-csi-nfs-sgnnas03
annotations:
storageclass.kubernetes.io/is-default-class: "true"
provisioner: nfs.csi.k8s.io
parameters:
server: "192.168.7.107"
share: /k8s-nfs-sgnnas03
reclaimPolicy: Retain
volumeBindingMode: Immediate
mountOptions:
- hard
- nfsvers=4.1
---
apiVersion: snapshot.storage.k8s.io/v1
kind: VolumeSnapshotClass
metadata:
name: csi-nfs-snapclass
annotations:
snapshot.storage.kubernetes.io/is-default-class: "true"
k10.kasten.io/is-snapshot-class: "true"
labels:
velero.io/csi-volumesnapshot-class: "true"
driver: nfs.csi.k8s.io
deletionPolicy: Delete

 


This is my Pre-Flight Checks

curl https://docs.kasten.io/tools/k10_primer.sh | bash
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 10732 100 10732 0 0 15284 0 --:--:-- --:--:-- --:--:-- 15287
Namespace option not provided, using default namespace
Checking for tools
--> Found kubectl
--> Found helm
--> Found jq
--> Found cat
--> Found base64
Checking if the Kasten Helm repo is present
--> The Kasten Helm repo was found
Checking for required Helm version (>= v3.0.0)
--> No Tiller needed with Helm v3.12.3
K10Primer image
--> Using Image (gcr.io/kasten-images/k10tools:6.0.8) to run test
Checking access to the Kubernetes context default
--> Able to access the default Kubernetes namespace
K10 Kanister tools image
--> Using Kanister tools image (ghcr.io/kanisterio/kanister-tools:0.96.0) to run test

Running K10Primer Job in cluster with command
./k10tools primer
serviceaccount/k10-primer created
clusterrolebinding.rbac.authorization.k8s.io/k10-primer created
job.batch/k10primer created
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod k10primer-s27x8 is in Pending phase
Pod Ready!
================================================================
Kubernetes Version Check:
Valid kubernetes version (v1.27.5+k3s1) - OK

RBAC Check:
Kubernetes RBAC is enabled - OK

Aggregated Layer Check:
The Kubernetes Aggregated Layer is enabled - OK

CSI Capabilities Check:
Using CSI GroupVersion snapshot.storage.k8s.io/v1 - OK

Validating Provisioners:
nfs.csi.k8s.io:
Is a CSI Provisioner - OK
Storage Classes:
storageclass-csi-nfs-sgnnas03
Valid Storage Class - OK
storageclass-csi-nfs-hq-k10
Valid Storage Class - OK
Volume Snapshot Classes:
k10-clone-csi-nfs-snapclass
csi-nfs-snapclass
Has k10.kasten.io/is-snapshot-class annotation set to true - OK
Has deletionPolicy 'Delete' - OK

Validate Generic Volume Snapshot:
Pod created successfully - OK
GVS Backup command executed successfully - OK
Pod deleted successfully - OK
================================================================
serviceaccount "k10-primer" deleted
clusterrolebinding.rbac.authorization.k8s.io "k10-primer" deleted
job.batch "k10primer" deleted
curl -s https://docs.kasten.io/tools/k10_primer.sh  | bash /dev/stdin csi -s storageclass-csi-nfs-sgnnas03
Using default user ID (1000)
Namespace option not provided, using default namespace
Checking for tools
--> Found kubectl
--> Found helm
--> Found jq
--> Found cat
--> Found base64
Checking if the Kasten Helm repo is present
--> The Kasten Helm repo was found
Checking for required Helm version (>= v3.0.0)
--> No Tiller needed with Helm v3.12.3
K10Primer image
--> Using Image (gcr.io/kasten-images/k10tools:6.0.8) to run test
Checking access to the Kubernetes context default
--> Able to access the default Kubernetes namespace
K10 Kanister tools image
--> Using Kanister tools image (ghcr.io/kanisterio/kanister-tools:0.96.0) to run test

Running K10Primer Job in cluster with command
./k10tools primer storage check csi
serviceaccount/k10-primer created
clusterrolebinding.rbac.authorization.k8s.io/k10-primer created
job.batch/k10primer created
Pod k10primer-sjhdl is in Pending phase
Pod Ready!
================================================================
Using "K10_PRIMER_CONFIG_YAML" env var content as config source
Using "K10_PRIMER_CONFIG_YAML" env var content as config source
Creating application
-> Created pod (kubestr-csi-original-podn6bdm) and pvc (kubestr-csi-original-pvcxzrvj)
Taking a snapshot
-> Created snapshot (kubestr-snapshot-20231009015644)
Restoring application
-> Restored pod (kubestr-csi-cloned-podqtrk7) and pvc (kubestr-csi-cloned-pvcckpbh)
Cleaning up resources
CSI Snapshot Walkthrough:
Using annotated VolumeSnapshotClass (csi-nfs-snapclass)
Successfully tested snapshot restore functionality. - OK
================================================================
serviceaccount "k10-primer" deleted
clusterrolebinding.rbac.authorization.k8s.io "k10-primer" deleted
job.batch "k10primer" deleted

I don’t know what to do to troubleshoot it.


This is log from csi-nfs when export fail

kubectl -n nfs-provisioner get pod
NAME READY STATUS RESTARTS AGE
csi-nfs-node-2r4nm 3/3 Running 0 12d
snapshot-controller-7c47788447-n8s6k 1/1 Running 1 (6d14h ago) 12d
csi-nfs-controller-86548b95f7-dtwtq 4/4 Running 4 (6d14h ago) 12d
kubectl -n nfs-provisioner logs -f snapshot-controller-7c47788447-n8s6k

I1009 02:16:52.762140 1 event.go:285] Event(v1.ObjectReference{Kind:"VolumeSnapshot", Namespace:"kasten-io", Name:"snapshot-copy-z9sph6kg", UID:"7a52c450-ae0d-4c88-ba7a-00a1cbcf87c2", APIVersion:"snapshot.storage.k8s.io/v1", ResourceVersion:"3804888", FieldPath:""}): type: 'Normal' reason: 'SnapshotCreated' Snapshot kasten-io/snapshot-copy-z9sph6kg was successfully created by the CSI driver.
I1009 02:16:52.762227 1 event.go:285] Event(v1.ObjectReference{Kind:"VolumeSnapshot", Namespace:"kasten-io", Name:"snapshot-copy-z9sph6kg", UID:"7a52c450-ae0d-4c88-ba7a-00a1cbcf87c2", APIVersion:"snapshot.storage.k8s.io/v1", ResourceVersion:"3804888", FieldPath:""}): type: 'Normal' reason: 'SnapshotReady' Snapshot kasten-io/snapshot-copy-z9sph6kg is ready to use.
E1009 02:17:08.332044 1 snapshot_controller_base.go:403] could not sync snapshot "kasten-io/snapshot-copy-z9sph6kg": snapshot controller failed to update snapshot-copy-z9sph6kg on API server: Operation cannot be fulfilled on volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-z9sph6kg": StorageError: invalid object, Code: 4, Key: /registry/snapshot.storage.k8s.io/volumesnapshots/kasten-io/snapshot-copy-z9sph6kg, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 7a52c450-ae0d-4c88-ba7a-00a1cbcf87c2, UID in object meta:
I1009 02:17:08.338611 1 snapshot_controller_base.go:269] deletion of snapshot "kasten-io/snapshot-copy-z9sph6kg" was already processed
I1009 02:17:09.332555 1 snapshot_controller_base.go:269] deletion of snapshot "kasten-io/snapshot-copy-z9sph6kg" was already processed
I1009 02:17:38.614619 1 event.go:285] Event(v1.ObjectReference{Kind:"VolumeSnapshot", Namespace:"kasten-io", Name:"snapshot-copy-98j6njz9", UID:"3c79de2a-c870-4258-afe9-d917d93e5051", APIVersion:"snapshot.storage.k8s.io/v1", ResourceVersion:"3805001", FieldPath:""}): type: 'Normal' reason: 'SnapshotCreated' Snapshot kasten-io/snapshot-copy-98j6njz9 was successfully created by the CSI driver.
I1009 02:17:38.614697 1 event.go:285] Event(v1.ObjectReference{Kind:"VolumeSnapshot", Namespace:"kasten-io", Name:"snapshot-copy-98j6njz9", UID:"3c79de2a-c870-4258-afe9-d917d93e5051", APIVersion:"snapshot.storage.k8s.io/v1", ResourceVersion:"3805001", FieldPath:""}): type: 'Normal' reason: 'SnapshotReady' Snapshot kasten-io/snapshot-copy-98j6njz9 is ready to use.
E1009 02:17:55.616242 1 snapshot_controller.go:1319] getSnapshotDriverName: failed to get snapshotContent: snapshot-copy-98j6njz9-content-e8b6ff32-be1d-47af-be2a-2f2f6ae83403
E1009 02:17:55.621443 1 snapshot_controller_base.go:403] could not sync snapshot "kasten-io/snapshot-copy-98j6njz9": snapshot controller failed to update snapshot-copy-98j6njz9 on API server: Operation cannot be fulfilled on volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-98j6njz9": StorageError: invalid object, Code: 4, Key: /registry/snapshot.storage.k8s.io/volumesnapshots/kasten-io/snapshot-copy-98j6njz9, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: 3c79de2a-c870-4258-afe9-d917d93e5051, UID in object meta:
E1009 02:17:55.621550 1 snapshot_controller.go:1319] getSnapshotDriverName: failed to get snapshotContent: snapshot-copy-98j6njz9-content-e8b6ff32-be1d-47af-be2a-2f2f6ae83403
I1009 02:17:55.621725 1 snapshot_controller_base.go:269] deletion of snapshot "kasten-io/snapshot-copy-98j6njz9" was already processed
I1009 02:17:56.622217 1 snapshot_controller_base.go:269] deletion of snapshot "kasten-io/snapshot-copy-98j6njz9" was already processed
I1009 02:18:26.790799 1 event.go:285] Event(v1.ObjectReference{Kind:"VolumeSnapshot", Namespace:"kasten-io", Name:"snapshot-copy-d7lzfhqx", UID:"fd00d6c8-5ff5-400b-a71e-17a817eab19e", APIVersion:"snapshot.storage.k8s.io/v1", ResourceVersion:"3805120", FieldPath:""}): type: 'Normal' reason: 'SnapshotCreated' Snapshot kasten-io/snapshot-copy-d7lzfhqx was successfully created by the CSI driver.
I1009 02:18:26.790852 1 event.go:285] Event(v1.ObjectReference{Kind:"VolumeSnapshot", Namespace:"kasten-io", Name:"snapshot-copy-d7lzfhqx", UID:"fd00d6c8-5ff5-400b-a71e-17a817eab19e", APIVersion:"snapshot.storage.k8s.io/v1", ResourceVersion:"3805120", FieldPath:""}): type: 'Normal' reason: 'SnapshotReady' Snapshot kasten-io/snapshot-copy-d7lzfhqx is ready to use.
E1009 02:18:47.440838 1 snapshot_controller_base.go:403] could not sync snapshot "kasten-io/snapshot-copy-d7lzfhqx": snapshot controller failed to update snapshot-copy-d7lzfhqx on API server: Operation cannot be fulfilled on volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-d7lzfhqx": StorageError: invalid object, Code: 4, Key: /registry/snapshot.storage.k8s.io/volumesnapshots/kasten-io/snapshot-copy-d7lzfhqx, ResourceVersion: 0, AdditionalErrorMsg: Precondition failed: UID in precondition: fd00d6c8-5ff5-400b-a71e-17a817eab19e, UID in object meta:
E1009 02:18:47.440951 1 snapshot_controller.go:1319] getSnapshotDriverName: failed to get snapshotContent: snapshot-copy-d7lzfhqx-content-cb57916a-78f9-4ac5-bbdd-8e546a720580
I1009 02:18:47.441039 1 snapshot_controller_base.go:335] deletion of content "snapshot-copy-d7lzfhqx-content-cb57916a-78f9-4ac5-bbdd-8e546a720580" was already processed
I1009 02:18:48.441838 1 snapshot_controller_base.go:269] deletion of snapshot "kasten-io/snapshot-copy-d7lzfhqx" was already processed
kubectl -n nfs-provisioner logs -f csi-nfs-controller-86548b95f7-dtwtq

I1009 02:16:52.874730 1 controller.go:1359] provision "kasten-io/kanister-pvc-6wdcg" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:16:52.875021 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-6wdcg", UID:"52374e38-8562-4f26-809f-d1f858188578", APIVersion:"v1", ResourceVersion:"3804897", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-6wdcg"
W1009 02:16:52.890675 1 controller.go:1165] requested volume size 10737418240 is greater than the size 0 for the source snapshot snapshot-copy-z9sph6kg. Volume plugin needs to handle volume expansion.
I1009 02:16:53.842588 1 controller.go:923] successfully created PV pvc-52374e38-8562-4f26-809f-d1f858188578 for PVC kanister-pvc-6wdcg and csi volume name 192.168.7.107#k8s-nfs-sgnnas03#pvc-52374e38-8562-4f26-809f-d1f858188578##
I1009 02:16:53.842651 1 controller.go:1442] provision "kasten-io/kanister-pvc-6wdcg" class "storageclass-csi-nfs-sgnnas03": volume "pvc-52374e38-8562-4f26-809f-d1f858188578" provisioned
I1009 02:16:53.842671 1 controller.go:1455] provision "kasten-io/kanister-pvc-6wdcg" class "storageclass-csi-nfs-sgnnas03": succeeded
I1009 02:16:53.848631 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-6wdcg", UID:"52374e38-8562-4f26-809f-d1f858188578", APIVersion:"v1", ResourceVersion:"3804897", FieldPath:""}): type: 'Normal' reason: 'ProvisioningSucceeded' Successfully provisioned volume pvc-52374e38-8562-4f26-809f-d1f858188578
I1009 02:17:01.986130 1 controller.go:1359] provision "kasten-io/kanister-pvc-vmgvd" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:17:01.986655 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-vmgvd", UID:"cf5a9272-3827-4b0e-ba3b-8dc6538fa1cc", APIVersion:"v1", ResourceVersion:"3456198", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-vmgvd"
W1009 02:17:01.992457 1 controller.go:934] Retrying syncing claim "cf5a9272-3827-4b0e-ba3b-8dc6538fa1cc", failure 870
E1009 02:17:01.992513 1 controller.go:957] error syncing claim "cf5a9272-3827-4b0e-ba3b-8dc6538fa1cc": failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-97sq5mkc: error getting snapshot snapshot-copy-97sq5mkc from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-97sq5mkc" not found
I1009 02:17:01.992562 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-vmgvd", UID:"cf5a9272-3827-4b0e-ba3b-8dc6538fa1cc", APIVersion:"v1", ResourceVersion:"3456198", FieldPath:""}): type: 'Warning' reason: 'ProvisioningFailed' failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-97sq5mkc: error getting snapshot snapshot-copy-97sq5mkc from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-97sq5mkc" not found
I1009 02:17:08.210011 1 controller.go:1502] delete "pvc-52374e38-8562-4f26-809f-d1f858188578": started
I1009 02:17:08.210099 1 controller.go:1279] volume pvc-52374e38-8562-4f26-809f-d1f858188578 does not need any deletion secrets
I1009 02:17:08.513172 1 controller.go:1517] delete "pvc-52374e38-8562-4f26-809f-d1f858188578": volume deleted
I1009 02:17:08.522130 1 controller.go:1562] delete "pvc-52374e38-8562-4f26-809f-d1f858188578": persistentvolume deleted succeeded
I1009 02:17:38.733271 1 controller.go:1359] provision "kasten-io/kanister-pvc-m5v6b" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:17:38.733633 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-m5v6b", UID:"b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5", APIVersion:"v1", ResourceVersion:"3805012", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-m5v6b"
W1009 02:17:38.765057 1 controller.go:1165] requested volume size 10737418240 is greater than the size 0 for the source snapshot snapshot-copy-98j6njz9. Volume plugin needs to handle volume expansion.
I1009 02:17:39.799207 1 controller.go:923] successfully created PV pvc-b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5 for PVC kanister-pvc-m5v6b and csi volume name 192.168.7.107#k8s-nfs-sgnnas03#pvc-b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5##
I1009 02:17:39.799259 1 controller.go:1442] provision "kasten-io/kanister-pvc-m5v6b" class "storageclass-csi-nfs-sgnnas03": volume "pvc-b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5" provisioned
I1009 02:17:39.799278 1 controller.go:1455] provision "kasten-io/kanister-pvc-m5v6b" class "storageclass-csi-nfs-sgnnas03": succeeded
I1009 02:17:39.806493 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-m5v6b", UID:"b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5", APIVersion:"v1", ResourceVersion:"3805012", FieldPath:""}): type: 'Normal' reason: 'ProvisioningSucceeded' Successfully provisioned volume pvc-b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5
I1009 02:17:54.537285 1 controller.go:1502] delete "pvc-b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5": started
I1009 02:17:54.537354 1 controller.go:1279] volume pvc-b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5 does not need any deletion secrets
I1009 02:17:54.801304 1 controller.go:1517] delete "pvc-b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5": volume deleted
I1009 02:17:54.809019 1 controller.go:1562] delete "pvc-b61cc8ac-1ac9-4df1-9dbd-b0386a835cd5": persistentvolume deleted succeeded
I1009 02:18:11.802158 1 controller.go:1359] provision "kasten-io/kanister-pvc-vr5nh" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:18:11.802524 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-vr5nh", UID:"74be25d4-2722-493e-a744-c16fe06fa316", APIVersion:"v1", ResourceVersion:"3459689", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-vr5nh"
W1009 02:18:11.807400 1 controller.go:934] Retrying syncing claim "74be25d4-2722-493e-a744-c16fe06fa316", failure 862
E1009 02:18:11.807446 1 controller.go:957] error syncing claim "74be25d4-2722-493e-a744-c16fe06fa316": failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-9fn7t9w9: error getting snapshot snapshot-copy-9fn7t9w9 from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-9fn7t9w9" not found
I1009 02:18:11.807473 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-vr5nh", UID:"74be25d4-2722-493e-a744-c16fe06fa316", APIVersion:"v1", ResourceVersion:"3459689", FieldPath:""}): type: 'Warning' reason: 'ProvisioningFailed' failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-9fn7t9w9: error getting snapshot snapshot-copy-9fn7t9w9 from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-9fn7t9w9" not found
I1009 02:18:26.904136 1 controller.go:1359] provision "kasten-io/kanister-pvc-xqnqw" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:18:26.904864 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-xqnqw", UID:"391c6f09-0bfc-4f3a-86de-b6f911ba9556", APIVersion:"v1", ResourceVersion:"3805129", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-xqnqw"
W1009 02:18:26.921678 1 controller.go:1165] requested volume size 10737418240 is greater than the size 0 for the source snapshot snapshot-copy-d7lzfhqx. Volume plugin needs to handle volume expansion.
I1009 02:18:27.798412 1 controller.go:923] successfully created PV pvc-391c6f09-0bfc-4f3a-86de-b6f911ba9556 for PVC kanister-pvc-xqnqw and csi volume name 192.168.7.107#k8s-nfs-sgnnas03#pvc-391c6f09-0bfc-4f3a-86de-b6f911ba9556##
I1009 02:18:27.798468 1 controller.go:1442] provision "kasten-io/kanister-pvc-xqnqw" class "storageclass-csi-nfs-sgnnas03": volume "pvc-391c6f09-0bfc-4f3a-86de-b6f911ba9556" provisioned
I1009 02:18:27.798486 1 controller.go:1455] provision "kasten-io/kanister-pvc-xqnqw" class "storageclass-csi-nfs-sgnnas03": succeeded
I1009 02:18:27.805876 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-xqnqw", UID:"391c6f09-0bfc-4f3a-86de-b6f911ba9556", APIVersion:"v1", ResourceVersion:"3805129", FieldPath:""}): type: 'Normal' reason: 'ProvisioningSucceeded' Successfully provisioned volume pvc-391c6f09-0bfc-4f3a-86de-b6f911ba9556
I1009 02:18:47.314169 1 controller.go:1502] delete "pvc-391c6f09-0bfc-4f3a-86de-b6f911ba9556": started
I1009 02:18:47.314410 1 controller.go:1279] volume pvc-391c6f09-0bfc-4f3a-86de-b6f911ba9556 does not need any deletion secrets
I1009 02:18:47.584930 1 controller.go:1517] delete "pvc-391c6f09-0bfc-4f3a-86de-b6f911ba9556": volume deleted
I1009 02:18:47.592197 1 controller.go:1562] delete "pvc-391c6f09-0bfc-4f3a-86de-b6f911ba9556": persistentvolume deleted succeeded
I1009 02:19:30.071250 1 controller.go:1359] provision "kasten-io/kanister-pvc-ctl27" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:19:30.071545 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-ctl27", UID:"2c475943-1927-4c87-a8b1-58955ba1ea7e", APIVersion:"v1", ResourceVersion:"3414684", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-ctl27"
W1009 02:19:30.076612 1 controller.go:934] Retrying syncing claim "2c475943-1927-4c87-a8b1-58955ba1ea7e", failure 973
I1009 02:19:30.076651 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-ctl27", UID:"2c475943-1927-4c87-a8b1-58955ba1ea7e", APIVersion:"v1", ResourceVersion:"3414684", FieldPath:""}): type: 'Warning' reason: 'ProvisioningFailed' failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-687bvz4q: error getting snapshot snapshot-copy-687bvz4q from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-687bvz4q" not found
E1009 02:19:30.076667 1 controller.go:957] error syncing claim "2c475943-1927-4c87-a8b1-58955ba1ea7e": failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-687bvz4q: error getting snapshot snapshot-copy-687bvz4q from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-687bvz4q" not found
I1009 02:19:58.253318 1 controller.go:1359] provision "kasten-io/kanister-pvc-kgwjz" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:19:58.253692 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-kgwjz", UID:"dba64628-de26-459f-a88b-0a0da18874f0", APIVersion:"v1", ResourceVersion:"3416004", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-kgwjz"
W1009 02:19:58.327351 1 controller.go:934] Retrying syncing claim "dba64628-de26-459f-a88b-0a0da18874f0", failure 970
E1009 02:19:58.327440 1 controller.go:957] error syncing claim "dba64628-de26-459f-a88b-0a0da18874f0": failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-82qwqv8j: error getting snapshot snapshot-copy-82qwqv8j from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-82qwqv8j" not found
I1009 02:19:58.327533 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-kgwjz", UID:"dba64628-de26-459f-a88b-0a0da18874f0", APIVersion:"v1", ResourceVersion:"3416004", FieldPath:""}): type: 'Warning' reason: 'ProvisioningFailed' failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-82qwqv8j: error getting snapshot snapshot-copy-82qwqv8j from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-82qwqv8j" not found
I1009 02:20:25.264842 1 controller.go:1359] provision "kasten-io/kanister-pvc-jdgkg" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:20:25.265240 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-jdgkg", UID:"dec0665b-e312-4140-a981-64719855fb5a", APIVersion:"v1", ResourceVersion:"3417325", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-jdgkg"
W1009 02:20:25.270951 1 controller.go:934] Retrying syncing claim "dec0665b-e312-4140-a981-64719855fb5a", failure 967
E1009 02:20:25.270994 1 controller.go:957] error syncing claim "dec0665b-e312-4140-a981-64719855fb5a": failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-ss2zkx8t: error getting snapshot snapshot-copy-ss2zkx8t from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-ss2zkx8t" not found
I1009 02:20:25.271051 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-jdgkg", UID:"dec0665b-e312-4140-a981-64719855fb5a", APIVersion:"v1", ResourceVersion:"3417325", FieldPath:""}): type: 'Warning' reason: 'ProvisioningFailed' failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-ss2zkx8t: error getting snapshot snapshot-copy-ss2zkx8t from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-ss2zkx8t" not found
I1009 02:22:01.992628 1 controller.go:1359] provision "kasten-io/kanister-pvc-vmgvd" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:22:01.992946 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-vmgvd", UID:"cf5a9272-3827-4b0e-ba3b-8dc6538fa1cc", APIVersion:"v1", ResourceVersion:"3456198", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-vmgvd"
W1009 02:22:01.997696 1 controller.go:934] Retrying syncing claim "cf5a9272-3827-4b0e-ba3b-8dc6538fa1cc", failure 871
E1009 02:22:01.997736 1 controller.go:957] error syncing claim "cf5a9272-3827-4b0e-ba3b-8dc6538fa1cc": failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-97sq5mkc: error getting snapshot snapshot-copy-97sq5mkc from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-97sq5mkc" not found
I1009 02:22:01.997760 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-vmgvd", UID:"cf5a9272-3827-4b0e-ba3b-8dc6538fa1cc", APIVersion:"v1", ResourceVersion:"3456198", FieldPath:""}): type: 'Warning' reason: 'ProvisioningFailed' failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-97sq5mkc: error getting snapshot snapshot-copy-97sq5mkc from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-97sq5mkc" not found
I1009 02:23:11.807769 1 controller.go:1359] provision "kasten-io/kanister-pvc-vr5nh" class "storageclass-csi-nfs-sgnnas03": started
I1009 02:23:11.808129 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-vr5nh", UID:"74be25d4-2722-493e-a744-c16fe06fa316", APIVersion:"v1", ResourceVersion:"3459689", FieldPath:""}): type: 'Normal' reason: 'Provisioning' External provisioner is provisioning volume for claim "kasten-io/kanister-pvc-vr5nh"
W1009 02:23:11.814167 1 controller.go:934] Retrying syncing claim "74be25d4-2722-493e-a744-c16fe06fa316", failure 863
E1009 02:23:11.814215 1 controller.go:957] error syncing claim "74be25d4-2722-493e-a744-c16fe06fa316": failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-9fn7t9w9: error getting snapshot snapshot-copy-9fn7t9w9 from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-9fn7t9w9" not found
I1009 02:23:11.814248 1 event.go:298] Event(v1.ObjectReference{Kind:"PersistentVolumeClaim", Namespace:"kasten-io", Name:"kanister-pvc-vr5nh", UID:"74be25d4-2722-493e-a744-c16fe06fa316", APIVersion:"v1", ResourceVersion:"3459689", FieldPath:""}): type: 'Warning' reason: 'ProvisioningFailed' failed to provision volume with StorageClass "storageclass-csi-nfs-sgnnas03": error getting handle for DataSource Type VolumeSnapshot by Name snapshot-copy-9fn7t9w9: error getting snapshot snapshot-copy-9fn7t9w9 from api server: volumesnapshots.snapshot.storage.k8s.io "snapshot-copy-9fn7t9w9" not found

 


@jackchuong Thank you for posting your question here.

I see that NFS provisioner (https://github.com/kubernetes-csi/csi-driver-nfs/pull/430) started supporting VolumeSnapshot recently.

From the above primer output, Snapshotting functionality seems to be working properly,

From the events and logs from provisioner, I could see that the snapshot copy also gets ready.

We might need to look into the debug logs to look at the issue further.
There are few changes in K10 which runs backup now in rootless mode and it could also be the reason for failure. 

Would you be able to open a case with K10 support by accessing `my.veeam.com`

Please select `Veeam Kasten K10 by Trial` and upload the debug logs(https://docs.kasten.io/latest/operating/support.html#gathering-debugging-information) in the same

 


Comment