Hi,
I have created an immutable bucket on a local test Minio Operator setup with object lock enabled. The location profile validates however when running the job the export fails with
Job failed to be executed
ActionSet Failed
Failed to stream backup to kopia repository
Failed to exec command in pod: command terminated with exit code 1
Details:
cause:
fields:
- name: message
value: '{"message":"Failed to stream backup to kopia
repository","function":"kasten.io/k10/kio/kanister/function.moverBackup","linenumber":192,"file":"kasten.io/k10/kio/kanister/function/mover_backup.go:192","cause":{"message":"Failed
to exec command in pod: command terminated with exit code 1"}}'
- name: actionSet
value:
metadata:
creationTimestamp: 2022-06-28T21:59:28Z
generateName: k10-backup-k10-dr-bp-2.1.1-7wmsz-catalog-svc-kasten-io-deployment-
generation: 4
labels:
kanister.io/JobID: 417a734f-f72d-11ec-9a4b-2ef2d43bb0e6
managedFields:
- apiVersion: cr.kanister.io/v1alpha1
fieldsType: FieldsV1
fieldsV1:
f:metadata:
f:generateName: {}
f:labels:
.: {}
f:kanister.io/JobID: {}
f:spec:
.: {}
f:actions: {}
f:status:
.: {}
f:actions: {}
f:error:
.: {}
f:message: {}
f:state: {}
manager: Go-http-client
operation: Update
time: 2022-06-28T21:59:28Z
name: k10-backup-k10-dr-bp-2.1.1-7wmsz-catalog-svc-kasten-io-dep7lkrt
namespace: kasten-io
resourceVersion: "4650"
uid: 1c3a3df8-7ca9-4299-962f-a1d3d7638cd2
spec:
actions:
- blueprint: k10-dr-bp-2.1.1-7wmsz
name: backup
object:
apiVersion: ""
group: ""
kind: deployment
name: catalog-svc
namespace: kasten-io
resource: ""
options:
cacheDirectory: /tmp/kopia-cache
dataSource: http://localhost:8000/v0/backup
filePath: kasten-io/catalog/model-store.db
hostName: fcb37209-ee14-4adf-8d4b-678674f2303d.catalog-svc.catalog-pv-claim
mountPath: /mnt/k10state
objectStorePath: e4ac9b0c-1f53-42c3-9257-f32360bbfced/k10/repo/
pod: catalog-svc-66b7c5f978-9xwcg
userName: k10-admin
podOverride:
securityContext:
runAsNonRoot: false
runAsUser: 0
tolerations:
- effect: NoExecute
key: node.kubernetes.io/not-ready
operator: Exists
tolerationSeconds: 300
- effect: NoExecute
key: node.kubernetes.io/unreachable
operator: Exists
tolerationSeconds: 300
preferredVersion: v1.0.0-alpha
profile:
apiVersion: v1alpha1
group: ""
kind: profile
name: dr-backup-k10-location-ccpj7
namespace: kasten-io
resource: ""
secrets:
artifactKey:
apiVersion: ""
group: ""
kind: secret
name: k10-dr-secret
namespace: kasten-io
resource: ""
status:
actions:
- artifacts:
snapshot:
keyValue:
backupIdentifier: "{{ .Phases.backupCatalog.Output.snapshotID }}"
backupPath: "{{ .Options.mountPath }}"
funcVersion: "{{ .Phases.backupCatalog.Output.version }}"
objectStorePath: "{{ .Options.objectStorePath }}"
blueprint: k10-dr-bp-2.1.1-7wmsz
deferPhase:
name: ""
state: ""
name: backup
object:
apiVersion: ""
group: ""
kind: deployment
name: catalog-svc
namespace: kasten-io
resource: ""
phases:
- name: backupCatalog
state: failed
error:
message: '{"message":"Failed to stream backup to kopia
repository","function":"kasten.io/k10/kio/kanister/function.moverBackup","linenumber":192,"file":"kasten.io/k10/kio/kanister/function/mover_backup.go:192","cause":{"message":"Failed
to exec command in pod: command terminated with exit code 1"}}'
state: failed
file: kasten.io/k10/kio/kanister/operation.go:114
function: kasten.io/k10/kio/kanister.(*Operation).Execute
linenumber: 114
message: ActionSet Failed
message: Job failed to be executed
fields: []
The local backup is using csi-hostpath driver and local snapshots complete successfully.