Solved

Immutable export failing to S3 Compatible


Userlevel 7
Badge +22

Hi,

 

I have created an immutable bucket on a local test Minio Operator setup with object lock enabled. The location profile validates however when running the job the export fails with 

 

Job failed to be executed

 ActionSet Failed

 Failed to stream backup to kopia repository

 Failed to exec command in pod: command terminated with exit code 1 

Details:

 

cause:
fields:
- name: message
value: '{"message":"Failed to stream backup to kopia
repository","function":"kasten.io/k10/kio/kanister/function.moverBackup","linenumber":192,"file":"kasten.io/k10/kio/kanister/function/mover_backup.go:192","cause":{"message":"Failed
to exec command in pod: command terminated with exit code 1"}}'
- name: actionSet
value:
metadata:
creationTimestamp: 2022-06-28T21:59:28Z
generateName: k10-backup-k10-dr-bp-2.1.1-7wmsz-catalog-svc-kasten-io-deployment-
generation: 4
labels:
kanister.io/JobID: 417a734f-f72d-11ec-9a4b-2ef2d43bb0e6
managedFields:
- apiVersion: cr.kanister.io/v1alpha1
fieldsType: FieldsV1
fieldsV1:
f:metadata:
f:generateName: {}
f:labels:
.: {}
f:kanister.io/JobID: {}
f:spec:
.: {}
f:actions: {}
f:status:
.: {}
f:actions: {}
f:error:
.: {}
f:message: {}
f:state: {}
manager: Go-http-client
operation: Update
time: 2022-06-28T21:59:28Z
name: k10-backup-k10-dr-bp-2.1.1-7wmsz-catalog-svc-kasten-io-dep7lkrt
namespace: kasten-io
resourceVersion: "4650"
uid: 1c3a3df8-7ca9-4299-962f-a1d3d7638cd2
spec:
actions:
- blueprint: k10-dr-bp-2.1.1-7wmsz
name: backup
object:
apiVersion: ""
group: ""
kind: deployment
name: catalog-svc
namespace: kasten-io
resource: ""
options:
cacheDirectory: /tmp/kopia-cache
dataSource: http://localhost:8000/v0/backup
filePath: kasten-io/catalog/model-store.db
hostName: fcb37209-ee14-4adf-8d4b-678674f2303d.catalog-svc.catalog-pv-claim
mountPath: /mnt/k10state
objectStorePath: e4ac9b0c-1f53-42c3-9257-f32360bbfced/k10/repo/
pod: catalog-svc-66b7c5f978-9xwcg
userName: k10-admin
podOverride:
securityContext:
runAsNonRoot: false
runAsUser: 0
tolerations:
- effect: NoExecute
key: node.kubernetes.io/not-ready
operator: Exists
tolerationSeconds: 300
- effect: NoExecute
key: node.kubernetes.io/unreachable
operator: Exists
tolerationSeconds: 300
preferredVersion: v1.0.0-alpha
profile:
apiVersion: v1alpha1
group: ""
kind: profile
name: dr-backup-k10-location-ccpj7
namespace: kasten-io
resource: ""
secrets:
artifactKey:
apiVersion: ""
group: ""
kind: secret
name: k10-dr-secret
namespace: kasten-io
resource: ""
status:
actions:
- artifacts:
snapshot:
keyValue:
backupIdentifier: "{{ .Phases.backupCatalog.Output.snapshotID }}"
backupPath: "{{ .Options.mountPath }}"
funcVersion: "{{ .Phases.backupCatalog.Output.version }}"
objectStorePath: "{{ .Options.objectStorePath }}"
blueprint: k10-dr-bp-2.1.1-7wmsz
deferPhase:
name: ""
state: ""
name: backup
object:
apiVersion: ""
group: ""
kind: deployment
name: catalog-svc
namespace: kasten-io
resource: ""
phases:
- name: backupCatalog
state: failed
error:
message: '{"message":"Failed to stream backup to kopia
repository","function":"kasten.io/k10/kio/kanister/function.moverBackup","linenumber":192,"file":"kasten.io/k10/kio/kanister/function/mover_backup.go:192","cause":{"message":"Failed
to exec command in pod: command terminated with exit code 1"}}'
state: failed
file: kasten.io/k10/kio/kanister/operation.go:114
function: kasten.io/k10/kio/kanister.(*Operation).Execute
linenumber: 114
message: ActionSet Failed
message: Job failed to be executed
fields: []

 

The local backup is using csi-hostpath driver and local snapshots complete successfully. 

icon

Best answer by Geoff Burke 7 August 2022, 14:02

View original

3 comments

Userlevel 7
Badge +22

I got some more time to dig into this. So on a Minio-operator multi-tenant setup in K8S this works. So my hunch is either it does not like the region setting (on the command line version you set through env), or Minio immutable with Kasten only works with distributed Minio which I know think I might have seen in the docs somewhere. Still you would think that the bucket would not successfully validate in Kasten if it did not like something with the other version. 

Userlevel 3
Badge +1

Hello @Geoff Burke,

 

Could you please perform the action that caused the above error and attach your debug logs. We will take a look into the process that is running during the time you are getting the error “Failed to exec command in pod: command terminated with exit code 1 ” 

 

Thanks

Emmanuel

Userlevel 7
Badge +22

Hello @Geoff Burke,

 

Could you please perform the action that caused the above error and attach your debug logs. We will take a look into the process that is running during the time you are getting the error “Failed to exec command in pod: command terminated with exit code 1 ” 

 

Thanks

Emmanuel

Hi EBrockman,

 

I have not had this issue again after updating to the latest K10 release and have done a number of test installs and tests so all good now. 

 

Thanks

Comment