kasten access error upstream request timeout


Userlevel 1
http://192.168.10.222:30948/k10/dex/auth/ldap/login?back=&state=szusjmwgx5uuwxqbrxq6fwcgqupstream request timeout

Environment:k8s v1.28.2;   k10 v6.5.10

 kubectl -n kasten-io logs auth-svc-5577fdff4b-94pfg 

{"File":"kasten.io/k10/kio/auth/handlers/ok.go","Function":"kasten.io/k10/kio/auth/handlers.(*OKHandler).ServeHTTP","Line":29,"cluster_name":"4de5112d-6615-497b-b647-f7a36d074f47","hostname":"auth-svc-5577fdff4b-94pfg","level":"info","msg":"Authenticated: no auth required","path":"/v0/authz/k10/dex/auth/ldap/login","status":200,"time":"2024-03-30T14:25:18.562Z","version":"6.5.10"}


8 comments

Userlevel 1

[root@master01 ~]# kubectl logs aggregatedapis-svc-8476dc4c59-8s6ct -n kasten-io
I0407 15:17:41.440031       1 serving.go:374] Generated self-signed cert (/tmp/apiserver.local.config/certificates/apiserver.crt, /tmp/apiserver.local.config/certificates/apiserver.key)
I0407 15:17:42.101942       1 handler.go:275] Adding GroupVersion apps.kio.kasten.io v1alpha1 to ResourceManager
I0407 15:17:42.261959       1 handler.go:275] Adding GroupVersion actions.kio.kasten.io v1alpha1 to ResourceManager
I0407 15:17:42.262570       1 handler.go:275] Adding GroupVersion vault.kio.kasten.io v1alpha1 to ResourceManager
I0407 15:17:42.268200       1 handler.go:275] Adding GroupVersion repositories.kio.kasten.io v1alpha1 to ResourceManager
{"File":"kasten.io/k10/kio/tracing/tracing.go","Function":"kasten.io/k10/kio/tracing.StartProfileBuffers","Line":109,"cluster_name":"7bc5d90f-8060-41f3-b616-b40ed48d5667","hostname":"aggregatedapis-svc-8476dc4c59-8s6ct","level":"info","msg":"no profile buffers configured","time":"2024-04-07T15:17:42.291Z","version":"6.5.9"}
Log message dropped (buffer): {"File":"kasten.io/k10/kio/tracing/tracing.go","Function":"kasten.io/k10/kio/tracing.StartProfileBuffers","Level":"info","Line":109,"Message":"no profile buffers configured","Time":"2024-04-07T15:17:42.291665344Z","cluster_name":"7bc5d90f-8060-41f3-b616-b40ed48d5667","hostname":"aggregatedapis-svc-8476dc4c59-8s6ct","version":"6.5.9"}
 Error: {"message":"Fluentbit connection error","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).handle","linenumber":97,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:97","cause":{"message":"Fluentbit connection problem","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).dial","linenumber":88,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:88","cause":{"message":"dial tcp 10.96.95.215:24224: connect: connection refused"}}}
I0407 15:17:42.354359       1 secure_serving.go:213] Serving securely on [::]:10250
I0407 15:17:42.354724       1 tlsconfig.go:240] "Starting DynamicServingCertificateController"
I0407 15:17:42.354780       1 requestheader_controller.go:169] Starting RequestHeaderAuthRequestController
I0407 15:17:42.354945       1 dynamic_serving_content.go:132] "Starting controller" name="serving-cert::/tmp/apiserver.local.config/certificates/apiserver.crt::/tmp/apiserver.local.config/certificates/apiserver.key"
I0407 15:17:42.356239       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::client-ca-file"
I0407 15:17:42.356720       1 shared_informer.go:311] Waiting for caches to sync for RequestHeaderAuthRequestController
I0407 15:17:42.356763       1 shared_informer.go:311] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
I0407 15:17:42.357765       1 configmap_cafile_content.go:202] "Starting controller" name="client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file"
I0407 15:17:42.357784       1 shared_informer.go:311] Waiting for caches to sync for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
I0407 15:17:42.465930       1 shared_informer.go:318] Caches are synced for RequestHeaderAuthRequestController
I0407 15:17:42.466097       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::client-ca-file
I0407 15:17:42.468171       1 shared_informer.go:318] Caches are synced for client-ca::kube-system::extension-apiserver-authentication::requestheader-client-ca-file
{"File":"kasten.io/k10/kio/aggapi/util/errors.go","Function":"kasten.io/k10/kio/aggapi/util.WrapError","Line":25,"cluster_name":"7bc5d90f-8060-41f3-b616-b40ed48d5667","error":{"message":"Could not retrieve artifacts for prefix search","function":"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix","linenumber":63,"file":"kasten.io/k10/kio/rest/clients/catalogclient.go:63","fields":[{"name":"config","value":{}},{"name":"retries","value":1}],"cause":{"message":"Get \"http://10.96.101.255:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-restorepointcontents.apps.kio.kasten.io\u0026limit=500\u0026reverseOrder=true\u0026value=%40\": dial tcp 10.96.101.255:8000: connect: connection refused"}},"hostname":"aggregatedapis-svc-8476dc4c59-8s6ct","level":"error","msg":"Internal Error","time":"2024-04-07T15:17:43.452Z","version":"6.5.9"}
{"File":"kasten.io/k10/kio/aggapi/util/errors.go","Function":"kasten.io/k10/kio/aggapi/util.WrapError","Line":25,"cluster_name":"7bc5d90f-8060-41f3-b616-b40ed48d5667","error":{"message":"Could not retrieve artifacts for prefix search","function":"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix","linenumber":63,"file":"kasten.io/k10/kio/rest/clients/catalogclient.go:63","fields":[{"name":"config","value":{}},{"name":"retries","value":1}],"cause":{"message":"Get \"http://10.96.101.255:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-clusterrestorepoints.apps.kio.kasten.io\u0026limit=500\u0026reverseOrder=true\u0026value=%40\": dial tcp 10.96.101.255:8000: connect: connection refused"}},"hostname":"aggregatedapis-svc-8476dc4c59-8s6ct","level":"error","msg":"Internal Error","time":"2024-04-07T15:17:43.454Z","version":"6.5.9"}
Log message dropped (buffer): {"File":"kasten.io/k10/kio/aggapi/util/errors.go","Function":"kasten.io/k10/kio/aggapi/util.WrapError","Level":"error","Line":25,"Message":"Internal Error","Time":"2024-04-07T15:17:43.452681822Z","cluster_name":"7bc5d90f-8060-41f3-b616-b40ed48d5667","error":{"message":"Could not retrieve artifacts for prefix search","function":"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix","linenumber":63,"file":"kasten.io/k10/kio/rest/clients/catalogclient.go:63","fields":[{"name":"config","value":{}},{"name":"retries","value":1}],"cause":{"message":"Get \"http://10.96.101.255:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-restorepointcontents.apps.kio.kasten.io\u0026limit=500\u0026reverseOrder=true\u0026value=%40\": dial tcp 10.96.101.255:8000: connect: connection refused"}},"hostname":"aggregatedapis-svc-8476dc4c59-8s6ct","version":"6.5.9"}
{"File":"kasten.io/k10/kio/aggapi/util/errors.go","Function":"kasten.io/k10/kio/aggapi/util.WrapError","Level":"error","Line":25,"Message":"Internal Error","Time":"2024-04-07T15:17:43.454788807Z","cluster_name":"7bc5d90f-8060-41f3-b616-b40ed48d5667","error":{"message":"Could not retrieve artifacts for prefix search","function":"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix","linenumber":63,"file":"kasten.io/k10/kio/rest/clients/catalogclient.go:63","fields":[{"name":"config","value":{}},{"name":"retries","value":1}],"cause":{"message":"Get \"http://10.96.101.255:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-clusterrestorepoints.apps.kio.kasten.io\u0026limit=500\u0026reverseOrder=true\u0026value=%40\": dial tcp 10.96.101.255:8000: connect: connection refused"}},"hostname":"aggregatedapis-svc-8476dc4c59-8s6ct","version":"6.5.9"}
 Error: {"message":"Fluentbit connection error","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).handle","linenumber":97,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:97","cause":{"message":"Fluentbit connection problem","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).dial","linenumber":88,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:88","cause":{"message":"dial tcp 10.96.95.215:24224: connect: connection refused"}}}
{"File":"kasten.io/k10/kio/aggapi/util/errors.go","Function":"kasten.io/k10/kio/aggapi/util.WrapError","Line":25,"cluster_name":"7bc5d90f-8060-41f3-b616-b40ed48d5667","error":{"message":"Could not retrieve artifacts for prefix search","function":"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix","linenumber":63,"file":"kasten.io/k10/kio/rest/clients/catalogclient.go:63","fields":[{"name":"config","value":{}},{"name":"retries","value":1}],"cause":{"message":"Get \"http://10.96.101.255:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\u0026limit=300\u0026reverseOrder=true\u0026value=%40\": dial tcp 10.96.101.255:8000: connect: connection refused"}},"hostname":"aggregatedapis-svc-8476dc4c59-8s6ct","level":"error","msg":"Internal Error","time":"2024-04-07T15:18:07.821Z","version":"6.5.9"}

Userlevel 7
Badge +7

@jaiganeshjk 

Userlevel 6
Badge +2

@lidw123 Would you be able to share a brief explanation on the issue that you are seeing in K10 ?

Were you not able to access K10 dashboards ? Any particular jobs failing ?

The above logs from aggregated-api pods shows that the connection from that pod to catalog-svc is being refused. 

If the catalog-svc pod is not running, That would be the reason for this error.

If the pod doesn’t have any issue and is ready, There could be something with the network connectivity between the pods.

You can manually try calling catalog’s healthz endpoint from aggregatedapis-svc pod.

```

kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://catalog-svc:8000/v0/healthz

```

Userlevel 1

Configuration Active Directory Authentication

not able to access K10 dashboards 

The page opens with an error: upstream request timeout

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://catalog-svc:8000/v0/healthz
{"namespace":"kasten-io","version":"6.5.10"}


[root@master01 ~]# kubectl get pods -n kasten-io
NAME                                     READY   STATUS    RESTARTS   AGE
aggregatedapis-svc-7ddfdddbd6-8xdz2      1/1     Running   0          10h
auth-svc-67b5696786-jrl9m                2/2     Running   0          10h
catalog-svc-6c66bbcbcf-6th8k             2/2     Running   0          10h
controllermanager-svc-69dbd6f6ff-blwf7   1/1     Running   0          10h
crypto-svc-d49b96b96-6v77j               4/4     Running   0          10h
dashboardbff-svc-7cf585d7c-dk6k8         2/2     Running   0          10h
executor-svc-cd4577cf8-dt46s             1/1     Running   0          10h
executor-svc-cd4577cf8-k526t             1/1     Running   0          10h
executor-svc-cd4577cf8-xr9v5             1/1     Running   0          10h
frontend-svc-778bc8ffd7-td8kp            1/1     Running   0          10h
gateway-6f4c79cfb8-n875n                 1/1     Running   0          10h
jobs-svc-54864bf959-d4br4                1/1     Running   0          10h
k10-grafana-86f8c8798-mdwms              1/1     Running   0          10h
kanister-svc-67d4588546-wjpff            1/1     Running   0          10h
logging-svc-67b56c969f-wcvhh             1/1     Running   0          10h
metering-svc-5d9d4b447f-cjl42            1/1     Running   0          10h
prometheus-server-5467dd9c77-gwdkt       2/2     Running   0          11h
state-svc-787f8b867-ctq29                3/3     Running   0          10h

Userlevel 1

@jaiganeshjk 

Userlevel 1

root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://auth-svc:8000/v0/healthz

{"namespace":"kasten-io","version”:”6.5.10”}

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://catalog-svc:8000/v0/healthz

{"namespace":"kasten-io","version”:”6.5.10”}

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://controllermanager-svc:8000/v0/healthz

{"namespace":"kasten-io","version":"6.5.10"}

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://crypto-svc:8000/v0/healthz

{"namespace":"kasten-io","version":"6.5.10"}

 

 

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://dex:8000/v0/healthz

 

<No information>

 

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://executor-svc:8000/v0/healthz

{"namespace":"kasten-io","version":"6.5.10"}

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://frontend-svc:8000/v0/healthz

<html>

<head><title>301 Moved Permanently</title></head>

<body>

<center><h1>301 Moved Permanently</h1></center>

<hr><center>nginx</center>

</body>

</html>

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://jobs-svc:8000/v0/healthz

{"namespace":"kasten-io","version":"6.5.10"}

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://kanister-svc:8000/v0/healthz

{"alive":true,"version

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://logging-svc:8000/v0/healthz

{"namespace":"kasten-io","version":"6.5.10"}

[root@master01 ~]# kubectl exec -it deploy/aggregatedapis-svc -n kasten-io -- curl http://state-svc:8000/v0/healthz

{"namespace":"kasten-io","version":"6.5.10"}

Userlevel 6
Badge +2

Would you be able to share the logs from auth-svc if the authentication is causing issues ?

Since this might involve an engaged debugging, would you be able to open a case in https://my.veeam.com/ selecting Kasten by Veeam K10 TrIal under product ?

 

Userlevel 1

@jaiganeshjk   

I've already made the case

Case #07219420

 

Thank you

Comment