Have installed the Kasten k10 on AWS k8, trying to access the dashboard from the kubectl --namespace kasten-io port-forward service/gateway 8080:8000
I get below error, cant get it to work.
Have installed the Kasten k10 on AWS k8, trying to access the dashboard from the kubectl --namespace kasten-io port-forward service/gateway 8080:8000
I get below error, cant get it to work.
kubectl get pods -n kasten-io
NAME READY STATUS RESTARTS AGE
aggregatedapis-svc-bc84999d9-j9l2z 1/1 Running 0 61m
auth-svc-584d66855c-nc89j 1/1 Running 0 61m
catalog-svc-5495cd9758-pjkbr 2/2 Running 0 61m
controllermanager-svc-5f4bf655bf-v5wn2 1/1 Running 0 61m
crypto-svc-6d495cb5b4-mx52b 4/4 Running 0 61m
dashboardbff-svc-8684d85777-zbwcr 2/2 Running 0 61m
executor-svc-7bc57f5c54-7zjdx 1/1 Running 0 61m
executor-svc-7bc57f5c54-jn4d6 1/1 Running 0 61m
executor-svc-7bc57f5c54-kdprp 1/1 Running 0 61m
frontend-svc-7c95c877db-nhbs8 1/1 Running 0 61m
gateway-748544f574-8rtmn 1/1 Running 0 5m59s
jobs-svc-79547bfb5d-2zjzz 1/1 Running 0 61m
k10-grafana-55b8c5bd8f-g4hm8 1/1 Running 0 61m
kanister-svc-6659cdb9cd-cjkqm 1/1 Running 0 61m
logging-svc-644799b5f6-nl5k8 1/1 Running 0 61m
metering-svc-599bfd5d68-2f8bg 1/1 Running 0 61m
prometheus-server-66bc65bd44-n2vnh 2/2 Running 0 61m
state-svc-78854d9c6d-2kw9h 3/3 Running 0 61m
kubectl get svc -n kasten-io
NAME TYPE CLUSTER-IP EXTERNAL-IP PORT(S) AGE
aggregatedapis-svc ClusterIP 172.20.198.161 <none> 443/TCP 62m
auth-svc ClusterIP 172.20.249.241 <none> 8000/TCP 63m
catalog-svc ClusterIP 172.20.15.168 <none> 8000/TCP 63m
controllermanager-svc ClusterIP 172.20.154.149 <none> 8000/TCP,18000/TCP 63m
crypto-svc ClusterIP 172.20.12.246 <none> 8000/TCP,8001/TCP,8002/TCP,8003/TCP 63m
dashboardbff-svc ClusterIP 172.20.107.124 <none> 8000/TCP,8001/TCP 63m
executor-svc ClusterIP 172.20.226.51 <none> 8000/TCP 63m
frontend-svc ClusterIP 172.20.233.228 <none> 8000/TCP 62m
gateway NodePort 172.20.203.155 <none> 80:32175/TCP 63m
gateway-admin ClusterIP 172.20.250.102 <none> 8877/TCP 63m
jobs-svc ClusterIP 172.20.3.190 <none> 8000/TCP 63m
k10-grafana ClusterIP 172.20.221.174 <none> 80/TCP 63m
kanister-svc ClusterIP 172.20.149.44 <none> 8000/TCP 62m
logging-svc ClusterIP 172.20.139.224 <none> 8000/TCP,24224/TCP,24225/TCP 62m
metering-svc ClusterIP 172.20.6.26 <none> 8000/TCP 62m
prometheus-server ClusterIP 172.20.182.213 <none> 80/TCP 63m
prometheus-server-exp ClusterIP 172.20.115.38 <none> 80/TCP 63m
state-svc ClusterIP 172.20.217.68 <none> 8000/TCP,8001/TCP,8002/TCP 62m
Logs of dashboardbff-svc pod
Error: {"message":"Fluentbit connection error","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).handle","linenumber":97,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:97","cause":{"message":"Fluentbit connection problem","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).dial","linenumber":88,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:88","cause":{"message":"dial tcp 172.20.139.224:24224: connect: connection refused"}}}
{"File":"kasten.io/k10/kio/bff/artifact_type_poller.go","Function":"kasten.io/k10/kio/bff.(*ArtifactTypeMetrics).Run","Line":89,"cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","error":"Internal error occurred: {\"message\":\"Could not retrieve artifacts for prefix search\",\"function\":\"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix\",\"linenumber\":63,\"file\":\"kasten.io/k10/kio/rest/clients/catalogclient.go:63\",\"fields\":/{\"name\":\"config\",\"value\":{}},{\"name\":\"retries\",\"value\":1}],\"cause\":{\"message\":\"Get \\\"http://172.20.15.168:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\\u0026limit=300\\u0026reverseOrder=true\\u0026value=%40\\\": dial tcp 172.20.15.168:8000: connect: connection refused\"}}","hostname":"dashboardbff-svc-8684d85777-zbwcr","level":"error","msg":"failed to get BackupActions","time":"2024-02-13T19:08:50.355Z","version":"6.5.4"}
Log message dropped (buffer): {"File":"kasten.io/k10/kio/bff/artifact_type_poller.go","Function":"kasten.io/k10/kio/bff.(*ArtifactTypeMetrics).Run","Level":"error","Line":89,"Message":"failed to get BackupActions","Time":"2024-02-13T19:08:50.355117471Z","cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","error":"Internal error occurred: {\"message\":\"Could not retrieve artifacts for prefix search\",\"function\":\"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix\",\"linenumber\":63,\"file\":\"kasten.io/k10/kio/rest/clients/catalogclient.go:63\",\"fields\":o{\"name\":\"config\",\"value\":{}},{\"name\":\"retries\",\"value\":1}],\"cause\":{\"message\":\"Get \\\"http://172.20.15.168:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\\u0026limit=300\\u0026reverseOrder=true\\u0026value=%40\\\": dial tcp 172.20.15.168:8000: connect: connection refused\"}}","hostname":"dashboardbff-svc-8684d85777-zbwcr","version":"6.5.4"}
Error: {"message":"Fluentbit connection error","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).handle","linenumber":97,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:97","cause":{"message":"Fluentbit connection problem","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).dial","linenumber":88,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:88","cause":{"message":"dial tcp 172.20.139.224:24224: connect: connection refused"}}}
{"File":"kasten.io/k10/kio/bff/artifact_type_poller.go","Function":"kasten.io/k10/kio/bff.(*ArtifactTypeMetrics).Run","Line":89,"cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","error":"Internal error occurred: {\"message\":\"Could not retrieve artifacts for prefix search\",\"function\":\"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix\",\"linenumber\":63,\"file\":\"kasten.io/k10/kio/rest/clients/catalogclient.go:63\",\"fields\":/{\"name\":\"config\",\"value\":{}},{\"name\":\"retries\",\"value\":1}],\"cause\":{\"message\":\"Get \\\"http://172.20.15.168:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\\u0026limit=300\\u0026reverseOrder=true\\u0026value=%40\\\": dial tcp 172.20.15.168:8000: connect: connection refused\"}}","hostname":"dashboardbff-svc-8684d85777-zbwcr","level":"error","msg":"failed to get BackupActions","time":"2024-02-13T19:09:20.358Z","version":"6.5.4"}
Log message dropped (buffer): {"File":"kasten.io/k10/kio/bff/artifact_type_poller.go","Function":"kasten.io/k10/kio/bff.(*ArtifactTypeMetrics).Run","Level":"error","Line":89,"Message":"failed to get BackupActions","Time":"2024-02-13T19:09:20.358230604Z","cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","error":"Internal error occurred: {\"message\":\"Could not retrieve artifacts for prefix search\",\"function\":\"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix\",\"linenumber\":63,\"file\":\"kasten.io/k10/kio/rest/clients/catalogclient.go:63\",\"fields\":o{\"name\":\"config\",\"value\":{}},{\"name\":\"retries\",\"value\":1}],\"cause\":{\"message\":\"Get \\\"http://172.20.15.168:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\\u0026limit=300\\u0026reverseOrder=true\\u0026value=%40\\\": dial tcp 172.20.15.168:8000: connect: connection refused\"}}","hostname":"dashboardbff-svc-8684d85777-zbwcr","version":"6.5.4"}
Error: {"message":"Fluentbit connection error","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).handle","linenumber":97,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:97","cause":{"message":"Fluentbit connection problem","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).dial","linenumber":88,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:88","cause":{"message":"dial tcp 172.20.139.224:24224: connect: connection refused"}}}
{"File":"kasten.io/k10/kio/bff/artifact_type_poller.go","Function":"kasten.io/k10/kio/bff.(*ArtifactTypeMetrics).Run","Line":89,"cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","error":"Internal error occurred: {\"message\":\"Could not retrieve artifacts for prefix search\",\"function\":\"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix\",\"linenumber\":63,\"file\":\"kasten.io/k10/kio/rest/clients/catalogclient.go:63\",\"fields\":/{\"name\":\"config\",\"value\":{}},{\"name\":\"retries\",\"value\":1}],\"cause\":{\"message\":\"Get \\\"http://172.20.15.168:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\\u0026limit=300\\u0026reverseOrder=true\\u0026value=%40\\\": dial tcp 172.20.15.168:8000: connect: connection refused\"}}","hostname":"dashboardbff-svc-8684d85777-zbwcr","level":"error","msg":"failed to get BackupActions","time":"2024-02-13T19:09:50.348Z","version":"6.5.4"}
Log message dropped (buffer): {"File":"kasten.io/k10/kio/bff/artifact_type_poller.go","Function":"kasten.io/k10/kio/bff.(*ArtifactTypeMetrics).Run","Level":"error","Line":89,"Message":"failed to get BackupActions","Time":"2024-02-13T19:09:50.348380383Z","cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","error":"Internal error occurred: {\"message\":\"Could not retrieve artifacts for prefix search\",\"function\":\"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix\",\"linenumber\":63,\"file\":\"kasten.io/k10/kio/rest/clients/catalogclient.go:63\",\"fields\":o{\"name\":\"config\",\"value\":{}},{\"name\":\"retries\",\"value\":1}],\"cause\":{\"message\":\"Get \\\"http://172.20.15.168:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\\u0026limit=300\\u0026reverseOrder=true\\u0026value=%40\\\": dial tcp 172.20.15.168:8000: connect: connection refused\"}}","hostname":"dashboardbff-svc-8684d85777-zbwcr","version":"6.5.4"}
Error: {"message":"Fluentbit connection error","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).handle","linenumber":97,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:97","cause":{"message":"Fluentbit connection problem","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).dial","linenumber":88,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:88","cause":{"message":"dial tcp 172.20.139.224:24224: connect: connection refused"}}}
{"File":"kasten.io/k10/kio/bff/artifact_type_poller.go","Function":"kasten.io/k10/kio/bff.(*ArtifactTypeMetrics).Run","Line":89,"cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","error":"Internal error occurred: {\"message\":\"Could not retrieve artifacts for prefix search\",\"function\":\"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix\",\"linenumber\":63,\"file\":\"kasten.io/k10/kio/rest/clients/catalogclient.go:63\",\"fields\":/{\"name\":\"config\",\"value\":{}},{\"name\":\"retries\",\"value\":1}],\"cause\":{\"message\":\"Get \\\"http://172.20.15.168:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\\u0026limit=300\\u0026reverseOrder=true\\u0026value=%40\\\": dial tcp 172.20.15.168:8000: connect: connection refused\"}}","hostname":"dashboardbff-svc-8684d85777-zbwcr","level":"error","msg":"failed to get BackupActions","time":"2024-02-13T19:10:20.353Z","version":"6.5.4"}
Log message dropped (buffer): {"File":"kasten.io/k10/kio/bff/artifact_type_poller.go","Function":"kasten.io/k10/kio/bff.(*ArtifactTypeMetrics).Run","Level":"error","Line":89,"Message":"failed to get BackupActions","Time":"2024-02-13T19:10:20.353711765Z","cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","error":"Internal error occurred: {\"message\":\"Could not retrieve artifacts for prefix search\",\"function\":\"kasten.io/k10/kio/rest/clients.FetchArtifactsForSearchPrefix\",\"linenumber\":63,\"file\":\"kasten.io/k10/kio/rest/clients/catalogclient.go:63\",\"fields\":o{\"name\":\"config\",\"value\":{}},{\"name\":\"retries\",\"value\":1}],\"cause\":{\"message\":\"Get \\\"http://172.20.15.168:8000/v0/artifacts/prefixSearch?key=api-list-prefix-key-backupactions.actions.kio.kasten.io\\u0026limit=300\\u0026reverseOrder=true\\u0026value=%40\\\": dial tcp 172.20.15.168:8000: connect: connection refused\"}}","hostname":"dashboardbff-svc-8684d85777-zbwcr","version":"6.5.4"}
Below are the logs from Gateway pod, is it the RBAC issue?
ACCESS 2024-02-13T22:33:04.517Z] "GET / HTTP/1.1" 404 NR 0 0 0 - "10.112.16.131" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36" "c98ddae0-08ae-465f-9b90-c3f6153bf03f" "10.112.16.131:31644" "-"
ACCESS 2024-02-13T22:33:06.412Z] "GET / HTTP/1.1" 404 NR 0 0 0 - "10.112.16.131" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36" "396af7b9-53ee-4f06-b1bc-ac1060169be4" "10.112.16.131:31644" "-"
2024-02-13 22:33:29 diagd 3.9.1 P16TThreadPoolExecutor-0_0] INFO: 495610EF-9F4D-4753-AE15-8FE6AA5C8B13: 127.0.0.1 "GET /metrics" 20ms 200 success
2024-02-13 22:33:33 diagd 3.9.1 P16TAEW] INFO: -global-: <RichStatus BAD error='Pod labels are not mounted in the Ambassador container; Kubernetes Ingress support is likely to be limited' hostname='gateway-65cb864bf7-nsphd' version='3.9.1'>
2024-02-13 22:33:33 diagd 3.9.1 P16TAEW] ERROR: ERROR ERROR ERROR Starting with configuration errors
2024-02-13 22:33:33 diagd 3.9.1 P16TAEW] INFO: EnvoyConfig: Generating V3
2024-02-13 22:33:33 diagd 3.9.1 P16TAEW] INFO: V3Listener <V3Listener HTTP ambassadorlistener on 0.0.0.0:8000 XFP]>: generated ===========================
2024-02-13 22:33:33 diagd 3.9.1 P16TAEW] INFO: V3Ready: ==== listen on 127.0.0.1:8006
time="2024-02-13 22:33:33.4078" level=info msg="Loaded file /ambassador/envoy/envoy.json" func=github.com/emissary-ingress/emissary/v3/pkg/ambex.Decode file="/go/pkg/ambex/main.go:281" CMD=entrypoint PID=1 THREAD=/ambex/main-loop
time="2024-02-13 22:33:33.4089" level=info msg="Saved snapshot v9" func=github.com/emissary-ingress/emissary/v3/pkg/ambex.csDump file="/go/pkg/ambex/main.go:351" CMD=entrypoint PID=1 THREAD=/ambex/main-loop
time="2024-02-13 22:33:33.4094" level=info msg="Pushing snapshot v9" func=github.com/emissary-ingress/emissary/v3/pkg/ambex.updaterWithTicker file="/go/pkg/ambex/ratelimit.go:159" CMD=entrypoint PID=1 THREAD=/ambex/updater
2024-02-13 22:33:33 diagd 3.9.1 P16TAEW] INFO: configuration updated (incremental) from snapshot snapshot (S18 L1 G10 C9)
2024-02-13 22:33:33 diagd 3.9.1 P16TAEW] INFO: error Pod labels are not mounted in the Ambassador container; Kubernetes Ingress support is likely to be limited
We recently moved the gateway svc port from 8000 to 80.
Ideally, the below port-forward command should work for you to access K10
kubectl --namespace kasten-io port-forward service/gateway 8080:80
Documentation - https://docs.kasten.io/latest/access/dashboard.html#access-via-kubectl
Please try it out and let me know if this doesn’t work.
Hello
Thanks for the response, i was able to port forward and get the dashboard, but there are error logs on Dashboarddbff-svc and gateway pods and also dashboard has network error
Dashboard svc pod-
Error: {"message":"Fluentbit connection error","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).handle","linenumber":97,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:97","cause":{"message":"Fluentbit connection problem","function":"kasten.io/k10/kio/log/hooks/fluentbit.(*Hook).dial","linenumber":88,"file":"kasten.io/k10/kio/log/hooks/fluentbit/fluentbit.go:88","cause":{"message":"dial tcp 172.20.139.224:24224: i/o timeout"}}}
Log message dropped (buffer): {"File":"kasten.io/k10/rest/srv/dashboardbffserver/kio_frontend_log_handler.go","Function":"kasten.io/k10/rest/srv/dashboardbffserver.(*frontendLogHandler).log","Level":"error","Line":34,"Message":"Response data","Time":"2024-02-14T17:05:39.097668169Z","cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","data":a"upstream request timeout"],"hostname":"dashboardbff-svc-74477d7749-lxwjz","origin":"frontend","sessionId":"5f4294d0-e66c-4fc0-a864-3dab4a12efc9","version":"6.5.4"}
{"File":"kasten.io/k10/rest/srv/dashboardbffserver/kio_frontend_log_handler.go","Function":"kasten.io/k10/rest/srv/dashboardbffserver.(*frontendLogHandler).log","Level":"info","Line":30,"Message":"Network error debug info","Time":"2024-02-14T17:05:39.09785307Z","cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","data":a],"hostname":"dashboardbff-svc-74477d7749-lxwjz","origin":"frontend","sessionId":"5f4294d0-e66c-4fc0-a864-3dab4a12efc9","version":"6.5.4"}
{"File":"kasten.io/k10/rest/srv/dashboardbffserver/kio_frontend_log_handler.go","Function":"kasten.io/k10/rest/srv/dashboardbffserver.(*frontendLogHandler).log","Level":"error","Line":34,"Message":"Response status","Time":"2024-02-14T17:05:39.114611471Z","cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","data":a504],"hostname":"dashboardbff-svc-74477d7749-lxwjz","origin":"frontend","sessionId":"5f4294d0-e66c-4fc0-a864-3dab4a12efc9","version":"6.5.4"}
{"File":"kasten.io/k10/rest/srv/dashboardbffserver/kio_frontend_log_handler.go","Function":"kasten.io/k10/rest/srv/dashboardbffserver.(*frontendLogHandler).log","Level":"error","Line":34,"Message":"Response headers","Time":"2024-02-14T17:05:39.133562583Z","cluster_name":"e64de663-d66a-4f1e-8b3b-f24789bd70a0","data":a{"content-length":"24","content-type":"text/plain","date":"Wed, 14 Feb 2024 17:05:38 GMT","server":"envoy"}],"hostname":"dashboardbff-svc-74477d7749-lxwjz","origin":"frontend","sessionId":"5f4294d0-e66c-4fc0-a864-3dab4a12efc9","version":"6.5.4"}
Dashboard error-
Gateway pod logs -
ERROR: ERROR ERROR ERROR Starting with configuration errors
2024-02-14 17:05:04 diagd 3.9.1 dP16TAEW] INFO: EnvoyConfig: Generating V3
2024-02-14 17:05:04 diagd 3.9.1 dP16TAEW] INFO: V3Listener <V3Listener HTTP ambassadorlistener on 0.0.0.0:8000 .XFP]>: generated ===========================
2024-02-14 17:05:04 diagd 3.9.1 dP16TAEW] INFO: V3Ready: ==== listen on 127.0.0.1:8006
2024-02-14 17:05:04 diagd 3.9.1 dP16TAEW] INFO: configuration updated (incremental) from snapshot snapshot (S18 L1 G10 C9)
time="2024-02-14 17:05:04.6205" level=info msg="Loaded file /ambassador/envoy/envoy.json" func=github.com/emissary-ingress/emissary/v3/pkg/ambex.Decode file="/go/pkg/ambex/main.go:281" CMD=entrypoint PID=1 THREAD=/ambex/main-loop
time="2024-02-14 17:05:04.6229" level=info msg="Saved snapshot v6" func=github.com/emissary-ingress/emissary/v3/pkg/ambex.csDump file="/go/pkg/ambex/main.go:351" CMD=entrypoint PID=1 THREAD=/ambex/main-loop
2024-02-14 17:05:04 diagd 3.9.1 dP16TAEW] INFO: error Pod labels are not mounted in the Ambassador container; Kubernetes Ingress support is likely to be limited
time="2024-02-14 17:05:04.6310" level=info msg="Pushing snapshot v6" func=github.com/emissary-ingress/emissary/v3/pkg/ambex.updaterWithTicker file="/go/pkg/ambex/ratelimit.go:159" CMD=entrypoint PID=1 THREAD=/ambex/updater
2024-02-14 17:05:04 diagd 3.9.1 gP16TAEW] INFO: -global-: <RichStatus BAD error='Pod labels are not mounted in the Ambassador container; Kubernetes Ingress support is likely to be limited' hostname='gateway-5784df8fbb-x2vmm' version='3.9.1'>
Gatway pod logs -
"GET /k10/dashboardbff-svc/v0/licenseStatus HTTP/1.1" 200 - 0 321 68 64 "10.112.16.161" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36" "aa945eae-16e9-46c4-a1a5-dcef95116844" "127.0.0.1:8080" "172.20.107.124:8000"
ACCESS 2024-02-14T17:04:38.709Z] "GET /k10/dashboardbff-svc/v0/bff/policycrs/kasten-io HTTP/1.1" 200 - 0 3 33 31 "10.112.16.161" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36" "05c54010-8936-4a4d-a0a6-ae7f16d1dc61" "127.0.0.1:8080" "172.20.107.124:8000"
2024-02-14 17:04:52 diagd 3.9.1 P16TThreadPoolExecutor-0_0] INFO: B9D8651D-8E09-48D6-87DD-1C5C091D0904: 127.0.0.1 "GET /metrics" 9ms 200 success
ACCESS 2024-02-14T17:04:56.721Z] "POST /k10/dashboardbff-svc/v0/log HTTP/1.1" 201 - 262 0 11 8 "10.112.16.161" "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36" "b9ccba0a-6430-4c2e-ad16-7935b9d219cc" "127.0.0.1:8080" "172.20.107.124:8000"
2024-02-14 17:05:04 diagd 3.9.1 P16TAEW] INFO: -global-: <RichStatus BAD error='Pod labels are not mounted in the Ambassador container; Kubernetes Ingress support is likely to be limited' hostname='gateway-5784df8fbb-x2vmm' version='3.9.1'>
2024-02-14 17:05:04 diagd 3.9.1 P16TAEW] ERROR: ERROR ERROR ERROR Starting with configuration errors
services.dashboardbff.hostNetwork
Thanks
Ahmed Hagag
You might want to verify whether the ingress class has been deployed and is being referenced by the ingress. kubectl get ingress k10-ingress -oyaml -n kasten-io
spec:
ingressClassName: nginx
rules:
- host: primary.xxxx.net
http:
paths:
- backend:
service:
name: gateway
port:
number: 80
path: /k10/
pathType: ImplementationSpecific
tls:
Hello
I am facing the similar issue. cant get the dashboard working
Install the kasten via helm by “helm install k10 kasten/k10 --namespace=kasten-io”
then I upgraded helm as per documentation by
helm upgrade k10 kasten/k10 --namespace=kasten-io \ --reuse-values \ --set externalGateway.create=true \ --set auth.tokenAuth.enabled=true
It just doesnt load
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.