(Kubernetes + Docker) Skaffold keeps terminating my deployment files : Error: could not stabilize within 2m0s: context deadline exceeded
Asked Answered
T

2

5

I'm trying to deploy a MicroServices system on my local machine using Skaffold.

ingress-srv.yaml

apiVersion: extensions/v1beta1
kind: Ingress
metadata:
  name: ingress-service
  annotations:
    kubernetes.io/ingress.class: nginx
    nginx.ingress.kubernetes.io/use-regex: 'true'
spec:
  rules:
    - host: ticketing.dot
      http:
        paths:
          - path: /api/users/?(.*)
            backend:
              serviceName: auth-srv
              servicePort: 3000

auth-depl.yaml

apiVersion: apps/v1
kind: Deployment
metadata:
  name: auth-depl
spec:
  replicas: 1
  selector:
    matchLabels:
      app: auth
  template:
    metadata:
      labels:
        app: auth
    spec:
      containers:
        - name: auth
          image: ****MYDOCKERID****/auth
          env:
            - name: JWT_KEY
              valueFrom:
                secretKeyRef:
                  name: jwt-secret
                  key: JWT_KEY

---
apiVersion: v1
kind: Service
metadata:
  name: auth-srv
spec:
  selector:
    app: auth
  ports:
    - name: auth
      protocol: TCP
      port: 3000
      targetPort: 3000

auth-mongo-depl.yaml:

apiVersion: apps/v1
kind: Deployment
metadata:
  name: auth-mongo-depl
spec:
  replicas: 1
  selector:
    matchLabels:
      app: auth-mongo
  template:
    metadata:
      labels:
        app: auth-mongo
    spec:
      containers:
        - name: auth-mongo
          image: mongo
---
apiVersion: v1
kind: Service
metadata:
  name: auth-mongo-srv
spec:
  selector:
    app: auth-mongo
  ports:
    - name: db
      protocol: TCP
      port: 27017
      targetPort: 27017

I've followed through the guidelines in the manual:

https://kubernetes.github.io/ingress-nginx/deploy/

and hit:

kubectl apply -f https://raw.githubusercontent.com/kubernetes/ingress-nginx/controller-v0.34.0/deploy/static/provider/cloud/deploy.yaml

However Skaffold keep terminating the deployment:

[34mListing files to watch...[0m
[34m - ****MYDOCKERID****/auth
[0m[34mGenerating tags...[0m
[34m - ****MYDOCKERID****/auth -> [0m****MYDOCKERID****/auth:683e8db
[34mChecking cache...[0m
[34m - ****MYDOCKERID****/auth: [0m[32mFound Locally[0m
[34mTags used in deployment:[0m
[34m - ****MYDOCKERID****/auth -> [0m****MYDOCKERID****/auth:3c4bb66ff693320b5fac3fde91906768f8b54b968813b226822d057d1dd3a995
[34mStarting deploy...[0m
 - deployment.apps/auth-depl created
 - service/auth-srv created
 - deployment.apps/auth-mongo-depl created
 - service/auth-mongo-srv created
 - ingress.extensions/ingress-service created
[34mWaiting for deployments to stabilize...[0m
 - deployment/auth-depl:
 - deployment/auth-mongo-depl:
 - deployment/auth-depl: waiting for rollout to finish: 0 of 1 updated replicas are available...
 - deployment/auth-mongo-depl: waiting for rollout to finish: 0 of 1 updated replicas are available...
 - deployment/auth-mongo-depl is ready. [1/2 deployment(s) still pending]
 - deployment/auth-depl failed. Error: could not stabilize within 2m0s: context deadline exceeded.
[34mCleaning up...[0m
 - deployment.apps "auth-depl" deleted
 - service "auth-srv" deleted
 - deployment.apps "auth-mongo-depl" deleted
 - service "auth-mongo-srv" deleted
 - ingress.extensions "ingress-service" deleted
[31mexiting dev mode because first deploy failed: 1/2 deployment(s) failed[0m

How can we fix this annoying issue?

EDIT 9:44 AM ISRAEL TIME :

C:\Development-T410\Micro Services - JAN>kubectl get pods
NAME                                 READY   STATUS                       RESTARTS   AGE
auth-depl-645bbf7b9d-llp2q           0/1     CreateContainerConfigError   0          115s
auth-depl-c6c765d7c-7wvcg            0/1     CreateContainerConfigError   0          28m
auth-mongo-depl-6b594c4847-4kzzt     1/1     Running                      0          115s
client-depl-5888f95b59-vznh6         1/1     Running                      0          114s
nats-depl-7dfccdf5-874vm             1/1     Running                      0          114s
orders-depl-74f4d48559-cbwlp         0/1     CreateContainerConfigError   0          114s
orders-depl-78fc845b4-9tfml          0/1     CreateContainerConfigError   0          28m
orders-mongo-depl-688676d675-lrvhp   1/1     Running                      0          113s
tickets-depl-7cc7ddbbff-z9pvc        0/1     CreateContainerConfigError   0          113s
tickets-depl-8574fc8f9b-tm6p4        0/1     CreateContainerConfigError   0          28m
tickets-mongo-depl-b95f45947-hf6wq   1/1     Running                      0          113s

C:\Development-T410\Micro Services>kubectl logs auth-depl-c6c765d7c-7wvcg
Error from server (BadRequest): container "auth" in pod "auth-depl-c6c765d7c-7wvcg" is waiting to start: CreateContainerConfigError
Transpicuous answered 14/7, 2020 at 4:9 Comment(1)
What does the kubectl describe pod auth-depl-645bbf7b9d-llp2q show? I'm mostly interested in the events section.Nappe
K
6

Looks look your auth-depl deployment is failing. Possibly the container is crashing or erroring out. To debug you can see the pod logs

$ kubectl logs auth-depl-xxxxxxxxxx-xxxxx

Make sure you run skaffold with the --cleanup=false option so that you can debug. For example,

$ skaffold dev --cleanup=false

Update:

Based on the logs it looks like it's an issue with your Kubernetes Secret and how it's defined, possibly the format or YAML format. This answer sheds some details on what the problem may be: Pod status as `CreateContainerConfigError` in Minikube cluster

Keffer answered 14/7, 2020 at 5:57 Comment(3)
Thanks , I've added the logs to the answer above (in the EDIT section) , however it doesn't really help much.Transpicuous
It looks like it's an issue with your Secret and how it's defined. #50425254Keffer
You were right , thanks , but now I have a bigger problem , I'll open another question. Thanks @KefferTranspicuous
L
0

You should add the environment variable for the mongo image in your deployment file

 env:
        - name: MONGO_INITDB_ROOT_USERNAME
          value: root
        - name: MONGO_INITDB_ROOT_PASSWORD
          value: "rootuser"
Lucky answered 2/8, 2020 at 16:41 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.