argo-cd: ArgoCD 1.4 Resources synchronized but still in OutOfSynch/Missing

Checklist:

  • I’ve searched in the docs and FAQ for my answer: http://bit.ly/argocd-faq.
  • I’ve included steps to reproduce the bug.
  • I’ve pasted the output of argocd version.

Hello everyone,

I deployed argocd 1.4 in our Openshift cluster and I have in this cluster applications deployed with helm 3. I tried to report all of our applications into argocd. Considering that argocd is not supporting helm v3 api and after some research on my side I changed on my charts apiVersion to use v1.

After that, I’ve created my application on argocd and I configured it to point to my helm chart (which are in a git repository). My application is created and state is now OutOfSynch. So now, when I clicked on Synchronize, argocd does is job and return a state : OK for synchronisation but, my application still in OutOfSynch/Missing.

I tried from ui to delete some resource and synchronize again but same result.

Can someone help me to understand what’s going on ? I’m really blocked and I can’t use argocd.

UPDATE: When I deploy my application on the namespace where argocd is running It’s working I can see my application Healthy and Synch but now, when I do the same but in ANOTHER NAMESPACE, it’s not working. It’s so weird.

I’m trying to deploy in my local cluster. I’ve created a new cluster with kubernetes.default.svc to replace the default one to setup it with a namespaced scope and I create a dedicated ServiceAccount and set in the configuration the according token. Then as I’m in openshift I give to my new ServiceAccount admin role. Finally for each namespace I create a Rolebinding which bind my admin role and serviceaccount system:serviceaccount:<NAMESPACE_OF_SA>:<DEST_NAMESPACE>

argocd is able to make all it’s kubectl apply commands on this destination namespace but UI seems to not be able to fetch the resources and I don’t know why. Does it a RBAC problem or something else ? What can I do to troubleshoot this problem because I’m stuck ?

To Reproduce

  • Create an helm3 chart
  • Change apiversion in Chart.yaml to v1
  • Create an application on argocd
  • Synchronize your application

Expected behavior

UI should show me my application as synchronized and healthy.

Screenshots

screen1 screen2

Version

argocd: v1.4.0-rc1+5af52f6
  BuildDate: 2020-01-13T17:23:04Z
  GitCommit: 5af52f66988ad8fa0d6b977d7f5aedcdb9f5a521
  GitTreeState: clean
  GoVersion: go1.12.6
  Compiler: gc
  Platform: darwin/amd64
argocd-server: v1.4.0+2d02948
  BuildDate: 2020-01-18T05:55:02Z
  GitCommit: 2d029488aba6e5ad48b2a756bfcf43d5cb7abcee
  GitTreeState: clean
  GoVersion: go1.12.6
  Compiler: gc
  Platform: linux/amd64
  Ksonnet Version: v0.13.1
  Kustomize Version: Version: {Version:kustomize/v3.2.1 GitCommit:d89b448c745937f0cf1936162f26a5aac688f840 BuildDate:2019-09-27T00:10:52Z GoOs:linux GoArch:amd64}
  Helm Version: v2.15.2
  Kubectl Version: v1.14.0

Logs

argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=info msg="Comparing app state (cluster: https://kubernetes.default.svc, namespace: eb-portal-eu-dev-axa-ebp-fr)" application=ebportal-dev
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="https://github.com/thedigitalstudio/EBP-charts has credentials"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Generated config manifests" application=ebportal-dev
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Retrieved lived manifests" application=ebportal-dev
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="built managed objects list" application=ebportal-dev
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type bitnami.com/v1alpha1, Kind=SealedSecret: no kind \"SealedSecret\" is registered for version \"bitnami.com/v1alpha1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type bitnami.com/v1alpha1, Kind=SealedSecret: no kind \"SealedSecret\" is registered for version \"bitnami.com/v1alpha1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type bitnami.com/v1alpha1, Kind=SealedSecret: no kind \"SealedSecret\" is registered for version \"bitnami.com/v1alpha1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type apps.openshift.io/v1, Kind=DeploymentConfig: no kind \"DeploymentConfig\" is registered for version \"apps.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type apps.openshift.io/v1, Kind=DeploymentConfig: no kind \"DeploymentConfig\" is registered for version \"apps.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type route.openshift.io/v1, Kind=Route: no kind \"Route\" is registered for version \"route.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type route.openshift.io/v1, Kind=Route: no kind \"Route\" is registered for version \"route.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type route.openshift.io/v1, Kind=Route: no kind \"Route\" is registered for version \"route.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Could not create new object of type route.openshift.io/v1, Kind=Route: no kind \"Route\" is registered for version \"route.openshift.io/v1\" in scheme \"k8s.io/client-go/kubernetes/scheme/register.go:67\""
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=info msg=syncing application=ebportal-dev isSelectiveSync=false skipHooks=false started=false syncId=00001-DZdfY

argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="tasks from hooks" application=ebportal-dev hookTasks="[]" syncId=00001-DZdfY
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'sealedsecrets' for bitnami.com/v1alpha1, Kind=SealedSecret"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'sealedsecrets' for bitnami.com/v1alpha1, Kind=SealedSecret"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'sealedsecrets' for bitnami.com/v1alpha1, Kind=SealedSecret"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'cronjobs' for batch/v1beta1, Kind=CronJob"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'deploymentconfigs' for apps.openshift.io/v1, Kind=DeploymentConfig"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'deploymentconfigs' for apps.openshift.io/v1, Kind=DeploymentConfig"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'routes' for route.openshift.io/v1, Kind=Route"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'routes' for route.openshift.io/v1, Kind=Route"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'services' for /v1, Kind=Service"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'services' for /v1, Kind=Service"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'routes' for route.openshift.io/v1, Kind=Route"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:32Z" level=debug msg="Chose API 'routes' for route.openshift.io/v1, Kind=Route"

argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:37Z" level=info msg="Applying resource CronJob/eb-portal-fileloader-france-dev in cluster: https://kubernetes.default.svc, namespace: eb-portal-eu-dev-axa-ebp-fr"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:37Z" level=debug msg="{\"apiVersion\":\"batch/v1beta1\",\"kind\":\"CronJob\",\"metadata\":{\"labels\":{\"app.kubernetes.io/instance\":\"ebportal-dev\"},\"name\":\"eb-portal-fileloader-france-dev\",\"namespace\":\"eb-portal-eu-dev-axa-ebp-fr\"},\"spec\":{\"concurrencyPolicy\":\"Allow\",\"jobTemplate\":{\"spec\":{\"template\":{\"metadata\":{\"labels\":{\"parent\":\"cronjobpi\"}},\"spec\":{\"containers\":[{\"env\":[{\"name\":\"EMAIL_TO\",\"value\":\"service.digitalstudio@axa.fr\"},{\"name\":\"S3_BUCKET\",\"value\":\"s3://eb-portal-eu-dev-ods/rec\"},{\"name\":\"S3_KEY\",\"value\":\"AKIAWSR7GURUKAHJCLAA\"},{\"name\":\"S3_ROLE\",\"value\":\"arn:aws:iam::452177863784:role/eb-portal-eu-dev_appservice\"},{\"name\":\"POSTGRES_URI_SECRET\",\"valueFrom\":{\"secretKeyRef\":{\"key\":\"POSTGRES_URI_SECRET\",\"name\":\"fileloader-eb-secret-france-dev\"}}},{\"name\":\"S3_SECRET\",\"valueFrom\":{\"secretKeyRef\":{\"key\":\"S3_SECRET\",\"name\":\"fileloader-eb-secret-france-dev\"}}},{\"name\":\"HTTP_PROXY\",\"value\":\"http://proxy:8080\"},{\"name\":\"HTTPS_PROXY\",\"value\":\"http://proxy:8080\"},{\"name\":\"http_proxy\",\"value\":\"http://proxy:8080\"},{\"name\":\"https_proxy\",\"value\":\"http://proxy:8080\"},{\"name\":\"NODE_ENV\",\"value\":\"production\"},{\"name\":\"SMTP_HOST\",\"value\":\"10.142.76.184\"},{\"name\":\"SMTP_PORT\",\"value\":\"25\"}],\"image\":\"docker.io/thedigitalstudio/france-file-loader:2.0.0\",\"imagePullPolicy\":\"Always\",\"name\":\"eb-portal-fileloader-france-dev\",\"resources\":{\"limits\":{\"cpu\":\"1\",\"memory\":\"1536Mi\"}},\"terminationMessagePath\":\"/dev/termination-log\"}],\"dnsPolicy\":\"ClusterFirst\",\"restartPolicy\":\"OnFailure\",\"securityContext\":{},\"terminationGracePeriodSeconds\":30}}}},\"schedule\":\"45 2 * * *\",\"suspend\":false}}"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:39Z" level=debug msg=applying application=ebportal-dev dryRun=true syncId=00001-DZdfY task="Sync/0 resource bitnami.com/SealedSecret:eb-portal-eu-dev-axa-ebp-fr/core-eb-secret-france-dev nil->obj (,,)"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:39Z" level=info msg="Applying resource SealedSecret/core-eb-secret-france-dev in cluster: https://kubernetes.default.svc, namespace: eb-portal-eu-dev-axa-ebp-fr"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:39Z" level=debug msg=applying application=ebportal-dev dryRun=true syncId=00001-DZdfY task="Sync/0 resource bitnami.com/SealedSecret:eb-portal-eu-dev-axa-ebp-fr/connector-eb-secret-france-dev nil->obj (,,)"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:39Z" level=info msg="Applying resource SealedSecret/connector-eb-secret-france-dev in cluster: https://kubernetes.default.svc, namespace: eb-portal-eu-dev-axa-ebp-fr"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch Ingress.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch Ingress.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch PodDisruptionBudget.policy on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch PodDisruptionBudget.policy on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch HorizontalPodAutoscaler.autoscaling on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch HorizontalPodAutoscaler.autoscaling on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch ServiceAccount on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch ServiceAccount on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch CronJob.batch on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch CronJob.batch on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch Route.route.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch Route.route.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch ReplicationController on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch ReplicationController on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch Deployment.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch Deployment.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch StatefulSet.apps on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch StatefulSet.apps on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch RoleBinding.rbac.authorization.k8s.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch RoleBinding.rbac.authorization.k8s.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch ReplicaSet.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch ReplicaSet.extensions on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch ImageStream.image.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch ImageStream.image.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch DeploymentConfig.apps.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch DeploymentConfig.apps.openshift.io on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"
argocd-application-controller-658f85b459-hn5m2 argocd-application-controller time="2020-01-21T15:19:56Z" level=debug msg="Failed to watch Job.batch on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com: Watch Job.batch on https://osconsole.pink.ap-southeast-1.aws.openpaas.axa-cloud.com has closed, retrying in 1s"

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Reactions: 1
  • Comments: 20 (2 by maintainers)

Most upvoted comments

Had this issue in 1.7.10, but it turned out to be caused by a startupProbe config in the deployment which is not available until a K8s 1.17 -> 1.18 upgrade. It looks like kubectl quietly ignored this, while argoCD noticed the difference.

Hi @alexmt . Paul and I are working on the same team/project. We’re currently running OKD v3.11 and running Kubernetes v1.11 . We’ve noticed several resources have inconsistent behaviour (mostly PVCs, Routes and Bitnami/SealedSecrets) and they appear to be out of sync. In the ArgoCD UI the resources seem like they aren’t present in the cluster at all, so not just a diffing customization issue, but upon a closer inspection both ArgoCD and Openshift recognize the resources are present/synced.

As @paulcrecan mentioned, if we restart the argocd-application-controller pod, the resources will be fully in sync for a short period of time (under an hour) regardless if the application was just instantiated or it was present in Argo before. We’ve also increased the limits on the controller but to no avail. Our current config for the pod is as follows:

- argocd-application-controller - '--status-processors' - '30' - '--operation-processors' - '30' - '--repo-server-timeout-seconds' - '180' - '--loglevel' - debug

If any other details are necessary, please let me know!