amazon-ecs-agent: Missing Cloudwatch metrics for ECS service
Summary
after upgrading from ami-0b97931838a22e7c1 to ami-0e6de310858faf4dc, we are missing some cloudwatch metrics for ecs services. Although the service has running task and nothing has changed in it.
ECS agent logs sometimes contains:
msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container ac6fa14827e76eed82dd30e5406c7d59146262948dde1de4b135aead0f4b8dea: context canceled" module=container.go
docker stats
hangs on this instance but works on another one.
Expected Behavior
Send metrics to cloudwatch
Observed Behavior
Metrics missing from cloudwatch
Environment Details
Server:
Containers: 48
Running: 26
Paused: 0
Stopped: 22
Images: 28
Server Version: 19.03.6-ce
Storage Driver: overlay2
Backing Filesystem: extfs
Supports d_type: true
Native Overlay Diff: true
Logging Driver: json-file
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge host ipvlan macvlan null overlay
Log: awslogs fluentd gcplogs gelf journald json-file local logentries splunk syslog
Swarm: inactive
Runtimes: runc
Default Runtime: runc
Init Binary: docker-init
containerd version: ff48f57fc83a8c44cf4ad5d672424a98ba37ded6
runc version: dc9208a3303feef5b3839f4323d9beb36df0a9dd
init version: fec3683
Security Options:
seccomp
Profile: default
Kernel Version: 4.14.203-156.332.amzn2.x86_64
Operating System: Amazon Linux 2
OSType: linux
Architecture: x86_64
CPUs: 8
Total Memory: 14.68GiB
Name: ****.eu-central-1.compute.internal
ID: PEGR:3OUX:JUTP:MYUL:JOCZ:2WMU:O2F2:5PUH:QKYN:ODFE:F7DX:MOSY
Docker Root Dir: /var/lib/docker
Debug Mode: false
Registry: https://index.docker.io/v1/
Labels:
Experimental: false
Insecure Registries:
127.0.0.0/8
Live Restore Enabled: false
{"Cluster":"ecs-cluster-**-prod","ContainerInstanceArn":"arn:aws:ecs:eu-central-1:***:container-instance/ecs-cluster-***-prod/b6f9599b3f0f4f4d8cd0b304e892d6ab","Version":"Amazon ECS Agent - v1.48.0 (6b10efac)"}
Filesystem Size Used Avail Use% Mounted on
devtmpfs 7,4G 0 7,4G 0% /dev
tmpfs 7,4G 0 7,4G 0% /dev/shm
tmpfs 7,4G 2,1M 7,4G 1% /run
tmpfs 7,4G 0 7,4G 0% /sys/fs/cgroup
/dev/xvda1 30G 4,9G 25G 17% /
tmpfs 1,5G 0 1,5G 0% /run/user/1000
More Agent Logs:
level=warn time=2020-11-30T10:34:53Z msg="Error publishing metrics: stats engine: no task metrics to report" module=client.go
level=warn time=2020-11-30T10:34:54Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 9aa7a27148355e85e25a3349ed8557cc937b8213acadd32c16d3f6276bfdd752: context canceled" module=container.go
level=info time=2020-11-30T10:34:54Z msg="DockerGoClient: Starting streaming metrics for container 3438926061af5d86abb39a9ba36ccb883a7d8874ae5f15cfd970bd1da81062ef" module=docker_client.go
level=info time=2020-11-30T10:34:55Z msg="DockerGoClient: Starting streaming metrics for container bbafaf04e73178cc5076fdcb81b98c81ca54e68683497de8fc715b92dff3cdf0" module=docker_client.go
level=warn time=2020-11-30T10:34:57Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 235c1239f9845171c4745b1c328699f20cbcb944bce07561c88de9a22f5b8f50: context canceled" module=container.go
level=warn time=2020-11-30T10:34:59Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container e49849f57f47253b828e049fba22f0f952d1416a53b58c379121b324c4bb1f49: context canceled" module=container.go
level=info time=2020-11-30T10:35:00Z msg="DockerGoClient: Starting streaming metrics for container 6d8f73959468e3f63bb3df2654fb61a11181490c4924ef081d141124d1177a85" module=docker_client.go
level=info time=2020-11-30T10:35:00Z msg="DockerGoClient: Starting streaming metrics for container 66973f83de37ec8ee97fe988798b5011df9349c4b54ef579b1f6402e546d3703" module=docker_client.go
level=info time=2020-11-30T10:35:01Z msg="DockerGoClient: Starting streaming metrics for container 60040e025fac197b8a26a38bb40e47ff9e793a7f6a30f63df8ae59dff95d2ce8" module=docker_client.go
level=info time=2020-11-30T10:35:03Z msg="DockerGoClient: Starting streaming metrics for container 17c3d17afb4847ebf3bc260dff2d09fbc1ac841f94a70812128e7ccd5223fda4" module=docker_client.go
level=warn time=2020-11-30T10:35:05Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 7ee18aff2068a8d24435249c50854bfe3f0387c936deb9c6aba2b56d588469fe: context canceled" module=container.go
level=warn time=2020-11-30T10:35:05Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 6b96d5829e625dfceec43cc32bfec89f1a1ce60b24ea924c922b9c2a6df484e6: context canceled" module=container.go
level=info time=2020-11-30T10:35:05Z msg="DockerGoClient: Starting streaming metrics for container b400a6298cb4e9b8e231e5eb3b42e66fe026e0abdda93aceed9b6e05139e3b88" module=docker_client.go
level=info time=2020-11-30T10:35:06Z msg="DockerGoClient: Starting streaming metrics for container 9aa7a27148355e85e25a3349ed8557cc937b8213acadd32c16d3f6276bfdd752" module=docker_client.go
level=warn time=2020-11-30T10:35:06Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container a61da7e8a1063d84804441c65c1d9278da077d00a83715c7bc55b64d2d603163: context canceled" module=container.go
level=warn time=2020-11-30T10:35:08Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container cad42c5896b29e80bee45df3f09caca64ae3062793738c5ec1951bd0ad74451c: context canceled" module=container.go
level=info time=2020-11-30T10:35:08Z msg="DockerGoClient: Starting streaming metrics for container 235c1239f9845171c4745b1c328699f20cbcb944bce07561c88de9a22f5b8f50" module=docker_client.go
level=info time=2020-11-30T10:35:11Z msg="DockerGoClient: Starting streaming metrics for container e49849f57f47253b828e049fba22f0f952d1416a53b58c379121b324c4bb1f49" module=docker_client.go
level=warn time=2020-11-30T10:35:11Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container f3059a19e304fa6a291f053a709ea8b9c5b4e296bd5f58b627bd8dcea26de723: context canceled" module=container.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 60040e025fac197b8a26a38bb40e47ff9e793a7f6a30f63df8ae59dff95d2ce8 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container bbafaf04e73178cc5076fdcb81b98c81ca54e68683497de8fc715b92dff3cdf0 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 6b96d5829e625dfceec43cc32bfec89f1a1ce60b24ea924c922b9c2a6df484e6 not collected, reason (cpu): need at least 1 non-NaN data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 2e9f29437bf8908dd91737ee87f7f93712c89403e280c41a9ee68509a197b73a not collected, reason (cpu): need at least 1 non-NaN data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container b2e0b5a816a6bc956578ca9c4881c3e20e6364764f53fb535488b38bbd3c476e not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 235c1239f9845171c4745b1c328699f20cbcb944bce07561c88de9a22f5b8f50 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 17c3d17afb4847ebf3bc260dff2d09fbc1ac841f94a70812128e7ccd5223fda4 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container f3059a19e304fa6a291f053a709ea8b9c5b4e296bd5f58b627bd8dcea26de723 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container e49849f57f47253b828e049fba22f0f952d1416a53b58c379121b324c4bb1f49 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 7ee18aff2068a8d24435249c50854bfe3f0387c936deb9c6aba2b56d588469fe not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container c66a7105d6d84c97bc744a1902f98b71501c0a638140bd18b773e0616e74d1e7 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 9aa7a27148355e85e25a3349ed8557cc937b8213acadd32c16d3f6276bfdd752 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container cad42c5896b29e80bee45df3f09caca64ae3062793738c5ec1951bd0ad74451c not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 5aa005aa5df465be200b11bf68572ef58f8bcff2dd94228cfb76a65cbfabcc57 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 150cf1d112a83bb6a1cbf5a34b49f38f4ac22576cf65868e15a072563f3cf388 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container ba18fb30d1fe640885bbb2d91f06e248d408a466e1b30336e1b695b8d225a8d0 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container b400a6298cb4e9b8e231e5eb3b42e66fe026e0abdda93aceed9b6e05139e3b88 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 6d8f73959468e3f63bb3df2654fb61a11181490c4924ef081d141124d1177a85 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container c0c78f46b12e4b44cd1d2a787e54e048d247a8d35785fc8a0d8ed148755e312a not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 6443a05106d3ef0ed6b815b813f723a1633dc6c27b66c54e85423f38613276de not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container a61da7e8a1063d84804441c65c1d9278da077d00a83715c7bc55b64d2d603163 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 2a88f03d38051c070af13dbab3b1c0815c123f4ce9cc01f787da52492c03b48d not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 66973f83de37ec8ee97fe988798b5011df9349c4b54ef579b1f6402e546d3703 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:13Z msg="cloudwatch metrics for container 3438926061af5d86abb39a9ba36ccb883a7d8874ae5f15cfd970bd1da81062ef not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=warn time=2020-11-30T10:35:13Z msg="Error publishing metrics: stats engine: no task metrics to report" module=client.go
level=warn time=2020-11-30T10:35:13Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 150cf1d112a83bb6a1cbf5a34b49f38f4ac22576cf65868e15a072563f3cf388: context canceled" module=container.go
level=warn time=2020-11-30T10:35:13Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container c66a7105d6d84c97bc744a1902f98b71501c0a638140bd18b773e0616e74d1e7: context canceled" module=container.go
level=warn time=2020-11-30T10:35:16Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container b2e0b5a816a6bc956578ca9c4881c3e20e6364764f53fb535488b38bbd3c476e: context canceled" module=container.go
level=info time=2020-11-30T10:35:17Z msg="DockerGoClient: Starting streaming metrics for container 6b96d5829e625dfceec43cc32bfec89f1a1ce60b24ea924c922b9c2a6df484e6" module=docker_client.go
level=info time=2020-11-30T10:35:18Z msg="DockerGoClient: Starting streaming metrics for container 7ee18aff2068a8d24435249c50854bfe3f0387c936deb9c6aba2b56d588469fe" module=docker_client.go
level=warn time=2020-11-30T10:35:18Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 5aa005aa5df465be200b11bf68572ef58f8bcff2dd94228cfb76a65cbfabcc57: context canceled" module=container.go
level=info time=2020-11-30T10:35:19Z msg="DockerGoClient: Starting streaming metrics for container a61da7e8a1063d84804441c65c1d9278da077d00a83715c7bc55b64d2d603163" module=docker_client.go
level=warn time=2020-11-30T10:35:19Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 6443a05106d3ef0ed6b815b813f723a1633dc6c27b66c54e85423f38613276de: context canceled" module=container.go
level=warn time=2020-11-30T10:35:19Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 2a88f03d38051c070af13dbab3b1c0815c123f4ce9cc01f787da52492c03b48d: context canceled" module=container.go
level=info time=2020-11-30T10:35:22Z msg="DockerGoClient: Starting streaming metrics for container cad42c5896b29e80bee45df3f09caca64ae3062793738c5ec1951bd0ad74451c" module=docker_client.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: checking to verify it's still at steady state." module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: task at steady state: RUNNING" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: waiting for event for task" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: Container [name=fluentbit runtimeID=150cf1d112a83bb6a1cbf5a34b49f38f4ac22576cf65868e15a072563f3cf388]: handling container change event [RUNNING]" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: Container [name=fluentbit runtimeID=150cf1d112a83bb6a1cbf5a34b49f38f4ac22576cf65868e15a072563f3cf388]: container change RUNNING->RUNNING is redundant" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: task at steady state: RUNNING" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: waiting for event for task" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: Container [name=prod-dashboard-proxy runtimeID=ba18fb30d1fe640885bbb2d91f06e248d408a466e1b30336e1b695b8d225a8d0]: handling container change event [RUNNING]" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: Container [name=prod-dashboard-proxy runtimeID=ba18fb30d1fe640885bbb2d91f06e248d408a466e1b30336e1b695b8d225a8d0]: container change RUNNING->RUNNING is redundant" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: task at steady state: RUNNING" module=task_manager.go
level=info time=2020-11-30T10:35:22Z msg="Managed task [arn:aws:ecs:eu-central-1:**:task/ecs-cluster-**-prod/8e19100b6b914e49848566ea97fb2b83]: waiting for event for task" module=task_manager.go
level=warn time=2020-11-30T10:35:23Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 2e9f29437bf8908dd91737ee87f7f93712c89403e280c41a9ee68509a197b73a: context canceled" module=container.go
level=warn time=2020-11-30T10:35:24Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container c0c78f46b12e4b44cd1d2a787e54e048d247a8d35785fc8a0d8ed148755e312a: context canceled" module=container.go
level=info time=2020-11-30T10:35:26Z msg="DockerGoClient: Starting streaming metrics for container c66a7105d6d84c97bc744a1902f98b71501c0a638140bd18b773e0616e74d1e7" module=docker_client.go
level=info time=2020-11-30T10:35:26Z msg="DockerGoClient: Starting streaming metrics for container f3059a19e304fa6a291f053a709ea8b9c5b4e296bd5f58b627bd8dcea26de723" module=docker_client.go
level=info time=2020-11-30T10:35:26Z msg="DockerGoClient: Starting streaming metrics for container 150cf1d112a83bb6a1cbf5a34b49f38f4ac22576cf65868e15a072563f3cf388" module=docker_client.go
level=warn time=2020-11-30T10:35:28Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container ba18fb30d1fe640885bbb2d91f06e248d408a466e1b30336e1b695b8d225a8d0: context canceled" module=container.go
level=info time=2020-11-30T10:35:29Z msg="DockerGoClient: Starting streaming metrics for container b2e0b5a816a6bc956578ca9c4881c3e20e6364764f53fb535488b38bbd3c476e" module=docker_client.go
level=warn time=2020-11-30T10:35:30Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 3438926061af5d86abb39a9ba36ccb883a7d8874ae5f15cfd970bd1da81062ef: context canceled" module=container.go
level=info time=2020-11-30T10:35:30Z msg="DockerGoClient: Starting streaming metrics for container 6443a05106d3ef0ed6b815b813f723a1633dc6c27b66c54e85423f38613276de" module=docker_client.go
level=info time=2020-11-30T10:35:30Z msg="DockerGoClient: Starting streaming metrics for container 5aa005aa5df465be200b11bf68572ef58f8bcff2dd94228cfb76a65cbfabcc57" module=docker_client.go
level=info time=2020-11-30T10:35:30Z msg="DockerGoClient: Starting streaming metrics for container 2a88f03d38051c070af13dbab3b1c0815c123f4ce9cc01f787da52492c03b48d" module=docker_client.go
level=warn time=2020-11-30T10:35:31Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container bbafaf04e73178cc5076fdcb81b98c81ca54e68683497de8fc715b92dff3cdf0: context canceled" module=container.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container e49849f57f47253b828e049fba22f0f952d1416a53b58c379121b324c4bb1f49 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 7ee18aff2068a8d24435249c50854bfe3f0387c936deb9c6aba2b56d588469fe not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container c66a7105d6d84c97bc744a1902f98b71501c0a638140bd18b773e0616e74d1e7 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 9aa7a27148355e85e25a3349ed8557cc937b8213acadd32c16d3f6276bfdd752 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 5aa005aa5df465be200b11bf68572ef58f8bcff2dd94228cfb76a65cbfabcc57 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container cad42c5896b29e80bee45df3f09caca64ae3062793738c5ec1951bd0ad74451c not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 150cf1d112a83bb6a1cbf5a34b49f38f4ac22576cf65868e15a072563f3cf388 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container ba18fb30d1fe640885bbb2d91f06e248d408a466e1b30336e1b695b8d225a8d0 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container b400a6298cb4e9b8e231e5eb3b42e66fe026e0abdda93aceed9b6e05139e3b88 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 6d8f73959468e3f63bb3df2654fb61a11181490c4924ef081d141124d1177a85 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container c0c78f46b12e4b44cd1d2a787e54e048d247a8d35785fc8a0d8ed148755e312a not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 6443a05106d3ef0ed6b815b813f723a1633dc6c27b66c54e85423f38613276de not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container a61da7e8a1063d84804441c65c1d9278da077d00a83715c7bc55b64d2d603163 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 2a88f03d38051c070af13dbab3b1c0815c123f4ce9cc01f787da52492c03b48d not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 66973f83de37ec8ee97fe988798b5011df9349c4b54ef579b1f6402e546d3703 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 3438926061af5d86abb39a9ba36ccb883a7d8874ae5f15cfd970bd1da81062ef not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 17c3d17afb4847ebf3bc260dff2d09fbc1ac841f94a70812128e7ccd5223fda4 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container f3059a19e304fa6a291f053a709ea8b9c5b4e296bd5f58b627bd8dcea26de723 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 60040e025fac197b8a26a38bb40e47ff9e793a7f6a30f63df8ae59dff95d2ce8 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container bbafaf04e73178cc5076fdcb81b98c81ca54e68683497de8fc715b92dff3cdf0 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 6b96d5829e625dfceec43cc32bfec89f1a1ce60b24ea924c922b9c2a6df484e6 not collected, reason (cpu): need at least 1 non-NaN data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 2e9f29437bf8908dd91737ee87f7f93712c89403e280c41a9ee68509a197b73a not collected, reason (cpu): need at least 1 non-NaN data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container b2e0b5a816a6bc956578ca9c4881c3e20e6364764f53fb535488b38bbd3c476e not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=info time=2020-11-30T10:35:33Z msg="cloudwatch metrics for container 235c1239f9845171c4745b1c328699f20cbcb944bce07561c88de9a22f5b8f50 not collected, reason (cpu): need at least 2 data points in queue to calculate CW stats set" module=engine.go
level=warn time=2020-11-30T10:35:33Z msg="Error publishing metrics: stats engine: no task metrics to report" module=client.go
level=info time=2020-11-30T10:35:35Z msg="DockerGoClient: Starting streaming metrics for container c0c78f46b12e4b44cd1d2a787e54e048d247a8d35785fc8a0d8ed148755e312a" module=docker_client.go
level=warn time=2020-11-30T10:35:36Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 6d8f73959468e3f63bb3df2654fb61a11181490c4924ef081d141124d1177a85: context canceled" module=container.go
level=warn time=2020-11-30T10:35:36Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 66973f83de37ec8ee97fe988798b5011df9349c4b54ef579b1f6402e546d3703: context canceled" module=container.go
level=info time=2020-11-30T10:35:36Z msg="DockerGoClient: Starting streaming metrics for container 2e9f29437bf8908dd91737ee87f7f93712c89403e280c41a9ee68509a197b73a" module=docker_client.go
level=warn time=2020-11-30T10:35:37Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 60040e025fac197b8a26a38bb40e47ff9e793a7f6a30f63df8ae59dff95d2ce8: context canceled" module=container.go
level=warn time=2020-11-30T10:35:39Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 17c3d17afb4847ebf3bc260dff2d09fbc1ac841f94a70812128e7ccd5223fda4: context canceled" module=container.go
level=warn time=2020-11-30T10:35:41Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container b400a6298cb4e9b8e231e5eb3b42e66fe026e0abdda93aceed9b6e05139e3b88: context canceled" module=container.go
level=warn time=2020-11-30T10:35:42Z msg="Error encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container 9aa7a27148355e85e25a3349ed8557cc937b8213acadd32c16d3f6276bfdd752: context canceled" module=container.go
About this issue
- Original URL
- State: closed
- Created 4 years ago
- Comments: 23 (13 by maintainers)
We are also having this issue. Increasing reserved memory didn’t help either. This is a serious bug and only encourages us to use another orchestrator.
Hey @angelcar, we’re also experiencing this issue. We’ve previously pinned our source AMI to
ami-0e5fb9632ceee168f
; however, every time we update to a source AMI newer than that, we see a tremendous spike in log messages sayingError encountered processing metrics stream from docker, this may affect cloudwatch metric accuracy: DockerGoClient: Unable to decode stats for container
. We are not running fluentbit and we’ve previously tried increasingECS_RESERVED_MEMORY
without success. Would it be alright to reopen this issue and is there any information I can provide to help debug this?