aiobotocore: TypeError: 'coroutine' object is not subscriptable

Describe the bug

I get the error TypeError: 'coroutine' object is not subscriptable when I try to read a CSV from S3. It does not occur if I set the AWS_DEFAULT_REGION as an environment variable. This did not seem to happen until about a week or so ago, I am now using the latest versions of aibotocore, s3fs installed via pip.

Minimal reproducer using aibotocore:

import aiobotocore
import asyncio

async def main():
    session = aiobotocore.get_session()
    async with session.create_client('s3') as client:
        return await client.head_object(Bucket="bucket", Key="some_csv.csv")

asyncio.run(main())

Thanks to @martindurant for helping me with the MRE.

Checklist

  • I have reproduced in environment where pip check passes without errors
  • I have provided pip freeze results
  • I have provided sample code or detailed way to reproduce
  • I have tried the same code in botocore to ensure this is an aiobotocore specific issue
  • I have tried similar code in aiohttp to ensure this is is an aiobotocore specific issue
  • I have checked the latest and older versions of aiobotocore/aiohttp/python to see if this is a regression / injection

pip freeze results

aiobotocore==1.1.0
aiohttp==3.6.2
aioitertools==0.7.0
async-timeout==3.0.1
asyncio==3.4.3
attrs==20.1.0
botocore==1.17.44
certifi==2020.6.20
chardet==3.0.4
click==7.1.2
cloudpickle @ file:///home/conda/feedstock_root/build_artifacts/cloudpickle_1598400192773/work
cytoolz==0.10.1
dask @ file:///home/conda/feedstock_root/build_artifacts/dask-core_1598124697190/work
distributed @ file:///home/conda/feedstock_root/build_artifacts/distributed_1598124720266/work
docutils==0.15.2
fastavro @ file:///home/conda/feedstock_root/build_artifacts/fastavro_1598244349821/work
fastrlock==0.5
fsspec @ file:///home/conda/feedstock_root/build_artifacts/fsspec_1596221475257/work
GDAL==3.0.4
HeapDict==1.0.1
idna==2.10
Jinja2==2.11.2
jmespath==0.10.0
joblib @ file:///home/conda/feedstock_root/build_artifacts/joblib_1593624380152/work
llvmlite==0.33.0
locket==0.2.0
MarkupSafe==1.1.1
msgpack==1.0.0
multidict==4.7.6
numba @ file:///home/conda/feedstock_root/build_artifacts/numba_1594681180710/work
numpy @ file:///home/conda/feedstock_root/build_artifacts/numpy_1597938346492/work
olefile==0.46
packaging @ file:///home/conda/feedstock_root/build_artifacts/packaging_1589925210001/work
pandas @ file:///home/conda/feedstock_root/build_artifacts/pandas_1598294454723/work
partd==1.1.0
pickle5 @ file:///home/conda/feedstock_root/build_artifacts/pickle5_1592859247555/work
Pillow @ file:///home/conda/feedstock_root/build_artifacts/pillow_1594213010297/work
psutil @ file:///home/conda/feedstock_root/build_artifacts/psutil_1594826921622/work
pyarrow==0.17.1
pynvml @ file:///home/conda/feedstock_root/build_artifacts/pynvml_1594169898909/work
pyparsing==2.4.7
python-dateutil==2.8.1
pytz==2020.1
PyYAML==5.3.1
s3fs==0.5.0
scikit-learn @ file:///home/conda/feedstock_root/build_artifacts/scikit-learn_1596546074663/work
scipy @ file:///home/conda/feedstock_root/build_artifacts/scipy_1595583586868/work
six @ file:///home/conda/feedstock_root/build_artifacts/six_1590081179328/work
sortedcontainers @ file:///home/conda/feedstock_root/build_artifacts/sortedcontainers_1591999956871/work
tblib==1.6.0
threadpoolctl @ file:///tmp/tmp79xdzxkt/threadpoolctl-2.1.0-py3-none-any.whl
toolz==0.10.0
tornado==6.0.4
treelite==0.92
treelite-runtime==0.92
typing-extensions @ file:///home/conda/feedstock_root/build_artifacts/typing_extensions_1588470653596/work
urllib3==1.25.10
wrapt==1.12.1
yarl==1.5.1
zict==2.0.0

Environment:

  • Python Version: 3.7
  • OS name and version: Ubuntu 18.04

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 18

Most upvoted comments

Thanks from me too, on behalf of s3fs

@martindurant working for me as well!

@thehesiod thanks for the fix. 🎖️🎖️🎖️

@martindurant Yes, it works for me! Woohoo! Note that aiobotocore=1.1.1 is on conda-forge now also!

@chinmaychandak @rsignell-usgs @inderpartap : the fix was released as aiobotocore 1.1.1 (on pypi, not yet conda). Would you each please try to test with the latest version (probably needs newer botocore too, please check).

@thehesiod - thank you! The code looks much more complicated than I would have thought. Being relatively new in async-land, I much appreciate your work here.

Just a note from another s3fs user that it would be great not to have to specify the region when we didn’t have to previous versions! https://github.com/intake/filesystem_spec/issues/386#issuecomment-683787104

@thehesiod I now tried it on an EC2 outside EKS in a different region than the bucket, and get the same error. So I think it might not be related to running inside containers/on Kubernetes.

thanks @chinmaychandak that will help us to repro

or if you’re running this on a specially configured ec2 instance, etc

I am running this inside a Docker container on AWS Kubernetes (EKS). The bucket is in a different region than my Kubernetes cluster.

I just tried it on an EC2 instance outside of Kubernetes in the same region as the bucket, and I don’t seem to see this issue.