Steps to reproduce
- Using google.cloud.bigquery v.1.23.0
- Insead of v.1.22.0 works fine.
Code example
from concurrent.futures import ThreadPoolExecutor
from logging import StreamHandler, Formatter, INFO, getLogger
from datetime import datetime, timedelta, timezone
from google.cloud import bigquery
from google.cloud.bigquery import LoadJobConfig
from google.cloud.bigquery import SchemaField
Stack trace
---------------------------------------------------------------------------
ImportError Traceback (most recent call last)
<command-778378> in <module>
3 from datetime import datetime, timedelta, timezone
4
----> 5 from google.cloud import bigquery
6 from google.cloud.bigquery import LoadJobConfig
7 from google.cloud.bigquery import SchemaField
/databricks/python/lib/python3.7/site-packages/google/cloud/bigquery/__init__.py in <module>
33 __version__ = get_distribution("google-cloud-bigquery").version
34
---> 35 from google.cloud.bigquery.client import Client
36 from google.cloud.bigquery.dataset import AccessEntry
37 from google.cloud.bigquery.dataset import Dataset
/databricks/python/lib/python3.7/site-packages/google/cloud/bigquery/client.py in <module>
56 from google.cloud.bigquery._helpers import _verify_job_config_type
57 from google.cloud.bigquery._http import Connection
---> 58 from google.cloud.bigquery import _pandas_helpers
59 from google.cloud.bigquery.dataset import Dataset
60 from google.cloud.bigquery.dataset import DatasetListItem
/databricks/python/lib/python3.7/site-packages/google/cloud/bigquery/_pandas_helpers.py in <module>
38 pyarrow = None
39
---> 40 from google.cloud.bigquery import schema
41
42
/databricks/python/lib/python3.7/site-packages/google/cloud/bigquery/schema.py in <module>
15 """Schemas for BigQuery tables / queries."""
16
---> 17 from six.moves import collections_abc
18
19 from google.cloud.bigquery_v2 import types
ImportError: cannot import name 'collections_abc' from 'six.moves' (unknown location)
I am reproducing this issue with these versions:
google-cloud-bigquery==1.24.0 six==1.14.0Adding
six==1.13.0to the requirements file of the main project worked for meHi,
Would it be possible for either
google-cloud-bigqueryto explicitly list that it requiressix >= 1.13.0, or forgoogle-cloud-coreto increase its lower bound andgoogle-cloud-bigqueryrequire the new version?The problem is that
pip install --upgradeon projects that depend upongoogle-cloud-bigquerynow fail, becausegoogle-cloud-coreincorrectly asserts that it only needssix >= 1.10.0(see here).I agree that this problem can be fixed by manually upgrading
six, but the point of listing dependencies and their versions is that this happens automatically.@HemangChothani Shouldn’t it work out of the box after installing
google.cloud.bigquerywithout any need to install further packages?Still facing this issue. My jenkins deployment pipeline broke with the same message.
Packages on my server:
Please advice.
Sorry that you have encountered this issue. I should have checked the minimum version of
sixfor thecollections.abcfix for Python 3.8. I do see it was added in 1.13.0. https://github.com/benjaminp/six/blob/203b81c2a719466ed13681f0062a4426c07c7481/CHANGES#L11Indeed, the correct fix is for bigquery to add
six >=1.13.0,<2.0.0devas a dependency in setup.py.My pipeline broke today. I install
google-cloud-bigqueryusing bootstrap script while creating dataproc cluster and i’m getting the following error after submitting my spark job :So i’ve migrated to previous version(1.22.0) and it’s working fine.
@jordangonzales Latest version of google.cloud.bigquery package is 1.23.1. Try downgrading it to 1.23.0 as it’s working with six==1.14
@smdmts @shihabuddinbuet @SaschaHeyer Please update or install the latest version of
sixwhich is1.13.0. It’s due to the older versionsixlike1.12.0.google-cloud-bigquery==1.24.0 six==1.14 works with python 2.7 but not python 3.7