superset: Unable to query BigQuery data - queries executed successful at BigQuery but results not returned at Superset
Unable to query BigQuery data. I could see the executed queries at the BigQuery project history. Even simple SQL statements are not returning results at Superset.
How to reproduce the bug
- Installed latest version of Apache Superset (2.1) & tried with older previous version of Superset and Python (3.8)
- Installed SQLAlchemy BigQuery connector
- Created database connection successfully and able to see the datasets and tables
- Creating Dataset / creating charts / running simple SQL statements at SQL lab produce Gateway TimeOut error.
Expected results
Result set at sql lab / data returning at the superset for creating charts.
Actual results
Gateway Timeout Error - queries running forever and then timeout after the timeout threshold.
Screenshots

Environment
- Browser: Tested with Google Chrome, Edge, Safari
- superset version: Superset 2.1.0 / Tested with Superset 2.0
- python version: Python 3.10 / Tested with Python 3.8
- bigquery connector: sqlalchemy-bigquery 1.6.1 / Tested with Pybigquery
- OS version: Ubuntu 22.04.2 LTS
- Hosted on: GCP VM
Checklist
- [ x ] I have checked the superset logs for python stacktraces and included it here as text if there are any.
- [ x ] I have reproduced the issue with at least the latest released version of superset.
- [ x ] I have checked the issue tracker for the same issue and I haven’t found one similar.
Additional context
The VM hosted on GCP Compute Engine and verified the access to BigQuery - APIs access is already enabled and able to query data using the same connector (sqlalchemy-bigquery).
from sqlalchemy.engine import create_engine
from pprint import pprint
bigquery_uri = f'bigquery://{project}/{table}'
engine = create_engine(
bigquery_uri,
)
query = f'SELECT * FROM table WHERE DATE(partition) = "2023-05-05" LIMIT 10;'
rows = engine.execute(query).fetchall()
rows = [dict(row) for row in rows]
pprint(rows)
Does not look like connectivity issue as I could see the datasets and tables listed for the database and once queries are executed, I could see the same successfully executed queries at the BigQuery.

Please let me know if you require any additional information. Your assistance would be greatly appreciated. Thank you.
About this issue
- Original URL
- State: closed
- Created a year ago
- Comments: 15 (6 by maintainers)
@rathishkumar
I found the issue is related to
gevent, as you guess it related to wsgi server. when I use gthread worker instead of gevent worker type ongunicorn, it works well!@okayhooni & @rathishkumar thanks you two! setting the worker type to
gthreadcompletely resolves the issue we we having.@rusackas I already updated descriptions related to this issue on the official docs. https://github.com/apache/superset/pull/24564
@joev-indx thanks for reply to Alex 👍
@AlexMSm I cannot speak to using cloud run, we are running it as a gke cluster. When initiating the server something like this should work for a simple case:
its the
-koption where you specify thegthreadoption. I hope this helps.After a week of trial and error, I discovered that the issue only occurs when running with gunicorn. I was able to resolve the issue by switching to the waitress WSGI. However, I am uncertain as to why running on gunicorn does not return results. If anyone else has observed a similar issue, please feel free to comment here.
Was facing the same issue on GCP Cloud Run & BigQuery. Please see https://apache-superset.slack.com/archives/C015WAZL0KH/p1686930010216179?thread_ts=1686908792.979199&cid=C015WAZL0KH for details.
Summary:
roles/bigquery.resourceViewer, otherwise there are IAM errors regarding missing"jobservice.getqueryresults"permissions--min-instances=to anything> 0, e.g.--min-instances=1works.Hi @rusackas
Our team is currently in the process of migrating Tableau & Looker studio reports to Apache Superset. However, this issue prevented us from moving forward with our proof of concept. We would like to inquire if there is any enterprise support available to assist us in resolving this issue as soon as possible. Thanks!