我正在尝试在人工智能平台笔记本中使用BigQuery,但我遇到了一个ContextualVersionConflict。在这个玩具示例中,我试图从项目BigQuery中的名为bgt_all的job2vec数据库中提取两列值为值的数据。
from google.cloud import bigquery
client = bigquery.Client()
aaa="""
SELECT BGTJobId, soc6 FROM `job2vec.bq_bgt_storage.bgt_all` LIMIT 100
"""
df = client.query(aaa).to_dataframe()
df.head()
回传
---------------------------------------------------------------------------
ContextualVersionConflict Traceback (most recent call last)
<ipython-input-25-7bdfe216bcc8> in <module>
7 SELECT BGTJobId, soc6 FROM `job2vec.bq_bgt_storage.bgt_all` LIMIT 100
8 """
----> 9 df = client.query(aaa).to_dataframe()
10 df.head()
/opt/conda/lib/python3.7/site-packages/google/cloud/bigquery/job.py in to_dataframe(self, bqstorage_client, dtypes, progress_bar_type, create_bqstorage_client, date_as_object)
3381 progress_bar_type=progress_bar_type,
3382 create_bqstorage_client=create_bqstorage_client,
-> 3383 date_as_object=date_as_object,
3384 )
3385
/opt/conda/lib/python3.7/site-packages/google/cloud/bigquery/table.py in to_dataframe(self, bqstorage_client, dtypes, progress_bar_type, create_bqstorage_client, date_as_object)
1725 progress_bar_type=progress_bar_type,
1726 bqstorage_client=bqstorage_client,
-> 1727 create_bqstorage_client=create_bqstorage_client,
1728 )
1729 df = record_batch.to_pandas(date_as_object=date_as_object)
/opt/conda/lib/python3.7/site-packages/google/cloud/bigquery/table.py in to_arrow(self, progress_bar_type, bqstorage_client, create_bqstorage_client)
1535 owns_bqstorage_client = False
1536 if not bqstorage_client and create_bqstorage_client:
-> 1537 bqstorage_client = self.client._create_bqstorage_client()
1538 owns_bqstorage_client = bqstorage_client is not None
1539
/opt/conda/lib/python3.7/site-packages/google/cloud/bigquery/client.py in _create_bqstorage_client(self)
402 """
403 try:
--> 404 from google.cloud import bigquery_storage_v1
405 except ImportError:
406 warnings.warn(
/opt/conda/lib/python3.7/site-packages/google/cloud/bigquery_storage_v1/__init__.py in <module>
20
21 __version__ = pkg_resources.get_distribution(
---> 22 "google-cloud-bigquery-storage"
23 ).version # noqa
24
/opt/conda/lib/python3.7/site-packages/pkg_resources/__init__.py in get_distribution(dist)
478 dist = Requirement.parse(dist)
479 if isinstance(dist, Requirement):
--> 480 dist = get_provider(dist)
481 if not isinstance(dist, Distribution):
482 raise TypeError("Expected string, Requirement, or Distribution", dist)
/opt/conda/lib/python3.7/site-packages/pkg_resources/__init__.py in get_provider(moduleOrReq)
354 """Return an IResourceProvider for the named module or requirement"""
355 if isinstance(moduleOrReq, Requirement):
--> 356 return working_set.find(moduleOrReq) or require(str(moduleOrReq))[0]
357 try:
358 module = sys.modules[moduleOrReq]
/opt/conda/lib/python3.7/site-packages/pkg_resources/__init__.py in require(self, *requirements)
897 included, even if they were already activated in this working set.
898 """
--> 899 needed = self.resolve(parse_requirements(requirements))
900
901 for dist in needed:
/opt/conda/lib/python3.7/site-packages/pkg_resources/__init__.py in resolve(self, requirements, env, installer, replace_conflicting, extras)
788 # Oops, the "best" so far conflicts with a dependency
789 dependent_req = required_by[req]
--> 790 raise VersionConflict(dist, req).with_context(dependent_req)
791
792 # push the new requirements onto the stack
ContextualVersionConflict: (google-api-core 1.22.1 (/opt/conda/lib/python3.7/site-packages), Requirement.parse('google-api-core[grpc]<2.0.0dev,>=1.22.2'), {'google-cloud-bigquery-storage'})
这很奇怪,因为当我运行!pip安装google核心-升级时,它显示它是1.24.1,所以我不太明白为什么。
编辑:当我键入!conda list \ grep google时,会出现以下内容
google-api-core-grpcio-gcp 1.16.0 1 conda-forge
google-api-python-client 1.9.1 pyh9f0ad1d_0 conda-forge
google-apitools 0.5.31 pypi_0 pypi
google-auth 1.24.0 pypi_0 pypi
google-auth-httplib2 0.0.3 py_3 conda-forge
google-auth-oauthlib 0.4.1 py_2 conda-forge
google-cloud-bigquery 1.24.0 pypi_0 pypi
google-cloud-bigquery-storage 2.1.0 pypi_0 pypi
google-cloud-bigtable 1.0.0 pypi_0 pypi
google-cloud-core 1.3.0 pypi_0 pypi
google-cloud-dataproc 1.1.1 pypi_0 pypi
google-cloud-datastore 1.7.4 pypi_0 pypi
google-cloud-dlp 0.13.0 pypi_0 pypi
google-cloud-firestore 1.8.1 pypi_0 pypi
google-cloud-kms 1.4.0 pypi_0 pypi
google-cloud-language 1.3.0 pypi_0 pypi
google-cloud-logging 1.15.1 pypi_0 pypi
google-cloud-pubsub 1.0.2 pypi_0 pypi
google-cloud-scheduler 1.3.0 pypi_0 pypi
google-cloud-spanner 1.17.1 pypi_0 pypi
google-cloud-speech 1.3.2 pypi_0 pypi
google-cloud-storage 1.30.0 pypi_0 pypi
google-cloud-tasks 1.5.0 pypi_0 pypi
google-cloud-translate 2.0.2 pypi_0 pypi
google-cloud-videointelligence 1.13.0 pypi_0 pypi
google-cloud-vision 0.42.0 pypi_0 pypi
google-crc32c 0.1.0 pypi_0 pypi
google-pasta 0.2.0 pypi_0 pypi
google-resumable-media 0.7.1 pypi_0 pypi
googleapis-common-protos 1.51.0 py37hc8dfbb8_2 conda-forge
grpc-google-iam-v1 0.12.3 pypi_0 pypi
发布于 2021-01-05 10:19:10
为了对社区做出进一步的贡献,我将根据我上面的评论发布答案。
首先,您应该尝试使用以下命令升级包:
pip install --upgrade pandas-gbq 'google-cloud-bigquery[bqstorage,pandas]'
然后,您可以不使用to_dataframe()方法,而是使用gbq(),它使用环境的默认项目从BigQuery加载数据,如下所示:
import pandas
sql = """
SELECT name
FROM `bigquery-public-data.usa_names.usa_1910_current`
WHERE state = 'TX'
LIMIT 100
"""
# Run a Standard SQL query using the environment's default project
df = pandas.read_gbq(sql, dialect='standard')
# Run a Standard SQL query with the project set explicitly
project_id = 'your-project-id'
df = pandas.read_gbq(sql, project_id=project_id, dialect='standard')
以上代码摘自文档https://cloud.google.com/bigquery/docs/pandas-gbq-migration#standard-sql-query。
发布于 2021-01-05 21:47:24
https://stackoverflow.com/questions/65515464
复制相似问题