我无法为谷歌导入气流提供商软件包。我用的命令是
pip3 install apache-airflow-backport-providers-google
它给了我错误
ERROR: Could not find a version that satisfies the requirement apache-airflow-backport-providers-google (from versions: none)
ERROR: No matching distribution found for apache-airflow-backport-providers-google
由于这个原因
我想知道是否有可能通过查看账单数据来获得用于dataproc实例的计算时数总数。
注:重申一下.我对集群存在的小时数不感兴趣,我想知道计算总时数。
我们将账单数据导出到BigQuery,我运行了以下查询:
select cost_grouping,cast(sum(hours) as int64) as hours
from (
select case when sku_description like 'Licensing Fee for Google Cloud Dataproc%' then sku_description
我遵循这个,通过Dataproc将作业提交给现有的Dataproc集群。
对于以下代码行:
// Configure the settings for the job controller client.
JobControllerSettings jobControllerSettings =
JobControllerSettings.newBuilder().setEndpoint(myEndpoint).build();
我收到以下错误:
SEVERE: Servlet.service() for servlet [dispatcherServlet] in
我正在尝试使用库在Dataproc中创建一个集群,然而,当设置region = 'us-central1'时,我得到以下异常:
google.api_core.exceptions.InvalidArgument: 400 Region 'us-central1' is invalid.
Please see https://cloud.google.com/dataproc/docs/concepts/regional-endpoints
for additional information on regional endpoints
代码(基于):
#!/u
我试图通过将gcloud命令转换为API来使用Dataproc,但是我在文档中找不到一个很好的例子。
%pip install google-cloud-dataproc
我发现的唯一一个好例子就是这个,它工作得很好:
from google.cloud import dataproc_v1
client = dataproc_v1.ClusterControllerClient()
project_id = 'test-project'
region = 'global'
for element in client.list_clusters(proje
我有一个python云函数,最初是部署到us-central1 1的。尝试将此函数部署到us-east4时,在加载代码时出现以下错误:
ERROR: (gcloud.functions.deploy) OperationError: code=3, message=Function failed on loading user code. This is likely due to a bug in the user code. Error message: with_scopes_if_required() got an unexpected keyword argument 'de
我正在为一个类项目编写一些代码,该类项目将作业发送到GCP中的dataproc集群。我最近遇到了一个奇怪的错误,我很难把头绕在上面。错误如下:
Exception in thread "Thread-5" java.lang.NoClassDefFoundError: io/grpc/CallCredentials2
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
at java.security.Sec