首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >Dataproc忽略Spark配置

Dataproc忽略Spark配置
EN

Stack Overflow用户
提问于 2020-12-10 14:58:59
回答 2查看 429关注 0票数 1

我在dataproc集群中运行下面的spark提交命令,但我注意到很少有spark配置被忽略。我想知道他们被忽视的原因是什么?

代码语言:javascript
运行
复制
gcloud dataproc jobs submit spark --cluster=<Cluster> --class=<class_name> --jars=<list_of_jars> --region=<region> --files=<list_of_files> --properties=spark.driver.extraJavaOptions="-Dconfig.file=application_dev.json -Dlog4j.configuration=log4j.properties",spark.executor.extraJavaOptions="-Dconfig.file=application_dev.json -Dlog4j.configuration=log4j.properties, spark.executor.instances=36, spark.executor.cores=4, spark.executor.memory=4G, spark.driver.memory=8G, spark.shuffle.service.enabled=true, spark.yarn.maxAppAttempts=1, spark.sql.shuffle.partitions=200, spark.executor.memoryOverhead=7680, spark.driver.maxResultSize=0, spark.port.maxRetries=250, spark.dynamicAllocation.initialExecutors=20, spark.dynamicAllocation.minExecutors=10"


Warning: Ignoring non-Spark config property:  spark.driver.maxResultSize
Warning: Ignoring non-Spark config property:  spark.driver.memory
Warning: Ignoring non-Spark config property:  spark.dynamicAllocation.minExecutors
Warning: Ignoring non-Spark config property:  spark.executor.cores
Warning: Ignoring non-Spark config property:  spark.port.maxRetries
Warning: Ignoring non-Spark config property:  spark.yarn.maxAppAttempts
Warning: Ignoring non-Spark config property:  spark.dynamicAllocation.initialExecutors
Warning: Ignoring non-Spark config property:  spark.executor.memory
Warning: Ignoring non-Spark config property:  spark.executor.memoryOverhead
Warning: Ignoring non-Spark config property:  spark.sql.shuffle.partitions
Warning: Ignoring non-Spark config property:  spark.executor.instances
EN

回答 2

Stack Overflow用户

回答已采纳

发布于 2020-12-10 15:10:11

试试下面的吧。他们不是extraJavaOptions,但属于properties

代码语言:javascript
运行
复制
gcloud dataproc jobs submit spark --cluster=<Cluster> --class=<class_name> --jars=<list_of_jars> --region=<region> --files=<list_of_files> --properties=spark.driver.extraJavaOptions="-Dconfig.file=application_dev.json -Dlog4j.configuration=log4j.properties",spark.executor.extraJavaOptions="-Dconfig.file=application_dev.json -Dlog4j.configuration=log4j.properties",spark.executor.instances=36,spark.executor.cores=4,spark.executor.memory=4G,spark.driver.memory=8G,spark.shuffle.service.enabled=true,spark.yarn.maxAppAttempts=1,spark.sql.shuffle.partitions=200,spark.executor.memoryOverhead=7680,spark.driver.maxResultSize=0,spark.port.maxRetries=250,spark.dynamicAllocation.initialExecutors=20,spark.dynamicAllocation.minExecutors=10

以更具可读性的形式:

代码语言:javascript
运行
复制
gcloud dataproc jobs submit spark --cluster=<Cluster> --class=<class_name> --jars=<list_of_jars> --region=<region> --files=<list_of_files> 
--properties=spark.driver.extraJavaOptions="
    -Dconfig.file=application_dev.json
    -Dlog4j.configuration=log4j.properties
",spark.executor.extraJavaOptions="
    -Dconfig.file=application_dev.json
    -Dlog4j.configuration=log4j.properties
",
spark.executor.instances=36,
spark.executor.cores=4,
spark.executor.memory=4G,
spark.driver.memory=8G,
spark.shuffle.service.enabled=true,
spark.yarn.maxAppAttempts=1,
spark.sql.shuffle.partitions=200,
spark.executor.memoryOverhead=7680,
spark.driver.maxResultSize=0,
spark.port.maxRetries=250,
spark.dynamicAllocation.initialExecutors=20,
spark.dynamicAllocation.minExecutors=10
票数 2
EN

Stack Overflow用户

发布于 2020-12-15 05:25:33

你能试试这个吗?

代码语言:javascript
运行
复制
gcloud dataproc jobs submit spark \
  --cluster=<Cluster> \
  --class=<class_name> \
  --jars=<list_of_jars> \
  --region=<region> \
  --files=<list_of_files> \
  --properties=^#^spark.driver.extraJavaOptions="-Dconfig.file=application_dev.json -Dlog4j.configuration=log4j.properties"#spark.executor.extraJavaOptions="-Dconfig.file=application_dev.json -Dlog4j.configuration=log4j.properties"#spark.executor.instances=36#spark.executor.cores=4#spark.executor.memory=4G#spark.driver.memory=8G#spark.shuffle.service.enabled=true#spark.yarn.maxAppAttempts=1#spark.sql.shuffle.partitions=200#spark.executor.memoryOverhead=7680#spark.driver.maxResultSize=0#spark.port.maxRetries=250#spark.dynamicAllocation.initialExecutors=20#spark.dynamicAllocation.minExecutors=10
票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/65229816

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档