首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
社区首页 >问答首页 >NoSuchMethodError - org.apache.spark.util.Utils$.withDummyCallSite

NoSuchMethodError - org.apache.spark.util.Utils$.withDummyCallSite
EN

Stack Overflow用户
提问于 2015-11-20 02:07:47
回答 1查看 1.7K关注 0票数 1

我正在尝试在集群上启动Spark作业(Spark 1.4.0)。无论是从命令行还是Eclipse,我都得到了一个关于Spark Utils类中缺少withDummyCallSite函数的错误。在maven依赖项中,我可以看到spark-core_2.10-1.4.0.jar已加载,它应该包含此函数。我运行的是Java 1.7,与先前编译代码时使用的Java版本相同。我可以在Spark Master监视器上看到作业已经启动,所以它看起来不像是防火墙问题。下面是我在控制台中看到的错误(从命令行和Eclipse):

代码语言:javascript
代码运行次数:0
运行
复制
ERROR 09:53:06,314  Logging.scala:75 -- Task 0 in stage 1.0 failed 4 times; aborting job
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder".
SLF4J: Defaulting to no-operation (NOP) logger implementation
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details.
java.lang.NoSuchMethodError: org.apache.spark.util.Utils$.withDummyCallSite(Lorg/apache/spark/SparkContext;Lscala/Function0;)Ljava/lang/Object;
    at org.apache.spark.sql.parquet.ParquetRelation2.buildScan(newParquet.scala:269)
    at org.apache.spark.sql.sources.HadoopFsRelation.buildScan(interfaces.scala:530)
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$8.apply(DataSourceStrategy.scala:98)
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$8.apply(DataSourceStrategy.scala:98)
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:266)
    at org.apache.spark.sql.sources.DataSourceStrategy$$anonfun$pruneFilterProject$1.apply(DataSourceStrategy.scala:265)
    at org.apache.spark.sql.sources.DataSourceStrategy$.pruneFilterProjectRaw(DataSourceStrategy.scala:296)
    at org.apache.spark.sql.sources.DataSourceStrategy$.pruneFilterProject(DataSourceStrategy.scala:261)
    at org.apache.spark.sql.sources.DataSourceStrategy$.apply(DataSourceStrategy.scala:94)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.planLater(QueryPlanner.scala:54)
    at org.apache.spark.sql.execution.SparkStrategies$HashAggregation$.apply(SparkStrategies.scala:162)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner$$anonfun$1.apply(QueryPlanner.scala:58)
    at scala.collection.Iterator$$anon$13.hasNext(Iterator.scala:371)
    at org.apache.spark.sql.catalyst.planning.QueryPlanner.plan(QueryPlanner.scala:59)
    at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan$lzycompute(SQLContext.scala:932)
    at org.apache.spark.sql.SQLContext$QueryExecution.sparkPlan(SQLContext.scala:930)
    at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan$lzycompute(SQLContext.scala:936)
    at org.apache.spark.sql.SQLContext$QueryExecution.executedPlan(SQLContext.scala:936)
    at org.apache.spark.sql.DataFrame.collect(DataFrame.scala:1255)
    at org.apache.spark.sql.DataFrame.count(DataFrame.scala:1269)

(为简明起见,日志将被截断)

提前感谢您的指点!

EN

回答 1

Stack Overflow用户

发布于 2015-11-20 12:57:41

请检查maven using keys (CNTR+Shift+T)如何解析您的类。确保它不是从类路径中的两个不同jars中解析出来的。

如果您的类是从任何传递依赖项引用的,请使用您需要的版本添加所需的jar作为直接依赖项。

您可以参考这些链接以获得更多参考。

mockito test gives no such method error when run as junit test but when jars are added manually in run confugurations, it runs well

Exception in thread "main" java.lang.NoSuchMethodError: org.slf4j.impl.StaticLoggerBinder.getSingleton()Lorg/slf4j/impl/StaticLoggerBinder

票数 1
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/33811286

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档