scala.Predef$.$scope()Lscala/xml/TopScope$和not found: type Application异常

intellij idea+scala+spark开发的程序之前一直正常,今天提示下面错误。

问题1 java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;

Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/10/03 22:35:16 INFO SparkContext: Running Spark version 2.1.0
17/10/03 22:35:16 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/10/03 22:35:17 INFO SecurityManager: Changing view acls to: Administrator
17/10/03 22:35:17 INFO SecurityManager: Changing modify acls to: Administrator
17/10/03 22:35:17 INFO SecurityManager: Changing view acls groups to: 
17/10/03 22:35:17 INFO SecurityManager: Changing modify acls groups to: 
17/10/03 22:35:17 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users  with view permissions: Set(Administrator); groups with view permissions: Set(); users  with modify permissions: Set(Administrator); groups with modify permissions: Set()
17/10/03 22:35:18 INFO Utils: Successfully started service 'sparkDriver' on port 63233.
17/10/03 22:35:18 INFO SparkEnv: Registering MapOutputTracker
17/10/03 22:35:18 INFO SparkEnv: Registering BlockManagerMaster
17/10/03 22:35:18 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/10/03 22:35:18 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/10/03 22:35:18 INFO DiskBlockManager: Created local directory at C:\Users\Administrator\AppData\Local\Temp\blockmgr-7d37f54c-7f7d-4452-bbe1-edd74a1b3cef
17/10/03 22:35:18 INFO MemoryStore: MemoryStore started with capacity 908.1 MB
17/10/03 22:35:18 INFO SparkEnv: Registering OutputCommitCoordinator
Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.$scope()Lscala/xml/TopScope$;
    at org.apache.spark.ui.jobs.AllJobsPage.<init>(AllJobsPage.scala:39)
    at org.apache.spark.ui.jobs.JobsTab.<init>(JobsTab.scala:38)
    at org.apache.spark.ui.SparkUI.initialize(SparkUI.scala:65)
    at org.apache.spark.ui.SparkUI.<init>(SparkUI.scala:82)
    at org.apache.spark.ui.SparkUI$.create(SparkUI.scala:220)
    at org.apache.spark.ui.SparkUI$.createLiveUI(SparkUI.scala:162)
    at org.apache.spark.SparkContext.<init>(SparkContext.scala:452)
    at cn.hadron.JoinDemo$.main(JoinDemo.scala:10)
    at cn.hadron.JoinDemo.main(JoinDemo.scala)
17/10/03 22:35:18 INFO DiskBlockManager: Shutdown hook called
17/10/03 22:35:18 INFO ShutdownHookManager: Shutdown hook called
17/10/03 22:35:18 INFO ShutdownHookManager: Deleting directory C:\Users\Administrator\AppData\Local\Temp\spark-fa8aeada-59ea-402b-98cd-1f0424746877\userFiles-e03aaa25-fd89-45a5-9917-bde095172ac8
17/10/03 22:35:18 INFO ShutdownHookManager: Deleting directory C:\Users\Administrator\AppData\Local\Temp\spark-fa8aeada-59ea-402b-98cd-1f0424746877

Process finished with exit code 1

解决办法

原来的pom.xml

    <dependency>
      <groupId>org.apache.spark</groupId>
      <artifactId>spark-core_2.10</artifactId>
      <version>2.1.0</version>
    </dependency>

修改为

<dependency>
    <groupId>org.apache.spark</groupId>
    <artifactId>spark-core_2.11</artifactId>
    <version>2.1.1</version>
</dependency>

再次运行,上面的问题已经消失,但是出现下面的问题。

问题2 not found: type Application

Error:(7, 20) not found: type Application
object App extends Application {

解决办法

参考: http://stackoverflow.com/questions/26176509/why-does-2-11-1-fail-with-error-not-found-type-application Application has been deprecated from scala 2.9, probably it has been deleted in scala 2.11 (it still exists in scala 2.10) even though at the moment I can’t find proofs for that, use App instead.

this is the scala 2.11 branch on github which has only an App.scala and this is the 2.10which has App.scala and Application.scala with a deprecated warning.

既然App.scala和Application.scala已经过时,直接删除生成App.scala文件即可。 再次运行,正常。

本文参与腾讯云自媒体分享计划,欢迎正在阅读的你也加入,一起分享。

发表于

我来说两句

0 条评论
登录 后参与评论

相关文章

来自专栏跟着阿笨一起玩NET

LINQ to XML 从逗号分隔值 (CSV) 文件生成 XML 文件

参考:http://msdn.microsoft.com/zh-cn/library/bb387090.aspx

451
来自专栏WindCoder

继承练习之汽车类—C++

421
来自专栏码匠的流水账

聊聊spring cloud gateway的LoadBalancerClientFilter

本文主要研究一下spring cloud gateway的LoadBalancerClientFilter

451
来自专栏大数据学习笔记

查看Hadoop HDFS 中的一个文件对应block信息

本文地址:http://blog.csdn.net/chengyuqiang/article/details/78163091 如果需要查看Hadoop HDF...

3288
来自专栏别先生

Spark入门,概述,部署,以及学习(Spark是一种快速、通用、可扩展的大数据分析引擎)

1:Spark的官方网址:http://spark.apache.org/ 1:Spark生态系统已经发展成为一个包含多个子项目的集合,其中包含SparkSQL...

4214
来自专栏Linux驱动

STM32-正弦波可调(50HZ~20KHZ可调、峰峰值0~3.3V可调)

1.原理: 通过定时器每隔一段时间触发一次DAC转换,然后通过DMA发送正玄波码表值给DAC. 当需要改变频率HZ时,只需要修改定时器频率即可(最高只能达到20...

5039
来自专栏吴小龙同學

Android 进程间通信

什么鬼!单例居然失效了,一个地方设置值,另个地方居然取不到,这怎么可能?没道理啊!排查半天,发现这两就不在一个进程里,才恍然大悟…… 什么是进程 按照操作...

2394
来自专栏闵开慧

运行wordcount时显示Could not obtain block

该文章接上面hadoop运行wordcount时卡住不动,接着下面 hadoop@ubuntu118:~/hadoop-1.0.2$ bi...

33414
来自专栏码匠的流水账

聊聊spring cloud gateway的PreserveHostHeaderGatewayFilter

本文主要研究下spring cloud gateway的PreserveHostHeaderGatewayFilter

562
来自专栏闵开慧

wordcount.java

package com.biencloud.test; import java.io.IOException;   import java.uti...

3045

扫码关注云+社区