我尝试在intellij上运行一个样例scala spark程序。我已经创建了一个maven项目,并将scala性质添加到该项目中。我可以运行scala hello world程序,但我正在尝试运行spark-scala,它抛出了以下异常。
Exception in thread "main" java.lang.VerifyError: class scala.collection.mutable.WrappedArray overrides final method toBuffer.()Lscala/collection/mutable/Buffer;
at java.lang.ClassLoader.defineClass1(Native Method)
at java.lang.ClassLoader.defineClass(ClassLoader.java:756)
at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
at java.net.URLClassLoader.defineClass(URLClassLoader.java:468)
at java.net.URLClassLoader.access$100(URLClassLoader.java:74)
at java.net.URLClassLoader$1.run(URLClassLoader.java:369)
at java.net.URLClassLoader$1.run(URLClassLoader.java:363)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:362)
at java.lang.ClassLoader.loadClass(ClassLoader.java:418)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:355)
at java.lang.ClassLoader.loadClass(ClassLoader.java:351)
at org.apache.spark.SparkConf.loadFromSystemProperties(SparkConf.scala:65)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:60)
at org.apache.spark.SparkConf.<init>(SparkConf.scala:55)
at com.dnb.dsl.test.SparkDemo$.main(SparkDemo.scala:7)在这里我附加了程序,
import org.apache.spark.{SparkConf, SparkContext}
object SparkDemo { def main(args: Array[String]) {
val conf = new SparkConf().setAppName("SparkDemo").setMaster("local")
val sc = new SparkContext(conf)
val input = sc.parallelize(Array(1, 2, 3, 4, 5, 6, 7, 8, 9, 10))
input.foreach(println)
}
} pom.xml
<properties>
<spark.version>2.0.1</spark.version>
<scala.version>2.11</scala.version>
</properties>
<dependencies>
<dependency> <!-- Spark dependency -->
<groupId>org.apache.spark</groupId>
<artifactId>spark-core_${scala.version}</artifactId>
<version>${spark.version}</version>
</dependency>
</dependencies>
<build>
<pluginManagement>
<plugins>
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<encoding>UTF-8</encoding>
<source>1.8</source>
<target>1.8</target>
<compilerArgument>-Werror</compilerArgument>
</configuration>
</plugin>
</plugins>
</pluginManagement>
</build> 从intellij附加scala sdk版本配置。使用的Java版本:1.8

发布于 2020-04-18 22:01:31
我会尝试一种不同的方法,就像@mazaneicha建议的那样,降低你的Scala版本。对于Scala,我会使用SBT而不是Maven。IntelliJ与SBT和Scala完全集成,并且SBT非常容易使用。
您的示例的build.sbt为:
name := "my-project"
version := "0.1"
scalaVersion := "2.11.10"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.2.0"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "2.2.0"
libraryDependencies += "org.apache.spark" %% "spark-hive" % "2.2.0"要查找和下载spark项目的库,可以使用Maven存储库:
https://stackoverflow.com/questions/61283331
复制相似问题