我正在尝试在Intellij14Scala工作表中创建一个SparkContext。
下面是我的依赖项
name := "LearnSpark"
version := "1.0"
scalaVersion := "2.11.7"
// for working with Spark API
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.0"下面是我在工作表中运行的代码
import org.apache.spark.{SparkContext, SparkConf}
val conf = new SparkConf().setMaster("local").setAppName("spark-play")
val sc = new SparkContext(conf)错误
15/08/24 14:01:59 ERROR SparkContext: Error initializing SparkContext.
java.lang.ClassNotFoundException: rg.apache.spark.rpc.akka.AkkaRpcEnvFactory
at java.net.URLClassLoader$1.run(URLClassLoader.java:372)
at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:360)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at java.lang.Class.forName0(Native Method)当我将Spark作为独立的应用运行时,它工作得很好。例如
import org.apache.spark.{SparkContext, SparkConf}
// stops verbose logs
import org.apache.log4j.{Level, Logger}
object TestMain {
Logger.getLogger("org").setLevel(Level.OFF)
def main(args: Array[String]): Unit = {
//Create SparkContext
val conf = new SparkConf()
.setMaster("local[2]")
.setAppName("mySparkApp")
.set("spark.executor.memory", "1g")
.set("spark.rdd.compress", "true")
.set("spark.storage.memoryFraction", "1")
val sc = new SparkContext(conf)
val data = sc.parallelize(1 to 10000000).collect().filter(_ < 1000)
data.foreach(println)
}
}有人能提供一些关于我应该在哪里解决这个异常的指导吗?
谢谢。
https://stackoverflow.com/questions/32189206
复制相似问题