我使用Scala的小apache spark项目在我添加Mllib之前一直运行得很好。
我的sbt构建文件看起来像下面这样,但是我得到了编译错误。我不能用Scala 2.11.X构建Apache Spark Mllib吗?任何指针都会很有帮助。
error] Modules were resolved with conflicting cross-version suffixes in {file::
[error] org.apache.spark:spark-launcher _2.11, _2.10
[error] org.apache.spark:spark-sketch _2.11, _2.10
[error] org.json4s:json4s-ast _2.11, _2.10
[error] org.apache.spark:spark-catalyst _2.11, _2.10
[error] org.apache.spark:spark-network-shuffle _2.11, _2.10
[error] org.scalatest:scalatest _2.11, _2.10
[error] com.twitter:chill _2.11, _2.10
[error] org.apache.spark:spark-sql _2.11, _2.10
[error] org.json4s:json4s-jackson _2.11, _2.10
[error] com.fasterxml.jackson.module:jackson-module-scala _2.11, _2.10
[error] org.json4s:json4s-core _2.11, _2.10
[error] org.apache.spark:spark-unsafe _2.11, _2.10
[error] org.apache.spark:spark-tags _2.11, _2.10
[error] org.apache.spark:spark-core _2.11, _2.10
[error] org.apache.spark:spark-network-common _2.11, _2.10
[trace] Stack trace suppressed: run last *:update for the full output.
[error] (*:update) Conflicting cross-version suffixes in: org.apache.spark:spark-launcher, org.apache.spark:spark-sketch, org.json4s:json4s-ast, org.apache.spark:spark-catalyst, org.apache.spark:spark-network-shuffle, org.scalatest:scalatest, com.twitter:chill, org.apache.spark:spark-sql, org.json4s:json4s-jackson, com.fasterxml.jackson.module:jackson-module-scala, org.json4s:json4s-core, org.apache.spark:spark-unsafe, org.apache.spark:spark-tags, org.apache.spark:spark-core, org.apache.spark:spark-network-common
[error] Total time: 18 s, completed 10-Mar-2017 20:41:51
version := "1.0"
scalaVersion := "2.11.8"
libraryDependencies += "org.scalactic" %% "scalactic" % "3.0.1"
libraryDependencies += "org.scalatest" %% "scalatest" % "3.0.1" % "test"
//libraryDependencies += "com.typesafe.scala-logging" %% "scala-logging" % "3.5.0"
//libraryDependencies += "ch.qos.logback" % "logback-classic" % "1.1.7"
libraryDependencies += "org.scala-lang.modules" %% "scala-xml" % "1.0.6"
libraryDependencies += "org.apache.spark" %% "spark-core" % "2.1.0"
libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.1.0"
// https://mvnrepository.com/artifact/org.apache.spark/spark-mllib_2.10
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "2.1.0"
发布于 2017-03-11 14:28:00
你绝对可以使用Scala2.11.X来构建Apache Spark MLlib。为此,你必须将Spark MLlib的库依赖从:
libraryDependencies += "org.apache.spark" % "spark-mllib_2.10" % "2.1.0"
至
libraryDependencies += "org.apache.spark" % "spark-mllib_2.11" % "2.1.0"
或
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "2.1.0"
https://stackoverflow.com/questions/42729182
复制相似问题