我正在尝试使用Spark和Scala,编译一个独立的应用程序。我不知道为什么我会得到这个错误:
topicModel.scala:2: ';' expected but 'import' found.
[error] import org.apache.spark.mllib.clustering.LDA
[error] ^
[error] one error found
[error] (compile:compileIncremental) Compilation failed这是build.sbt代码:
name := "topicModel"
version := "1.0"
scalaVersion := "2.11.6"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.3.1"
libraryDependencies += "org.apache.spark" %% "spark-graphx" % "1.3.1"
libraryDependencies += "org.apache.spark" %% "spark-mllib" % "1.3.1"这些是导入:
import scala.collection.mutable
import org.apache.spark.mllib.clustering.LDA
import org.apache.spark.mllib.linalg.{Vector, Vectors}
import org.apache.spark.rdd.RDD
object Simple {
  def main(args: Array[String]) {发布于 2015-05-25 14:45:55
这可能是因为您的文件具有旧的Macintosh行结尾(\r)?
有关更多详细信息,请参阅Why do I need semicolons after these imports?。
https://stackoverflow.com/questions/30422750
复制相似问题