我试图整合Spark Streaming和Kafka。我在intellij idea IDE中编写了我的源代码,编译器可以编译代码而没有任何错误,但是当我尝试构建jar文件时,会出现一条错误消息,显示:
Error:scalac: bad symbolic reference. A signature in KafkaUtils.class refers to term kafka
in package <root> which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling KafkaUtils.class.
我在google上做研究,很多人说这是因为scala版本和spark_streaming_kafka jar文件之间的版本不同。但我检查了版本,他们是一些。
有人知道它为什么会发生吗?
这里有更多细节:scala版本:2.10 spark streaming kafka jar版本:spark_streaming_kafka_2.10-1.20.jar,spark_streaming_2.10-1.20.jar
我的源代码:
object Kafka {
def main(args: Array[String]) {
val master = "local[*]"
val zkQuorum = "localhost:2181"
val group = ""
val topics = "test"
val numThreads = 1
val conf = new SparkConf().setAppName("Kafka")
val ssc = new StreamingContext(conf, Seconds(2))
val topicpMap = topics.split(",").map((_, numThreads.toInt)).toMap
val lines = KafkaUtils.createStream(ssc, zkQuorum, group, topicpMap).map(_._2)
val words = lines.flatMap(_.split(" "))
words.print()
ssc.start()
ssc.awaitTermination()
}
}
发布于 2018-07-30 16:54:38
我也面临着同样的问题,谁知道如何解决这个问题?
https://stackoverflow.com/questions/-100004980
复制相似问题