当我调用RDD.mapValues(...).reduceByKey(...)时,我的代码不会编译。但是当我倒序时,RDD.reduceByKey(...).mapValues(...)((x, _) ⇒ x)Test.scala:7: error: value reduceByKey is not a member of org.apache.spark.rdd.RDD[(Long, E)]
possible cause: mayb
我试图使用在DStream上执行ETL,但得到了以下错误。你能帮我修一下吗?谢谢。KafkaCardCount.scala:56:28: value reduceByKey is not a member of org.apache.spark.streaming.dstream.DStreamimport org.apache.spark.SparkConf
import org.apache.spark.streaming.kafka010.Con
编辑-[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler- Missing parents: List()
[dag-scheduler-event-loop] INFO org.apache.spark.scheduler.DAGScheduler -[dag-sch