"); System.out.println(total);我得到的例外如下(SparkContext.scala:2131)在org.apache.spark.rdd.RDD$$anonfun$reduce$1.apply(RDD.scala:1029) at org.apache.spark.rdd.RDDOper
我想在文件中打印内容,下面的代码是我如何做到这一点的。rdd.map( x => field + x )}这两行注释掉的代码行运行良好,但是,这条mappedRDD.reduce( (a, b) => println(a) )build file:/home/cliu/Documents/github/Apache-Spark/)
[info] Compiling 1 Scala