读取Spark2.0中的多行json文件时出现异常 val data = spark.read .json("C:\\user\\Spark类访问org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.的org.apache.spark.input.StreamFileInputFormat.set
accountBal.createOrReplaceTempView("accntBal")
" SELECT CTC_ID, ACCNT_BALPAID_THRU_DT <= CURRENT_DATE AND PAID_THRU_DT > '01/01/2000' AND PAID_THRU_DT is not null "
org.apache.spark.sql.AnalysisExceptio