我正在尝试使用pip install安装pyspark,如下所示。但是我得到了以下错误。(python_virenv)edamame$ pip install pyspark Could not find a version that satisfiesthe requirement pyspark (from versions: )有谁知道吗?
在EMR Spark上,通过数据帧向S3写入RDD[String]。SaveMode.Overwrite)保存模式为Overwrite,并且s3n://my-bucket/some/new/path尚不存在(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Ta