> library('BBmisc')
> library('sparklyr')
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version, :
Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.
> spark_home_dir()
[1] "C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7"
> spark_installed_versions()
spark hadoop dir
1 3.0.0 2.7 C:\\Users\\Owner\\AppData\\Local/spark/spark-3.0.0-bin-hadoop2.7
> spark_home_set()
Setting SPARK_HOME environment variable to C:\Users\Owner\AppData\Local/spark/spark-3.0.0-bin-hadoop2.7
> sc <- spark_connect(master = 'local')
Error in start_shell(master = master, spark_home = spark_home, spark_version = version, :
Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.来源:https://github.com/englianhu/binary.com-interview-question/issues/1#issue-733943885
我可以知道如何解决Failed to find 'spark-submit2.cmd' under 'C:\Users\Owner\AppData\Local\spark\spark-3.0.0-bin-hadoop2.7', please verify - SPARK_HOME.吗?
参考资料:需要帮助开始使用火花和火花
发布于 2020-11-06 21:13:41
解决了!
步骤:
'C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2'。spark_home_set('C:/Users/scibr/AppData/Local/spark/spark-3.0.1-bin-hadoop3.2')GitHub来源:https://github.com/englianhu/binary.com-interview-question/issues/1#event-3968919946
https://stackoverflow.com/questions/64632416
复制相似问题