在我安装Apache Livy的机器上(在Ubuntu 16.04上):
(a)是否可以在Spark独立模式下运行它?
我正在考虑使用Spark 1.6.3,它是为Hadoop2.6预先构建的,可以从https://spark.apache.org/downloads.html下载
(b)如果是,我如何配置它?
(c) Spark Standalone的HADOOP_CONF_DIR应该是什么?link https://github.com/cloudera/livy提到了以下环境变量:
export SPARK_HOME=/usr/lib/spark
export HADOOP_CONF_DIR=/etc/hadoop/conf除了最后一个任务外,我已经成功地构建了Livy,该任务在Spark安装过程中尚未完成:
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] livy-api ........................................... SUCCESS [ 9.984 s]
[INFO] livy-client-common ................................. SUCCESS [ 6.681 s]
[INFO] livy-test-lib ...................................... SUCCESS [ 0.647 s]
[INFO] livy-rsc ........................................... SUCCESS [01:08 min]
[INFO] livy-core_2.10 ..................................... SUCCESS [ 7.225 s]
[INFO] livy-repl_2.10 ..................................... SUCCESS [02:42 min]
[INFO] livy-core_2.11 ..................................... SUCCESS [ 56.400 s]
[INFO] livy-repl_2.11 ..................................... SUCCESS [03:06 min]
[INFO] livy-server ........................................ SUCCESS [02:12 min]
[INFO] livy-assembly ...................................... SUCCESS [ 15.959 s]
[INFO] livy-client-http ................................... SUCCESS [ 25.377 s]
[INFO] livy-scala-api_2.10 ................................ SUCCESS [ 40.336 s]
[INFO] livy-scala-api_2.11 ................................ SUCCESS [ 40.991 s]
[INFO] minicluster-dependencies_2.10 ...................... SUCCESS [ 24.400 s]
[INFO] minicluster-dependencies_2.11 ...................... SUCCESS [ 5.489 s]
[INFO] livy-integration-test .............................. SUCCESS [ 37.473 s]
[INFO] livy-coverage-report ............................... SUCCESS [ 3.062 s]
[INFO] livy-examples ...................................... SUCCESS [ 6.841 s]
[INFO] livy-python-api .................................... FAILURE [ 8.053 s]
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 13:59 min
[INFO] Finished at: 2016-11-29T13:14:10-08:00
[INFO] Final Memory: 76M/2758M
[INFO] ------------------------------------------------------------------------谢谢。
发布于 2020-05-05 01:28:46
为了将来的参考,请遵循以下步骤:(Ubuntu)
以下是您需要遵循的详细步骤:
Spark
导出JAVA_HOME="/lib/jvm/jdk1.8.0_251“导出路径=$PATH:$JAVA_HOME/bin
导出SPARK_HOME=/opt/hadoop/spark-2.4.5-bin-hadoop2.7导出路径=$PATH:$SPARK_HOME/bin:$SPARK_HOME/sbin
导出LIVY_HOME=/opt/hadoop/apache-livy-0.7.0-incubating-bin导出路径=$PATH:$LIVY_HOME/bin
导出HADOOP_CONF_DIR=/etc/hadoop/conf <-(可选)日志在$LIVY_HOME中,我们需要创建名为"logs“的文件夹并赋予其权限,否则当我们启动”livy-
”,只需启动它。
访问
10.有很多rest端点:https://livy.incubator.apache.org/docs/latest/rest-api.html
11.如果您对运行JAR感兴趣,那么可以使用Batch instad of session。
12.为spark创建简单的应用程序,其中传递conf的master作为参数,使其成为动态的(以便您可以传递master url)
scalaVersion := "2.11.12“
必须安装libraryDependencies += "org.apache.spark“%%”spark- +=“% "2.4.5”libraryDependencies += "org.apache.spark“%% "spark-sql”% "2.4.5"
spark-submit --class桌面file:///home/user_name/Desktop/scala_demo.jar spark://abhishek- com.company.Main :7077
适用于Livy its的
POST本地主机:8998/batches
{
"className": "com.company.Main",
"executorMemory": "20g",
"args": [
"spark://abhishek-desktop:7077"
],
"file": "local:/home/user_name/Desktop/scala_demo.jar"
}执行上面的
发布于 2018-04-18 09:50:44
可能是丢失的python模块。看一看失败的日志。
Traceback (most recent call last):
File "setup.py", line 18, in <module>
from setuptools import setup
ImportError: No module named setuptools在这种情况下,需要安装setuptools模块。
pip install setuptools
Collecting setuptools
Downloading https://files.pythonhosted.org/packages/20/d7/04a0b689d3035143e2ff288f4b9ee4bf6ed80585cc121c90bfd85a1a8c2e/setuptools-39.0.1-py2.py3-none-any.whl (569kB)
100% |████████████████████████████████| 573kB 912kB/s
Installing collected packages: setuptools
Successfully installed setuptools-20.7.0发布于 2017-05-15 16:24:05
确保您已经设置了
export SPARK_HOME=/path/to/spark/home
然后运行mvn -DskipTests package。
即使没有HADOOP_CONF_DIR,它也应该可以工作
https://stackoverflow.com/questions/40876586
复制相似问题