首页
学习
活动
专区
圈层
工具
发布

Hadoop+Hive+HBase+Spark 集群部署(三)

hadoophivehbasespark

2. spark

spark-env.sh

代码语言:javascript
复制
export SCALA_HOME=/opt/soft/scala-2.12.6
export JAVA_HOME=/usr/java/jdk1.8.0_162
export HADOOP_HOME=/opt/soft/hadoop-2.8.3
export HADOOP_CONF_DIR=$HADOOP_HOME/etc/hadoop
export SPARK_HOME=/opt/soft/spark-2.3.0-bin-hadoop2.7
export SPARK_MASTER_IP=node
export SPARK_EXECUTOR_MEMORY=4G

slaves

代码语言:javascript
复制
node1
node2
node3

启动 / 停止 命令

  • 和hadoop冲突,用绝对路径 /opt/soft/spark-2.3.0-bin-hadoop2.7/sbin/start-all.sh 启动
  • /opt/soft/spark-2.3.0-bin-hadoop2.7/sbin/stop-all.sh 停止

spark_webUI 端口

  • 8080
  • Spark context Web UI available at http://node:4040

shell

代码语言:javascript
复制
[root@node ~]# spark-shell
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://node:4040
Spark context available as 'sc' (master = local[*], app id = local-1525334225269).
Spark session available as 'spark'.
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 2.3.0
      /_/
         
Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_162)
Type in expressions to have them evaluated.
Type :help for more information.

scala> 8*8
res0: Int = 64

本文由 bytebye 创作 本站文章除注明转载/出处外,均为本站原创或翻译,转载前请务必署名

下一篇
举报
领券