/root/app/spark/sbin
./start-all.sh
spark-submit --class org.apache.spark.examples.SparkPi --master spark://aliyun:7077 /root/app/spark/examples/jars/spark-examples_2.11-2.3.3.jar
计算结果
2019-08-05 18:11:05 INFO DAGScheduler:54 - Job 0 finished: reduce at SparkPi.scala:38, took 5.065242 s
Pi is roughly 3.146655733278666
命令解释: spark-submint :提交命令,提交应用程序,该命令在spark安装目录下的bin底下 –class org.apache.spark.examples.SparkPi:应用程序的主类 –master spark://aliyun:7077 :运行的master /root/app/spark/examples/jars/spark-examples_2.11-2.3.3.jar:jar包所在路径
# spark-shell
2019-08-05 19:31:54 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://aliyun:4040
Spark context available as 'sc' (master = local[*], app id = local-1565004727280).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.3.3
/_/
Using Scala version 2.11.8 (OpenJDK 64-Bit Server VM, Java 1.8.0_222)
Type in expressions to have them evaluated.
Type :help for more information.
spark laocal本地运行模式
Spark context available as 'sc' (master = local[*], app id = local-1565004727280).
Spark session available as 'spark'.
https://mvnrepository.com/artifact/org.apache.spark scala-maven-plugin
<dependency>
<groupId>net.alchim31.maven</groupId>
<artifactId>scala-archetype-simple</artifactId>
<version>1.7</version>
<type>maven-archetype</type>
</dependency>