首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >Spark安装

Spark安装

作者头像
汤高
发布2018-01-11 16:15:11
7830
发布2018-01-11 16:15:11
举报
文章被收录于专栏:积累沉淀积累沉淀

我的安装版本是spark-1.6.1-bin-hadoop2.6.tgz   这个版本必须要求jdk1.7或者1.7以上

安装spark必须要scala-2.11  版本支撑    我安装的是scala-2.11.8.tgz 

tg@master:/software$ tar -zxvf scala-2.11.8.tgz  tg@master:/software/scala-2.11.8$ ls bin  doc  lib  man 添加环境变量 tg@master:/$ sudo  /etc/profile 加入 export SCALA_HOME=/software/scala-2.11.8 export PATH=$SCALA_HOME/bin:$PATH tg@master:/$ source /etc/profile 启动scala tg@master:/$ scala Welcome to Scala 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_80). Type in expressions for evaluation. Or try :help. scala> 9*9 res0: Int = 81 安装Spark ---------------- tg@master:~$ cp ~/Desktop/spark-1.6.1-bin-hadoop2.6.tgz  /software/ tg@master:~$ cd /software/ tg@master:/software$ ls apache-hive-2.0.0-bin         jdk-7u80-linux-x64.tar.gz apache-hive-2.0.0-bin.tar.gz  scala-2.11.8 hadoop-2.6.4                  scala-2.11.8.tgz hadoop-2.6.4.tar.gz           spark-1.6.1-bin-hadoop2.6.tgz hbase-1.2.1                   zookeeper-3.4.8 hbase-1.2.1-bin.tar.gz        zookeeper-3.4.8.tar.gz jdk1.7.0_80 tg@master:/software$ tar -zxvf spark-1.6.1-bin-hadoop2.6.tgz  添加环境变量 sudo gedit /etc/profile export SPARK_HOME=/software/spark-1.6.1-bin-hadoop2.6 export PATH=$SPARK_HOME/bin:$PATH source  /etc/profile 修改spark-env.sh tg@master:~$ cd /software/spark-1.6.1-bin-hadoop2.6/conf/ tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ ls docker.properties.template  metrics.properties.template   spark-env.sh.template fairscheduler.xml.template  slaves.template log4j.properties.template   spark-defaults.conf.template tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ cp spark-env.sh.template  spark-env.sh tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ sudo gedit spark-env.sh 加入 export SCALA_HOME=/software/scala-2.11.8 export JAVA_HOME=/software/jdk1.7.0_80 export SPARK_MASTER_IP=192.168.52.140 export SPARK_WORKER_MEMORY=512m export master=spark://192.168.52.140:7070 修改slaves tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ cp slaves.template slaves tg@master:/software/spark-1.6.1-bin-hadoop2.6/conf$ sudo gedit slaves master 启动 tg@master:/software/spark-1.6.1-bin-hadoop2.6$ sbin/start-all.sh  starting org.apache.spark.deploy.master.Master, logging to /software/spark-1.6.1-bin-hadoop2.6/logs/spark-tg-org.apache.spark.deploy.master.Master-1-master.out master: starting org.apache.spark.deploy.worker.Worker, logging to /software/spark-1.6.1-bin-hadoop2.6/logs/spark-tg-org.apache.spark.deploy.worker.Worker-1-master.out jps查看进程 多了Worker,Master tg@master:/software/hbase-1.2.1/conf$ jps 4400 HRegionServer 3033 DataNode 5794 Jps 4793 Main 3467 ResourceManager 5652 SparkSubmit 5478 Master 3591 NodeManager 3240 SecondaryNameNode 3910 QuorumPeerMain 2911 NameNode 5567 Worker 4246 HMaster tg@master:/software/hbase-1.2.1/conf$  tg@master:/software/spark-1.6.1-bin-hadoop2.6$ spark-shell log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties To adjust logging level use sc.setLogLevel("INFO") Welcome to       ____              __      / __/__  ___ _____/ /__     _\ \/ _ \/ _ `/ __/  '_/    /___/ .__/\_,_/_/ /_/\_\   version 1.6.1       /_/ Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_80) Type in expressions to have them evaluated. Type :help for more information. Spark context available as sc. 16/05/31 02:17:47 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 16/05/31 02:17:49 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 16/05/31 02:18:02 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 16/05/31 02:18:03 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException 16/05/31 02:18:10 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 16/05/31 02:18:11 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies) 16/05/31 02:18:19 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0 16/05/31 02:18:19 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException SQL context available as sqlContext. scala>  查看

查看job

本文参与 腾讯云自媒体分享计划,分享自作者个人站点/博客。
原始发表:2016-05-31 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
相关产品与服务
容器服务
腾讯云容器服务(Tencent Kubernetes Engine, TKE)基于原生 kubernetes 提供以容器为核心的、高度可扩展的高性能容器管理服务,覆盖 Serverless、边缘计算、分布式云等多种业务部署场景,业内首创单个集群兼容多种计算节点的容器资源管理模式。同时产品作为云原生 Finops 领先布道者,主导开源项目Crane,全面助力客户实现资源优化、成本控制。
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档