首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >HDFS的Shell操作(开发重点)

HDFS的Shell操作(开发重点)

作者头像
用户7656790
发布2020-09-10 11:02:06
2890
发布2020-09-10 11:02:06
举报

图丨pixabay

1.基本语法

bin/hadoop fs 具体命令 OR bin/hdfs dfs 具体命令 dfsfs的实现类。

2.命令大全
[hadoop@hadoop103 hadoop-2.7.2]$ bin/hadoop fs
Usage: hadoop fs [generic options]
    [-appendToFile <localsrc> ... <dst>]
    [-cat [-ignoreCrc] <src> ...]
    [-checksum <src> ...]
    [-chgrp [-R] GROUP PATH...]
    [-chmod [-R] <MODE[,MODE]... | OCTALMODE> PATH...]
    [-chown [-R] [OWNER][:[GROUP]] PATH...]
    [-copyFromLocal [-f] [-p] [-l] <localsrc> ... <dst>]
    [-copyToLocal [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
    [-count [-q] [-h] <path> ...]
    [-cp [-f] [-p | -p[topax]] <src> ... <dst>]
    [-createSnapshot <snapshotDir> [<snapshotName>]]
    [-deleteSnapshot <snapshotDir> <snapshotName>]
    [-df [-h] [<path> ...]]
    [-du [-s] [-h] <path> ...]
    [-expunge]
    [-find <path> ... <expression> ...]
    [-get [-p] [-ignoreCrc] [-crc] <src> ... <localdst>]
    [-getfacl [-R] <path>]
    [-getfattr [-R] {-n name | -d} [-e en] <path>]
    [-getmerge [-nl] <src> <localdst>]
    [-help [cmd ...]]
    [-ls [-d] [-h] [-R] [<path> ...]]
    [-mkdir [-p] <path> ...]
    [-moveFromLocal <localsrc> ... <dst>]
    [-moveToLocal <src> <localdst>]
    [-mv <src> ... <dst>]
    [-put [-f] [-p] [-l] <localsrc> ... <dst>]
    [-renameSnapshot <snapshotDir> <oldName> <newName>]
    [-rm [-f] [-r|-R] [-skipTrash] <src> ...]
    [-rmdir [--ignore-fail-on-non-empty] <dir> ...]
    [-setfacl [-R] [{-b|-k} {-m|-x <acl_spec>} <path>]|[--set <acl_spec> <path>]]
    [-setfattr {-n name [-v value] | -x name} <path>]
    [-setrep [-R] [-w] <rep> <path> ...]
    [-stat [format] <path> ...]
    [-tail [-f] <file>]
    [-test -[defsz] <path>]
    [-text [-ignoreCrc] <src> ...]
    [-touchz <path> ...]
    [-truncate [-w] <length> <path> ...]
    [-usage [cmd ...]]

Generic options supported are
-conf <configuration file>     specify an application configuration file
-D <property=value>            use value for given property
-fs <local|namenode:port>      specify a namenode
-jt <local|resourcemanager:port>    specify a ResourceManager
-files <comma separated list of files>    specify comma separated files to be copied to the map reduce cluster
-libjars <comma separated list of jars>    specify comma separated jar files to include in the classpath.
-archives <comma separated list of archives>    specify comma separated archives to be unarchived on the compute machines.

The general command line syntax is
bin/hadoop command [genericOptions] [commandOptions]

3.常用命令实操

先启动集群:

102

[hadoop@hadoop102 ~]$ cd /opt/module/hadoop-2.7.2/
[hadoop@hadoop102 hadoop-2.7.2]$ sbin/start-dfs.sh 

在这里插入图片描述

103

[hadoop@hadoop103 ~]$ cd /opt/module/hadoop-2.7.2/
[hadoop@hadoop103 hadoop-2.7.2]$ sbin/start-yarn.sh 

在这里插入图片描述

(1)-help:输出这个命令参数

[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -help rm

(2)-ls: 显示目录信息

[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -ls /

(3)-mkdir:在HDFS上创建目录

[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -mkdir -p /chongqing/chengkou

(4)-moveFromLocal:从本地剪切粘贴到HDFS

[hadoop@hadoop103 hadoop-2.7.2]$ touch bossxiang.txt
[hadoop@hadoop103 hadoop-2.7.2]$ vi bossxiang.txt 
woshibossxiang
[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -moveFromLocal ./bossxiang.txt /chongqing/chengkou

(5)-appendToFile:追加一个文件到已经存在的文件末尾

[hadoop@hadoop103 hadoop-2.7.2]$ touch yuan.txt
[hadoop@hadoop103 hadoop-2.7.2]$ vi yuan.txt 
wo shi chen yuan
[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -appendToFile ./yuan.txt /chongqing/chengkou/bossxiang.txt

(6)-cat:显示文件内容

[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -cat /chongqing/chengkou/bossxiang.txt
[hadoop@hadoop103 hadoop-2.7.2]$ hadoop fs -chgrp hadoop /chongqing/chengkou/bossxiang.txt

在这里插入图片描述

(8)-copyFromLocal:从本地文件系统中拷贝文件到HDFS路径去

[hadoop@hadoop102 hadoop-2.7.2]$ touch xinyue.txt
[hadoop@hadoop102 hadoop-2.7.2]$ vi xinyue.txt 
wo shi xinyue
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -copyFromLocal ./xinyue.txt  /chongqing/chengkou/

(9)-copyToLocal:从HDFS拷贝到本地

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -copyToLocal /chongqing/chengkou/bossxiang.txt  ./

(10)-cp :从HDFS的一个路径拷贝到HDFS的另一个路径

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -cp /chongqing/chengkou/xinyue.txt /chongqing/

在这里插入图片描述

(11)-mv:在HDFS目录中移动文件

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -mv /chongqing/xinyue.txt /

在这里插入图片描述

(12)-get:等同于copyToLocal,就是从HDFS下载文件到本地

[hadoop@hadoop102 hadoop-2.7.2]$ rm xinyue.txt
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -get /xinyue.txt

(13)-getmerge:合并下载多个文件,比如HDFS的目录 /user/atguigu/test下有多个文件:log.1, log.2,log.3,…

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -getmerge /chongqing/chengkou/* ./zaiyiqi.txt
[hadoop@hadoop102 hadoop-2.7.2]$ cat zaiyiqi.txt 
woshibossxiang
wo shi xinyue

在这里插入图片描述

(14)-put:等同于copyFromLocal

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -put ./LICENSE.txt /chongqing/chengkou

在这里插入图片描述

(15)-tail:显示一个文件的末尾

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -tail /chongqing/chengkou/LICENSE.txt
mentation and/or other materials provided with the
   distribution.

   THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS
   "AS IS" AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT
   LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR
   A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT
   OWNER OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL,
   SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT
   LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE,
   DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY
   THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT
   (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
   OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.

   You can contact the author at :
   - LZ4 source repository : http://code.google.com/p/lz4/
   - LZ4 public forum : https://groups.google.com/forum/#!forum/lz4c
*/

(16)-rm:删除文件或文件夹

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -rm /chongqing/chengkou/LICENSE.txt
20/08/08 04:21:42 INFO fs.TrashPolicyDefault: Namenode trash configuration: Deletion interval = 0 minutes, Emptier interval = 0 minutes.
Deleted /chongqing/chengkou/LICENSE.txt

(17)-rmdir:删除空目录

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -mkdir /test
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -rmdir /test

(18)-du统计文件夹的大小信息

[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -du /
29         /chongqing
212046774  /hadoop-2.7.2.tar.gz
13         /sanguo
37         /wc.input
14         /xinyue.txt
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -du -h /
29       /chongqing
202.2 M  /hadoop-2.7.2.tar.gz
13       /sanguo
37       /wc.input
14       /xinyue.txt
[hadoop@hadoop102 hadoop-2.7.2]$ hadoop fs -du -h -s /
202.2 M  /

(19)-setrep:设置HDFS中文件的副本数量

[hadoop@hadoop102 hadoop-2.7.2]$ cd data/tmp/dfs/data/current/BP-1454198558-117.59.224.141-1595323236787/current/finalized/subdir0/subdir0/
[hadoop@hadoop102 subdir0]$ hadoop fs -rm -R /chongqing
[hadoop@hadoop102 subdir0]$ hadoop fs -rm /hadoop-2.7.2.tar.gz
[hadoop@hadoop102 subdir0]$ hadoop fs -rm -R /sanguo
[hadoop@hadoop102 subdir0]$ ll
total 207096
-rw-rw-r--. 1 hadoop hadoop        37 Jul 21 09:10 blk_1073741825
-rw-rw-r--. 1 hadoop hadoop        11 Jul 21 09:10 blk_1073741825_1001.meta
-rw-rw-r--. 1 hadoop hadoop        14 Aug  8 02:57 blk_1073741831
-rw-rw-r--. 1 hadoop hadoop        11 Aug  8 02:57 blk_1073741831_1011.meta
drwxr-xr-x. 9 hadoop hadoop       149 Jan 25  2016 hadoop-2.7.2
-rw-rw-r--. 1 hadoop hadoop 212046774 Jul 21 09:26 tmp.txt
[hadoop@hadoop102 subdir0]$ cat blk_1073741825
xiang
xiang
lin 
lin yuan chen  yuan
[hadoop@hadoop102 subdir0]$ cat blk_1073741831
wo shi xinyue
[hadoop@hadoop102 subdir0]$ 
[hadoop@hadoop102 finalized]$ hadoop fs -setrep 2 /xinyue.txt

所有巧合的是要么是上天注定要么是一个人偷偷的在努力。

结束!

参考链接:尚硅谷hadoop教程

本文参与 腾讯云自媒体分享计划,分享自微信公众号。
原始发表:2020-09-04,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 五角钱的程序员 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • 1.基本语法
    • 2.命令大全
    相关产品与服务
    大数据
    全栈大数据产品,面向海量数据场景,帮助您 “智理无数,心中有数”!
    领券
    问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档