HDFS Introduction Shell basic operation

1. Shell Command line client

Hadoop Provides the shell Command line client , The use method is as follows :
hadoop fs <args>
file system shell Including and Hadoop distributed file system (HDFS) as well as Hadoop Other supported file systems ( As local FS,HFTP FS,S3 FS
etc. ) Various similarities of direct interaction shell Command of . All FS shell Command will be path URI As a parameter .

URI Format is scheme://authority/path. about HDFS, The scheme yes hdfs, For local FS, The scheme yes
file.scheme and authority Is optional . If not specified , The default scheme specified in the configuration is used .

about HDFS, An example command is as follows :
hadoop fs -ls hdfs://namenode:host/parent/child hadoop fs -ls /parent/child
fs.defaultFS Configuration in
For local file systems , An example command is as follows :
hadoop fs -ls file:///root/
If the file system used is HDFS, Use hdfs dfs It's ok , here
hadoop fs <args> = hdfs dfs <args>
2. Shell Command options

Option name Use format meaning
-ls -ls < route > View the current directory structure for the specified path
-lsr -lsr < route > Recursively view the directory structure of the specified path
-du -du < route > Count the file size under the directory
-dus -dus < route > Files under summary statistics directory ( Clip ) size
-count -count [-q] < route > Statistical documents ( Clip ) number
-mv -mv < from path > < Destination path > move
-cp -cp < from path > < Destination path > copy
-rm -rm [-skipTrash] < route > Delete file / Blank folder
-rmr -rmr [-skipTrash] < route > Recursive deletion
-put -put < Multiple linux Files on > Upload file
-copyFromLocal -copyFromLocal < Multiple linux Files on > Copy from local
-moveFromLocal -moveFromLocal < Multiple linux Files on > Move from local
-getmerge -getmerge < from path > Merge to local
-cat -cat view file contents
-text -text view file contents
-copyToLocal -copyToLocal [-ignoreCrc] [-crc] [hdfs from path ] [linux Destination path ] Copy from local
-moveToLocal -moveToLocal [-crc] Move from local
-mkdir -mkdir Create a blank folder
-setrep -setrep [-R] [-w] < Number of copies > < route > Number of modified copies
-touchz -touchz < File path > Create a blank file
-stat -stat [format] < route > Show file statistics
-tail -tail [-f] < file > View file tail information
-chmod -chmod [-R] < Authority mode > [ route ] Modify permission
-chown -chown [-R] [ Owner ][:[ Genus group ]] route Modify owner
-chgrp -chgrp [-R] Group name route Modify group
-help -help [ Command options ] help
3. Shell Introduction to common commands

–ls
usage method :hadoop fs -ls [-h] [-R]
function : Show files , catalog information .
Example :hadoop fs -ls /user/hadoop/file1

–mkdir
usage method :hadoop fs -mkdir [-p]
function : stay hdfs Create directory on ,-p Indicates that all levels of parent directories in the path will be created .
Example :hadoop fs -mkdir –p /user/hadoop/dir1

–put
usage method :hadoop fs -put [-f] [-p] [ -| .. ].
function : Single src Or more srcs Copy from local file system to target file system .
-p: Retain access and modification time , Ownership and authority .
-f: Cover destination ( If it already exists )
Example :hadoop fs -put -f localfile1 localfile2 /user/hadoop/hadoopdir

–get
usage method :hadoop fs -get [-ignorecrc] [-crc] [-p] [-f]
-ignorecrc: Skip the CRC inspect .
-crc: Write for downloaded files CRC Checksums .
function : Copy files to local file system .
Example :hadoop fs -get hdfs://host:port/user/hadoop/file localfile

–appendToFile
usage method :hadoop fs -appendToFile …
function : Append a file to the end of an existing file
Example :hadoop fs -appendToFile localfile /hadoop/hadoopfile

–cat
usage method :hadoop fs -cat [-ignoreCrc] URI [URI …]
function : Show file contents to stdout
Example :hadoop fs -cat /hadoop/hadoopfile

–tail
usage method :hadoop fs -tail [-f] URI
function : Display the last one thousand bytes of the file to stdout.
-f Option to output additional data as the file grows .
Example :hadoop fs -tail /hadoop/hadoopfile

–chgrp
usage method :hadoop fs -chgrp [-R] GROUP URI [URI …]
function : Change Association of filegroups . User must be the owner of the file , Otherwise, it's a super user .
-R Will cause changes to occur recursively under the directory structure .
Example :hadoop fs -chgrp othergroup /hadoop/hadoopfile

–chmod
function : Change file permissions . use -R Will cause changes to occur recursively under the directory structure .
Example :hadoop fs -chmod 666 /hadoop/hadoopfile

–chown
function : Change owner of file . use -R Will cause changes to occur recursively under the directory structure .
Example :hadoop fs -chown someuser:somegrp /hadoop/hadoopfile

–copyFromLocal
usage method :hadoop fs -copyFromLocal URI
function : Copy files from the local file system to hdfs Path to
Example :hadoop fs -copyFromLocal /root/1.txt /

–copyToLocal
function : from hdfs Copy Local
Example :hadoop fs -copyToLocal /aaa/jdk.tar.gz

–cp
function : from hdfs A path copy of hdfs Another path for
Example : hadoop fs -cp /aaa/jdk.tar.gz /bbb/jdk.tar.gz.2

–mv
function : stay hdfs Move files in directory
Example : hadoop fs -mv /aaa/jdk.tar.gz /

–getmerge
function : Merge and download multiple files
Example : such as hdfs Contents of /aaa/ Multiple files below :log.1, log.2,log.3,…
hadoop fs -getmerge /aaa/log.* ./log.sum
–rm
function : Delete the specified file . Delete only non empty directories and files .-r Recursive deletion .
Example :hadoop fs -rm -r /aaa/bbb/

–df
function : Statistics of available space information of file system
Example :hadoop fs -df -h /

–du
function : Show all file sizes in directory , When only one file is specified , Show the size of this file .
Example :hadoop fs -du /user/hadoop/dir1

–setrep
function : Change the copy factor of a file .-R Option to recursively change the copies of all files in the directory
coefficient .
Example :hadoop fs -setrep -w 3 -R /user/hadoop/dir1

Like to like comments + Pay attention



Thank you for reading , I hope I can help you , Thank you for your support !