HDFS IntroductoryShell basic operation

1. Shell Command line client

Hadoop Provides the shell Command line client, The use method is as follows:
hadoop fs <args>
file system shell Include and Hadoop distributed file system (HDFS) as well as Hadoop Other supported file systems( Such as local FS,HFTP FS,S3 FS
etc.) Various similarities of direct interaction shell Orders. All FS shell Command will be path URI As parameter.

URI Format is scheme://authority/path. about HDFS, The scheme yes hdfs, For local FS, The scheme yes
file.scheme and authority It is optional.. If not specified, The default scheme specified in the configuration is used.

about HDFS, An example command is as follows:
hadoop fs -ls hdfs://namenode:host/parent/child hadoop fs -ls /parent/child
fs.defaultFS Configuration in
For local file systems, An example command is as follows:
hadoop fs -ls file:///root/
If the file system used is HDFS, Then use hdfs dfs It's ok, here
hadoop fs <args> = hdfs dfs <args>
2. Shell Command options

Option name Use format Meaning
-ls -ls < Route> View the current directory structure for the specified path
-lsr -lsr < Route> Recursively view the directory structure of the specified path
-du -du < Route> Count the file size under the directory
-dus -dus < Route> Files under summary statistics directory( Clip) Size
-count -count [-q] < Route> Statistical documents( Clip) Number
-mv -mv < from path> < Destination path> move
-cp -cp < from path> < Destination path> copy
-rm -rm [-skipTrash] < Route> Delete files/ Blank folder
-rmr -rmr [-skipTrash] < Route> Recursive deletion
-put -put < Multiple linux Files on> Upload files
-copyFromLocal -copyFromLocal < Multiplelinux Files on> Copy from local
-moveFromLocal -moveFromLocal < Multiple linux Files on> Move from local
-getmerge -getmerge < from path> Merge to local
-cat -cat view file contents
-text -text view file contents
-copyToLocal -copyToLocal [-ignoreCrc] [-crc] [hdfs from path] [linux Destination path] Copy from local
-moveToLocal -moveToLocal [-crc] Move from local
-mkdir -mkdir Create a blank folder
-setrep -setrep [-R] [-w] < Copy number> < Route> Number of modified copies
-touchz -touchz < File path> Create a blank file
-stat -stat [format] < Route> Show file statistics
-tail -tail [-f] < file> View file tail information
-chmod -chmod [-R] < Permission mode> [ Route] Modify authority
-chown -chown [-R] [ Belong to][:[ Generic group]] Route Modify master
-chgrp -chgrp [-R] Generic group name Route Modify group
-help -help [ Command options] Help
3. Shell Introduction to common commands

–ls
Usage method:hadoop fs -ls [-h] [-R]
function: Display file, catalog information.
Example:hadoop fs -ls /user/hadoop/file1

–mkdir
Usage method:hadoop fs -mkdir [-p]
function: stay hdfs Create directory on,-p Indicates that all levels of parent directories in the path will be created.
Example:hadoop fs -mkdir –p /user/hadoop/dir1

–put
Usage method:hadoop fs -put [-f] [-p] [ -| .. ].
function: Will single src Or more srcs Copy from local file system to target file system.
-p: Retain access and modification time, Ownership and authority.
-f: Cover destination( If it already exists)
Example:hadoop fs -put -f localfile1 localfile2 /user/hadoop/hadoopdir

–get
Usage method:hadoop fs -get [-ignorecrc] [-crc] [-p] [-f]
-ignorecrc: Skip the CRC inspect.
-crc: Write for downloaded files CRC Checksum.
function: Copy files to local file system.
Example:hadoop fs -get hdfs://host:port/user/hadoop/file localfile

–appendToFile
Usage method:hadoop fs -appendToFile …
function: Append a file to the end of an existing file
Example:hadoop fs -appendToFile localfile /hadoop/hadoopfile

–cat
Usage method:hadoop fs -cat [-ignoreCrc] URI [URI …]
function: Show file contents to stdout
Example:hadoop fs -cat /hadoop/hadoopfile

–tail
Usage method:hadoop fs -tail [-f] URI
function: Display the last one thousand bytes of the file to stdout.
-f Option to output additional data as the file grows.
Example:hadoop fs -tail /hadoop/hadoopfile

–chgrp
Usage method:hadoop fs -chgrp [-R] GROUP URI [URI …]
function: Change Association of filegroups. User must be the owner of the file, Otherwise, it's a super user.
-R Will cause changes to occur recursively under the directory structure.
Example:hadoop fs -chgrp othergroup /hadoop/hadoopfile

–chmod
function: Change file permissions. Use-R Will cause changes to occur recursively under the directory structure.
Example:hadoop fs -chmod 666 /hadoop/hadoopfile

–chown
function: Change owner of file. Use-R Will cause changes to occur recursively under the directory structure.
Example:hadoop fs -chown someuser:somegrp /hadoop/hadoopfile

–copyFromLocal
Usage method:hadoop fs -copyFromLocal URI
function: Copy files from the local file system to hdfs Path to
Example:hadoop fs -copyFromLocal /root/1.txt /

–copyToLocal
function: from hdfs Copy Local
Example:hadoop fs -copyToLocal /aaa/jdk.tar.gz

–cp
function: from hdfs A path copy of hdfs Another path for
Example: hadoop fs -cp /aaa/jdk.tar.gz /bbb/jdk.tar.gz.2

–mv
function: stay hdfs Move files in directory
Example: hadoop fs -mv /aaa/jdk.tar.gz /

–getmerge
function: Merge and download multiple files
Example: such as hdfs Directory /aaa/ Multiple files below:log.1, log.2,log.3,…
hadoop fs -getmerge /aaa/log.* ./log.sum
–rm
function: Delete the specified file. Delete only non empty directories and files.-r Recursive deletion.
Example:hadoop fs -rm -r /aaa/bbb/

–df
function: Statistics of available space information of file system
Example:hadoop fs -df -h /

–du
function: Show all file sizes in directory, When only one file is specified, Show the size of this file.
Example:hadoop fs -du /user/hadoop/dir1

–setrep
function: Change the copy factor of a file.-R Option to recursively change the copies of all files in the directory
coefficient.
Example:hadoop fs -setrep -w 3 -R /user/hadoop/dir1

Like to like comments+ Pay attention to it.



Thank you for reading. I hope I can help you, Thank you for your support!