[antares@hadoop1 hadoop-3.3.5]$ mkdir wcinput
[antares@hadoop1 hadoop-3.3.5]$ cd wcinput
[antares@hadoop1 wcinput]$ vim word.txt
内容可以随便写即可:(比如编写如下内容)
启动: systemctl start docker
停止: systemctl stop docker
重启: systemctl restart docker
查看: systemctl status docker
开机: systemctl enable docker
[antares@hadoop1 hadoop-3.3.5]$ hadoop jar share/hadoop/mapreduce/hadoop-mapreduce-examples-3.3.5.jar wordcount wcinput wcoutput
注意:wcoutput ---这个是在命令运行过程中新生成的,不可提前建立该文件
[antares@hadoop1 hadoop-3.3.5]$ cat wcoutput/part-r-00000
docker ?5
enable ?1
restart 1
start ? 1
status ?1
stop ? ?1
systemctl ? ? ? 5
停止: ?1
启动: ?1
开机: ?1
查看: ?1
重启: ?1
[antares@hadoop1 hadoop-3.3.5]$ vim kang.txt
[antares@hadoop1 hadoop-3.3.5]$?sudo chown antares:antares -R /opt/module
如果本来就是用这个命令登录,那么不需要再进行授权可执行。
[antares@hadoop2 hadoop-3.3.5]$ ls
bin include lib LICENSE-binary LICENSE.txt NOTICE.txt sbin wcinput
etc kang.txt libexec licenses-binary NOTICE-binary README.txt share wcoutput
[antares@hadoop2 hadoop-3.3.5]$ pwd
/opt/module/hadoop-3.3.5
[antares@hadoop2 hadoop-3.3.5]$ scp -r /opt/module/hadoop-3.3.5/kang.txt antares@hadoop3:/opt/module/hadoop-3.3.5/
The authenticity of host 'hadoop3 (192.168.193.176)' can't be established.
ECDSA key fingerprint is SHA256:HmeFoPbjR1dLiPcwjnlsYhOq3EiaJirR7H9jcjQnBfU.
ECDSA key fingerprint is MD5:d2:d9:4f:61:0b:5a:65:c1:c0:48:d7:b4:c2:f2:1f:1a.
Are you sure you want to continue connecting (yes/no)? yes
Warning: Permanently added 'hadoop3,192.168.193.176' (ECDSA) to the list of known hosts.
antares@hadoop3's password:
kang.txt
?另外一台的操作同样使用上述命令,同意(yes)后再输入password 接口拷贝成功。可登录刚拷贝的虚拟机查看是否已经拷贝成功。
rsync主要用于备份和镜像。具有速度快、避免复制相同内容和支持符号链接的优点。
rsync和scp区别:用rsync做文件的复制要比scp的速度快,rsync只对差异文件做更新。scp是把所有文件都复制过去
rsync主要用于备份和镜像。具有速度快、避免复制相同内容和支持符号链接的优点。
rsync和scp区别:用rsync做文件的复制要比scp的速度快,rsync只对差异文件做更新。scp是把所有文件都复制过去。
删除hadoop3中/opt/module/hadoop-3.3.5/下的 ?wcinput 和 wcoutput
[antares@hadoop3 hadoop-3.3.5]$ rm -rf wcinput wcoutput
同步hadoop102中的/opt/module/hadoop-3.3.5到hadoop103
[antares@hadoop2 module]$ rsync -av hadoop-3.3.5/ antares@hadoop3:/opt/module/hadoop-3.3.5/