🔥码云GVP开源项目 12k star Uniapp+ElementUI 功能强大 支持多语言、二开方便! 广告
[TOC] > [参考网址](https://www.jianshu.com/p/9c8a0f7b98cf) ## 创建用户 ``` export username=im_user useradd -d /home/${username} -m ${username} passwd ${username} echo "${username} ALL = (root) NOPASSWD:ALL" | sudo tee /etc/sudoers.d/${username} ``` ## 设置免登陆 ``` cat >> /etc/hosts <<EOF 192.168.0.110 h1 EOF ``` ``` su ${username} ssh-keygen #(10). 将管理节点产生的公钥拷贝到其他节点 ssh-copy-id -i /home/${username}/.ssh/id_rsa.pub ${username}@h1 ``` 测试 免登陆 `ssh h2` ## 安装 java `yum install java -y` vim ~/.bashrc ``` export JAVA_HOME=/usr/lib/jvm/java-1.8.0-openjdk-1.8.0.232.b09-0.el7_7.x86_64/jre/ export JRE_HOME=$JAVA_HOME/jre export PATH=$JAVA_HOME/bin:$JRE_HOME/bin:$PATH export CLASSPATH=$JAVA_HOME/lib:$JRE_HOME/lib: ``` `source ~/.bashrc` ## 安装 Hadoop [镜像下载3.1](https://mirrors.tuna.tsinghua.edu.cn/apache/hadoop/common/) ``` sudo tar -zxf ./hadoop-3.1.1.tar.gz -C /usr/local cd /usr/local sudo mv ./hadoop-3.1.1 ./hadoop #重命名 sudo chown -R hadoop ./hadoop # 修改文件权限 cd /usr/local/hadoop ./bin/hadoop version ``` vim /.bashrc ``` export HADOOP_HOME=/usr/local/hadoop export PATH=$PATH:/usr/local/hadoop/bin ``` ### 修改配置文件 vim /usr/local/hadoop/etc/hadoop/core-site.xml ``` <configuration> <property> <name>hadoop.tmp.dir</name> <value>file:/usr/local/hadoop/tmp</value> <description>Abase for other temporary directories.</description> </property> <property> <name>fs.defaultFS</name> <value>hdfs://localhost:9000</value> </property> </configuration> ``` vim /usr/local/hadoop/etc/hadoop/hdfs-site.xml ``` <configuration> <property> <name>dfs.replication</name> <value>1</value> </property> <property> <name>dfs.namenode.name.dir</name> <value>file:/usr/local/hadoop/tmp/dfs/name</value> </property> <property> <name>dfs.datanode.data.dir</name> <value>file:/usr/local/hadoop/tmp/dfs/data</value> </property> </configuration> ``` ### 格式化节点 `/usr/local/hadoop/bin/hdfs namenode -format` ![UTOOLS1576056391244.png](http://yanxuan.nosdn.127.net/210b0d578b9df87046424995f1e2c991.png) ### 启动 Hadoop `/usr/local/hadoop/start-dfs.sh` ### 关闭 Hadoop `/usr/local/hadoop/stop-dfs.sh` ### Web界面查看HDFS信息 http://localhost:9870