# Hadoop单机环境搭建笔记(ubuntu)
**下载hadoop**
hadoop下载地址:
[http://www.apache.org/dyn/closer.cgi/hadoop/core/](http://www.apache.org/dyn/closer.cgi/hadoop/core/)
这里下载的版本是1.0.3
$ mkdir hadoop
$ wget http://www.fayea.com/apache-mirror/hadoop/core/stable/hadoop-1.0.3.tar.gz .
**安装java**
首先用“java -version”查看是否已安装,如果显示类似于java version "1.7.0_147-icedtea 则表示已安装
没有安装可以参考:[http://blog.csdn.net/yang_hui1986527/article/details/6677450](http://blog.csdn.net/yang_hui1986527/article/details/6677450)
安装必须设置JAVA_HOME和CLASSPATH
我的配置:
export PATH=${PATH}:/usr/lib/jvm/java-6-openjdk-amd64/bin
export JAVA_HOME=/usr/lib/jvm/java-6-openjdk-amd64/
export JRE_HOME=${JAVA_HOME}/jre
export CLASSPATH=.:${JRE_HOME}/lib:${JAVA_HOME}/lib/dt.jar:${JAVA_HOME}/lib/tools.jar
并将这两行语句放在:~/.bashrc中
**安装ssh**
$ sudo apt-get install ssh
安装完成后,需要确认可以用ssh免密码登录localhost
$ ssh localhost
如果需要密码才能登录,则需要设置:
$ ssh-keygen -t dsa -P '' -f ~/.ssh/id_dsa
$ cat ~/.ssh/id_dsa.pub >> ~/.ssh/authorized_keys
**安装rsync**
rsync是linux实现远程同步的软件
$ sudo apt-get install rsync
**配置启动hadoop**
解压:
`$ tar -zxvf hadoop-1.0.3.tar.gz`
设置JAVA_HOME
编辑conf/hadoop-env.sh文件,找到:
`# export JAVA_HOME=/usr/lib/j2sdk1.5-sun`
修改为:
`export JAVA_HOME=/usr/lib/jvm/java-6-openjdk-amd64/`
修改配置文件:
如果不知道java在什么地方请用whereis java查询
修改conf/core-site.xml:
~~~
fs.default.name
hdfs://localhost:9000
~~~
修改conf/hdfs-site.xml:
~~~
dfs.replication
1
hadoop.tmp.dir
/home/work/hadoop_tmp
~~~
修改conf/mapred-site.xml:
~~~
mapred.job.tracker
localhost:9001
~~~
初始化hadoop Namenode:
`$ bin/hadoop namenode –format`
启动:
`$ bin/start-all.sh`
确认启动:
~~~
$ jps
5146 Jps
4538 TaskTracker
4312 JobTracker
4015 DataNode
4228 SecondaryNameNode
3789 NameNode
~~~
表示启动成功
**以下内容写入~/.bashrc:**
~~~
alias hadoop='/home/zxm/hadoop/hadoop-1.0.3/bin/hadoop'
alias hls='hadoop fs -ls'
alias hlsr='hadoop fs -lsr'
alias hcp='hadoop fs -cp '
alias hmv='hadoop fs -mv'
alias hget='hadoop fs -get'
alias hput='hadoop fs -put'
alias hrm='hadoop fs -rm'
alias hmkdir='hadoop fs -mkdir'
alias hcat='hadoop fs -cat'
alias hrmr='hadoop fs -rmr'
alias hstat='hadoop fs -stat'
alias htest='hadoop fs -test'
alias htext='hadoop fs -text'
alias htouchz='hadoop fs -touchz'
alias hdu='hadoop fs -du'
alias hdus='hadoop fs -dus'
alias hchmod='hadoop fs -chmod'
alias hchgrp='hadoop fs -chgrp'
alias hchown='hadoop fs -chown'
alias htail='hadoop fs -tail'<span style="font-family:Arial, Helvetica, sans-serif;"><span style="white-space: normal;">
</span></span>
~~~
常见问题解决方案:
问题1:运行hadoop命令是出现“Warning: $HADOOP_HOME is deprecated.”报警
解决:添加 export HADOOP_HOME_WARN_SUPPRESS=TRUE 到 hadoop-env.sh 中
问题2:namenode无法启动
解决:删除/tmp/hadoop* 执行bin/hadoop namenode –format