·### 1.安装包的准备:
hadoop-2.6.0-cdh5.14.2.tar.gz
2.解压安装包
3.配置文件
进入hadoop安装包/etc/hadoop
3.-1 vi hadoop-env.sh
java_home为你本地的jdk目录
export JAVA_HOME=/opt/soft/jdk180
3-2 vi core-site.xml
<configuration>
<property>
<name>fs.defaultFS</name>
<value>hdfs://本地ip:9000</value>
</property>
<property>
<name>hadoop.tmp.dir</name>
<value>/opt/soft/hadoop260/tmp</value>
</property>
<property>
<name>hadoop.proxyuser.root.hosts</name>
<value>*</value>
</property>
<property>
<name>hadoop.proxyuser.root.groups</name>
<value>*</value>
</property>
</configuration>
3-3 vi hdfs-site.xml
<configuration>
<property>
<name>dfs.replication</name>
<value>1</value>
</property>>
</configuration>
3-4 修改mapred-site.xml.template的副本
cp mapred-site.xml.template mapred-site.xml
vi mapred-site.xml
<configuration>
<property>
<name>mapreduce.framework.name</name>
<value>yarn</value>
</property>
</configuration>
3-5 vi yarn-site.xml
<configuration>
<property>
<name>yarn.resourcemanager.localhost</name>
<value>localhost</value>
</property>
<property>
<name>yarn.nodemanager.aux-services</name>
<value>mapreduce_shuffle</value>
</property>
</configuration>
4 配置环境变量 vi etc/profile
export HADOOP_HOME=你本地hadoop的安装路径
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
export HADOOP_INSTALL=$HADOOP_HOME
激活环境变量
source /etc/profile
磁盘格式化
hapoop namenode -format
运行
start-all.sh
5.更改host文件
vi /etc/hosts
在最后添加ip加上计算机名
6.验证
浏览器:http://本地ip:50070