天天看點

Mac環境下安裝hadoop配置ssh免密登入安裝hadoop

配置ssh免密登入

mac下開啟遠端登入

系統偏好設定 -> 共享 -> 遠端登入      

授權免密登入

# 生成秘鑰(如果沒有)
ssh-keygen -t rsa -P ''

# 授權免密登入
cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

# 免密登入
ssh lcoalhost      

安裝hadoop

brew install hadoop      

配置

# 檢視hadoop路徑
brew info hadoop

# 檢視java路徑
which java      

進入配置檔案路徑

/usr/local/Cellar/hadoop/3.1.1/libexec/etc/hadoop

1、配置JAVA環境

hadoop-env.sh
export HADOOP_OPTS="$HADOOP_OPTS -Djava.net.preferIPv4Stack=true -Djava.security.krb5.realm= -Djava.security.krb5.kdc="
export JAVA_HOME="/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home"      

2、配置hdfs位址和端口

core-site.xml

<configuration>
  <property>
     <name>hadoop.tmp.dir</name>
<value>/usr/local/Cellar/hadoop/hdfs/tmp</value>
  </property>

  <property>
     <name>fs.default.name</name>
     <value>hdfs://localhost:8020</value>
  </property>
</configuration>      

3、配置jobtracker的位址和端口

mapred-site.xml

<configuration>
      <property>
        <name>mapred.job.tracker</name>
        <value>localhost:8021</value>
      </property>
</configuration>      

4、修改hdfs備份數

hdfs-site.xml

<configuration>
   <property>
     <name>dfs.replication</name>
     <value>1</value>
    </property>

    <property>
       <name>dfs.name.dir</name>
       <value>/usr/local/Cellar/hadoop/hdfs/name</value>
    </property>

    <property>
       <name>dfs.data.dir</name>
       <value>/usr/local/Cellar/hadoop/hdfs/data</value>
    </property>

    <property>
      <name>dfs.http.address</name>
      <value>localhost:50070</value>
    </property>
</configuration>      

添加環境變量

vim ~/.bash_profile

export HADOOP_HOME=/usr/local/Cellar/hadoop/3.1.1
export HADOOP_MAPRED_HOME=$HADOOP_HOME
export HADOOP_COMMON_HOME=$HADOOP_HOME
export HADOOP_HDFS_HOME=$HADOOP_HOME
export YARN_HOME=$HADOOP_HOME
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export PATH=$PATH:$HADOOP_HOME/sbin:$HADOOP_HOME/bin
      

生效

source ~/.bash_profile      

格式化hdfs

hdfs namenode -format      

啟動關閉

start-dfs.sh  
stop-dfs.sh

start-yarn.sh
stop-yarn.sh      

hadoop

http://localhost:50070

yarn

http://localhost:8088

檢視檔案

hdfs dfs -ls  /      

指令沒反應加個sudo

參考

  1. mac下Hadoop、HDFS、Spark環境的安裝和搭建
  2. mac關于Hadoop安裝
  3. Hadoop setup 一些問題及解決
  4. HDFS_NAMENODE_USER, HDFS_DATANODE_USER & HDFS_SECONDARYNAMENODE_USER not defined