天天看点

CentOS 7.2 系统下编译 Hadoop2.7.2

因为官网下载下来的Hadoop安装包是x86,32位的,64位需要自己编译。

编译环境

   操作系统:CentOS 7.2 64位(需要连接互联网)

如何查看centos 版本

   # cat redhat-release

      CentOS Linux release 7.2.1511 (Core)

Hadoop源代码版本  2.7.2

1、安装基本应用

   # sudo yum -y install  svn   ncurses-devel   gcc* 

   # sudo yum -y install lzo-devel zlib-devel autoconf    automake    libtool    cmake     openssl –devel 

2、安装JDK

  JDK版本:JDK-8u91(随便下载rpm或者tar压缩包)运行安装JDK

   运行安装JDK

   #  rpm -ivh jdk-8u91-linux-x64.rpm  

   设置JAVA环境

     #vi .bash_profile  

   添加以下代码

     export JAVA_HOME=/usr/java/jdk1.8.0_91  

     export CLASSPATH=.:$CLASSPATH:$JAVA_HOME/lib:$JAVA_HOME/jre/lib  

     export PATH=$PATH:$JAVA_HOME/bin:$JAVA_HOME/jre/bin  

3、安装protobuf-2.5.0.tar.gz

  下载protobuf-2.5.0.tar.gz

  解压缩

    #tar zxvf protobuf-2.5.0.tar.gz  

  进入该目录

    #cd protobuf-2.5.0  

  运行检测

    #./configure --prefix=/usr/local/protoc

  编译

    #sudo make  

  安装

    #sudo make install  

  配置环境变量 /etc/profile    

  export PATH=/usr/local/protoc/bin:$PATH

  立即生效:

    #source /etc/profile

  检验是否安装成功

    #protoc --version

4、安装maven

    下载apache-maven-3.2.3-bin.tar.gz

    解压缩

        #tar   zxvf   apache-maven-3.2.3-bin.tar.gz  

    配置环境变量 /etc/profile

        export MAVEN_HOME=/usr/local/program/maven/ apache- maven- 3.2.3  

        export PATH=$PATH:$MAVEN_HOME/bin   

    使环境变量生效

        #source /etc/profile  

    检验是否安装成功

        #mvn -version 

5、安装ant

    下载apache-ant-1.9.4-bin.tar.gz

    解压缩

    #tar   zxvf   apache-ant-1.9.4-bin.tar.gz -C /usr/local/

    添加环境变量/etc/profile

        export ANT_HOME=/usr/local/apache-ant-1.9.4  

        export PATH=$PATH:$ANT_HOME/bin  

    使环境变量生效

        #source /etc/profile  

    检验是否安装成功

        #ant -version

6、编译Hadoop

        #cd /home/grid/hadoop-2.7.2-src

       #chmod -R 777 *

        #mvn package -Pdist,native -DskipTests -Dtar  

    或者使用

        #mvn clean package –Pdist,native –DskipTests –Dtar  

编译成功如下图所示:      
[INFO] Reactor Summary:      
[INFO]       
[INFO] Apache Hadoop Main ................................ SUCCESS [1.231s]      
[INFO] Apache Hadoop Project POM ......................... SUCCESS [1.439s]      
[INFO] Apache Hadoop Annotations ......................... SUCCESS [3.263s]      
[INFO] Apache Hadoop Assemblies .......................... SUCCESS [0.194s]      
[INFO] Apache Hadoop Project Dist POM .................... SUCCESS [2.009s]      
[INFO] Apache Hadoop Maven Plugins ....................... SUCCESS [4.012s]      
[INFO] Apache Hadoop MiniKDC ............................. SUCCESS [3.469s]      
[INFO] Apache Hadoop Auth ................................ SUCCESS [4.303s]      
[INFO] Apache Hadoop Auth Examples ....................... SUCCESS [3.684s]      
[INFO] Apache Hadoop Common .............................. SUCCESS [1:52.170s]      
[INFO] Apache Hadoop NFS ................................. SUCCESS [9.035s]      
[INFO] Apache Hadoop KMS ................................. SUCCESS [12.988s]      
[INFO] Apache Hadoop Common Project ...................... SUCCESS [0.048s]      
[INFO] Apache Hadoop HDFS ................................ SUCCESS [3:15.174s]      
[INFO] Apache Hadoop HttpFS .............................. SUCCESS [20.688s]      
[INFO] Apache Hadoop HDFS BookKeeper Journal ............. SUCCESS [8.065s]      
[INFO] Apache Hadoop HDFS-NFS ............................ SUCCESS [4.789s]      
[INFO] Apache Hadoop HDFS Project ........................ SUCCESS [0.045s]      
[INFO] hadoop-yarn ....................................... SUCCESS [0.074s]      
[INFO] hadoop-yarn-api ................................... SUCCESS [50.090s]      
[INFO] hadoop-yarn-common ................................ SUCCESS [42.626s]      
[INFO] hadoop-yarn-server ................................ SUCCESS [0.045s]      
[INFO] hadoop-yarn-server-common ......................... SUCCESS [15.496s]      
[INFO] hadoop-yarn-server-nodemanager .................... SUCCESS [20.052s]      
[INFO] hadoop-yarn-server-web-proxy ...................... SUCCESS [3.146s]      
[INFO] hadoop-yarn-server-applicationhistoryservice ...... SUCCESS [9.605s]      
[INFO] hadoop-yarn-server-resourcemanager ................ SUCCESS [27.570s]      
[INFO] hadoop-yarn-server-tests .......................... SUCCESS [5.965s]      
[INFO] hadoop-yarn-client ................................ SUCCESS [6.913s]      
[INFO] hadoop-yarn-server-sharedcachemanager ............. SUCCESS [3.334s]      
[INFO] hadoop-yarn-applications .......................... SUCCESS [0.017s]      
[INFO] hadoop-yarn-applications-distributedshell ......... SUCCESS [2.388s]      
[INFO] hadoop-yarn-applications-unmanaged-am-launcher .... SUCCESS [1.789s]      
[INFO] hadoop-yarn-site .................................. SUCCESS [0.041s]      
[INFO] hadoop-yarn-registry .............................. SUCCESS [5.992s]      
[INFO] hadoop-yarn-project ............................... SUCCESS [3.458s]      
[INFO] hadoop-mapreduce-client ........................... SUCCESS [0.037s]      
[INFO] hadoop-mapreduce-client-core ...................... SUCCESS [29.678s]      
[INFO] hadoop-mapreduce-client-common .................... SUCCESS [23.665s]      
[INFO] hadoop-mapreduce-client-shuffle ................... SUCCESS [5.847s]      
[INFO] hadoop-mapreduce-client-app ....................... SUCCESS [10.471s]      
[INFO] hadoop-mapreduce-client-hs ........................ SUCCESS [7.129s]      
[INFO] hadoop-mapreduce-client-jobclient ................. SUCCESS [4.546s]      
[INFO] hadoop-mapreduce-client-hs-plugins ................ SUCCESS [1.708s]      
[INFO] Apache Hadoop MapReduce Examples .................. SUCCESS [6.265s]      
[INFO] hadoop-mapreduce .................................. SUCCESS [2.146s]      
[INFO] Apache Hadoop MapReduce Streaming ................. SUCCESS [4.554s]      
[INFO] Apache Hadoop Distributed Copy .................... SUCCESS [8.177s]      
[INFO] Apache Hadoop Archives ............................ SUCCESS [2.076s]      
[INFO] Apache Hadoop Rumen ............................... SUCCESS [7.427s]      
[INFO] Apache Hadoop Gridmix ............................. SUCCESS [4.028s]      
[INFO] Apache Hadoop Data Join ........................... SUCCESS [2.624s]      
[INFO] Apache Hadoop Ant Tasks ........................... SUCCESS [2.132s]      
[INFO] Apache Hadoop Extras .............................. SUCCESS [3.122s]      
[INFO] Apache Hadoop Pipes ............................... SUCCESS [1.904s]      
[INFO] Apache Hadoop OpenStack support ................... SUCCESS [4.887s]      
[INFO] Apache Hadoop Amazon Web Services support ......... SUCCESS [1:09.582s]      
[INFO] Apache Hadoop Azure support ....................... SUCCESS [30.537s]      
[INFO] Apache Hadoop Client .............................. SUCCESS [7.480s]      
[INFO] Apache Hadoop Mini-Cluster ........................ SUCCESS [0.097s]      
[INFO] Apache Hadoop Scheduler Load Simulator ............ SUCCESS [4.930s]      
[INFO] Apache Hadoop Tools Dist .......................... SUCCESS [10.072s]      
[INFO] Apache Hadoop Tools ............................... SUCCESS [0.032s]      
[INFO] Apache Hadoop Distribution ........................ SUCCESS [1:32.461s]      
[INFO] ------------------------------------------------------------------------      
[INFO] BUILD SUCCESS      
[INFO] ------------------------------------------------------------------------      
[INFO] Total time: 15:35.148s      
[INFO] Finished at: Tue Jun 07 18:53:21 CST 2016      
[INFO] Final Memory: 109M/446M          

7、FAQ (这些都是我在实际编译过程中遇到的问题,希望对大家有所帮助)

1)[ERROR] Failed to execute goal org.apache.hadoop:hadoop-maven-plugins:2.7.2:protoc (compile-protoc) on project hadoop-common: org.apache.maven.plugin.MojoExecutionException: 'protoc --version' did not return a version -> [Help 1]

这个问题困扰了我很长的时间,本质问题是protobuf没有安装,或者配置有问题,在网上找了文章,重新安装了protobuf,加入环境变量

然后不用sudo来编译,直接用mvn。

继续阅读