天天看點

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

Hadoop在整個大資料技術體系中占有至關重要的地位,是大資料技術的基礎和敲門磚,對Hadoop基礎知識的掌握程度會在一定程度決定在大資料技術的道路上能走多遠。 最近想要學習Spark,首先需要搭建Spark的環境,Spark的依賴環境比較多,需要Java JDK、Hadoop的支援。我們就分步驟依次介紹各個依賴的安裝和配置。新安裝了一個Linux Ubuntu 18.04系統,想在此系統上進行環境搭建,詳細記錄一下過程。

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

1、Java JDK8的安裝

前往Oracle官網下載下傳JDK8,選擇适合自己作業系統的版本,此處選擇Linux 64 https://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-2133151.html

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

下載下傳之後的包放到某個目錄下,此處放在/opt/java目錄 [email protected]:~/www.linuxidc.com$ sudo cp /home/linuxidc/www.linuxidc.com/jdk-8u231-linux-x64.tar.gz /opt/java/

[sudo] linuxidc 的密碼:

[email protected]:~/www.linuxidc.com$ cd /opt/java/

[email protected]:/opt/java$ ls

jdk-8u231-linux-x64.tar.gz 使用指令:tar -zxvf jdk-8u231-linux-x64.tar.gz 解壓縮 [email protected]:/opt/java$ sudo tar -zxf jdk-8u231-linux-x64.tar.gz

[email protected]:/opt/java$ ls

jdk1.8.0_231  jdk-8u231-linux-x64.tar.gz

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境
hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境
hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

修改配置檔案/etc/profile,使用指令:sudo nano /etc/profile [email protected]:/opt/java$ sudo nano /etc/profile 在檔案末尾增加以下内容(具體路徑依據環境而定): export JAVA_HOME=/opt/java/jdk1.8.0_231

export JRE_HOME=/opt/java/jdk1.8.0_231/jre

export PATH=${JAVA_HOME}/bin:$PATH

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

儲存退出,在終端界面使用指令:source /etc/profile 使配置檔案生效。 [email protected]:/opt/java$ source /etc/profile 使用java -version驗證安裝是否成功,以下回顯表明安裝成功了。 [email protected]:/opt/java$ java -version

java version "1.8.0_231"

Java(TM) SE Runtime Environment (build 1.8.0_231-b11)

Java HotSpot(TM) 64-Bit Server VM (build 25.231-b11, mixed mode)

[email protected]:/opt/java$

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

2、安裝Hadoop

前往官網https://hadoop.apache.org/releases.html下載下傳hadoop,此處選擇版本2.7.7 http://www.apache.org/dist/hadoop/core/hadoop-2.7.7/hadoop-2.7.7.tar.gz hadoop需要ssh免密登陸等功能,是以先安裝ssh。 使用指令: [email protected]:~/www.linuxidc.com$ sudo apt-get install ssh

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

[email protected]:~/www.linuxidc.com$ sudo apt-get install rsync

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

将下載下傳的包放到某個目錄下,此處放在/opt/hadoop [email protected]:~/www.linuxidc.com$ sudo cp /home/linuxidc/www.linuxidc.com/hadoop-2.7.7.tar.gz /opt/hadoop/

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境
hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

使用指令:tar -zxvf hadoop-2.7.7.tar.gz 進行解壓縮 此處選擇僞分布式的安裝方式(Pseudo-Distributed) 修改解壓後的目錄下的子目錄檔案 etc/hadoop/hadoop-env.sh,将JAVA_HOME路徑修改為本機JAVA_HOME的路徑,如下圖:

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

配置Hadoop的環境變量 使用指令: [email protected]:/opt/hadoop/hadoop-2.7.7/etc/hadoop$ sudo nano /etc/profile 添加以下内容: export HADOOP_HOME=/opt/hadoop/hadoop-2.7.7 修改PATH變量,添加hadoop的bin目錄進去 export PATH=${JAVA_HOME}/bin:${HADOOP_HOME}/bin:$PATH

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

修改解壓後的目錄下的子目錄檔案 etc/hadoop/core-site.xml [email protected]:/opt/hadoop/hadoop-2.7.7/etc/hadoop$ sudo nano core-site.xml fs.defaultFShdfs://localhost:9000 如下圖:

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

修改解壓後的目錄下的子目錄檔案 etc/hadoop/hdfs-site.xml [email protected]:/opt/hadoop/hadoop-2.7.7/etc/hadoop$ sudo nano hdfs-site.xml dfs.replication1 如下圖:

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

設定免密登陸 [email protected]:~/www.linuxidc.com$ ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa

Generating public/private rsa key pair.

Your identification has been saved in /home/linuxidc/.ssh/id_rsa.

Your public key has been saved in /home/linuxidc/.ssh/id_rsa.pub.

The key fingerprint is:

SHA256:zY+ELQc3sPXwTBRfKlTwntek6TWVsuQziHtu3N/6L5w [email protected]

The key's randomart image is:

+---[RSA 2048]----+

|        . o.*+. .|

|        + B o o.|

|        o o =o+.o|

|        B..+oo=o|

|        S.*. ==.+|

|        +.o .oo.|

|        .o.o... |

|          oo .E .|

|          ..  o==|

+----[SHA256]-----+

[email protected]:~/www.linuxidc.com$ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

[email protected]:~/www.linuxidc.com$ chmod 0600 ~/.ssh/authorized_keys

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

使用指令:ssh localhost 驗證是否成功,如果不需要輸入密碼即可登陸說明成功了。 [email protected]:~/www.linuxidc.com$ ssh localhost

Welcome to Ubuntu 18.04.3 LTS (GNU/Linux 5.4.0-999-generic x86_64)  * Documentation:  https://help.ubuntu.com

 * Management:    https://landscape.canonical.com

 * Support:        https://ubuntu.com/advantage  * Canonical Livepatch is available for installation.

  - Reduce system reboots and improve kernel security. Activate at:

    https://ubuntu.com/livepatch 188 個可更新軟體包。

0 個安全更新。 Your Hardware Enablement Stack (HWE) is supported until April 2023.

Last login: Sat Nov 30 23:25:35 2019 from 127.0.0.1

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

接下來需要驗證Hadoop的安裝 a、格式化檔案系統 [email protected]:/opt/hadoop/hadoop-2.7.7$ bin/hdfs namenode -format

19/11/30 23:29:06 INFO namenode.NameNode: STARTUP_MSG:

/************************************************************

STARTUP_MSG: Starting NameNode

STARTUP_MSG:  host = linuxidc/127.0.1.1

STARTUP_MSG:  args = [-format]

STARTUP_MSG:  version = 2.7.7

......

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

b、啟動Namenode和Datanode [email protected]:/opt/hadoop/hadoop-2.7.7$ sbin/start-dfs.sh

Starting namenodes on [localhost]

localhost: starting namenode, logging to /opt/hadoop/hadoop-2.7.7/logs/hadoop-linuxidc-namenode-linuxidc.out

localhost: starting datanode, logging to /opt/hadoop/hadoop-2.7.7/logs/hadoop-linuxidc-datanode-linuxidc.out

Starting secondary namenodes [0.0.0.0]

The authenticity of host '0.0.0.0 (0.0.0.0)' can't be established.

ECDSA key fingerprint is SHA256:OSXsQK3E9ReBQ8c5to2wvpcS6UGrP8tQki0IInUXcG0.

Are you sure you want to continue connecting (yes/no)? yes

0.0.0.0: Warning: Permanently added '0.0.0.0' (ECDSA) to the list of known hosts.

0.0.0.0: starting secondarynamenode, logging to /opt/hadoop/hadoop-2.7.7/logs/hadoop-linuxidc-secondarynamenode-linuxidc.out c、浏覽器通路 http://localhost:50070

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

3、Scala安裝:

下載下傳位址:https://www.scala-lang.org/download/2.11.8.html

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

下載下傳好後解壓到:/opt/scala [email protected]:~/下載下傳$ sudo tar zxf scala-2.11.8.tgz -C /opt/scala

[sudo] linuxidc 的密碼:

[email protected]:~/下載下傳$ cd /opt/scala

[email protected]:/opt/scala$ ls

scala-2.11.8 配置環境變量: [email protected]:/opt/scala$ sudo nano /etc/profile 添加: export SCALA_HOME=/opt/scala/scala-2.11.8  

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

source /etc/profile

4、安裝spark

前往spark官網下載下傳spark https://spark.apache.org/downloads.html 此處選擇版本如下: spark-2.4.4-bin-hadoop2.7 将spark放到某個目錄下,此處放在/opt/spark 使用指令:tar -zxvf spark-2.4.0-bin-hadoop2.7.tgz 解壓縮即可 [email protected]:~/www.linuxidc.com$ sudo cp /home/linuxidc/www.linuxidc.com/spark-2.4.4-bin-hadoop2.7.tgz /opt/spark/

[sudo] linuxidc 的密碼:

[email protected]:~/www.linuxidc.com$ cd /opt/spark/

[email protected]:/opt/spark$ ls

spark-2.4.4-bin-hadoop2.7.tgz

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

[email protected]:/opt/spark$ sudo tar -zxf spark-2.4.4-bin-hadoop2.7.tgz

[sudo] linuxidc 的密碼:

[email protected]:/opt/spark$ ls

spark-2.4.4-bin-hadoop2.7  spark-2.4.4-bin-hadoop2.7.tgz

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

使用指令:./bin/run-example SparkPi 10 測試spark的安裝 配置環境變量SPARK_HOME [email protected]:/opt/spark/spark-2.4.4-bin-hadoop2.7$ sudo nano /etc/profile export SPARK_HOME=/opt/spark/spark-2.4.4-bin-hadoop2.7

export PATH=${JAVA_HOME}/bin:${HADOOP_HOME}/bin:${SPARK_HOME}/bin:$PATH

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

source /etc/profile 配置配置spark-env.sh 進入到spark/conf/ sudo cp /opt/spark/spark-2.4.4-bin-hadoop2.7/conf/spark-env.sh.template /opt/spark/spark-2.4.4-bin-hadoop2.7/conf/spark-env.sh [email protected]:/opt/spark/spark-2.4.4-bin-hadoop2.7/conf$ sudo nano spark-env.sh export JAVA_HOME=/opt/java/jdk1.8.0_231

export HADOOP_HOME=/opt/hadoop/hadoop-2.7.7

export HADOOP_CONF_DIR=/opt/hadoop/hadoop-2.7.7/etc/hadoop

export SPARK_HOME=/opt/spark/spark-2.4.4-bin-hadoop2.7

export SCALA_HOME=/opt/scala/scala-2.11.8

export SPARK_MASTER_IP=127.0.0.1

export SPARK_MASTER_PORT=7077

export SPARK_MASTER_WEBUI_PORT=8099

export SPARK_WORKER_CORES=3

export SPARK_WORKER_INSTANCES=1

export SPARK_WORKER_MEMORY=5G

export SPARK_WORKER_WEBUI_PORT=8081

export SPARK_EXECUTOR_CORES=1

export SPARK_EXECUTOR_MEMORY=1G

export LD_LIBRARY_PATH=${LD_LIBRARY_PATH}:$HADOOP_HOME/lib/native

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

Java,Hadoop等具體路徑根據自己實際環境設定。 啟動bin目錄下的spark-shell

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

可以看到已經進入到scala環境,此時就可以編寫代碼啦。 spark-shell的web界面http://127.0.0.1:4040

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

暫時先這樣,如有什麼疑問,請在Linux公社下面的評論欄裡提出。

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

Linux公社微信号ID: linuxidc_com長按或掃描左圖識别二維碼關注Linux公社公衆微信号 更多Hadoop相關資訊見Hadoop 專題頁面 https://www.linuxidc.com/topicnews.aspx?tid=13

Linux公社的RSS位址:https://www.linuxidc.com/rssFeed.aspx

本文永久更新連結位址:https://www.linuxidc.com/Linux/2019-12/161628.htm

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境
hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

支援就點下在看并 轉發朋友圈吧

hadoop distcp 參數調優_Ubuntu 18.04下搭建單機Hadoop和Spark叢集環境

繼續閱讀