天天看点

HiBench+kafka+flink 流测试命令及步骤

测试机运行原理分析如下: 

HiBench+kafka+flink 流测试命令及步骤

Hadoop1 [flink]

Internal  IP enp5s0f1: 192.168.1.1

External IP enp9s0f0: 10.18.182.102

BMC : 10.18.182.109

Hadoop2 [kafka]

Internal IP enp5s0f1: 192.168.1.2

External IP enp9s0f0: 10.18.182.104

BMC : 10.18.182.113 

Hadoop3 [dataGen]

Internal IP enp5s0f1: 192.168.1.3

External IP enp9s0f0: 10.18.182.105

BMC : 10.18.182.114

1. hadoop2(kafka服务器)

   sudo chown -R yjiang2:yjiang2 /tmp/zookeeper

   a.zookeeper   $ZOOKEEPER_HOME/bin/zkServer.sh start

   b.kafka       $KAFKA_HOME/bin/kafka-server-start.sh $KAFKA_HOME/config/server.properties

                 ./bin/kafka-server-start.sh config/server.properties 

2. hadoop1(flink服务器) 

   a.start hadoop single node

    $HADOOP_HOME/sbin/start-dfs.sh

    $HADOOP_HOME/sbin/start-yarn.sh    

   b.start flink yarn 

   $FLINK_HOME/bin/yarn-session.sh  -d -s 16 -tm 10240 -n 1 -jm 8192

这个是测试命令,可以通过 http://【flink-server-ip】:8088/cluster 来查看状态。如果能正常启动,就可以通过命令来暂停它。例如:yarn application -kill application_1572875276237_0004。

3. hadoop3(HiBench服务器) 

   a.start hadoop single-node

      $HADOOP_HOME/sbin/start-dfs.sh

      $HADOOP_HOME/sbin/start-yarn.sh  

   b. Hibench

      ./bin/workloads/streaming/identity/prepare/genSeedDataset.sh

      ./bin/workloads/streaming/identity/prepare/dataGen.sh                  ###这是单个进程

       ./bin/workloads/streaming/identity/prepare/dataGenP.sh             ###多进程

#!/bin/bash

proc_num=$1
echo $proc_num
for i in {1..20}
do
    exec ./bin/workloads/streaming/identity/prepare/dataGen.sh &
done
           

4. hadoop1(flink服务器) 

   ./bin/flink run -m yarn-cluster -yn 10 -c com.intel.hibench.flinkbench.RunBench /home/yjiang2/HiBench-7.0/flinkbench/streaming/target/flinkbench-streaming-7.1-SNAPSHOT-jar-with-dependencies.jar /home/yjiang2/HiBench-7.0/report/identity/flink/conf/sparkbench/sparkbench.conf

5. hadoop2(kafka服务器)

   check the results:

   ./bin/workloads/streaming/identity/common/metrics_reader.sh

   input the topic in the bottom of the list 

///

///

//更改挂载节点mnt权限

sudo chown -R yjiang2:yjiang2 /home/yjiang2/mnt/

sudo chown -R yjiang2:yjiang2 

//启动命令

$HADOOP_HOME/sbin/start-dfs.sh

$HADOOP_HOME/sbin/start-yarn.sh        

$HADOOP_HOME/sbin/mr-jobhistory-daemon.sh start historyserver

$HADOOP_HOME/sbin/stop-dfs.sh

chmod 700 -R .ssh

ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa

cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys

chmod 0600 ~/.ssh/authorized_keys

ssh localhost

ssh 0.0.0.0

//挂载目录权限

sudo chmod a+rw /home/yjiang2/mnt/ -R

:%s/yjiang2/yjiang2/g

grep yjiang2 -rn ./*

scp hadoop-2.7.1.tgz script.tgz [email protected]:~/

继续阅读