1. 環境
- mac或者linux 1台; 用戶端 電腦(windows、mac、linux) 1台
- jdk 1.8.x
- hbase 2.2.2 下載下傳: https://hbase.apache.org/downloads.html
- hdaoop 3.2.1 下載下傳: https://hadoop.apache.org/releases.html
- hadoop-common-bin 工具 下載下傳(windows): https://download.csdn.net/download/shuaidan19920412/12080258
2. 安裝及測試
- 将相關檔案解壓至相關目錄,以 /opt/apache/為例
- 配置 hbase-2.2.2/conf/hbase-env.sh,
export JAVA_HOME=/usr/java/jdk1.8.0_161 (jdk路徑)
- 配置 /hbase-2.2.2/conf/hbase-site.xml,
<configuration>
<property>
<name>hbase.zookeeper.quorum</name>
<!-- 增加統計支援 更改為伺服器計算機名字 -->
<value>ubuntu</value>
</property>
<property>
<name>hbase.rootdir</name>
<!--自定義路徑 -->
<value>file:///data/apache/hbase/root</value>
</property>
<property>
<name>hbase.tmp.dir</name>
<!--自定義路徑 -->
<value>/data/apache/hbase/tmp</value>
</property>
<property>
<name>hbase.zookeeper.property.dataDir</name>
<!--自定義路徑 -->
<value>/data/apache/hbase/zoo</value>
</property>
<property>
<name>hbase.unsafe.stream.capability.enforce</name>
<value>false</value>
</property>
</configuration>
- 配置 host ,此步驟影響遠端通路 。linux (vi /etc/hosts)
添加目前計算機ip 名字,比如 192.168.3.1 ubuntu
- 啟動及測試
(1) 指令行運作 hbase-2.2.2/bin/start-hbase.sh
如果出現 running master, logging to /opt/apache/hbase-2.2.2/bin/../logs/hbase-devin-master-ubxxxxxxxxxx
(2) 且使用 jps 指令檢視 有 HMaster 程序 ,則表示成功,入下圖
(3) 網頁 http://192.168.3.31:16010/master-status (更改為自己ip或者主機位址)
- 伺服器本地 shell 測試
(1) 打開工具 hbase-2.2.2/bin/hbase shell ,如下圖
(2) 測試是否通 , 輸入指令 list
(3) 也可以測試建表等指令,比如
create 'spark_test','name','sex'
3. 遠端連接配接
- 如果windows , 配置環境周遊 HADOOP_HOME, 指向 之前準備的hadoop_commen 位址
- 配置本地host , 同伺服器配置,将伺服器ip位址配置為相應的名字。比如
- 關閉區域網路等
- 測試是否可通: telnet 192.xxx 2181
- 編碼,pom檔案如下
-
<properties> <maven.compiler.source>1.8</maven.compiler.source> <maven.compiler.target>1.8</maven.compiler.target> <encoding>UTF-8</encoding> <scala.version>2.11.12</scala.version> <scala.compat.version>2.11</scala.compat.version> <hadoop.version>3.2.1</hadoop.version> <spark.version>2.4.3</spark.version> <hbase.version>2.2.2</hbase.version> </properties> <dependency> <groupId>org.apache.hadoop</groupId> <artifactId>hadoop-client</artifactId> <version>${hadoop.version}</version> <exclusions> <exclusion> <artifactId>commons-httpclient</artifactId> <groupId>commons-httpclient</groupId> </exclusion> <exclusion> <artifactId>httpcore</artifactId> <groupId>org.apache.httpcomponents</groupId> </exclusion> <exclusion> <artifactId>hadoop-common</artifactId> <groupId>org.apache.hadoop</groupId> </exclusion> </exclusions> </dependency> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-client</artifactId> <version>${hbase.version}</version> </dependency> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-common</artifactId> <version>${hbase.version}</version> </dependency> <dependency> <groupId>org.apache.hbase</groupId> <artifactId>hbase-server</artifactId> <version>${hbase.version}</version> </dependency>
- 部分測試代碼
val tableName = "spark_test"
val columnFamilys = List("a", "b", "c")
val conf = HBaseConfiguration.create()
conf.set("hbase.zookeeper.quorum","ubuntu")
conf.set("hbase.zookeeper.property.clientPort", "2181")
val hbaseconn = ConnectionFactory.createConnection(conf)
val admin:Admin = hbaseconn.getAdmin()
val myTableName :TableName = TableName.valueOf(tableName)
if (admin.tableExists(myTableName)) {
println(tableName +" Table exists!")
//val tableDesc: HTableDescriptor = new HTableDescriptor(TableName.valueOf(tablename))
//tableDesc.addCoprocessor("org.apache.hadoop.hbase.coprocessor.AggregateImplementation")
}else {
// 表描述器構造器
println(tableName +" Table not exists!")
val tdb = TableDescriptorBuilder.newBuilder(TableName.valueOf(tableName))
if(null != columnFamilys)
for (columnFamily <- columnFamilys) {
//列族描述起構造器//列族描述起構造器
val cdb = ColumnFamilyDescriptorBuilder.newBuilder(Bytes.toBytes(columnFamily))
//獲得列描述起
val cfd: ColumnFamilyDescriptor = cdb.build
//添加列族
tdb.setColumnFamily(cfd)
}
// 獲得表描述器
val td = tdb.build
admin.createTable(td)
println("create successful!! ")
}
admin.close
可能碰到的問題
* 如果讀取 maven xml配置異常,可能是因為 setting.xml 中包含了中文(含注釋)或者非UTF8編碼
* 如果報錯 Could not initialize class org.fusesource.jansi.internal.Kernel32; 可能是因為windows下缺jansi-1.4.jar ;解決方案:下載下傳jansi-1.4.jar包放到hbase-2.2.1\lib下,重新啟動即可