Fork me on GitHub

Spring整合HBase

§ 系统环境

  • Ubuntu

    • Hadoop 2.7.3

    • HBase 1.2.3

    • JDK 1.8

  • Windows

    • IDEA 16

    • Spring 4.3.2.RELEASE

    • Spring Data Hadoop 2.4.0.RELEASE

§ 配置HBase运行环境

Hadoop和HBase都配置运行在Ubuntu虚拟机中,HBase以伪分布式模式运行,使用HDFS作为存储系统。

首先需要安装必须的软件:

  1. $ apt-get install ssh 
  2. $ apt-get install rsync 

并确保ssh可以不使用口令就能够连接localhost。设置方式如下:

  1. $ ssh-keygen -t rsa -P '' -f ~/.ssh/id_rsa 
  2. $ cat ~/.ssh/id_rsa.pub >> ~/.ssh/authorized_keys 
  3. $ chmod 0600 ~/.ssh/authorized_keys 

查看Ubuntu的IP地址,然后编辑/etc/hosts,注释掉其他主机映射内容,将<ip><domain>替换为本机IP地址和自定义域:

  1. #127.0.0.1 localhost 
  2. #127.0.1.1 <domain> 
  3. <ip> <domain> localhost 

§ 配置Hadoop

<HADOOP_DIR>表示Hadoop根目录,编辑<HADOOP_DIR>/etc/hadoop/hadoop-env.sh,修改JAVA_HOME为JDK安装目录。

编辑配置文件<HADOOP_DIR>/etc/hadoop/core-site.xml

  1. <configuration> 
  2. <property> 
  3. <name>fs.defaultFS</name> 
  4. <value>hdfs://<ip>:9000</value> 
  5. </property> 
  6. <property> 
  7. <name>hadoop.tmp.dir</name> 
  8. <value>/home/<user>/data</value> 
  9. </property> 
  10. </configuration> 

编辑配置文件<HADOOP_DIR>/etc/hadoop/hdfs-site.xml

  1. <configuration> 
  2. <property> 
  3. <name>dfs.replication</name> 
  4. <value>1</value> 
  5. </property> 
  6. </configuration> 

§ 配置HBase

<HBASE_DIR>表示HBase根目录,修改<HBASE_DIR>/conf/hbase-env.sh,添加JAVA_HOME

编辑配置文件<HBASE_DIR>/conf/hbase-site.xml,替换ip和domain为对应内容:

  1. <configuration> 
  2. <property> 
  3. <name>hbase.cluster.distributed</name> 
  4. <value>true</value> 
  5. </property> 
  6. <property> 
  7. <name>hbase.rootdir</name> 
  8. <value>hdfs://<ip>:9000/hbase</value> 
  9. </property> 
  10. <property> 
  11. <name>hbase.zookeeper.quorum</name> 
  12. <value><domain></value> 
  13. </property> 
  14. <property> 
  15. <name>hbase.zookeeper.property.clientPort</name> 
  16. <value>2181</value> 
  17. </property> 
  18. </configuration> 

§ 启动Hadoop和HBase

启动Hadoop前先格式化文件系统:

  1. $ ./<HADOOP_DIR>/bin/hdfs namenode -format 

然后启动Hadoop和HBase:

  1. $ ./<HADOOP_DIR>/sbin/start-dfs.sh 
  2. $ ./<HBASE_DIR>/bin/start-hbase.sh 

可以使用jps命令查看所有启动的进程:

  1. $ jps 
  2. 2786 NameNode 
  3. 2914 DataNode 
  4. 6259 HQuorumPeer 
  5. 6324 HMaster 
  6. 3083 SecondaryNameNode 
  7. 6411 HRegionServer 

§ 创建Maven项目

新建Maven项目,所需依赖如下:

  1. <properties> 
  2. <slf4j.version>1.7.21</slf4j.version> 
  3. <spring.version>4.3.2.RELEASE</spring.version> 
  4. </properties> 
  5.  
  6. <dependencies> 
  7. <dependency> 
  8. <groupId>junit</groupId> 
  9. <artifactId>junit</artifactId> 
  10. <scope>test</scope> 
  11. </dependency> 
  12. <dependency> 
  13. <groupId>org.springframework</groupId> 
  14. <artifactId>spring-core</artifactId> 
  15. </dependency> 
  16. <dependency> 
  17. <groupId>org.springframework</groupId> 
  18. <artifactId>spring-context</artifactId> 
  19. </dependency> 
  20. <dependency> 
  21. <groupId>org.springframework</groupId> 
  22. <artifactId>spring-tx</artifactId> 
  23. </dependency> 
  24. <dependency> 
  25. <groupId>org.springframework.data</groupId> 
  26. <artifactId>spring-data-hadoop</artifactId> 
  27. <exclusions> 
  28. <exclusion> 
  29. <groupId>org.springframework</groupId> 
  30. <artifactId>spring-context-support</artifactId> 
  31. </exclusion> 
  32. <exclusion> 
  33. <groupId>org.slf4j</groupId> 
  34. <artifactId>slf4j-log4j12</artifactId> 
  35. </exclusion> 
  36. </exclusions> 
  37. </dependency> 
  38. <dependency> 
  39. <groupId>org.springframework</groupId> 
  40. <artifactId>spring-test</artifactId> 
  41. </dependency> 
  42. <dependency> 
  43. <groupId>org.apache.hadoop</groupId> 
  44. <artifactId>hadoop-auth</artifactId> 
  45. </dependency> 
  46. <dependency> 
  47. <groupId>org.apache.hbase</groupId> 
  48. <artifactId>hbase-client</artifactId> 
  49. <version>1.2.3</version> 
  50. <scope>compile</scope> 
  51. <exclusions> 
  52. <exclusion> 
  53. <groupId>log4j</groupId> 
  54. <artifactId>log4j</artifactId> 
  55. </exclusion> 
  56. <exclusion> 
  57. <groupId>org.slf4j</groupId> 
  58. <artifactId>slf4j-log4j12</artifactId> 
  59. </exclusion> 
  60. </exclusions> 
  61. </dependency> 
  62.  
  63. <dependency> 
  64. <groupId>org.slf4j</groupId> 
  65. <artifactId>jcl-over-slf4j</artifactId> 
  66. </dependency> 
  67. <dependency> 
  68. <groupId>org.slf4j</groupId> 
  69. <artifactId>slf4j-api</artifactId> 
  70. </dependency> 
  71. <dependency> 
  72. <groupId>org.slf4j</groupId> 
  73. <artifactId>slf4j-log4j12</artifactId> 
  74. </dependency> 
  75. <dependency> 
  76. <groupId>log4j</groupId> 
  77. <artifactId>log4j</artifactId> 
  78. </dependency> 
  79. </dependencies> 

将HBase的配置文件hbase-site.xml复制到resources下,新建Spring配置文件applicationContext.xml

  1. <?xml version="1.0" encoding="UTF-8"?> 
  2. <beans xmlns="http://www.springframework.org/schema/beans" 
  3. xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" 
  4. xmlns:context="http://www.springframework.org/schema/context" 
  5. xmlns:hdp="http://www.springframework.org/schema/hadoop" 
  6. xsi:schemaLocation="http://www.springframework.org/schema/beans http://www.springframework.org/schema/beans/spring-beans.xsd 
  7. http://www.springframework.org/schema/context http://www.springframework.org/schema/context/spring-context.xsd 
  8. http://www.springframework.org/schema/hadoop http://www.springframework.org/schema/hadoop/spring-hadoop.xsd"> 
  9.  
  10. <context:annotation-config/> 
  11. <context:component-scan base-package="com.sample.hbase"/> 
  12. <hdp:configuration resources="hbase-site.xml"/> 
  13. <hdp:hbase-configuration configuration-ref="hadoopConfiguration"/> 
  14. <bean id="hbaseTemplate" class="org.springframework.data.hadoop.hbase.HbaseTemplate"> 
  15. <property name="configuration" ref="hbaseConfiguration"/> 
  16. </bean> 
  17. </beans> 

新建测试用例:

  1. @RunWith(SpringJUnit4ClassRunner.class) 
  2. @ContextConfiguration(locations = {"classpath*:applicationContext.xml"}) 
  3. public class BaseTest
  4.  
  5. @Autowired 
  6. private HbaseTemplate template; 
  7.  
  8. @Test 
  9. public void testFind()
  10. List<String> rows = template.find("user", "cf", "name", new RowMapper<String>() { 
  11. public String mapRow(Result result, int i) throws Exception
  12. return result.toString(); 

  13. }); 
  14. Assert.assertNotNull(rows); 

  15.  
  16. @Test 
  17. public void testPut()
  18. template.put("user", "1", "cf", "name", Bytes.toBytes("Alice")); 


整合完成。

posted on 2016-11-04 13:50  sungoshawk  阅读(12741)  评论(2编辑  收藏  举报