前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >hudi 0.9.0适配hbase 2.2.6

hudi 0.9.0适配hbase 2.2.6

作者头像
从大数据到人工智能
发布2022-01-19 08:16:23
4680
发布2022-01-19 08:16:23
举报
文章被收录于专栏:大数据-BigData

总览

hudi中,hbase可以作为索引数据的存储,hudi默认使用的hbase版本为1.2.3。

在hbase从1.x升级到2.x之后,其api发生了较大的变化,直接修改hudi中hbase的版本是不合适的,即会发生编译错误。

本文对部分源码进行修改以使hbase 2.2.6适配hudi 0.9.0

编译报错

如果我们直接修改hbase的版本为2.2.6的话,会出现如下编译错误:

代码语言:javascript
复制
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:compile (default-compile) on project hudi-common: Compilation failure: Compilation failure: 
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/common/bootstrap/index/HFileBootstrapIndex.java:[181,34] no suitable method found for createReader(org.apache.hadoop.fs.FileSystem,org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex.HFilePathForReader,org.apache.hadoop.hbase.io.hfile.CacheConfig,org.apache.hadoop.conf.Configuration)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.hbase.io.FSDataInputStreamWrapper,long,org.apache.hadoop.hbase.io.hfile.CacheConfig,boolean,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.hbase.io.hfile.CacheConfig,boolean,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/common/bootstrap/index/HFileBootstrapIndex.java:[309,93] cannot find symbol
[ERROR]   symbol:   method getKeyValue()
[ERROR]   location: variable scanner of type org.apache.hadoop.hbase.io.hfile.HFileScanner
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/common/bootstrap/index/HFileBootstrapIndex.java:[534,51] incompatible types: org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex.HoodieKVComparator cannot be converted to org.apache.hadoop.hbase.CellComparator
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/common/bootstrap/index/HFileBootstrapIndex.java:[537,51] incompatible types: org.apache.hudi.common.bootstrap.index.HFileBootstrapIndex.HoodieKVComparator cannot be converted to org.apache.hadoop.hbase.CellComparator
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/io/storage/HoodieHFileReader.java:[72,24] no suitable method found for createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.hbase.io.hfile.CacheConfig,org.apache.hadoop.conf.Configuration)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.hbase.io.FSDataInputStreamWrapper,long,org.apache.hadoop.hbase.io.hfile.CacheConfig,boolean,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.hbase.io.hfile.CacheConfig,boolean,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/io/storage/HoodieHFileReader.java:[80,24] no suitable method found for createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.hbase.io.FSDataInputStreamWrapper,int,org.apache.hadoop.hbase.io.hfile.CacheConfig,org.apache.hadoop.conf.Configuration)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.hbase.io.FSDataInputStreamWrapper,long,org.apache.hadoop.hbase.io.hfile.CacheConfig,boolean,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR]     method org.apache.hadoop.hbase.io.hfile.HFile.createReader(org.apache.hadoop.fs.FileSystem,org.apache.hadoop.fs.Path,org.apache.hadoop.hbase.io.hfile.CacheConfig,boolean,org.apache.hadoop.conf.Configuration) is not applicable
[ERROR]       (actual and formal argument lists differ in length)
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/io/storage/HoodieHFileReader.java:[114,56] incompatible types: org.apache.hadoop.hbase.io.hfile.HFileBlock cannot be converted to java.nio.ByteBuffer
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/io/storage/HoodieHFileReader.java:[149,27] cannot find symbol
[ERROR]   symbol:   method getKeyValue()
[ERROR]   location: variable scanner of type org.apache.hadoop.hbase.io.hfile.HFileScanner
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/io/storage/HoodieHFileReader.java:[180,54] cannot find symbol
[ERROR]   symbol:   method getKeyValue()
[ERROR]   location: variable scanner of type org.apache.hadoop.hbase.io.hfile.HFileScanner
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/io/storage/HoodieHFileReader.java:[200,50] cannot find symbol
[ERROR]   symbol:   method getKeyValue()
[ERROR]   location: variable scanner of type org.apache.hadoop.hbase.io.hfile.HFileScanner
[ERROR] /root/hudi-0.9.0/hudi-common/src/main/java/org/apache/hudi/io/storage/HoodieHFileReader.java:[224,28] cannot find symbol
[ERROR]   symbol:   method getKeyValue()
[ERROR]   location: variable keyScanner of type org.apache.hadoop.hbase.io.hfile.HFileScanner
[ERROR] -> [Help 1]
[ERROR] 
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR] 
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
[ERROR] 
[ERROR] After correcting the problems, you can resume the build with the command
[ERROR]   mvn <goals> -rf :hudi-commonCopy

针对上述问题,我们发现主要的兼容性问题有4个:

  1. HFileBootstrapIndex#createReader方法参数问题
  2. HFileScanner#getKeyValue方法在hbase 2.2.6中已经不存在了
  3. HFile#withComparator方法的入参为CellComparator类型,而源码中的则是HoodieKVComparator
  4. HFileReaderImpl#getMetaBlock方法返回参数的变化由ByteBuffer变为HFileBlock

hudi源码修改

针对上述问题,我们进行如下修改:

所以我们可以对HFileBootstrapIndex类的309行修改为:

代码语言:javascript
复制
          keys.add(converter.apply(getUserKeyFromCellKey(CellUtil.getCellKeyAsString(scanner.getCell()))));Copy

(在编译中可能会遇到其他相同的getKeyValue方法问题,用上述方法进行修改即可)

  1. 在hbase升级之后,我们可以看到HFile#withComparator需要的参数为CellComparator:

所以我们可以通过修改HFileBootstrapIndex的585-586行,使HoodieKVComparator继承CellComparatorImpl

代码语言:javascript
复制
  public static class HoodieKVComparator extends CellComparatorImpl {
  }Copy

(在编译中可能会遇到其他相同问题,用上述方法进行修改即可)

  1. HFileReaderImpl#getMetaBlock方法返回参数具体问题如下:

我们再来看一下这个方法的变更历史:

由此我们知道可以通过修改HoodieHFileReader的115行,改为如下:

代码语言:javascript
复制
      ByteBuff serializedFilter = reader.getMetaBlock(KEY_BLOOM_FILTER_META_BLOCK, false).getBufferWithoutHeader();Copy

(在编译中可能会遇到相同问题,用上述方法进行修改即可)

其他问题

由于hudi 0.9.0的jetty版本和hbase 2.2.6的jetty版本存在冲突,所以我们需要排除掉hbase的jetty,使用hudi要求的jetty版本。

这个问题可能在编译阶段不会报错,但是在运行阶段是会报错的。

那么解决了上述问题之后,就可以让hudi使用hbase 2.2.6啦!

本文为从大数据到人工智能博主「xiaozhch5」的原创文章,遵循CC 4.0 BY-SA版权协议,转载请附上原文出处链接及本声明。

原文链接:https://cloud.tencent.com/developer/article/1936506

本文参与 腾讯云自媒体同步曝光计划,分享自作者个人站点/博客。
原始发表:2021-11-,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • 总览
  • 编译报错
  • hudi源码修改
  • 其他问题
相关产品与服务
TDSQL MySQL 版
TDSQL MySQL 版(TDSQL for MySQL)是腾讯打造的一款分布式数据库产品,具备强一致高可用、全球部署架构、分布式水平扩展、高性能、企业级安全等特性,同时提供智能 DBA、自动化运营、监控告警等配套设施,为客户提供完整的分布式数据库解决方案。
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档