前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >Hadoop Mapper 阶段将数据直接从 HDFS 导入 Hbase

Hadoop Mapper 阶段将数据直接从 HDFS 导入 Hbase

作者头像
用户1177713
发布2018-02-24 14:47:06
8890
发布2018-02-24 14:47:06
举报
文章被收录于专栏:数据之美数据之美

数据源格式如下:

代码语言:javascript
复制
20130512	1	-1	-1	13802	1	2013-05-12 07:26:22	
20130512	1	-1	-1	13802	1	2013-05-12 11:18:24

我们期待的结果是数据直接从 hdfs 读取后 写入 hbase,没有 reduce 阶段,

代码如下:

代码语言:javascript
复制
package WebsiteAnalysis;

import java.io.IOException;

import org.apache.hadoop.conf.Configuration;
import org.apache.hadoop.fs.Path;
import org.apache.hadoop.hbase.HBaseConfiguration;
import org.apache.hadoop.hbase.KeyValue;
import org.apache.hadoop.hbase.client.Put;
import org.apache.hadoop.hbase.io.ImmutableBytesWritable;
import org.apache.hadoop.hbase.mapreduce.TableOutputFormat;
import org.apache.hadoop.hbase.util.Bytes;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.io.Writable;
import org.apache.hadoop.mapreduce.Job;
import org.apache.hadoop.mapreduce.Mapper;
import org.apache.hadoop.mapreduce.lib.input.FileInputFormat;
import org.apache.hadoop.util.GenericOptionsParser;

public class Map2Hdfs {
	public static final String NAME = "ImportFromFile";

	public enum Counters {
		LINES
	}

	static class ImportMapper extends Mapper<LongWritable, Text, ImmutableBytesWritable, Writable> {
		private byte[] family = null;
		private byte[] qualifier = null;

		@Override
		protected void setup(Context context) throws IOException, InterruptedException {
			String column = context.getConfiguration().get("conf.column");
			byte[][] colkey = KeyValue.parseColumn(Bytes.toBytes(column));
			family = colkey[0];
			if (colkey.length > 1) {
				qualifier = colkey[1];
			}
		}

		@Override
		public void map(LongWritable offset, Text line, Context context) throws IOException {
			try {
				String[] lineArr = line.toString().split("\t");
				Put put = new Put(Bytes.toBytes(offset + ""));
				put.add(family, Bytes.toBytes("time"), Bytes.toBytes(lineArr[lineArr.length - 1]));
				context.write(new ImmutableBytesWritable(Bytes.toBytes(offset + "")), put);
				context.getCounter(Counters.LINES).increment(1);
			} catch (Exception e) {
				e.printStackTrace();
			}
		}
	}

	public static void main(String[] args) throws Exception {
		Configuration conf = HBaseConfiguration.create();
		String[] otherArgs = new GenericOptionsParser(conf, args).getRemainingArgs();
		conf.set("conf.column", "cf");
		String inputPath = "/dsap/middata/lj/ooxx/pv";
		Job job = new Job(conf, "TestMap2Hdfs");

		job.setJarByClass(Map2Hdfs.class);
		job.setMapperClass(ImportMapper.class);
		job.setOutputFormatClass(TableOutputFormat.class);
		job.getConfiguration().set(TableOutputFormat.OUTPUT_TABLE, "TestMap2Hdfs");
		job.setOutputKeyClass(ImmutableBytesWritable.class);
		job.setOutputValueClass(Writable.class);
		job.setNumReduceTasks(0);
		FileInputFormat.addInputPath(job, new Path(inputPath + "/" + otherArgs[0]));
		System.exit(job.waitForCompletion(true) ? 0 : 1);
	}
}

REF:

http://stackoverflow.com/questions/11061854/hadoop-writing-to-hbase-directly-from-the-mapper

http://blog.sina.com.cn/s/blog_62a9902f0101904h.html  新建表的方式写入

hbase-hdfs MapReduce 数据读写总结

http://blog.pureisle.net/archives/1938.html  hbase hdfs MR 读写的几种情况总结

http://blog.csdn.net/kirayuan/article/details/7001278  hbase表拷贝样例代码

本文参与 腾讯云自媒体分享计划,分享自作者个人站点/博客。
如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档