前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >flume 整合 kafka

flume 整合 kafka

作者头像
WindWant
发布2020-09-11 11:03:19
3810
发布2020-09-11 11:03:19
举报
文章被收录于专栏:后端码事

flume 整合 kafka:

  • flume:高可用的,高可靠的,分布式的海量日志采集、聚合和传输的系统。
  • kafka:分布式的流数据平台。
  • flume 采集业务日志,发送到 kafka

一、安装部署Kafka

Download

1.0.0 is the latest release. The current stable version is 1.0.0.

You can verify your download by following these procedures and using these KEYS.

1.0.0

We build for multiple versions of Scala. This only matters if you are using Scala and you want a version built for the same Scala version you use. Otherwise any version should work (2.11 is recommended).

1、解压:

tar zxvf kafka_2.11-1.0.0.tgz

2、部署目录:

mv kafka_2.12-1.0.0 /usr/local/kafka2.12

3、启动zookeeper

....

4、启动kafka:

#nohup bin/kafka-server-start.sh config/server.properties &

5、创建topic:

#bin/kafka-topics.sh --create --zookeeper localhost:2181 --partitions 1 --replication-factor 1 --topic test

Created topic "test".

6、查看topic:

# bin/kafka-topics.sh --list --zookeeper localhost:2181

__consumer_offsets

test

7.测试发送数据

#bin/kafka-console-producer.sh --broker-list localhost:9092 --topic test

输入:my test

8.测试消费消息:

#bin/kafka-console-consumer.sh --bootstrap-server localhost:9092 --topic test --from-beginning

二、安装部署flume

flume下载:

Download

Apache Flume is distributed under the Apache License, version 2.0

The link in the Mirrors column should display a list of available mirrors with a default selection based on your inferred location. If you do not see that page, try a different browser. The checksum and signature are links to the originals on the main distribution server.

Apache Flume binary (tar.gz)

apache-flume-1.8.0-bin.tar.gz

apache-flume-1.8.0-bin.tar.gz.md5

apache-flume-1.8.0-bin.tar.gz.sha1

apache-flume-1.8.0-bin.tar.gz.asc

Apache Flume source (tar.gz)

apache-flume-1.8.0-src.tar.gz

apache-flume-1.8.0-src.tar.gz.md5

apache-flume-1.8.0-src.tar.gz.sha1

apache-flume-1.8.0-src.tar.gz.asc

It is essential that you verify the integrity of the downloaded files using the PGP or MD5 signatures. Please read Verifying Apache HTTP Server Releases for more information on why you should verify our releases.

1、下载:

wget apache-flume-1.8.0-bin.tar.gz

2、解压:

tar zxvf apache-flume-1.8.0-bin.tar.gz

3、设置目录:

mv apache-flume-1.8.0-bin /usr/local/flume1.8

4、准备工作:

安装java并设置java环境变量,flume环境变量,在`/etc/profile`中加入

export JAVA_HOME=/usr/java/jdk1.8.0_65

export FLUME_HOME=/usr/local/flume1.8

export PATH=$PATH:$JAVA_HOME/bin:$FLUME_HOME

执行:source /etc/profile 生效变量

5、建立log采集目录:

/tmp/logs/kafka.log

6、配置

拷贝配置模板:

# cp conf/flume-conf.properties.template conf/flume-conf.properties

# cp conf/flume-env.properties.template conf/flume-env.properties

编辑配置如下:

agent.sources = s1

agent.channels = c1

agent.sinks = k1

agent.sources.s1.type=exec

#日志采集位置

agent.sources.s1.command=tail -F /tmp/logs/kafka.log

agent.sources.s1.channels=c1

agent.channels.c1.type=memory

agent.channels.c1.capacity=10000

agent.channels.c1.transactionCapacity=100

agent.sinks.k1.type= org.apache.flume.sink.kafka.KafkaSink

#kafka 地址

agent.sinks.k1.brokerList=localhost:9092

#kafka topic

agent.sinks.k1.topic=test

agent.sinks.k1.serializer.class=kafka.serializer.StringEncoder

agent.sinks.k1.channel=c1

功能验证

7、启动服务

# bin/flume-ng agent --conf ./conf/ -f conf/kafka.properties -Dflume.root.logger=DEBUG,console -n agent

运行日志位于logs目录,或者启动时添加-Dflume.root.logger=INFO,console 选项前台启动,输出打印日志,查看具体运行日志,服务异常时查原因。

8、创建测试日志生成:log_producer_test.sh

for((i=0;i<=1000;i++));

do echo "kafka_flume_test-"+$i>>/tmp/logs/kafka.log;

do

9、生成日志:

./log_producer_test.sh

观察kafka日志消费情况。。。

本文参与 腾讯云自媒体同步曝光计划,分享自作者个人站点/博客。
原始发表:2017-11-22 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • Download
    • 1.0.0
      • 1、解压:
        • 2、部署目录:
          • 3、启动zookeeper
            • 4、启动kafka:
              • 5、创建topic:
                • 6、查看topic:
                  • 7.测试发送数据
                    • 8.测试消费消息:
                    • Download
                      • 1、下载:
                        • 2、解压:
                          • 3、设置目录:
                            • 4、准备工作:
                              • 5、建立log采集目录:
                                • 6、配置
                                  • 7、启动服务
                                    • 8、创建测试日志生成:log_producer_test.sh
                                      • 9、生成日志:
                                      领券
                                      问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档