前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >Flume1.8安装配置与入门实例

Flume1.8安装配置与入门实例

作者头像
程裕强
发布2018-01-02 16:57:08
1K0
发布2018-01-02 16:57:08
举报

1、下载

http://flume.apache.org/download.html

这里写图片描述
这里写图片描述

http://mirrors.tuna.tsinghua.edu.cn/apache/flume/1.8.0/apache-flume-1.8.0-bin.tar.gz

代码语言:javascript
复制
[root@node1 ~]# wget http://mirrors.tuna.tsinghua.edu.cn/apache/flume/1.8.0/apache-flume-1.8.0-bin.tar.gz
--2017-12-20 09:19:18--  http://mirrors.tuna.tsinghua.edu.cn/apache/flume/1.8.0/apache-flume-1.8.0-bin.tar.gz
Resolving mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)... 101.6.6.178, 2402:f000:1:416:101:6:6:178
Connecting to mirrors.tuna.tsinghua.edu.cn (mirrors.tuna.tsinghua.edu.cn)|101.6.6.178|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 58688757 (56M) [application/octet-stream]
Saving to: ‘apache-flume-1.8.0-bin.tar.gz’

100%[=====================================================================================================>] 58,688,757  1.44MB/s   in 73s    

2017-12-20 09:20:32 (783 KB/s) - ‘apache-flume-1.8.0-bin.tar.gz’ saved [58688757/58688757]

[root@node1 ~]# 

2、安装配置

解压缩

代码语言:javascript
复制
[root@node1 ~]# tar -zxvf apache-flume-1.8.0-bin.tar.gz

修改名字

代码语言:javascript
复制
[root@node1 ~]# mv apache-flume-1.8.0-bin /opt/flume-1.8.0

目录结构

代码语言:javascript
复制
[root@node1 ~]# cd /opt/flume-1.8.0/
[root@node1 flume-1.8.0]# ll
total 148
drwxr-xr-x  2 root root    62 Dec 20 09:30 bin
-rw-r--r--  1 root root 81264 Sep 15 08:26 CHANGELOG
drwxr-xr-x  2 root root   127 Dec 20 09:30 conf
-rw-r--r--  1 root root  5681 Sep 15 08:26 DEVNOTES
-rw-r--r--  1 root root  2873 Sep 15 08:26 doap_Flume.rdf
drwxr-xr-x 10 root root  4096 Sep 15 08:48 docs
drwxr-xr-x  2 root root  8192 Dec 20 09:30 lib
-rw-r--r--  1 root root 27663 Sep 15 08:26 LICENSE
-rw-r--r--  1 root root   249 Sep 15 08:26 NOTICE
-rw-r--r--  1 root root  2483 Sep 15 08:26 README.md
-rw-r--r--  1 root root  1588 Sep 15 08:26 RELEASE-NOTES
drwxr-xr-x  2 root root    68 Dec 20 09:30 tools
[root@node1 flume-1.8.0]#

配置flume-env.sh

代码语言:javascript
复制
[root@node1 flume-1.8.0]# cd conf
[root@node1 conf]# ls
flume-conf.properties.template  flume-env.ps1.template  flume-env.sh.template  log4j.properties
[root@node1 conf]# cp flume-env.sh.template flume-env.sh
代码语言:javascript
复制
[root@node1 conf]# vi flume-env.sh

修改内容如下:

代码语言:javascript
复制
# Enviroment variables can be set here.
export JAVA_HOME=/opt/jdk1.8.0_112

# Give Flume more memory and pre-allocate, enable remote monitoring via JMX
export JAVA_OPTS="-Xms100m -Xmx2000m -Dcom.sun.management.jmxremote"

配置flume-conf.properties

代码语言:javascript
复制
[root@node1 conf]# cp flume-conf.properties.template flume-conf.properties

查看相关命令参数

代码语言:javascript
复制
[root@node1 flume-1.8.0]# bin/flume-ng
Error: Unknown or unspecified command ''

Usage: bin/flume-ng <command> [options]...

commands:
  help                      display this help text
  agent                     run a Flume agent
  avro-client               run an avro Flume client
  version                   show Flume version info

global options:
  --conf,-c <conf>          use configs in <conf> directory
  --classpath,-C <cp>       append to the classpath
  --dryrun,-d               do not actually start Flume, just print the command
  --plugins-path <dirs>     colon-separated list of plugins.d directories. See the
                            plugins.d section in the user guide for more details.
                            Default: $FLUME_HOME/plugins.d
  -Dproperty=value          sets a Java system property value
  -Xproperty=value          sets a Java -X option

agent options:
  --name,-n <name>          the name of this agent (required)
  --conf-file,-f <file>     specify a config file (required if -z missing)
  --zkConnString,-z <str>   specify the ZooKeeper connection to use (required if -f missing)
  --zkBasePath,-p <path>    specify the base path in ZooKeeper for agent configs
  --no-reload-conf          do not reload config file if changed
  --help,-h                 display help text

avro-client options:
  --rpcProps,-P <file>   RPC client properties file with server connection params
  --host,-H <host>       hostname to which events will be sent
  --port,-p <port>       port of the avro source
  --dirname <dir>        directory to stream to avro source
  --filename,-F <file>   text file to stream to avro source (default: std input)
  --headerFile,-R <file> File containing event headers as key/value pairs on each new line
  --help,-h              display help text

  Either --rpcProps or both --host and --port must be specified.

Note that if <conf> directory is specified, then it is always included first
in the classpath.

[root@node1 flume-1.8.0]#

版本信息

代码语言:javascript
复制
[root@node1 flume-1.8.0]# bin/flume-ng version
Flume 1.8.0
Source code repository: https://git-wip-us.apache.org/repos/asf/flume.git
Revision: 99f591994468633fc6f8701c5fc53e0214b6da4f
Compiled by denes on Fri Sep 15 14:58:00 CEST 2017
From source with checksum fbb44c8c8fb63a49be0a59e27316833d
[root@node1 flume-1.8.0]#

3、例1:avro

Avro(阿弗罗)是一个数据序列化系统,设计用于支持大批量数据交换的应用。

(1)avro.conf 在flume 的conf 文件夹下新建 配置文件 avro.conf

代码语言:javascript
复制
[root@node1 conf]# vi avro.conf
[root@node1 conf]# cat avro.conf 
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type= avro
a1.sources.r1.channels = c1
a1.sources.r1.bind = 0.0.0.0
a1.sources.r1.port = 4141

# Describe the sink
a1.sinks.k1.type= logger

# Use a channel which buffers events in memory
a1.channels.c1.type= memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
[root@node1 conf]# 

(2)启动flume agent a1 启动 Flume 代理 bin/flume-ng agent -c /opt/flume-1.8.0/conf/ -f /opt/flume-1.8.0/conf/avro.conf -n a1 -Dflume.root.logger=INFO,console

代码语言:javascript
复制
[root@node1 flume-1.8.0]# bin/flume-ng agent -c /opt/flume-1.8.0/conf/ -f /opt/flume-1.8.0/conf/avro.conf -n a1 -Dflume.root.logger=INFO,console
Info: Sourcing environment configuration script /opt/flume-1.8.0/conf/flume-env.sh
Info: Including Hadoop libraries found via (/opt/hadoop-2.7.3/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/opt/hbase-1.2.6/bin/hbase) for HBASE access
Info: Including Hive libraries found via () for Hive access
/*
* 提示:该行代码过长,系统自动注释不进行高亮。一键复制会移除系统注释 
* + exec /opt/jdk1.8.0_112/bin/java -Xms100m -Xmx2000m -Dcom.sun.management.jmxremote -Dflume.root.logger=INFO,console -cp '/opt/flume-1.8.0/conf:/opt/flume-1.8.0/lib/*:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/opt/jdk1.8.0_112/lib/tools.jar:/opt/hbase-1.2.6:/opt/hbase-1.2.6/lib/activation-1.1.jar:/opt/hbase-1.2.6/lib/aopalliance-1.0.jar:/opt/hbase-1.2.6/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/api-util-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/asm-3.1.jar:/opt/hbase-1.2.6/lib/avro-1.7.4.jar:/opt/hbase-1.2.6/lib/commons-beanutils-1.7.0.jar:/opt/hbase-1.2.6/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase-1.2.6/lib/commons-cli-1.2.jar:/opt/hbase-1.2.6/lib/commons-codec-1.9.jar:/opt/hbase-1.2.6/lib/commons-collections-3.2.2.jar:/opt/hbase-1.2.6/lib/commons-compress-1.4.1.jar:/opt/hbase-1.2.6/lib/commons-configuration-1.6.jar:/opt/hbase-1.2.6/lib/commons-daemon-1.0.13.jar:/opt/hbase-1.2.6/lib/commons-digester-1.8.jar:/opt/hbase-1.2.6/lib/commons-el-1.0.jar:/opt/hbase-1.2.6/lib/commons-httpclient-3.1.jar:/opt/hbase-1.2.6/lib/commons-io-2.4.jar:/opt/hbase-1.2.6/lib/commons-lang-2.6.jar:/opt/hbase-1.2.6/lib/commons-logging-1.2.jar:/opt/hbase-1.2.6/lib/commons-math-2.2.jar:/opt/hbase-1.2.6/lib/commons-math3-3.1.1.jar:/opt/hbase-1.2.6/lib/commons-net-3.1.jar:/opt/hbase-1.2.6/lib/disruptor-3.3.0.jar:/opt/hbase-1.2.6/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase-1.2.6/lib/guava-12.0.1.jar:/opt/hbase-1.2.6/lib/guice-3.0.jar:/opt/hbase-1.2.6/lib/guice-servlet-3.0.jar:/opt/hbase-1.2.6/lib/hadoop-annotations-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-auth-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-hdfs-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-app-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-api-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-server-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-client-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-examples-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-external-blockcache-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop2-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-prefix-tree-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-procedure-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-protocol-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-resource-bundle-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-rest-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-shell-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-thrift-1.2.6.jar:/opt/hbase-1.2.6/lib/htrace-core-3.1.0-incubating.jar:/opt/hbase-1.2.6/lib/httpclient-4.2.5.jar:/opt/hbase-1.2.6/lib/httpcore-4.4.1.jar:/opt/hbase-1.2.6/lib/jackson-core-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-jaxrs-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-xc-1.9.13.jar:/opt/hbase-1.2.6/lib/jamon-runtime-2.4.1.jar:/opt/hbase-1.2.6/lib/jasper-compiler-5.5.23.jar:/opt/hbase-1.2.6/lib/jasper-runtime-5.5.23.jar:/opt/hbase-1.2.6/lib/javax.inject-1.jar:/opt/hbase-1.2.6/lib/java-xmlbuilder-0.4.jar:/opt/hbase-1.2.6/lib/jaxb-api-2.2.2.jar:/opt/hbase-1.2.6/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase-1.2.6/lib/jcodings-1.0.8.jar:/opt/hbase-1.2.6/lib/jersey-client-1.9.jar:/opt/hbase-1.2.6/lib/jersey-core-1.9.jar:/opt/hbase-1.2.6/lib/jersey-guice-1.9.jar:/opt/hbase-1.2.6/lib/jersey-json-1.9.jar:/opt/hbase-1.2.6/lib/jersey-server-1.9.jar:/opt/hbase-1.2.6/lib/jets3t-0.9.0.jar:/opt/hbase-1.2.6/lib/jettison-1.3.3.jar:/opt/hbase-1.2.6/lib/jetty-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-sslengine-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-util-6.1.26.jar:/opt/hbase-1.2.6/lib/joni-2.1.2.jar:/opt/hbase-1.2.6/lib/jruby-complete-1.6.8.jar:/opt/hbase-1.2.6/lib/jsch-0.1.42.jar:/opt/hbase-1.2.6/lib/jsp-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/jsp-api-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/junit-4.12.jar:/opt/hbase-1.2.6/lib/leveldbjni-all-1.8.jar:/opt/hbase-1.2.6/lib/libthrift-0.9.3.jar:/opt/hbase-1.2.6/lib/log4j-1.2.17.jar:/opt/hbase-1.2.6/lib/metrics-core-2.2.0.jar:/opt/hbase-1.2.6/lib/netty-all-4.0.23.Final.jar:/opt/hbase-1.2.6/lib/paranamer-2.3.jar:/opt/hbase-1.2.6/lib/protobuf-java-2.5.0.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5-6.1.14.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5.jar:/opt/hbase-1.2.6/lib/slf4j-api-1.7.7.jar:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar:/opt/hbase-1.2.6/lib/snappy-java-1.0.4.1.jar:/opt/hbase-1.2.6/lib/spymemcached-2.11.6.jar:/opt/hbase-1.2.6/lib/xmlenc-0.52.jar:/opt/hbase-1.2.6/lib/xz-1.0.jar:/opt/hbase-1.2.6/lib/zookeeper-3.4.6.jar:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/lib/*' -Djava.library.path=:/opt/hadoop-2.7.3/lib/native:/opt/hadoop-2.7.3/lib/native org.apache.flume.node.Application -f /opt/flume-1.8.0/conf/avro.conf -n a1
*/
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flume-1.8.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2017-12-20 09:50:23,141 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start(PollingPropertiesFileConfigurationProvider.java:62)] Configuration provider starting
2017-12-20 09:50:23,158 (conf-file-poller-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:134)] Reloading configuration file:/opt/flume-1.8.0/conf/avro.conf
2017-12-20 09:50:23,176 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:930)] Added sinks: k1 Agent: a1
2017-12-20 09:50:23,176 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-20 09:50:23,177 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-20 09:50:23,212 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration.validateConfiguration(FlumeConfiguration.java:140)] Post-validation flume configuration contains configuration for agents: [a1]
2017-12-20 09:50:23,212 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:147)] Creating channels
2017-12-20 09:50:23,255 (conf-file-poller-0) [INFO - org.apache.flume.channel.DefaultChannelFactory.create(DefaultChannelFactory.java:42)] Creating instance of channel c1 type memory
2017-12-20 09:50:23,270 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:201)] Created channel c1
2017-12-20 09:50:23,271 (conf-file-poller-0) [INFO - org.apache.flume.source.DefaultSourceFactory.create(DefaultSourceFactory.java:41)] Creating instance of source r1, type avro
2017-12-20 09:50:23,344 (conf-file-poller-0) [INFO - org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:42)] Creating instance of sink: k1, type: logger
2017-12-20 09:50:23,348 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:116)] Channel c1 connected to [r1, k1]
2017-12-20 09:50:23,383 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:137)] Starting new configuration:{ sourceRunners:{r1=EventDrivenSourceRunner: { source:Avro source r1: { bindAddress: 0.0.0.0, port: 4141 } }} sinkRunners:{k1=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@9838ca6 counterGroup:{ name:null counters:{} } }} channels:{c1=org.apache.flume.channel.MemoryChannel{name: c1}} }
2017-12-20 09:50:23,398 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:144)] Starting Channel c1
2017-12-20 09:50:23,400 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:159)] Waiting for channel: c1 to start. Sleeping for 500 ms
2017-12-20 09:50:23,404 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: CHANNEL, name: c1: Successfully registered new MBean.
2017-12-20 09:50:23,407 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: CHANNEL, name: c1 started
2017-12-20 09:50:23,901 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:171)] Starting Sink k1
2017-12-20 09:50:23,902 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:182)] Starting Source r1
2017-12-20 09:50:23,903 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.source.AvroSource.start(AvroSource.java:234)] Starting Avro source r1: { bindAddress: 0.0.0.0, port: 4141 }...
2017-12-20 09:50:24,546 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SOURCE, name: r1: Successfully registered new MBean.
2017-12-20 09:50:24,547 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SOURCE, name: r1 started
2017-12-20 09:50:24,549 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.source.AvroSource.start(AvroSource.java:260)] Avro source r1 started.

最后看到Avro source r1 started.,说明agent a1启动成功。

(3)创建指定文件

代码语言:javascript
复制
[root@node1 ~]# echo "hello world" > test.log

(4)使用avro-client发送文件

代码语言:javascript
复制
[root@node1 flume-1.8.0]# bin/flume-ng avro-client -c /opt/flume-1.8.0/conf/ -H node1 -p 4141 -F /root/test.log 
Info: Sourcing environment configuration script /opt/flume-1.8.0/conf/flume-env.sh
Info: Including Hadoop libraries found via (/opt/hadoop-2.7.3/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/opt/hbase-1.2.6/bin/hbase) for HBASE access
Info: Including Hive libraries found via () for Hive access
/*
* 提示:该行代码过长,系统自动注释不进行高亮。一键复制会移除系统注释 
* + exec /opt/jdk1.8.0_112/bin/java -Xms100m -Xmx2000m -Dcom.sun.management.jmxremote -cp '/opt/flume-1.8.0/conf:/opt/flume-1.8.0/lib/*:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/opt/jdk1.8.0_112/lib/tools.jar:/opt/hbase-1.2.6:/opt/hbase-1.2.6/lib/activation-1.1.jar:/opt/hbase-1.2.6/lib/aopalliance-1.0.jar:/opt/hbase-1.2.6/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/api-util-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/asm-3.1.jar:/opt/hbase-1.2.6/lib/avro-1.7.4.jar:/opt/hbase-1.2.6/lib/commons-beanutils-1.7.0.jar:/opt/hbase-1.2.6/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase-1.2.6/lib/commons-cli-1.2.jar:/opt/hbase-1.2.6/lib/commons-codec-1.9.jar:/opt/hbase-1.2.6/lib/commons-collections-3.2.2.jar:/opt/hbase-1.2.6/lib/commons-compress-1.4.1.jar:/opt/hbase-1.2.6/lib/commons-configuration-1.6.jar:/opt/hbase-1.2.6/lib/commons-daemon-1.0.13.jar:/opt/hbase-1.2.6/lib/commons-digester-1.8.jar:/opt/hbase-1.2.6/lib/commons-el-1.0.jar:/opt/hbase-1.2.6/lib/commons-httpclient-3.1.jar:/opt/hbase-1.2.6/lib/commons-io-2.4.jar:/opt/hbase-1.2.6/lib/commons-lang-2.6.jar:/opt/hbase-1.2.6/lib/commons-logging-1.2.jar:/opt/hbase-1.2.6/lib/commons-math-2.2.jar:/opt/hbase-1.2.6/lib/commons-math3-3.1.1.jar:/opt/hbase-1.2.6/lib/commons-net-3.1.jar:/opt/hbase-1.2.6/lib/disruptor-3.3.0.jar:/opt/hbase-1.2.6/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase-1.2.6/lib/guava-12.0.1.jar:/opt/hbase-1.2.6/lib/guice-3.0.jar:/opt/hbase-1.2.6/lib/guice-servlet-3.0.jar:/opt/hbase-1.2.6/lib/hadoop-annotations-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-auth-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-hdfs-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-app-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-api-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-server-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-client-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-examples-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-external-blockcache-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop2-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-prefix-tree-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-procedure-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-protocol-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-resource-bundle-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-rest-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-shell-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-thrift-1.2.6.jar:/opt/hbase-1.2.6/lib/htrace-core-3.1.0-incubating.jar:/opt/hbase-1.2.6/lib/httpclient-4.2.5.jar:/opt/hbase-1.2.6/lib/httpcore-4.4.1.jar:/opt/hbase-1.2.6/lib/jackson-core-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-jaxrs-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-xc-1.9.13.jar:/opt/hbase-1.2.6/lib/jamon-runtime-2.4.1.jar:/opt/hbase-1.2.6/lib/jasper-compiler-5.5.23.jar:/opt/hbase-1.2.6/lib/jasper-runtime-5.5.23.jar:/opt/hbase-1.2.6/lib/javax.inject-1.jar:/opt/hbase-1.2.6/lib/java-xmlbuilder-0.4.jar:/opt/hbase-1.2.6/lib/jaxb-api-2.2.2.jar:/opt/hbase-1.2.6/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase-1.2.6/lib/jcodings-1.0.8.jar:/opt/hbase-1.2.6/lib/jersey-client-1.9.jar:/opt/hbase-1.2.6/lib/jersey-core-1.9.jar:/opt/hbase-1.2.6/lib/jersey-guice-1.9.jar:/opt/hbase-1.2.6/lib/jersey-json-1.9.jar:/opt/hbase-1.2.6/lib/jersey-server-1.9.jar:/opt/hbase-1.2.6/lib/jets3t-0.9.0.jar:/opt/hbase-1.2.6/lib/jettison-1.3.3.jar:/opt/hbase-1.2.6/lib/jetty-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-sslengine-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-util-6.1.26.jar:/opt/hbase-1.2.6/lib/joni-2.1.2.jar:/opt/hbase-1.2.6/lib/jruby-complete-1.6.8.jar:/opt/hbase-1.2.6/lib/jsch-0.1.42.jar:/opt/hbase-1.2.6/lib/jsp-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/jsp-api-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/junit-4.12.jar:/opt/hbase-1.2.6/lib/leveldbjni-all-1.8.jar:/opt/hbase-1.2.6/lib/libthrift-0.9.3.jar:/opt/hbase-1.2.6/lib/log4j-1.2.17.jar:/opt/hbase-1.2.6/lib/metrics-core-2.2.0.jar:/opt/hbase-1.2.6/lib/netty-all-4.0.23.Final.jar:/opt/hbase-1.2.6/lib/paranamer-2.3.jar:/opt/hbase-1.2.6/lib/protobuf-java-2.5.0.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5-6.1.14.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5.jar:/opt/hbase-1.2.6/lib/slf4j-api-1.7.7.jar:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar:/opt/hbase-1.2.6/lib/snappy-java-1.0.4.1.jar:/opt/hbase-1.2.6/lib/spymemcached-2.11.6.jar:/opt/hbase-1.2.6/lib/xmlenc-0.52.jar:/opt/hbase-1.2.6/lib/xz-1.0.jar:/opt/hbase-1.2.6/lib/zookeeper-3.4.6.jar:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/lib/*' -Djava.library.path=:/opt/hadoop-2.7.3/lib/native:/opt/hadoop-2.7.3/lib/native org.apache.flume.client.avro.AvroCLIClient -H node1 -p 4141 -F /root/test.log
*/
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flume-1.8.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
[root@node1 flume-1.8.0]# 

(5)接收到的消息 此时在fulme启动的控制台,可以看到以下信息,注意其中一行

代码语言:javascript
复制
2017-12-20 09:53:05,347 (lifecycleSupervisor-1-4) [INFO - org.apache.flume.source.AvroSource.start(AvroSource.java:260)] Avro sour
2017-12-20 10:03:10,594 (New I/O server boss #5) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x0492d472, /192.168.80.131:53314 => /192.168.80.131:4141] OPEN
2017-12-20 10:03:10,601 (New I/O worker #1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x0492d472, /192.168.80.131:53314 => /192.168.80.131:4141] BOUND: /192.168.80.131:4141
2017-12-20 10:03:10,602 (New I/O worker #1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x0492d472, /192.168.80.131:53314 => /192.168.80.131:4141] CONNECTED: /192.168.80.131:53314
2017-12-20 10:03:11,124 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:95)] Event: { headers:{} body: 68 65 6C 6C 6F 20 77 6F 72 6C 64                hello world }
2017-12-20 10:03:11,134 (New I/O worker #1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x0492d472, /192.168.80.131:53314 :> /192.168.80.131:4141] DISCONNECTED
2017-12-20 10:03:11,134 (New I/O worker #1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x0492d472, /192.168.80.131:53314 :> /192.168.80.131:4141] UNBOUND
2017-12-20 10:03:11,135 (New I/O worker #1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.handleUpstream(NettyServer.java:171)] [id: 0x0492d472, /192.168.80.131:53314 :> /192.168.80.131:4141] CLOSED
2017-12-20 10:03:11,135 (New I/O worker #1) [INFO - org.apache.avro.ipc.NettyServer$NettyServerAvroHandler.channelClosed(NettyServer.java:209)] Connection to /192.168.80.131:53314 disconnected.

把其中一行挑出来

代码语言:javascript
复制
2017-12-20 10:03:11,124 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:95)] Event: { headers:{} body: 68 65 6C 6C 6F 20 77 6F 72 6C 64                hello world }

4、例子2:Spool

Spool监测配置的目录下新增的文件,并将文件中的数据读取出来。

需要注意两点: 1) 拷贝到spool目录下的文件不可以再打开编辑。 2) spool目录下不可包含相应的子目录

(1)创建spool.conf

创建 agent 的配置文件 spool.conf

代码语言:javascript
复制
[root@node1 flume-1.8.0]# vi conf/spool.conf
[root@node1 flume-1.8.0]# cat conf/spool.conf 
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = spooldir
a1.sources.r1.channels = c1
a1.sources.r1.spoolDir =/root/logs
a1.sources.r1.fileHeader = true

# Describe the sink
a1.sinks.k1.type = logger

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
[root@node1 flume-1.8.0]# 

其中,/root/logs 为我存放被监控的日志文件目录

(2)启动flume agent a1 启动 Flume 代理

代码语言:javascript
复制
[root@node1 flume-1.8.0]# bin/flume-ng agent -c /opt/flume-1.8.0/conf/ -f /opt/flume-1.8.0/conf/spool.conf -n a1 -Dflume.root.logger=INFO,console
Info: Sourcing environment configuration script /opt/flume-1.8.0/conf/flume-env.sh
Info: Including Hadoop libraries found via (/opt/hadoop-2.7.3/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/opt/hbase-1.2.6/bin/hbase) for HBASE access
Info: Including Hive libraries found via () for Hive access
/*
* 提示:该行代码过长,系统自动注释不进行高亮。一键复制会移除系统注释 
* + exec /opt/jdk1.8.0_112/bin/java -Xms100m -Xmx2000m -Dcom.sun.management.jmxremote -Dflume.root.logger=INFO,console -cp '/opt/flume-1.8.0/conf:/opt/flume-1.8.0/lib/*:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/opt/jdk1.8.0_112/lib/tools.jar:/opt/hbase-1.2.6:/opt/hbase-1.2.6/lib/activation-1.1.jar:/opt/hbase-1.2.6/lib/aopalliance-1.0.jar:/opt/hbase-1.2.6/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/api-util-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/asm-3.1.jar:/opt/hbase-1.2.6/lib/avro-1.7.4.jar:/opt/hbase-1.2.6/lib/commons-beanutils-1.7.0.jar:/opt/hbase-1.2.6/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase-1.2.6/lib/commons-cli-1.2.jar:/opt/hbase-1.2.6/lib/commons-codec-1.9.jar:/opt/hbase-1.2.6/lib/commons-collections-3.2.2.jar:/opt/hbase-1.2.6/lib/commons-compress-1.4.1.jar:/opt/hbase-1.2.6/lib/commons-configuration-1.6.jar:/opt/hbase-1.2.6/lib/commons-daemon-1.0.13.jar:/opt/hbase-1.2.6/lib/commons-digester-1.8.jar:/opt/hbase-1.2.6/lib/commons-el-1.0.jar:/opt/hbase-1.2.6/lib/commons-httpclient-3.1.jar:/opt/hbase-1.2.6/lib/commons-io-2.4.jar:/opt/hbase-1.2.6/lib/commons-lang-2.6.jar:/opt/hbase-1.2.6/lib/commons-logging-1.2.jar:/opt/hbase-1.2.6/lib/commons-math-2.2.jar:/opt/hbase-1.2.6/lib/commons-math3-3.1.1.jar:/opt/hbase-1.2.6/lib/commons-net-3.1.jar:/opt/hbase-1.2.6/lib/disruptor-3.3.0.jar:/opt/hbase-1.2.6/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase-1.2.6/lib/guava-12.0.1.jar:/opt/hbase-1.2.6/lib/guice-3.0.jar:/opt/hbase-1.2.6/lib/guice-servlet-3.0.jar:/opt/hbase-1.2.6/lib/hadoop-annotations-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-auth-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-hdfs-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-app-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-api-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-server-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-client-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-examples-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-external-blockcache-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop2-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-prefix-tree-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-procedure-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-protocol-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-resource-bundle-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-rest-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-shell-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-thrift-1.2.6.jar:/opt/hbase-1.2.6/lib/htrace-core-3.1.0-incubating.jar:/opt/hbase-1.2.6/lib/httpclient-4.2.5.jar:/opt/hbase-1.2.6/lib/httpcore-4.4.1.jar:/opt/hbase-1.2.6/lib/jackson-core-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-jaxrs-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-xc-1.9.13.jar:/opt/hbase-1.2.6/lib/jamon-runtime-2.4.1.jar:/opt/hbase-1.2.6/lib/jasper-compiler-5.5.23.jar:/opt/hbase-1.2.6/lib/jasper-runtime-5.5.23.jar:/opt/hbase-1.2.6/lib/javax.inject-1.jar:/opt/hbase-1.2.6/lib/java-xmlbuilder-0.4.jar:/opt/hbase-1.2.6/lib/jaxb-api-2.2.2.jar:/opt/hbase-1.2.6/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase-1.2.6/lib/jcodings-1.0.8.jar:/opt/hbase-1.2.6/lib/jersey-client-1.9.jar:/opt/hbase-1.2.6/lib/jersey-core-1.9.jar:/opt/hbase-1.2.6/lib/jersey-guice-1.9.jar:/opt/hbase-1.2.6/lib/jersey-json-1.9.jar:/opt/hbase-1.2.6/lib/jersey-server-1.9.jar:/opt/hbase-1.2.6/lib/jets3t-0.9.0.jar:/opt/hbase-1.2.6/lib/jettison-1.3.3.jar:/opt/hbase-1.2.6/lib/jetty-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-sslengine-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-util-6.1.26.jar:/opt/hbase-1.2.6/lib/joni-2.1.2.jar:/opt/hbase-1.2.6/lib/jruby-complete-1.6.8.jar:/opt/hbase-1.2.6/lib/jsch-0.1.42.jar:/opt/hbase-1.2.6/lib/jsp-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/jsp-api-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/junit-4.12.jar:/opt/hbase-1.2.6/lib/leveldbjni-all-1.8.jar:/opt/hbase-1.2.6/lib/libthrift-0.9.3.jar:/opt/hbase-1.2.6/lib/log4j-1.2.17.jar:/opt/hbase-1.2.6/lib/metrics-core-2.2.0.jar:/opt/hbase-1.2.6/lib/netty-all-4.0.23.Final.jar:/opt/hbase-1.2.6/lib/paranamer-2.3.jar:/opt/hbase-1.2.6/lib/protobuf-java-2.5.0.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5-6.1.14.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5.jar:/opt/hbase-1.2.6/lib/slf4j-api-1.7.7.jar:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar:/opt/hbase-1.2.6/lib/snappy-java-1.0.4.1.jar:/opt/hbase-1.2.6/lib/spymemcached-2.11.6.jar:/opt/hbase-1.2.6/lib/xmlenc-0.52.jar:/opt/hbase-1.2.6/lib/xz-1.0.jar:/opt/hbase-1.2.6/lib/zookeeper-3.4.6.jar:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/lib/*' -Djava.library.path=:/opt/hadoop-2.7.3/lib/native:/opt/hadoop-2.7.3/lib/native org.apache.flume.node.Application -f /opt/flume-1.8.0/conf/spool.conf -n a1
*/
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flume-1.8.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2017-12-20 10:15:22,365 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start(PollingPropertiesFileConfigurationProvider.java:62)] Configuration provider starting
2017-12-20 10:15:22,376 (conf-file-poller-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:134)] Reloading configuration file:/opt/flume-1.8.0/conf/spool.conf
2017-12-20 10:15:22,383 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:930)] Added sinks: k1 Agent: a1
2017-12-20 10:15:22,384 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-20 10:15:22,384 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-20 10:15:22,412 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration.validateConfiguration(FlumeConfiguration.java:140)] Post-validation flume configuration contains configuration for agents: [a1]
2017-12-20 10:15:22,412 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:147)] Creating channels
2017-12-20 10:15:22,422 (conf-file-poller-0) [INFO - org.apache.flume.channel.DefaultChannelFactory.create(DefaultChannelFactory.java:42)] Creating instance of channel c1 type memory
2017-12-20 10:15:22,432 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:201)] Created channel c1
2017-12-20 10:15:22,434 (conf-file-poller-0) [INFO - org.apache.flume.source.DefaultSourceFactory.create(DefaultSourceFactory.java:41)] Creating instance of source r1, type spooldir
2017-12-20 10:15:22,458 (conf-file-poller-0) [INFO - org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:42)] Creating instance of sink: k1, type: logger
2017-12-20 10:15:22,464 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:116)] Channel c1 connected to [r1, k1]
2017-12-20 10:15:22,480 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:137)] Starting new configuration:{ sourceRunners:{r1=EventDrivenSourceRunner: { source:Spool Directory source r1: { spoolDir: /root/logs } }} sinkRunners:{k1=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@3969eb61 counterGroup:{ name:null counters:{} } }} channels:{c1=org.apache.flume.channel.MemoryChannel{name: c1}} }
2017-12-20 10:15:22,500 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:144)] Starting Channel c1
2017-12-20 10:15:22,503 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:159)] Waiting for channel: c1 to start. Sleeping for 500 ms
2017-12-20 10:15:22,506 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: CHANNEL, name: c1: Successfully registered new MBean.
2017-12-20 10:15:22,509 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: CHANNEL, name: c1 started
2017-12-20 10:15:23,004 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:171)] Starting Sink k1
2017-12-20 10:15:23,005 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:182)] Starting Source r1
2017-12-20 10:15:23,006 (lifecycleSupervisor-1-4) [INFO - org.apache.flume.source.SpoolDirectorySource.start(SpoolDirectorySource.java:83)] SpoolDirectorySource source starting with directory: /root/logs
2017-12-20 10:15:23,058 (lifecycleSupervisor-1-4) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SOURCE, name: r1: Successfully registered new MBean.
2017-12-20 10:15:23,059 (lifecycleSupervisor-1-4) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SOURCE, name: r1 started

(2)添加日志文件

模拟日志产生,直接追加文件到/root/logs目录 另外打开一个终端,

代码语言:javascript
复制
[root@node1 ~]# cp test.log /root/logs

(3)flume搜集到的数据 再观察(1)中flume的控制台的输出:

代码语言:javascript
复制
2017-12-20 10:15:36,829 (pool-3-thread-1) [INFO - org.apache.flume.client.avro.ReliableSpoolingFileEventReader.readEvents(ReliableSpoolingFileEventReader.java:324)] Last read took us just up to a file boundary. Rolling to the next file, if there is one.
2017-12-20 10:15:36,829 (pool-3-thread-1) [INFO - org.apache.flume.client.avro.ReliableSpoolingFileEventReader.rollCurrentFile(ReliableSpoolingFileEventReader.java:433)] Preparing to move file /root/logs/test.log to /root/logs/test.log.COMPLETED
2017-12-20 10:15:38,031 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:95)] Event: { headers:{file=/root/logs/test.log} body: 68 65 6C 6C 6F 20 77 6F 72 6C 64                hello world }

(4)COMPLETED Flume在传完文件之后,将会修改文件的后缀,变为.COMPLETED(后缀也可以在配置文件中灵活指定)

代码语言:javascript
复制
[root@node1 ~]# ll /root/logs
total 4
-rw-r--r-- 1 root root 12 Dec 20 10:14 test.log.COMPLETED
[root@node1 ~]#

5、例子3:Syslogtcp

Syslogtcp监听TCP的端口做为数据源

(1)编辑agent 的配置文件syslogtcp.conf

代码语言:javascript
复制
[root@node1 flume-1.8.0]# vi conf/syslogtcp.conf
[root@node1 flume-1.8.0]# cat conf/syslogtcp.conf 
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe/configure the source
a1.sources.r1.type = syslogtcp
a1.sources.r1.port = 5140
a1.sources.r1.host = localhost
a1.sources.r1.channels = c1

# Describe the sink
a1.sinks.k1.type = logger

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# Bind the source and sink to the channel
a1.sources.r1.channels = c1
a1.sinks.k1.channel = c1
[root@node1 flume-1.8.0]#

(2)启动 Flume 代理 bin/flume-ng agent -c /opt/flume-1.8.0/conf/ -f /opt/flume-1.8.0/conf/syslogtcp.conf -n a1 -Dflume.root.logger=INFO,console

代码语言:javascript
复制
[root@node1 flume-1.8.0]# bin/flume-ng agent -c /opt/flume-1.8.0/conf/ -f /opt/flume-1.8.0/conf/syslogtcp.conf -n a1 -Dflume.root.logger=INFO,console
Info: Sourcing environment configuration script /opt/flume-1.8.0/conf/flume-env.sh
Info: Including Hadoop libraries found via (/opt/hadoop-2.7.3/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/opt/hbase-1.2.6/bin/hbase) for HBASE access
Info: Including Hive libraries found via () for Hive access
/*
* 提示:该行代码过长,系统自动注释不进行高亮。一键复制会移除系统注释 
* + exec /opt/jdk1.8.0_112/bin/java -Xms100m -Xmx2000m -Dcom.sun.management.jmxremote -Dflume.root.logger=INFO,console -cp '/opt/flume-1.8.0/conf:/opt/flume-1.8.0/lib/*:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/opt/jdk1.8.0_112/lib/tools.jar:/opt/hbase-1.2.6:/opt/hbase-1.2.6/lib/activation-1.1.jar:/opt/hbase-1.2.6/lib/aopalliance-1.0.jar:/opt/hbase-1.2.6/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/api-util-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/asm-3.1.jar:/opt/hbase-1.2.6/lib/avro-1.7.4.jar:/opt/hbase-1.2.6/lib/commons-beanutils-1.7.0.jar:/opt/hbase-1.2.6/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase-1.2.6/lib/commons-cli-1.2.jar:/opt/hbase-1.2.6/lib/commons-codec-1.9.jar:/opt/hbase-1.2.6/lib/commons-collections-3.2.2.jar:/opt/hbase-1.2.6/lib/commons-compress-1.4.1.jar:/opt/hbase-1.2.6/lib/commons-configuration-1.6.jar:/opt/hbase-1.2.6/lib/commons-daemon-1.0.13.jar:/opt/hbase-1.2.6/lib/commons-digester-1.8.jar:/opt/hbase-1.2.6/lib/commons-el-1.0.jar:/opt/hbase-1.2.6/lib/commons-httpclient-3.1.jar:/opt/hbase-1.2.6/lib/commons-io-2.4.jar:/opt/hbase-1.2.6/lib/commons-lang-2.6.jar:/opt/hbase-1.2.6/lib/commons-logging-1.2.jar:/opt/hbase-1.2.6/lib/commons-math-2.2.jar:/opt/hbase-1.2.6/lib/commons-math3-3.1.1.jar:/opt/hbase-1.2.6/lib/commons-net-3.1.jar:/opt/hbase-1.2.6/lib/disruptor-3.3.0.jar:/opt/hbase-1.2.6/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase-1.2.6/lib/guava-12.0.1.jar:/opt/hbase-1.2.6/lib/guice-3.0.jar:/opt/hbase-1.2.6/lib/guice-servlet-3.0.jar:/opt/hbase-1.2.6/lib/hadoop-annotations-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-auth-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-hdfs-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-app-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-api-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-server-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-client-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-examples-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-external-blockcache-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop2-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-prefix-tree-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-procedure-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-protocol-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-resource-bundle-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-rest-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-shell-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-thrift-1.2.6.jar:/opt/hbase-1.2.6/lib/htrace-core-3.1.0-incubating.jar:/opt/hbase-1.2.6/lib/httpclient-4.2.5.jar:/opt/hbase-1.2.6/lib/httpcore-4.4.1.jar:/opt/hbase-1.2.6/lib/jackson-core-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-jaxrs-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-xc-1.9.13.jar:/opt/hbase-1.2.6/lib/jamon-runtime-2.4.1.jar:/opt/hbase-1.2.6/lib/jasper-compiler-5.5.23.jar:/opt/hbase-1.2.6/lib/jasper-runtime-5.5.23.jar:/opt/hbase-1.2.6/lib/javax.inject-1.jar:/opt/hbase-1.2.6/lib/java-xmlbuilder-0.4.jar:/opt/hbase-1.2.6/lib/jaxb-api-2.2.2.jar:/opt/hbase-1.2.6/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase-1.2.6/lib/jcodings-1.0.8.jar:/opt/hbase-1.2.6/lib/jersey-client-1.9.jar:/opt/hbase-1.2.6/lib/jersey-core-1.9.jar:/opt/hbase-1.2.6/lib/jersey-guice-1.9.jar:/opt/hbase-1.2.6/lib/jersey-json-1.9.jar:/opt/hbase-1.2.6/lib/jersey-server-1.9.jar:/opt/hbase-1.2.6/lib/jets3t-0.9.0.jar:/opt/hbase-1.2.6/lib/jettison-1.3.3.jar:/opt/hbase-1.2.6/lib/jetty-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-sslengine-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-util-6.1.26.jar:/opt/hbase-1.2.6/lib/joni-2.1.2.jar:/opt/hbase-1.2.6/lib/jruby-complete-1.6.8.jar:/opt/hbase-1.2.6/lib/jsch-0.1.42.jar:/opt/hbase-1.2.6/lib/jsp-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/jsp-api-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/junit-4.12.jar:/opt/hbase-1.2.6/lib/leveldbjni-all-1.8.jar:/opt/hbase-1.2.6/lib/libthrift-0.9.3.jar:/opt/hbase-1.2.6/lib/log4j-1.2.17.jar:/opt/hbase-1.2.6/lib/metrics-core-2.2.0.jar:/opt/hbase-1.2.6/lib/netty-all-4.0.23.Final.jar:/opt/hbase-1.2.6/lib/paranamer-2.3.jar:/opt/hbase-1.2.6/lib/protobuf-java-2.5.0.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5-6.1.14.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5.jar:/opt/hbase-1.2.6/lib/slf4j-api-1.7.7.jar:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar:/opt/hbase-1.2.6/lib/snappy-java-1.0.4.1.jar:/opt/hbase-1.2.6/lib/spymemcached-2.11.6.jar:/opt/hbase-1.2.6/lib/xmlenc-0.52.jar:/opt/hbase-1.2.6/lib/xz-1.0.jar:/opt/hbase-1.2.6/lib/zookeeper-3.4.6.jar:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/lib/*' -Djava.library.path=:/opt/hadoop-2.7.3/lib/native:/opt/hadoop-2.7.3/lib/native org.apache.flume.node.Application -f /opt/flume-1.8.0/conf/syslogtcp.conf -n a1
*/
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flume-1.8.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2017-12-21 07:23:50,152 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start(PollingPropertiesFileConfigurationProvider.java:62)] Configuration provider starting
2017-12-21 07:23:50,165 (conf-file-poller-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:134)] Reloading configuration file:/opt/flume-1.8.0/conf/syslogtcp.conf
2017-12-21 07:23:50,184 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:930)] Added sinks: k1 Agent: a1
2017-12-21 07:23:50,184 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 07:23:50,184 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 07:23:50,206 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration.validateConfiguration(FlumeConfiguration.java:140)] Post-validation flume configuration contains configuration for agents: [a1]
2017-12-21 07:23:50,206 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:147)] Creating channels
2017-12-21 07:23:50,245 (conf-file-poller-0) [INFO - org.apache.flume.channel.DefaultChannelFactory.create(DefaultChannelFactory.java:42)] Creating instance of channel c1 type memory
2017-12-21 07:23:50,252 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:201)] Created channel c1
2017-12-21 07:23:50,253 (conf-file-poller-0) [INFO - org.apache.flume.source.DefaultSourceFactory.create(DefaultSourceFactory.java:41)] Creating instance of source r1, type syslogtcp
2017-12-21 07:23:50,272 (conf-file-poller-0) [INFO - org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:42)] Creating instance of sink: k1, type: logger
2017-12-21 07:23:50,277 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:116)] Channel c1 connected to [r1, k1]
2017-12-21 07:23:50,295 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:137)] Starting new configuration:{ sourceRunners:{r1=EventDrivenSourceRunner: { source:org.apache.flume.source.SyslogTcpSource{name:r1,state:IDLE} }} sinkRunners:{k1=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@7a7b110d counterGroup:{ name:null counters:{} } }} channels:{c1=org.apache.flume.channel.MemoryChannel{name: c1}} }
2017-12-21 07:23:50,334 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:144)] Starting Channel c1
2017-12-21 07:23:50,335 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:159)] Waiting for channel: c1 to start. Sleeping for 500 ms
2017-12-21 07:23:50,337 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: CHANNEL, name: c1: Successfully registered new MBean.
2017-12-21 07:23:50,340 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: CHANNEL, name: c1 started
2017-12-21 07:23:50,835 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:171)] Starting Sink k1
2017-12-21 07:23:50,837 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:182)] Starting Source r1
2017-12-21 07:23:51,005 (lifecycleSupervisor-1-4) [INFO - org.apache.flume.source.SyslogTcpSource.start(SyslogTcpSource.java:125)] Syslog TCP Source starting...
2017-12-21 07:23:51,030 (lifecycleSupervisor-1-4) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SOURCE, name: r1: Successfully registered new MBean.
2017-12-21 07:23:51,030 (lifecycleSupervisor-1-4) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SOURCE, name: r1 started

(3)发送数据

代码语言:javascript
复制
[root@node1 ~]# echo "hello world" | nc localhost 5140

(4)控制台输出

代码语言:javascript
复制
2017-12-21 07:23:51,030 (lifecycleSupervisor-1-4) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SOURCE, name: r1 started
2017-12-21 07:25:30,542 (New I/O worker #1) [WARN - org.apache.flume.source.SyslogUtils.buildEvent(SyslogUtils.java:317)] Event created from Invalid Syslog data.
2017-12-21 07:25:32,999 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.LoggerSink.process(LoggerSink.java:95)] Event: { headers:{Severity=0, Facility=0, flume.syslog.status=Invalid} body: 68 65 6C 6C 6F 20 77 6F 72 6C 64                hello world }

6、例子4:Exec

EXEC 执行一个给定的命令获得输出的源。 下面演示实时监控/root/test.log文件,并将结果输出到/tmp 目录下。

(1)创建 agent 的配置文件 exec.conf

代码语言:javascript
复制
[root@node1 flume-1.8.0]# vi conf/exec.conf
[root@node1 flume-1.8.0]# cat conf/exec.conf 
# Describe the agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe the source
a1.sources.r1.type = exec
a1.sources.r1.shell = /bin/bash -c
a1.sources.r1.channels = c1
a1.sources.r1.command = tail -F /root/test.log

# Describe the sink
a1.sinks.k1.type = logger

# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# sink
a1.sinks.k1.type = file_roll
a1.sinks.k1.channel = c1
a1.sinks.k1.sink.directory = /tmp
[root@node1 flume-1.8.0]#

(2)启动代理 bin/flume-ng agent -c /opt/flume-1.8.0/conf/ -f /opt/flume-1.8.0/conf/exec.conf -n a1 -Dflume.root.logger=INFO,console

代码语言:javascript
复制
[root@node1 flume-1.8.0]# bin/flume-ng agent -c /opt/flume-1.8.0/conf/ -f /opt/flume-1.8.0/conf/exec.conf -n a1 -Dflume.root.logger=INFO,console
Info: Sourcing environment configuration script /opt/flume-1.8.0/conf/flume-env.sh
Info: Including Hadoop libraries found via (/opt/hadoop-2.7.3/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/opt/hbase-1.2.6/bin/hbase) for HBASE access
Info: Including Hive libraries found via () for Hive access
/*
* 提示:该行代码过长,系统自动注释不进行高亮。一键复制会移除系统注释 
* + exec /opt/jdk1.8.0_112/bin/java -Xms100m -Xmx2000m -Dcom.sun.management.jmxremote -Dflume.root.logger=INFO,console -cp '/opt/flume-1.8.0/conf:/opt/flume-1.8.0/lib/*:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/opt/jdk1.8.0_112/lib/tools.jar:/opt/hbase-1.2.6:/opt/hbase-1.2.6/lib/activation-1.1.jar:/opt/hbase-1.2.6/lib/aopalliance-1.0.jar:/opt/hbase-1.2.6/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/api-util-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/asm-3.1.jar:/opt/hbase-1.2.6/lib/avro-1.7.4.jar:/opt/hbase-1.2.6/lib/commons-beanutils-1.7.0.jar:/opt/hbase-1.2.6/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase-1.2.6/lib/commons-cli-1.2.jar:/opt/hbase-1.2.6/lib/commons-codec-1.9.jar:/opt/hbase-1.2.6/lib/commons-collections-3.2.2.jar:/opt/hbase-1.2.6/lib/commons-compress-1.4.1.jar:/opt/hbase-1.2.6/lib/commons-configuration-1.6.jar:/opt/hbase-1.2.6/lib/commons-daemon-1.0.13.jar:/opt/hbase-1.2.6/lib/commons-digester-1.8.jar:/opt/hbase-1.2.6/lib/commons-el-1.0.jar:/opt/hbase-1.2.6/lib/commons-httpclient-3.1.jar:/opt/hbase-1.2.6/lib/commons-io-2.4.jar:/opt/hbase-1.2.6/lib/commons-lang-2.6.jar:/opt/hbase-1.2.6/lib/commons-logging-1.2.jar:/opt/hbase-1.2.6/lib/commons-math-2.2.jar:/opt/hbase-1.2.6/lib/commons-math3-3.1.1.jar:/opt/hbase-1.2.6/lib/commons-net-3.1.jar:/opt/hbase-1.2.6/lib/disruptor-3.3.0.jar:/opt/hbase-1.2.6/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase-1.2.6/lib/guava-12.0.1.jar:/opt/hbase-1.2.6/lib/guice-3.0.jar:/opt/hbase-1.2.6/lib/guice-servlet-3.0.jar:/opt/hbase-1.2.6/lib/hadoop-annotations-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-auth-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-hdfs-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-app-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-api-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-server-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-client-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-examples-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-external-blockcache-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop2-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-prefix-tree-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-procedure-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-protocol-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-resource-bundle-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-rest-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-shell-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-thrift-1.2.6.jar:/opt/hbase-1.2.6/lib/htrace-core-3.1.0-incubating.jar:/opt/hbase-1.2.6/lib/httpclient-4.2.5.jar:/opt/hbase-1.2.6/lib/httpcore-4.4.1.jar:/opt/hbase-1.2.6/lib/jackson-core-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-jaxrs-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-xc-1.9.13.jar:/opt/hbase-1.2.6/lib/jamon-runtime-2.4.1.jar:/opt/hbase-1.2.6/lib/jasper-compiler-5.5.23.jar:/opt/hbase-1.2.6/lib/jasper-runtime-5.5.23.jar:/opt/hbase-1.2.6/lib/javax.inject-1.jar:/opt/hbase-1.2.6/lib/java-xmlbuilder-0.4.jar:/opt/hbase-1.2.6/lib/jaxb-api-2.2.2.jar:/opt/hbase-1.2.6/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase-1.2.6/lib/jcodings-1.0.8.jar:/opt/hbase-1.2.6/lib/jersey-client-1.9.jar:/opt/hbase-1.2.6/lib/jersey-core-1.9.jar:/opt/hbase-1.2.6/lib/jersey-guice-1.9.jar:/opt/hbase-1.2.6/lib/jersey-json-1.9.jar:/opt/hbase-1.2.6/lib/jersey-server-1.9.jar:/opt/hbase-1.2.6/lib/jets3t-0.9.0.jar:/opt/hbase-1.2.6/lib/jettison-1.3.3.jar:/opt/hbase-1.2.6/lib/jetty-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-sslengine-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-util-6.1.26.jar:/opt/hbase-1.2.6/lib/joni-2.1.2.jar:/opt/hbase-1.2.6/lib/jruby-complete-1.6.8.jar:/opt/hbase-1.2.6/lib/jsch-0.1.42.jar:/opt/hbase-1.2.6/lib/jsp-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/jsp-api-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/junit-4.12.jar:/opt/hbase-1.2.6/lib/leveldbjni-all-1.8.jar:/opt/hbase-1.2.6/lib/libthrift-0.9.3.jar:/opt/hbase-1.2.6/lib/log4j-1.2.17.jar:/opt/hbase-1.2.6/lib/metrics-core-2.2.0.jar:/opt/hbase-1.2.6/lib/netty-all-4.0.23.Final.jar:/opt/hbase-1.2.6/lib/paranamer-2.3.jar:/opt/hbase-1.2.6/lib/protobuf-java-2.5.0.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5-6.1.14.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5.jar:/opt/hbase-1.2.6/lib/slf4j-api-1.7.7.jar:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar:/opt/hbase-1.2.6/lib/snappy-java-1.0.4.1.jar:/opt/hbase-1.2.6/lib/spymemcached-2.11.6.jar:/opt/hbase-1.2.6/lib/xmlenc-0.52.jar:/opt/hbase-1.2.6/lib/xz-1.0.jar:/opt/hbase-1.2.6/lib/zookeeper-3.4.6.jar:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/lib/*' -Djava.library.path=:/opt/hadoop-2.7.3/lib/native:/opt/hadoop-2.7.3/lib/native org.apache.flume.node.Application -f /opt/flume-1.8.0/conf/exec.conf -n a1
*/
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flume-1.8.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2017-12-21 07:41:19,266 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start(PollingPropertiesFileConfigurationProvider.java:62)] Configuration provider starting
2017-12-21 07:41:19,274 (conf-file-poller-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:134)] Reloading configuration file:/opt/flume-1.8.0/conf/exec.conf
2017-12-21 07:41:19,279 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 07:41:19,279 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:930)] Added sinks: k1 Agent: a1
2017-12-21 07:41:19,280 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 07:41:19,280 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 07:41:19,297 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration.validateConfiguration(FlumeConfiguration.java:140)] Post-validation flume configuration contains configuration for agents: [a1]
2017-12-21 07:41:19,297 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:147)] Creating channels
2017-12-21 07:41:19,308 (conf-file-poller-0) [INFO - org.apache.flume.channel.DefaultChannelFactory.create(DefaultChannelFactory.java:42)] Creating instance of channel c1 type memory
2017-12-21 07:41:19,313 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:201)] Created channel c1
2017-12-21 07:41:19,314 (conf-file-poller-0) [INFO - org.apache.flume.source.DefaultSourceFactory.create(DefaultSourceFactory.java:41)] Creating instance of source r1, type exec
2017-12-21 07:41:19,324 (conf-file-poller-0) [INFO - org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:42)] Creating instance of sink: k1, type: file_roll
2017-12-21 07:41:19,338 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:116)] Channel c1 connected to [r1, k1]
2017-12-21 07:41:19,345 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:137)] Starting new configuration:{ sourceRunners:{r1=EventDrivenSourceRunner: { source:org.apache.flume.source.ExecSource{name:r1,state:IDLE} }} sinkRunners:{k1=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@1e4b4b8d counterGroup:{ name:null counters:{} } }} channels:{c1=org.apache.flume.channel.MemoryChannel{name: c1}} }
2017-12-21 07:41:19,360 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:144)] Starting Channel c1
2017-12-21 07:41:19,363 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: CHANNEL, name: c1: Successfully registered new MBean.
2017-12-21 07:41:19,370 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: CHANNEL, name: c1 started
2017-12-21 07:41:19,389 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:171)] Starting Sink k1
2017-12-21 07:41:19,391 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:182)] Starting Source r1
2017-12-21 07:41:19,392 (lifecycleSupervisor-1-3) [INFO - org.apache.flume.sink.RollingFileSink.start(RollingFileSink.java:110)] Starting org.apache.flume.sink.RollingFileSink{name:k1, channel:c1}...
2017-12-21 07:41:19,393 (lifecycleSupervisor-1-3) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SINK, name: k1: Successfully registered new MBean.
2017-12-21 07:41:19,394 (lifecycleSupervisor-1-3) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SINK, name: k1 started
2017-12-21 07:41:19,399 (lifecycleSupervisor-1-3) [INFO - org.apache.flume.sink.RollingFileSink.start(RollingFileSink.java:142)] RollingFileSink k1 started.
2017-12-21 07:41:19,406 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.source.ExecSource.start(ExecSource.java:168)] Exec source starting with command: tail -F /root/test.log
2017-12-21 07:41:19,407 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SOURCE, name: r1: Successfully registered new MBean.
2017-12-21 07:41:19,408 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SOURCE, name: r1 started

(3)实时产生数据 另外开启一个 终端,用脚本输出信息到 /root/test.log

编写脚本,没隔1s输入一个数据到 /root/test.log

代码语言:javascript
复制
[root@node1 ~]# vi createData.sh
[root@node1 ~]# cat createData.sh
#!/bin/bash
for i in {1..1000}
do 
   sleep 1
   echo "exec$i" >> /root/test.log
done
[root@node1 ~]# 

启动脚本

代码语言:javascript
复制
[root@node1 ~]# sh createData.sh

(4)查看实时收集到的数据

代码语言:javascript
复制
[root@node1 tmp]# ll
total 4
-rw-r--r-- 1 root root  1 Dec 21 07:57 1513861046584-1
drwxr-xr-x 2 root root 18 Dec 21 07:57 hsperfdata_root
[root@node1 tmp]# ll
total 4
-rw-r--r-- 1 root root 25 Dec 21 07:57 1513861046584-1
-rw-r--r-- 1 root root  0 Dec 21 07:57 1513861046584-2
drwxr-xr-x 2 root root 18 Dec 21 07:57 hsperfdata_root
[root@node1 tmp]# ll
total 8
-rw-r--r-- 1 root root 25 Dec 21 07:57 1513861046584-1
-rw-r--r-- 1 root root 58 Dec 21 07:58 1513861046584-2
drwxr-xr-x 2 root root 18 Dec 21 07:57 hsperfdata_root
[root@node1 tmp]# ll
total 8
-rw-r--r-- 1 root root  25 Dec 21 07:57 1513861046584-1
-rw-r--r-- 1 root root 121 Dec 21 07:58 1513861046584-2
drwxr-xr-x 2 root root  18 Dec 21 07:57 hsperfdata_root
[root@node1 tmp]# cat 1513861046584-1
exec1
exec2
exec3
exec4
[root@node1 tmp]# 

7、例子5:如何把数据写入HDFS

(1)创建HDFS目录

代码语言:javascript
复制
[root@node1 ~]# hdfs dfs -mkdir /flume

(2)创建hdfs.conf

代码语言:javascript
复制
[root@node1 flume-1.8.0]# vi conf/hdfs.conf 
[root@node1 flume-1.8.0]# cat conf/hdfs.conf 
# Describe the agent
a1.sources = r1
a1.sinks = k1
a1.channels = c1

# Describe the source
a1.sources.r1.type = exec
a1.sources.r1.shell = /bin/bash -c
a1.sources.r1.channels = c1
a1.sources.r1.command = tail -F /root/test.log


# Use a channel which buffers events in memory
a1.channels.c1.type = memory
a1.channels.c1.capacity = 1000
a1.channels.c1.transactionCapacity = 100

# sink
a1.sinks.k1.type = hdfs
a1.sinks.k1.channel = c1
a1.sinks.k1.hdfs.path = hdfs://cetc/flume
a1.sinks.k1.hdfs.filePrefix = Syslog
a1.sinks.k1.hdfs.round = true
a1.sinks.k1.hdfs.roundValue = 10
a1.sinks.k1.hdfs.roundUnit = minute
[root@node1 flume-1.8.0]# 

(3)启动flume代理

代码语言:javascript
复制
[root@node1 flume-1.8.0]# bin/flume-ng agent -c /opt/flume-1.8.0/conf/ -f /opt/flume-1.8.0/conf/hdfs.conf -n a1 -Dflume.root.logger=INFO,console
Info: Sourcing environment configuration script /opt/flume-1.8.0/conf/flume-env.sh
Info: Including Hadoop libraries found via (/opt/hadoop-2.7.3/bin/hadoop) for HDFS access
Info: Including HBASE libraries found via (/opt/hbase-1.2.6/bin/hbase) for HBASE access
Info: Including Hive libraries found via () for Hive access
/*
* 提示:该行代码过长,系统自动注释不进行高亮。一键复制会移除系统注释 
* + exec /opt/jdk1.8.0_112/bin/java -Xms100m -Xmx2000m -Dcom.sun.management.jmxremote -Dflume.root.logger=INFO,console -cp '/opt/flume-1.8.0/conf:/opt/flume-1.8.0/lib/*:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/opt/jdk1.8.0_112/lib/tools.jar:/opt/hbase-1.2.6:/opt/hbase-1.2.6/lib/activation-1.1.jar:/opt/hbase-1.2.6/lib/aopalliance-1.0.jar:/opt/hbase-1.2.6/lib/apacheds-i18n-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/apacheds-kerberos-codec-2.0.0-M15.jar:/opt/hbase-1.2.6/lib/api-asn1-api-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/api-util-1.0.0-M20.jar:/opt/hbase-1.2.6/lib/asm-3.1.jar:/opt/hbase-1.2.6/lib/avro-1.7.4.jar:/opt/hbase-1.2.6/lib/commons-beanutils-1.7.0.jar:/opt/hbase-1.2.6/lib/commons-beanutils-core-1.8.0.jar:/opt/hbase-1.2.6/lib/commons-cli-1.2.jar:/opt/hbase-1.2.6/lib/commons-codec-1.9.jar:/opt/hbase-1.2.6/lib/commons-collections-3.2.2.jar:/opt/hbase-1.2.6/lib/commons-compress-1.4.1.jar:/opt/hbase-1.2.6/lib/commons-configuration-1.6.jar:/opt/hbase-1.2.6/lib/commons-daemon-1.0.13.jar:/opt/hbase-1.2.6/lib/commons-digester-1.8.jar:/opt/hbase-1.2.6/lib/commons-el-1.0.jar:/opt/hbase-1.2.6/lib/commons-httpclient-3.1.jar:/opt/hbase-1.2.6/lib/commons-io-2.4.jar:/opt/hbase-1.2.6/lib/commons-lang-2.6.jar:/opt/hbase-1.2.6/lib/commons-logging-1.2.jar:/opt/hbase-1.2.6/lib/commons-math-2.2.jar:/opt/hbase-1.2.6/lib/commons-math3-3.1.1.jar:/opt/hbase-1.2.6/lib/commons-net-3.1.jar:/opt/hbase-1.2.6/lib/disruptor-3.3.0.jar:/opt/hbase-1.2.6/lib/findbugs-annotations-1.3.9-1.jar:/opt/hbase-1.2.6/lib/guava-12.0.1.jar:/opt/hbase-1.2.6/lib/guice-3.0.jar:/opt/hbase-1.2.6/lib/guice-servlet-3.0.jar:/opt/hbase-1.2.6/lib/hadoop-annotations-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-auth-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-hdfs-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-app-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-core-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-jobclient-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-mapreduce-client-shuffle-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-api-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-client-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hadoop-yarn-server-common-2.5.1.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-annotations-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-client-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-common-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-examples-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-external-blockcache-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop2-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-hadoop-compat-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-it-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-prefix-tree-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-procedure-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-protocol-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-resource-bundle-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-rest-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-server-1.2.6-tests.jar:/opt/hbase-1.2.6/lib/hbase-shell-1.2.6.jar:/opt/hbase-1.2.6/lib/hbase-thrift-1.2.6.jar:/opt/hbase-1.2.6/lib/htrace-core-3.1.0-incubating.jar:/opt/hbase-1.2.6/lib/httpclient-4.2.5.jar:/opt/hbase-1.2.6/lib/httpcore-4.4.1.jar:/opt/hbase-1.2.6/lib/jackson-core-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-jaxrs-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-mapper-asl-1.9.13.jar:/opt/hbase-1.2.6/lib/jackson-xc-1.9.13.jar:/opt/hbase-1.2.6/lib/jamon-runtime-2.4.1.jar:/opt/hbase-1.2.6/lib/jasper-compiler-5.5.23.jar:/opt/hbase-1.2.6/lib/jasper-runtime-5.5.23.jar:/opt/hbase-1.2.6/lib/javax.inject-1.jar:/opt/hbase-1.2.6/lib/java-xmlbuilder-0.4.jar:/opt/hbase-1.2.6/lib/jaxb-api-2.2.2.jar:/opt/hbase-1.2.6/lib/jaxb-impl-2.2.3-1.jar:/opt/hbase-1.2.6/lib/jcodings-1.0.8.jar:/opt/hbase-1.2.6/lib/jersey-client-1.9.jar:/opt/hbase-1.2.6/lib/jersey-core-1.9.jar:/opt/hbase-1.2.6/lib/jersey-guice-1.9.jar:/opt/hbase-1.2.6/lib/jersey-json-1.9.jar:/opt/hbase-1.2.6/lib/jersey-server-1.9.jar:/opt/hbase-1.2.6/lib/jets3t-0.9.0.jar:/opt/hbase-1.2.6/lib/jettison-1.3.3.jar:/opt/hbase-1.2.6/lib/jetty-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-sslengine-6.1.26.jar:/opt/hbase-1.2.6/lib/jetty-util-6.1.26.jar:/opt/hbase-1.2.6/lib/joni-2.1.2.jar:/opt/hbase-1.2.6/lib/jruby-complete-1.6.8.jar:/opt/hbase-1.2.6/lib/jsch-0.1.42.jar:/opt/hbase-1.2.6/lib/jsp-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/jsp-api-2.1-6.1.14.jar:/opt/hbase-1.2.6/lib/junit-4.12.jar:/opt/hbase-1.2.6/lib/leveldbjni-all-1.8.jar:/opt/hbase-1.2.6/lib/libthrift-0.9.3.jar:/opt/hbase-1.2.6/lib/log4j-1.2.17.jar:/opt/hbase-1.2.6/lib/metrics-core-2.2.0.jar:/opt/hbase-1.2.6/lib/netty-all-4.0.23.Final.jar:/opt/hbase-1.2.6/lib/paranamer-2.3.jar:/opt/hbase-1.2.6/lib/protobuf-java-2.5.0.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5-6.1.14.jar:/opt/hbase-1.2.6/lib/servlet-api-2.5.jar:/opt/hbase-1.2.6/lib/slf4j-api-1.7.7.jar:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar:/opt/hbase-1.2.6/lib/snappy-java-1.0.4.1.jar:/opt/hbase-1.2.6/lib/spymemcached-2.11.6.jar:/opt/hbase-1.2.6/lib/xmlenc-0.52.jar:/opt/hbase-1.2.6/lib/xz-1.0.jar:/opt/hbase-1.2.6/lib/zookeeper-3.4.6.jar:/opt/hadoop-2.7.3/etc/hadoop:/opt/hadoop-2.7.3/share/hadoop/common/lib/*:/opt/hadoop-2.7.3/share/hadoop/common/*:/opt/hadoop-2.7.3/share/hadoop/hdfs:/opt/hadoop-2.7.3/share/hadoop/hdfs/lib/*:/opt/hadoop-2.7.3/share/hadoop/hdfs/*:/opt/hadoop-2.7.3/share/hadoop/yarn/lib/*:/opt/hadoop-2.7.3/share/hadoop/yarn/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/lib/*:/opt/hadoop-2.7.3/share/hadoop/mapreduce/*:/opt/hadoop-2.7.3/contrib/capacity-scheduler/*.jar:/opt/hbase-1.2.6/conf:/lib/*' -Djava.library.path=:/opt/hadoop-2.7.3/lib/native:/opt/hadoop-2.7.3/lib/native org.apache.flume.node.Application -f /opt/flume-1.8.0/conf/hdfs.conf -n a1
*/
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/opt/flume-1.8.0/lib/slf4j-log4j12-1.6.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hadoop-2.7.3/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/opt/hbase-1.2.6/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
2017-12-21 08:57:35,748 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider.start(PollingPropertiesFileConfigurationProvider.java:62)] Configuration provider starting
2017-12-21 08:57:35,758 (conf-file-poller-0) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider$FileWatcherRunnable.run(PollingPropertiesFileConfigurationProvider.java:134)] Reloading configuration file:/opt/flume-1.8.0/conf/hdfs.conf
2017-12-21 08:57:35,765 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 08:57:35,767 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 08:57:35,768 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:930)] Added sinks: k1 Agent: a1
2017-12-21 08:57:35,768 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 08:57:35,768 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 08:57:35,769 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 08:57:35,769 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 08:57:35,769 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration$AgentConfiguration.addProperty(FlumeConfiguration.java:1016)] Processing:k1
2017-12-21 08:57:35,791 (conf-file-poller-0) [INFO - org.apache.flume.conf.FlumeConfiguration.validateConfiguration(FlumeConfiguration.java:140)] Post-validation flume configuration contains configuration for agents: [a1]
2017-12-21 08:57:35,791 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:147)] Creating channels
2017-12-21 08:57:35,838 (conf-file-poller-0) [INFO - org.apache.flume.channel.DefaultChannelFactory.create(DefaultChannelFactory.java:42)] Creating instance of channel c1 type memory
2017-12-21 08:57:35,844 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.loadChannels(AbstractConfigurationProvider.java:201)] Created channel c1
2017-12-21 08:57:35,845 (conf-file-poller-0) [INFO - org.apache.flume.source.DefaultSourceFactory.create(DefaultSourceFactory.java:41)] Creating instance of source r1, type exec
2017-12-21 08:57:35,863 (conf-file-poller-0) [INFO - org.apache.flume.sink.DefaultSinkFactory.create(DefaultSinkFactory.java:42)] Creating instance of sink: k1, type: hdfs
2017-12-21 08:57:35,885 (conf-file-poller-0) [INFO - org.apache.flume.node.AbstractConfigurationProvider.getConfiguration(AbstractConfigurationProvider.java:116)] Channel c1 connected to [r1, k1]
2017-12-21 08:57:35,899 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:137)] Starting new configuration:{ sourceRunners:{r1=EventDrivenSourceRunner: { source:org.apache.flume.source.ExecSource{name:r1,state:IDLE} }} sinkRunners:{k1=SinkRunner: { policy:org.apache.flume.sink.DefaultSinkProcessor@26a730b6 counterGroup:{ name:null counters:{} } }} channels:{c1=org.apache.flume.channel.MemoryChannel{name: c1}} }
2017-12-21 08:57:35,938 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:144)] Starting Channel c1
2017-12-21 08:57:35,941 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: CHANNEL, name: c1: Successfully registered new MBean.
2017-12-21 08:57:35,941 (lifecycleSupervisor-1-0) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: CHANNEL, name: c1 started
2017-12-21 08:57:35,942 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:171)] Starting Sink k1
2017-12-21 08:57:35,943 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:182)] Starting Source r1
2017-12-21 08:57:35,947 (lifecycleSupervisor-1-2) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SINK, name: k1: Successfully registered new MBean.
2017-12-21 08:57:35,947 (lifecycleSupervisor-1-2) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SINK, name: k1 started
2017-12-21 08:57:35,950 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.source.ExecSource.start(ExecSource.java:168)] Exec source starting with command: tail -F /root/test.log
2017-12-21 08:57:35,951 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SOURCE, name: r1: Successfully registered new MBean.
2017-12-21 08:57:35,952 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SOURCE, name: r1 started
2017-12-21 08:57:39,975 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.HDFSSequenceFile.configure(HDFSSequenceFile.java:63)] writeFormat = Writable, UseRawLocalFileSystem = false

(4)实时产生数据

代码语言:javascript
复制
[root@node1 ~]# sh createData.sh

(5)查看收集数据结果

代码语言:javascript
复制
[root@node1 ~]# hdfs dfs -ls /flume
Found 3 items
-rw-r--r--   3 root supergroup        385 2017-12-21 08:57 /flume/Syslog.1513864659974
-rw-r--r--   3 root supergroup        386 2017-12-21 08:58 /flume/Syslog.1513864659975
-rw-r--r--   3 root supergroup          0 2017-12-21 08:58 /flume/Syslog.1513864659976.tmp
[root@node1 ~]# hdfs dfs -ls /flume
Found 3 items
-rw-r--r--   3 root supergroup        385 2017-12-21 08:57 /flume/Syslog.1513864659974
-rw-r--r--   3 root supergroup        386 2017-12-21 08:58 /flume/Syslog.1513864659975
-rw-r--r--   3 root supergroup        271 2017-12-21 08:58 /flume/Syslog.1513864659976.tmp
[root@node1 ~]# hdfs dfs -ls /flume
Found 4 items
-rw-r--r--   3 root supergroup        385 2017-12-21 08:57 /flume/Syslog.1513864659974
-rw-r--r--   3 root supergroup        386 2017-12-21 08:58 /flume/Syslog.1513864659975
-rw-r--r--   3 root supergroup        395 2017-12-21 08:58 /flume/Syslog.1513864659976
-rw-r--r--   3 root supergroup        245 2017-12-21 08:58 /flume/Syslog.1513864659977.tmp
[root@node1 ~]# hdfs dfs -ls /flume
Found 5 items
-rw-r--r--   3 root supergroup        385 2017-12-21 08:57 /flume/Syslog.1513864659974
-rw-r--r--   3 root supergroup        386 2017-12-21 08:58 /flume/Syslog.1513864659975
-rw-r--r--   3 root supergroup        395 2017-12-21 08:58 /flume/Syslog.1513864659976
-rw-r--r--   3 root supergroup        395 2017-12-21 08:58 /flume/Syslog.1513864659977
-rw-r--r--   3 root supergroup          0 2017-12-21 08:58 /flume/Syslog.1513864659978.tmp
[root@node1 ~]# hdfs dfs -ls /flume
Found 6 items
-rw-r--r--   3 root supergroup        385 2017-12-21 08:57 /flume/Syslog.1513864659974
-rw-r--r--   3 root supergroup        386 2017-12-21 08:58 /flume/Syslog.1513864659975
-rw-r--r--   3 root supergroup        395 2017-12-21 08:58 /flume/Syslog.1513864659976
-rw-r--r--   3 root supergroup        395 2017-12-21 08:58 /flume/Syslog.1513864659977
-rw-r--r--   3 root supergroup        395 2017-12-21 08:58 /flume/Syslog.1513864659978
-rw-r--r--   3 root supergroup          0 2017-12-21 08:58 /flume/Syslog.1513864659979.tmp
[root@node1 ~]# 

(6)flume控制台输出

代码语言:javascript
复制
2017-12-21 08:57:35,942 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:171)] Starting Sink k1
2017-12-21 08:57:35,943 (conf-file-poller-0) [INFO - org.apache.flume.node.Application.startAllComponents(Application.java:182)] Starting Source r1
2017-12-21 08:57:35,947 (lifecycleSupervisor-1-2) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SINK, name: k1: Successfully registered new MBean.
2017-12-21 08:57:35,947 (lifecycleSupervisor-1-2) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SINK, name: k1 started
2017-12-21 08:57:35,950 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.source.ExecSource.start(ExecSource.java:168)] Exec source starting with command: tail -F /root/test.log
2017-12-21 08:57:35,951 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.register(MonitoredCounterGroup.java:119)] Monitored counter group for type: SOURCE, name: r1: Successfully registered new MBean.
2017-12-21 08:57:35,952 (lifecycleSupervisor-1-1) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.start(MonitoredCounterGroup.java:95)] Component type: SOURCE, name: r1 started
2017-12-21 08:57:39,975 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.HDFSSequenceFile.configure(HDFSSequenceFile.java:63)] writeFormat = Writable, UseRawLocalFileSystem = false
2017-12-21 08:57:40,365 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:251)] Creating hdfs://cetc/flume/Syslog.1513864659974.tmp
2017-12-21 08:57:48,825 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.close(BucketWriter.java:393)] Closing hdfs://cetc/flume/Syslog.1513864659974.tmp
2017-12-21 08:57:48,902 (hdfs-k1-call-runner-3) [INFO - org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:655)] Renaming hdfs://cetc/flume/Syslog.1513864659974.tmp to hdfs://cetc/flume/Syslog.1513864659974
2017-12-21 08:57:49,073 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:251)] Creating hdfs://cetc/flume/Syslog.1513864659975.tmp
2017-12-21 08:58:00,883 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.close(BucketWriter.java:393)] Closing hdfs://cetc/flume/Syslog.1513864659975.tmp
2017-12-21 08:58:01,003 (hdfs-k1-call-runner-8) [INFO - org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:655)] Renaming hdfs://cetc/flume/Syslog.1513864659975.tmp to hdfs://cetc/flume/Syslog.1513864659975
2017-12-21 08:58:01,171 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:251)] Creating hdfs://cetc/flume/Syslog.1513864659976.tmp
2017-12-21 08:58:09,989 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.close(BucketWriter.java:393)] Closing hdfs://cetc/flume/Syslog.1513864659976.tmp
2017-12-21 08:58:10,051 (hdfs-k1-call-runner-3) [INFO - org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:655)] Renaming hdfs://cetc/flume/Syslog.1513864659976.tmp to hdfs://cetc/flume/Syslog.1513864659976
2017-12-21 08:58:10,115 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:251)] Creating hdfs://cetc/flume/Syslog.1513864659977.tmp
2017-12-21 08:58:19,054 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.close(BucketWriter.java:393)] Closing hdfs://cetc/flume/Syslog.1513864659977.tmp
2017-12-21 08:58:19,493 (hdfs-k1-call-runner-8) [INFO - org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:655)] Renaming hdfs://cetc/flume/Syslog.1513864659977.tmp to hdfs://cetc/flume/Syslog.1513864659977
2017-12-21 08:58:19,534 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:251)] Creating hdfs://cetc/flume/Syslog.1513864659978.tmp
2017-12-21 08:58:31,105 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.close(BucketWriter.java:393)] Closing hdfs://cetc/flume/Syslog.1513864659978.tmp
2017-12-21 08:58:31,157 (hdfs-k1-call-runner-3) [INFO - org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:655)] Renaming hdfs://cetc/flume/Syslog.1513864659978.tmp to hdfs://cetc/flume/Syslog.1513864659978
2017-12-21 08:58:31,201 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:251)] Creating hdfs://cetc/flume/Syslog.1513864659979.tmp
2017-12-21 08:58:40,141 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.close(BucketWriter.java:393)] Closing hdfs://cetc/flume/Syslog.1513864659979.tmp
2017-12-21 08:58:40,202 (hdfs-k1-call-runner-8) [INFO - org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:655)] Renaming hdfs://cetc/flume/Syslog.1513864659979.tmp to hdfs://cetc/flume/Syslog.1513864659979
2017-12-21 08:58:40,309 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:251)] Creating hdfs://cetc/flume/Syslog.1513864659980.tmp
2017-12-21 08:58:49,199 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.close(BucketWriter.java:393)] Closing hdfs://cetc/flume/Syslog.1513864659980.tmp
2017-12-21 08:58:49,287 (hdfs-k1-call-runner-3) [INFO - org.apache.flume.sink.hdfs.BucketWriter$8.call(BucketWriter.java:655)] Renaming hdfs://cetc/flume/Syslog.1513864659980.tmp to hdfs://cetc/flume/Syslog.1513864659980
2017-12-21 08:58:49,343 (SinkRunner-PollingRunner-DefaultSinkProcessor) [INFO - org.apache.flume.sink.hdfs.BucketWriter.open(BucketWriter.java:251)] Creating hdfs://cetc/flume/Syslog.1513864659981.tmp
^C2017-12-21 08:58:58,913 (agent-shutdown-hook) [INFO - org.apache.flume.lifecycle.LifecycleSupervisor.stop(LifecycleSupervisor.java:78)] Stopping lifecycle supervisor 11
2017-12-21 08:58:58,942 (agent-shutdown-hook) [INFO - org.apache.flume.node.PollingPropertiesFileConfigurationProvider.stop(PollingPropertiesFileConfigurationProvider.java:84)] Configuration provider stopping
2017-12-21 08:58:58,945 (pool-5-thread-1) [INFO - org.apache.flume.source.ExecSource$ExecRunnable.run(ExecSource.java:372)] Command [tail -F /root/test.log] exited with 130
2017-12-21 08:58:58,946 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:149)] Component type: CHANNEL, name: c1 stopped
2017-12-21 08:58:58,946 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:155)] Shutdown Metric for type: CHANNEL, name: c1. channel.start.time == 1513864655941
2017-12-21 08:58:58,946 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:161)] Shutdown Metric for type: CHANNEL, name: c1. channel.stop.time == 1513864738946
2017-12-21 08:58:58,946 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: CHANNEL, name: c1. channel.capacity == 1000
2017-12-21 08:58:58,946 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: CHANNEL, name: c1. channel.current.size == 0
2017-12-21 08:58:58,946 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: CHANNEL, name: c1. channel.event.put.attempt == 80
2017-12-21 08:58:58,946 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: CHANNEL, name: c1. channel.event.put.success == 80
2017-12-21 08:58:58,947 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: CHANNEL, name: c1. channel.event.take.attempt == 92
2017-12-21 08:58:58,947 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: CHANNEL, name: c1. channel.event.take.success == 74
2017-12-21 08:58:58,947 (agent-shutdown-hook) [INFO - org.apache.flume.source.ExecSource.stop(ExecSource.java:188)] Stopping exec source with command: tail -F /root/test.log
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:149)] Component type: SOURCE, name: r1 stopped
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:155)] Shutdown Metric for type: SOURCE, name: r1. source.start.time == 1513864655952
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:161)] Shutdown Metric for type: SOURCE, name: r1. source.stop.time == 1513864738948
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: SOURCE, name: r1. src.append-batch.accepted == 0
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: SOURCE, name: r1. src.append-batch.received == 0
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: SOURCE, name: r1. src.append.accepted == 0
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: SOURCE, name: r1. src.append.received == 0
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: SOURCE, name: r1. src.events.accepted == 80
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: SOURCE, name: r1. src.events.received == 80
2017-12-21 08:58:58,948 (agent-shutdown-hook) [INFO - org.apache.flume.instrumentation.MonitoredCounterGroup.stop(MonitoredCounterGroup.java:177)] Shutdown Metric for type: SOURCE, name: r1. src.open-connection.count == 0
本文参与 腾讯云自媒体分享计划,分享自作者个人站点/博客。
原始发表:2017-12-20 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • 1、下载
  • 2、安装配置
  • 3、例1:avro
  • 4、例子2:Spool
  • 5、例子3:Syslogtcp
  • 6、例子4:Exec
  • 7、例子5:如何把数据写入HDFS
相关产品与服务
大数据
全栈大数据产品,面向海量数据场景,帮助您 “智理无数,心中有数”!
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档