前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >06 . ELK Stack + kafka集群

06 . ELK Stack + kafka集群

作者头像
iginkgo18
修改2021-05-08 22:51:56
3750
修改2021-05-08 22:51:56
举报
文章被收录于专栏:devops_k8s
简介

Filebeat用于收集本地文件的日志数据。 它作为服务器上的代理安装,Filebeat监视日志目录或特定的日志文件,尾部文件,并将它们转发到Elasticsearch或Logstash进行索引。 logstash 和filebeat都具有日志收集功能,filebeat更轻量,使用go语言编写,占用资源更少,可以有很高的并发,但logstash 具有filter功能,能过滤分析日志。一般结构都是filebeat采集日志,然后发送到消息队列,如redis,kafka。然后logstash去获取,利用filter功能过滤分析,然后存储到elasticsearch中。 Kafka是LinkedIn开源的分布式发布-订阅消息系统,目前归属于Apache定级项目。Kafka主要特点是基于Pull的模式来处理消息消费,追求高吞吐量,一开始的目的就是用于日志收集和传输。0.8版本开始支持复制,不支持事务,对消息的重复、丢失、错误没有严格要求,适合产生大量数据的互联网服务的数据收集业务。

环境清单

IP

hostname

软件

配置要求

网络

备注

192.168.43.176

ES/数据存储

elasticsearch-7.2

内存2GB/硬盘40GB

Nat,内网

192.168.43.215

Kibana/UI展示

kibana-7.2

内存2GB/硬盘40GB

Nat,内网

192.168.43.164

Filebeat/数据采集

Filebeat-7.2/nginx

内存2GB/硬盘40GB

Nat,内网

192.168.43.30

Logstash/数据管道

logstash-7.2

内存2GB/硬盘40GB

Nat,内网

192.168.43.86

Kibana/UI展示

kibana-7.2

内存2GB/硬盘40GB

Nat,内网

192.168.43.47

Kafka/消息队列

Kafka2.12 / zk3.4

内存2GB/硬盘40GB

Nat,内网

192.168.43.151

Kafka/消息队列

Kafka2.12 / zk3.4

内存2GB/硬盘40GB

Nat,内网

192.168.43.43

Kafka/消息队列

Kafka2.12 / zk3.4

内存2GB/硬盘40GB

Nat,内网

192.168.43.194

Tomcat

tomcat8.5

内存2GB/硬盘40GB

Nat,内网

ELK集群部署请看上一篇博客

https://cloud.tencent.com/developer/article/1706592

配置使用zookeeper和kafka请看我写的另一篇博客

https://cloud.tencent.com/developer/article/1706684

使用Logstash和Kafka交互
编辑logstash配置文件
代码语言:javascript
复制
input{
  stdin{}
}

output{
  kafka{
    topic_id =>"kafkatest"
    bootstrap_servers => "192.168.43.47:9092"
    batch_size => 5
}
  stdout{
     codec => "rubydebug"
	}
}
启动logstash,输入数据
代码语言:javascript
复制
./bin/logstash -f kafka.conf 
zhoujian
{
    "@timestamp" => 2020-07-24T07:11:26.235Z,
       "message" => "zhoujian",
          "host" => "logstash-30",
      "@version" => "1"
}
youmen
{
    "@timestamp" => 2020-07-24T07:11:29.441Z,
       "message" => "youmen",
          "host" => "logstash-30",
      "@version" => "1"
}
kafka中查看写入数据
代码语言:javascript
复制
# 查看kafka现有的topic
./bin/kafka-topics.sh --list --bootstrap-server 192.168.43.47:9092,192.168.43.151:9092,192.168.43.43:9092 
kafkatest
test-you-io

# 查看kafkatest里面消息
./bin/kafka-console-consumer.sh --bootstrap-server 192.168.43.47:9092,192.168.43.151:9092,192.168.43.43:9092 --topic kafkatest --from-beginning
2020-07-24T07:13:59.461Z logstash-30 zhoujian
2020-07-24T07:14:01.518Z logstash-30 youmen

数据写入成功,kafka配置完成

配置Filebeat
输出日志到kafka
代码语言:javascript
复制
/etc/filebeat/filebeat.yml 
  # hosts: ["localhost:9200"]

  # Optional protocol and basic auth credentials.
  #protocol: "https"
  #username: "elastic"
  #password: "changeme"

#----------------------------- Logstash output --------------------------------
#output.logstash:
  # The Logstash hosts
  #hosts: ["localhost:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

output.kafka:
  enabled: true
  hosts: ["192.168.43.47:9092","192.168.43.151:9092","192.168.43.43:9092"]
  topic: "tomcat-filebeat"
  partition.hash:
    reachable_only: true
  compression: gzip
  max_message_bytes: 1000000
  required_acks: 1
#================================ Processors =====================================

# Configure processors to enhance or manipulate events generated by the beat.

processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
[root@tomcat-194 logs]# cat /etc/filebeat/filebeat.yml
filebeat.inputs:
- type: log
  enabled: true 
  paths:
    - /usr/local/tomcat/logs/localhost_access_log.2020-07*
filebeat.config.modules:
  path: ${path.config}/modules.d/*.yml
  reload.enabled: false
setup.template.settings:
  index.number_of_shards: 1
setup.kibana:
output.kafka:
  enabled: true
  hosts: ["192.168.43.47:9092","192.168.43.151:9092","192.168.43.43:9092"]
  topic: "tomcat-filebeat"
  partition.hash:
    reachable_only: true
  compression: gzip
  max_message_bytes: 1000000
  required_acks: 1
processors:
  - add_host_metadata: ~
  - add_cloud_metadata: ~
kafka查询是否有tomcat日志
代码语言:javascript
复制
./bin/kafka-console-consumer.sh --bootstrap-server 192.168.43.47:9092,192.168.43.151:9092,192.168.43.43:9092 --topic tomcat-filebeat --from-beginning
      
{"@timestamp":"2020-07-24T06:35:24.294Z","@metadata":{"beat":"filebeat","type":"_doc","version":"7.2.0","topic":"tomcat-filebeat"},"message":"{\"client\":\"192.168.43.84\",  \"client user\":\"-\",   \"authenticated\":\"-\",   \"access time\":\"[24/Jul/2020:14:35:10 +0800]\",     \"method\":\"GET /docs/config/ HTTP/1.1\",   \"status\":\"200\",  \"send bytes\":\"6826\",  \"Query?string\":\"\",  \"partner\":\"http://192.168.43.194:8080/\",  \"Agent version\":\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36\"}","input":{"type":"log"},"ecs":{"version":"1.0.0"},"host":{"name":"tomcat-194","id":"b029c3ce28374f7db698c050e342457f","containerized":false,"hostname":"tomcat-194","architecture":"x86_64","os":{"platform":"centos","version":"7 (Core)","family":"redhat","name":"CentOS Linux","kernel":"3.10.0-514.el7.x86_64","codename":"Core"}},"agent":{"hostname":"tomcat-194","id":"cfe87df5-c912-49d0-8758-b73e917a6c9c","version":"7.2.0","type":"filebeat","ephemeral_id":"894657d2-af1a-4660-a3eb-98602bc3d1d7"},"log":{"offset":19393,"file":{"path":"/usr/local/tomcat/logs/localhost_access_log.2020-07-24.log"}}}
{"@timestamp":"2020-07-24T06:38:29.339Z","@metadata":{"beat":"filebeat","type":"_doc","version":"7.2.0","topic":"tomcat-filebeat"},"host":{"id":"b029c3ce28374f7db698c050e342457f","containerized":false,"hostname":"tomcat-194","name":"tomcat-194","architecture":"x86_64","os":{"family":"redhat","name":"CentOS Linux","kernel":"3.10.0-514.el7.x86_64","codename":"Core","platform":"centos","version":"7 (Core)"}},"agent":{"ephemeral_id":"894657d2-af1a-4660-a3eb-98602bc3d1d7","hostname":"tomcat-194","id":"cfe87df5-c912-49d0-8758-b73e917a6c9c","version":"7.2.0","type":"filebeat"},"ecs":{"version":"1.0.0"},"message":"{\"client\":\"192.168.43.84\",  \"client user\":\"-\",   \"authenticated\":\"-\",   \"access time\":\"[24/Jul/2020:14:38:18 +0800]\",     \"method\":\"GET /manager/status HTTP/1.1\",   \"status\":\"403\",  \"send bytes\":\"3446\",  \"Query?string\":\"\",  \"partner\":\"http://192.168.43.194:8080/\",  \"Agent version\":\"Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/84.0.4147.89 Safari/537.36\"}","log":{"offset":19797,"file":{"path":"/usr/local/tomcat/logs/localhost_access_log.2020-07-24.log"}},"input":{"type":"log"}}
^CProcessed a total of 66 messages
使用logstash从Kafka读取日志到es
配置logstash读取kafka日志
代码语言:javascript
复制
cat kafka-es.conf
input{
  kafka{
      bootstrap_servers => "192.168.43.62:9092,192.168.43.151:9092,192.168.43.43:9092"
      topics => "tomcat-filebeat"
      consumer_threads => 1
      decorate_events => true
      codec => "json"
      auto_offset_reset => "latest" 
    }
}
 
output{
  elasticsearch {
    hosts => ["192.168.43.176:9200"]
    index => "tomcat-filebeat-%{+YYYY.MM.dd}"
 
}
  stdout{
     codec => "rubydebug"
	}
}
前台运行,确保日志能否正常输出
代码语言:javascript
复制
./bin/logstash -f kafka-es.conf 
constant ::Fixnum is deprecated
{
         "input" => {
        "type" => "log"
    },
      "@version" => "1",
       "message" => "{\"client\":\"192.168.43.227\",  \"client user\":\"-\",   \"authenticated\":\"-\",   \"access time\":\"[24/Jul/2020:15:47:08 +0800]\",     \"method\":\"GET / HTTP/1.1\",   \"status\":\"200\",  \"send bytes\":\"11215\",  \"Query?string\":\"\",  \"partner\":\"-\",  \"Agent version\":\"curl/7.29.0\"}",
         "agent" => {
             "version" => "7.2.0",
            "hostname" => "tomcat-194",
        "ephemeral_id" => "894657d2-af1a-4660-a3eb-98602bc3d1d7",
                  "id" => "cfe87df5-c912-49d0-8758-b73e917a6c9c",
                "type" => "filebeat"
    },
          "host" => {
                 "name" => "tomcat-194",
                   "os" => {
             "version" => "7 (Core)",
                "name" => "CentOS Linux",
            "codename" => "Core",
              "family" => "redhat",
            "platform" => "centos",
              "kernel" => "3.10.0-514.el7.x86_64"
                             },
                   "id" => "b029c3ce28374f7db698c050e342457f",
        "containerized" => false,
             "hostname" => "tomcat-194",
         "architecture" => "x86_64"
    },
    "@timestamp" => 2020-07-24T07:47:11.857Z,
           "log" => {
        "offset" => 20203,
          "file" => {
            "path" => "/usr/local/tomcat/logs/localhost_access_log.2020-07-24.log"
        }
    },
           "ecs" => {
        "version" => "1.0.0"
    }
}
     
# kafka节点查看


{"@timestamp":"2020-07-24T07:53:11.944Z","@metadata":{"beat":"filebeat","type":"_doc","version":"7.2.0","topic":"tomcat-filebeat"},"host":{"id":"b029c3ce28374f7db698c050e342457f","containerized":false,"hostname":"tomcat-194","architecture":"x86_64","name":"tomcat-194","os":{"codename":"Core","platform":"centos","version":"7 (Core)","family":"redhat","name":"CentOS Linux","kernel":"3.10.0-514.el7.x86_64"}},"agent":{"type":"filebeat","ephemeral_id":"894657d2-af1a-4660-a3eb-98602bc3d1d7","hostname":"tomcat-194","id":"cfe87df5-c912-49d0-8758-b73e917a6c9c","version":"7.2.0"},"log":{"file":{"path":"/usr/local/tomcat/logs/localhost_access_log.2020-07-24.log"},"offset":20462},"message":"{\"client\":\"192.168.43.227\",  \"client user\":\"-\",   \"authenticated\":\"-\",   \"access time\":\"[24/Jul/2020:15:53:06 +0800]\",     \"method\":\"GET / HTTP/1.1\",   \"status\":\"200\",  \"send bytes\":\"11215\",  \"Query?string\":\"\",  \"partner\":\"-\",  \"Agent version\":\"curl/7.29.0\"}","input":{"type":"log"},"ecs":{"version":"1.0.0"}}
es查看索引
代码语言:javascript
复制
curl -XGET "http://127.0.0.1:9200/_cat/indices?v"
health status index                           uuid                   pri rep docs.count docs.deleted store.size pri.store.size
green  open   .monitoring-es-7-2020.07.24     z0Ff-j7WSlSm4ZBH6IhZaw   1   1        185           60      3.7mb            2mb
green  open   .monitoring-kibana-7-2020.07.24 PWqXvObhSRazQn4CY8Z2lg   1   1          3            0    216.3kb         73.4kb
green  open   .kibana_task_manager            Ptj7ydZmQqGG7hWxK2NbSg   1   1          2            0     61.2kb         45.5kb
green  open   .kibana_2                       fot9Sk6jRWa2vS5cQGvOeQ   1   1          5            0     68.6kb         34.3kb
green  open   .kibana_1                       jYD4jXLVTeeAMImEz9NEVA   1   1          1            0     18.7kb          9.3kb
green  open   .tasks                          NIwDk-PYQT-d-njh3g0t0g   1   1          1            0     12.7kb          6.3kb
green  open   tomcat-filebeat-2020.07.24      s3aB-c6GSemUHvaurYQ8Zw   1   1         38            0    227.4kb         80.3kb
本文参与 腾讯云自媒体同步曝光计划,分享自作者个人站点/博客。
原始发表:2020-07-24 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • 简介
  • 环境清单
  • 使用Logstash和Kafka交互
    • 编辑logstash配置文件
      • 启动logstash,输入数据
        • kafka中查看写入数据
        • 配置Filebeat
          • 输出日志到kafka
            • kafka查询是否有tomcat日志
            • 使用logstash从Kafka读取日志到es
              • 配置logstash读取kafka日志
                • 前台运行,确保日志能否正常输出
                  • es查看索引
                  相关产品与服务
                  日志服务
                  日志服务(Cloud Log Service,CLS)是腾讯云提供的一站式日志服务平台,提供了从日志采集、日志存储到日志检索,图表分析、监控告警、日志投递等多项服务,协助用户通过日志来解决业务运维、服务监控、日志审计等场景问题。
                  领券
                  问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档