ElasticSearch + Logstash + Kibana 日志采集

本文节选自《Netkiller Monitoring 手札》

ElasticSearch + Logstash + Kibana 一键安装

配置 logstash 将本地日志导入到 elasticsearch

input {
  file {
    type => "syslog"
    path => [ "/var/log/maillog", "/var/log/messages", "/var/log/secure" ]
    start_position => "beginning"
  }
}
output {
  stdout { codec => rubydebug }
  elasticsearch { 
    hosts => ["127.0.0.1:9200"] 
  }
}		

19.3. TCP/UDP 接收日志并写入 elasticsearch

		input {
  file {
    type => "syslog"
    path => [ "/var/log/auth.log", "/var/log/messages", "/var/log/syslog" ]
  }
  tcp {
    port => "5145"
    type => "syslog-network"
  }
  udp {
    port => "5145"
    type => "syslog-network"
  }
}
output {
  elasticsearch { 
    hosts => ["127.0.0.1:9200"] 
  }
}		

19.4. 配置 Broker(Redis)

19.4.1. indexer

input {/etc/logstash/conf.d/indexer.conf

  redis {
    host => "127.0.0.1"
    port => "6379" 
    key => "logstash:demo"
    data_type => "list"
    codec  => "json"
    type => "logstash-redis-demo"
    tags => ["logstashdemo"]
  }
}

output {
  stdout { codec => rubydebug }
  elasticsearch {
    hosts => ["127.0.0.1:9200"]
  }
}	

测试

			# redis-cli 
127.0.0.1:6379> RPUSH logstash:demo "{\"time\": \"2012-01-01T10:20:00\", \"message\": \"logstash demo message\"}"
(integer) 1
127.0.0.1:6379> exit			

如果执行成功日志如下

			# cat /var/log/logstash/logstash-plain.log 
[2017-03-22T15:54:36,491][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2017-03-22T15:54:36,496][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://127.0.0.1:9200/, :path=>"/"}
[2017-03-22T15:54:36,600][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>#<URI::HTTP:0x20dae6aa URL:http://127.0.0.1:9200/>}
[2017-03-22T15:54:36,601][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2017-03-22T15:54:36,686][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-03-22T15:54:36,693][INFO ][logstash.outputs.elasticsearch] Installing elasticsearch template to _template/logstash
[2017-03-22T15:54:36,780][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x2f9efc89 URL://127.0.0.1>]}
[2017-03-22T15:54:36,787][INFO ][logstash.pipeline        ] Starting pipeline {"id"=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>1000}
[2017-03-22T15:54:36,792][INFO ][logstash.inputs.redis    ] Registering Redis {:identity=>"redis://@127.0.0.1:6379/0 list:logstash:demo"}
[2017-03-22T15:54:36,793][INFO ][logstash.pipeline        ] Pipeline main started
[2017-03-22T15:54:36,838][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2017-03-22T15:55:10,018][WARN ][logstash.runner          ] SIGTERM received. Shutting down the agent.
[2017-03-22T15:55:10,024][WARN ][logstash.agent           ] stopping pipeline {:id=>"main"}			

19.4.2. shipper

			input {
  file {
    path => [ "/var/log/nginx/access.log" ]
    start_position => "beginning"
  }
}

filter {
  grok {
    match => { "message" => "%{NGINXACCESS}" }
    add_field => { "type" => "access" }
  }
  date {
    match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
  }
  geoip {
    source => "clientip"
  }
}

output {
  redis {
    host => "127.0.0.1"
    port => 6379
    data_type => "list"
    key => "logstash:demo"
  }
}			

19.5. Kafka

input {

  kafka {
   zk_connect => "kafka:2181"
   group_id => "logstash"
   topic_id => "apache_logs"
   consumer_threads => 16
  }
}		

19.8. FAQ

19.8.1. 查看 Kibana 数据库

			# curl 'http://localhost:9200/_search?pretty'
{
  "took" : 1,
  "timed_out" : false,
  "_shards" : {
    "total" : 1,
    "successful" : 1,
    "failed" : 0
  },
  "hits" : {
    "total" : 1,
    "max_score" : 1.0,
    "hits" : [
      {
        "_index" : ".kibana",
        "_type" : "config",
        "_id" : "5.2.2",
        "_score" : 1.0,
        "_source" : {
          "buildNum" : 14723
        }
      }
    ]
  }
}			

19.8.2. logstash 无法写入 elasticsearch

elasticsearch 的配置不能省略 9200 端口,否则将无法链接elasticsearch

  elasticsearch {
    hosts => ["127.0.0.1:9200"]
  }			

原文发布于微信公众号 - Netkiller(netkiller-ebook)

原文发表时间:2017-03-23

本文参与腾讯云自媒体分享计划,欢迎正在阅读的你也加入,一起分享。

发表于

我来说两句

0 条评论
登录 后参与评论

相关文章

来自专栏流媒体

Linux下ndk编译移植FFmpeg到Android平台简介

这里我们选择3.2.4版本(注意:这里使用的3.2.4版本,如果用最新的版本,编译可能出现问题,为了想让大家上手,建议版本先保持一致)。直接github上选择下...

2002
来自专栏zhisheng

渣渣菜鸡的 ElasticSearch 源码解析 —— 环境搭建

去 https://www.elastic.co/downloads/past-releases 这里找到 ElasticSearch 6.3.2 版本,下载后...

1822
来自专栏Laoqi's Linux运维专列

MySQL高可用架构之MHA

2983
来自专栏流媒体

SpringBoot集成Swagger2

Swagger 是一个规范和完整的框架,用于生成、描述、调用和可视化 RESTful 风格的 Web 服务。总体目标是使客户端和文件系统作为服务器以同样的速度来...

803
来自专栏Golang语言社区

Google Cloud 宣布支持Go 1.11

The Go Blog Announcing App Engine’s New Go 1.11 Runtime

1624
来自专栏全栈架构

Spring Boot 与 Kotlin 处理Web表单提交

我们在做web开发的时候,肯定逃不过表单提交,这篇文章通过Spring Boot使用Kotlin 语言 创建和提交一个表单。

672
来自专栏Netkiller

Spring boot with Git version

本文节选自《Netkiller Java 手札》 5.23. Spring boot with Git version Spring boot 每次升级打包发给...

3008
来自专栏逢魔安全实验室

Some Linux Hacking Tricks

3675
来自专栏张善友的专栏

Windows Server AppFabric Beta 2 已经发布

Windows Server AppFabric Beta 2是一个包含完全功能的AppFabric版本(This build represents our “...

1765
来自专栏大大的微笑

ActiveMQ几个重要的配置文件

version:5.10,在5.8以后增加了levelDB的方式进行集群配置 ①.wrapper.conf: # -----------------------...

7429

扫码关注云+社区