首页 > 日志服务 > ELK日志服务使用-shipper-to-indexer
2016
02-26

ELK日志服务使用-shipper-to-indexer

上一篇文章写了用rsyslog同步日志到elk服务器,可以同步nginx的日志文件,或者直接通过rsyslog到elk主机的某一个端口,用logstash定义的syslog接收两种方式。这篇讲用logstash的shipper to indexer传送日志
logstash的type说明:
type
1.Value type is string
2.There is no default value for this setting.
Add a type field to all events handled by this input.

Types are used mainly for filter activation.

The type is stored as part of the event itself, so you can also use the type to search for it in Kibana.

If you try to set a type on an event that already has one (for example when you send an event from a shipper to an indexer) then a new input will not override the existing type. A type set at the shipper stays with that event for its life even when sent to another Logstash server.

下面例子为:logstash的shipper  input读取日志后,filter过滤,通过redis通道output,发送出去

logstash的indexer收集由多台shipper发送的日志(当然这里是一台),一般会把redis安装到此主机,收到日志后发送给本机的elasticsearch

还是那句老话,日志格式根据自己实际日志来匹配,type、tags、index可随便定义,引用Ian Unruh大神的一个图片https://ianunruh.com/

ELK日志服务使用-shipper-to-indexer - 第1张  | linux工匠|关注运维自动化|Python开发|linux高可用集群|数据库维护|性能提优|系统架构

最后编辑:
作者:bbotte
这个作者貌似有点懒,什么都没有留下。

ELK日志服务使用-shipper-to-indexer》有 1 条评论

  1. Pingback 引用通告: ELK日志服务使用-kafka传输日志 | linux工匠 关注运维自动化|Python开发|linux高可用集群|数据库维护|性能提优

留下一个回复

你的email不会被公开。