本站文章均为愚人乙工作实践中的心得原创,转载请注明出处,沟通交流欢迎点击加入【运维架构QQ群】991904631

filebeat6.2.4多套日志采集并入不同kafka小记

日志分析 愚人乙

处理的是nginx日志,场景是不同业务域名有一套独立的access.log和error.log,需求是通过filebeat采集后推送到不同的kafka,配置文件如下: filebeat.prospectors:-type:logenabled:truepaths:-/home/work/log/

7819bfc2025079baa43c224eefc27f03.png

处理的是nginx日志,场景是不同业务域名有一套独立的access.log和error.log,需求是通过filebeat采集后推送到不同的kafka,配置文件如下:

filebeat.prospectors:
- type: log

  enabled: true
  paths:
    - /home/work/log/nginx/api.xmpush.xiaomi.com-access.log
    - /home/work/log/nginx/register.xmpush.xiaomi.com-access.log
    - /home/work/log/nginx/feedback.xmpush.xiaomi.com-access.log
  fields:
      ttopic: sre_xmpush_mt_api_nginx_access
  tail_files: true

- type: log

  enabled: true
  paths:
    - /home/work/log/nginx/api.xmpush.xiaomi.com-error.log
    - /home/work/log/nginx/register.xmpush.xiaomi.com-error.log
    - /home/work/log/nginx/feedback.xmpush.xiaomi.com-error.log
  fields:
      ttopic: sre_xmpush_mt_api_nginx_error
  tail_files: true
  
.......................

output.kafka:
  enabled: true
  hosts: ["1.1.1.1:6666","2.2.2.2:6666","3.3.3.3:6666","4.4.4.4:6666"]
  topic: '%{[fields][ttopic]}'

运维网咖社”原创作品,允许转载,转载时请务必以超链接形式标明文章原始出处、作者信息和本声明。否则将追究法律责任。http://www.net-add.com

喜欢 (7) or 分享 (0)
欢迎点击加入【运维架构QQ群】991904631