SIEM Logstash parsing for more than hundred technologies

Overview

LogIndexer Pipeline

Logstash Parsing Configurations for Elastisearch SIEM and OpenDistro for Elasticsearch SIEM

Why this project exists

The overhead of implementing Logstash parsing and applying Elastic Common Schema (ECS) across audit, security, and system logs can be a large drawback when using Elasticsearch as a SIEM (Security Incident and Event Management). The Cargill SIEM team has spent significant time on developing quality Logstash parsing processors for many well-known log vendors and wants to share this work with the community. In addition to Logstash processors, we have also included log collection programs for API-based log collection, as well as the setup scripts used to generate our pipeline-to-pipeline architecture.

Quick start Instructions

"Quick start" mostly depends on how your Logstash configuration is set up. If you have your own setup already established, it might be best to use the processors that apply to your organization's log collection (found in the "config" directory). If you are seeking to use the architecture in this repo, consult the README found in the build_scripts directory. We will be adding an elaborate setup guide soon.

Contributions

We welcome and encourage individual contributions to this repo. Please see the Contribution.md guide in the root of the repo. Please note that we reserve the right to close pull requests or issues that appear to be out of scope for our project, or for other reasons not specified.

Questions, Comments & Expected Level of Attention

Please create an issue and someone will try to respond to your issue within 5 business days. However, it should be noted that while we will try revisit the repository semi-regularly, we are not held beholden to this response time (life happens). We welcome other individuals' answers and input as well.

Licensing

Apache-2.0

Comments
  • improved cisco ACI processor

    improved cisco ACI processor

    Improved the cisco aci processor with the following changes:

    1. simplified grok parsing
    2. removed complex logic used to detected event and error messages
    3. fixed broken parsing of the device hostname sending logs
    4. tmp.rule does NOT rapresent an username , it's instead the even.reason as described by cisco, - The action or condition that caused the event, such as a component failure or a threshold crossing.

    sample messages used for testing

    <186>Dec 08 21:20:20.614 ABC-DCA-NPRD-ACILEF-104 %LOG_LOCAL7-2-SYSTEM_MSG [F0532][raised][interface-physical-down][critical][sys/phys-[eth1/47]/phys/fault-F0532] Port is down, reason being suspended(no LACP PDUs)(connected), used by EPG on node 104 of fabric ACI Fabric1 with hostname ABC-DCA-NPRD-ACILEF-104
    
    <190>Nov 24 18:20:53.237 ABC-DCB-ACIAPC-003 %LOG_LOCAL7-6-SYSTEM_MSG [E4206143][transition][info][fwrepo/fw-aci-apic-dk9.5.2.6e] Firmware aci-apic-dk9.5.2.6e created
    
    opened by anubisg1 3
  • [Help / Documentation] - how to classify incoming syslog messages

    [Help / Documentation] - how to classify incoming syslog messages

    As per title, how would we classify incoming syslog messages so that they end up in the proper process pipeline?

    Let's take a common use case where in the network we have Cisco IOS router and switches , Cisco ACI , Cisco WLC and ISE, then Checkpoint Firewalls , F5 load balancers etc ...

    generally those devices would all be sending logs to the syslog server IP port 514. but how would we classify from where each message is coming from in order to send it to the specific processor ?

    are we supposed to setup a different input queu for each processor (for example, different port ofn the syslog server so that for example, ACI goes to 192.168.10.10 port 5514 whole Checkpoint on port 5515? )

    or is there an ip filter that says, if source IP is X send to ACI processor if Y send to checkpoint ..

    or what other options are there?

    question 
    opened by anubisg1 2
  • host split enrichment error

    host split enrichment error

    For certain hostnames the host split enrichment is causing the pipeline to be blocked until grok timesout.

    [2022-06-10T15:54:58,563][WARN ][org.logstash.plugins.pipeline.PipelineBus][processor] Attempted to send event to 'enrichments' but that address was unavailable. Maybe the destination pipeline is down or stopping? Will Retry. [2022-06-10T15:57:22,451][WARN ][logstash.filters.grok ][enrichments] Timeout executing grok '^(?<[host][tmp]>.?).(?<[host][domain]>.?)$' against field '[host][hostname]' with value 'abc-name123-xyz.domain.com'!

    opened by nnovaes 2
  • Fix deprecation warnings

    Fix deprecation warnings

    User Story - details

    For translate we should use source, target instead of field, destination. On boot logstash 15 shows these warnings:

    [2021-11-09T16:53:33,518][WARN ][logstash.filters.translate] You are using a deprecated config setting "destination" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `target` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc.
    [2021-11-09T16:53:33,519][WARN ][logstash.filters.translate] You are using a deprecated config setting "field" set in translate. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Use `source` option instead. If you have any questions about this, please visit the #logstash channel on freenode irc.
    

    Atleast upto Logstash 13 new fields are not supported so let's make this change when we upgrade.

    Tasks

    • [ ]
    • [ ]

    X-Reference Issues

    Related Code

    << Any related code here... >>
    
    opened by KrishnanandSingh 2
  • native vlan mismatch and other improvements

    native vlan mismatch and other improvements

    Description

    • Parsing for Native VLAN mismatch error messages 2021-10-14T13:28:06.497Z {name=abc.com} <188>132685: Oct 14 21:28:07.975 GMT: %CDP-4-NATIVE_VLAN_MISMATCH: Native VLAN mismatch discovered on FastEthernet0/1 (1), with xyz GigabitEthernet1/0/1 (36).
    • Lowercase [actual_msg] field
    • fix typo on timestamp
    • add the timezone to [tmp][devicetimestamp]
    • removed the old parser code for native vlan mismatch
    • removed a catch all condition in the old parser
    • lowercase [rule.category]
    opened by nnovaes 2
  • Feature Request: Add known applications + risk score field based off destination.port fields

    Feature Request: Add known applications + risk score field based off destination.port fields

    User Story - details

    As a SIEM engineer I want to know port numbers associated with the destination.port field. This will allow me to quickly identify potential applications communicating on the session and also the risk of the traffic Im observing

    Tasks

    • Create a port lookup translation.
    • Add risk category score to application (scale of 1-10 or severity name).

    Examples:

    3389 -> Remote Desktop Protocol (high risk)
    22 - Secure Shell (high risk)
    3306 - MySQL (medium risk)
    6881-6889 - Bit Torrent (high risk)
    
    opened by ryanpodonnell1 2
  • Cisco IOS (cisco.router and cisco.switch) new rules

    Cisco IOS (cisco.router and cisco.switch) new rules

    Description

    new parsing rules for cisco.router and cisco.switch. The old version of this processor needs some rework. However, there are functioning bits of it that i have preserved, since they kind of work. the new rules provide some good foundation for future "full" parsing and also covers bgp and interface up/down msgs. the lookup database for translate filters is static.

    opened by nnovaes 2
  • Update syslog_log_security_sdwan.app.conf

    Update syslog_log_security_sdwan.app.conf

    Description

    These updates correct assignment of versa fields to the ECS model. It also adds back versa specific fields that do not map to ECS into a separate [labels][all] field that works like tags. I couldn't find clean way to implement it without using the add_tag command, so i have saved the event tags to another field and then restored back

    @Akhila-Y please review as well.

    opened by nnovaes 1
  • added space, testing new IDE

    added space, testing new IDE

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • added new ECS fields to .csv file

    added new ECS fields to .csv file

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • added missing fields for coverge reporting to aws cloudtrail

    added missing fields for coverge reporting to aws cloudtrail

    Description

    Please provide a description of your proposed changes - providing obfuscated log/code examples is highly encouraged.

    Related Issues

    Are there any Issues to this PR?

    Todos

    Are there any additional items that must be completed before this PR gets merged in?

    • [ ]
    • [ ]
    opened by MehaSal 1
  • [[enrichments]>worker22] ruby - Ruby exception occurred: no implicit conversion of nil into String

    [[enrichments]>worker22] ruby - Ruby exception occurred: no implicit conversion of nil into String

    Describe the bug

    [ERROR] 2022-11-26 07:54:05.540 [[enrichments]>worker14] ruby - Ruby exception occurred: no implicit conversion of nil into String {:class=>"TypeError", :backtrace=>["(ruby filter code):68:in `block in filter_method'", "org/jruby/RubyArray.java:1865:in `each'", "(ruby filter code):67:in `block in filter_method'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:96:in `inline_script'", "/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-filter-ruby-3.1.8/lib/logstash/filters/ruby.rb:89:in `filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:159:in `do_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:178:in `block in multi_filter'", "org/jruby/RubyArray.java:1865:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:175:in `multi_filter'", "org/logstash/config/ir/compiler/AbstractFilterDelegatorExt.java:134:in `multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:301:in `block in start_workers'"]}
    

    X-Reference issues

    (Cross reference any user stories that this bug might be affecting)

    Steps To Reproduce

    start the enrichment pipeline. I'm using logstash 8.5.2

    Expected behavior

    no error should be seen

    Additional context

    The following components in enrichment make use of ruby filer, but i don't understand what is the culprit

    ./02_ecs_data_type.conf
    ./04_timestamp.conf
    ./11_related_hosts.conf
    ./12_related_user.conf
    ./13_related_ip.conf
    ./14_related_hash.conf
    ./16_related_mac.conf
    ./93_mitre.conf
    ./94_remove_empty_n_truncate.conf
    
    bug wontfix 
    opened by anubisg1 2
  • cisco processor fails because of missing hostname and lowercase date

    cisco processor fails because of missing hostname and lowercase date

    I'm working with syslog_audit_cisco.switch.conf and i found the following issues:

    1. the syslog message is assumed here https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L52 as
      # {timesdtamp} {facility} {severity} {mnemonic} {description}
      # seq no:timestamp: %facility-severity-MNEMONIC:description
    

    in reality most people would configure "logging origin-id hostname" which will change the log format into

      # {hostname} {timesdtamp} {facility} {severity} {mnemonic} {description}
      # seq no: hostname: timestamp: %facility-severity-MNEMONIC:description
    
    1. the parser at line https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L33 is modifying the hostname field before that field is parsed (maybe this is assumed from kafka, instead of being taken from the logs?

    2. in line https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L48 the message is converted to lower case, but that causes date parse failures later on, becuase of case missmatch .

    Nov 17 11:44:46.490 UTC matches, but when i have nov 17 11:44:46.490 utc it fails on the date parsing here: https://github.com/Cargill/OpenSIEM-Logstash-Parsing/blob/1.0/config/processors/syslog_audit_cisco.switch.conf#L77

    Sample log entry for reference:

    <14>4643: Switch-core01: Nov 17 11:44:46.490 UTC: %LINK-3-UPDOWN: Interface GigabitEthernet1/0/27, changed state to up

    opened by anubisg1 0
  • GeoLitePrivate2-City.mmdb doesn't exist

    GeoLitePrivate2-City.mmdb doesn't exist

    to use the geoip enrichment, you need to files, specifically

              database => "/mnt/s3fs_geoip/GeoLite2-City.mmdb"
              database => "/mnt/s3fs_geoip/GeoLitePrivate2-City.mmdb"
    

    unfortunately seems like GeoLitePrivate2-City.mmdb doesn't exist anywhere in the internet and maxmind only provides

    • GeoLite2-ASN.mmdb
    • GeoLite2-City.mmdb
    • GeoLite2-Country.mmdb

    i'd expect that either more information on where to find GeoLitePrivate2-City.mmdb is added to the documentation or the enrichment pipeline is updated to function without that file

    documentation question 
    opened by anubisg1 3
  • Validate ECS fields

    Validate ECS fields

    User Story - details

    There should be an enrichment checking that only permitted values are stored in ECS fields that have a predefined set of values, so those fields can be compliant with ECS. See https://www.elastic.co/guide/en/ecs/1.9/ecs-event.html for more info. I believe event.xyz are the only fields that have their values defined. If that's true the sample code below should take care of doing this validation.

    Tasks

    • [ ]
    • [ ]

    X-Reference Issues

    Related Code

    the sample configuration below picks the event.type value that came from the processors and populates ecs_status with valid or event.type-invalid_field_value. therefore, if the ecs_status is not valid, it will add a tag that will have event.type-invalid_field_value. i.e. if event.type is "process", because "process" is not among the allowed values for event.type, a event.type-invalid_field_value: process will be added.

     translate {
                field => "event.type"
                dictionary => [
                "access", "valid", 
                "admin", "valid", 
                "allowed", "valid", 
                "change", "valid", 
                "connection", "valid", 
                "creation", "valid", 
                "deletion", "valid", 
                "denied", "valid", 
                "end", "valid", 
                "error", "valid", 
                "group", "valid", 
                "info", "valid", 
                "installation", "valid", 
                "protocol", "valid", 
                "start", "valid", 
                "user", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.type-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.type}" ]
                remove_field => [ "ecs_status", "event.type"]
            }
        }
    
        #EVENT.CATEGORY
        translate {
                field => "event.category"
                dictionary => [
                "authentication", "valid", 
                "configuration", "valid", 
                "driver", "valid", 
                "database", "valid", 
                "file", "valid", 
                "host", "valid", 
                "iam", "valid", 
                "intrusion_detection", "valid", 
                "malware", "valid", 
                "network", "valid", 
                "package", "valid", 
                "process", "valid", 
                "web", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.category-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.category}" ]
                remove_field => [ "ecs_status", "event.category"]
    
            }
        }
    
        # event.kind
         translate {
                field => "event.kind"
                dictionary => [
                "alert", "valid", 
                "event", "valid", 
                "metric", "valid", 
                "state", "valid", 
                "pipeline_error", "valid", 
                "signal", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.kind-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.kind}" ]
                remove_field => [ "ecs_status", "event.kind"]
    
            }
        }
    
    
        # event.outcome
         translate {
                field => "event.outcome"
                dictionary => [
                "failure", "valid", 
                "success", "valid", 
                "unknown", "valid"
                ]
                exact => true
                # [field]-[error]
                fallback => "event.outcome-invalid_field_value"
                destination => "ecs_status"
            }
        if [ecs_status] !~ "valid" {
            mutate {
                add_tag => [ "%{ecs_status}: %{event.outcome}" ]
                remove_field => [ "ecs_status", "event.outcome"]
    
            }
        }
    
    opened by nnovaes 0
Releases(v0.1-beta)
  • v0.1-beta(May 19, 2021)

    This release lack an elaborate usage documentation so marking this as beta. Users can still work with it by going through the python script. Soon documentation would be added.

    Source code(tar.gz)
    Source code(zip)
Owner
Working to nourish the world. Committed to helping the world thrive
Sample code and notebooks for Vertex AI, the end-to-end machine learning platform on Google Cloud

Google Cloud Vertex AI Samples Welcome to the Google Cloud Vertex AI sample repository. Overview The repository contains notebooks and community conte

Google Cloud Platform 560 Dec 31, 2022
Implementation of "Meta-rPPG: Remote Heart Rate Estimation Using a Transductive Meta-Learner"

Meta-rPPG: Remote Heart Rate Estimation Using a Transductive Meta-Learner This repository is the official implementation of Meta-rPPG: Remote Heart Ra

Eugene Lee 137 Dec 13, 2022
CSAC - Collaborative Semantic Aggregation and Calibration for Separated Domain Generalization

CSAC Introduction This repository contains the implementation code for paper: Co

ScottYuan 5 Jul 22, 2022
PyTorch implementation for "Mining Latent Structures with Contrastive Modality Fusion for Multimedia Recommendation"

MIRCO PyTorch implementation for paper: Latent Structures Mining with Contrastive Modality Fusion for Multimedia Recommendation Dependencies Python 3.

Big Data and Multi-modal Computing Group, CRIPAC 9 Dec 08, 2022
Implementation of CVPR'21: RfD-Net: Point Scene Understanding by Semantic Instance Reconstruction

RfD-Net [Project Page] [Paper] [Video] RfD-Net: Point Scene Understanding by Semantic Instance Reconstruction Yinyu Nie, Ji Hou, Xiaoguang Han, Matthi

Yinyu Nie 162 Jan 06, 2023
Source Code for Simulations in the Publication "Can the brain use waves to solve planning problems?"

Code for Simulations in the Publication Can the brain use waves to solve planning problems? Installing Required Python Packages Please use Python vers

EMD Group 2 Jul 01, 2022
[NeurIPS 2021] "Drawing Robust Scratch Tickets: Subnetworks with Inborn Robustness Are Found within Randomly Initialized Networks" by Yonggan Fu, Qixuan Yu, Yang Zhang, Shang Wu, Xu Ouyang, David Cox, Yingyan Lin

Drawing Robust Scratch Tickets: Subnetworks with Inborn Robustness Are Found within Randomly Initialized Networks Yonggan Fu, Qixuan Yu, Yang Zhang, S

12 Dec 11, 2022
PyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.

VAENAR-TTS - PyTorch Implementation PyTorch Implementation of VAENAR-TTS: Variational Auto-Encoder based Non-AutoRegressive Text-to-Speech Synthesis.

Keon Lee 67 Nov 14, 2022
Fre-GAN: Adversarial Frequency-consistent Audio Synthesis

Fre-GAN Vocoder Fre-GAN: Adversarial Frequency-consistent Audio Synthesis Training: python train.py --config config.json Citation: @misc{kim2021frega

Rishikesh (ऋषिकेश) 93 Dec 17, 2022
PURE: End-to-End Relation Extraction

PURE: End-to-End Relation Extraction This repository contains (PyTorch) code and pre-trained models for PURE (the Princeton University Relation Extrac

Princeton Natural Language Processing 657 Jan 09, 2023
2.86% and 15.85% on CIFAR-10 and CIFAR-100

Shake-Shake regularization This repository contains the code for the paper Shake-Shake regularization. This arxiv paper is an extension of Shake-Shake

Xavier Gastaldi 294 Nov 22, 2022
A framework that allows people to write their own Rocket League bots.

YOU PROBABLY SHOULDN'T PULL THIS REPO Bot Makers Read This! If you just want to make a bot, you don't need to be here. Instead, start with one of thes

543 Dec 20, 2022
The official code of "SCROLLS: Standardized CompaRison Over Long Language Sequences".

SCROLLS This repository contains the official code of the paper: "SCROLLS: Standardized CompaRison Over Long Language Sequences". Links Official Websi

TAU NLP Group 39 Dec 23, 2022
Person Re-identification

Person Re-identification Final project of Computer Vision Table of content Person Re-identification Table of content Students: Proposed method Dataset

Nguyễn Hoàng Quân 4 Jun 17, 2021
Subgraph Based Learning of Contextual Embedding

SLiCE Self-Supervised Learning of Contextual Embeddings for Link Prediction in Heterogeneous Networks Dataset details: We use four public benchmark da

Pacific Northwest National Laboratory 27 Dec 01, 2022
🌎 The Modern Declarative Data Flow Framework for the AI Empowered Generation.

🌎 JSONClasses JSONClasses is a declarative data flow pipeline and data graph framework. Official Website: https://www.jsonclasses.com Official Docume

Fillmula Inc. 53 Dec 09, 2022
GUPNet - Geometry Uncertainty Projection Network for Monocular 3D Object Detection

GUPNet This is the official implementation of "Geometry Uncertainty Projection Network for Monocular 3D Object Detection". citation If you find our wo

Yan Lu 103 Dec 28, 2022
Disentangled Cycle Consistency for Highly-realistic Virtual Try-On, CVPR 2021

Disentangled Cycle Consistency for Highly-realistic Virtual Try-On, CVPR 2021 [WIP] The code for CVPR 2021 paper 'Disentangled Cycle Consistency for H

ChongjianGE 94 Dec 11, 2022
New AidForBlind - Various Libraries used like OpenCV and other mentioned in Requirements.txt

AidForBlind Recommended PyCharm IDE Various Libraries used like OpenCV and other

Aalhad Chandewar 1 Jan 13, 2022
This is an example implementation of the paper "Cross Domain Robot Imitation with Invariant Representation".

IR-GAIL This is an example implementation of the paper "Cross Domain Robot Imitation with Invariant Representation". Dependency The experiments are de

Zhao-Heng Yin 1 Jul 14, 2022