Sending Snort logs to Logstash



  • Hi all,

    I'm trying to collect and visualize my Snort logs using ELK cluster. I've setup everything and Logstash collects only logs with Facility_label: kernel and sometimes Facility_label: local5 which is nginx logs of pfSense. Snort logs are configured to be stored in LOG_LOCAL6 but I'm not collecting anything from it, even if my syslog is configured to send everything to ELK.

    Did anyone face this issue before? I believe that most of people did it already and if you have any experience configuring this stuff it would be great if you can share it with us.

    Thanks in advance!



  • So I managed to collect snort logs but now I face another issue. For some reason my grok parse fails every time and I believe it is because my "match" syntax. This is how my filter looks like:

    filter {
      syslog_pri { }
      if [type] == "syslog" {
        grok {
          match => {
             "message" => "<%{INT:syslog_pri}>[A-Za-z]{3} [0-9]+ [0-9]{2}:[0-9]{2}:[0-9]{2} %{WORD:program}([\d+])?: %{GREEDYDATA:message}"
          }
        }
        mutate {
          add_field => {
               "received_at"   => "%{@timestamp}"
               "received_from" => "%{host}"
               "logger"        => "%{program}"
          }
        }
    }
    

    Do you have any recommendation how my "message" syntax should look like for snort logs in pfSense?

    Thanks



  • I set something up with ELK quite a long time ago as an experiment. It worked, but to get it working I did a wholesale "copy and paste" operation on the log filters from someone's Google page. That was several years ago.

    I suggest using a Google search to see if you can locate some recent examples of using ELK to parse Snort data from syslog logs. These filters are highly dependent on the precise content of the log data and the layout of the various "fields".


Log in to reply