• Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Search
  • Register
  • Login
Netgate Discussion Forum
  • Categories
  • Recent
  • Tags
  • Popular
  • Users
  • Search
  • Register
  • Login

Grok Filter OpenVPN and Snort for logstash with Json Dashboard

Scheduled Pinned Locked Moved Bounties
4 Posts 2 Posters 5.6k Views
Loading More Posts
  • Oldest to Newest
  • Newest to Oldest
  • Most Votes
Reply
  • Reply as topic
Log in to reply
This topic has been deleted. Only users with topic management privileges can see it.
  • K
    killmasta93
    last edited by May 30, 2016, 11:54 PM

    Hi,
    I Have been trying for many months and no luck on getting this to work to filter the logs from pfSense and Send them to ELK. I am not sure what the price could be so message me and we can negotiate the price, The payment will be though paypal.

    Thank you

    Tutorials:

    https://www.mediafire.com/folder/v329emaz1e9ih/Tutorials

    1 Reply Last reply Reply Quote 0
    • K
      killmasta93
      last edited by Jun 8, 2016, 12:49 AM

      BUMP?

      Tutorials:

      https://www.mediafire.com/folder/v329emaz1e9ih/Tutorials

      1 Reply Last reply Reply Quote 0
      • N
        Noebas
        last edited by Jun 19, 2016, 10:56 AM

        I have this running with this dashboard http://imgur.com/5gaJ7ZY

        Showing failed logins and geo locations of logins. But one could build any dashboard once the data is in.
        If this would be the thing you need i am willing to share, do not have to have anything for it just a thank you.

        1 Reply Last reply Reply Quote 0
        • N
          Noebas
          last edited by Jun 25, 2016, 5:57 PM

          I do use Kibana 4, i would upgrade to it.

          Do you use syslog? It fist strips the date and time from the rest, changes it to the correct kibana time stamp. Then i filter the filter log and openvpn.
          I am working on the 2.3 gateway log. The Filter log you need the patten for and for geo you need the geo datafile

          This is My code:

          filter { 
          #Date time translation
            if [program] == "syslog" {
              grok {
                match => [ "message", "(?<datetime>(?:Jan(?:uary)?|Feb(?:ruary)?|Mar(?:ch)?|Apr(?:il)?|May|Jun(?:e)?|Jul(?:y)?|Aug(?:ust)?|Sep(?:tember)?|Oct(?:ober)?|Nov(?:ember)?|Dec(?:ember)?)\s+(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]) (?:2[0123]|[01]?[0-9]):(?:[0-5][0-9]):(?:[0-5][0-9])) (?<prog>.?): (?<msg>.)" ]
              }
              mutate {
                gsub => ["datetime","  "," "]
              }
              date {
                match => [ "datetime", "MMM dd HH:mm:ss" ]
              }
              mutate {
                replace => [ "message", "%{msg}" ]
              }
              mutate {
                remove_field => [ "msg", "datetime" ]
              }
          }
          #Filterlog
          if "filterlog" in [prog] { 
              grok {
                patterns_dir => "/etc/logstash/conf.d/patterns"
                match => [ "message", "%{PFSENSE_LOG_DATA}%{PFSENSE_IP_SPECIFIC_DATA}%{PFSENSE_IP_DATA}%{PFSENSE_PROTOCOL_DATA}",
                  "message", "%{PFSENSE_LOG_DATA}%{PFSENSE_IPv4_SPECIFIC_DATA_ECN}%{PFSENSE_IP_DATA}%{PFSENSE_PROTOCOL_DATA}" ]
              }
              mutate {
                lowercase => [ 'proto' ]
              }
            }
          #OpenVPN
          if "openvpn" in [prog] {
            grok {
              match => [ "message", "user '%{WORD:openvpn_user}'" ]
              match => [ "message", "%{WORD:openvpn_user}/%{IP:openvpn_scr_ip}:%{INT:openvpn_scr_port} MULTI_sva: pool returned IPv4=%{IP:openvpn_ip}" ]
            }
            #GEO DATA
              geoip {
              source => "openvpn_scr_ip"
              database => "/etc/logstash/GeoLiteCity.dat"
                }
          }
          }</msg></prog></datetime>

          1 Reply Last reply Reply Quote 0
          • First post
            Last post
          Copyright 2025 Rubicon Communications LLC (Netgate). All rights reserved.
            This community forum collects and processes your personal information.
            consent.not_received