Sagan - a multi-threads, high performance log analysis engine

Related tags

Logging sagan
Overview
,-._,-.    Sagan, the advanced Suricata/Snort like log analysis engine!
\/)"(\/ 
 (_o_)     Champ Clark III & The Quadrant InfoSec Team [quadrantsec.com]
 /   \/)   Copyright (C) 2009-2021 Quadrant Information Security, et al.
(|| ||) 
 oo-oo  

Join the Sagan Discord channel

Discord

Sagan Documentation

Sagan "Read The Docs! https://sagan.readthedocs.io

What is Sagan?

Sagan is an open source (GNU/GPLv2) high performance, real-time log analysis & correlation engine. It is written in C and uses a multi-threaded architecture to deliver high performance log & event analysis. The Sagan structure and Sagan rules work similarly to the Suricata & Snort IDS engine. This was intentionally done to maintain compatibility with rule management software (oinkmaster/pulledpork/etc) and allows Sagan to correlate log events with your IDS/IPS system.

Sagan can write out to databases via Suricata EVE formats and/or Unified2, it is compatible with all Snort & Suricata consoles. Sagan can write also write out JSON which can be ingested by Elasticsearch and viewed with console like Kibana, EVEbox, etc.

Sagan supports many different output formats, log normalization (via liblognorm), GeoIP detection, script execution on event and automatic firewall support via "Snortsam" (see http://www.snortsam.net).

Sagan uses the GNU "artisic style".

Sagan Features:

  • Sagan’s multi-threaded architecture allows it to use all CPUs / cores for real-time log processing.
  • Sagan's CPU and memory resources are light weight.
  • Sagan uses a similar rule syntax to Cisco’s “Snort” & Suricata which allows for easy rule management and correlation with Snort or Suricata IDS / IPS systems.
  • Sagan can store alert data in Cisco’s “Snort” native “unified2” binary data format or Suricata's JSON format for easier log-to-packet correlation.
  • Sagan is compatible with popular graphical-base security consoles like Snorby, BASE, Sguil, and EveBox.
  • Sagan can easily export data from other SIEMs via syslog.
  • Sagan can track events based on geographic locations via IP address source or destination data (e.g., identifying logins from strange geographic locations).
  • Sagan can monitor usage based on time of day (e.g., writing a rule to trigger when an administrator logs in at 3:00 AM).
  • Sagan has multiple means of parsing and extracting data through liblognorm or built in parsing rule options like parse_src_ip, parse_dst_ip, parse_port, parse_string, parse_hash (MD5, SHA1,SHA256).
  • Sagan can query custom blacklists, Bro Intel subscriptions like Critical Stack and “Bluedot”, Quadrant Information Security threat intelligence feeds by IP address, hashes (MD5, SHA1, SHA256), URLs, emails, usernames, and much more.
  • Sagan’s “client tracking” can inform you when machines start or stop logging. This helps you verify that you are getting the data you need.
  • Sagan uses “xbits” to correlate data between log events which allows Sagan to “remember” and flag events across multiple log lines and sources.
  • Sagan uses Intra-Process communications between Sagan processes to share data. Sagan can also use Redis (beta) to share data between Sagan instances within a network.
  • To help reduce “alert fatigue”, Sagan can “threshold” or only alert “after” certain criteria have been met.

Where can I get help with Sagan?

For more general Sagan information, please visit the offical Sagan web site: https://sagan.quadrantsec.com.

For Sagan documentation to assist with installation, rule writing, etc. Check out: https://sagan.readthedocs.io/en/latest/

For help & assistence, check out the Sagan mailing list. If it located at: https://groups.google.com/forum/#!forum/sagan-users. You can also ask questions on the Sagan Discord channel at https://discord.gg/n6ReCZED

If you're looking for Sagan rule sets on Github, they are located at: https://github.com/beave/sagan-rules

Comments
  • External output plugin produces invalid JSON

    External output plugin produces invalid JSON

    The external output plugin sends JSON with a dangling normalize making the entire JSON sent invalid.

    Looks like https://github.com/quadrantsec/sagan/blob/d264dc3081715d7f360740fd53fded5de2ed7ff1/src/output-plugins/external.c#L192 might be at fault. I'm unsure though if Event->json_normalize should be true for rules not using the normalize; keyword.

    { "signature_id": 505234, "signature": "Authentication failure", "rev": 1, "severity": 1, "category": "unsuccessful-user", "priority": 1, "timestamp": "01-07-2022 12:32:21.859190", "drop": "false", "flow_id": 852827190, "in_iface": "", "src_ip": "192.168.10.33", "src_port": 514, "dest_ip": "192.168.10.33", "dest_port": 514, "xff": "192.168.10.33", "proto": "UDP", "syslog_facility": "local7", "syslog_level": "info", "syslog_priority": "info", "syslog_message": "[2022\/01\/07 12:32:15.228367,  3] ..\/..\/source4\/auth\/kerberos\/krb5_init_context.c:80(smb_krb5_debug_wrapper)#012  Kerberos: Failed to decrypt PA-DATA -- [email protected] (enctype aes256-cts-hmac-sha1-96) error Decrypt integrity check failed for checksum type hmac-sha1-96-aes256, key type aes256-cts-hmac-sha1-96#", "normalize":  }
    
    alert any $EXTERNAL_NET any -> $HOME_NET any (msg: "Authentication failure"; content:"Kerberos: Failed to decrypt PA-DATA -- "; program: samba; classtype: unsuccessful-user; sid: 505234; rev: 1; external: /usr/local/sagan/external.py)
    
    opened by wokis 10
  • Event_id - detection doesn't work

    Event_id - detection doesn't work

    Hallo, I am using Sagan's JSON input module but I pull the event ID from the message rather than from a JSON field. The event_id is at the beggining of the message field. In version 2.0.1 the event_id field is UNDEFINE. After the patch ( https://groups.google.com/g/sagan-users/c/ju-3g2vIYgE ) the result is that event_id is empty string.

    Best Regards Ivan


    [D] Data in _Sagan_Proc_Syslog (including extracted JSON) [D] ----------------------------------------------------------------------------- [D] * message: " 4727: A security-enabled global group was created. Subject: Security ID: S-1-5-21-3641769155-4107095991-1524253519-8107 Account Name: USER Account Domain: DOMAIN Logon ID: 0x4DCCB98 New Group: Security ID: S-1-5-21-3641769155-4107095991-1524253519-31261 Group Name: dddddd Group Domain: DOMAIN Attributes: SAM Account Name: dddddd SID History: - Additional Information: Privileges: -" [D] * program: "Security" [D] * host: "10.XXX.XXX.XX" [D] * level: "info" [D] * facility: "user" [D] * priority: "14" [D] * tag: "UNDEFINED" [D] * time: "20:51:58" [D] * date: "2021-01-28" [D] * src_ip : "" [D] * dst_ip : "" [D] * src_port : "0" [D] * dst_port : "0" [D] * proto : "0" [D] * ja3: "" [D] * event_id: "" [D] * md5: "" [D] * sha1: "" [D] * sha256: "" [D] * filename: "" [D] * hostname: "" [D] * url: "" [D] * username: ""

    opened by iku899 9
  • fails to build with GCC 10: multiple definitions of the same variable

    fails to build with GCC 10: multiple definitions of the same variable

    As documented at https://gcc.gnu.org/gcc-10/porting_to.html GCC 10 defaults to -fno-common which reveals issues with the code in sagan.

    See https://bugs.debian.org/957771 for more details.

    Workaround is to build with fcommon but better would be to cleanup the code.

    opened by jonassmedegaard 8
  • [sagan-users] SEGV version 2.0.0

    [sagan-users] SEGV version 2.0.0

    Hello all,

    Today I tried updating to version 2.0.0 from the new github repo.

    If I start sagan from the shell it is starting up OK, but as soon as it starts sagan as daemon with systemctl start sagan I get a failure (signal=SEGV).

    If I revert to earlier versions in the old repo I can get commit c9c22a5ea0882b5954e8dba94dcbf533a75cd0c1 from 26-10 running, but commit 43cd81adafdf6686584d40e8d1bc64fb120ddba6 from 28-10 gives me the same SEGV message as the most recent version.

    I'm wondering if someone has also encountered the same issue and knows a solution.

    The OS I'm running on is OS 18.04LTS and I have the core dump attached to this message.

    Best regards,

    Stef stacktrace.txt

    opened by beave 4
  • Better checking for alert type.

    Better checking for alert type.

    "alert ip" isn't valid in Sagan.

    "Valid options for this field are any, tcp, udp or icmp. In most cases, you will likely want to specify any. The protocal is determined by the parse_proto or parse_program_proto rule options."

    Sagan should check for valid type and error if invalid.

    opened by quadrantsec 3
  • Rule errors with fresh build from source and rules from current repo

    Rule errors with fresh build from source and rules from current repo

    I followed the install guide at https://sagan.readthedocs.io/en/latest/install.html but got the source from quadrantsec instead of beave. It compiled and installed on Debian 10. I downloaded the updated rules from quadrantsec repo and it kept giving me errors when starting it. I downloaded the rules from beave and it started without problems.

    The errors were something about "expected 'track'" on several rules like cisco-correlation and something else.

    opened by J-Beavers 3
  • sagan fails to start

    sagan fails to start

    Hi,

    I have just installed sagan(1.2.2-1) on Arch Linux(5.10.56-1-lts) , no install errors to report, but as I am going through the "Post-installation setup and testing" section from the Sagan User guide(readthedocs):

    • "To test, run sagan --debug syslog,engine as the root user. It will switch to the sagan user when ready, and remain running in the foreground. command: $ sudo sagan --debug syslog,engine output:sagan: error while loading shared libraries: libesmtp.so.6: cannot open shared object file: No such file or directory issue: sagan fails to start libesmtp version: community/libesmtp 1.1.0-1 [installed]

    libesmtp Directories: libesmtp /usr/include/auth-client.h libesmtp /usr/include/libesmtp.h libesmtp /usr/lib/ libesmtp /usr/lib/esmtp-plugins-6.2.0/ libesmtp /usr/lib/esmtp-plugins-6.2.0/sasl-crammd5.so libesmtp /usr/lib/esmtp-plugins-6.2.0/sasl-login.so libesmtp /usr/lib/esmtp-plugins-6.2.0/sasl-ntlm.so libesmtp /usr/lib/esmtp-plugins-6.2.0/sasl-plain.so libesmtp /usr/lib/libesmtp.so libesmtp /usr/lib/libesmtp.so.6.2.0

    At first sight it looks like sagan cannot find libesmtp.so.6 library. Do you have any suggestion?

    Thanks,

    opened by Haxx 3
  • json_pcre { } range error

    json_pcre { } range error

    Example: json_pcre:".QuestionName", "/^.*\w{1,50}\.(\w+\.\w{2,10})$/";

    In the above example if you use {1,50} sagan will error with [rules.c, line 2354] Missing last '/' in json_pcre. This doesn't affect the normal pcre keyword. Also, single digits {50} work just fine but if a range is specified the error will occur.

    opened by bryant-smith 2
  • Found an issue with json_meta_contains

    Found an issue with json_meta_contains

    Hello,

    Was trying to change sid:5003377 "[WINDOWS-AUTH] Suspicious network login from non-RFC1918" to leverage the json parsing.

    Given an event like this:

      "syslog-source-ip": "10.2.33.41",
      "WorkstationName": "WORKSTATION",
      "VirtualAccount": "%%1843",
      "Version": "2",
      "TransmittedServices": "-",
      "TaskValue": "12544",
      "TargetUserSid": "WINDOWS\\auser",
      "TargetUserName": "auser",
      "TargetOutboundUserName": "-",
      "TargetOutboundDomainName": "-",
      "TargetLogonId": "0xasdfasdf",
      "TargetLinkedLogonId": "0x0",
      "TargetDomainName": "WINDOWS",
      "SubjectUserSid": "NULL SID",
      "SubjectUserName": "-",
      "SubjectLogonId": "0x0",
      "SubjectDomainName": "-",
      "SourceName": "Microsoft-Windows-Security-Auditing",
      "SourceModuleType": "im_msvistalog",
      "SourceModuleName": "winlog",
      "SeverityValue": "2",
      "Severity": "INFO",
      "RestrictedAdminMode": "-",
      "ProviderGuid": "{REDACTED}",
      "ProcessName": "-",
      "ProcessId": "0x0",
      "OrgName": "redacted",
      "OpcodeValue": "0",
      "Opcode": "Info",
      "Message": "An account was successfully logged on.  Subject:  Security ID:  S-1-0-0  Account Name:  -  Account Domain:  -  Logon ID:  0x0  Logon Information:  Logon Type:  3  Restricted Admin Mode: -  Virtual Account:  No  Elevated Token:  No  Impersonation Level:  Impersonation  New Logon:  Security ID: REDACTED Account Name:  auser  Account Domain:  WINDOWS  Logon ID:  0xasdfasdf Linked Logon ID:  0x0  Network Account Name: -  Network Account Domain: -  Logon GUID:  {00000000-0000-0000-0000-000000000000}  Process Information:  Process ID:  0x0  Process Name:  -  Network Information:  Workstation Name: WORKSTATION  Source Network Address: 24.18.x.x  Source Port:  64491  Detailed Authentication Information:  Logon Process:  NtLmSsp   Authentication Package: NTLM  Transited Services: -  Package Name (NTLM only): NTLM V2  Key Length:  128  This event is generated when a logon session is created. It is generated on the computer that was accessed.  The subject fields indicate the account on the local system which requested the logon. This is most commonly a service such as the Server service, or a local process such as Winlogon.exe or Services.exe.  The logon type field indicates the kind of logon that occurred. The most common types are 2 (interactive) and 3 (network).  The New Logon fields indicate the account for whom the new logon was created, i.e. the account that was logged on.  The network fields indicate where a remote logon request originated. Workstation name is not always available and may be left blank in some cases.  The impersonation level field indicates the extent to which a process in the logon session can impersonate.  The authentication information fields provide detailed information about this specific logon request.  - Logon GUID is a unique identifier that can be used to correlate this event with a KDC event.  - Transited services indicate which intermediate services have participated in this logon request.  - Package name indicates which sub-protocol was used among the NTLM protocols.  - Key length indicates the length of the generated session key. This will be 0 if no session key was requested.",
      "LogonType": "3",
      "LogonProcessName": "NtLmSsp ",
      "LogonGuid": "{00000000-0000-0000-0000-000000000000}",
      "LmPackageName": "NTLM V2",
      "Keywords": "REDACTED",
      "KeyLength": "128",
      "IpPort": "64491",
      "IpAddress": "24.18.x.x",
      "ImpersonationLevel": "%%1833",
      "Hostname": "REDACTED",
      "ExecutionThreadID": "6024",
      "ExecutionProcessID": "752",
      "Evtid": " 4624: ",
      "EventType": "AUDIT_SUCCESS",
      "EventTime": "2021-02-26T16:31:08.165307-08:00",
      "EventReceivedTime": "2021-02-26T16:42:10.861381-08:00",
      "EventID": "4624",
      "ElevatedToken": "%%1843",
      "Channel": "Security",
      "Category": "Logon",
      "AuthenticationPackageName": "NTLM",
    
    }
    

    While working on the meta_content portion:

     meta_content:!"Source Network Address|3a| %sagan%",10.,192.168.,-,|3a 3a|1,127.0.0.1,172.16.,172.17.,172.18.,172.19.,172.20.,172.21.,172.22.,172.23.,172.24.,172.25.,172.26.,172.27.,172.28.,172.29.,172.30.,172.31,169.254,fe80,|3a 3a|1;
    

    I found a discrepancy in the application of json_meta_contains. Its necessary to use either json_pcre or json_meta_contains since we just want to match on the beginning of the key IpAddress's value.

    When I tried to use:

    json_meta_content:!"IpAddress",10.,192.168.,-,|3a 3a|1,127.0.0.1,172.16.,172.17.,172.18.,172.19.,172.20.,172.21.,172.22.,172.23.,172.24.,172.25.,172.26.,172.27.,172.28.,172.29.,172.30.,172.31,169.254,fe80,|3a 3a|1; json_meta_contains;
    

    I was unable to load the signature receiving this error:

    [E] [rules.c, line 3538] Got bad rule option 'json_meta_contains' on line 268 of /usr/local/etc/sagan-rules/windows-auth.rules. Abort.
    

    The fully modified signature is :

    alert any any any -> any any (msg: "[WINDOWS-AUTH] Suspicious network login from non-RFC1918"; program: *Security*; event_id: 4624; json_content:".LogonType","3"; parse_src_ip: 1; json_meta_
    content:!"IpAddress",10.,192.168.,-,|3a 3a|1,127.0.0.1,172.16.,172.17.,172.18.,172.19.,172.20.,172.21.,172.22.,172.23.,172.24.,172.25.,172.26.,172.27.,172.28.,172.29.,172.30.,172.31,169.254,
    fe80,|3a 3a|1; json_meta_contains; json_meta_nocase;reference: url,findingbad.blogspot.cz/2017/12/a-few-of-my-favorite-things-continued.html; reference: url,wiki.quadrantsec.com/bin/view/Mai
    n/5003377; reference: url,www.quadrantsec.com/about/blog/using_jack_crooks_log_analysis_concepts_with_sagan; sid:5003377; classtype:suspicious-login; rev:6;)
    

    When I looked at the source code pertaining to the json_meta_contains declaration:

                        /* Set the previous "json_meta_strstr" to use strstr instead of strcmp */
    
                        /* TODO: Remove "json_meta_strstr" */
    
                        if (!strcmp(rulesplit, "json_meta_strstr") || !strcmp(rulesplit, "json_meta_contains") )
                            {
                                strtok_r(NULL, ":", &saveptrrule2);
                                rulestruct[counters->rulecount].json_meta_strstr[json_content_count-1] = 1;
                            }
    

    Shouldnt the use of "json_content_count-1" actually be "json_meta_content_count-1" ?

    Similiar to the way the json_meta_nocase modifier is coded.

    
                        if (!strcmp(rulesplit, "json_meta_nocase"))
                            {
                                strtok_r(NULL, ":", &saveptrrule2);
                                rulestruct[counters->rulecount].json_meta_content_case[json_meta_content_count-1] = true;
                            }
    
    
    opened by CyberTaoFlow 2
  • -L command line option log directory override

    -L command line option log directory override

    There is a -l option to send the sagan.log file to a separate location, bypassing the config file. Can the -l be changed to redirect all logs to the new location or use -L for that?

    opened by bryant-smith 1
  • Ability to read in gzip compressed files.

    Ability to read in gzip compressed files.

    Sagan can read in files on the disk. It would be great if Sagan could automatically read in gzip compressed files. libz (??) has gzfopen() and a similar API for file calls.. Might not be hard to do.

    opened by quadrantsec 1
  • meta_content and json_meta_content modifier

    meta_content and json_meta_content modifier

    The meta_content and json_meta_content both accept multiple multiple value can are compared based on OR logic, meaning only one needs to match to make the statement true. What about a & modifier to change them to use AND logic and have to match all value? This of course can be done with multiple json_content statements but may look cleaner for certain usages.

    This can also be accomplished with json_pcre but may be faster than using pcre?

    Current Usage: json_content:".SourceImage", "|5c|AppData|5c|Local|5c|"; json_content:".SourceImage","|2e|exe"; distance:0; within:100;

    Proposed Usage: json_meta_content:&".SourceImage",|5c|AppData|5c|Local|5c|, |2e|exe;

    The proposed usage can even be processed in order. So in this example it would have to match |5c|AppData|5c|Local|5c| first then |2e|exe would be next and have to be found after the first match. This may need an additional modifier like json_meta_ordered;

    json_meta_content:&".SourceImage",|5c|AppData|5c|Local|5c|, |2e|exe; json_meta_ordered;

    opened by bryant-smith 2
  • Sagan loads rules with

    Sagan loads rules with "threshold" written incorrectly

    Observed behavior

    Two rules in the https://github.com/quadrantsec/sagan-rules repo are written incorrect, possibly in an older format:

    alert any $EXTERNAL_NET any -> $HOME_NET any (msg:"[ASTERISK] Login session failed [0/5]"; content: "Wrong password"; classtype: unsuccessful-user; program: asterisk; reference: url,wiki.quadrantsec.com/bin/view/Main/5000179; threshold:suppress, track by_src, count 5, seconds 900; sid:5000179; rev:4;)
    
    alert any $EXTERNAL_NET any -> $HOME_NET any (msg:"[ASTERISK] Brute force login session failed [5/5]"; content: "Wrong password"; xbits: set, brute_force ,track ip_src, expire 21600; classtype: brute-force; program: asterisk; reference: url,wiki.quadrantsec.com/bin/view/Main/5002942; after: track by_src, count 5, seconds 300; threshold:suppress, track by_src, count 5, seconds 900; sid:5002942; rev:4;)
    

    Sagan successfully runs when loading in these rules, but the thresholds are not applied properly. This results in no threshold being applied. Changing to the documented way of threshold: type suppress, ... enables the rule to perform as expected.

    Expected behavior

    Sagan fails to load due to the rule being written incorrectly

    Sagan version and OS version

    Sagan v2.1.0 Debian 11

    opened by wrharding 0
  • after and threshold tracking keys from json_map/normalization

    after and threshold tracking keys from json_map/normalization

    It could be useful to grab the key used in the json_map field and be able to track by whatever the user defined in json_map. Here's an example:

    alert any $HOME_NET any -> $HOME_NET any (msg:"[WINDOWS-AUTH] Domain Controller Blocked Audit: Audit NTLM authentication to this domain controller [100/1]"; \
    program:Microsoft-Windows-NTLM/Operational; \
    json_map:"event_id",".EventID"; 
    event_id:8004; \
    json_map:"host.workstationName",".WorkstationName"; \
    json_map:"host.username",".Hostname"; \
    after: track by_host.workstationName&by_host.username, count 10, seconds 60; \
    threshold: type suppress, track by_host.workstationName&by_host.username, count 1, seconds 60; \
    classtype:attempted-user; sid:1111111; rev:1;)
    

    In the example we can use the predetermined fields, like event_id, or a user defined variable. This give the ability to track by two or more strings with after and/or threshold. If this can be done, maybe the same can be done with liblognorm. Any fields that are defined using liblognorm can be tracked with after and/or threshold.

    opened by bryant-smith 0
  • strip character transformation

    strip character transformation

    Similar to https://suricata.readthedocs.io/en/suricata-6.0.6/rules/transforms.html?highlight=transform

    Since some windows logs may contain tab characters and some don't it could be handy to have the ability to remove them and somewhat normalize the logs. This would strip all matches to the character values either in hex or ascii format.

    strip_char: { [hex values|ascii values], [...] } ;

    Ex. strip_char: |0d 0a|; remove return newline strip_char: |09|; remove tab strip_char: "\t"; `strip_char: |20|, |60|, ^; remove spaces and back ticks and carrots.

    The back ticks and carrots can be used to obfuscate commands and having them removed can help with detections.

    --debug transform Command line option Being able to see how the log looks after the transformation can help with rule writing if something isn't working.

    opened by bryant-smith 0
  • Compilation Fails When Configuring without Normalization or JSON Output

    Compilation Fails When Configuring without Normalization or JSON Output

    I have been attempting to compile Sagan on an Alpine Linux container to realize container disk size improvements over Debian. However, I have run into an issue, outlined at wrharding/sagan-docker#1

    My configure statement is like so: ./configure --disable-lognorm --disable-libfastjson

    When attempting to run make in the outlined compilation steps, I run into the below error:

    /sagan # make
    make  all-recursive
    make[1]: Entering directory '/sagan'
    Making all in src
    make[2]: Entering directory '/sagan/src'
    gcc -DHAVE_CONFIG_H -I. -I..  -I..     -g -O2 -D__Linux__ -MT sagan-sagan.o -MD -MP -MF .deps/sagan-sagan.Tpo -c -o sagan-sagan.o `test -f 'sagan.c' || echo './'`sagan.c
    In file included from sagan.c:98:
    rules.h:293:29: error: 'MAX_JSON_DECODE_BASE64' undeclared here (not in a function)
      293 |     bool json_decode_base64[MAX_JSON_DECODE_BASE64];
          |                             ^~~~~~~~~~~~~~~~~~~~~~
    make[2]: *** [Makefile:801: sagan-sagan.o] Error 1
    make[2]: Leaving directory '/sagan/src'
    make[1]: *** [Makefile:392: all-recursive] Error 1
    make[1]: Leaving directory '/sagan'
    make: *** [Makefile:333: all] Error 2
    /sagan #
    

    I've observed the same issue on an Ubuntu 20.04 Desktop environment:

    [email protected]:~/Projects/sagan$ make
    make  all-recursive
    make[1]: Entering directory '/home/wharding/Projects/sagan'
    Making all in src
    make[2]: Entering directory '/home/wharding/Projects/sagan/src'
    gcc -DHAVE_CONFIG_H -I. -I..  -I..     -g -O2 -D__Linux__ -MT sagan-sagan.o -MD -MP -MF .deps/sagan-sagan.Tpo -c -o sagan-sagan.o `test -f 'sagan.c' || echo './'`sagan.c
    In file included from sagan.c:95:
    rules.h:293:29: error: ‘MAX_JSON_DECODE_BASE64’ undeclared here (not in a function)
      293 |     bool json_decode_base64[MAX_JSON_DECODE_BASE64];
          |                             ^~~~~~~~~~~~~~~~~~~~~~
    make[2]: *** [Makefile:800: sagan-sagan.o] Error 1
    make[2]: Leaving directory '/home/wharding/Projects/sagan/src'
    make[1]: *** [Makefile:390: all-recursive] Error 1
    make[1]: Leaving directory '/home/wharding/Projects/sagan'
    make: *** [Makefile:331: all] Error 2
    
    opened by wrharding 1
  • Centos 7 - protocol.map error: 'type' not specified at line 1

    Centos 7 - protocol.map error: 'type' not specified at line 1

    On an Centos 7 box with updated build-utils installed (yum groupinstall "Development Tools") I have the following error compiling sagan from source

    ./configure
    make
    

    error:

    make  all-recursive
    make[1]: Entering directory `/root/sagan'
    Making all in src
    make[2]: Entering directory `/root/sagan/src'
    gcc -DHAVE_CONFIG_H -I. -I..  -I.. -I/usr/include/libfastjson    -I/usr/local/include  -g -O2 -D__Linux__ -MT sagan-sagan.o -MD -MP -MF .deps/sagan-sagan.Tpo -c -o sagan-sagan.o `test -f 'sagan.c' || echo './'`sagan.c
    sagan.c: In function ‘main’:
    sagan.c:1184:13: error: ‘for’ loop initial declarations are only allowed in C99 mode
                 for (size_t z = 0; z != globbuf.gl_pathc; ++z)
                 ^
    sagan.c:1184:13: note: use option -std=c99 or -std=gnu99 to compile your code
    make[2]: *** [sagan-sagan.o] Error 1
    make[2]: Leaving directory `/root/sagan/src'
    make[1]: *** [all-recursive] Error 1
    make[1]: Leaving directory `/root/sagan'
    make: *** [all] Error 2
    

    Make Version: GNU Make 3.82 Built for x86_64-redhat-linux-gnu I got success in compiling if I do

    make CFLAGS=-std=c99
    

    check sagan compilation

    sagan -h
    
    --[Sagan version 2.1.0 | Help/usage screen]--------------------------------
    
    -h, --help		Help (this screen).
    -C, --credits		Sagan credits.
    -d, --debug [type]	Types: engine, syslog, load, external, threads, ipc, limits, malformed, xbit, flexbit, brointel, parse_ip, client-stats, track-clients, normalize, json.
    -D, --daemon		Make process a daemon (fork to the background).
    -u, --user [username]	Run as user (defaults to 'sagan').
    -c, --chroot [dir]	Chroot Sagan to specified directory.
    -f, --config [file]	Sagan configuration file to load.
    -F, --file [file]	FIFO over ride.  This reads a file in rather than reading
    			from a FIFO.  The file must be in the Sagan format!
    -l, --log [file]	sagan.log location [default: /var/log/sagan/sagan.log].
    -Q, --quiet		Run Sagan in 'quiet' mode (no console output)
    -t, --threads [#]	Set number of threads to use (overrides the sagan.yaml).
    -r, --rules [file]	Use a specific rule set (overrides the sagan.yaml).
    -T, --test		Test configuration (sagan.yaml) and rule sets.
    
    * liblognorm (log normalization) support is included.
    * libfastjson support is included.
    * Syslog output is included.
    * Using PCRE JIT.
    
    * Compiled on Aug  7 2022 at 14:54:28.
    

    Then installed sagan-rules as per documentation but run sagan --debug syslog,engine or sagan -T gives the error below.

    [*] Loading protocol map file. [/usr/local/etc/sagan-rules/protocol.map]
    [E] [protocol-map.c, line 104] 'type' not specified at line 1
    

    I checked line 104 of protocol-map.c and it seems and error of decoding JSON -> C struct while reading file protocol.map

     while(fgets(mapbuf, 1024, mapfile) != NULL)
            {
                /* Skip comments and blank linkes */
                if (mapbuf[0] == '#' || mapbuf[0] == 10 || mapbuf[0] == ';' || mapbuf[0] == 32)
                    {
                        line_number++;
                        continue;
                    }
                else
                    {
                        line_number++;
                        json_obj = json_tokener_parse(mapbuf);
                        if (json_object_object_get_ex(json_obj, "type", &tmp))
                            {
                                type = json_object_get_string(tmp);
                            }
                        else
                            {
                                Sagan_Log(ERROR, "[%s, line %d] 'type' not specified at line %d", __FILE__, __LINE__, line_number );
                            }
    

    I thought of outdated Centos libraries and then I compiled and installed an updated version of libfastjson from git clone https://github.com/rsyslog/libfastjson.git and also compiled and re-installed liblognorm from source

    libfastjson: make check
    ============================================================================
    Testsuite summary for libfastjson 1.2101.0.master
    ============================================================================
    # TOTAL: 17
    # PASS:  17
    # SKIP:  0
    # XFAIL: 0
    # FAIL:  0
    # XPASS: 0
    # ERROR: 0
    ============================================================================
    liblognorm: make check
    ============================================================================
    Testsuite summary for liblognorm 2.0.7.master
    ============================================================================
    # TOTAL: 115
    # PASS:  115
    # SKIP:  0
    # XFAIL: 0
    # FAIL:  0
    # XPASS: 0
    # ERROR: 0
    

    Then I compiled again sagan pointing to the new libraries (libfastjson compiles in standard /usr dir)

    make clean
    ./configure --with-lognorm-includes=/usr/local/include --with-lognorm-libraries=/usr/local/lib 
    make CFLAGS=-std=c99
    make install
    sagan -T 
    

    The error is still there:

    [*] *******************************************************************************
    [*] ** Running Sagan in 'test'. Engine will not start after testing is complete. **
    [*] *******************************************************************************
    [*] Sagan's PID is 9646
    [*] Loading classifications.conf file. [/usr/local/etc/sagan-rules/classification.config]
    [*] 53 classifications loaded
    [*] Loading references.conf file. [/usr/local/etc/sagan-rules/reference.config]
    [*] 6 references loaded.
    [*] Loading protocol map file. [/usr/local/etc/sagan-rules/protocol.map]
    [E] [protocol-map.c, line 104] 'type' not specified at line 1
    

    Any help would be appreciated.

    opened by egargale 1
Releases(v,2,0.2)
  • v,2,0.2(Dec 29, 2021)

    Sagan 2.0.2 released.

    • Fixes that allow Sagan to compile using GCC 10.

                  https://github.com/quadrantsec/sagan/commit/21f753d2ad0f1c4fe5488ad5e325b9ddb3b8f2c7
      
                * When Sagan finds a "correlated event" (via a "xbit" or "flexbit"),  Sagan will store
                  the correlated data within the fired alert EVE.  This means you don't have to search
                  for the data! 
      
                  https://github.com/quadrantsec/sagan/commit/efed225c0e90b8ea9d975fed1efd390d9c6d2345
      
                * Patch for Stef Roskam chaning the engine order and improve json parsing. Thanks Stef!!
      
                  https://github.com/quadrantsec/sagan/pull/14
      
                * Various minor JSON fixes.
      
                  https://github.com/quadrantsec/sagan/commit/ac447fb1b75f5d260e761d161167fa82c8bbe53f
                  https://github.com/quadrantsec/sagan/commit/7060725730a1311de7cfc8912f4fcc5b495fa1b4
                  https://github.com/quadrantsec/sagan/commit/e2e70565fe8f159ae4c249e585ca0129377ac053
      
                * Major code cleanup in processors/engine.c.  Over time,  this code had become 
                  harder to maintain.  This cleanup makes the code more maintainable and 
                  more efficient.  This cleanup resulted in improved preformance and better
                  memory footprint.  Various other code cleanups as well to improve preformance and 
                  memory footprint!
      
                  https://github.com/quadrantsec/sagan/commit/ac6dcf754d1476ed7e4ceebff317a40f9f19eaf9
                  https://github.com/quadrantsec/sagan/commit/90f479b28ef14e55f7fd0652c0a6fd3c90d0485e
                  https://github.com/quadrantsec/sagan/commit/54ab349c5f0c07b1c251e874cd55bd7228f27ab4
                  https://github.com/quadrantsec/sagan/commit/21f753d2ad0f1c4fe5488ad5e325b9ddb3b8f2c7
      
                * Allow message "mapping" to take place in the signature. For example;  
      
                  json_map: "src_ip", ".ClientIP"
      
                  This will map the JSON data value of ".ClientIP" to Sagan internal engine of
                  "src_ip".  That is,  the ".ClientIP" will become what Sagan knows as "src_ip"
                  which can then be used with other keywords (threshold, after, etc).  Removed the
                  code for the "json-message.map",  as this is a much more efficient way to map
                  JSON data.
      
                  https://github.com/quadrantsec/sagan/commit/2382f87c187bccadb453b5aa8287952290906896
                  https://github.com/quadrantsec/sagan/commit/977668e9f2e9f0b042ca59518d949263a68e3a1a
      
                * Fix issue when value is "null" in JSON 
      
                  https://github.com/quadrantsec/sagan/commit/475cbf97518a6b3b8b0c95cf7192daf66f105e8f
                  https://github.com/quadrantsec/sagan/commit/ce9a6d791b8ef6a7232a5d66d462cba0299f590f
                  https://github.com/quadrantsec/sagan/commit/54ab349c5f0c07b1c251e874cd55bd7228f27ab4
                  https://github.com/quadrantsec/sagan/commit/350edda012b6588b81d1b165b8e7e495e92168b3
      
    Source code(tar.gz)
    Source code(zip)
  • v2.0.1(Feb 8, 2021)

    2021/02/08 - Sagan 2.0.1 released.

                * Multiple bug fixes that address compile time issues with GCC 10. 
    
                * Can now compile with Google's TCMalloc (--enable-tcmalloc).  This 
                  might result in less memory usage and a minor increase in performance.
    
                * Bug fix for "event_id" not working in certain situations.  Thanks to
                  Ivan Kuncl (iku899) at Github for reporting this issue. 
    
                  https://github.com/quadrantsec/sagan/issues/8
    
                * Bug fix for segfault when running with --daemon flag.  Thanks to 
                  Stef Roskam (smr1983) for reporting and patching this.
    
                  https://github.com/quadrantsec/sagan/issues/2
    
                * A lot of "cleanup" work provided by Jonas Smedegaad (jonassmedegaard). 
                  This involved proper git "tagging", typo's, dirty source trees, etc. 
    
                * Removed unneeded pthead_mutex_locks() in bluedot.c.  This should
                  cause a minor performance increase.  Also some other minor Bluedot
                  performance enhancements.
    
                * Removed the "perfmon" function.  Use "stats-json" instead!
    
                * Added a "Max threads used" statistics.  This assists with properly
                  tuning the number of threads in your sagan.yaml.  It displays the 
                  max number of threads during the lifetime of Sagan.
    
                * Bypass content/pcre when syslog "message" is null. 
    
                  https://github.com/quadrantsec/sagan/commit/261adc243a4a43dd5c87483d31c1aacce73b95d2
    
                * Simplified the was "client-stats" functions.  Now writes out one JSON
                  object for each log source detected.   This change is also reflected in
                  Meer. 
    
                * Sagan now records PID on startup & minor typo's fixed. 
    
    Source code(tar.gz)
    Source code(zip)
  • v2.0.0(Jan 27, 2021)

    Quadrant Information Security (https://quadrantsec.com) is proud to release version 2.0.0 of the Sagan log analysis engine! Some of the major updates to this release are:

    • The Sagan repos have moved! They can now be found at:

    https://github.com/quadrantsec/sagan https://github.com/quadrantsec/sagan-rules

    • New JSON parsing options (json_content, json_pcre, etc). The allows for decoding and writing rules for JSON based logs easier. See https://sagan.readthedocs.io/en/latest/sagan-json.html#sagan-json for more details.

    • Sagan EVE now stores more GeoIP information (if available). With the use of the Maxmind “city” GeoIP2 databases, Sagan will record “city”, “postal codes”, “latitude”, “longitude”, etc.

    • Statistics are now written in a JSON format similar to Suricata JSON stats. This will replace the legacy “perfmon” stats output in 2.0.1.

    • Introduction to “event_id” rule option to automagically part Windows event IDs from logs.

    • New “metadata” rule option for rules. This works the same as Suricata’s “metadata” rule options.

    • Added “normalization” data to EVE output.

    • New “append_program” rule option. This option appends the “program” field to the end of the syslog message. This can be useful when program fields are erratic and cannot be depended on.

    • Removed “Snortsam” and “Unified2” support.

    • Rewrote the way EVE files are written to better handle file rotation and automatic EVE file recreation.

    • Statistics now record “bytes_total” and “bytes_ignored”. This can be useful to determine how much data Sagan has processed.

    • New “client-stats” configuration option. This option will take a single log message every few minutes (user specified) and record it a separate file. This can be useful for providing an “example” of the types of data a host is sending.

    • Better validation of signatures upon start up.

    • A lot of stability, memory and CPU enhancements that make sure Sagan is as stable as possible.

    More ChangeLog information is at: https://github.com/quadrantsec/sagan/blob/main/ChangeLog

    Source code(tar.gz)
    Source code(zip)
Owner
Quadrant Information Security
Quadrant Information Security consulting company based in Jacksonville, Fl. We operate a 24/7 SOC/MSSP and develop the Sagan Log Analysis Engine - sagan.io
Quadrant Information Security
Mini-async-log-c - Mini async log C port. Now with C++ wrappers.

Description A C11/C++11 low-latency wait-free producer (when using Thread Local Storage) asynchronous textual data logger with type-safe strings. Base

null 67 Sep 13, 2022
Log engine for c plus plus

PTCLogs library PTCLogs is a library for pretty and configurable logs. Installation To install the library (headers and .so file), clone this repo and

Pedro Tavares de Carvalho 12 May 20, 2022
Minimalistic logging library with threads and manual callstacks

Minimalistic logging library with threads and manual callstacks

Sergey Kosarevsky 21 Jun 24, 2022
log4cplus is a simple to use C++ logging API providing thread-safe, flexible, and arbitrarily granular control over log management and configuration. It is modelled after the Java log4j API.

% log4cplus README Short Description log4cplus is a simple to use C++17 logging API providing thread--safe, flexible, and arbitrarily granular control

null 1.3k Oct 3, 2022
Example program using eBPF to log data being based in using shell pipes

Example program using eBPF to log data being based in using shell pipes (|)

pat_h/to/file 28 Jul 16, 2022
A revised version of NanoLog which writes human readable log file, and is easier to use.

NanoLogLite NanoLogLite is a revised version of NanoLog, and is easier to use without performance compromise. The major changes are: NanoLogLite write

Meng Rao 23 Sep 27, 2022
Cute Log is a C++ Library that competes to be a unique logging tool.

Cute Log Cute Log is a C++ Library that competes to be a unique logging tool. Version: 2 Installation Click "Code" on the main repo page (This one.).

null 4 Aug 12, 2021
Compressed Log Processor (CLP) is a free tool capable of compressing text logs and searching the compressed logs without decompression.

CLP Compressed Log Processor (CLP) is a tool capable of losslessly compressing text logs and searching the compressed logs without decompression. To l

null 258 Oct 3, 2022
View and log aoe-api requests and responses

aoe4_socketspy View and log aoe-api requests and responses Part 1: https://www.codereversing.com/blog/archives/420 Part 2: https://www.codereversing.c

Alex Abramov 9 Aug 6, 2022
Uberlog - Cross platform multi-process C++ logging system

uberlog uberlog is a cross platform C++ logging system that is: Small Fast Robust Runs on Linux, Windows, OSX MIT License Small Two headers, and three

IMQS Software 14 Aug 15, 2022
logog is a portable C++ library to facilitate logging of real-time events in performance-oriented applications

logog is a portable C++ library to facilitate logging of real-time events in performance-oriented applications, such as games. It is especially appropriate for projects that have constrained memory and constrained CPU requirements.

John Byrd 46 Oct 21, 2020
Reckless logging. Low-latency, high-throughput, asynchronous logging library for C++.

Introduction Reckless is an extremely low-latency, high-throughput logging library. It was created because I needed to perform extensive diagnostic lo

Mattias Flodin 440 Oct 4, 2022
Log.c2 is based on rxi/log.c with MIT LICENSE which is inactive now. Log.c has a very flexible and scalable architecture

log.c2 A simple logging library. Log.c2 is based on rxi/log.c with MIT LICENSE which is inactive now. Log.c has a very flexible and scalable architect

Alliswell 2 Feb 13, 2022
Inter-process communication library to enable allocation between processes/threads and send/receive of allocated regions between producers/consumer processes or threads using this ipc buffer.

This is a relatively simple IPC buffer that allows multiple processes and threads to share a dynamic heap allocator, designate "channels" between processes, and share that memory between producer/consumer pairs on those channels.

RaftLib 8 Aug 20, 2022
Mini-async-log-c - Mini async log C port. Now with C++ wrappers.

Description A C11/C++11 low-latency wait-free producer (when using Thread Local Storage) asynchronous textual data logger with type-safe strings. Base

null 67 Sep 13, 2022
Simple application log library. supporting multiple log levels, custom output & flash memory support.

ArduinoLog - C++ Log library for Arduino devices An minimalistic Logging framework for Arduino-compatible embedded systems. ArduinoLog is a minimalist

Thijs Elenbaas 126 Aug 21, 2022
Probabilistic Risk Analysis Tool (fault tree analysis, event tree analysis, etc.)

SCRAM SCRAM is a Command-line Risk Analysis Multi-tool. This project aims to build a command line tool for probabilistic risk analysis. SCRAM is capab

Olzhas Rakhimov 112 Sep 6, 2022
android analysis tools, jni trace by native hook, libc hook, write log with caller's addr in file or AndroidLog

编译方法 unix like mkdir "build" cd build cmake .. -DNDK=your_ndk_path/Android/sdk/ndk/22.0.7026061 -DANDROID_ABI=armeabi-v7a make -j8 或者使用andriod studio编

pony 55 Sep 6, 2022
High Performance 3D Game Engine, with a high emphasis on Rendering

Electro High Performance 3D Game Engine, with a high emphasis on Rendering MainFeatures Rendering PBR Renderer (Cook–Torrance GGX) IBL (Image Based Li

Surge 44 Sep 21, 2022
HybridSE (Hybrid SQL Engine) is an LLVM-based, hybrid-execution and high-performance SQL engine

HybridSE (Hybrid SQL Engine) is an LLVM-based, hybrid-execution and high-performance SQL engine. It can provide fast and consistent execution on heterogeneous SQL data systems, e.g., OLAD database, HTAP system, SparkSQL, and Flink Stream SQL.

4Paradigm 45 Sep 12, 2021