From e547308882e8fca2ceb0f3bc8e16f130852c4ab1 Mon Sep 17 00:00:00 2001 From: LmR Date: Thu, 19 Sep 2019 08:51:53 +0000 Subject: [PATCH 1/9] Add all information needed to configure the new Splunk analyzer --- analyzer_requirements.md | 66 +++++++++++++++++++++++++++++++++++++++- 1 file changed, 65 insertions(+), 1 deletion(-) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index 697a52a..ec791ff 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -80,6 +80,7 @@ on is free or requires special access or valid subscription or product license. * [MnemonicPDNS](#mnemonicpdns) * [SinkDB](#sinkdb) * [Shodan](#shodan) + * [Splunk] (#splunk) * [Subscription and License\-based Analyzers](#subscription-and-license-based-analyzers) * [DNSDB](#dnsdb) * [DomainTools](#domaintools) @@ -796,6 +797,69 @@ level account, otherwise a free one can be used. Supply the API key as the value for the `key` parameter. +### Splunk +This analyzer allows you to execute a list of searches in Splunk by passing the element you are looking for as a parameter + +This analyzer comes in 10 flavors: +- Splunk_Search_**Domain_FQDN**: Dispatch a list of saved searches on a given domain/fqdn +- Splunk_Search_**File_Filename**: Dispatch a list of saved searches on a given file/filename +- Splunk_Search_**Hash**: Dispatch a list of saved searches on a given hash +- Splunk_Search_**IP**: Dispatch a list of saved searches on a given IP (IPv4 only) +- Splunk_Search_**Mail_Email**: Dispatch a list of saved searches on a given mail/email +- Splunk_Search_**Mail_Subject**: Dispatch a list of saved searches on a given mail_subject +- Splunk_Search_**Other**: Dispatch a list of saved searches on a given data (any type) +- Splunk_Search_**Registry**: Dispatch a list of saved searches on a given registry +- Splunk_Search_**URL_URI_Path**: Dispatch a list of saved searches on a given url/uri_path +- Splunk_Search_**User_Agent**: Dispatch a list of saved searches on a given user_agent + +#### Requirements +You need to have access to a Splunk instance with a dedicated account. For any saved search you want to use, you have to group them in the same Application and with the same owner. +When you configure an analyzer, it will ask you these information: +- host: This is the domain name or the IP of your Splunk instance. +- port: This is the port to reach to access Splunk (HTTPS or API). +- username (optional): If your Splunk instance has authentication, you need an account to access to it (and to the indexes you want to search). Please avoid to use admin. +- password (optional): If your Splunk instance has authentication, this is the password of the previous account. Please avoid to use admin and respect password complexity. No token access is supported. +- application: This is the application in which all the saved searches are stored on your Splunk instance. +- owner: This is the owner of all the saved searches, it must be the same for all of them. This can be different from the username mentionned above but you will need shared rights. +- savedsearches: A list of all saved searches you want to execute. You just have to put the name of the saved searches here. **Each saved search will be executed/dispatch in parallel (and so they will become jobs) but the Cortex job will finish once all Splunk jobs are done**. +- max_count: This parameter is set to 1,000 by default. It's the number of results to recover from the job. A limit is set to avoid any trouble in TheHive/Cortex on the GUI. If value is set to 0, then all available results are returned. + +#### How to recover arguments in Splunk ? +All arguments can be retrieve using "$args.DATATYPE$". As an example is better than a long speech, here it is: +Imagine that you have a search with this query: +```index=myindex_internet sourcetype=mysourcetype url=$args.url$* +| stats count by user, url, src_ip``` +This query will recover the data using $args.url$. + +So, you can recover your data using : +- $arg.type$: This parameter indicates the type of data (if you need so) + +- $arg.domain$: This parameter contains the data for an analysis over a domain +- $arg.fqdn$: This parameter contains the data for an analysis over a fqdn +- $arg.file$: This parameter contains the data for an analysis over a file +- $arg.filename$: This parameter contains the data for an analysis over a filename +- $arg.hash$: This parameter contains the data for an analysis over a hash +- $arg.ip$: This parameter contains the data for an analysis over a ip +- $arg.mail$: This parameter contains the data for an analysis over a mail +- $arg.email$: This parameter contains the data for an analysis over a email +- $arg.email_subject$: This parameter contains the data for an analysis over a email_subject +- $arg.other$: This parameter contains the data for an analysis over a other +- $arg.registry$: This parameter contains the data for an analysis over a registry +- $arg.url$: This parameter contains the data for an analysis over a url +- $arg.uri_path$: This parameter contains the data for an analysis over a uri_path +- $arg.user-agent$: This parameter contains the data for an analysis over a user-agent + +#### Taxonomies +They are 5 taxonomies available on this analyzer: +- **Splunk:Results**: Indicates the total number of results found by all the saved searches +- **Splunk:Info** (optional): Indicates the total number of results which have a field "level" set to "info" +- **Splunk:Safe** (optional): Indicates the total number of results which have a field "level" set to "safe" +- **Splunk:Suspicious** (optional): Indicates the total number of results which have a field "level" set to "suspicious" +- **Splunk:Malicious** (optional): Indicates the total number of results which have a field "level" set to "malicious" + +As mentionned above, your saved searches can return a field named "level" which will be interpreted by Cortex/TheHive as a taxonomy and will create reports accordingly to the value (info,safe,suspicious or malicious) + + ## Subscription and License-based Analyzers ### DNSDB Leverage Farsight Security's [DNSDB](https://www.dnsdb.info/) for Passive DNS. @@ -1058,4 +1122,4 @@ Add domain from observables in cases to Umbrella blacklist. #### Requirements -To configure the responder, provide the url of the service as a value for the `integration_url` parameter. \ No newline at end of file +To configure the responder, provide the url of the service as a value for the `integration_url` parameter. From d5989538f6aa7ccd269cf9fff00244b461844663 Mon Sep 17 00:00:00 2001 From: LmR Date: Thu, 19 Sep 2019 08:53:47 +0000 Subject: [PATCH 2/9] Fix bug on the summary --- analyzer_requirements.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index ec791ff..fad5069 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -80,7 +80,7 @@ on is free or requires special access or valid subscription or product license. * [MnemonicPDNS](#mnemonicpdns) * [SinkDB](#sinkdb) * [Shodan](#shodan) - * [Splunk] (#splunk) + * [Splunk](#splunk) * [Subscription and License\-based Analyzers](#subscription-and-license-based-analyzers) * [DNSDB](#dnsdb) * [DomainTools](#domaintools) From 620fc0b9379ed1a3407624d7fd9ce8ead977e52c Mon Sep 17 00:00:00 2001 From: LmR Date: Thu, 19 Sep 2019 08:57:01 +0000 Subject: [PATCH 3/9] Fix some bugs --- analyzer_requirements.md | 22 ++++++++++++---------- 1 file changed, 12 insertions(+), 10 deletions(-) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index fad5069..30acf05 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -815,20 +815,22 @@ This analyzer comes in 10 flavors: #### Requirements You need to have access to a Splunk instance with a dedicated account. For any saved search you want to use, you have to group them in the same Application and with the same owner. When you configure an analyzer, it will ask you these information: -- host: This is the domain name or the IP of your Splunk instance. -- port: This is the port to reach to access Splunk (HTTPS or API). -- username (optional): If your Splunk instance has authentication, you need an account to access to it (and to the indexes you want to search). Please avoid to use admin. -- password (optional): If your Splunk instance has authentication, this is the password of the previous account. Please avoid to use admin and respect password complexity. No token access is supported. -- application: This is the application in which all the saved searches are stored on your Splunk instance. -- owner: This is the owner of all the saved searches, it must be the same for all of them. This can be different from the username mentionned above but you will need shared rights. -- savedsearches: A list of all saved searches you want to execute. You just have to put the name of the saved searches here. **Each saved search will be executed/dispatch in parallel (and so they will become jobs) but the Cortex job will finish once all Splunk jobs are done**. -- max_count: This parameter is set to 1,000 by default. It's the number of results to recover from the job. A limit is set to avoid any trouble in TheHive/Cortex on the GUI. If value is set to 0, then all available results are returned. +- **host**: This is the domain name or the IP of your Splunk instance. +- **port**: This is the port to reach to access Splunk (HTTPS or API). +- **username** (optional): If your Splunk instance has authentication, you need an account to access to it (and to the indexes you want to search). Please avoid to use admin. +- **password** (optional): If your Splunk instance has authentication, this is the password of the previous account. Please avoid to use admin and respect password complexity. No token access is supported. +- **application**: This is the application in which all the saved searches are stored on your Splunk instance. +- **owner**: This is the owner of all the saved searches, it must be the same for all of them. This can be different from the username mentionned above but you will need shared rights. +- **savedsearches**: A list of all saved searches you want to execute. You just have to put the name of the saved searches here. **Each saved search will be executed/dispatch in parallel (and so they will become jobs) but the Cortex job will finish once all Splunk jobs are done**. +- **max_count**: This parameter is set to 1,000 by default. It's the number of results to recover from the job. A limit is set to avoid any trouble in TheHive/Cortex on the GUI. If value is set to 0, then all available results are returned. #### How to recover arguments in Splunk ? All arguments can be retrieve using "$args.DATATYPE$". As an example is better than a long speech, here it is: Imagine that you have a search with this query: -```index=myindex_internet sourcetype=mysourcetype url=$args.url$* -| stats count by user, url, src_ip``` +``` +index=myindex_internet sourcetype=mysourcetype url=$args.url$* +| stats count by user, url, src_ip +``` This query will recover the data using $args.url$. So, you can recover your data using : From 862ecbea2d959940af7cf8d973d88ac308f04600 Mon Sep 17 00:00:00 2001 From: LmR Date: Thu, 19 Sep 2019 08:58:22 +0000 Subject: [PATCH 4/9] Missing some 's' --- analyzer_requirements.md | 32 ++++++++++++++++---------------- 1 file changed, 16 insertions(+), 16 deletions(-) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index 30acf05..e6b3976 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -834,22 +834,22 @@ index=myindex_internet sourcetype=mysourcetype url=$args.url$* This query will recover the data using $args.url$. So, you can recover your data using : -- $arg.type$: This parameter indicates the type of data (if you need so) - -- $arg.domain$: This parameter contains the data for an analysis over a domain -- $arg.fqdn$: This parameter contains the data for an analysis over a fqdn -- $arg.file$: This parameter contains the data for an analysis over a file -- $arg.filename$: This parameter contains the data for an analysis over a filename -- $arg.hash$: This parameter contains the data for an analysis over a hash -- $arg.ip$: This parameter contains the data for an analysis over a ip -- $arg.mail$: This parameter contains the data for an analysis over a mail -- $arg.email$: This parameter contains the data for an analysis over a email -- $arg.email_subject$: This parameter contains the data for an analysis over a email_subject -- $arg.other$: This parameter contains the data for an analysis over a other -- $arg.registry$: This parameter contains the data for an analysis over a registry -- $arg.url$: This parameter contains the data for an analysis over a url -- $arg.uri_path$: This parameter contains the data for an analysis over a uri_path -- $arg.user-agent$: This parameter contains the data for an analysis over a user-agent +- $args.type$: This parameter indicates the type of data (if you need so) + +- $args.domain$: This parameter contains the data for an analysis over a domain +- $args.fqdn$: This parameter contains the data for an analysis over a fqdn +- $args.file$: This parameter contains the data for an analysis over a file +- $args.filename$: This parameter contains the data for an analysis over a filename +- $args.hash$: This parameter contains the data for an analysis over a hash +- $args.ip$: This parameter contains the data for an analysis over a ip +- $args.mail$: This parameter contains the data for an analysis over a mail +- $args.email$: This parameter contains the data for an analysis over a email +- $args.email_subject$: This parameter contains the data for an analysis over a email_subject +- $args.other$: This parameter contains the data for an analysis over a other +- $args.registry$: This parameter contains the data for an analysis over a registry +- $args.url$: This parameter contains the data for an analysis over a url +- $args.uri_path$: This parameter contains the data for an analysis over a uri_path +- $args.user-agent$: This parameter contains the data for an analysis over a user-agent #### Taxonomies They are 5 taxonomies available on this analyzer: From 5e0755cc96775b1ed99156faefdd70665b737086 Mon Sep 17 00:00:00 2001 From: LmR Date: Mon, 30 Sep 2019 13:00:54 +0000 Subject: [PATCH 5/9] Modify the documentation concerning the earliest/latest time possibility --- analyzer_requirements.md | 3 +++ 1 file changed, 3 insertions(+) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index e6b3976..5196855 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -822,6 +822,9 @@ When you configure an analyzer, it will ask you these information: - **application**: This is the application in which all the saved searches are stored on your Splunk instance. - **owner**: This is the owner of all the saved searches, it must be the same for all of them. This can be different from the username mentionned above but you will need shared rights. - **savedsearches**: A list of all saved searches you want to execute. You just have to put the name of the saved searches here. **Each saved search will be executed/dispatch in parallel (and so they will become jobs) but the Cortex job will finish once all Splunk jobs are done**. +- **earliest_time**: If not empty, this parameter will specify the earliest time to use for all searches. If empty, the earliest time set in the saved search will be used by Splunk +- **latest_time**: If not empty, this parameter will specify the latest time to use for all searches. If empty, the latest time set in the saved search will be used by Splunk + - **max_count**: This parameter is set to 1,000 by default. It's the number of results to recover from the job. A limit is set to avoid any trouble in TheHive/Cortex on the GUI. If value is set to 0, then all available results are returned. #### How to recover arguments in Splunk ? From 5547835a468014b3a3c8f146edd259dd5d59e784 Mon Sep 17 00:00:00 2001 From: LmR Date: Mon, 30 Sep 2019 13:02:58 +0000 Subject: [PATCH 6/9] Fix added space --- analyzer_requirements.md | 1 - 1 file changed, 1 deletion(-) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index 5196855..6a610bd 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -824,7 +824,6 @@ When you configure an analyzer, it will ask you these information: - **savedsearches**: A list of all saved searches you want to execute. You just have to put the name of the saved searches here. **Each saved search will be executed/dispatch in parallel (and so they will become jobs) but the Cortex job will finish once all Splunk jobs are done**. - **earliest_time**: If not empty, this parameter will specify the earliest time to use for all searches. If empty, the earliest time set in the saved search will be used by Splunk - **latest_time**: If not empty, this parameter will specify the latest time to use for all searches. If empty, the latest time set in the saved search will be used by Splunk - - **max_count**: This parameter is set to 1,000 by default. It's the number of results to recover from the job. A limit is set to avoid any trouble in TheHive/Cortex on the GUI. If value is set to 0, then all available results are returned. #### How to recover arguments in Splunk ? From cb1a8b9cd082634f64d87300b3a1465add321b71 Mon Sep 17 00:00:00 2001 From: LmR Date: Wed, 9 Oct 2019 11:49:26 +0000 Subject: [PATCH 7/9] Add information about the "GUI port" --- analyzer_requirements.md | 3 ++- 1 file changed, 2 insertions(+), 1 deletion(-) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index 6a610bd..316e342 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -816,7 +816,8 @@ This analyzer comes in 10 flavors: You need to have access to a Splunk instance with a dedicated account. For any saved search you want to use, you have to group them in the same Application and with the same owner. When you configure an analyzer, it will ask you these information: - **host**: This is the domain name or the IP of your Splunk instance. -- **port**: This is the port to reach to access Splunk (HTTPS or API). +- **port**: This is the port to reach to access Splunk (API). +- **port_gui**: This is the port to reach to access Splunk (HTTP(s)) - **username** (optional): If your Splunk instance has authentication, you need an account to access to it (and to the indexes you want to search). Please avoid to use admin. - **password** (optional): If your Splunk instance has authentication, this is the password of the previous account. Please avoid to use admin and respect password complexity. No token access is supported. - **application**: This is the application in which all the saved searches are stored on your Splunk instance. From 680bbb82c3382c8220398f2e2d8c609adc5c4d55 Mon Sep 17 00:00:00 2001 From: LmR Date: Thu, 10 Oct 2019 06:28:10 +0000 Subject: [PATCH 8/9] Little fix --- analyzer_requirements.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index 316e342..4d1c310 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -817,7 +817,7 @@ You need to have access to a Splunk instance with a dedicated account. For any s When you configure an analyzer, it will ask you these information: - **host**: This is the domain name or the IP of your Splunk instance. - **port**: This is the port to reach to access Splunk (API). -- **port_gui**: This is the port to reach to access Splunk (HTTP(s)) +- **port_gui**: This is the port to reach to access Splunk (HTTP) - **username** (optional): If your Splunk instance has authentication, you need an account to access to it (and to the indexes you want to search). Please avoid to use admin. - **password** (optional): If your Splunk instance has authentication, this is the password of the previous account. Please avoid to use admin and respect password complexity. No token access is supported. - **application**: This is the application in which all the saved searches are stored on your Splunk instance. From 356d78e1b8cfe5fc30a2e9954a3985d16b62005f Mon Sep 17 00:00:00 2001 From: LmR Date: Mon, 28 Oct 2019 14:13:08 +0000 Subject: [PATCH 9/9] Add documentation about user --- analyzer_requirements.md | 1 + 1 file changed, 1 insertion(+) diff --git a/analyzer_requirements.md b/analyzer_requirements.md index 4d1c310..c0ee966 100644 --- a/analyzer_requirements.md +++ b/analyzer_requirements.md @@ -811,6 +811,7 @@ This analyzer comes in 10 flavors: - Splunk_Search_**Registry**: Dispatch a list of saved searches on a given registry - Splunk_Search_**URL_URI_Path**: Dispatch a list of saved searches on a given url/uri_path - Splunk_Search_**User_Agent**: Dispatch a list of saved searches on a given user_agent +- Splunk_Search_**User**: Dispatch a list of saved searches on a given user id (variable name is 'other' #### Requirements You need to have access to a Splunk instance with a dedicated account. For any saved search you want to use, you have to group them in the same Application and with the same owner.