fix(savedsearches): disable schedule by default
This commit is contained in:
parent
d5c2348beb
commit
d677af89d9
|
@ -15,9 +15,9 @@
|
|||
|
||||
Provide custom search commands to update [malware-filter](https://gitlab.com/malware-filter) lookups. Each command downloads from a source CSV and emit rows as events which can then be piped to a lookup file or used as a subsearch. Each command is exported globally and can be used in any app. This add-on currently does not have any UI.
|
||||
|
||||
Source CSVs will be downloaded via a proxy if configured in "$SPLUNK_HOME/etc/system/local/[server.conf](https://docs.splunk.com/Documentation/Splunk/latest/Admin/Serverconf#Splunkd_http_proxy_configuration)".
|
||||
[Lookup files](./lookups/) can be updated using the bundled scheduled reports every 12 hours, every 15 minutes for botnet_ip.csv and opendbl_ip.csv. The scheduled reports are disabled by default. Enable the relevant schedule that corresponds to the required lookup file. Modify the search string to add [optional arguments](#usage).
|
||||
|
||||
By default, [lookup files](./lookups/) will be updated using scheduled reports every 12 hours, every 15 minutes for botnet_ip.csv and opendbl_ip.csv. Modify the relevant saved searches to add [optional arguments](#usage).
|
||||
Source CSVs will be downloaded via a proxy if configured in "$SPLUNK_HOME/etc/system/local/[server.conf](https://docs.splunk.com/Documentation/Splunk/latest/Admin/Serverconf#Splunkd_http_proxy_configuration)".
|
||||
|
||||
Refer to [this article](https://mdleom.com/blog/2023/04/16/splunk-lookup-malware-filter/) for a more comprehensive guide on detecting malicious domain, URL, IP and CIDR range.
|
||||
|
||||
|
|
|
@ -4,7 +4,7 @@ action.lookup.filename = botnet_ip.csv
|
|||
cron_schedule = */15 * * * *
|
||||
description = Update lookup every 15 minutes from 00:00
|
||||
dispatch.earliest_time = -1h
|
||||
enableSched = 1
|
||||
enableSched = 0
|
||||
schedule_window = 60
|
||||
search = | getbotnetip
|
||||
|
||||
|
@ -14,7 +14,7 @@ action.lookup.filename = botnet-filter-splunk.csv
|
|||
cron_schedule = 0 */12 * * *
|
||||
description = Update lookup every 12 hours from 00:00
|
||||
dispatch.earliest_time = -12h
|
||||
enableSched = 1
|
||||
enableSched = 0
|
||||
schedule_window = 60
|
||||
search = | getbotnetfilter
|
||||
|
||||
|
@ -24,7 +24,7 @@ action.lookup.filename = opendbl_ip.csv
|
|||
cron_schedule = */15 * * * *
|
||||
description = Update lookup every 15 minutes from 00:00
|
||||
dispatch.earliest_time = -1h
|
||||
enableSched = 1
|
||||
enableSched = 0
|
||||
schedule_window = 60
|
||||
search = | getopendbl
|
||||
|
||||
|
@ -34,7 +34,7 @@ action.lookup.filename = phishing-filter-splunk.csv
|
|||
cron_schedule = 0 */12 * * *
|
||||
description = Update lookup every 12 hours from 00:00
|
||||
dispatch.earliest_time = -12h
|
||||
enableSched = 1
|
||||
enableSched = 0
|
||||
schedule_window = 60
|
||||
search = | getphishingfilter
|
||||
|
||||
|
@ -44,7 +44,7 @@ action.lookup.filename = pup-filter-splunk.csv
|
|||
cron_schedule = 0 */12 * * *
|
||||
description = Update lookup every 12 hours from 00:00
|
||||
dispatch.earliest_time = -12h
|
||||
enableSched = 1
|
||||
enableSched = 0
|
||||
schedule_window = 60
|
||||
search = | getpupfilter
|
||||
|
||||
|
@ -54,7 +54,7 @@ action.lookup.filename = urlhaus-filter-splunk-online.csv
|
|||
cron_schedule = 0 */12 * * *
|
||||
description = Update lookup every 12 hours from 00:00
|
||||
dispatch.earliest_time = -12h
|
||||
enableSched = 1
|
||||
enableSched = 0
|
||||
schedule_window = 60
|
||||
search = | geturlhausfilter
|
||||
|
||||
|
@ -64,6 +64,6 @@ action.lookup.filename = vn-badsite-filter-splunk.csv
|
|||
cron_schedule = 0 */12 * * *
|
||||
description = Update lookup every 12 hours from 00:00
|
||||
dispatch.earliest_time = -12h
|
||||
enableSched = 1
|
||||
enableSched = 0
|
||||
schedule_window = 60
|
||||
search = | getvnbadsitefilter
|
||||
|
|
Loading…
Reference in New Issue