Compare commits

...

16 Commits

Author SHA1 Message Date
Ming Di Leom 23e3238c2b
release: 0.2.0 2024-01-26 04:04:51 +00:00
Ming Di Leom 521012f9cd
refactor(savedsearches): move action.lookup to outputlookup
enables on-demand lookup update
override_if_empty=false prevents lookup from being overwritten with empty result
2024-01-26 03:55:22 +00:00
Ming Di Leom 716f9a521f
fix(transforms): leave batch_index_query to default 2024-01-26 03:48:37 +00:00
Ming Di Leom da853d5e9b
docs: example usage 2024-01-26 02:12:01 +00:00
Ming Di Leom 36fd29f277
chore(vscode): code action
https://code.visualstudio.com/updates/v1_85#_code-actions-on-save-and-auto
2024-01-26 02:08:42 +00:00
Ming Di Leom 313ee66590
release: 0.1.1 2023-11-14 07:30:07 +00:00
Ming Di Leom 1787e5e2de
fix: schedule_window should be less than cron frequency 2023-11-14 07:28:06 +00:00
Ming Di Leom 93b7e2a44c
ci: replace pylint with pre-commit
pylint has been replaced by ruff
2023-11-11 04:47:56 +00:00
Ming Di Leom 9b04a100db
ci: splunk-sdk does not support python 3.12
https://github.com/splunk/splunk-sdk-python/issues/548
2023-11-11 04:44:07 +00:00
Ming Di Leom 444b92a837
release: 0.1.0 2023-11-11 01:34:54 +00:00
Ming Di Leom 1cd2ec36a8
fix: set time range to all time
https://docs.splunk.com/Documentation/Splunk/9.1.1/SearchReference/Collect#Events_without_timestamps
2023-11-11 01:33:10 +00:00
Ming Di Leom 097ba9a3df
chore(pre-commit): update hooks 2023-10-01 10:10:49 +00:00
Ming Di Leom 81ee292f1d
build: reset file permission
running in windows may set execution permission
that is not allowed in splunk cloud
2023-10-01 10:10:24 +00:00
Ming Di Leom 4e084978b5
docs: add splunkbase page 2023-07-20 10:57:59 +00:00
Ming Di Leom b9d7dae295
release: 0.0.13 2023-07-18 10:37:40 +00:00
Ming Di Leom d677af89d9
fix(savedsearches): disable schedule by default 2023-07-18 10:27:04 +00:00
9 changed files with 95 additions and 70 deletions

View File

@ -1,4 +1,4 @@
image: python:slim
image: python:3.11-slim
variables:
PIP_CACHE_DIR: "$CI_PROJECT_DIR/.cache/pip"
@ -15,19 +15,24 @@ stages:
lint:
stage: test
variables:
PRE_COMMIT_HOME: ${CI_PROJECT_DIR}/.cache/pre-commit
cache:
paths:
- .cache/pip
- .venv/
- ${PRE_COMMIT_HOME}
before_script:
- apt-get update && apt-get install -y --no-install-recommends git
- python --version
- python -m venv .venv
- source .venv/bin/activate
- pip install -r requirements-dev.txt -U
script:
- pylint $(find -type f -name "*.py" ! -path "./.venv/**" ! -path "./lib/**")
- pre-commit run --all-files
build:
stage: build
@ -56,7 +61,8 @@ release_job:
stage: release
image: registry.gitlab.com/gitlab-org/release-cli:latest
rules:
- if: $CI_COMMIT_TAG # Run this job when a tag is created manually
# Run this job when a tag is created manually
- if: $CI_COMMIT_TAG
script:
- echo "Running the release job."
release:

View File

@ -18,15 +18,15 @@ repos:
- id: end-of-file-fixer
- id: trailing-whitespace
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: "v0.0.254"
rev: "v0.0.291"
hooks:
- id: ruff
args: [--fix, --exit-non-zero-on-fix]
- repo: https://github.com/psf/black
rev: 22.12.0
rev: 23.9.1
hooks:
- id: black
- repo: https://github.com/pre-commit/mirrors-prettier
rev: "v2.7.1"
rev: "v3.0.3"
hooks:
- id: prettier

View File

@ -3,14 +3,13 @@
"files.trimTrailingWhitespace": true,
"files.insertFinalNewline": true,
"files.trimFinalNewlines": true,
"python.formatting.provider": "none",
"[python]": {
"editor.defaultFormatter": "ms-python.black-formatter",
"editor.formatOnSave": true,
"editor.tabSize": 4,
"editor.codeActionsOnSave": {
"source.fixAll": true,
"source.organizeImports": true
"source.fixAll": "explicit",
"source.organizeImports": "explicit"
}
},
"[markdown]": {

View File

@ -15,9 +15,9 @@
Provide custom search commands to update [malware-filter](https://gitlab.com/malware-filter) lookups. Each command downloads from a source CSV and emit rows as events which can then be piped to a lookup file or used as a subsearch. Each command is exported globally and can be used in any app. This add-on currently does not have any UI.
Source CSVs will be downloaded via a proxy if configured in "$SPLUNK_HOME/etc/system/local/[server.conf](https://docs.splunk.com/Documentation/Splunk/latest/Admin/Serverconf#Splunkd_http_proxy_configuration)".
[Lookup files](./lookups/) can be updated using the bundled scheduled reports every 12 hours, every 15 minutes for botnet_ip.csv and opendbl_ip.csv. The scheduled reports are disabled by default. Enable the relevant schedule that corresponds to the required lookup file. Modify the search string to add [optional arguments](#usage).
By default, [lookup files](./lookups/) will be updated using scheduled reports every 12 hours, every 15 minutes for botnet_ip.csv and opendbl_ip.csv. Modify the relevant saved searches to add [optional arguments](#usage).
Source CSVs will be downloaded via a proxy if configured in "$SPLUNK_HOME/etc/system/local/[server.conf](https://docs.splunk.com/Documentation/Splunk/latest/Admin/Serverconf#Splunkd_http_proxy_configuration)".
Refer to [this article](https://mdleom.com/blog/2023/04/16/splunk-lookup-malware-filter/) for a more comprehensive guide on detecting malicious domain, URL, IP and CIDR range.
@ -25,7 +25,7 @@ Tested on Splunk 9.x.
## Installation
Releases are available at https://gitlab.com/malware-filter/splunk-malware-filter/-/releases
Releases are available at [Splunkbase](https://splunkbase.splunk.com/app/6970) and [GitLab](https://gitlab.com/malware-filter/splunk-malware-filter/-/releases)
Instruction to build the main branch is available at the [Build](#build) section.
@ -156,6 +156,27 @@ Recommend to update the lookup file "opendbl_ip.csv" every 15 minutes (cron `*/1
Source: https://opendbl.net/
## Example usage
```
| tstats summariesonly=true allow_old_summaries=true count FROM datamodel=Web WHERE Web.action="allowed"
BY Web.user, Web.src, Web.dest, Web.site, Web.url, Web.category, Web.action, index, _time span=1s
| rename Web.* AS *
| lookup urlhaus-filter-splunk-online host AS site, host AS dest OUTPUT message AS description, updated
| lookup urlhaus-filter-splunk-online path_wildcard_prefix AS vendor_url, host AS site, host AS dest OUTPUT message AS description2, updated AS updated2
| lookup phishing-filter-splunk host AS site, host AS dest OUTPUT message AS description3, updated AS updated3
| lookup phishing-filter-splunk path_wildcard_prefix AS vendor_url, host AS site, host AS dest OUTPUT message AS description4, updated AS updated4
| lookup pup-filter-splunk host AS site, host AS dest OUTPUT message AS description5, updated AS updated5
| lookup vn-badsite-filter-splunk host AS site, host AS dest OUTPUT message AS description6, updated AS updated6
| lookup botnet_ip dst_ip AS dest OUTPUT malware AS description7, updated AS updated7
| eval Description=coalesce(description, description2, description3, description4, description5, description6, description7)
| search Description=*
| eval updated=coalesce(updated, updated2, updated3, updated4, updated5, updated6, updated7), "Signature Last Updated"=strftime(strptime(updated." +0000","%Y-%m-%dT%H:%M:%SZ %z"),"%Y-%m-%d %H:%M:%S %z"), Time=strftime(_time, "%Y-%m-%d %H:%M:%S %z"), "Source IP"=src, Username=user, Domain=site, "Destination IP"=dest, URL=url, Action=action
| table Time, index, "Signature Last Updated", "Source IP", Username, Domain, "Destination IP", Description, Action, URL
```
It is not recommended to use subsearch (e.g. `[| inputlookup urlhaus-filter-splunk-online.csv | fields host ]`) for these [lookup tables](./lookups/) especially [urlhaus-filter](./lookups/urlhaus-filter-splunk-online.csv) and [phishing-filter](./lookups/phishing-filter-splunk.csv) because they usually have more than 30,000 rows, which exceed the soft-limit of [10,000 rows](https://docs.splunk.com/Documentation/SplunkCloud/latest/Search/Aboutsubsearches#Subsearch_performance_considerations) returned by subsearch.
## Disable individual commands
Settings -> All configurations -> filter by "malware_filter" app

View File

@ -5,7 +5,7 @@
"id": {
"group": null,
"name": "TA-malware-filter",
"version": "0.0.12"
"version": "0.2.0"
},
"author": [
{

View File

@ -5,7 +5,9 @@
import tarfile
from configparser import ConfigParser
from os import environ, path
from re import search, sub
from pathlib import PurePath
from posixpath import join as posixjoin
from re import search
from subprocess import check_call
from sys import executable
@ -50,18 +52,31 @@ def exclusion(tarinfo):
# exclude certain folders/files
pathname = tarinfo.name
if search(
r"/\.|\\\.|__pycache__|pyproject\.toml|requirements|build\.py|tar\.gz", pathname
r"/\.|\\\.|__pycache__|pyproject\.toml|requirements|build\.py|\.tar\.gz|\.tgz",
pathname,
):
return None
# rename parent folder as "TA-malware-filter"
tarinfo.name = sub(r"^.", "TA-malware-filter", pathname)
app = PurePath(pathname).parts[0]
# reset file stats
# based on https://splunkbase.splunk.com/app/833
tarinfo.uid = 1001
tarinfo.gid = 123
tarinfo.uid = 0
tarinfo.gid = 0
tarinfo.uname = tarinfo.gname = ""
if tarinfo.isfile():
# remove execution permission
tarinfo.mode = 0o644
# except for scripts
# tarinfo uses posix (not nt)
if (
tarinfo.name.startswith(posixjoin(app, "bin"))
and path.splitext(tarinfo.name)[-1] == ".py"
):
tarinfo.mode = 0o744
if tarinfo.isdir():
# remove write permission from group & world
tarinfo.mode = 0o755
return tarinfo
@ -84,4 +99,4 @@ check_call(
pkg_file = f"TA-malware-filter-{version()}.tar.gz"
print(f"Creating {pkg_file}...")
with tarfile.open(pkg_file, "w:gz") as tar:
tar.add(".", filter=exclusion)
tar.add(".", filter=exclusion, arcname="TA-malware-filter")

View File

@ -1,6 +1,3 @@
#
# App configuration file
#
[install]
is_configured = false
@ -9,7 +6,7 @@ id = TA-malware-filter
[id]
name = TA-malware-filter
version = 0.0.12
version = 0.2.0
[ui]
is_visible = false
@ -18,4 +15,4 @@ label = malware-filter Add-on
[launcher]
author = Ming Di Leom
description = Update malware-filter lookups. https://gitlab.com/malware-filter
version = 0.0.12
version = 0.2.0

View File

@ -1,69 +1,63 @@
[malware-filter Update botnet_ip.csv]
action.lookup = 1
action.lookup.filename = botnet_ip.csv
cron_schedule = */15 * * * *
description = Update lookup every 15 minutes from 00:00
dispatch.earliest_time = -1h
enableSched = 1
schedule_window = 60
search = | getbotnetip
# https://docs.splunk.com/Documentation/Splunk/latest/SearchReference/Collect#Events_without_timestamps
dispatch.earliest_time = 0
enableSched = 0
schedule_window = 5
search = | getbotnetip\
| outputlookup override_if_empty=false botnet_ip.csv
[malware-filter Update botnet-filter-splunk.csv]
action.lookup = 1
action.lookup.filename = botnet-filter-splunk.csv
cron_schedule = 0 */12 * * *
description = Update lookup every 12 hours from 00:00
dispatch.earliest_time = -12h
enableSched = 1
dispatch.earliest_time = 0
enableSched = 0
schedule_window = 60
search = | getbotnetfilter
search = | getbotnetfilter\
| outputlookup override_if_empty=false botnet-filter-splunk.csv
[malware-filter Update opendbl_ip.csv]
action.lookup = 1
action.lookup.filename = opendbl_ip.csv
cron_schedule = */15 * * * *
description = Update lookup every 15 minutes from 00:00
dispatch.earliest_time = -1h
enableSched = 1
schedule_window = 60
search = | getopendbl
dispatch.earliest_time = 0
enableSched = 0
schedule_window = 5
search = | getopendbl\
| outputlookup override_if_empty=false opendbl_ip.csv
[malware-filter Update phishing-filter-splunk.csv]
action.lookup = 1
action.lookup.filename = phishing-filter-splunk.csv
cron_schedule = 0 */12 * * *
description = Update lookup every 12 hours from 00:00
dispatch.earliest_time = -12h
enableSched = 1
dispatch.earliest_time = 0
enableSched = 0
schedule_window = 60
search = | getphishingfilter
search = | getphishingfilter\
| outputlookup override_if_empty=false phishing-filter-splunk.csv
[malware-filter Update pup-filter-splunk.csv]
action.lookup = 1
action.lookup.filename = pup-filter-splunk.csv
cron_schedule = 0 */12 * * *
description = Update lookup every 12 hours from 00:00
dispatch.earliest_time = -12h
enableSched = 1
dispatch.earliest_time = 0
enableSched = 0
schedule_window = 60
search = | getpupfilter
search = | getpupfilter\
| outputlookup override_if_empty=false pup-filter-splunk.csv
[malware-filter Update urlhaus-filter-splunk-online.csv]
action.lookup = 1
action.lookup.filename = urlhaus-filter-splunk-online.csv
cron_schedule = 0 */12 * * *
description = Update lookup every 12 hours from 00:00
dispatch.earliest_time = -12h
enableSched = 1
dispatch.earliest_time = 0
enableSched = 0
schedule_window = 60
search = | geturlhausfilter
search = | geturlhausfilter\
| outputlookup override_if_empty=false urlhaus-filter-splunk-online.csv
[malware-filter Update vn-badsite-filter-splunk.csv]
action.lookup = 1
action.lookup.filename = vn-badsite-filter-splunk.csv
cron_schedule = 0 */12 * * *
description = Update lookup every 12 hours from 00:00
dispatch.earliest_time = -12h
enableSched = 1
dispatch.earliest_time = 0
enableSched = 0
schedule_window = 60
search = | getvnbadsitefilter
search = | getvnbadsitefilter\
| outputlookup override_if_empty=false vn-badsite-filter-splunk.csv

View File

@ -1,39 +1,32 @@
[urlhaus-filter-splunk-online]
batch_index_query = 0
case_sensitive_match = 1
filename = urlhaus-filter-splunk-online.csv
max_matches = 1
[phishing-filter-splunk]
batch_index_query = 0
case_sensitive_match = 1
filename = phishing-filter-splunk.csv
max_matches = 1
[pup-filter-splunk]
batch_index_query = 0
case_sensitive_match = 1
filename = pup-filter-splunk.csv
max_matches = 1
[vn-badsite-filter-splunk]
batch_index_query = 0
case_sensitive_match = 1
filename = vn-badsite-filter-splunk.csv
max_matches = 1
[botnet-filter-splunk]
batch_index_query = 0
case_sensitive_match = 1
filename = botnet-filter-splunk.csv
[botnet_ip]
batch_index_query = 0
case_sensitive_match = 1
filename = botnet_ip.csv
[opendbl_ip]
batch_index_query = 0
case_sensitive_match = 1
filename = opendbl_ip.csv
min_matches = 1