refactor: deploy filters to gitlab pages

- number of commits is becoming unwieldly
- gitlab pages is much faster than downloading from raw git
  * main download link is still curben.gitlab.io/malware-filter/
  * the cons is relying on gitlab ci/pages, previous approach was more portable
This commit is contained in:
MDLeom 2022-01-08 02:09:42 +00:00
commit 8c94ddba40
No known key found for this signature in database
GPG Key ID: 32D3E28E96A695E8
9 changed files with 1193 additions and 0 deletions

4
.gitignore vendored Normal file
View File

@ -0,0 +1,4 @@
tmp/
.vscode/
public/
node_modules/

30
.gitlab-ci.yml Normal file
View File

@ -0,0 +1,30 @@
image: alpine:latest # Use the latest version of Alpine Linux docker image
build_job:
stage: build
before_script:
- apk update && apk add brotli curl grep
script:
- sh src/script.sh
- find public -type f -regex '.*\.\(txt\|conf\|tpl\|rules\)$' -exec gzip -f -k -9 {} \;
- find public -type f -regex '.*\.\(txt\|conf\|tpl\|rules\)$' -exec brotli -f -k -9 {} \;
artifacts:
paths:
- tmp
- public
pages:
stage: deploy
script:
- echo
artifacts:
paths:
- public
include:
- template: Security/Secret-Detection.gitlab-ci.yml

1
.nvmrc Normal file
View File

@ -0,0 +1 @@
lts/*

42
LICENSE.md Normal file
View File

@ -0,0 +1,42 @@
CC0 1.0 Universal
==================
Statement of Purpose
---------------------
The laws of most jurisdictions throughout the world automatically confer exclusive Copyright and Related Rights (defined below) upon the creator and subsequent owner(s) (each and all, an "owner") of an original work of authorship and/or a database (each, a "Work").
Certain owners wish to permanently relinquish those rights to a Work for the purpose of contributing to a commons of creative, cultural and scientific works ("Commons") that the public can reliably and without fear of later claims of infringement build upon, modify, incorporate in other works, reuse and redistribute as freely as possible in any form whatsoever and for any purposes, including without limitation commercial purposes. These owners may contribute to the Commons to promote the ideal of a free culture and the further production of creative, cultural and scientific works, or to gain reputation or greater distribution for their Work in part through the use and efforts of others.
For these and/or other purposes and motivations, and without any expectation of additional consideration or compensation, the person associating CC0 with a Work (the "Affirmer"), to the extent that he or she is an owner of Copyright and Related Rights in the Work, voluntarily elects to apply CC0 to the Work and publicly distribute the Work under its terms, with knowledge of his or her Copyright and Related Rights in the Work and the meaning and intended legal effect of CC0 on those rights.
1. Copyright and Related Rights.
--------------------------------
A Work made available under CC0 may be protected by copyright and related or neighboring rights ("Copyright and Related Rights"). Copyright and Related Rights include, but are not limited to, the following:
i. the right to reproduce, adapt, distribute, perform, display, communicate, and translate a Work;
ii. moral rights retained by the original author(s) and/or performer(s);
iii. publicity and privacy rights pertaining to a person's image or likeness depicted in a Work;
iv. rights protecting against unfair competition in regards to a Work, subject to the limitations in paragraph 4(a), below;
v. rights protecting the extraction, dissemination, use and reuse of data in a Work;
vi. database rights (such as those arising under Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases, and under any national implementation thereof, including any amended or successor version of such directive); and
vii. other similar, equivalent or corresponding rights throughout the world based on applicable law or treaty, and any national implementations thereof.
2. Waiver.
-----------
To the greatest extent permitted by, but not in contravention of, applicable law, Affirmer hereby overtly, fully, permanently, irrevocably and unconditionally waives, abandons, and surrenders all of Affirmer's Copyright and Related Rights and associated claims and causes of action, whether now known or unknown (including existing as well as future claims and causes of action), in the Work (i) in all territories worldwide, (ii) for the maximum duration provided by applicable law or treaty (including future time extensions), (iii) in any current or future medium and for any number of copies, and (iv) for any purpose whatsoever, including without limitation commercial, advertising or promotional purposes (the "Waiver"). Affirmer makes the Waiver for the benefit of each member of the public at large and to the detriment of Affirmer's heirs and successors, fully intending that such Waiver shall not be subject to revocation, rescission, cancellation, termination, or any other legal or equitable action to disrupt the quiet enjoyment of the Work by the public as contemplated by Affirmer's express Statement of Purpose.
3. Public License Fallback.
----------------------------
Should any part of the Waiver for any reason be judged legally invalid or ineffective under applicable law, then the Waiver shall be preserved to the maximum extent permitted taking into account Affirmer's express Statement of Purpose. In addition, to the extent the Waiver is so judged Affirmer hereby grants to each affected person a royalty-free, non transferable, non sublicensable, non exclusive, irrevocable and unconditional license to exercise Affirmer's Copyright and Related Rights in the Work (i) in all territories worldwide, (ii) for the maximum duration provided by applicable law or treaty (including future time extensions), (iii) in any current or future medium and for any number of copies, and (iv) for any purpose whatsoever, including without limitation commercial, advertising or promotional purposes (the "License"). The License shall be deemed effective as of the date CC0 was applied by Affirmer to the Work. Should any part of the License for any reason be judged legally invalid or ineffective under applicable law, such partial invalidity or ineffectiveness shall not invalidate the remainder of the License, and in such case Affirmer hereby affirms that he or she will not (i) exercise any of his or her remaining Copyright and Related Rights in the Work or (ii) assert any associated claims and causes of action with respect to the Work, in either case contrary to Affirmer's express Statement of Purpose.
4. Limitations and Disclaimers.
--------------------------------
a. No trademark or patent rights held by Affirmer are waived, abandoned, surrendered, licensed or otherwise affected by this document.
b. Affirmer offers the Work as-is and makes no representations or warranties of any kind concerning the Work, express, implied, statutory or otherwise, including without limitation warranties of title, merchantability, fitness for a particular purpose, non infringement, or the absence of latent or other defects, accuracy, or the present or absence of errors, whether or not discoverable, all to the greatest extent permissible under applicable law.
c. Affirmer disclaims responsibility for clearing rights of other persons that may apply to the Work or any use thereof, including without limitation any person's Copyright and Related Rights in the Work. Further, Affirmer disclaims responsibility for obtaining any necessary consents, permissions or other rights required for any use of the Work.
d. Affirmer understands and acknowledges that Creative Commons is not a party to this document and has no duty or obligation with respect to this CC0 or use of the Work.
For more information, please see
https://creativecommons.org/publicdomain/zero/1.0/

588
README.md Normal file
View File

@ -0,0 +1,588 @@
# Malicious URL Blocklist
> Edit 2021/01/08: the default branch has changed to **main**.
A blocklist of malicious websites that are being used for malware distribution, based on the **Database dump (CSV)** of Abuse.ch [URLhaus](https://urlhaus.abuse.ch/). Blocklist is updated twice a day.
There are multiple formats available, refer to the appropriate section according to the program used:
- uBlock Origin (uBO) -> [URL-based](#url-based) section (recommended)
- Pi-hole -> [Domain-based](#domain-based) or [Hosts-based](#hosts-based) section
- AdGuard Home -> [Domain-based (AdGuard Home)](#domain-based-adguard-home) or [Hosts-based](#hosts-based) section
- AdGuard (browser extension) -> [URL-based (AdGuard)](#url-based-adguard)
- Vivaldi -> [URL-based (Vivaldi)](#url-based-vivaldi)
- [Hosts](#hosts-based)
- [Dnsmasq](#dnsmasq)
- BIND -> BIND [zone](#bind) or [RPZ](#response-policy-zone)
- [Unbound](#unbound)
- [dnscrypt-proxy](#dnscrypt-proxy)
- Internet Explorer -> [Tracking Protection List (IE)](#tracking-protection-list-ie)
- [Snort2](#snort2)
- [Snort3](#snort3)
- [Suricata](#suricata)
Not sure which format to choose? See [Compatibility](https://gitlab.com/curben/urlhaus-filter/wikis/compatibility) page in the wiki.
Check out my other filters:
- [phishing-filter](https://gitlab.com/curben/phishing-filter)
- [pup-filter](https://gitlab.com/curben/pup-filter)
- [tracking-filter](https://gitlab.com/curben/tracking-filter)
## URL-based
Import the following URL into uBO to subscribe (includes online and **offline** malicious websites):
- https://curben.gitlab.io/malware-filter/urlhaus-filter.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter.txt
</details>
<br />
Lite version (**online** links only):
_enabled by default in uBO >=[1.28.2](https://github.com/gorhill/uBlock/releases/tag/1.28.2)_
- https://curben.gitlab.io/malware-filter/urlhaus-filter-online.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-online.txt
</details>
**Note:** Lite version is 99% smaller by excluding offline urls. The status of urls is determined by the upstream Abuse.ch. However, the test is not 100% accurate and some malicious urls that are otherwise accessible may be missed. If bandwidth (9 MB/day) is not a constraint, I recommend the regular version; browser extensions may utilise [HTTP compression](https://developer.mozilla.org/en-US/docs/Web/HTTP/Compression) that can save 70% of bandwidth.
Regular version contains >260K filters, do note that uBO can [easily handle](https://github.com/uBlockOrigin/uBlock-issues/issues/338#issuecomment-452843669) 500K filters.
If you've installed the lite version but prefer to use the regular version, it's better to remove it beforehand. Having two versions at the same time won't cause any conflict issue, uBO can detect duplicate network filters and adjust accordingly, but it's a waste of your bandwidth.
**AdGuard Home** users should use [this blocklist](#domain-based-adguard-home).
## URL-based (AdGuard)
Import the following URL into AdGuard browser extensions to subscribe (includes online and **offline** malicious websites):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-ag.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-ag.txt
</details>
<br />
Lite version (**online** links only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-ag-online.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-ag-online.txt
</details>
## URL-based (Vivaldi)
_Requires Vivaldi Desktop/Android 3.3+, blocking level must be at least "Block Trackers"_
Import the following URL into Vivaldi's **Tracker Blocking Sources** to subscribe (includes online and **offline** malicious websites):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-vivaldi.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-vivaldi.txt
</details>
<br />
Lite version (**online** links only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-vivaldi-online.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-vivaldi-online.txt
</details>
## Domain-based
This blocklist includes domains and IP addresses.
- https://curben.gitlab.io/malware-filter/urlhaus-filter-domains.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-domains.txt
</details>
<br />
Lite version (online domains/IPs only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-domains-online.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-domains-online.txt
</details>
## Domain-based (AdGuard Home)
This AdGuard Home-compatible blocklist includes domains and IP addresses.
- https://curben.gitlab.io/malware-filter/urlhaus-filter-agh.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-agh.txt
</details>
<br />
Lite version (online domains/IPs only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-agh-online.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-agh-online.txt
</details>
## Hosts-based
This blocklist includes domains only.
- https://curben.gitlab.io/malware-filter/urlhaus-filter-hosts.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-hosts.txt
</details>
<br />
Lite version (online domains only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-hosts-online.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-hosts-online.txt
</details>
## Dnsmasq
This blocklist includes domains only.
### Install
```
# Create a new folder to store the blocklist
mkdir -p /usr/local/etc/dnsmasq/
# Create a new cron job for daily update
printf '#!/bin/sh\ncurl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-dnsmasq.conf" -o "/usr/local/etc/dnsmasq/urlhaus-filter-dnsmasq.conf"\n' > /etc/cron.daily/urlhaus-filter
# cron job requires execution permission
chmod 755 /etc/cron.daily/urlhaus-filter
# Configure dnsmasq to use the blocklist
printf "\nconf-file=/usr/local/etc/dnsmasq/urlhaus-filter-dnsmasq.conf\n" >> /etc/dnsmasq.conf
```
- https://curben.gitlab.io/malware-filter/urlhaus-filter-dnsmasq.conf
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-dnsmasq.conf
</details>
<br />
Lite version (online domains only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-dnsmasq-online.conf
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-dnsmasq-online.conf
</details>
## BIND
This blocklist includes domains only.
### Install
```
# Create a new folder to store the blocklist
mkdir -p /usr/local/etc/bind/
# Create a new cron job for daily update
printf '#!/bin/sh\ncurl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-bind.conf" -o "/usr/local/etc/bind/urlhaus-filter-bind.conf"\n' > /etc/cron.daily/urlhaus-filter
# cron job requires execution permission
chmod 755 /etc/cron.daily/urlhaus-filter
# Configure BIND to use the blocklist
printf '\ninclude "/usr/local/etc/bind/urlhaus-filter-bind.conf";\n' >> /etc/bind/named.conf
```
Add this to "/etc/bind/null.zone.file" (skip this step if the file already exists):
```
$TTL 86400 ; one day
@ IN SOA ns.nullzone.loc. ns.nullzone.loc. (
2017102203
28800
7200
864000
86400 )
NS ns.nullzone.loc.
A 0.0.0.0
@ IN A 0.0.0.0
* IN A 0.0.0.0
```
Zone file is derived from [here](https://github.com/tomzuu/blacklist-named/blob/master/null.zone.file).
- https://curben.gitlab.io/malware-filter/urlhaus-filter-bind.conf
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-bind.conf
</details>
<br />
Lite version (online domains only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-bind-online.conf
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-bind-online.conf
</details>
## Response Policy Zone
This blocklist includes domains only.
- https://curben.gitlab.io/malware-filter/urlhaus-filter-rpz.conf
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-rpz.conf
</details>
<br />
Lite version (online domains only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-rpz-online.conf
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-rpz-online.conf
</details>
## Unbound
This blocklist includes domains only.
### Install
```
# Create a new folder to store the blocklist
mkdir -p /usr/local/etc/unbound/
# Create a new cron job for daily update
printf '#!/bin/sh\ncurl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-unbound.conf" -o "/usr/local/etc/unbound/urlhaus-filter-unbound.conf"\n' > /etc/cron.daily/urlhaus-filter
# cron job requires execution permission
chmod 755 /etc/cron.daily/urlhaus-filter
# Configure Unbound to use the blocklist
printf '\n include: "/usr/local/etc/unbound/urlhaus-filter-unbound.conf"\n' >> /etc/unbound/unbound.conf
```
- https://curben.gitlab.io/malware-filter/urlhaus-filter-unbound.conf
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-unbound.conf
</details>
<br />
Lite version (online domains only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-unbound-online.conf
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-unbound-online.conf
</details>
## dnscrypt-proxy
### Install
```
# Create a new folder to store the blocklist
mkdir -p /etc/dnscrypt-proxy/
# Create a new cron job for daily update
printf '#!/bin/sh\ncurl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-dnscrypt-blocked-names.txt" -o "/etc/dnscrypt-proxy/urlhaus-filter-dnscrypt-blocked-names.txt"\n' > /etc/cron.daily/urlhaus-filter
printf '\ncurl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-dnscrypt-blocked-ips.txt" -o "/etc/dnscrypt-proxy/urlhaus-filter-dnscrypt-blocked-ips.txt"\n' >> /etc/cron.daily/urlhaus-filter
# cron job requires execution permission
chmod 755 /etc/cron.daily/urlhaus-filter
```
Configure dnscrypt-proxy to use the blocklist:
``` diff
[blocked_names]
+ blocked_names_file = '/etc/dnscrypt-proxy/urlhaus-filter-dnscrypt-blocked-names.txt'
[blocked_ips]
+ blocked_ips_file = '/etc/dnscrypt-proxy/urlhaus-filter-dnscrypt-blocked-ips.txt'
```
- https://curben.gitlab.io/malware-filter/urlhaus-filter-dnscrypt-blocked-names.txt
- https://curben.gitlab.io/malware-filter/urlhaus-filter-dnscrypt-blocked-ips.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-dnscrypt-blocked-names.txt
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-dnscrypt-blocked-ips.txt
</details>
<br />
Lite version (online domains only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-dnscrypt-blocked-names-online.txt
- https://curben.gitlab.io/malware-filter/urlhaus-filter-dnscrypt-blocked-ips-online.txt
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-dnscrypt-blocked-names-online.txt
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-dnscrypt-blocked-ips-online.txt
</details>
## Tracking Protection List (IE)
This blocklist includes domains only. Supported in Internet Explorer 9+.
- https://curben.gitlab.io/malware-filter/urlhaus-filter.tpl
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter.tpl
</details>
<br />
Lite version (online domains only):
- https://curben.gitlab.io/malware-filter/urlhaus-filter-online.tpl
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-online.tpl
</details>
## Snort2
This ruleset includes online URLs only. Not compatible with [Snort3](#snort3).
### Install
```
# Download ruleset
curl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-snort2-online.rules" -o "/etc/snort/rules/urlhaus-filter-snort2-online.rules"
# Create a new cron job for daily update
printf '#!/bin/sh\ncurl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-snort2-online.rules" -o "/etc/snort/rules/urlhaus-filter-snort2-online.rules"\n' > /etc/cron.daily/urlhaus-filter
# cron job requires execution permission
chmod 755 /etc/cron.daily/urlhaus-filter
# Configure Snort to use the ruleset
printf "\ninclude \$RULE_PATH/urlhaus-filter-snort2-online.rules\n" >> /etc/snort/snort.conf
```
- https://curben.gitlab.io/malware-filter/urlhaus-filter-snort2-online.rules
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-snort2-online.rules
</details>
## Snort3
This ruleset includes online URLs only. Not compatible with [Snort2](#snort2).
### Install
```
# Download ruleset
curl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-snort3-online.rules" -o "/etc/snort/rules/urlhaus-filter-snort3-online.rules"
# Create a new cron job for daily update
printf '#!/bin/sh\ncurl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-snort3-online.rules" -o "/etc/snort/rules/urlhaus-filter-snort3-online.rules"\n' > /etc/cron.daily/urlhaus-filter
# cron job requires execution permission
chmod 755 /etc/cron.daily/urlhaus-filter
```
Configure Snort to use the ruleset:
``` diff
# /etc/snort/snort.lua
ips =
{
variables = default_variables,
+ include = 'rules/urlhaus-filter-snort3-online.rules'
}
```
- https://curben.gitlab.io/malware-filter/urlhaus-filter-snort3-online.rules
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-snort3-online.rules
</details>
## Suricata
This ruleset includes online URLs only.
### Install
```
# Download ruleset
curl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-suricata-online.rules" -o "/etc/suricata/rules/urlhaus-filter-suricata-online.rules"
# Create a new cron job for daily update
printf '#!/bin/sh\ncurl -L "https://curben.gitlab.io/malware-filter/urlhaus-filter-suricata-online.rules" -o "/etc/suricata/rules/urlhaus-filter-suricata-online.rules"\n' > /etc/cron.daily/urlhaus-filter
# cron job requires execution permission
chmod 755 /etc/cron.daily/urlhaus-filter
```
Configure Suricata to use the ruleset:
``` diff
# /etc/suricata/suricata.yaml
rule-files:
- local.rules
+ - urlhaus-filter-suricata-online.rules
```
- https://curben.gitlab.io/malware-filter/urlhaus-filter-suricata-online.rules
<details>
<summary>Mirrors</summary>
- https://curben.gitlab.io/urlhaus-filter/urlhaus-filter-suricata-online.rules
</details>
## Third-party mirrors
<details>
<summary>iosprivacy/urlhaus-filter-mirror</summary>
TBC
</details>
## Issues
This blocklist operates by blocking the **whole** website, instead of specific webpages; exceptions are made on popular websites (e.g. `https://docs.google.com/`), in which webpages are specified instead (e.g. `https://docs.google.com/malware-page`). Malicious webpages are only listed in the [URL-based](#url-based) filter, popular websites are excluded from other filters.
*Popular* websites are as listed in the [Umbrella Popularity List](https://s3-us-west-1.amazonaws.com/umbrella-static/index.html) (top 1M domains + subdomains), [Tranco List](https://tranco-list.eu/) (top 1M domains) and this [custom list](src/exclude.txt).
If you wish to exclude certain website(s) that you believe is sufficiently well-known, please create an [issue](https://gitlab.com/curben/urlhaus-filter/issues) or [merge request](https://gitlab.com/curben/urlhaus-filter/merge_requests). If the website is quite obscure but you still want to visit it, you can add a new line `||legitsite.com^$badfilter` to "My filters" tab of uBO; use a subdomain if relevant, `||sub.legitsite.com^$badfilter`.
This filter **only** accepts new malware URLs from [URLhaus](https://urlhaus.abuse.ch/).
Please report new malware URL to the upstream maintainer through https://urlhaus.abuse.ch/api/#submit.
## Cloning
Since the filter is updated frequently, cloning the repo would become slower over time as the revision grows.
Use shallow clone to get the recent revisions only. Getting the last five revisions should be sufficient for a valid MR.
`git clone --depth 5 https://gitlab.com/curben/urlhaus-filter.git`
## FAQ
See [FAQ](https://gitlab.com/curben/urlhaus-filter/wikis/faq).
## License
[Creative Commons Zero v1.0 Universal](LICENSE.md)
[badge.sh](src/badge.sh) & [.gitlab/](.gitlab/) contain badges that are licensed by [Shields.io](https://shields.io) under [CC0 1.0](LICENSE.md)
[URLhaus](https://urlhaus.abuse.ch/): [CC0](https://creativecommons.org/publicdomain/zero/1.0/)
[Tranco List](https://tranco-list.eu/): [MIT License](https://choosealicense.com/licenses/mit/)
[Umbrella Popularity List](https://s3-us-west-1.amazonaws.com/umbrella-static/index.html): Available free of charge by Cisco Umbrella
This repository is not endorsed by Abuse.ch.

14
package.json Normal file
View File

@ -0,0 +1,14 @@
{
"name": "urlhaus-filter",
"private": true,
"scripts": {
"build": "node src/build.js"
},
"dependencies": {
"extract-zip": "^2.0.1",
"got": "^11.8.3"
},
"engines": {
"node": ">= 14.15.0"
}
}

30
src/build.js Normal file
View File

@ -0,0 +1,30 @@
'use strict'
// for deployment outside of GitLab CI, e.g. Cloudflare Pages and Netlify
const { stream: gotStream } = require('got')
const unzip = require('extract-zip')
const { join } = require('path')
const { mkdir } = require('fs/promises')
const { createWriteStream } = require('fs')
const { pipeline } = require('stream/promises')
const rootPath = join(__dirname, '..')
const tmpPath = join(rootPath, 'tmp')
const zipPath = join(tmpPath, 'artifacts.zip')
const artifactsUrl = 'https://gitlab.com/curben/urlhaus-filter/-/jobs/artifacts/main/download?job=pages'
const f = async () => {
await mkdir(tmpPath, { recursive: true })
console.log(`Downloading artifacts.zip from "${artifactsUrl}"`)
await pipeline(
gotStream(artifactsUrl),
createWriteStream(zipPath)
)
console.log('Extracting artifacts.zip...')
await unzip(zipPath, { dir: rootPath })
}
f()

77
src/exclude.txt Normal file
View File

@ -0,0 +1,77 @@
# Exclusion list
# malicious links are still included in "urlhaus-filter.txt" & "urlhaus-filter-online.txt"
shellter-static.s3.amazonaws.com
void.cat
lists.infradead.org
cdimage.debian.org
pdesaa.cimaa.pt
shengen.ru
cdn-frm-eu.wargaming.net
hampsteadclinic.co.uk
users.skynet.be
users.telenet.be
u.teknik.io
digitalschnitt.de
debonosu.jp
cd.textfiles.com
a.uguu.se
s3.us-east-2.amazonaws.com
s3.us-east-1.amazonaws.com
s3.amazonaws.com
s3.us-west-1.amazonaws.com
s3.us-west-2.amazonaws.com
s3.af-south-1.amazonaws.com
s3.ap-east-1.amazonaws.com
s3.ap-south-1.amazonaws.com
s3.ap-northeast-3.amazonaws.com
s3.ap-northeast-2.amazonaws.com
s3.ap-southeast-1.amazonaws.com
s3.ap-southeast-2.amazonaws.com
s3.ap-northeast-1.amazonaws.com
s3.ca-central-1.amazonaws.com
s3.cn-north-1.amazonaws.com.cn
s3.cn-northwest-1.amazonaws.com.cn
s3.eu-central-1.amazonaws.com
s3.eu-west-1.amazonaws.com
s3.eu-west-2.amazonaws.com
s3.eu-south-1.amazonaws.com
s3.eu-west-3.amazonaws.com
s3.eu-north-1.amazonaws.com
s3.sa-east-1.amazonaws.com
s3.me-south-1.amazonaws.com
s3.us-gov-east-1.amazonaws.com
s3.us-gov-west-1.amazonaws.com
s3-us-east-2.amazonaws.com
s3-us-east-1.amazonaws.com
s3-us-west-1.amazonaws.com
s3-us-west-2.amazonaws.com
s3-af-south-1.amazonaws.com
s3-ap-east-1.amazonaws.com
s3-ap-south-1.amazonaws.com
s3-ap-northeast-3.amazonaws.com
s3-ap-northeast-2.amazonaws.com
s3-ap-southeast-1.amazonaws.com
s3-ap-southeast-2.amazonaws.com
s3-ap-northeast-1.amazonaws.com
s3-ca-central-1.amazonaws.com
s3-cn-north-1.amazonaws.com.cn
s3-cn-northwest-1.amazonaws.com.cn
s3-eu-central-1.amazonaws.com
s3-eu-west-1.amazonaws.com
s3-eu-west-2.amazonaws.com
s3-eu-south-1.amazonaws.com
s3-eu-west-3.amazonaws.com
s3-eu-north-1.amazonaws.com
s3-sa-east-1.amazonaws.com
s3-me-south-1.amazonaws.com
s3-us-gov-east-1.amazonaws.com
s3-us-gov-west-1.amazonaws.com
srv-store4.gofile.io
srv-store6.gofile.io
srv-file9.gofile.io
blog.theodo.com
ronnietucker.co.uk
cdn.tmooc.cn
104.244.72.54
dl.packetstormsecurity.net
screenshield.us

407
src/script.sh Normal file
View File

@ -0,0 +1,407 @@
#!/bin/sh
set -efux -o pipefail
## Create a temporary working folder
mkdir -p "tmp/"
cd "tmp/"
## Prepare datasets
curl -L "https://urlhaus.abuse.ch/downloads/csv/" -o "urlhaus.zip"
curl -L "https://s3-us-west-1.amazonaws.com/umbrella-static/top-1m.csv.zip" -o "top-1m-umbrella.zip"
curl -L "https://tranco-list.eu/top-1m.csv.zip" -o "top-1m-tranco.zip"
cp -f "../src/exclude.txt" "."
## Prepare URLhaus.csv
unzip -p "urlhaus.zip" | \
# Convert DOS to Unix line ending
dos2unix | \
tr "[:upper:]" "[:lower:]" | \
# Remove comment
sed "/^#/d" > "URLhaus.csv"
## Parse URLs
cat "URLhaus.csv" | \
cut -f 6 -d '"' | \
cut -f 3- -d "/" | \
# Domain must have at least a 'dot'
grep -F "." | \
# Remove invalid protocol, see #32
sed -E "s/^(ttps:\/\/|https:\/|http\/)//g" | \
# Remove www.
sed "s/^www\.//g" | \
sort -u > "urlhaus.txt"
## Parse domain and IP address only
cat "urlhaus.txt" | \
cut -f 1 -d "/" | \
cut -f 1 -d ":" | \
# Remove invalid domains, see #15
grep -vF "??" | \
cut -f 1 -d "?" | \
sort -u > "urlhaus-domains.txt"
## Parse online URLs only
cat "URLhaus.csv" | \
grep '"online"' | \
cut -f 6 -d '"' | \
cut -f 3- -d "/" | \
sed "s/^www\.//g" | \
sort -u > "urlhaus-online.txt"
cat "urlhaus-online.txt" | \
cut -f 1 -d "/" | \
cut -f 1 -d ":" | \
grep -vF "??" | \
cut -f 1 -d "?" | \
sort -u > "urlhaus-domains-online.txt"
## Parse the Umbrella 1 Million
unzip -p "top-1m-umbrella.zip" | \
dos2unix | \
tr "[:upper:]" "[:lower:]" | \
# Parse domains only
cut -f 2 -d "," | \
grep -F "." | \
# Remove www.
sed "s/^www\.//g" | \
sort -u > "top-1m-umbrella.txt"
## Parse the Tranco 1 Million
unzip -p "top-1m-tranco.zip" | \
dos2unix | \
tr "[:upper:]" "[:lower:]" | \
# Parse domains only
cut -f 2 -d "," | \
grep -F "." | \
# Remove www.
sed "s/^www\.//g" | \
sort -u > "top-1m-tranco.txt"
# Merge Umbrella and self-maintained top domains
cat "top-1m-umbrella.txt" "top-1m-tranco.txt" "exclude.txt" | \
sort -u > "top-1m-well-known.txt"
## Parse popular domains from URLhaus
cat "urlhaus-domains.txt" | \
# grep match whole line
grep -Fx -f "top-1m-well-known.txt" > "urlhaus-top-domains.txt"
## Parse domains from URLhaus excluding popular domains
cat "urlhaus-domains.txt" | \
grep -F -vf "urlhaus-top-domains.txt" | \
# Remove blank lines
sed "/^$/d" > "malware-domains.txt"
cat "urlhaus-domains-online.txt" | \
grep -F -vf "urlhaus-top-domains.txt" | \
sed "/^$/d" > "malware-domains-online.txt"
## Parse malware URLs from popular domains
cat "urlhaus.txt" | \
grep -F -f "urlhaus-top-domains.txt" | \
sed "s/^/||/g" | \
sed "s/$/\$all/g" > "malware-url-top-domains.txt"
cat "urlhaus-online.txt" | \
grep -F -f "urlhaus-top-domains.txt" | \
sed "s/^/||/g" | \
sed "s/$/\$all/g" > "malware-url-top-domains-online.txt"
cat "urlhaus-online.txt" | \
grep -F -f "urlhaus-top-domains.txt" > "malware-url-top-domains-raw-online.txt"
## Merge malware domains and URLs
CURRENT_TIME="$(date -R -u)"
FIRST_LINE="! Title: Malicious URL Blocklist"
SECOND_LINE="! Updated: $CURRENT_TIME"
THIRD_LINE="! Expires: 1 day (update frequency)"
FOURTH_LINE="! Homepage: https://gitlab.com/curben/urlhaus-filter"
FIFTH_LINE="! License: https://gitlab.com/curben/urlhaus-filter#license"
SIXTH_LINE="! Source: https://urlhaus.abuse.ch/api/"
ANNOUNCEMENT_1="\n! 2021/01/08: There has been a major change to the mirrors, check the repo for the new mirrors."
ANNOUNCEMENT_2="! Old mirrors will be deprecated in 3 months. The main download link \"curben.gitlab.io/malware-filter/\" _is not affected_."
COMMENT_ABP="$FIRST_LINE\n$SECOND_LINE\n$THIRD_LINE\n$FOURTH_LINE\n$FIFTH_LINE\n$SIXTH_LINE\n$ANNOUNCEMENT_1\n$ANNOUNCEMENT_2"
mkdir -p "../public/"
cat "malware-domains.txt" "malware-url-top-domains.txt" | \
sort | \
sed '1 i\'"$COMMENT_ABP"'' > "../public/urlhaus-filter.txt"
cat "malware-domains-online.txt" "malware-url-top-domains-online.txt" | \
sort | \
sed '1 i\'"$COMMENT_ABP"'' | \
sed "1s/Malicious/Online Malicious/" > "../public/urlhaus-filter-online.txt"
# Adguard Home (#19, #22)
cat "malware-domains.txt" | \
sed "s/^/||/g" | \
sed "s/$/^/g" > "malware-domains-adguard-home.txt"
cat "malware-domains-online.txt" | \
sed "s/^/||/g" | \
sed "s/$/^/g" > "malware-domains-online-adguard-home.txt"
cat "malware-domains-adguard-home.txt" | \
sort | \
sed '1 i\'"$COMMENT_ABP"'' | \
sed "1s/Blocklist/Blocklist (AdGuard Home)/" > "../public/urlhaus-filter-agh.txt"
cat "malware-domains-online-adguard-home.txt" | \
sort | \
sed '1 i\'"$COMMENT_ABP"'' | \
sed "1s/Malicious/Online Malicious/" | \
sed "1s/Blocklist/Blocklist (AdGuard Home)/" > "../public/urlhaus-filter-agh-online.txt"
# Adguard browser extension
cat "malware-domains.txt" | \
sed "s/^/||/g" | \
sed "s/$/\$all/g" > "malware-domains-adguard.txt"
cat "malware-domains-online.txt" | \
sed "s/^/||/g" | \
sed "s/$/\$all/g" > "malware-domains-online-adguard.txt"
cat "malware-domains-adguard.txt" "malware-url-top-domains.txt" | \
sort | \
sed '1 i\'"$COMMENT_ABP"'' | \
sed "1s/Blocklist/Blocklist (AdGuard)/" > "../public/urlhaus-filter-ag.txt"
cat "malware-domains-online-adguard.txt" "malware-url-top-domains-online.txt" | \
sort | \
sed '1 i\'"$COMMENT_ABP"'' | \
sed "1s/Malicious/Online Malicious/" | \
sed "1s/Blocklist/Blocklist (AdGuard)/" > "../public/urlhaus-filter-ag-online.txt"
# Vivaldi
cat "malware-domains.txt" | \
sed "s/^/||/g" | \
sed "s/$/\$document/g" > "malware-domains-vivaldi.txt"
cat "malware-domains-online.txt" | \
sed "s/^/||/g" | \
sed "s/$/\$document/g" > "malware-domains-online-vivaldi.txt"
cat "malware-domains-vivaldi.txt" "malware-url-top-domains.txt" | \
sed "s/\$all$/\$document/g" | \
sort | \
sed '1 i\'"$COMMENT_ABP"'' | \
sed "1s/Blocklist/Blocklist (Vivaldi)/" > "../public/urlhaus-filter-vivaldi.txt"
cat "malware-domains-online-vivaldi.txt" "malware-url-top-domains-online.txt" | \
sed "s/\$all$/\$document/g" | \
sort | \
sed '1 i\'"$COMMENT_ABP"'' | \
sed "1s/Malicious/Online Malicious/" | \
sed "1s/Blocklist/Blocklist (Vivaldi)/" > "../public/urlhaus-filter-vivaldi-online.txt"
## Domains-only blocklist
# awk + head is a workaround for sed prepend
COMMENT=$(printf "$COMMENT_ABP" | sed "s/^!/#/g" | sed "1s/URL/Domains/" | awk '{printf "%s\\n", $0}' | head -c -2)
COMMENT_ONLINE=$(printf "$COMMENT" | sed "1s/Malicious/Online Malicious/" | awk '{printf "%s\\n", $0}' | head -c -2)
cat "malware-domains.txt" | \
sort | \
sed '1 i\'"$COMMENT"'' > "../public/urlhaus-filter-domains.txt"
cat "malware-domains-online.txt" | \
sort | \
sed '1 i\'"$COMMENT_ONLINE"'' > "../public/urlhaus-filter-domains-online.txt"
## Hosts only
cat "malware-domains.txt" | \
sort | \
# Remove IPv4 address
grep -vE "^([0-9]{1,3}[\.]){3}[0-9]{1,3}$" > "malware-hosts.txt"
cat "malware-domains-online.txt" | \
sort | \
# Remove IPv4 address
grep -vE "^([0-9]{1,3}[\.]){3}[0-9]{1,3}$" > "malware-hosts-online.txt"
## Hosts file blocklist
cat "malware-hosts.txt" | \
sed "s/^/0.0.0.0 /g" | \
# Re-insert comment
sed '1 i\'"$COMMENT"'' | \
sed "1s/Domains/Hosts/" > "../public/urlhaus-filter-hosts.txt"
cat "malware-hosts-online.txt" | \
sed "s/^/0.0.0.0 /g" | \
sed '1 i\'"$COMMENT_ONLINE"'' | \
sed "1s/Domains/Hosts/" > "../public/urlhaus-filter-hosts-online.txt"
## Dnsmasq-compatible blocklist
cat "malware-hosts.txt" | \
sed "s/^/address=\//g" | \
sed "s/$/\/0.0.0.0/g" | \
sed '1 i\'"$COMMENT"'' | \
sed "1s/Blocklist/dnsmasq Blocklist/" > "../public/urlhaus-filter-dnsmasq.conf"
cat "malware-hosts-online.txt" | \
sed "s/^/address=\//g" | \
sed "s/$/\/0.0.0.0/g" | \
sed '1 i\'"$COMMENT_ONLINE"'' | \
sed "1s/Blocklist/dnsmasq Blocklist/" > "../public/urlhaus-filter-dnsmasq-online.conf"
## BIND-compatible blocklist
cat "malware-hosts.txt" | \
sed 's/^/zone "/g' | \
sed 's/$/" { type master; notify no; file "null.zone.file"; };/g' | \
sed '1 i\'"$COMMENT"'' | \
sed "1s/Blocklist/BIND Blocklist/" > "../public/urlhaus-filter-bind.conf"
cat "malware-hosts-online.txt" | \
sed 's/^/zone "/g' | \
sed 's/$/" { type master; notify no; file "null.zone.file"; };/g' | \
sed '1 i\'"$COMMENT_ONLINE"'' | \
sed "1s/Blocklist/BIND Blocklist/" > "../public/urlhaus-filter-bind-online.conf"
## DNS Response Policy Zone (RPZ)
CURRENT_UNIX_TIME="$(date +%s)"
RPZ_SYNTAX="\n\$TTL 30\n@ IN SOA rpz.curben.gitlab.io. hostmaster.rpz.curben.gitlab.io. $CURRENT_UNIX_TIME 86400 3600 604800 30\n NS localhost.\n"
cat "malware-hosts.txt" | \
sed "s/$/ CNAME ./g" | \
sed '1 i\'"$RPZ_SYNTAX"'' | \
sed '1 i\'"$COMMENT"'' | \
sed "s/^#/;/g" | \
sed "1s/Blocklist/RPZ Blocklist/" > "../public/urlhaus-filter-rpz.conf"
cat "malware-hosts-online.txt" | \
sed "s/$/ CNAME ./g" | \
sed '1 i\'"$RPZ_SYNTAX"'' | \
sed '1 i\'"$COMMENT_ONLINE"'' | \
sed "s/^#/;/g" | \
sed "1s/Blocklist/RPZ Blocklist/" > "../public/urlhaus-filter-rpz-online.conf"
## Unbound-compatible blocklist
cat "malware-hosts.txt" | \
sed 's/^/local-zone: "/g' | \
sed 's/$/" always_nxdomain/g' | \
sed '1 i\'"$COMMENT"'' | \
sed "1s/Blocklist/Unbound Blocklist/" > "../public/urlhaus-filter-unbound.conf"
cat "malware-hosts-online.txt" | \
sed 's/^/local-zone: "/g' | \
sed 's/$/" always_nxdomain/g' | \
sed '1 i\'"$COMMENT_ONLINE"'' | \
sed "1s/Blocklist/Unbound Blocklist/" > "../public/urlhaus-filter-unbound-online.conf"
## dnscrypt-proxy blocklists
# name-based
cat "malware-hosts.txt" | \
sed '1 i\'"$COMMENT"'' | \
sed "1s/Domains/Names/" > "../public/urlhaus-filter-dnscrypt-blocked-names.txt"
cat "malware-hosts-online.txt" | \
sed '1 i\'"$COMMENT_ONLINE"'' | \
sed "1s/Domains/Names/" > "../public/urlhaus-filter-dnscrypt-blocked-names-online.txt"
## IPv4-based
cat "malware-domains.txt" | \
sort | \
grep -E "^([0-9]{1,3}[\.]){3}[0-9]{1,3}$" | \
sed '1 i\'"$COMMENT"'' | \
sed "1s/Domains/IPs/" > "../public/urlhaus-filter-dnscrypt-blocked-ips.txt"
cat "malware-domains-online.txt" | \
sort | \
grep -E "^([0-9]{1,3}[\.]){3}[0-9]{1,3}$" | \
sed '1 i\'"$COMMENT_ONLINE"'' | \
sed "1s/Domains/IPs/" > "../public/urlhaus-filter-dnscrypt-blocked-ips-online.txt"
## Temporarily disable command print
set +x
# Snort & Suricata
rm -f "../public/urlhaus-filter-snort2-online.rules" \
"../public/urlhaus-filter-snort3-online.rules" \
"../public/urlhaus-filter-suricata-online.rules"
SID="100000001"
while read DOMAIN; do
SN_RULE="alert tcp \$HOME_NET any -> \$EXTERNAL_NET [80,443] (msg:\"urlhaus-filter malicious website detected\"; flow:established,from_client; content:\"GET\"; http_method; content:\"$DOMAIN\"; content:\"Host\"; http_header; classtype:trojan-activity; sid:$SID; rev:1;)"
SN3_RULE="alert http \$HOME_NET any -> \$EXTERNAL_NET any (msg:\"urlhaus-filter malicious website detected\"; http_header:field host; content:\"$DOMAIN\",nocase; classtype:trojan-activity; sid:$SID; rev:1;)"
SR_RULE="alert http \$HOME_NET any -> \$EXTERNAL_NET any (msg:\"urlhaus-filter malicious website detected\"; flow:established,from_client; http.method; content:\"GET\"; http.host; content:\"$DOMAIN\"; classtype:trojan-activity; sid:$SID; rev:1;)"
echo "$SN_RULE" >> "../public/urlhaus-filter-snort2-online.rules"
echo "$SN3_RULE" >> "../public/urlhaus-filter-snort3-online.rules"
echo "$SR_RULE" >> "../public/urlhaus-filter-suricata-online.rules"
SID=$(( $SID + 1 ))
done < "malware-domains-online.txt"
while read URL; do
HOST=$(echo "$URL" | cut -d"/" -f1)
URI=$(echo "$URL" | sed -e "s/^$HOST//" -e "s/;/\\\;/g")
# Snort2 only supports <=2047 characters of `content`
SN_RULE="alert tcp \$HOME_NET any -> \$EXTERNAL_NET [80,443] (msg:\"urlhaus-filter malicious website detected\"; flow:established,from_client; content:\"GET\"; http_method; content:\"$(echo $URI | cut -c -2047)\"; http_uri; nocase; content:\"$HOST\"; content:\"Host\"; http_header; classtype:trojan-activity; sid:$SID; rev:1;)"
SN3_RULE="alert http \$HOME_NET any -> \$EXTERNAL_NET any (msg:\"urlhaus-filter malicious website detected\"; http_header:field host; content:\"$HOST\",nocase; http_uri; content:\"$URI\",nocase; classtype:trojan-activity; sid:$SID; rev:1;)"
SR_RULE="alert http \$HOME_NET any -> \$EXTERNAL_NET any (msg:\"urlhaus-filter malicious website detected\"; flow:established,from_client; http.method; content:\"GET\"; http.uri; content:\"$URI\"; endswith; nocase; http.host; content:\"$HOST\"; classtype:trojan-activity; sid:$SID; rev:1;)"
echo "$SN_RULE" >> "../public/urlhaus-filter-snort2-online.rules"
echo "$SN3_RULE" >> "../public/urlhaus-filter-snort3-online.rules"
echo "$SR_RULE" >> "../public/urlhaus-filter-suricata-online.rules"
SID=$(( $SID + 1 ))
done < "malware-url-top-domains-raw-online.txt"
## Re-enable command print
set -x
sed -i '1 i\'"$COMMENT_ONLINE"'' "../public/urlhaus-filter-snort2-online.rules"
sed -i "1s/Domains Blocklist/URL Snort2 Ruleset/" "../public/urlhaus-filter-snort2-online.rules"
sed -i '1 i\'"$COMMENT_ONLINE"'' "../public/urlhaus-filter-snort3-online.rules"
sed -i "1s/Domains Blocklist/URL Snort3 Ruleset/" "../public/urlhaus-filter-snort3-online.rules"
sed -i '1 i\'"$COMMENT_ONLINE"'' "../public/urlhaus-filter-suricata-online.rules"
sed -i "1s/Domains Blocklist/URL Suricata Ruleset/" "../public/urlhaus-filter-suricata-online.rules"
## IE blocklist
COMMENT_IE="msFilterList\n$COMMENT\n: Expires=1\n#"
COMMENT_ONLINE_IE="msFilterList\n$COMMENT_ONLINE\n: Expires=1\n#"
cat "malware-hosts.txt" | \
sed "s/^/-d /g" | \
sed '1 i\'"$COMMENT_IE"'' | \
sed "2s/Domains Blocklist/Hosts Blocklist (IE)/" > "../public/urlhaus-filter.tpl"
cat "malware-hosts-online.txt" | \
sed "s/^/-d /g" | \
sed '1 i\'"$COMMENT_ONLINE_IE"'' | \
sed "2s/Domains Blocklist/Hosts Blocklist (IE)/" > "../public/urlhaus-filter-online.tpl"
## Clean up artifacts
rm -f "URLhaus.csv" "top-1m-umbrella.zip" "top-1m-umbrella.txt" "top-1m-tranco.txt"
cd ../