NotaInutilis / Super-SEO-Spam-Suppressor
An anticapitalist blocklist targeting websites abusing SEO tactics to spam web searches with data pollution and security risks: content farms, scrapers, copycats, generative AIs, scams, advertisements, malwares, and useless wasteful garbage in general. It is best used with uBlacklist.
View on GitHubAI Architecture Analysis
This repository is indexed by RepoMind. By analyzing NotaInutilis/Super-SEO-Spam-Suppressor in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.
Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context on-demand, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.
Repository Overview (README excerpt)
Crawler viewSuper SEO Spam Suppressor (SSSS[^SSSS]) An anticapitalist blocklist targeting websites abusing SEO tactics to spam web searches with data pollution and security risks: junk news, content farms, scams, impersonations, fads and bubbles (Web3 or genAI), and all other kinds of useless wasteful garbage. It is best used with uBlacklist. [^SSSS]: It's a Gridman reference. I'm spelling it out because it's also the name of a skin disease: don't go looking for SSSS on image search. > Our website is now optimized to supply content to Google, not build an audience of its own. > Mia Sato, "The Perfect Webpage", *The Verge*. Since 2019, Google's search functions are being destroyed: it is now scientifically proven that the biggest search engine on the internet has become a barely useable, terminally enshittified mess, merely a husk of the wonderful discovery tool it was yesterday. Do you want to learn about *thing*? How about **buying** *thing* and **consuming** *thing* instead? Its drive to commercialize our every online interaction also has consequences on other search engines whose indexers crawl through shit optimized for Google's terrible algorithm. Plus the latest trend of so-called "artificial intelligence" generative models produces even more garbage at an ever growing pace. This list is, as any good adblocking tool is, an attempt to escape this neverending capitalist coercition and attention theft. All of the tech giants play this game so consider also using a social media blocklist. This blocklist is left in the public domain (Do What The Fuck You Want To Public License). | Formats | Headers | Domains | TLDs | URLs | Pages | Expressions | Regular expressions | |----------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------| | uBlacklist | YAML frontmatter | ✓ Match patterns | ✓ Match patterns | ✓ Match patterns | ✓ Match patterns | ✓ See documentation | ✓ See documentation | | AdBlock Plus | Special comments | ✓ Document option | ✓ Document option | ✓ Document option | ✓ Document option | | ✓ See documentation | | Hosts | Commented | ✓ | | | | | | | Dnsmasq | Commented | ✓ | ✓ | | | | | | pdnsd | Commented | ✓ | ✓ | | | | | | Unbound | Commented | ✓ | ✓ | | | | | | Mastodon | CSV | ✓ Fediverse only | | | | | | | Fediblockhole | CSV | ✓ Fediverse only | | | | | | | Domains | Commented | ✓ | | | | | | Browser extensions uBlacklist syntax Blocklist in uBlacklist format to use with uBlacklist. It filters blocked sites from results on several search engines. Click here to subscribe. Note that subscription links need to be enabled for this to work. AdBlock Plus syntax Blocklist in AdBlock format to use with an adblocker (uBlock Origin, Adguard…) or Adguard Home. It uses a strict blocking rule to block access to those sites on your browser. Click here to subscribe. DNS blockers Hosts format Blocklist in Hosts format to use in a hosts) file or Pi-hole. IPv6 version. **Known issue:** Firefox's DNS over HTTPS option bypasses the computer's hosts file ruleset. Dnsmasq format Blocklist in Dnsmasq format to use with the Dnsmasq DNS server software. pdnsd format Blocklist in pdnsd format to use with the pdnsd caching DNS proxy server software. Unbound format Blocklist in Unbound configuration format to use with the Unbound validating, recursive, caching DNS resolver. Fediverse formats Mastodon Blocklist in Mastodon format to use with Mastodon and other federated services. It will defederate from blocked instances. Fediblockhole Blocklist in FediBlockHole format to use with the FediBlockHole tool for Mastodon. It will defederate from blocked instances. How to contribute Clone this repository and add one domain per line in files stored in the folder. Blocked sites are organized using subfolders and files within the folder. Use comments ( ) and markdown files ( ) to add more information and references. > For the website, add on a new line of the file. You can paste the full URL: the update script will clean it and make it a domain. As the hosts format does not automatically block subdomains (e.g. ), they have to be explicitely added to the list to maintain compatibility. It is possible to add TLDs (e.g. , without the dot) to the list, they will be blocked by Dnsmasq, adblockers and uBlacklist. Domains related to Fediverse instances (Mastodon, Peertube, etc.) should be put in files with in their names (e.g. ) so that they are included in the Fediverse blocklists. Then, when you push your changes to the folder, GitHub actions automatically generate new versions of the blocklists. Should you want to generate them yourself, you can run the script (prerequisites : bash, python). Finally, make a pull request: it will be reviewed and approved within a few days. Importing an external list External lists can be imported by adding them to the as a new line in the following format: . They are automatically downloaded twice a day, cleaned (some formats only), copied to the folder and thus added to the list generation database. The domain list in the file serves as an exception ruleset for imported lists. How to contribute (easy mode) If you have no idea how Git works, you can still contribute! Just open an issue with the URLs you would like to add to the list…