rasulkireev / TuxSEO
Automated Blog Content Creation for Founders Who Hate Writing
AI Architecture Analysis
This repository is indexed by RepoMind. By analyzing rasulkireev/TuxSEO in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.
Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context on-demand, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.
Repository Overview (README excerpt)
Crawler viewAutomated Blog Content Creation for Founders Who Hate Writing *** Overview • TuxSEO learns about your business, analyzes the market which lets you... • Generate content ideas for you business blog to drive traffic from searches. • Stop wasting time and money on research and writing, let TuxSEO do it for you. • TuxSEO is open-source, self-hostable. Always. • Run it privately on your computer or try it on our cloud app. *** TOC • Overview • TOC • Deployment • Render • Docker Compose • Pure Python / Django deployment • Custom Deployment on Caprover • Local Development • Star History Deployment Render The only required env vars are: • OPENAI_API_KEY • TAVILY_API_KEY • GEMINI_API_KEY • PERPLEXITY_API_KEY • JINA_READER_API_KEY • KEYWORDS_EVERYWHERE_API_KEY The rest are optional. **Note:** This should work out of the box with Render's free tier if you provide the AI API keys. Here's what you need to know about the limitations: • **Worker Service Limitation**: The worker service is not a dedicated worker type (those are only available on paid plans). For the free tier, I had to use a web service through a small hack, but it works fine for most use cases. • **Memory Constraints**: The free web service has a 512 MB RAM limit, which can cause issues with **automated background tasks only**. When you add a project, it runs a suite of background tasks to analyze your website, generate articles, keywords, and other content. These automated processes can hit memory limits and potentially cause failures. • **Manual Tasks Work Fine**: However, if you perform tasks manually (like generating a single article), these typically use the web service instead of the worker and should work reliably since it's one request at a time. • **Upgrade Recommendation**: If you do upgrade to a paid plan, use the actual worker service instead of the web service workaround for better automated task reliability. **Reality Check**: The website functionality should be usable on the free tier - you'll only pay for API costs. Manual operations work fine, but automated background tasks (especially when adding multiple projects) may occasionally fail due to memory constraints. It's not super comfortable for heavy automated use, but perfectly functional for manual content generation. If you know of any other services like Render that allow deployment via a button and provide free Redis, Postgres, and web services, please let me know in the Issues section. I can try to create deployments for those. Bear in mind that free services are usually not large enough to run this application reliably. Docker Compose This should also be pretty streamlined. On your server you can create a folder in which you will have 2 files: • Copy the contents of into and update all the necessary values. • Copy the contents of into and run the suggested command from the top of the file. How you are going to expose the backend container is up to you. I usually do it via Nginx Reverse Proxy with UPSTREAM_HTTP_ADDRESS. Pure Python / Django deployment Not recommended due to not being too safe for production and not being tested by me. If you are not into Docker or Render and just wanto to run this via regular commands you will need to have 5 processes running: • • • • • You'd still need to make sure .env has correct values. Custom Deployment on Caprover • Create 4 apps on CapRover. • - • - • Create a new CapRover app token for: • - • Add Environment Variables to those same apps from . • Create a new GitHub Actions secret with the following: • - • - • Then just push main branch. • Github Workflow in this repo should take care of the rest. Local Development All the information on how to run, develop and update your new application can be found in the documentation. • Update the name of the to and update relevant variables. • Run • Run just in case, it sometimes has troubles connecting to REDIS on first deployment. CI pre-PR runbook (required before opening a PR) Run the same pytest command as CI locally: What this does: • Boots local Postgres/Redis services used by tests. • Runs with the same strict flags as CI ( ). • Pins for deterministic hash ordering. If this fails, fix locally before pushing. Analytics ingestion jobs (GA4, GSC, Plausible) TuxSEO now ships background ingestion for connected analytics providers: • Google Analytics 4 (GA4) • Google Search Console (GSC) • Plausible Implementation highlights: • Incremental sync cursor per with a rolling lookback window. • Raw source snapshots ( ) + normalized daily facts ( ). • Idempotent upsert semantics for safe retries. • Provider-aware retry/backoff behavior with observable cursor status and sanitized errors. • Missing/disconnected integrations are skipped without failing the whole scheduling run. • Project Home includes an **Analytics (GA4/GSC/Plausible)** section that surfaces: • connected-source status badges • 30-day KPI rollups (clicks, impressions, sessions, users, conversions) • derived rates (CTR, engagement, conversion) and avg GSC position • trend deltas (recent 7d vs prior 7d) • top low-CTR/high-impression opportunities with actionable on-page SEO suggestions Scheduled entrypoint: • Worker task entrypoint: • Deterministic content quality evaluation Run the deterministic rubric evaluation locally: This executes , scores / / fixtures with fixed rubric weights, and writes . Safe baseline update procedure: • Run and confirm the change is expected. • Review the generated score deltas. • Re-run with explicit baseline update mode: • Commit together with the scoring logic change. Internal API (BlogPost CRUD) Internal blog post management endpoints are available under and are protected by the **superuser API key** query param ( ). • — create (existing endpoint, kept for compatibility) • — list • — retrieve • — full update • — partial update • — delete Non-superuser or missing API keys are rejected with unauthorized responses. Star History