back to home

hendrikbgr / Free-Proxy-Repo

Get a fresh list of proxies every couple of hours. Scraped from over 60+ Websites & over 360+ Links. This list gets updated onces all scraped proxies have been checked and then we start scraping again.

87 stars
12 forks
0 issues

AI Architecture Analysis

This repository is indexed by RepoMind. By analyzing hendrikbgr/Free-Proxy-Repo in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.

Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context on-demand, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.

Source files are only loaded when you start an analysis to optimize performance.

Embed this Badge

Showcase RepoMind's analysis directly in your repository's README.

[![Analyzed by RepoMind](https://img.shields.io/badge/Analyzed%20by-RepoMind-4F46E5?style=for-the-badge)](https://repomind.in/repo/hendrikbgr/Free-Proxy-Repo)
Preview:Analyzed by RepoMind

Repository Overview (README excerpt)

Crawler view

FREE FRESH DAILY PROXY LIST UPDATED EVERY HOUR -> Download Now Proxy-Pool-API | Powered by this Repo Access a 100% free proxy pool API! Quickly and easily integrate proxies into your projects. Visit proxy-pool-api.vercel.app to get started. šŸ”„ Version 0.0.3 šŸ”„ This is the first version of my fully automated GitHub repo & proxy scraper. My proxy scraper & checker runs on my local Raspberry Pi 4+ and updates the proxy list once scraped and checked. After that, the whole process restarts. My script crawls over 60+ websites and more than 270+ URLs to find all public proxies available. All proxies are checked before being updated here. Proxy Scraper & Checker You can find the Proxy Scraper's repo here: Proxy-Scraper šŸš€ Automate your Proxy Scraping šŸš€ šŸ“Œ Ver. 0.0.3 šŸ“Œ Features • Scrape all public proxies from preset URLs. • Scrape all links from the preset URLs. • Scrape all public proxies from the discovered links. • Save all scraped proxies to a file. • Remove duplicate proxies. • Check all proxies and save them to a file. • Checked on www.google.com. Support šŸ‘Øā€šŸ’» Contact šŸ“© If you encounter any issues with running the script or have questions, feel free to reach out to me: • **Twitter:** @hendrik_bgr • **Email:** hendriksdevmail@gmail.com