BodhiSearch / BodhiApp
Run Open Source/Open Weight LLMs locally with OpenAI compatible APIs
View on GitHubAI Architecture Analysis
This repository is indexed by RepoMind. By analyzing BodhiSearch/BodhiApp in our AI interface, you can instantly generate complete architecture diagrams, visualize control flows, and perform automated security audits across the entire codebase.
Our Agentic Context Augmented Generation (Agentic CAG) engine loads full source files into context on-demand, avoiding the fragmentation of traditional RAG systems. Ask questions about the architecture, dependencies, or specific features to see it in action.
Repository Overview (README excerpt)
Crawler viewBodhi App Table of Contents • Overview • Features • Installation • Usage • Documentation • API Client • Community • Powered By Overview Bodhi App allows you to run Open Source LLMs locally. It utilizes the Huggingface ecosystem for accessing open-source LLM weights and information and is powered by llama.cpp. While many apps that help you run LLMs locally are targeted at technical users, Bodhi App is designed with both technical and non-technical users in mind. For technical users, it provides OpenAI-compatible chat completions and models API endpoints. It includes comprehensive API documentation following OpenAPI standards and features a built-in SwaggerUI that allows developers to explore and test all API endpoints live. For non-technical users, it comes with a built-in Chat UI that is quick to start and easy to understand. Users can quickly get started with open-source models and adjust various settings to suit their needs. The app also enables users to discover, explore, and download new open-source models that fit their requirements and are compatible with their local hardware. Features • **Built-in Chat UI:** Enjoy an intuitive, responsive chat interface with real-time streaming, markdown support, and customizable settings • **Model Management:** Download and manage GGUF model files directly from HuggingFace • **API Token Management:** Securely generate and manage API tokens for external integrations • **Dynamic App Settings:** Easily adjust application parameters (like execution variant and idle timeout) on the fly • **Responsive Design:** A fully adaptive layout that works seamlessly across desktop and mobile devices • **Robust Error Handling:** Comprehensive error logging and troubleshooting guides to help quickly identify and resolve issues Installation Bodhi App is currently released only for the Mac platform. You can install it either by downloading the release from the GitHub release page or using Homebrew. Homebrew Bodhi App hosts its external cask at . Install Bodhi App using this command: Once installed, launch from the folder. You should see the Bodhi App icon in your system tray. Launch the homepage from the system tray menu by selecting . GitHub Releases Download the latest release for your platform from the Releases page. Unzip and move to your folder, then launch it. You should see the Bodhi App icon in your system tray. Launch the homepage from the system tray menu by selecting . Docker Bodhi App is available as Docker images with multiple hardware acceleration variants. Each variant is optimized for specific hardware configurations to provide the best performance. Available Variants • **CPU Variant**: Standard CPU-only inference for maximum compatibility (multi-platform: AMD64 + ARM64) • **CUDA Variant**: NVIDIA GPU acceleration for faster inference on NVIDIA hardware • **ROCm Variant**: AMD GPU acceleration for AMD graphics cards • **Vulkan Variant**: Cross-vendor GPU acceleration supporting multiple GPU vendors Quick Start **CPU Variant (Most Compatible - Auto-detects AMD64/ARM64):** **CUDA Variant (NVIDIA GPU):** **ROCm Variant (AMD GPU):** **Vulkan Variant (Cross-vendor GPU):** Hardware Requirements • **CPU**: Standard x86_64 (AMD64) or ARM64 processor (auto-detected) • **CUDA**: NVIDIA GPU with CUDA 12.4+ support and compatible drivers • **ROCm**: AMD GPU with ROCm 6.4+ support and compatible drivers • **Vulkan**: GPU with Vulkan API support and compatible drivers Volume Mounts • : Application data, configuration, and downloaded models • : HuggingFace cache directory for model downloads After starting the container, Bodhi App will be available at . Setup On first launch, Bodhi App starts with a setup flow. Follow this process to configure and install Bodhi App for your local machine and get started. Documentation Bodhi App comes with built-in documentation: • User Guide: Access at http://localhost:1135/docs/ • Technical Documentation: Available as OpenAPI Swagger UI at http://localhost:1135/swagger-ui/ API Client Bodhi App provides a TypeScript client for easy integration with the API: Installation Usage For more information, see the ts-client documentation. Community Web & Desktop Open WebUI {width=600px} (Open up a pull request on README.md to include community integrations) Powered By llama.cpp huggingface.co