
Automatable GenAI Scripting for programmatically assembling prompts for LLMs using JavaScript.

Fully local web research assistant using LLMs for generating queries, summarizing results, and writing reports.

THE Copilot in Obsidian is an open-source AI assistant integrated into Obsidian for enhanced note-taking and knowledge management.

Bridge between Ollama and MCP servers, enabling local LLMs to use Model Context Protocol tools.

WhatsApp MCP server enabling users to interact with WhatsApp messages, contacts, and media through an MCP interface.

Documentation on setting up an LLM server on Debian from scratch, using Ollama/vLLM, Open WebUI, OpenedAI Speech/Kokoro FastAPI, and ComfyUI.

A flexible framework for optimizing local deployments of large language models with cutting-edge inference techniques.

LLMFarm is an iOS and MacOS app for offline use of large language models using the GGML library.

EdgePersona is a fully localized intelligent digital human that runs offline with low computational requirements.

Discover, download, and run local LLMs like Llama and DeepSeek on your computer easily.