mcp-ollama

Free
ollamalocal-llmprivacy +1

Local LLM runtime MCP supporting llama, mistral, qwen and other open-source models with complete data locality.

Use Cases

  • Local AI inference
  • Privacy-sensitive data
  • Medical, legal, finance

Related Tools

Visit Official Site →