AI Providers
approvedby Pavel Frankov
A hub for setting AI providers (OpenAI-like, Ollama and more) in one place.
Obsidian AI Providers
⚠️ Important Note: This plugin is a configuration tool - it helps you manage your AI settings in one place.
Think of it like a control panel where you can:
- Store your API keys and settings for AI services
- Share these settings with other Obsidian plugins
- Avoid entering the same AI settings multiple times
The plugin itself doesn't do any AI processing - it just helps other plugins connect to AI services more easily.
Required by plugins
Supported providers
- OpenAI
- OpenRouter
- Anthropic
- Google Gemini
- Mistral AI
- Together AI
- Fireworks AI
- Perplexity AI
- DeepSeek
- xAI (Grok)
- Cerebras
- Z.AI
- Groq
- 302.AI
- Novita AI
- DeepInfra
- SambaNova
- LM Studio
- Ollama (and Open WebUI)
- OpenAI compatible API
Features
- Fully encapsulated API for working with AI providers
- Develop AI plugins faster without dealing directly with provider-specific APIs
- Easily extend support for additional AI providers in your plugin
- Available in 11 languages: English, Spanish, French, Italian, Portuguese, German, Russian, Chinese, Japanese, Korean, and Dutch
Installation
Obsidian plugin store (recommended)
This plugin is available in the Obsidian community plugin store https://obsidian.md/plugins?id=ai-providers
BRAT
You can install this plugin via BRAT: pfrankov/obsidian-ai-providers
Create AI provider
Ollama
- Install Ollama.
- Install Gemma 2
ollama pull gemma2or any preferred model from the library. - Select
OllamainProvider type - Click refresh button and select the model that suits your needs (e.g.
gemma2)
Additional: if you have issues with streaming completion with Ollama try to set environment variable OLLAMA_ORIGINS to *:
- For MacOS run
launchctl setenv OLLAMA_ORIGINS "*". - For Linux and Windows check the docs.
OpenAI
- Select
OpenAIinProvider type - Set
Provider URLtohttps://api.openai.com/v1 - Retrieve and paste your
API keyfrom the API keys page - Click refresh button and select the model that suits your needs (e.g.
gpt-4o)
OpenAI compatible server
There are several options to run local OpenAI-like server:
- Open WebUI
- llama.cpp
- llama-cpp-python
- LocalAI
- Obabooga Text generation web UI
- LM Studio
- ...maybe more
OpenRouter
- Select
OpenRouterinProvider type - Set
Provider URLtohttps://openrouter.ai/api/v1 - Retrieve and paste your
API keyfrom the API keys page - Click refresh button and select the model that suits your needs (e.g.
anthropic/claude-3.7-sonnet)
Google Gemini
- Select
Google GeminiinProvider type - Set
Provider URLtohttps://generativelanguage.googleapis.com/v1beta/openai - Retrieve and paste your
API keyfrom the API keys page - Click refresh button and select the model that suits your needs (e.g.
gemini-1.5-flash)
LM Studio
- Select
LM StudioinProvider type - Set
Provider URLtohttp://localhost:1234/v1 - Click refresh button and select the model that suits your needs (e.g.
gemma2)
Groq
- Select
GroqinProvider type - Set
Provider URLtohttps://api.groq.com/openai/v1 - Retrieve and paste your
API keyfrom the API keys page - Click refresh button and select the model that suits your needs (e.g.
llama3-70b-8192)
For plugin developers
Docs: How to integrate AI Providers in your plugin.
Quick reference (details in SDK docs):
try {
const finalText = await aiProviders.execute({
provider,
prompt: "Hello",
onProgress: (chunk, full) => {/* stream UI update */},
abortController
});
// use finalText
} catch (e) {
// handle error / abort
}
Removed callbacks: onEnd / onError — promise resolve/reject covers them (only onProgress remains for streaming). Legacy chainable handler also deprecated.
Roadmap
- Docs for devs
- Ollama context optimizations
- German translations
- Chinese translations
- Update to latest OpenAI version and embedding models
- Russian translations
- Groq Provider support
- Passing messages instead of one prompt
- Anthropic Provider support
- Shared embeddings to avoid re-embedding the same documents multiple times
- Spanish, Italian, French, Dutch, Portuguese, Japanese, Korean translations
- Incapsulated basic RAG search with optional BM25 search
My other Obsidian plugins
- Local GPT that assists with local AI for maximum privacy and offline access.
- Colored Tags that colorizes tags in distinguishable colors.
For plugin developers
Search results and similarity scores are powered by semantic analysis of your plugin's README. If your plugin isn't appearing for searches you'd expect, try updating your README to clearly describe your plugin's purpose, features, and use cases.