Ollama Sidechat
unlistedby Leonyx
Chat with your local Ollama instance in a side window
Updated 2mo agoMIT
Ollama Sidechat for Obsidian
A side panel chat interface for Ollama in Obsidian. Chat with local LLMs directly in your vault.
Features
- š¬ Side Panel Chat ā Chat with Ollama without leaving your notes
- š Note Context ā Automatically includes active note content as context
- š¾ Chat History ā Saves conversations as searchable markdown files
- ā” Streaming Responses ā See responses as they're generated
- š Model Switching ā Switch between available models on the fly
Installation
- Install Ollama on your system
- Pull a model:
ollama pull llama3.1 - Copy this plugin to
.obsidian/plugins/ollama-sidechat/ - Enable the plugin in Obsidian Settings ā Community Plugins
Usage
- Click the brain icon in the ribbon (or run command "Open Ollama Chat")
- Start Ollama on your machine (if using it locally)
- Type your message and press Enter to send
- Use Shift+Enter for multi-line input
Settings
| Setting | Description |
|---|---|
| Ollama URL | API endpoint (default: http://localhost:11434) |
| Default Model | Model to use for chat |
| Temperature | Response creativity (0-1) |
| Max Tokens | Token limit per response (-1 = unlimited) |
| Chat History Folder | Where to save chat files |
| Include Note Context | Send active note as context |
Chat History
Chats are saved as markdown files organized by month:
Ollama Chats/
āāā 2026-01/
āāā 2026-01-22_14-30-15_my-question.md
Each file includes YAML frontmatter with metadata and links to context notes.
Requirements
- Obsidian 0.15.0+
- Ollama installed locally
License
MIT
For plugin developers
Search results and similarity scores are powered by semantic analysis of your plugin's README. If your plugin isn't appearing for searches you'd expect, try updating your README to clearly describe your plugin's purpose, features, and use cases.