Ollama Sidechat

unlisted

by Leonyx

Chat with your local Ollama instance in a side window

Updated 2mo agoMIT
View on GitHub

Ollama Sidechat for Obsidian

A side panel chat interface for Ollama in Obsidian. Chat with local LLMs directly in your vault.

Features

  • šŸ’¬ Side Panel Chat — Chat with Ollama without leaving your notes
  • šŸ“ Note Context — Automatically includes active note content as context
  • šŸ’¾ Chat History — Saves conversations as searchable markdown files
  • ⚔ Streaming Responses — See responses as they're generated
  • šŸ”„ Model Switching — Switch between available models on the fly

Installation

  1. Install Ollama on your system
  2. Pull a model: ollama pull llama3.1
  3. Copy this plugin to .obsidian/plugins/ollama-sidechat/
  4. Enable the plugin in Obsidian Settings → Community Plugins

Usage

  1. Click the brain icon in the ribbon (or run command "Open Ollama Chat")
  2. Start Ollama on your machine (if using it locally)
  3. Type your message and press Enter to send
  4. Use Shift+Enter for multi-line input
Ollama Sidechat in Obsidian

Settings

SettingDescription
Ollama URLAPI endpoint (default: http://localhost:11434)
Default ModelModel to use for chat
TemperatureResponse creativity (0-1)
Max TokensToken limit per response (-1 = unlimited)
Chat History FolderWhere to save chat files
Include Note ContextSend active note as context
Settings Page 1 Settings Page 2

Chat History

Chats are saved as markdown files organized by month:

Ollama Chats/
└── 2026-01/
    └── 2026-01-22_14-30-15_my-question.md

Each file includes YAML frontmatter with metadata and links to context notes.

Requirements

  • Obsidian 0.15.0+
  • Ollama installed locally

License

MIT

For plugin developers

Search results and similarity scores are powered by semantic analysis of your plugin's README. If your plugin isn't appearing for searches you'd expect, try updating your README to clearly describe your plugin's purpose, features, and use cases.