LLM Shortcut
approvedby Viktor Chernodub
Provides a way to create shortcuts for commands powered by LLM capabilities.
LLM Shortcut
Turn Markdown prompt files into Obsidian commands.
LLM Shortcut maps your prompt library folder to command palette entries, then runs the selected prompt against the active note using any OpenAI-compatible provider.
Why this plugin
If you keep reusing prompts ("improve writing", "translate", "make this a bullet-list"), copy-paste gets tedious quickly.
This plugin lets you:
- keep prompts as plain
.mdfiles in your vault; - organize them in folders;
- run them like native Obsidian commands.
I used OpenRouter's
google/gemini-3-flash-previewfor demo
Prerequisites
You have to use your own LLM provider and keys :)
Features
- Use your own OpenAI-compatible providers (OpenAI, OpenRouter, and others with compatible endpoints)
- Prompt files become commands automatically (including nested folders)
- Streaming output directly into the editor selection/cursor
- Custom prompt command for one-off prompts without creating a file
- Local-first behavior: your prompt files stay in your vault
2-minute quick start
- Install
LLM Shortcutfrom Community Plugins. - Open plugin settings and fill:
π API keyπ Base URL(example:https://api.openai.com/v1)π€ Model name(example:gpt-4.1-mini)
- Create a prompt folder (default:
_prompts). - Add a prompt file, for example
_prompts/Writing/Improve.md:
Improve the selected text.
Keep the original meaning, but make it clearer and more concise.
- Open any note, select text (or place cursor), then run command:
LLM Shortcut: Writing / Improve
The better you are with the prompting, the better results you get, it's mostly on you :)
Advanced prompt features
info-mode popup
This feature is a βshow result, donβt edit my noteβ mode. Normally, this plugin writes the AI response directly into your note (at cursor/selection).
If you set this in your prompt file frontmatter: llm-shortcut-prompt-response-processing-mode: info the response is shown in a popup window instead.
The popup opens with your prompt name as the title, shows a loading state, then streams in the AI answer live.
The answer is rendered as Markdown (so headings/lists/tables display nicely).
Your note content is not replaced in this mode.
Good use case: dictionary/explanation prompts (like prompt-examples/Foreign word explanation.md) where you want to read info quickly without changing the document.
Selection-Only Commands
Some prompts work best when applied to a specific selection of text. You can mark a command as selection-only by adding frontmatter to your prompt file:
---
llm-shortcut-selection-mode: selection-only
---
Your prompt content here...
When a command is marked as selection-only, it will:
- Require text to be selected before execution
- Show an error notification if you try to run it without a selection
- Only process the selected text (and the document context) when executed
This is useful for prompts that are designed to transform, analyze, or modify specific portions of text rather than working with the entire document.
Context for LLM
By default, the plugin sends the entire file content to the LLM, marking the areas that should be modified (either a text selection or the caret position). The LLM uses the full file as context when making modifications.
You can limit the context window by specifying the number of characters to include before and after the selection or caret position. This is particularly useful when working with very long documents or when you want to focus the LLM's attention on a specific area.
To configure the context size, add these parameters to your prompt file's frontmatter:
---
llm-shortcut-context-size-before: 256
llm-shortcut-context-size-after: 0
---
Your prompt content here...
llm-shortcut-context-size-before: Number of characters to include before the selection (default: entire file)llm-shortcut-context-size-after: Number of characters to include after the selection (default: entire file)
Built-in command: custom prompt
The plugin also adds a command for ad-hoc prompting (default label: Custom prompt).
You can rename this in settings via π Command label.
Our use cases
- Improve clarity and grammar in selected paragraphs
- Translate selected text while preserving formatting
- Convert free text into a table
- Explain unfamiliar words in context
- Generate concise summaries/checklists from meeting notes
Ready-made examples are available in prompt-examples.
Integrations
License
MIT
For plugin developers
Search results and similarity scores are powered by semantic analysis of your plugin's README. If your plugin isn't appearing for searches you'd expect, try updating your README to clearly describe your plugin's purpose, features, and use cases.