Mini-RAG

by John Wheatley
5
4
3
2
1
Score: 30/100

Description

Category: Learning & Knowledge Management

The Mini-RAG plugin enables local retrieval augmented generation by connecting your notes to a locally running LLM through Ollama. You can start a chat in the context of a specific note or folder, allowing the model to reference only relevant content when generating responses. It supports any Ollama-installed model and provides controls for model selection, temperature adjustment, and even context-free chatting when you want unconstrained responses. Interactions can be initiated directly from right-click menus in the editor or sidebar, and conversations can be saved for later reference.

Reviews

No reviews yet.

Stats

16
stars
1,289
downloads
3
forks
274
days
300
days
332
days
0
total PRs
0
open PRs
0
closed PRs
0
merged PRs
1
total issues
1
open issues
0
closed issues
0
commits

RequirementsExperimental

  • Ollama installed and running locally

Latest Version

a year ago

Changelog

Description

First release of the Mini-Rag plugin for Obsidian.

Features

  • Context-Sensitive Chats with Local LLMs
  • Context-Free Chats with Local LLMs

README file from

Github

Mini-RAG

Local Retrieval Augmented Generation for your Obsidian notes


What is Mini-RAG?

Mini-RAG lets you chat with a locally running LLM, in the context of selected Obsidian notes and folders. For the LLM, you can select any locally installed Ollama model (see: Configure Mini-Rag).

Setting Up Mini-RAG

Install Ollama

If you don't already have Ollama installed, you can download and install Ollama here.

This is necessary because Mini-RAG relies on a locally running instance of Ollama for its responses. This is the same reason that Mini-RAG is currently a desktop-only plugin.

Configure Mini-RAG

Open "options" by clicking on the gear icon then navigate to Community Plugins > Mini-RAG > Options. Here you can set the:

  • Ollama URL: If left unset, Ollama's default URL is used
  • Model: From a dropdown list of AI Models installed on your local Ollama setup
  • Temperature: Higher temperatures give more creative response, but also lead to more hallucinations
  • Enable context-free chats: Provides the option to chat with an LLM without the context of a note or folder

Using Mini-RAG

Opening a Mini-RAG Chat

This is done from the right-click context menu. You will see the "Mini-RAG" option when you:

  • Right-Click within a note
  • Right-Click a note in the sidebar
  • Right-Click a folder in the sidebar
  • Open a note's triple-dot menu

Saving Conversations

To save a Mini-RAG conversation, click the "Save" button. If you continue the conversation after this, you will need to click "save" again to update the saved conversation.


Author

For more about the author visit JJWheatley.com

Similar Plugins

info
• Similar plugins are suggested based on the common tags between the plugins.