Skip to content

Official Plugins

Official plugins are maintained by the SimpleContext core team and available in SimpleContext-Plugin.


Semantic similarity search as a complement to keyword-based retrieval.

Tested with Bot Zero Dependency

The Problem It Solves

SimpleContext already has strong keyword retrieval. But keywords fail when meaning differs from wording:

Query  : "laptop sering hang"
Memory : "notebook error saat multitasking"
Keyword: ❌ no match (different words)
Vector : ✅ match (same meaning)

This plugin adds a semantic similarity layer on top of keyword retrieval. Both run together — the LLM gets context from both approaches.

Install

simplecontext-bot plugins install vector-search
# Download from registry
git clone https://github.com/zacxyonly/SimpleContext-Plugin
cp SimpleContext-Plugin/official/plugin-vector-search/vector_search_plugin.py ./plugins/

Configuration

# config.yaml
plugins:
  vector_search_plugin:
    enabled: true
    provider: local        # local | openai | ollama
    top_k: 5
    min_score: 0.15
    inject_as_system: true
    tiers: [semantic, episodic]
Option Type Default Description
provider string local Embedding engine: local, openai, ollama
top_k int 5 Max results injected into context
min_score float 0.15 Minimum cosine similarity (0.0–1.0)
inject_as_system bool true Inject into system message
tiers list ["semantic", "episodic"] Tiers to index and search

Embedding Providers

TF-IDF cosine similarity. Zero dependency, zero setup.

vector_search_plugin:
  provider: local

text-embedding-3-small — high quality, requires API key.

vector_search_plugin:
  provider: openai
  openai_api_key: sk-...
  openai_model: text-embedding-3-small

Local model via Ollama REST API. No API key needed.

ollama pull nomic-embed-text
vector_search_plugin:
  provider: ollama
  ollama_url: http://localhost:11434
  ollama_model: nomic-embed-text

How It Works

New message saved
on_message_saved
  └── embed content → store in vector index (persisted to DB)

Context being built for LLM
on_context_build
  ├── embed latest user query
  ├── cosine similarity against full index
  ├── select top-K hits (score ≥ min_score)
  └── inject into system message:
      • [Semantic | 84%] user prefers Python for scripting...
      • [Episodic | 67%] session: fixed import error in numpy...

Bot Commands

When loaded in SimpleContext-Bot, this plugin auto-registers /semantic:

/semantic kenangan liburan
→ 1. [Semantic | ████░ 84%] user went to Bali last year...
→ 2. [Episodic | ███░░ 67%] talked about beach holidays...

API

plugin = sc._plugins.get("vector_search_plugin")

# Index size for a user
plugin.index_size(user_id)   # → 142

# Clear index (rebuilds automatically as messages arrive)
plugin.clear_index(user_id)

# Reindex from existing memory (run after first install)
nodes = sc.context(user_id).get_all_active()
plugin.reindex(user_id, nodes)