Skip to main content

Quick Start

Get up and running with Hindsight in 60 seconds.

Start the Server

pip install hindsight-all
export HINDSIGHT_API_LLM_PROVIDER=groq
export HINDSIGHT_API_LLM_API_KEY=gsk_xxxxxxxxxxxx

hindsight-api

API available at http://localhost:8888

LLM Provider

Hindsight requires an LLM with structured output support. Recommended: Groq with gpt-oss-20b for fast, cost-effective inference. Also supports OpenAI and Ollama.


Use the Client

pip install hindsight-client
from hindsight_client import Hindsight

client = Hindsight(base_url="http://localhost:8888")

# Retain: Store information
client.retain(bank_id="my-bank", content="Alice works at Google as a software engineer")

# Recall: Search memories
client.recall(bank_id="my-bank", query="What does Alice do?")

# Reflect: Generate personality-aware response
client.reflect(bank_id="my-bank", query="Tell me about Alice")

What's Happening

OperationWhat it does
RetainContent is processed, facts are extracted, entities are identified and linked in a knowledge graph
RecallFour search strategies (semantic, keyword, graph, temporal) run in parallel to find relevant memories
ReflectRetrieved memories are used to generate a personality-aware response

Next Steps

  • Retain — Advanced options for storing memories
  • Recall — Search and retrieval strategies
  • Reflect — Personality-aware reasoning
  • Memory Banks — Configure personality and background
  • Server Deployment — Docker Compose, Helm, and production setup