PROJECT.md 1.8 KB

PROJECT.md — mem0-python-server

Purpose

mem0-python-server is a lightweight FastAPI service that wraps the mem0 library to provide persistent memory over a REST API. It powers OpenClaw’s memory workflows by offering two distinct memory collections (conversational + knowledge), local reranking, and a stable HTTP contract for storage, recall, and cleanup.

Scope

This project focuses on:

  • Running a reliable HTTP API for memory storage and search
  • Keeping conversational vs. knowledge memories separate
  • Providing predictable, structured responses for OpenClaw and ingest pipelines
  • Supporting local reranking and metadata passthrough

Where the API is documented

The full endpoint reference (requests, responses, reranker contract, testing notes) now lives in API.md. The README provides a lightweight overview + quick links to both PROJECT.md and API.md so the entry point stays concise.

Important files

  • mem0server.py — entrypoint, exposes app for uvicorn
  • mem0core/ — core FastAPI app, routes, prompts, reranker, storage
  • README.md — architecture overview, quickstart notes, and doc links
  • API.md — canonical endpoint reference for all routes
  • reset_memory.py — resets a Chroma collection and restarts the container
  • tests.sh — endpoint smoke tests

Operating assumptions

  • Chroma runs at 192.168.0.200:8001
  • Embedder runs via Ollama at 192.168.0.200:11434
  • Reranker at 192.168.0.200:5200 (optional; server falls back if offline)
  • OpenAI/Groq API key available in environment (see README)

Conventions

  • Conversational memory routes are mounted at /memories.
  • Knowledge memory routes are mounted at /knowledge.
  • A merged search is exposed at /search.
  • The API is stable and intended to remain backward-compatible for OpenClaw.