Show HN: I Replaced Vector Databases with Git for AI Memory (PoC)

DiffMem is a lightweight, git-based memory backend designed for AI agents and conversational systems. It uses Markdown files for human-readable storage, Git for tracking temporal evolution through differentials, and an in-memory BM25 index for fast, explainable retrieval.

This project is a proof-of-concept (PoC) exploring how version control systems can serve as a foundation for efficient, scalable memory in AI applications. At its core, DiffMem treats memory as a versioned repository: the "current state" of knowledge is stored in editable files, while historical changes are preserved in Git's commit graph.

This separation allows agents to query and search against a compact, up-to-date surface without the overhead of historical data, while enabling deep dives into evolution when needed. Traditional memory systems for AI agents often rely on databases, vector stores, or graph structures. These work well for certain scales but can become bloated or inefficient when dealing with long-term, evolving personal knowledge.

DiffMem takes a different path by leveraging Git's strengths: Current-State Focus - Memory files store only the "now" view of information (e.g., current relationships, facts, or timelines). This reduces the surface area for queries and searches, making operations faster and more token-efficient in LLM contexts. Historical states are not loaded by default - they live in Git's history, accessible on-demand.

Differential Intelligence - Git diffs and logs provide a natural way to track how memories evolve. Agents can ask "How has this fact changed over time?" without scanning entire histories, pulling only relevant commits. This mirrors how human memory reconstructs events from cues, not full replays.

Durability and Portability - Plaintext Markdown ensures memories are human-readable and tool-agnostic. Git's distributed nature means your data is backup-friendly and not locked into proprietary formats.

Efficiency for Agents - By separating "surface" (current files) from "depth" (git history), agents can be selective - load the now for quick responses, dive into diffs for analytical tasks. This keeps context windows lean while enabling rich temporal reasoning.

This approach shines for long-horizon AI systems where memories accumulate over years: it scales without sprawl, maintains auditability, and allows "smart forgetting" through pruning while preserving reconstructability.

Key Components

Writer Agent (writer_agent): Analyzes conversation transcripts, identifies/creates entities, stages updates in Git's working tree. Commits are explicit, ensuring atomic changes.

Context Manager (context_manager): Assembles query-relevant context at varying depths (basic: core blocks; wide: semantic search; deep: full files; temporal: with git history).

Searcher Agent (searcher_agent): LLM-orchestrated BM25 search - distills queries from conversations, retrieves snippets, synthesizes responses.

API Layer (api.py): Clean interface for read/write operations.

Example

The repo follows a structured layout (see repo_guide.md for details), with current states in Markdown files and evolution in Git commits. Indexing is in-memory for speed, rebuilt on demand.

DiffMem's git-centric design solves key challenges in AI memory systems: Reduced Query Surface - Only current-state files are indexed/searched by default. This minimizes noise in BM25 results and keeps LLM contexts concise - crucial for token limits.

Future Vision

Where This Could Go - DiffMem points to a future where AI memory is as versioned and collaborative as code. Imagine:

  • Agent-Driven Pruning: LLMs that "forget" low-strength memories by archiving to git branches, mimicking neural plasticity.
  • Collaborative Memories: Multi-agent systems sharing repos, with merge requests for "memory reconciliation."
  • Temporal Agents: Specialized models that query git logs to answer "how did I change?" - enabling self-reflective AI.
  • Hybrid Stores: Combine with vector embeddings for semantic depth, using git as the "diff layer" over embeddings.
  • Open-Source Ecosystem: Plugins for voice input, mobile sync, or integration with tools like Obsidian.

We're excited to see where the community takes this - perhaps toward distributed, privacy-first personal AIs. This is an R&D project from Growth Kinetics, a boutique data solutions agency specializing in AI enablement. We're exploring how differential memory can power next-gen agents. We'd love collaborations, PRs, or honest feedback to improve it.

Fork, experiment, PR! We're looking for: Issues/PRs welcome. Let's build the future of AI memory together.

Licence: MIT Growth Kinetics 2025