Show HN: I replaced vector databases with Git for AI memory (PoC)
Hacker News (score: 51)Description
The insight: Git already solved versioned document management. Why are we building complex vector stores when we could just use markdown files with Git's built-in diff/blame/history?
How it works:
Memories stored as markdown files in a Git repo Each conversation = one commit git diff shows how understanding evolves over time BM25 for search (no embeddings needed) LLMs generate search queries from conversation context Example: Ask "how has my project evolved?" and it uses git diff to show actual changes in understanding, not just similarity scores.
This is very much a PoC - rough edges everywhere, not production ready. But it's been working surprisingly well for personal use. The entire index for a year of conversations fits in ~100MB RAM with sub-second retrieval.
The cool part: You can git checkout to any point in time and see exactly what the AI knew then. Perfect reproducibility, human-readable storage, and you can manually edit memories if needed.
GitHub: https://github.com/Growth-Kinetics/DiffMem
Stack: Python, GitPython, rank-bm25, OpenRouter for LLM orchestration. MIT licensed.
Would love feedback on the approach. Is this crazy or clever? What am I missing that will bite me later?
More from Hacker
Show HN: Offline tiles and routing and geocoding in one Docker Compose stack
Show HN: Offline tiles and routing and geocoding in one Docker Compose stack Hi HN,<p>I’m building Corviont, a self-hosted offline maps appliance (tiles + routing + search) for edge/on-prem devices.<p>Hosted demo (no install): <a href="https://demo.corviont.com/" rel="nofollow">https://demo.corviont.com/</a><p>Self-host (Docker Compose repo): <a href="https://github.com/corviont/monaco-demo" rel="nofollow">https://github.com/corviont/monaco-demo</a><p>Docs: <a href="https://www.corviont.com/docs" rel="nofollow">https://www.corviont.com/docs</a><p>What’s inside:<p><pre><code> - Vector tiles served locally (PMTiles) - Routing served locally (Valhalla) - Offline geocoding/search + reverse (SQLite Nominatim-based index) - MapLibre UI wired to the local endpoints </code></pre> After the initial image + data pulls, it runs fully offline (no external map/routing/geocoding API calls).<p>Next (if people need it): a signed on-device updater for regional datasets (verify → atomic swap → reload).<p>I’d love feedback: where offline maps/routing/search matters for you, and what constraints bite (hardware, fleet size, update windows, regions, deployment style).
Common Lisp SDK for the Datastar Hypermedia Framework
Common Lisp SDK for the Datastar Hypermedia Framework
No Graphics API
No Graphics API
Ghostty compiled to WASM with xterm.js API compatibility
Ghostty compiled to WASM with xterm.js API compatibility
No other tools from this source yet.