Show HN: Object database for LLMs that persists across chats (MCP server)
Show HN (score: 6)Description
For example, if I find a great coding tutorial in chat, or tell it how much I ran yesterday, it forgets that when I close the chat. Even if I keep the chat history, I still need to scour through lots of messages to find the data I want. Ideally, Claude would remember all this, and I’d be able to find it later with ease. This is what my team built.
It is a collaborative database you add to any LLM that supports MCP. (Claude Code, Desktop, and Pro for now; ChatGPT will soon). You can add, update, and search for items in the database inside chat. You can easily create your own object schemas. There is an automatically generated web UI for using the database. It generates maps, charts, calendars, tables, lists, and other UI elements. You can share or publish the database as well.
Over time, we want to make this database powerful enough to make our lives much simpler by letting LLMs replace a bunch of the apps and software services we use daily.
More from Show
Show HN: Ragnerock, an AI data analysis tool
Show HN: Ragnerock, an AI data analysis tool Hi HN, I’m Matt Mahowald, and together with my cofounder John, we’re launching the public beta of Ragnerock today.<p>As a data scientist, you spend the majority of your time wrangling data. Even though you might have a set of techniques and tricks you like to use, how exactly you treat a particular source of data tends to be fairly bespoke, so you end up writing custom logic each time.<p>Ragnerock was born from the observation that modern LLMs can be used to automate a lot of the grunt work involved in this process, while still allowing for fully customizable pipelines. What’s more, by leveraging techniques like constrained decoding, it’s possible to provide a unified query interface regardless of the data source - bridging raw data sources like text and images with your existing structured data living in your databases.<p>Ragnerock has four main components:<p>- A workflow designer that lets you build LLM-driven data processing and analysis pipelines<p>- A job orchestration layer that runs those workflows<p>- A query interface which lets you inspect the results of those workflows with plain SQL<p>- A notebook system which is 100% API-compatible with Jupyter and runs on your existing kernels, so you can easily pull data into your existing environments and analyses<p>Ragnerock also supports bring-your-own AI (OpenAI, Anthropic, and Google APIs), databases, and blob storage, so you can join with your existing datasets and have all outputs flow to your data lake. We’re particularly excited about our web crawling feature, which allows you to scrape websites and trigger workflows on updates: for example, you might point Ragnerock at your favorite blog and run a workflow to assess posts for topics and sentiment.<p>You can try it out at <a href="https://www.ragnerock.com" rel="nofollow">https://www.ragnerock.com</a> ; no credit card needed and the first 20 hours of compute are free. It’s an early-stage product so we’re especially interested in feedback.<p>Happy to answer any questions - John and I will be around in the comments today.
No other tools from this source yet.