A Simple and Universal Swarm Intelligence Engine, Predicting Anything. 简洁通用的群体智能引擎,预测万物
-
Updated
Mar 20, 2026 - Python
A Simple and Universal Swarm Intelligence Engine, Predicting Anything. 简洁通用的群体智能引擎,预测万物
Dolt – Git for Data
Memory engine and app that is extremely fast, scalable. The Memory API for the AI era.
Memory for 24/7 proactive agents like openclaw (moltbot, clawdbot).
SQL Native Memory Layer for LLMs, AI Agents & Multi-Agent Systems
A lightweight, lightning-fast, in-process vector database
AI memory OS for LLM and Agent systems(moltbot,clawdbot,openclaw), enabling persistent Skill memory for cross-task skill reuse and evolution.
A memory OS that makes your OpenClaw agents more personal while saving tokens.
🧠 Make your agents learn from experience. Now available as a hosted solution at kayba.ai
Open-source persistent memory for AI agent pipelines (LangGraph, CrewAI, AutoGen) and Claude. REST API + knowledge graph + autonomous consolidation.
The context development platform. Store, enrich, and retrieve structured knowledge with graph-native infrastructure, semantic retrieval, and portable context cores.
Memory library for building stateful agents
A Markdown-first memory system, a standalone library for any AI agent. Inspired by OpenClaw.
Semantica 🧠 — A framework for building semantic layers, context graphs, and decision intelligence systems with explainability and provenance.
Awesome AI Memory | LLM Memory | A curated knowledge base on AI memory for LLMs and agents, covering long-term memory, reasoning, retrieval, and memory-native system design. Awesome-AI-Memory 是一个 集中式、持续更新的 AI 记忆知识库,系统性整理了与 大模型记忆(LLM Memory)与智能体记忆(Agent Memory) 相关的前沿研究、工程框架、系统设计、评测基准与真实应用实践。
AgentChat 是一个基于 LLM 的智能体交流平台,内置默认 Agent 并支持用户自定义 Agent。通过多轮对话和任务协作,Agent 可以理解并协助完成复杂任务。项目集成 LangChain、Function Call、MCP 协议、RAG、Memory、Milvus 和 ElasticSearch 等技术,实现高效的知识检索与工具调用,使用 FastAPI 构建高性能后端服务。
The Enterprise-Ready OpenClaw. Your Proactive AI Assistant That Remembers Everything
Open-source cross-agent memory layer for coding agents via MCP. Compatible with Cursor, Claude Code, Codex, Windsurf, Gemini CLI, GitHub Copilot, Kiro, OpenCode, Antigravity, and Trae.
Curated systems, benchmarks, and papers etc. on memory for LLMs/MLLMs --- long-term context, retrieval, and reasoning.
Give your AI agent a brain that remembers. Local memory system for Claude Code — 100% private, GPU-accelerated, zero cloud dependency.
Add a description, image, and links to the agent-memory topic page so that developers can more easily learn about it.
To associate your repository with the agent-memory topic, visit your repo's landing page and select "manage topics."