memU-server: Local Backend Service for AI Memory System

memU-server is the backend management service for MemU, responsible for providing API endpoints, data storage, and management capabilities, as well as deep integration with the core memU framework. It powers the frontend memU-ui with reliable data support, ensuring efficient reading, writing, and maintenance of Agent memories. memU-server can be deployed locally or in private environments and supports quick startup and configuration via Docker, enabling developers to manage the AI memory system in a secure environment.

memU-server backend service overview

Key Features

Quick Deployment

  • Provides API endpoints compatible with memU-ui, ensuring stable and reliable data services
  • Launch backend service and database with a single command
  • Docker image provided

Comprehensive Memory Management

(Some features planned for future releases)

Memory Data Management
  • Support creating, reading, and deleting Memory Submissions
  • Memorize results support create, read, update, and delete (CRUD) operations
  • Retrieve records support querying and tracking
  • Tracks LLM token usage for transparent and controllable costs
User and Permission Management
  • User login and registration system
  • Role-based access control: Developer / Admin / Regular User
  • Backend manages access scope and permissions for secure operations

MemU Repositories: What They Do & How to Use

memU
memU-server
memU-ui
PositioningCore algorithm engineMemory data backend serviceFront-end dashboard
Key Features
  • Core algorithms
  • Memory extraction
  • Multi-strategy retrieval
  • Memory CRUD
  • Retrieve record tracking
  • Token usage & billing tracking
  • User system
  • RBAC permission system
  • Security boundary controls
  • Front-end interface
  • Visual memory viewer
  • User management UI
  • Data retrieval UI
  • Easy self-hosting experience
Best ForDevelopers/teams who want to embed AI memory algorithms into their productTeams that want to self-host a memory backend (internal tools, research, enterprise setups)Developers/teams looking for a ready-to-use memory console
UsageCore algorithms can be used standalone or integrated into serverSelf-hostable; works together with memUSelf-hostable; integrates with memU

memU, memU-server, and memU-ui together form a flexible memory ecosystem for LLMs and AI agents.

Join the Growing memU Open-Source Community

For more information please contact info@nevamind.ai.

GitHub Issues
Report bugs, request features, and track development. Submit an issue.
Discord
Get real-time support, chat with the community, and stay updated. Join us.
X (Twitter)
Follow for updates, AI insights, and key announcements. Follow us.

Start memU-server in minutes

Deploy a secure backend for memory APIs with Docker, and connect memU-ui for real-time operations.