@@ -7,24 +7,15 @@ A Redis-powered memory server built for AI agents and applications. It manages b
7
7
- ** Working Memory**
8
8
9
9
- Session-scoped storage for messages, structured memories, context, and metadata
10
- - Automatically summarizes conversations when they exceed a client-configured window size
10
+ - Automatically summarizes conversations when they exceed a client-configured (or server-managed) window size
11
11
- Supports all major OpenAI and Anthropic models
12
12
- Automatic (background) promotion of structured memories to long-term storage
13
13
14
14
- ** Long-Term Memory**
15
15
16
16
- Persistent storage for memories across sessions
17
- - ** Pluggable Vector Store Backends** - Support for multiple vector databases through LangChain VectorStore interface:
18
- - ** Redis** (default) - RedisStack with RediSearch
19
- - ** Chroma** - Open-source vector database
20
- - ** Pinecone** - Managed vector database service
21
- - ** Weaviate** - Open-source vector search engine
22
- - ** Qdrant** - Vector similarity search engine
23
- - ** Milvus** - Cloud-native vector database
24
- - ** PostgreSQL/PGVector** - PostgreSQL with vector extensions
25
- - ** LanceDB** - Embedded vector database
26
- - ** OpenSearch** - Open-source search and analytics suite
27
- - Semantic search to retrieve memories with advanced filtering system
17
+ - Pluggable Vector Store Backends - Support for any LangChain VectorStore (defaults to Redis)
18
+ - Semantic search to retrieve memories with advanced filtering
28
19
- Filter by session, user ID, namespace, topics, entities, timestamps, and more
29
20
- Supports both exact match and semantic similarity search
30
21
- Automatic topic modeling for stored memories with BERTopic or configured LLM
0 commit comments