Skip to content

Commit 849cadf

Browse files
authored
Add neptune example notebook and documentation (mem0ai#3224)
Signed-off-by: Andrew Carbonetto <[email protected]>
1 parent 2638487 commit 849cadf

File tree

7 files changed

+218
-342
lines changed

7 files changed

+218
-342
lines changed

docs/examples/aws_example.mdx

Lines changed: 24 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,17 @@
11
---
2-
title: AWS Bedrock and AOSS
2+
title: Amazon Stack: AWS Bedrock, AOSS, and Neptune Analytics
33
---
44

55
<Snippet file="blank-notif.mdx" />
66

7-
This example demonstrates how to configure and use the `mem0ai` SDK with **AWS Bedrock** and **OpenSearch Service (AOSS)** for persistent memory capabilities in Python.
7+
This example demonstrates how to configure and use the `mem0ai` SDK with **AWS Bedrock**, **OpenSearch Service (AOSS)**, and **AWS Neptune Analytics** for persistent memory capabilities in Python.
88

99
## Installation
1010

11-
Install the required dependencies:
11+
Install the required dependencies to include the Amazon data stack, including **boto3**, **opensearch-py**, and **langchain-aws**:
1212

1313
```bash
14-
pip install mem0ai boto3 opensearch-py
14+
pip install "mem0ai[graph,extras]"
1515
```
1616

1717
## Environment Setup
@@ -34,11 +34,15 @@ print(os.environ['AWS_SECRET_ACCESS_KEY'])
3434

3535
## Configuration and Usage
3636

37-
This sets up Mem0 with AWS Bedrock for embeddings and LLM, and OpenSearch as the vector store.
37+
This sets up Mem0 with:
38+
- [AWS Bedrock for LLM](https://docs.mem0.ai/components/llms/models/aws_bedrock)
39+
- [AWS Bedrock for embeddings](https://docs.mem0.ai/components/embedders/models/aws_bedrock#aws-bedrock)
40+
- [OpenSearch as the vector store](https://docs.mem0.ai/components/vectordbs/dbs/opensearch)
41+
- [Neptune Analytics as your graph store](https://docs.mem0.ai/open-source/graph_memory/overview#initialize-neptune-analytics).
3842

3943
```python
4044
import boto3
41-
from opensearchpy import OpenSearch, RequestsHttpConnection, AWSV4SignerAuth
45+
from opensearchpy import RequestsHttpConnection, AWSV4SignerAuth
4246
from mem0.memory.main import Memory
4347

4448
region = 'us-west-2'
@@ -56,7 +60,7 @@ config = {
5660
"llm": {
5761
"provider": "aws_bedrock",
5862
"config": {
59-
"model": "anthropic.claude-3-5-haiku-20241022-v1:0",
63+
"model": "us.anthropic.claude-3-7-sonnet-20250219-v1:0",
6064
"temperature": 0.1,
6165
"max_tokens": 2000
6266
}
@@ -68,21 +72,29 @@ config = {
6872
"host": "your-opensearch-domain.us-west-2.es.amazonaws.com",
6973
"port": 443,
7074
"http_auth": auth,
71-
"embedding_model_dims": 1024,
7275
"connection_class": RequestsHttpConnection,
7376
"pool_maxsize": 20,
7477
"use_ssl": True,
75-
"verify_certs": True
78+
"verify_certs": True,
79+
"embedding_model_dims": 1024,
7680
}
77-
}
81+
},
82+
"graph_store": {
83+
"provider": "neptune",
84+
"config": {
85+
"endpoint": f"neptune-graph://my-graph-identifier",
86+
},
87+
},
7888
}
7989

80-
# Initialize memory system
90+
# Initialize the memory system
8191
m = Memory.from_config(config)
8292
```
8393

8494
## Usage
8595

96+
Reference [Notebook example](https://github.com/mem0ai/mem0/blob/main/examples/graph-db-demo/neptune-example.ipynb)
97+
8698
#### Add a memory:
8799

88100
```python
@@ -117,4 +129,4 @@ memory = m.get(memory_id)
117129

118130
## Conclusion
119131

120-
With Mem0 and AWS services like Bedrock and OpenSearch, you can build intelligent AI companions that remember, adapt, and personalize their responses over time. This makes them ideal for long-term assistants, tutors, or support bots with persistent memory and natural conversation abilities.
132+
With Mem0 and AWS services like Bedrock, OpenSearch, and Neptune Analytics, you can build intelligent AI companions that remember, adapt, and personalize their responses over time. This makes them ideal for long-term assistants, tutors, or support bots with persistent memory and natural conversation abilities.

docs/open-source/graph_memory/overview.mdx

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -236,6 +236,8 @@ m = Memory.from_config(config_dict=config)
236236

237237
Mem0 now supports Amazon Neptune Analytics as a graph store provider. This integration allows you to use Neptune Analytics for storing and querying graph-based memories.
238238

239+
You can use Neptune Analytics as part of an Amazon tech stack [Setup AWS Bedrock, AOSS, and Neptune](https://docs.mem0.ai/examples/aws_example#aws-bedrock-and-aoss)
240+
239241
#### Instance Setup
240242

241243
Create an Amazon Neptune Analytics instance in your AWS account following the [AWS documentation](https://docs.aws.amazon.com/neptune-analytics/latest/userguide/get-started.html).
@@ -266,12 +268,8 @@ The Neptune memory store uses AWS LangChain Python API to connect to Neptune ins
266268
```python Python
267269
from mem0 import Memory
268270

269-
# This example must connect to a neptune-graph instance with 1536 vector dimensions specified.
271+
# Provided neptune-graph instance must have the same vector dimensions as the embedder provider.
270272
config = {
271-
"embedder": {
272-
"provider": "openai",
273-
"config": {"model": "text-embedding-3-large", "embedding_dims": 1536},
274-
},
275273
"graph_store": {
276274
"provider": "neptune",
277275
"config": {
37.4 KB
Loading
140 KB
Loading
138 KB
Loading
209 KB
Loading

0 commit comments

Comments
 (0)