Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
36 changes: 24 additions & 12 deletions docs/examples/aws_example.mdx
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
---
title: AWS Bedrock and AOSS
title: Amazon Stack: AWS Bedrock, AOSS, and Neptune Analytics
---

<Snippet file="security-compliance.mdx" />

This example demonstrates how to configure and use the `mem0ai` SDK with **AWS Bedrock** and **OpenSearch Service (AOSS)** for persistent memory capabilities in Python.
This example demonstrates how to configure and use the `mem0ai` SDK with **AWS Bedrock**, **OpenSearch Service (AOSS)**, and **AWS Neptune Analytics** for persistent memory capabilities in Python.

## Installation

Install the required dependencies:
Install the required dependencies to include the Amazon data stack, including **boto3**, **opensearch-py**, and **langchain-aws**:

```bash
pip install mem0ai boto3 opensearch-py
pip install "mem0ai[graph,extras]"
```

## Environment Setup
Expand All @@ -34,11 +34,15 @@ print(os.environ['AWS_SECRET_ACCESS_KEY'])

## Configuration and Usage

This sets up Mem0 with AWS Bedrock for embeddings and LLM, and OpenSearch as the vector store.
This sets up Mem0 with:
- [AWS Bedrock for LLM](https://docs.mem0.ai/components/llms/models/aws_bedrock)
- [AWS Bedrock for embeddings](https://docs.mem0.ai/components/embedders/models/aws_bedrock#aws-bedrock)
- [OpenSearch as the vector store](https://docs.mem0.ai/components/vectordbs/dbs/opensearch)
- [Neptune Analytics as your graph store](https://docs.mem0.ai/open-source/graph_memory/overview#initialize-neptune-analytics).

```python
import boto3
from opensearchpy import OpenSearch, RequestsHttpConnection, AWSV4SignerAuth
from opensearchpy import RequestsHttpConnection, AWSV4SignerAuth
from mem0.memory.main import Memory

region = 'us-west-2'
Expand All @@ -56,7 +60,7 @@ config = {
"llm": {
"provider": "aws_bedrock",
"config": {
"model": "anthropic.claude-3-5-haiku-20241022-v1:0",
"model": "us.anthropic.claude-3-7-sonnet-20250219-v1:0",
"temperature": 0.1,
"max_tokens": 2000
}
Expand All @@ -68,21 +72,29 @@ config = {
"host": "your-opensearch-domain.us-west-2.es.amazonaws.com",
"port": 443,
"http_auth": auth,
"embedding_model_dims": 1024,
"connection_class": RequestsHttpConnection,
"pool_maxsize": 20,
"use_ssl": True,
"verify_certs": True
"verify_certs": True,
"embedding_model_dims": 1024,
}
}
},
"graph_store": {
"provider": "neptune",
"config": {
"endpoint": f"neptune-graph://my-graph-identifier",
},
},
}

# Initialize memory system
# Initialize the memory system
m = Memory.from_config(config)
```

## Usage

Reference [Notebook example](https://github.com/mem0ai/mem0/blob/main/examples/graph-db-demo/neptune-example.ipynb)

#### Add a memory:

```python
Expand Down Expand Up @@ -117,4 +129,4 @@ memory = m.get(memory_id)

## Conclusion

With Mem0 and AWS services like Bedrock and OpenSearch, you can build intelligent AI companions that remember, adapt, and personalize their responses over time. This makes them ideal for long-term assistants, tutors, or support bots with persistent memory and natural conversation abilities.
With Mem0 and AWS services like Bedrock, OpenSearch, and Neptune Analytics, you can build intelligent AI companions that remember, adapt, and personalize their responses over time. This makes them ideal for long-term assistants, tutors, or support bots with persistent memory and natural conversation abilities.
8 changes: 3 additions & 5 deletions docs/open-source/graph_memory/overview.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -236,6 +236,8 @@ m = Memory.from_config(config_dict=config)

Mem0 now supports Amazon Neptune Analytics as a graph store provider. This integration allows you to use Neptune Analytics for storing and querying graph-based memories.

You can use Neptune Analytics as part of an Amazon tech stack [Setup AWS Bedrock, AOSS, and Neptune](https://docs.mem0.ai/examples/aws_example#aws-bedrock-and-aoss)

#### Instance Setup

Create an Amazon Neptune Analytics instance in your AWS account following the [AWS documentation](https://docs.aws.amazon.com/neptune-analytics/latest/userguide/get-started.html).
Expand Down Expand Up @@ -266,12 +268,8 @@ The Neptune memory store uses AWS LangChain Python API to connect to Neptune ins
```python Python
from mem0 import Memory

# This example must connect to a neptune-graph instance with 1536 vector dimensions specified.
# Provided neptune-graph instance must have the same vector dimensions as the embedder provider.
config = {
"embedder": {
"provider": "openai",
"config": {"model": "text-embedding-3-large", "embedding_dims": 1536},
},
"graph_store": {
"provider": "neptune",
"config": {
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading