Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
14 changes: 13 additions & 1 deletion labs/module4/README.md
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# Module 4: Advanced Agent Features

## Overview
This module explores advanced agent capabilities through Model Context Protocol (MCP) and multi-agent systems. You'll learn how to combine different agent frameworks, create specialized tool servers, and build interoperable AI systems that can collaborate effectively.
This module explores advanced agent capabilities through Model Context Protocol (MCP), multi-agent systems, and secure code execution. You'll learn how to combine different agent frameworks, create specialized tool servers, build interoperable AI systems that can collaborate effectively, and leverage AWS Bedrock AgentCore for secure code interpretation.

## Learning Objectives

Expand All @@ -10,6 +10,8 @@ This module explores advanced agent capabilities through Model Context Protocol
* Mix and match different agent frameworks (PydanticAI, LangChain, etc.)
* Build collaborative multi-agent systems
* Integrate third-party MCP servers into your applications
* Use AWS Bedrock AgentCore Code Interpreter for secure code execution
* Perform data processing and file operations in sandboxed environments

## Prerequisites

Expand All @@ -20,6 +22,16 @@ This module explores advanced agent capabilities through Model Context Protocol
- AWS Bedrock access
- Basic understanding of async programming

## Notebooks

### 4_agentcore_tool_code_interpreter.ipynb
Learn how to use AWS Bedrock AgentCore Code Interpreter for secure code execution:
- Set up and configure AgentCore Code Interpreter with boto3
- Execute Python and JavaScript code in secure sandboxes
- Perform statistical calculations and data analysis
- Handle file operations (read, write, process)
- Process large files up to 100MB inline or 5GB via S3

## Installation

```bash
Expand Down
Loading