Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion .doc_gen/metadata/medical-imaging_metadata.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -1136,7 +1136,7 @@ medical-imaging_Scenario_ImageSetsAndFrames:
title_abbrev: Get started with image sets and image frames
synopsis: >
import DICOM files and download image frames in &AHI;.</para>
<para>The implementation is structured as a workflow command-line
<para>The implementation is structured as a command-line
application.
synopsis_list:
- Set up resources for a DICOM import.
Expand Down
2 changes: 1 addition & 1 deletion .doc_gen/metadata/s3_metadata.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -3447,7 +3447,7 @@ s3_Scenario_ObjectLock:
sdkguide:
excerpts:
- description: |
Entrypoint for the workflow (<noloc>index.js</noloc>). This orchestrates all of the steps.
Entrypoint for the scenario (<noloc>index.js</noloc>). This orchestrates all of the steps.
Visit GitHub to see the implementation details for Scenario, ScenarioInput, ScenarioOutput, and ScenarioAction.
snippet_files:
- javascriptv3/example_code/s3/scenarios/object-locking/index.js
Expand Down
10 changes: 5 additions & 5 deletions .doc_gen/metadata/scheduler_metadata.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -156,9 +156,9 @@ scheduler_DeleteScheduleGroup:
- python.example_code.scheduler.DeleteScheduleGroup
services:
scheduler: {DeleteScheduleGroup}
scheduler_ScheduledEventsWorkflow:
title: A complete &EVS; Scheduled Events workflow using an &AWS; SDK
title_abbrev: Scheduled Events workflow
scheduler_ScheduledEventsScenario:
title: A complete &EVS; Scheduled Events scenario using an &AWS; SDK
title_abbrev: Scheduled Events
synopsis_list:
- Deploy a &CFN; stack with required resources.
- Create a &EVS; schedule group.
Expand All @@ -173,7 +173,7 @@ scheduler_ScheduledEventsWorkflow:
- sdk_version: 2
github: javav2/example_code/scheduler
excerpts:
- description: Run the workflow.
- description: Run the scenario.
genai: most
snippet_tags:
- scheduler.javav2.scenario.main
Expand All @@ -186,7 +186,7 @@ scheduler_ScheduledEventsWorkflow:
- sdk_version: 3
github: dotnetv3/EventBridge Scheduler
excerpts:
- description: Run the workflow.
- description: Run the scenario.
genai: most
snippet_tags:
- Scheduler.dotnetv3.SchedulerWorkflow
Expand Down
8 changes: 4 additions & 4 deletions .doc_gen/metadata/sesv2_metadata.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -418,17 +418,17 @@ sesv2_DeleteEmailTemplate:
services:
sesv2: {DeleteEmailTemplate}
sesv2_NewsletterWorkflow:
title: A complete &SESv2; Newsletter workflow using an &AWS; SDK
title_abbrev: Newsletter workflow
synopsis: run the &SESv2; newsletter workflow.
title: A complete &SESv2; Newsletter scenario using an &AWS; SDK
title_abbrev: Newsletter scenario
synopsis: run the &SESv2; newsletter scenario.
category: Scenarios
languages:
.NET:
versions:
- sdk_version: 3
github: dotnetv3/SESv2
excerpts:
- description: Run the workflow.
- description: Run the scenario.
genai: most
snippet_tags:
- SESWorkflow.dotnetv3.NewsletterWorkflow
Expand Down
4 changes: 2 additions & 2 deletions .doc_gen/metadata/sqs_metadata.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -989,10 +989,10 @@ sqs_Scenario_TopicsAndQueues:
github: javascriptv3/example_code/cross-services/wkflw-topics-queues
sdkguide:
excerpts:
- description: This is the entry point for this workflow.
- description: This is the entry point for this scenario.
snippet_tags:
- javascript.v3.wkflw.topicsandqueues.index
- description: The preceding code provides the necessary dependencies and starts the workflow. The next section
- description: The preceding code provides the necessary dependencies and starts the scenario. The next section
contains the bulk of the example.
snippet_tags:
- javascript.v3.wkflw.topicsandqueues.wrapper
Expand Down
2 changes: 1 addition & 1 deletion .github/workflows/lint-javascript.yml
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ jobs:
sparse-checkout: |
.github
javascriptv3
workflows
scenarios
- name: Get changed files
id: changed-files
uses: tj-actions/changed-files@e9772d140489982e0e3704fea5ee93d536f1e275
Expand Down
2 changes: 1 addition & 1 deletion .tools/ailly/dotnet-prompts.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -152,7 +152,7 @@ code:

'''
/// <summary>
/// Run the preparation step of the workflow. Should return successful.
/// Run the preparation step of the scenario. Should return successful.
/// </summary>
/// <returns>Async task.</returns>
[Fact]
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -221,7 +221,7 @@ Resources:
sleep 30 # prevent "Error: Rpmdb changed underneath us"
yum install python-pip -y
python3 -m pip install boto3 ec2-metadata
wget -O server.py https://raw.githubusercontent.com/awsdocs/aws-doc-sdk-examples/main/workflows/resilient_service/resources/server.py
wget -O server.py https://raw.githubusercontent.com/awsdocs/aws-doc-sdk-examples/main/scenarios/features/resilient_service/resources/server.py
python3 server.py 80

# 4. An Auto Scaling group that starts EC2 instances, one in each of three Availability Zones.
Expand Down
4 changes: 2 additions & 2 deletions cpp/example_code/iot/things_and_shadows_workflow/README.md
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
## AWS IoT Things and Shadows Workflow
## AWS IoT Things and Shadows Scenario
This example demonstrates various interactions with the AWS Internet of things (IoT) Core service using the AWS SDK.
The program guides you through a series of steps, showcasing AWS IoT capabilities and providing a comprehensive example for developers.



### Workflow Steps
### Scenario Steps

#### Create an AWS IoT thing:

Expand Down
4 changes: 2 additions & 2 deletions cpp/example_code/medical-imaging/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -93,7 +93,7 @@ This example shows you how to get started using HealthImaging.
#### Get started with image sets and image frames

This example shows you how to import DICOM files and download image frames in HealthImaging.</para>
<para>The implementation is structured as a workflow command-line
<para>The implementation is structured as a command-line
application.


Expand Down Expand Up @@ -141,4 +141,4 @@ This example shows you how to import DICOM files and download image frames in He

Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.

SPDX-License-Identifier: Apache-2.0
SPDX-License-Identifier: Apache-2.0
Original file line number Diff line number Diff line change
Expand Up @@ -16,20 +16,20 @@ This workflow runs as a command-line application prompting for user input.
3. An Amazon S3 output bucket for a DICOM import job.
4. An AWS Identity and Access Management (IAM) role with the appropriate permissions for a DICOM import job.

![CloudFormation stack diagram](../../../../workflows/healthimaging_image_sets/.images/cfn_stack.png)
![CloudFormation stack diagram](../../../../scenarios/features/healthimaging_image_sets/.images/cfn_stack.png)

2. The user chooses a DICOM study to copy from the [National Cancer Institute Imaging Data Commons (IDC) Collections](https://registry.opendata.aws/nci-imaging-data-commons/)' public S3 bucket.
3. The chosen study is copied to the user's input S3 bucket.

![DICOM copy diagram](../../../../workflows/healthimaging_image_sets/.images/copy_dicom.png)
![DICOM copy diagram](../../../../scenarios/features/healthimaging_image_sets/.images/copy_dicom.png)

4. A HealthImaging DICOM import job is run.

![DICOM import diagram](../../../../workflows/healthimaging_image_sets/.images/dicom_import.png)
![DICOM import diagram](../../../../scenarios/features/healthimaging_image_sets/.images/dicom_import.png)

5. The workflow retrieves the IDs for the HealthImaging image frames created by the DICOM import job.

![Image frame ID retrieval diagram](../../../../workflows/healthimaging_image_sets/.images/get_image_frame_ids.png)
![Image frame ID retrieval diagram](../../../../scenarios/features/healthimaging_image_sets/.images/get_image_frame_ids.png)

6. The HealthImaging image frames are downloaded, decoded to a bitmap format, and verified using a CRC32 checksum.
7. The created resources can then be deleted, if the user chooses.
Expand Down
14 changes: 7 additions & 7 deletions dotnetv3/EventBridge Scheduler/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@ Code excerpts that show you how to call individual service functions.
Code examples that show you how to accomplish a specific task by calling multiple
functions within the same service.

- [Scheduled Events workflow](Scenarios/SchedulerWorkflow.cs)
- [Scheduled Events](Scenarios/SchedulerWorkflow.cs)


<!--custom.examples.start-->
Expand Down Expand Up @@ -85,7 +85,7 @@ This example shows you how to get started using EventBridge Scheduler.



#### Scheduled Events workflow
#### Scheduled Events

This example shows you how to do the following:

Expand All @@ -96,12 +96,12 @@ This example shows you how to do the following:
- Delete EventBridge Scheduler the schedule and schedule group.
- Clean up resources and delete the stack.

<!--custom.scenario_prereqs.scheduler_ScheduledEventsWorkflow.start-->
<!--custom.scenario_prereqs.scheduler_ScheduledEventsWorkflow.end-->
<!--custom.scenario_prereqs.scheduler_ScheduledEventsScenario.start-->
<!--custom.scenario_prereqs.scheduler_ScheduledEventsScenario.end-->


<!--custom.scenarios.scheduler_ScheduledEventsWorkflow.start-->
<!--custom.scenarios.scheduler_ScheduledEventsWorkflow.end-->
<!--custom.scenarios.scheduler_ScheduledEventsScenario.start-->
<!--custom.scenarios.scheduler_ScheduledEventsScenario.end-->

### Tests

Expand Down Expand Up @@ -129,4 +129,4 @@ in the `dotnetv3` folder.

Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.

SPDX-License-Identifier: Apache-2.0
SPDX-License-Identifier: Apache-2.0
4 changes: 2 additions & 2 deletions dotnetv3/EventBridge Scheduler/Scenarios/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ This example shows how to use AWS SDKs to work with Amazon EventBridge Scheduler

The target SNS topic and the AWS Identity and Access Management (IAM) role used with the schedules are created as part of an AWS CloudFormation stack that is deployed at the start of the workflow, and deleted when the workflow is complete.

![Object Lock Features](../../../workflows/eventbridge_scheduler/resources/scheduler-workflow.png)
![Object Lock Features](../../../scenarios/features/eventbridge_scheduler/resources/scheduler-workflow.png)

This workflow demonstrates the following steps and tasks:

Expand Down Expand Up @@ -65,7 +65,7 @@ This workflow uses the following AWS services:

### Resources

The workflow scenario deploys the AWS CloudFormation stack with the required resources.
The feature scenario deploys the AWS CloudFormation stack with the required resources.

### Instructions

Expand Down
26 changes: 13 additions & 13 deletions dotnetv3/EventBridge Scheduler/Scenarios/SchedulerWorkflow.cs
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
// SPDX-License-Identifier: Apache-2.0

// snippet-start:[Scheduler.dotnetv3.SchedulerWorkflow]
Expand Down Expand Up @@ -27,8 +27,8 @@ public class SchedulerWorkflow
- Prompt the user for an email address to use for the subscription for the SNS topic subscription.
- Prompt the user for a name for the Cloud Formation stack.
- Deploy the Cloud Formation template in resources/cfn_template.yaml for resource creation.
- Store the outputs of the stack into variables for use in the workflow.
- Create a schedule group for all workflow schedules.
- Store the outputs of the stack into variables for use in the scenario.
- Create a schedule group for all schedules.

2. Create one-time Schedule:
- Create a one-time schedule to send an initial event.
Expand All @@ -55,9 +55,9 @@ public class SchedulerWorkflow
private static string _snsTopicArn = null!;

public static bool _interactive = true;
private static string _stackName = "default-scheduler-workflow-stack-name";
private static string _scheduleGroupName = "workflow-schedules-group";
private static string _stackResourcePath = "../../../../../../workflows/eventbridge_scheduler/resources/cfn_template.yaml";
private static string _stackName = "default-scheduler-scenario-stack-name";
private static string _scheduleGroupName = "scenario-schedules-group";
private static string _stackResourcePath = "../../../../../../scenarios/features/eventbridge_scheduler/resources/cfn_template.yaml";

public static async Task Main(string[] args)
{
Expand All @@ -83,7 +83,7 @@ public static async Task Main(string[] args)
}

Console.WriteLine(new string('-', 80));
Console.WriteLine("Welcome to the Amazon EventBridge Scheduler Workflow.");
Console.WriteLine("Welcome to the Amazon EventBridge Scheduler Scenario.");
Console.WriteLine(new string('-', 80));

try
Expand All @@ -109,12 +109,12 @@ public static async Task Main(string[] args)
}
catch (Exception ex)
{
_logger.LogError(ex, "There was a problem with the workflow, initiating cleanup...");
_logger.LogError(ex, "There was a problem with the scenario, initiating cleanup...");
_interactive = false;
await Cleanup();
}

Console.WriteLine("Amazon EventBridge Scheduler workflow completed.");
Console.WriteLine("Amazon EventBridge Scheduler scenario completed.");
}

/// <summary>
Expand All @@ -141,7 +141,7 @@ public static async Task<bool> PrepareApplication()

if (deploySuccess)
{
// Create a schedule group for all workflow schedules
// Create a schedule group for all schedules
await _schedulerWrapper.CreateScheduleGroupAsync(_scheduleGroupName);

Console.WriteLine("Application preparation complete.");
Expand Down Expand Up @@ -414,14 +414,14 @@ public static async Task<bool> CreateRecurringSchedule()
}

/// <summary>
/// Cleans up the resources created during the workflow.
/// Cleans up the resources created during the scenario.
/// </summary>
/// <returns>True if the cleanup was successful.</returns>
public static async Task<bool> Cleanup()
{
// Prompt the user to confirm cleanup.
var cleanup = !_interactive || GetYesNoResponse(
"Do you want to delete all resources created by this workflow? (y/n) ");
"Do you want to delete all resources created by this scenario? (y/n) ");
if (cleanup)
{
try
Expand All @@ -441,7 +441,7 @@ public static async Task<bool> Cleanup()
return false;
}
}
_logger.LogInformation("EventBridge Scheduler workflow is complete.");
_logger.LogInformation("EventBridge Scheduler scenario is complete.");
return true;
}

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
// SPDX-License-Identifier: Apache-2.0

using Amazon.CloudFormation;
Expand All @@ -18,7 +18,7 @@ public class SchedulerWorkflowTests
private SchedulerWrapper _schedulerWrapper = null!;

/// <summary>
/// Verifies the workflow with an integration test. No errors should be logged.
/// Verifies the scenario with an integration test. No errors should be logged.
/// </summary>
/// <returns>Async task.</returns>
[Fact]
Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
// SPDX-License-Identifier: Apache-2.0

using Amazon.S3;
Expand Down Expand Up @@ -46,7 +46,7 @@ public S3ConditionalRequestsScenarioTests()
}

/// <summary>
/// Run the setup step of the workflow. Should return successful.
/// Run the setup step of the scenario. Should return successful.
/// </summary>
/// <returns>Async task.</returns>
[Fact]
Expand Down
4 changes: 2 additions & 2 deletions dotnetv3/S3/scenarios/S3ObjectLockScenario/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ This example shows how to use AWS SDKs to work with Amazon Simple Storage Servic

[Amazon S3 Object Lock](https://docs.aws.amazon.com/AmazonS3/latest/userguide/object-lock.html) can help prevent Amazon S3 objects from being deleted or overwritten for a fixed amount of time or indefinitely. Object Lock can help meet regulatory requirements or protect against object changes or deletion.

![Object Lock Features](../../../../workflows/s3_object_lock/resources/Diagram_Amazon-S3-Object-Lock.png)
![Object Lock Features](../../../../scenarios/features/s3_object_lock/resources/Diagram_Amazon-S3-Object-Lock.png)

This workflow demonstrates the following steps and tasks:
1. Add object lock settings to both new and existing S3 buckets.
Expand All @@ -31,7 +31,7 @@ For general prerequisites, see the [README](../../../README.md) in the `dotnetv3

### Resources

The workflow scenario steps create the buckets and objects needed for the example. No additional resources are required.
The feature scenario steps create the buckets and objects needed for the example. No additional resources are required.

This workflow includes an optional step to add a governance mode retention period of one day to objects in an S3 bucket. In order to delete these objects, you must have the `s3:BypassGovernanceRetention` permission. If you do not have this permission, you will be unable to delete these objects until the retention period has expired.

Expand Down
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
// Copyright Amazon.com, Inc. or its affiliates. All Rights Reserved.
// SPDX-License-Identifier: Apache-2.0

using Amazon.S3;
Expand Down Expand Up @@ -41,7 +41,7 @@ public S3ObjectLockScenarioTests()
}

/// <summary>
/// Run the setup step of the workflow. Should return successful.
/// Run the setup step of the scenario. Should return successful.
/// </summary>
/// <returns>Async task.</returns>
[Fact]
Expand All @@ -68,7 +68,7 @@ public async Task TestSetup()
}

/// <summary>
/// Run the list object step of the workflow. Should return successful.
/// Run the list object step of the scenario. Should return successful.
/// </summary>
/// <returns>Async task.</returns>
[Fact]
Expand All @@ -88,7 +88,7 @@ public async Task TestObjects()


/// <summary>
/// Run the cleanup step of the workflow. Should return successful.
/// Run the cleanup step of the scenario. Should return successful.
/// </summary>
/// <returns>Async task.</returns>
[Fact]
Expand Down
Loading
Loading