id
stringlengths 8
78
| source
stringclasses 743
values | chunk_id
int64 1
5.05k
| text
stringlengths 593
49.7k
|
---|---|---|---|
step-functions-dg-064
|
step-functions-dg.pdf
| 64 |
about creating a Lambda function, see Step 4: Configure the Lambda function in the Getting started with using Distributed Map state tutorial. 2. Copy the following code for the Lambda function and paste it into the Code source section of your Lambda function. Step 2: Create the Lambda function 200 AWS Step Functions import json Developer Guide def lambda_handler(event, context): multiplication_factor = event['BatchInput']['MyMultiplicationFactor'] items = event['Items'] results = [multiplication_factor * item for item in items] return { 'statusCode': 200, 'multiplied': results } 3. After you create your Lambda function, copy the function's ARN displayed in the upper-right corner of the page. The following is an example ARN, where function-name is the name of the Lambda function (in this case, ProcessEntireBatch): arn:aws:lambda:region:123456789012:function:function-name You'll need to provide the function ARN in the state machine you created in Step 1. 4. Choose Deploy to deploy the changes. Step 3: Run the state machine When you run the state machine, the Distributed Map state starts four child workflow executions, where each execution processes three items, while one execution processes a single item. The following example shows the data passed to the ProcessEntireBatch function by one of the child workflow executions. { "BatchInput": { "MyMultiplicationFactor": 7 }, "Items": [1, 2, 3] } Given this input, the following example shows the output array named multiplied that is returned by the Lambda function. Step 3: Run the state machine 201 AWS Step Functions Developer Guide { "statusCode": 200, "multiplied": [7, 14, 21] } The state machine returns the following output that contains four arrays named multiplied for the four child workflow executions. These arrays contain the multiplication results of the individual input items. [ { "statusCode": 200, "multiplied": [7, 14, 21] }, { "statusCode": 200, "multiplied": [28, 35, 42] }, { "statusCode": 200, "multiplied": [49, 56, 63] }, { "statusCode": 200, "multiplied": [70] } ] To combine all the array items returned into a single output array, you can use the ResultSelector field. Define this field inside the Distributed Map state to find all the multiplied arrays, extract all the items inside these arrays, and then combine them into a single output array. To use the ResultSelector field, update your state machine definition as shown in the following example. { "StartAt": "Pass", "States": { ... ... "Map": { Step 3: Run the state machine 202 AWS Step Functions "Type": "Map", ... ... "ItemsPath": "$.MyItems", "ResultSelector": { "multiplied.$": "$..multiplied[*]" } } } } Developer Guide The updated state machine returns a consolidated output array as shown in the following example. { "multiplied": [7, 14, 21, 28, 35, 42, 49, 56, 63, 70] } Processing individual items with a Lambda function in Step Functions In this tutorial, you use the Distributed Map state's ItemBatcher (Map) field to iterate over individual items present in a batch using a Lambda function. The Distributed Map state starts four child workflow executions. Each of these child workflows runs an Inline Map state. For its each iteration, the Inline Map state invokes a Lambda function and passes a single item from the batch to the function. The Lambda function then processes the item and returns the result. You'll create a state machine that performs multiplication on an array of integers. Say that the integer array you provide as input is [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] and the multiplication factor is 7. Then, the resulting array formed after multiplying these integers with a factor of 7, will be [7, 14, 21, 28, 35, 42, 49, 56, 63, 70]. Step 1: Create the state machine In this step, you create the workflow prototype of the state machine that passes a single item from a batch of items to each invocation of the Lambda function you'll create in Step 2. • Use the following definition to create a state machine using the Step Functions console. For information about creating a state machine, see Step 1: Create the workflow prototype in the Getting started with using Distributed Map state tutorial. Process individual items with Lambda 203 AWS Step Functions Developer Guide In this state machine, you define a Distributed Map state that accepts an array of 10 integers as input and passes these array items to the child workflow executions in batches. Each child workflow execution receives a batch of three items as input and runs an Inline Map state. Every iteration of the Inline Map state invokes a Lambda function and passes an item from the batch to the function. This function then multiplies the item with a factor of 7 and returns the result. The output of each child workflow execution is a JSON array that contains the multiplication result for each of the items passed. Important Make sure to replace the Amazon Resource Name (ARN) of the Lambda function in
|
step-functions-dg-065
|
step-functions-dg.pdf
| 65 |
array items to the child workflow executions in batches. Each child workflow execution receives a batch of three items as input and runs an Inline Map state. Every iteration of the Inline Map state invokes a Lambda function and passes an item from the batch to the function. This function then multiplies the item with a factor of 7 and returns the result. The output of each child workflow execution is a JSON array that contains the multiplication result for each of the items passed. Important Make sure to replace the Amazon Resource Name (ARN) of the Lambda function in the following code with the ARN of the function you'll create in Step 2. { "StartAt": "Pass", "States": { "Pass": { "Type": "Pass", "Next": "Map", "Result": { "MyMultiplicationFactor": 7, "MyItems": [1, 2, 3, 4, 5, 6, 7, 8, 9, 10] } }, "Map": { "Type": "Map", "ItemProcessor": { "ProcessorConfig": { "Mode": "DISTRIBUTED", "ExecutionType": "STANDARD" }, "StartAt": "InnerMap", "States": { "InnerMap": { "Type": "Map", "ItemProcessor": { "ProcessorConfig": { "Mode": "INLINE" Step 1: Create the state machine 204 AWS Step Functions }, "StartAt": "Lambda Invoke", "States": { "Lambda Invoke": { "Type": "Task", Developer Guide "Resource": "arn:aws:states:::lambda:invoke", "OutputPath": "$.Payload", "Parameters": { "Payload.$": "$", "FunctionName": "arn:aws:lambda:region:account- id:function:functionName" }, "Retry": [ { "ErrorEquals": [ "Lambda.ServiceException", "Lambda.AWSLambdaException", "Lambda.SdkClientException", "Lambda.TooManyRequestsException" ], "IntervalSeconds": 2, "MaxAttempts": 6, "BackoffRate": 2 } ], "End": true } } }, "End": true, "ItemsPath": "$.Items", "ItemSelector": { "MyMultiplicationFactor.$": "$.BatchInput.MyMultiplicationFactor", "MyItem.$": "$$.Map.Item.Value" } } } }, "End": true, "Label": "Map", "MaxConcurrency": 1000, "ItemsPath": "$.MyItems", "ItemBatcher": { "MaxItemsPerBatch": 3, Step 1: Create the state machine 205 AWS Step Functions Developer Guide "BatchInput": { "MyMultiplicationFactor.$": "$.MyMultiplicationFactor" } } } } } Step 2: Create the Lambda function In this step, you create the Lambda function that processes each item passed from the batch. Important Ensure that your Lambda function is under the same AWS Region as your state machine. To create the Lambda function 1. Use the Lambda console to create a Python Lambda function named ProcessSingleItem. For information about creating a Lambda function, see Step 4: Configure the Lambda function in the Getting started with using Distributed Map state tutorial. 2. Copy the following code for the Lambda function and paste it into the Code source section of your Lambda function. import json def lambda_handler(event, context): multiplication_factor = event['MyMultiplicationFactor'] item = event['MyItem'] result = multiplication_factor * item return { 'statusCode': 200, 'multiplied': result } Step 2: Create the Lambda function 206 AWS Step Functions Developer Guide 3. After you create your Lambda function, copy the function's ARN displayed in the upper-right corner of the page. The following is an example ARN, where function-name is the name of the Lambda function (in this case, ProcessSingleItem): arn:aws:lambda:region:123456789012:function:function-name You'll need to provide the function ARN in the state machine you created in Step 1. 4. Choose Deploy to deploy the changes. Step 3: Run the state machine When you run the state machine, the Distributed Map state starts four child workflow executions, where each execution processes three items, while one execution processes a single item. The following example shows the data passed to one of the ProcessSingleItem function invocations inside a child workflow execution. { "MyMultiplicationFactor": 7, "MyItem": 1 } Given this input, the following example shows the output that is returned by the Lambda function. { "statusCode": 200, "multiplied": 7 } The following example shows the output JSON array for one of the child workflow executions. [ { "statusCode": 200, "multiplied": 7 }, { "statusCode": 200, "multiplied": 14 Step 3: Run the state machine 207 AWS Step Functions }, { "statusCode": 200, "multiplied": 21 } ] Developer Guide The state machine returns the following output that contains four arrays for the four child workflow executions. These arrays contain the multiplication results of the individual input items. Finally, the state machine output is an array named multiplied that combines all the multiplication results returned for the four child workflow executions. [ [ { "statusCode": 200, "multiplied": 7 }, { "statusCode": 200, "multiplied": 14 }, { "statusCode": 200, "multiplied": 21 } ], [ { "statusCode": 200, "multiplied": 28 }, { "statusCode": 200, "multiplied": 35 }, { "statusCode": 200, "multiplied": 42 } ], [ Step 3: Run the state machine 208 Developer Guide AWS Step Functions { "statusCode": 200, "multiplied": 49 }, { "statusCode": 200, "multiplied": 56 }, { "statusCode": 200, "multiplied": 63 } ], [ { "statusCode": 200, "multiplied": 70 } ] ] To combine all the multiplication results returned by the child workflow executions into a single output array, you can use the ResultSelector field. Define this field inside the Distributed Map state to find all the results, extract the individual results, and then combine them into a single output array named multiplied. To use the ResultSelector field, update your state machine definition as shown in the following example.
|
step-functions-dg-066
|
step-functions-dg.pdf
| 66 |
Developer Guide AWS Step Functions { "statusCode": 200, "multiplied": 49 }, { "statusCode": 200, "multiplied": 56 }, { "statusCode": 200, "multiplied": 63 } ], [ { "statusCode": 200, "multiplied": 70 } ] ] To combine all the multiplication results returned by the child workflow executions into a single output array, you can use the ResultSelector field. Define this field inside the Distributed Map state to find all the results, extract the individual results, and then combine them into a single output array named multiplied. To use the ResultSelector field, update your state machine definition as shown in the following example. { "StartAt": "Pass", "States": { ... ... "Map": { "Type": "Map", ... ... "ItemBatcher": { "MaxItemsPerBatch": 3, "BatchInput": { "MyMultiplicationFactor.$": "$.MyMultiplicationFactor" } Step 3: Run the state machine 209 AWS Step Functions }, "ItemsPath": "$.MyItems", "ResultSelector": { "multiplied.$": "$..multiplied" } } } } Developer Guide The updated state machine returns a consolidated output array as shown in the following example. { "multiplied": [7, 14, 21, 28, 35, 42, 49, 56, 63, 70] } Starting a Step Functions workflow in response to events You can execute an AWS Step Functions state machine in response to an event routed by an Amazon EventBridge rule to Step Functions as a target. The following tutorial shows you how to configure a state machine as a target of an Amazon EventBridge rule. Whenever files are added to an Amazon Simple Storage Service (Amazon S3) bucket, the EventBridge rule will start the state machine. A practical example of this approach could be a state machine that runs Amazon Rekognition analysis on image files added to the bucket to categorize and assign keywords. In this tutorial, you start the execution of a Helloworld state machine by uploading a file to an Amazon S3 bucket. Then you review the example input of that execution to identify the information that is included in input from the Amazon S3 event notification delivered to EventBridge. Prerequisite: Create a State Machine Before you can configure a state machine as an Amazon EventBridge target, you must create the state machine. • To create a basic state machine, use the Creating state machine that uses a Lambda function tutorial. • If you already have a Helloworld state machine, proceed to the next step. Start a workflow from EventBridge 210 AWS Step Functions Developer Guide Step 1: Create a Bucket in Amazon S3 Now that you have a Helloworld state machine, you need to create an Amazon S3 bucket which stores your files. In Step 3 of this tutorial, you set up a rule so that when a file is uploaded to this bucket, EventBridge triggers an execution of your state machine. 1. Navigate to the Amazon S3 console, and then choose Create bucket to create the bucket in which you want to store your files and trigger an Amazon S3 event rule. 2. Enter a Bucket name, such as username-sfn-tutorial. Note Bucket names must be unique across all existing bucket names in all AWS Regions in Amazon S3. Use your own username to make this name unique. You need to create all resources in the same AWS Region. 3. Keep all the default selections on the page, and choose Create bucket. Step 2: Enable Amazon S3 Event Notification with EventBridge After you create the Amazon S3 bucket, configure it to send events to EventBridge whenever certain events happen in your S3 bucket, such as file uploads. 1. Navigate to the Amazon S3 console. 2. In the Buckets list, choose the name of the bucket that you want to enable events for. 3. Choose Properties. 4. Scroll down the page to view the Event Notifications section, and then choose Edit in the Amazon EventBridge subsection. 5. Under Send notifications to Amazon EventBridge for all events in this bucket, choose On. 6. Choose Save changes. Note After you enable EventBridge, it takes around five minutes for the changes to take effect. Step 1: Create a Bucket in Amazon S3 211 AWS Step Functions Developer Guide Step 3: Create an Amazon EventBridge Rule After you have a state machine, and have created the Amazon S3 bucket and configured it to send event notifications to EventBridge, create an EventBridge rule. Note You must configure EventBridge rule in the same AWS Region as the Amazon S3 bucket. To create the rule 1. Navigate to the Amazon EventBridge console, choose Create rule. Tip Alternatively, in the navigation pane on the EventBridge console, choose Rules under Buses, and then choose Create rule. 2. Enter a Name for your rule (for example, S3Step Functions) and optionally enter a Description for the rule. 3. For Event bus and Rule type, keep the default selections. 4. Choose Next. This opens the Build event pattern page. 5. Scroll down to the Event pattern section, and
|
step-functions-dg-067
|
step-functions-dg.pdf
| 67 |
EventBridge rule. Note You must configure EventBridge rule in the same AWS Region as the Amazon S3 bucket. To create the rule 1. Navigate to the Amazon EventBridge console, choose Create rule. Tip Alternatively, in the navigation pane on the EventBridge console, choose Rules under Buses, and then choose Create rule. 2. Enter a Name for your rule (for example, S3Step Functions) and optionally enter a Description for the rule. 3. For Event bus and Rule type, keep the default selections. 4. Choose Next. This opens the Build event pattern page. 5. Scroll down to the Event pattern section, and do the following: a. b. c. For Event source, keep the default selection of AWS events or EventBridge partner events. For AWS service, choose Simple Storage Service (S3). For Event type, choose Amazon S3 Event Notification. d. Choose Specific event(s), and then choose Object Created. e. Choose Specific bucket(s) by name and enter the bucket name you created in Step 1 (username-sfn-tutorial) to store your files. f. Choose Next. This opens the Select target(s) page. Step 3: Create an Amazon EventBridge Rule 212 AWS Step Functions To create the target Developer Guide 1. 2. 3. In Target 1, keep the default selection of AWS service. In the Select a target dropdown list, select Step Functions state machine. In the State machine list, select the state machine that you created earlier (for example, Helloworld). 4. Keep all the default selections on the page, and choose Next. This opens the Configure tags page. 5. Choose Next again. This opens the Review and create page. 6. Review the details of the rule and choose Create rule. The rule is created and the Rules page is displayed, listing all your Amazon EventBridge rules. Step 4: Test the Rule Now that everything is in place, test adding a file to the Amazon S3 bucket, and then look at the input of the resulting state machine execution. 1. Add a file to your Amazon S3 bucket. Navigate to the Amazon S3 console, choose the bucket you created to store files (username- sfn-tutorial), and then choose Upload. 2. Add a file, for example test.png, and then choose Upload. This launches an execution of your state machine, passing information from AWS CloudTrail as the input. 3. Check the execution for your state machine. Navigate to the Step Functions console and select the state machine used in your Amazon EventBridge rule (Helloworld). 4. Select the most recent execution of that state machine and expand the Execution Input section. This input includes information such as the bucket name and the object name. In a real-world use case, a state machine can use this input to perform actions on that object. Step 4: Test the Rule 213 AWS Step Functions Developer Guide Example of Execution Input The following example shows a typical input to the state machine execution. { "version": "0", "id": "6c540ad4-0671-9974-6511-756fbd7771c3", "detail-type": "Object Created", "source": "aws.s3", "account": "123456789012", "time": "2023-06-23T23:45:48Z", "region": "us-east-2", "resources": [ "arn:aws:s3:::username-sfn-tutorial" ], "detail": { "version": "0", "bucket": { "name": "username-sfn-tutorial" }, "object": { "key": "test.png", "size": 800704, "etag": "f31d8546bb67845b4d3048cde533b937", "sequencer": "00621049BA9A8C712B" }, "request-id": "79104EXAMPLEB723", "requester": "123456789012", "source-ip-address": "200.0.100.11", "reason": "PutObject" } } Creating a Step Functions API using API Gateway You can use Amazon API Gateway to associate your AWS Step Functions APIs with methods in an API Gateway API. When an HTTPS request is sent to an API method, API Gateway invokes your Step Functions API actions. This tutorial shows you how to create an API that uses one resource and the POST method to communicate with the StartExecution API action. You'll use the AWS Identity and Access Management (IAM) console to create a role for API Gateway. Then, you'll use the API Gateway Example of Execution Input 214 AWS Step Functions Developer Guide console to create an API Gateway API, create a resource and method, and map the method to the StartExecution API action. Finally, you'll deploy and test your API. Note Although Amazon API Gateway can start a Step Functions execution by calling StartExecution, you must call DescribeExecution to get the result. Step 1: Create an IAM Role for API Gateway Before you create your API Gateway API, you need to give API Gateway permission to call Step Functions API actions. To set up permissions for API Gateway 1. Sign in to the IAM console and choose Roles, Create role. 2. On the Select trusted entity page, do the following: a. b. For Trusted entity type, keep the default selection of AWS service. For Use case, choose API Gateway from the dropdown list. 3. Select API Gateway, and then choose Next. 4. On the Add permissions page, choose Next. 5. (Optional) On the Name, review, and create page, enter details, such as the role name. For example, enter APIGatewayToStepFunctions. 6. Choose Create
|
step-functions-dg-068
|
step-functions-dg.pdf
| 68 |
API Gateway permission to call Step Functions API actions. To set up permissions for API Gateway 1. Sign in to the IAM console and choose Roles, Create role. 2. On the Select trusted entity page, do the following: a. b. For Trusted entity type, keep the default selection of AWS service. For Use case, choose API Gateway from the dropdown list. 3. Select API Gateway, and then choose Next. 4. On the Add permissions page, choose Next. 5. (Optional) On the Name, review, and create page, enter details, such as the role name. For example, enter APIGatewayToStepFunctions. 6. Choose Create role. The IAM role appears in the list of roles. 7. Choose the name of your role and note the Role ARN, as shown in the following example. arn:aws:iam::123456789012:role/APIGatewayToStepFunctions To attach a policy to the IAM role 1. On the Roles page, search for your role (APIGatewayToStepFunctions), and then choose the role. Step 1: Create an IAM Role for API Gateway 215 AWS Step Functions Developer Guide 2. On the Permissions tab, choose Add permissions, and then choose Attach policies. 3. On the Attach Policy page, search for AWSStepFunctionsFullAccess, choose the policy, and then choose Add permissions. Step 2: Create your API Gateway API After you create your IAM role, you can create your custom API Gateway API. To create the API 1. Open the Amazon API Gateway console, and then choose Create API. 2. On the Choose an API type page, in the REST API pane, choose Build. 3. On the Create REST API page, select New API, and then enter StartExecutionAPI for the API name. 4. Keep API endpoint type as Regional, and then choose Create API. To create a resource 1. On the Resources page of StartExecutionAPI, choose Create resource. 2. On the Create resource page, enter execution for Resource name, and then choose Create resource. To create a POST method 1. Choose the /execution resource, and then choose Create method. 2. 3. 4. 5. For Method type, choose POST. For Integration type, choose AWS service. For AWS Region, choose a Region from the list. For AWS service, choose Step Functions from the list. 6. Keep AWS subdomain blank. 7. For HTTP method, choose POST from the list. Step 2: Create your API Gateway API 216 AWS Step Functions Note All Step Functions API actions use the HTTP POST method. 8. 9. For Action type, select Use action name. For Action name, enter StartExecution. Developer Guide 10. For Execution role, enter the role ARN of the IAM role that you created earlier, as shown in the following example. arn:aws:iam::123456789012:role/APIGatewayToStepFunctions 11. Keep the default options for Credential cache and Default timeout, and then choose Save. The visual mapping between API Gateway and Step Functions is displayed on the /execution - POST - Method execution page. Step 3: Test and Deploy the API Gateway API Once you have created the API, test and deploy it. To test the communication between API Gateway and Step Functions 1. On the /execution - POST - Method Execution page, choose the Test tab. You might need to choose the right arrow button to show the tab. 2. On the /execution - POST - Method Test tab, copy the following request parameters into the Request body section using the ARN of an existing state machine (or create a new state machine that uses a Lambda function), and then choose Test. { "input": "{}", "name": "MyExecution", "stateMachineArn": "arn:aws:states:region:123456789012:stateMachine:HelloWorld" } For more information, see the StartExecution Request Syntax in the AWS Step Functions API Reference. Step 3: Test and Deploy the API Gateway API 217 AWS Step Functions Note Developer Guide If you don't want to include the ARN of your state machine in the body of your API Gateway call, you can configure a mapping template in the Integration request tab, as shown in the following example. { "input": "$util.escapeJavaScript($input.json('$'))", "stateMachineArn": "$util.escapeJavaScript($stageVariables.arn)" } With this approach, you can specify ARNs of different state machines based on your development stage (for example, dev, test, and prod). For more information about specifying stage variables in a mapping template, see $stageVariables in the API Gateway Developer Guide. 3. The execution starts and the execution ARN and its epoch date are displayed under Response body. { "executionArn": "arn:aws:states:region:123456789012:execution:HelloWorld:MyExecution", "startDate": 1486768956.878 } Note You can view the execution by choosing your state machine on the AWS Step Functions console. To deploy your API 1. On the Resources page of StartExecutionAPI, choose Deploy API. 2. 3. For Stage, select New stage. For Stage name, enter alpha. Step 3: Test and Deploy the API Gateway API 218 AWS Step Functions Developer Guide 4. (Optional) For Description, enter a description. 5. Choose Deploy. To test your deployment 1. On the Stages page of StartExecutionAPI, expand alpha, /, /execution, POST, and then choose the
|
step-functions-dg-069
|
step-functions-dg.pdf
| 69 |
date are displayed under Response body. { "executionArn": "arn:aws:states:region:123456789012:execution:HelloWorld:MyExecution", "startDate": 1486768956.878 } Note You can view the execution by choosing your state machine on the AWS Step Functions console. To deploy your API 1. On the Resources page of StartExecutionAPI, choose Deploy API. 2. 3. For Stage, select New stage. For Stage name, enter alpha. Step 3: Test and Deploy the API Gateway API 218 AWS Step Functions Developer Guide 4. (Optional) For Description, enter a description. 5. Choose Deploy. To test your deployment 1. On the Stages page of StartExecutionAPI, expand alpha, /, /execution, POST, and then choose the POST method. 2. Under Method overrides, choose the copy icon to copy your API's invoke URL. The full URL should look like the following example. https://a1b2c3d4e5.execute-api.region.amazonaws.com/alpha/execution 3. From the command line, run the curl command using the ARN of your state machine, and then invoke the URL of your deployment, as shown in the following example. curl -X POST -d '{"input": "{}","name": "MyExecution","stateMachineArn": "arn:aws:states:region:123456789012:stateMachine:HelloWorld"}' https:// a1b2c3d4e5.execute-api.region.amazonaws.com/alpha/execution The execution ARN and its epoch date are returned, as shown in the following example. {"executionArn":"arn:aws:states:region:123456789012:execution:HelloWorld:MyExecution","startDate":1.486772644911E9} Note If you get a "Missing Authentication Token" error, make sure that the invoke URL ends with /execution. Handling error conditions using a Step Functions state machine In this tutorial, you create an AWS Step Functions state machine with a Fallback states field. The Catch field uses an AWS Lambda function to respond with conditional logic based on error message type. This is a technique called function error handling. For more information, see AWS Lambda function errors in Node.js in the AWS Lambda Developer Guide. Handle error conditions 219 AWS Step Functions Note Developer Guide You can also create state machines that Retry on timeouts or those that use Catch to transition to a specific state when an error or timeout occurs. For examples of these error handling techniques, see Examples Using Retry and Using Catch. Step 1: Create a Lambda function that fails Use a Lambda function to simulate an error condition. Important Ensure that your Lambda function is under the same AWS account and AWS Region as your state machine. 1. Open the AWS Lambda console at https://console.aws.amazon.com/lambda/. 2. Choose Create function. 3. Choose Use a blueprint, enter step-functions into the search box, and then choose the Throw a custom error blueprint. For Function name, enter FailFunction. For Role, keep the default selection (Create a new role with basic Lambda permissions). The following code is displayed in the Lambda function code pane. 4. 5. 6. exports.handler = async (event, context) => { function CustomError(message) { this.name = 'CustomError'; this.message = message; } CustomError.prototype = new Error(); throw new CustomError('This is a custom error!'); }; The context object returns the error message This is a custom error!. 7. Choose Create function. Step 1: Create a Lambda function that fails 220 AWS Step Functions Developer Guide 8. After your Lambda function is created, copy the function's Amazon Resource Name (ARN) displayed in the upper-right corner of the page. The following is an example ARN: arn:aws:lambda:region:123456789012:function:FailFunction 9. Choose Deploy. Step 2: Test the Lambda function Test your Lambda function to see it in operation. 1. On the FailFunction page, choose the Test tab, and then choose Test. You don't need to create a test event. 2. To review the test results (the simulated error), under Execution result, expand Details. Step 3: Create a state machine with a Catch field Use the Step Functions console to create a state machine that uses a Task workflow state state with a Catch field. Add a reference to your Lambda function in the Task state. The state machine invokes the Lambda function, which fails during execution. Step Functions retries the function twice using exponential backoff between retries. 1. Open the Step Functions console and choose Create state machine. 2. In the Choose a template dialog box, select Blank. 3. Choose Select to open Workflow Studio in Design mode. 4. Choose Code to open the code editor. In the code editor, you write and edit the Amazon States Language (ASL) definition of your workflows. 5. Paste the following code, but replace the ARN of the Lambda function that you created earlier in the Resource field. { "Comment": "A Catch example of the Amazon States Language using an AWS Lambda function", "StartAt": "CreateAccount", "States": { "CreateAccount": { "Type": "Task", "Resource": "arn:aws:lambda:region:123456789012:function:FailFunction", Step 2: Test the Lambda function 221 AWS Step Functions Developer Guide "Catch": [ { "ErrorEquals": ["CustomError"], "Next": "CustomErrorFallback" }, { "ErrorEquals": ["States.TaskFailed"], "Next": "ReservedTypeFallback" }, { "ErrorEquals": ["States.ALL"], "Next": "CatchAllFallback" } ], "End": true }, "CustomErrorFallback": { "Type": "Pass", "Result": "This is a fallback from a custom Lambda function exception", "End": true }, "ReservedTypeFallback": { "Type": "Pass", "Result": "This is a fallback from a reserved error code", "End": true }, "CatchAllFallback": { "Type":
|
step-functions-dg-070
|
step-functions-dg.pdf
| 70 |
field. { "Comment": "A Catch example of the Amazon States Language using an AWS Lambda function", "StartAt": "CreateAccount", "States": { "CreateAccount": { "Type": "Task", "Resource": "arn:aws:lambda:region:123456789012:function:FailFunction", Step 2: Test the Lambda function 221 AWS Step Functions Developer Guide "Catch": [ { "ErrorEquals": ["CustomError"], "Next": "CustomErrorFallback" }, { "ErrorEquals": ["States.TaskFailed"], "Next": "ReservedTypeFallback" }, { "ErrorEquals": ["States.ALL"], "Next": "CatchAllFallback" } ], "End": true }, "CustomErrorFallback": { "Type": "Pass", "Result": "This is a fallback from a custom Lambda function exception", "End": true }, "ReservedTypeFallback": { "Type": "Pass", "Result": "This is a fallback from a reserved error code", "End": true }, "CatchAllFallback": { "Type": "Pass", "Result": "This is a fallback from any error code", "End": true } } } This is a description of your state machine using the Amazon States Language. It defines a single Task state named CreateAccount. For more information, see State Machine Structure. For more information about the syntax of the Retry field, see State machine examples using Retry and using Catch. Note Unhandled errors in Lambda runtimes were historically reported only as Lambda.Unknown. In newer runtimes, timeouts are reported as Sandbox.Timedout in the error output. Step 3: Create a state machine with a Catch field 222 AWS Step Functions Developer Guide When Lambda exceeds the maximum number of invocations, the reported error will be Lambda.TooManyRequestsException. Match on Lambda.Unknown, Sandbox.Timedout, States.ALL, and States.TaskFailed to handle possible errors. For more information about Lambda Handled and Unhandled errors, see FunctionError in the AWS Lambda Developer Guide. 6. (Optional) In the Graph visualization, see the real-time graphical visualization of your workflow. 7. Specify a name for your state machine. To do this, choose the edit icon next to the default state machine name of MyStateMachine. Then, in State machine configuration, specify a name in the State machine name box. For this tutorial, enter Catchfailure. 8. (Optional) In State machine configuration, specify other workflow settings, such as state machine type and its execution role. For this tutorial, keep all the default selections in State machine settings. 9. In the Confirm role creation dialog box, choose Confirm to continue. You can also choose View role settings to go back to State machine configuration. Note If you delete the IAM role that Step Functions creates, Step Functions can't recreate it later. Similarly, if you modify the role (for example, by removing Step Functions from the principals in the IAM policy), Step Functions can't restore its original settings later. Step 4: Run the state machine After you create your state machine, you can run it. 1. On the State machines page, choose Catchfailure. 2. On the Catchfailure page, choose Start execution. The Start execution dialog box is displayed. 3. In the Start execution dialog box, do the following: Step 4: Run the state machine 223 AWS Step Functions Developer Guide 1. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. 2. (Optional) In the Input box, enter input values in JSON format to run your workflow. 3. Choose Start execution. 4. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. For example, to view your custom error message, choose the CreateAccount step in Graph view, and then choose the Output tab. Step 4: Run the state machine 224 AWS Step Functions Note Developer Guide You can preserve the state input with the error by using ResultPath. See Use ResultPath to include both error and input in a Catch. Creating an Activity state machine using Step Functions This tutorial shows you how to create an activity-based state machine using Java and AWS Step Functions. Activities allow you to control worker code that runs somewhere else from your state machine. For an overview, see Learn about Activities in Step Functions in Learn about state machines in Step Functions. To complete this tutorial, you need the following: • The SDK for Java. The example activity in this tutorial is a Java application that uses the AWS SDK for Java to communicate with AWS. • AWS credentials in the environment or in the standard AWS configuration file. For more information, see Set Up Your
|
step-functions-dg-071
|
step-functions-dg.pdf
| 71 |
how to create an activity-based state machine using Java and AWS Step Functions. Activities allow you to control worker code that runs somewhere else from your state machine. For an overview, see Learn about Activities in Step Functions in Learn about state machines in Step Functions. To complete this tutorial, you need the following: • The SDK for Java. The example activity in this tutorial is a Java application that uses the AWS SDK for Java to communicate with AWS. • AWS credentials in the environment or in the standard AWS configuration file. For more information, see Set Up Your AWS Credentials in the AWS SDK for Java Developer Guide. Step 1: Create an Activity You must make Step Functions aware of the activity whose worker (a program) you want to create. Step Functions responds with an Amazon Resource Name(ARN) that establishes an identity for the activity. Use this identity to coordinate the information passed between your state machine and worker. Important Ensure that your activity task is under the same AWS account as your state machine. 1. In the Step Functions console, in the navigation pane on the left, choose Activities. 2. Choose Create activity. 3. Enter a Name for the activity, for example, get-greeting, and then choose Create activity. 4. When your activity task is created, make a note of its ARN, as shown in the following example. Create an Activity state machine 225 AWS Step Functions Developer Guide arn:aws:states:region:123456789012:activity:get-greeting Step 2: Create a state machine Create a state machine that determines when your activity is invoked and when your worker should perform its primary work, collect its results, and return them. To create the state machine, you'll use the Code editor of Workflow Studio. 1. In the Step Functions console, in the navigation pane on the left, choose State machines. 2. On the State machines page, choose Create state machine. 3. In the Choose a template dialog box, select Blank. 4. Choose Select to open Workflow Studio in Design mode. 5. For this tutorial, you'll write the Amazon States Language (ASL) definition of your state machine in the code editor. To do this, choose Code. 6. Remove the existing boilerplate code and paste the following code. Remember to replace the example ARN in the Resource field with the ARN of the activity task that you created earlier in Step 1: Create an Activity. { "Comment": "An example using a Task state.", "StartAt": "getGreeting", "Version": "1.0", "TimeoutSeconds": 300, "States": { "getGreeting": { "Type": "Task", "Resource": "arn:aws:states:region:123456789012:activity:get-greeting", "End": true } } } This is a description of your state machine using the Amazon States Language (ASL). It defines a single Task state named getGreeting. For more information, see State Machine Structure. Step 2: Create a state machine 226 AWS Step Functions Developer Guide 7. On the Graph visualization, make sure the workflow graph for the ASL definition you added looks similar to the following graph. 8. Specify a name for your state machine. To do this, choose the edit icon next to the default state machine name of MyStateMachine. Then, in State machine configuration, specify a name in the State machine name box. For this tutorial, enter the name ActivityStateMachine. 9. (Optional) In State machine configuration, specify other workflow settings, such as state machine type and its execution role. For this tutorial, keep all the default selections in State machine settings. Step 2: Create a state machine 227 AWS Step Functions Developer Guide If you've previously created an IAM role with the correct permissions for your state machine and want to use it, in Permissions, select Choose an existing role, and then select a role from the list. Or select Enter a role ARN and then provide an ARN for that IAM role. 10. In the Confirm role creation dialog box, choose Confirm to continue. You can also choose View role settings to go back to State machine configuration. Note If you delete the IAM role that Step Functions creates, Step Functions can't recreate it later. Similarly, if you modify the role (for example, by removing Step Functions from the principals in the IAM policy), Step Functions can't restore its original settings later. Step 3: Implement a Worker Create a worker. A worker is a program that is responsible for: • Polling Step Functions for activities using the GetActivityTask API action. • Performing the work of the activity using your code, (for example, the getGreeting() method in the following code). • Returning the results using the SendTaskSuccess, SendTaskFailure, and SendTaskHeartbeat API actions. Note For a more complete example of an activity worker, see Example: Activity Worker in Ruby. This example provides an implementation based on best practices, which you can use as a reference for your activity worker. The code implements a consumer-producer pattern with a configurable number of
|
step-functions-dg-072
|
step-functions-dg.pdf
| 72 |
a worker. A worker is a program that is responsible for: • Polling Step Functions for activities using the GetActivityTask API action. • Performing the work of the activity using your code, (for example, the getGreeting() method in the following code). • Returning the results using the SendTaskSuccess, SendTaskFailure, and SendTaskHeartbeat API actions. Note For a more complete example of an activity worker, see Example: Activity Worker in Ruby. This example provides an implementation based on best practices, which you can use as a reference for your activity worker. The code implements a consumer-producer pattern with a configurable number of threads for pollers and activity workers. To implement the worker 1. Create a file named GreeterActivities.java. 2. Add the following code to it. Step 3: Implement a Worker 228 AWS Step Functions Developer Guide import com.amazonaws.ClientConfiguration; import com.amazonaws.auth.EnvironmentVariableCredentialsProvider; import com.amazonaws.regions.Regions; import com.amazonaws.services.stepfunctions.AWSStepFunctions; import com.amazonaws.services.stepfunctions.AWSStepFunctionsClientBuilder; import com.amazonaws.services.stepfunctions.model.GetActivityTaskRequest; import com.amazonaws.services.stepfunctions.model.GetActivityTaskResult; import com.amazonaws.services.stepfunctions.model.SendTaskFailureRequest; import com.amazonaws.services.stepfunctions.model.SendTaskSuccessRequest; import com.amazonaws.util.json.Jackson; import com.fasterxml.jackson.databind.JsonNode; import java.util.concurrent.TimeUnit; public class GreeterActivities { public String getGreeting(String who) throws Exception { return "{\"Hello\": \"" + who + "\"}"; } public static void main(final String[] args) throws Exception { GreeterActivities greeterActivities = new GreeterActivities(); ClientConfiguration clientConfiguration = new ClientConfiguration(); clientConfiguration.setSocketTimeout((int)TimeUnit.SECONDS.toMillis(70)); AWSStepFunctions client = AWSStepFunctionsClientBuilder.standard() .withRegion(Regions.US_EAST_1) .withCredentials(new EnvironmentVariableCredentialsProvider()) .withClientConfiguration(clientConfiguration) .build(); while (true) { GetActivityTaskResult getActivityTaskResult = client.getActivityTask( new GetActivityTaskRequest().withActivityArn(ACTIVITY_ARN)); if (getActivityTaskResult.getTaskToken() != null) { try { JsonNode json = Jackson.jsonNodeOf(getActivityTaskResult.getInput()); String greetingResult = Step 3: Implement a Worker 229 AWS Step Functions Developer Guide greeterActivities.getGreeting(json.get("who").textValue()); client.sendTaskSuccess( new SendTaskSuccessRequest().withOutput( greetingResult).withTaskToken(getActivityTaskResult.getTaskToken())); } catch (Exception e) { client.sendTaskFailure(new SendTaskFailureRequest().withTaskToken( getActivityTaskResult.getTaskToken())); } } else { Thread.sleep(1000); } } } } Note The EnvironmentVariableCredentialsProvider class in this example assumes that the AWS_ACCESS_KEY_ID (or AWS_ACCESS_KEY) and AWS_SECRET_KEY (or AWS_SECRET_ACCESS_KEY) environment variables are set. For more information about providing the required credentials to the factory, see AWSCredentialsProvider in the AWS SDK for Java API Reference and Set Up AWS Credentials and Region for Development in the AWS SDK for Java Developer Guide. By default the AWS SDK will wait up to 50 seconds to receive data from the server for any operation. The GetActivityTask operation is a long-poll operation that will wait up to 60 seconds for the next available task. To prevent receiving a SocketTimeoutException error, set SocketTimeout to 70 seconds. 3. In the parameter list of the GetActivityTaskRequest().withActivityArn() constructor, replace the ACTIVITY_ARN value with the ARN of the activity task that you created earlier in Step 1: Create an Activity. Step 4: Run the state machine When you start the execution of the state machine, your worker polls Step Functions for activities, performs its work (using the input that you provide), and returns its results. Step 4: Run the state machine 230 AWS Step Functions Developer Guide 1. On the ActivityStateMachine page, choose Start execution. The Start execution dialog box is displayed. 2. In the Start execution dialog box, do the following: a. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. b. In the Input box, enter the following JSON input to run your workflow. c. d. { "who": "AWS Step Functions" } Choose Start execution. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. Step 5: Run and Stop the Worker To have the worker poll your state machine for activities, you must run the worker. 1. On the command line, navigate to the directory in which you created GreeterActivities.java. Step 5: Run and Stop the Worker 231 AWS Step Functions Developer Guide 2. To use the AWS SDK, add the full path of the lib and third-party directories to the dependencies of your build file and to your Java CLASSPATH. For more information, see Downloading and Extracting the SDK in the AWS SDK for Java Developer Guide. 3. Compile the file. $ javac GreeterActivities.java 4. Run the file. $ java GreeterActivities 5. On the Step Functions console, navigate to the Execution Details page. 6. When the execution completes, examine the results of your execution. 7. Stop the worker. View X-Ray traces in Step Functions In this tutorial, you will learn how to use X-Ray to trace errors that occur
|
step-functions-dg-073
|
step-functions-dg.pdf
| 73 |
the full path of the lib and third-party directories to the dependencies of your build file and to your Java CLASSPATH. For more information, see Downloading and Extracting the SDK in the AWS SDK for Java Developer Guide. 3. Compile the file. $ javac GreeterActivities.java 4. Run the file. $ java GreeterActivities 5. On the Step Functions console, navigate to the Execution Details page. 6. When the execution completes, examine the results of your execution. 7. Stop the worker. View X-Ray traces in Step Functions In this tutorial, you will learn how to use X-Ray to trace errors that occur when running a state machine. You can use AWS X-Ray to visualize the components of your state machine, identify performance bottlenecks, and troubleshoot requests that resulted in an error. In this tutorial, you will create several Lambda functions that randomly produce errors, which you can then trace and analyze using X-Ray. The Creating a Step Functions state machine that uses Lambda tutorial walks you though creating a state machine that calls a Lambda function. If you have completed that tutorial, skip to Step 2 and use the AWS Identity and Access Management (IAM) role that you previously created. Step 1: Create an IAM role for Lambda Both AWS Lambda and AWS Step Functions can run code and access AWS resources (for example, data stored in Amazon S3 buckets). To maintain security, you must grant Lambda and Step Functions access to these resources. Lambda requires you to assign an AWS Identity and Access Management (IAM) role when you create a Lambda function, in the same way Step Functions requires you to assign an IAM role when you create a state machine. You use the IAM console to create a service-linked role. View X-Ray traces 232 AWS Step Functions To create a role (console) Developer Guide 1. Sign in to the AWS Management Console and open the IAM console at https:// console.aws.amazon.com/iam/. 2. In the navigation pane of the IAM console, choose Roles. Then choose Create role. 3. Choose the AWS Service role type, and then choose Lambda. 4. Choose the Lambda use case. Use cases are defined by the service to include the trust policy required by the service. Then choose Next: Permissions. 5. Choose one or more permissions policies to attach to the role (for example, AWSLambdaBasicExecutionRole). See AWS Lambda Permissions Model. Select the box next to the policy that assigns the permissions that you want the role to have, and then choose Next: Review. 6. 7. Enter a Role name. (Optional) For Role description, edit the description for the new service-linked role. 8. Review the role, and then choose Create role. Step 2: Create a Lambda function Your Lambda function will randomly throw errors or time out, producing example data to view in X-Ray. Important Ensure that your Lambda function is under the same AWS account and AWS Region as your state machine. 1. Open the Lambda console and choose Create function. 2. 3. In the Create function section, choose Author from scratch. In the Basic information section, configure your Lambda function: a. b. c. For Function name, enter TestFunction1. For Runtime, choose Node.js 18.x. For Role, select Choose an existing role. Step 2: Create a Lambda function 233 AWS Step Functions Developer Guide d. For Existing role, select the Lambda role that you created earlier. Note If the IAM role that you created doesn't appear in the list, the role might still need a few minutes to propagate to Lambda. e. Choose Create function. When your Lambda function is created, note its Amazon Resource Name (ARN) in the upper-right corner of the page. For example: arn:aws:lambda:region:123456789012:function:TestFunction1 4. Copy the following code for the Lambda function into the Function code section of the TestFunction1 page. function getRandomSeconds(max) { return Math.floor(Math.random() * Math.floor(max)) * 1000; } function sleep(ms) { return new Promise(resolve => setTimeout(resolve, ms)); } export const handler = async (event) => { if(getRandomSeconds(4) === 0) { throw new Error("Something went wrong!"); } let wait_time = getRandomSeconds(5); await sleep(wait_time); return { 'response': true } }; This code creates randomly timed failures, which will be used to generate example errors in your state machine that can be viewed and analyzed using X-Ray traces. 5. Choose Save. Step 3: Create two more Lambda functions Create two more Lambda functions. Step 3: Create two more Lambda functions 234 AWS Step Functions Developer Guide 1. Repeat Step 2 to create two more Lambda functions. For the next function, in Function name, enter TestFunction2. For the last function, in Function name, enter TestFunction3. 2. In the Lambda console, check that you now have three Lambda functions, TestFunction1, TestFunction2, and TestFunction3. Step 4: Create a state machine In this step, you'll use the Step Functions console to create a state machine with three Task states.
|
step-functions-dg-074
|
step-functions-dg.pdf
| 74 |
using X-Ray traces. 5. Choose Save. Step 3: Create two more Lambda functions Create two more Lambda functions. Step 3: Create two more Lambda functions 234 AWS Step Functions Developer Guide 1. Repeat Step 2 to create two more Lambda functions. For the next function, in Function name, enter TestFunction2. For the last function, in Function name, enter TestFunction3. 2. In the Lambda console, check that you now have three Lambda functions, TestFunction1, TestFunction2, and TestFunction3. Step 4: Create a state machine In this step, you'll use the Step Functions console to create a state machine with three Task states. Each Task state will a reference one of your three Lambda functions. 1. Open the Step Functions console and choose Create state machine. Important Make sure that your state machine is under the same AWS account and Region as the Lambda functions you created earlier in Step 2 and Step 3. 2. In the Choose a template dialog box, select Blank. 3. Choose Select to open Workflow Studio in Design mode. 4. For this tutorial, you'll write the Amazon States Language (ASL) definition of your state machine in the Code editor. To do this, choose Code. 5. Remove the existing boilerplate code and paste the following code. In the Task state definition, remember to replace the example ARNs with the ARNs of the Lambda functions you created. { "StartAt": "CallTestFunction1", "States": { "CallTestFunction1": { "Type": "Task", "Resource": "arn:aws:lambda:region:123456789012:function:test-function1", "Catch": [ { "ErrorEquals": [ "States.TaskFailed" ], "Next": "AfterTaskFailed" } ], "Next": "CallTestFunction2" Step 4: Create a state machine 235 AWS Step Functions }, "CallTestFunction2": { "Type": "Task", Developer Guide "Resource": "arn:aws:lambda:region:123456789012:function:test-function2", "Catch": [ { "ErrorEquals": [ "States.TaskFailed" ], "Next": "AfterTaskFailed" } ], "Next": "CallTestFunction3" }, "CallTestFunction3": { "Type": "Task", "Resource": "arn:aws:lambda:region:123456789012:function:test-function3", "TimeoutSeconds": 5, "Catch": [ { "ErrorEquals": [ "States.Timeout" ], "Next": "AfterTimeout" }, { "ErrorEquals": [ "States.TaskFailed" ], "Next": "AfterTaskFailed" } ], "Next": "Succeed" }, "Succeed": { "Type": "Succeed" }, "AfterTimeout": { "Type": "Fail" }, "AfterTaskFailed": { "Type": "Fail" } } Step 4: Create a state machine 236 AWS Step Functions } Developer Guide This is a description of your state machine using the Amazon States Language. It defines three Task states named CallTestFunction1, CallTestFunction2 and CallTestFunction3. Each calls one of your three Lambda functions. For more information, see State Machine Structure. 6. Specify a name for your state machine. To do this, choose the edit icon next to the default state machine name of MyStateMachine. Then, in State machine configuration, specify a name in the State machine name box. For this tutorial, enter the name TraceFunctions. 7. (Optional) In State machine configuration, specify other workflow settings, such as state machine type and its execution role. For this tutorial, under Additional configuration, choose Enable X-Ray tracing. Keep all the other default selections in State machine settings. If you've previously created an IAM role with the correct permissions for your state machine and want to use it, in Permissions, select Choose an existing role, and then select a role from the list. Or select Enter a role ARN and then provide an ARN for that IAM role. 8. In the Confirm role creation dialog box, choose Confirm to continue. You can also choose View role settings to go back to State machine configuration. Note If you delete the IAM role that Step Functions creates, Step Functions can't recreate it later. Similarly, if you modify the role (for example, by removing Step Functions from the principals in the IAM policy), Step Functions can't restore its original settings later. Step 5: Run the state machine State machine executions are instances where you run your workflow to perform tasks. 1. On the TraceFunctions page, choose Start execution. The New execution page is displayed. Step 5: Run the state machine 237 AWS Step Functions Developer Guide 2. In the Start execution dialog box, do the following: a. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. b. Choose Start execution. c. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. Run several (at least three) executions. 3. After the
|
step-functions-dg-075
|
step-functions-dg.pdf
| 75 |
Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. Run several (at least three) executions. 3. After the executions have finished, follow the X-Ray trace map link. You can view the trace while an execution is still running, but you may want to see the execution results before viewing the X-Ray trace map. 4. View the service map to identify where errors are occurring, connections with high latency, or traces for requests that were unsuccessful. In this example, you can see how much traffic each function is receiving. TestFunction2 was called more often than TestFunction3, and TestFunction1 was called more than twice as often as TestFunction2. The service map indicates the health of each node by coloring it based on the ratio of successful calls to errors and faults: • Green for successful calls • Red for server faults (500 series errors) • Yellow for client errors (400 series errors) • Purple for throttling errors (429 Too Many Requests) Step 5: Run the state machine 238 AWS Step Functions Developer Guide You can also choose a service node to view requests for that node, or an edge between two nodes to view requests that traveled that connection. 5. View the X-Ray trace map to see what has happened for each execution. The Timeline view shows a hierarchy of segments and subsegments. The first entry in the list is the segment, which represents all data recorded by the service for a single request. Below the segment are subsegments. This example shows subsegments recorded by the Lambda functions. For more information on understanding X-Ray traces and using X-Ray with Step Functions, see the Trace Step Functions request data in AWS X-Ray Step 5: Run the state machine 239 AWS Step Functions Developer Guide Gather Amazon S3 bucket info using AWS SDK service integrations This tutorial shows you how to perform an AWS SDK integration with Amazon Simple Storage Service. The state machine you create in this tutorial gathers information about your Amazon S3 buckets, then list your buckets along with version information for each bucket in the current region. Step 1: Create the state machine Using the Step Functions console, you'll create a state machine that includes a Task state to list all the Amazon S3 buckets in the current account and region. Then, you'll add another Task state that invokes the HeadBucket API to verify if the returned bucket is accessible in the current region. If the bucket isn't accessible, the HeadBucket API call returns the S3.S3Exception error. You'll include a Catch block to catch this exception and a Pass state as the fallback state. 1. Open the Step Functions console and choose Create state machine. 2. In the Choose a template dialog box, select Blank. 3. Choose Select to open Workflow Studio in Design mode. 4. For this tutorial, you'll write the Amazon States Language (ASL) definition of your state machine in the Code editor. To do this, choose Code. 5. Remove the existing boilerplate code and paste the following state machine definition. { "Comment": "A description of my state machine", "StartAt": "ListBuckets", "States": { "ListBuckets": { "Type": "Task", "Parameters": {}, "Resource": "arn:aws:states:::aws-sdk:s3:listBuckets", "Next": "Map" }, "Map": { "Type": "Map", "ItemsPath": "$.Buckets", "ItemProcessor": { "ProcessorConfig": { "Mode": "INLINE" Gather Amazon S3 bucket info 240 AWS Step Functions }, "StartAt": "HeadBucket", "States": { "HeadBucket": { "Type": "Task", "ResultPath": null, "Parameters": { "Bucket.$": "$.Name" }, "Resource": "arn:aws:states:::aws-sdk:s3:headBucket", Developer Guide "Catch": [ { "ErrorEquals": [ "S3.S3Exception" ], "ResultPath": null, "Next": "Pass" } ], "Next": "GetBucketVersioning" }, "GetBucketVersioning": { "Type": "Task", "End": true, "Parameters": { "Bucket.$": "$.Name" }, "ResultPath": "$.BucketVersioningInfo", "Resource": "arn:aws:states:::aws-sdk:s3:getBucketVersioning" }, "Pass": { "Type": "Pass", "End": true, "Result": { "Status": "Unknown" }, "ResultPath": "$.BucketVersioningInfo" } } }, "End": true } } Step 1: Create the state machine 241 AWS Step Functions } Developer Guide 6. Specify a name for your state machine. To do this, choose the edit icon next to the default state machine name of MyStateMachine. Then, in State machine configuration, specify a name in the State machine name box. For this tutorial, enter the name Gather-S3-Bucket-Info-Standard. 7. (Optional) In State machine configuration, specify other workflow settings, such as state machine type and its execution role. Keep all the default selections in State machine settings. If you've previously created an IAM role with the correct permissions for your
|
step-functions-dg-076
|
step-functions-dg.pdf
| 76 |
} Step 1: Create the state machine 241 AWS Step Functions } Developer Guide 6. Specify a name for your state machine. To do this, choose the edit icon next to the default state machine name of MyStateMachine. Then, in State machine configuration, specify a name in the State machine name box. For this tutorial, enter the name Gather-S3-Bucket-Info-Standard. 7. (Optional) In State machine configuration, specify other workflow settings, such as state machine type and its execution role. Keep all the default selections in State machine settings. If you've previously created an IAM role with the correct permissions for your state machine and want to use it, in Permissions, select Choose an existing role, and then select a role from the list. Or select Enter a role ARN and then provide an ARN for that IAM role. 8. In the Confirm role creation dialog box, choose Confirm to continue. You can also choose View role settings to go back to State machine configuration. Note If you delete the IAM role that Step Functions creates, Step Functions can't recreate it later. Similarly, if you modify the role (for example, by removing Step Functions from the principals in the IAM policy), Step Functions can't restore its original settings later. In Step 2, you'll add the missing permissions to the state machine role. Step 2: Add the necessary IAM role permissions To gather information about the Amazon S3 buckets in your current region, you must provide your state machine the necessary permissions to access the Amazon S3 buckets. 1. On the state machine page, choose IAM role ARN to open the Roles page for the state machine role. 2. Choose Add permissions and then choose Create inline policy. 3. Choose the JSON tab, and then paste the following permissions into the JSON editor. Step 2: Add the necessary IAM role permissions 242 AWS Step Functions Developer Guide { "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "s3:ListAllMyBuckets", "s3:ListBucket", "s3:GetBucketVersioning" ], "Resource": "*" } ] } 4. Choose Review policy. 5. Under Review policy, for the policy Name, enter s3-bucket-permissions. 6. Choose Create policy. Step 3: Run a Standard state machine execution 1. On the Gather-S3-Bucket-Info-Standard page, choose Start execution. 2. In the Start execution dialog box, do the following: a. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. b. Choose Start execution. c. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. Step 3: Run a Standard state machine execution 243 AWS Step Functions Developer Guide To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. Step 4: Run an Express state machine execution 1. Create an Express state machine using the state machine definition provided in Step 1. Make sure that you also include the necessary IAM role permissions as explained in Step 2. Tip To distinguish from the Standard machine you created earlier, name the Express state machine as Gather-S3-Bucket-Info-Express. 2. On the Gather-S3-Bucket-Info-Standard page, choose Start execution. 3. In the Start execution dialog box, do the following: a. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. b. Choose Start execution. c. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. Step 4: Run an Express state machine execution 244 AWS Step Functions Developer Guide Continue long-running workflows using Step Functions API (recommended) AWS Step Functions is designed to run workflows with
|
step-functions-dg-077
|
step-functions-dg.pdf
| 77 |
On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. Step 4: Run an Express state machine execution 244 AWS Step Functions Developer Guide Continue long-running workflows using Step Functions API (recommended) AWS Step Functions is designed to run workflows with a finite duration and number of steps. Standard workflow executions have a maximum duration of one year and 25,000 events (see Step Functions service quotas). For long-running executions, you can avoid reaching the hard quota by starting a new workflow execution from the Task state. You need to break your workflows up into smaller state machines which continue ongoing work in a new execution. To start new workflow executions, you will call the StartExecution API action from your Task state and pass the necessary parameters. Step Functions can start workflow executions by calling its own API as an integrated service. We recommend that you use this approach to avoid exceeding service quotas for long-running executions. Step 1: Create a long-running state machine Create a long-running state machine that you want to start from the Task state of a different state machine. For this tutorial, use the state machine that uses a Lambda function. Note Make sure to copy the name and Amazon Resource Name of this state machine in a text file for later use. Step 2: Create a state machine to call the Step Functions API action To start workflow executions from a Task state 1. Open the Step Functions console and choose Create state machine. 2. In the Choose a template dialog box, select Blank. 3. Choose Select to open Workflow Studio in Design mode. 4. From the Actions tab, drag the StartExecution API action and drop it on the empty state labelled Drag first state here. Continue long-running workflows using Step Functions API (recommended) 245 AWS Step Functions Developer Guide 5. Choose the StartExecution state and do the following in the Configuration tab in Design mode: a. Rename the state to Start nested execution. b. c. For Integration type, choose AWS SDK - new from the dropdown list. In API Parameters, do the following: i. For StateMachineArn, replace the sample Amazon Resource Name with the ARN of your state machine. For example, enter the ARN of the state machine that uses Lambda. ii. For Input node, replace the existing placeholder text with the following value: "Comment": "Starting workflow execution using a Step Functions API action" iii. Make sure your inputs in API Parameters look similar to the following: { "StateMachineArn": "arn:aws:states:us- east-2:123456789012:stateMachine:LambdaStateMachine", "Input": { "Comment": "Starting workflow execution using a Step Functions API action", "AWS_STEP_FUNCTIONS_STARTED_BY_EXECUTION_ID.$": "$$.Execution.Id" } 6. (Optional) Choose Definition on the Inspector panel panel to view the automatically- generated Amazon States Language (ASL) definition of your workflow. Tip You can also view the ASL definition in the Code editor of Workflow Studio. In the code editor, you can also edit the ASL definition of your workflow. 7. Specify a name for your state machine. To do this, choose the edit icon next to the default state machine name of MyStateMachine. Then, in State machine configuration, specify a name in the State machine name box. For this tutorial, enter the name ParentStateMachine. Step 2: Create a state machine to call the Step Functions API action 246 AWS Step Functions Developer Guide 8. (Optional) In State machine configuration, specify other workflow settings, such as state machine type and its execution role. For this tutorial, keep all the default selections in State machine settings. If you've previously created an IAM role with the correct permissions for your state machine and want to use it, in Permissions, select Choose an existing role, and then select a role from the list. Or select Enter a role ARN and then provide an ARN for that IAM role. 9. In the Confirm role creation dialog box, choose Confirm to continue. You can also choose View role settings to go back to State machine configuration. Note If you delete the IAM role that Step Functions creates, Step Functions can't recreate it later. Similarly, if you modify the role (for example, by removing Step Functions from the principals in the IAM policy), Step Functions can't restore its original settings later. Step 3: Update the IAM policy To make sure your state machine has permissions to start the execution of the state machine that uses a Lambda function, you need to attach an inline policy to your state machine's IAM role. For more information, see
|
step-functions-dg-078
|
step-functions-dg.pdf
| 78 |
also choose View role settings to go back to State machine configuration. Note If you delete the IAM role that Step Functions creates, Step Functions can't recreate it later. Similarly, if you modify the role (for example, by removing Step Functions from the principals in the IAM policy), Step Functions can't restore its original settings later. Step 3: Update the IAM policy To make sure your state machine has permissions to start the execution of the state machine that uses a Lambda function, you need to attach an inline policy to your state machine's IAM role. For more information, see Embedding Inline Policies in the IAM User Guide. 1. On the ParentStateMachine page, choose the IAM role ARN to navigate to the IAM Roles page for your state machine. 2. Assign an appropriate permission to the IAM role of the ParentStateMachine for it to be able to start execution of another state machine. To assign the permission, do the following: a. On the IAM Roles page, choose Add permissions, and then choose Create inline policy. b. On the Create policy page, choose the JSON tab. c. Replace the existing text with the following policy. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", Step 3: Update the IAM policy 247 AWS Step Functions Developer Guide "Action": [ "states:StartExecution" ], "Resource": [ "arn:aws:states:region:123456789012:stateMachine:LambdaStateMachine" ] } ] } d. Choose Review policy. e. Specify a name for the policy, and then choose Create policy. Step 4: Run the state machine State machine executions are instances where you run your workflow to perform tasks. 1. On the ParentStateMachine page, choose Start execution. The Start execution dialog box is displayed. 2. In the Start execution dialog box, do the following: a. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. b. c. d. (Optional) In the Input box, enter input values in JSON format to run your workflow. Choose Start execution. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. Step 4: Run the state machine 248 AWS Step Functions Developer Guide To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. 3. Open the LambdaStateMachine page and notice a new execution triggered by the ParentStateMachine. Using a Lambda function to continue a new execution in Step Functions Tip The following approach uses a Lambda function to start a new workflow execution. We recommend using a Step Functions Task state to start new workflow executions. See how in the following tutorial: the section called “Continue long-running workflows using Step Functions API (recommended)” . You can create a state machine that uses a Lambda function to start a new execution before the current execution terminates. With this approach to continue ongoing work in a new execution, you can break large jobs into smaller workflows, or run a workflow indefinitely. This tutorial builds on the concept of using an external Lambda function to modify your workflow, which was demonstrated in the Iterate a loop with a Lambda function in Step Functions tutorial. You use the same Lambda function (Iterator) to iterate a loop for a specific number of times. In addition, you create another Lambda function to start a new execution of your workflow, and to decrement a count each time it starts a new execution. By setting the number of executions in the input, this state machine ends and restarts an execution a specified number of times. The state machine you'll create implements the following states. State Purpose ConfigureCount A Pass state that configures the count, index, and step values that the Iterator Lambda function uses to step through iterations of work. Using Lambda to continue a workflow 249 AWS Step Functions Developer Guide State Purpose Iterator A Task state that references the Iterator Lambda function. IsCountReached ExampleWork ShouldRestart Restart Prerequisites A Choice state that uses a Boolean value from the Iterator function to decide whether the state machine should continue the example work, or move to the ShouldRestart state. A Pass state that represents the Task state that would perform work in an actual implementation. A Choice state that uses the executionCount value to decide whether it should
|
step-functions-dg-079
|
step-functions-dg.pdf
| 79 |
values that the Iterator Lambda function uses to step through iterations of work. Using Lambda to continue a workflow 249 AWS Step Functions Developer Guide State Purpose Iterator A Task state that references the Iterator Lambda function. IsCountReached ExampleWork ShouldRestart Restart Prerequisites A Choice state that uses a Boolean value from the Iterator function to decide whether the state machine should continue the example work, or move to the ShouldRestart state. A Pass state that represents the Task state that would perform work in an actual implementation. A Choice state that uses the executionCount value to decide whether it should end one execution and start another, or simply end. A Task state that uses a Lambda function to start a new execution of your state machine. Like the Iterator function, this function also decrements a count. The Restart state passes the decremented value of the count to the input of the new execution. Before you begin, go through the Creating a Step Functions state machine that uses Lambda tutorial to ensure that you're familiar with using Lambda and Step Functions together. Step 1: Create a Lambda function to iterate a count Note If you have completed the Iterate a loop with a Lambda function in Step Functions tutorial, you can skip this step and use that Lambda function. This section and the Iterate a loop with a Lambda function in Step Functions tutorial show how you can use a Lambda function to track a count, for example, the number of iterations of a loop in your state machine. Prerequisites 250 AWS Step Functions Developer Guide The following Lambda function receives input values for count, index, and step. It returns these values with an updated index and a Boolean named continue. The Lambda function sets continue to true if the index is less than count. Your state machine then implements a Choice state that executes some application logic if continue is true, or moves on to ShouldRestart if continue is false. Create the Iterate Lambda function 1. Open the Lambda console, and then choose Create function. 2. On the Create function page, choose Author from scratch. 3. In the Basic information section, configure your Lambda function, as follows: a. b. c. For Function name, enter Iterator. For Runtime, choose Node.js 16.x. Keep all the default selections on the page, and then choose Create function. When your Lambda function is created, make a note of its Amazon Resource Name (ARN) in the upper-right corner of the page, for example: arn:aws:lambda:region:123456789012:function:Iterator 4. Copy the following code for the Lambda function into the Code source section of the Iterator page in the Lambda console. exports.handler = function iterator (event, context, callback) { let index = event.iterator.index; let step = event.iterator.step; let count = event.iterator.count; index = index + step; callback(null, { index, step, count, continue: index < count }) } Step 1: Create a Lambda function to iterate a count 251 AWS Step Functions Developer Guide This code accepts input values for count, index, and step. It increments the index by the value of step and returns these values, and the Boolean value of continue. The value of continue is true if index is less than count. 5. Choose Deploy to deploy the code. Test the Iterate Lambda function To see your Iterate function working, run it with numeric values. You can provide input values for your Lambda function that mimic an iteration to see what output you get with specific input values. To test your Lambda function 1. In the Configure test event dialog box, choose Create new test event, and then type TestIterator for Event name. 2. Replace the example data with the following. { "Comment": "Test my Iterator function", "iterator": { "count": 10, "index": 5, "step": 1 } } These values mimic what would come from your state machine during an iteration. The Lambda function increments the index and returns continue as true. When the index is not less than the count, it returns continue as false. For this test, the index has already incremented to 5. The results should increment the index to 6 and set continue to true. 3. Choose Create. 4. On the Iterator page in your Lambda console, be sure TestIterator is listed, and then choose Test. The results of the test are displayed at the top of the page. Choose Details and review the result. Step 1: Create a Lambda function to iterate a count 252 AWS Step Functions Developer Guide { "index": 6, "step": 1, "count": 10, "continue": true } Note If you set index to 9 for this test, the index increments to 10, and continue is false. Step 2: Create a Restart Lambda function to start a new Step Functions execution 1. Open the Lambda console, and then choose Create function. 2. On
|
step-functions-dg-080
|
step-functions-dg.pdf
| 80 |
console, be sure TestIterator is listed, and then choose Test. The results of the test are displayed at the top of the page. Choose Details and review the result. Step 1: Create a Lambda function to iterate a count 252 AWS Step Functions Developer Guide { "index": 6, "step": 1, "count": 10, "continue": true } Note If you set index to 9 for this test, the index increments to 10, and continue is false. Step 2: Create a Restart Lambda function to start a new Step Functions execution 1. Open the Lambda console, and then choose Create function. 2. On the Create function page, choose Author from scratch. 3. In the Basic information section, configure your Lambda function, as follows: a. b. For Function name, enter Restart. For Runtime, choose Node.js 16.x. 4. Keep all the default selections on the page, and then choose Create function. When your Lambda function is created, make a note of its Amazon Resource Name (ARN) in the upper-right corner of the page, for example: arn:aws:lambda:region:123456789012:function:Iterator 5. Copy the following code for the Lambda function into the Code source section of the Restart page in the Lambda console. The following code decrements a count of the number of executions, and starts a new execution of your state machine, including the decremented value. var aws = require('aws-sdk'); var sfn = new aws.StepFunctions(); Step 2: Create a Restart Lambda function to start a new Step Functions execution 253 AWS Step Functions Developer Guide exports.restart = function(event, context, callback) { let StateMachineArn = event.restart.StateMachineArn; event.restart.executionCount -= 1; event = JSON.stringify(event); let params = { input: event, stateMachineArn: StateMachineArn }; sfn.startExecution(params, function(err, data) { if (err) callback(err); else callback(null,event); }); } 6. Choose Deploy to deploy the code. Step 3: Create a state machine Now that you've created your two Lambda functions, create a state machine. In this state machine, the ShouldRestart and Restart states are how you break your work across multiple executions. Example ShouldRestart Choice state The following excerpt shows the ShouldRestartChoice state. This state determines whether or not you should restart the execution. "ShouldRestart": { "Type": "Choice", "Choices": [ { "Variable": "$.restart.executionCount", "NumericGreaterThan": 1, "Next": "Restart" } ], Step 3: Create a state machine 254 AWS Step Functions Developer Guide The $.restart.executionCount value is included in the input of the initial execution. It's decremented by one each time the Restart function is called, and then placed into the input for each subsequent execution. Example Restart Task state The following excerpt shows the RestartTask state. This state uses the Lambda function you created earlier to restart the execution, and to decrement the count to track the remaining number of executions to start. "Restart": { "Type": "Task", "Resource": "arn:aws:lambda:region:123456789012:function:Restart", "Next": "Done" }, To create the state machine 1. Open the Step Functions console and choose Create state machine. Important Make sure that your state machine is under the same AWS account and Region as the Lambda functions you created earlier in Step 1 and Step 2. 2. In the Choose a template dialog box, select Blank. 3. Choose Select to open Workflow Studio in Design mode. 4. For this tutorial, you'll write the Amazon States Language (ASL) definition of your state machine in the Code editor. To do this, choose Code. 5. Remove the existing boilerplate code and paste the following code. Remember to replace the ARNs in this code with the ARNs of the Lambda functions you created. { "Comment": "Continue-as-new State Machine Example", "StartAt": "ConfigureCount", "States": { "ConfigureCount": { "Type": "Pass", "Result": { "count": 100, Step 3: Create a state machine 255 AWS Step Functions Developer Guide "index": -1, "step": 1 }, "ResultPath": "$.iterator", "Next": "Iterator" }, "Iterator": { "Type": "Task", "Resource": "arn:aws:lambda:region:123456789012:function:Iterator", "ResultPath": "$.iterator", "Next": "IsCountReached" }, "IsCountReached": { "Type": "Choice", "Choices": [ { "Variable": "$.iterator.continue", "BooleanEquals": true, "Next": "ExampleWork" } ], "Default": "ShouldRestart" }, "ExampleWork": { "Comment": "Your application logic, to run a specific number of times", "Type": "Pass", "Result": { "success": true }, "ResultPath": "$.result", "Next": "Iterator" }, "ShouldRestart": { "Type": "Choice", "Choices": [ { "Variable": "$.restart.executionCount", "NumericGreaterThan": 0, "Next": "Restart" } ], "Default": "Done" }, "Restart": { Step 3: Create a state machine 256 AWS Step Functions Developer Guide "Type": "Task", "Resource": "arn:aws:lambda:region:123456789012:function:Restart", "Next": "Done" }, "Done": { "Type": "Pass", "End": true } } } 6. Specify a name for your state machine. To do this, choose the edit icon next to the default state machine name of MyStateMachine. Then, in State machine configuration, specify a name in the State machine name box. For this tutorial, enter the name ContinueAsNew. 7. (Optional) In State machine configuration, specify other workflow settings, such as state machine type and its execution role. For this tutorial, keep all the default selections in State machine settings. If you've previously created an IAM role with the correct permissions
|
step-functions-dg-081
|
step-functions-dg.pdf
| 81 |
"arn:aws:lambda:region:123456789012:function:Restart", "Next": "Done" }, "Done": { "Type": "Pass", "End": true } } } 6. Specify a name for your state machine. To do this, choose the edit icon next to the default state machine name of MyStateMachine. Then, in State machine configuration, specify a name in the State machine name box. For this tutorial, enter the name ContinueAsNew. 7. (Optional) In State machine configuration, specify other workflow settings, such as state machine type and its execution role. For this tutorial, keep all the default selections in State machine settings. If you've previously created an IAM role with the correct permissions for your state machine and want to use it, in Permissions, select Choose an existing role, and then select a role from the list. Or select Enter a role ARN and then provide an ARN for that IAM role. 8. In the Confirm role creation dialog box, choose Confirm to continue. You can also choose View role settings to go back to State machine configuration. Note If you delete the IAM role that Step Functions creates, Step Functions can't recreate it later. Similarly, if you modify the role (for example, by removing Step Functions from the principals in the IAM policy), Step Functions can't restore its original settings later. 9. Save the Amazon Resource Name (ARN) of this state machine in a text file. You'll need to provide the ARN while providing permission to the Lambda function to start a new Step Functions execution. Step 3: Create a state machine 257 AWS Step Functions Developer Guide Step 4: Update the IAM Policy To make sure your Lambda function has permissions to start a new Step Functions execution, attach an inline policy to the IAM role you use for your Restart Lambda function. For more information, see Embedding Inline Policies in the IAM User Guide. Note You can update the Resource line in the previous example to reference the ARN of your ContinueAsNew state machine. This restricts the policy so that it can only start an execution of that specific state machine. { "Version": "2012-10-17", "Statement": [ { "Sid": "VisualEditor0", "Effect": "Allow", "Action": [ "states:StartExecution" ], "Resource": "arn:aws:states:us-east-2:account-idstateMachine:ContinueAsNew" } ] } Step 5: Run the state machine To start an execution, provide input that includes the ARN of the state machine and an executionCount for how many times it should start a new execution. 1. On the ContinueAsNew page, choose Start execution. The Start execution dialog box is displayed. 2. In the Start execution dialog box, do the following: a. (Optional) Enter a custom execution name to override the generated default. Step 4: Update the IAM Policy 258 AWS Step Functions Developer Guide Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. b. In the Input box, enter the following JSON input to run your workflow. { "restart": { "StateMachineArn": "arn:aws:states:region:account- id:stateMachine:ContinueAsNew", "executionCount": 4 } } c. Update the StateMachineArn field with the ARN for your ContinueAsNew state machine. d. Choose Start execution. e. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. The Graph view displays the first of the four executions. Before it completes, it will pass through the Restart state and start a new execution. Step 5: Run the state machine 259 AWS Step Functions Developer Guide As this execution completes, you can look at the next execution that's running. Select the ContinueAsNew link at the top to see the list of executions. You should see both the recently closed execution, and an ongoing execution that the Restart Lambda function started. When all the executions are complete, you should see four successful executions in the list. The first execution that was started displays the name you chose, and subsequent executions have a generated name. Step 5: Run the state machine 260 AWS Step Functions Developer Guide Accessing cross-account AWS resources in Step Functions With the cross-account access support in Step Functions, you can share resources configured in different AWS accounts. In this tutorial, we walk you through the process of accessing a cross- account Lambda function defined in an account called Production. This function is invoked from a state machine in an
|
step-functions-dg-082
|
step-functions-dg.pdf
| 82 |
all the executions are complete, you should see four successful executions in the list. The first execution that was started displays the name you chose, and subsequent executions have a generated name. Step 5: Run the state machine 260 AWS Step Functions Developer Guide Accessing cross-account AWS resources in Step Functions With the cross-account access support in Step Functions, you can share resources configured in different AWS accounts. In this tutorial, we walk you through the process of accessing a cross- account Lambda function defined in an account called Production. This function is invoked from a state machine in an account called Development. In this tutorial, the Development account is referred to as the source account and the Production account is the target account containing the target IAM role. To start, in your Task state’s definition, you specify the target IAM role the state machine must assume before invoking the cross-account Lambda function. Then, modify the trust policy in the target IAM role to allow the source account to assume the target role temporarily. Also, to call the AWS resource, define the appropriate permissions in the target IAM role. Finally, update the source account’s execution role to specify the required permission to assume the target role. You can configure your state machine to assume an IAM role for accessing resources from multiple AWS accounts. However, a state machine can assume only one IAM role at a given time based on the Task state’s definition. Note Currently, cross-Region AWS SDK integration and cross-Region AWS resource access aren't available in Step Functions. Access cross-account resources 261 AWS Step Functions Prerequisites Developer Guide • This tutorial uses the example of a Lambda function for demonstrating how to set up cross- account access. You can use any other AWS resource, but make sure you’ve configured the resource in a different account. Important IAM roles and resource-based policies delegate access across accounts only within a single partition. For example, assume that you have an account in US West (N. California) in the standard aws partition. You also have an account in China (Beijing) in the aws- cn partition. You can't use an Amazon S3 resource-based policy in your account in China (Beijing) to allow access for users in your standard aws account. • Make a note of the cross-account resource's Amazon Resource Name (ARN) in a text file. Later in this tutorial, you'll provide this ARN in your state machine's Task state definition. The following is an example of a Lambda function ARN: arn:aws:lambda:us-east-2:account-id:function:functionName • Make sure you've created the target IAM role that the state machine needs to assume. Step 1: Update the Task state definition to specify the target role In the Task state of your workflow, add a Credentials field containing the identity the state machine must assume before invoking the cross-account Lambda function. The following procedure demonstrates how to access a cross-account Lambda function called Echo. You can call any AWS resource by following these steps. 1. Open the Step Functions console and choose Create state machine. 2. On the Choose authoring method page, choose Design your workflow visually and keep all the default selections. 3. To open Workflow Studio, choose Next. 4. On the Actions tab, drag and drop a Task state on the canvas. This invokes the cross-account Lambda function that's using this Task state. 5. On the Configuration tab, do the following: Prerequisites 262 AWS Step Functions Developer Guide a. Rename the state to Cross-account call. b. For Function name, choose Enter function name, and then enter the Lambda function ARN in the box. For example, arn:aws:lambda:us- east-2:111122223333:function:Echo. c. For Provide IAM role ARN, specify the target IAM role ARN. For example, arn:aws:iam::111122223333:role/LambdaRole. Tip Alternatively, you can also specify a reference path to an existing key-value pair in the state’s JSON input that contains the IAM role ARN. To do this, choose Get IAM role ARN at runtime from state input. For an example of specifying a value by using a reference path, see Specifying JSONPath as IAM role ARN. 6. Choose Next. 7. On the Review generated code page, choose Next. 8. On the Specify state machine settings page, specify details for the new state machine, such as a name, permissions, and logging level. 9. Choose Create state machine. 10. Make a note of the state machine's IAM role ARN and the state machine ARN in a text file. You'll need to provide these ARNs in the target account's trust policy. Your Task state definition should now look similar to the following definition. { "StartAt": "Cross-account call", "States": { "Cross-account call": { "Type": "Task", "Resource": "arn:aws:states:::lambda:invoke", "Credentials": { "RoleArn": "arn:aws:iam::111122223333:role/LambdaRole" }, "Parameters": { "FunctionName": "arn:aws:lambda:us-east-2:111122223333:function:Echo", }, "End": true Step 1: Update the Task state definition to specify the target role 263 AWS Step Functions } } }
|
step-functions-dg-083
|
step-functions-dg.pdf
| 83 |
as a name, permissions, and logging level. 9. Choose Create state machine. 10. Make a note of the state machine's IAM role ARN and the state machine ARN in a text file. You'll need to provide these ARNs in the target account's trust policy. Your Task state definition should now look similar to the following definition. { "StartAt": "Cross-account call", "States": { "Cross-account call": { "Type": "Task", "Resource": "arn:aws:states:::lambda:invoke", "Credentials": { "RoleArn": "arn:aws:iam::111122223333:role/LambdaRole" }, "Parameters": { "FunctionName": "arn:aws:lambda:us-east-2:111122223333:function:Echo", }, "End": true Step 1: Update the Task state definition to specify the target role 263 AWS Step Functions } } } Developer Guide Step 2: Update the target role's trust policy The IAM role must exist in the target account and you must modify its trust policy to allow the source account to assume this role temporarily. Additionally, you can control who can assume the target IAM role. After you create the trust relationship, a user from the source account can use the AWS Security Token Service (AWS STS) AssumeRole API operation. This operation provides temporary security credentials that enable access to AWS resources in a target account. 1. Open the IAM console at https://console.aws.amazon.com/iam/. 2. On the navigation pane of the console, choose Roles and then use the Search box to search for the target IAM role. For example, LambdaRole. 3. Choose the Trust relationships tab. 4. Choose Edit trust policy and paste the following trust policy. Make sure to replace the AWS account number and IAM role ARN. The sts:ExternalId field further controls who can assume the role. The state machine's name must include only characters that the AWS Security Token Service AssumeRole API supports. For more information, see AssumeRole in the AWS Security Token Service API Reference. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "sts:AssumeRole", "Principal": { "AWS": "arn:aws:iam::account-id:role/ExecutionRole" // The source account's state machine execution role ARN }, "Condition": { // Control which account and state machine can assume the target IAM role "StringEquals": { "sts:ExternalId": "arn:aws:states:region:account- id:stateMachine:testCrossAccount" //// ARN of the state machine that will assume the role. Step 2: Update the target role's trust policy 264 AWS Step Functions Developer Guide } } } ] } 5. Keep this window open and proceed to the next step for further actions. Step 3: Add the required permission in the target role Permissions in the IAM policies determine whether a specific request is allowed or denied. The target IAM role must have the correct permission to invoke the Lambda function. 1. Choose the Permissions tab. 2. Choose Add permissions and then choose Create inline policy. 3. Choose the JSON tab and replace the existing content with the following permission. Make sure to replace your Lambda function ARN. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "lambda:InvokeFunction", "Resource": "arn:aws:lambda:us-east-2:111122223333:function:Echo" // The cross-account AWS resource being accessed } ] } 4. Choose Review policy. 5. On the Review policy page, enter a name for the permission, and then choose Create policy. Step 4: Add permission in execution role to assume the target role Step Functions doesn’t automatically generate the AssumeRole policy for all cross-account service integrations. You must add the required permission in the state machine's execution role to allow it to assume a target IAM role in one or more AWS accounts. Step 3: Add the required permission in the target role 265 AWS Step Functions Developer Guide 1. Open your state machine's execution role in the IAM console at https:// console.aws.amazon.com/iam/. To do this: a. Open the state machine that you created in Step 1 in the source account. b. On the State machine detail page, choose IAM role ARN. 2. On the Permissions tab, choose Add permissions and then choose Create inline policy. 3. Choose the JSON tab and replace the existing content with the following permission. Make sure to replace your Lambda function ARN. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": "sts:AssumeRole", "Resource": "arn:aws:iam::111122223333:role/LambdaRole" // The target role to be assumed } ] } 4. Choose Review policy. 5. On the Review policy page, enter a name for the permission, and then choose Create policy. Workshops for learning Step Functions Workshop: The Step Functions Workshop In this workshop, you will learn to use the primary features of Step Functions while building workflows. A series of interactive modules start by introducing you to basic workflows, task states, and error handling. You can continue to learn choice states for branch logic, map states for processing arrays, and parallel states for running multiple branches in parallel. Workshop: Large-scale Data Processing with Step Functions Learn how serverless technologies such as Step Functions and Lambda can simplify management and scaling, offload undifferentiated tasks, and address the challenges of large-scale distributed data processing. Along the way, you will work with distributed map for high concurrency Workshops
|
step-functions-dg-084
|
step-functions-dg.pdf
| 84 |
learn to use the primary features of Step Functions while building workflows. A series of interactive modules start by introducing you to basic workflows, task states, and error handling. You can continue to learn choice states for branch logic, map states for processing arrays, and parallel states for running multiple branches in parallel. Workshop: Large-scale Data Processing with Step Functions Learn how serverless technologies such as Step Functions and Lambda can simplify management and scaling, offload undifferentiated tasks, and address the challenges of large-scale distributed data processing. Along the way, you will work with distributed map for high concurrency Workshops 266 AWS Step Functions Developer Guide processing. The workshop also presents best practices for optimizing your workflows, and practical use cases for claims processing, vulnerability scanning, and Monte Carlo simulation. Workshops 267 AWS Step Functions Developer Guide Deploy a state machine using a starter template for Step Functions To deploy state machines for a variety of example use cases and patterns, you can choose one of the following starter templates in the AWS Step Functions console. These starter templates are ready-to-run sample projects that automatically create the workflow prototype and definition, and all related AWS resources for the project. You can use these sample projects to deploy and run them as is, or use the workflow prototypes to build on them. If you build upon these projects, Step Functions creates the workflow prototype, but doesn't deploy the resources listed in the workflow definition. When you deploy the sample projects, they provision a fully functional state machine, and create the related resources for the state machine to run. When you create a sample project, Step Functions uses AWS CloudFormation to create the related resources referenced by the state machine. List of starter templates • Manage a container task with Amazon ECS and Amazon SNS • Transfer data records with Lambda, DynamoDB, and Amazon SQS • Poll for job status with Lambda and AWS Batch • Create a task timer with Lambda and Amazon SNS • Create a callback pattern example with Amazon SQS, Amazon SNS, and Lambda • Manage an Amazon EMR job • Run an EMR Serverless job • Start a workflow within a workflow with Step Functions and Lambda • Process data from a queue with a Map state in Step Functions • Process a CSV file from Amazon S3 using a Distributed Map • Process data in an Amazon S3 bucket with Distributed Map • Train a machine learning model using Amazon SageMaker AI • Tune the hyperparameters of a machine learning model in SageMaker AI • Perform AI prompt-chaining with Amazon Bedrock • Process high-volume messages from Amazon SQS with Step Functions Express workflows 268 AWS Step Functions Developer Guide • Perform selective checkpointing using Standard and Express workflows • Build an AWS CodeBuild project using Step Functions • Preprocess data and train a machine learning model with Amazon SageMaker AI • Orchestrate AWS Lambda functions with Step Functions • Start an Athena query and send a results notification • Execute queries in sequence and parallel using Athena • Query large datasets using an AWS Glue crawler • Keep data in a target table updated with AWS Glue and Athena • Create and manage an Amazon EKS cluster with a node group • Interact with an API managed by API Gateway • Call a microservice running on Fargate using API Gateway integration • Send a custom event to an EventBridge event bus • Invoke Synchronous Express Workflows through API Gateway • Run an ETL/ELT workflow using Step Functions and the Amazon Redshift API • Manage a batch job with AWS Batch and Amazon SNS • Fan out batch jobs with Map state • Run an AWS Batch job with Lambda Manage a container task with Amazon ECS and Amazon SNS This sample project demonstrates how to run an AWS Fargate task, and then send an Amazon SNS notification based on whether that job succeeds or fails. Deploying this sample project will create an AWS Step Functions state machine, a Fargate cluster, and an Amazon SNS topic. In this project, Step Functions uses a state machine to call the Fargate task synchronously. It then waits for the task to succeed or fail, and it sends an Amazon SNS topic with a message about whether the job succeeded or failed. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. Manage a container task 269 AWS Step Functions Developer Guide 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow
|
step-functions-dg-085
|
step-functions-dg.pdf
| 85 |
to succeed or fail, and it sends an Amazon SNS topic with a message about whether the job succeeded or failed. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. Manage a container task 269 AWS Step Functions Developer Guide 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Transfer data records with Lambda, DynamoDB, and Amazon SQS This sample project demonstrates how to iteratively read items from an Amazon DynamoDB table and send these items to an Amazon SQS queue using a Step Functions state machine. Deploying this sample project will create a Step Functions state machine, a DynamoDB table, an AWS Lambda function, and an Amazon SQS queue. Step 2: Run the demo state machine 270 AWS Step Functions Developer Guide In this project, Step Functions uses the Lambda function to populate the DynamoDB table. The state machine also uses a for loop to read each of the entries, and then sends each entry to an Amazon SQS queue. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Step 1: Create the state machine 271 AWS Step Functions Developer Guide Poll for job status with Lambda and AWS Batch This sample project creates an AWS Batch job poller. It implements an AWS Step Functions state machine that uses AWS Lambda to create a Wait state loop that checks on an AWS Batch job. This sample project creates and configures all resources so that your Step Functions workflow will submit an AWS Batch job, and will wait for that job to complete before ending successfully. Note You can also implement this pattern without using a Lambda function. For information about controlling AWS Batch directly, see Integrating services with Step Functions. This sample project creates the state machine, two Lambda functions, and an AWS Batch queue, and configures the related IAM permissions. For more information about how AWS Step Functions can control other AWS services, see Integrating services with Step Functions. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create
|
step-functions-dg-086
|
step-functions-dg.pdf
| 86 |
see Integrating services with Step Functions. This sample project creates the state machine, two Lambda functions, and an AWS Batch queue, and configures the related IAM permissions. For more information about how AWS Step Functions can control other AWS services, see Integrating services with Step Functions. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Job poller 272 AWS Step Functions Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Create a task timer with Lambda and Amazon SNS This sample project creates a task timer. It implements an AWS Step Functions state machine that implements a Wait state, and uses an AWS Lambda function that sends an Amazon Simple Notification Service (Amazon SNS) notification. A Wait workflow state state is a state type that waits for a trigger to perform a single unit of work. Note This sample project implements an AWS Lambda function to send an Amazon Simple Notification Service (Amazon SNS) notification. You can also send an Amazon SNS notification directly from the Amazon States Language. See Integrating services with Step Functions. This sample project creates the state machine, a Lambda function, and an Amazon SNS topic, and configures the related AWS Identity and Access Management (IAM) permissions. For more information about the resources that are created with the Task Timer sample project, see the following: For more information about how AWS Step Functions can control other AWS services, see Integrating services with Step Functions. Step 2: Run the demo state machine 273 AWS Step Functions Developer Guide • AWS CloudFormation User Guide • Amazon Simple Notification Service Developer Guide • AWS Lambda Developer Guide • IAM Getting Started Guide Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! Step 1: Create the state machine 274 AWS Step Functions Developer Guide You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Create a callback pattern example with Amazon SQS, Amazon SNS, and Lambda This sample project demonstrates how to have AWS Step Functions pause during a task, and wait for an external process to return a task token that was generated when the task started. To learn how to implement the callback pattern in Step Functions, see Wait for a Callback with Task Token. For more information about how AWS Step Functions can control other AWS services, see Integrating services
|
step-functions-dg-087
|
step-functions-dg.pdf
| 87 |
demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Create a callback pattern example with Amazon SQS, Amazon SNS, and Lambda This sample project demonstrates how to have AWS Step Functions pause during a task, and wait for an external process to return a task token that was generated when the task started. To learn how to implement the callback pattern in Step Functions, see Wait for a Callback with Task Token. For more information about how AWS Step Functions can control other AWS services, see Integrating services with Step Functions. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. Callback pattern example 275 AWS Step Functions 1. Choose Deploy and run. Developer Guide 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Manage an Amazon EMR job This sample project demonstrates Amazon EMR and AWS Step Functions integration. The project creates an Amazon EMR cluster, adds multiple steps and runs them, and then terminate the cluster. Important Amazon EMR does not have a free pricing tier. Running the sample project will incur costs. You can find pricing information on the Amazon EMR pricing page. The availability of Amazon EMR service integration is subject to the availability of Amazon EMR APIs. Because of this, this sample project might not work correctly in some AWS Regions. See the Amazon EMR documentation for limitations in special Regions. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Manage an Amazon EMR job 276 AWS Step Functions Note Standard charges apply for services deployed to your account. Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Run an EMR Serverless job This sample project demonstrates how to create and start an EMR Serverless application and run multiple jobs within it. This sample project creates the state machine, the supporting AWS resources, and configures the related IAM permissions. Explore this sample project to learn about running EMR Serverless jobs using Step Functions state machines, or use it as a starting point for your own projects. Important EMR Serverless does not have a free pricing tier. Running the sample project will incur costs. You can find pricing information on the Amazon EMR Serverless pricing page. In addition, the availability of EMR Serverless service integration is subject to the availability of EMR Serverless APIs. Because of this, this sample project might not
|
step-functions-dg-088
|
step-functions-dg.pdf
| 88 |
it. This sample project creates the state machine, the supporting AWS resources, and configures the related IAM permissions. Explore this sample project to learn about running EMR Serverless jobs using Step Functions state machines, or use it as a starting point for your own projects. Important EMR Serverless does not have a free pricing tier. Running the sample project will incur costs. You can find pricing information on the Amazon EMR Serverless pricing page. In addition, the availability of EMR Serverless service integration is subject to the availability of EMR Serverless APIs. Because of this, this sample project might not work Step 2: Run the demo state machine 277 AWS Step Functions Developer Guide correctly or be available in some AWS Regions. See the Other considerations topic for information about availability of EMR Serverless in AWS Regions. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Step 1: Create the state machine 278 AWS Step Functions Developer Guide Start a workflow within a workflow with Step Functions and Lambda This sample project demonstrates how to use an AWS Step Functions state machine to start other state machine executions. For information about starting state machine executions from another state machine, see Start workflow executions from a task state in Step Functions. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! Start a workflow within a workflow 279 AWS Step Functions Developer Guide You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Process data from a queue with a Map state in Step Functions In this sample workflow, a Map workflow state state processes data from a queue, sending messages to subscribers and storing them in a database. Step Functions uses an optimized integration to pull messages from an Amazon SQS queue. When messages are available, a Choice state passes an array of JSON messages to a Map state for processing. For each message, the state machine writes the message to DynamoDB, removes the message from the queue, and publishes the message to an Amazon SNS topic. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a.
|
step-functions-dg-089
|
step-functions-dg.pdf
| 89 |
database. Step Functions uses an optimized integration to pull messages from an Amazon SQS queue. When messages are available, a Choice state passes an array of JSON messages to a Map state for processing. For each message, the state machine writes the message to DynamoDB, removes the message from the queue, and publishes the message to an Amazon SNS topic. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Subscribe to the Amazon SNS topic Tip Subscribe to the Amazon SNS topic and add items to the Amazon SQS queue before you run your state machine. Process data with a Map 280 AWS Step Functions Developer Guide 1. Open the Amazon SNS console. 2. Choose Topics and find the topic that was created by the sample project. 3. Choose Create subscription, and for Protocol, choose Email. 4. Under Endpoint, enter your email address to subscribe to the topic. 5. Choose Create subscription. 6. Confirm the subscription in your email to activate the subscription. Step 3: Add messages to the Amazon SQS queue 1. Open the Amazon SQS console. 2. Choose the queue that was created by the sample project. 3. Choose Send and receive messages, enter a message and choose Send message. Repeat this step to add several messages to the queue. Step 4: Run the state machine Tip Queues in Amazon SNS are eventually consistent. You may need to wait a few minutes after sending messages to the queue before running your state machine. If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Step 3: Add messages to the Amazon SQS queue 281 AWS Step Functions Developer Guide Process a CSV file from Amazon S3 using a Distributed Map This sample project demonstrates how you can use the Distributed Map state to iterate over 10,000 rows of a CSV file that is generated using a Lambda function. The CSV file contains shipping information of customer orders and is stored in an Amazon S3 bucket. The Distributed Map iterates over a batch of 10 rows in the CSV file for data analysis. The Distributed Map contains a Lambda function to detect any delayed orders. The Distributed Map also contains an Inline Map to process the delayed orders in a batch and returns these delayed orders in an array. For each delayed order, the Inline Map sends a message to an Amazon SQS queue. Finally, this sample project stores the Map Run results to another Amazon S3 bucket in your AWS account. With Distributed Map, you can run up to 10,000 parallel child workflow executions at a time. In this sample project, the maximum concurrency of Distributed Map is set at 1000 that limits it to 1000 parallel child workflow executions. This sample project creates the state machine, the supporting AWS resources, and configures the related IAM permissions. Explore this sample project to learn about using the Distributed Map for orchestrating large-scale, parallel workloads, or use it as a starting point for your own projects. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note
|
step-functions-dg-090
|
step-functions-dg.pdf
| 90 |
the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Distributed Map to process a CSV file in S3 282 AWS Step Functions Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Process data in an Amazon S3 bucket with Distributed Map This sample project demonstrates how you can use the Distributed Map state to process large- scale data, for example, analyze historical weather data and identify the weather station that has the highest average temperature on the planet each month. The weather data is recorded in over 12,000 CSV files, which in turn are stored in an Amazon S3 bucket. This sample project includes two Distributed Map states named Distributed S3 copy NOA Data and ProcessNOAAData. Distributed S3 copy NOA Data iterates over the CSV files in a public Amazon S3 bucket named noaa-gsod-pds and copies them to an Amazon S3 bucket in your AWS account. ProcessNOAAData iterates over the copied files and includes a Lambda function that performs the temperature analysis. The sample project first checks the contents of the Amazon S3 bucket with a call to the ListObjectsV2 API action. Based on the number of keys returned in response to this call, the sample project takes one of the following decisions: • If the key count is more than or equal to 1, the project transitions to the ProcessNOAAData state. This Distributed Map state includes a Lambda function named TemperatureFunction that finds the weather station that had the highest average temperature each month. This function returns a dictionary with year-month as the key and a dictionary that contains information about the weather station as the value. Step 2: Run the demo state machine 283 AWS Step Functions Developer Guide • If the returned key count doesn't exceed 1, the Distributed S3 copy NOA Data state lists all objects from the public bucket noaa-gsod-pds and iteratively copies the individual objects to another bucket in your account in batches of 100. An Inline Map performs the iterative copying of the objects. After all objects are copied, the project transitions to the ProcessNOAAData state for processing the weather data. The sample project finally transitions to a reducer Lambda function that performs a final aggregation of the results returned by the TemperatureFunction function and writes the results to an Amazon DynamoDB table. With Distributed Map, you can run up to 10,000 parallel child workflow executions at a time. In this sample project, the maximum concurrency of ProcessNOAAData Distributed Map is set at 3000 that limits it to 3000 parallel child workflow executions. This sample project creates the state machine, the supporting AWS resources, and configures the related IAM permissions. Explore this sample project to learn about using the Distributed Map for orchestrating large-scale, parallel workloads, or use it as a starting point for your own projects. Important This sample project is only available in the US East (N. Virginia) Region. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Step 1: Create the state machine 284 AWS Step Functions Note Standard charges
|
step-functions-dg-091
|
step-functions-dg.pdf
| 91 |
state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Step 1: Create the state machine 284 AWS Step Functions Note Standard charges apply for services deployed to your account. Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Train a machine learning model using Amazon SageMaker AI This sample project demonstrates how to use SageMaker AI and AWS Step Functions to train a machine learning model and how to batch transform a test dataset. In this project, Step Functions uses a Lambda function to seed an Amazon S3 bucket with a test dataset. It then trains a machine learning model and performs a batch transform, using the SageMaker AI service integration. For more information about SageMaker AI and Step Functions service integrations, see the following: • Integrating services with Step Functions • Create and manage Amazon SageMaker AI jobs with Step Functions Step 2: Run the demo state machine 285 AWS Step Functions Note Developer Guide This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see SageMaker AI Pricing. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Step 1: Create the state machine 286 AWS Step Functions Congratulations! Developer Guide You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Tune the hyperparameters of a machine learning model in SageMaker AI This sample project demonstrates using SageMaker AI to tune the hyperparameters of a machine learning model, and to batch transform a test dataset. In this project, Step Functions uses a Lambda function to seed an Amazon S3 bucket with a test dataset. It then creates a hyperparameter tuning job using the SageMaker AI service integration. It then uses a Lambda function to extract the data path, saves the tuning model, extracts the model name, and then runs a batch transform job to perform inference in SageMaker AI. For more information about SageMaker AI and Step Functions service integrations, see the following: • Integrating services with Step Functions • Create and manage Amazon SageMaker AI jobs with Step Functions Note This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see SageMaker AI Pricing. Step 1: Create the state machine 1.
|
step-functions-dg-092
|
step-functions-dg.pdf
| 92 |
tuning model, extracts the model name, and then runs a batch transform job to perform inference in SageMaker AI. For more information about SageMaker AI and Step Functions service integrations, see the following: • Integrating services with Step Functions • Create and manage Amazon SageMaker AI jobs with Step Functions Note This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see SageMaker AI Pricing. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: Tune a machine learning model 287 AWS Step Functions Developer Guide a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Perform AI prompt-chaining with Amazon Bedrock This sample project demonstrates how you can integrate with Amazon Bedrock to perform AI prompt-chaining and build high-quality chatbots using Amazon Bedrock. The project chains together some prompts and resolves them in the sequence in which they're provided. Chaining of these prompts augments the ability of the language model being used to deliver a highly-curated response. This sample project creates the state machine, the supporting AWS resources, and configures the related IAM permissions. Explore this sample project to learn about using Amazon Bedrock Step 2: Run the demo state machine 288 AWS Step Functions Developer Guide optimized service integration with Step Functions state machines, or use it as a starting point for your own projects. Prerequisites This sample project uses the Cohere Command large language model (LLM). To successfully run this sample project, you must add access to this LLM from the Amazon Bedrock console. To add the model access, do the following: 1. Open the Amazon Bedrock console. 2. On the navigation pane, choose Model access. 3. Choose Manage model access. 4. Select the check box next to Cohere. 5. Choose Request access. The Access status for Cohere model shows as Access granted. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Prerequisites 289 AWS Step Functions Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Process high-volume messages from Amazon SQS with Step Functions Express workflows This sample project demonstrates how to use an AWS Step Functions Express Workflow to process messages or data from a high-volume event source, such as Amazon
|
step-functions-dg-093
|
step-functions-dg.pdf
| 93 |
1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Process high-volume messages from Amazon SQS with Step Functions Express workflows This sample project demonstrates how to use an AWS Step Functions Express Workflow to process messages or data from a high-volume event source, such as Amazon Simple Queue Service (Amazon SQS). Because Express Workflows can be started at a very high rate, they are ideal for high-volume event processing or streaming data workloads. Here are two commonly used methods to execute your state machine from an event source: • Configure an Amazon CloudWatch Events rule to start a state machine execution whenever the event source emits an event. For more information, see Creating a CloudWatch Events Rule That Triggers on an Event. • Map the event source to a Lambda function, and write function code to execute your state machine. The AWS Lambda function is invoked each time your event source emits an event, in turn starting a state machine execution. For more information see Using AWS Lambda with Amazon SQS. This sample project uses the second method to start an execution each time the Amazon SQS queue sends a message. You can use a similar configuration to trigger Express Workflows execution from other event sources, such as Amazon Simple Storage Service (Amazon S3), Amazon DynamoDB, and Amazon Kinesis. Step 2: Run the demo state machine 290 AWS Step Functions Developer Guide For more information about Express Workflows and Step Functions service integrations, see the following: • Choosing workflow type in Step Functions • Integrating services with Step Functions • Step Functions service quotas Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Trigger the state machine execution 1. Open the Amazon SQS console. 2. Select the queue that was created by the sample project. The name will be similar to Example-SQSQueue-wJalrXUtnFEMI. 3. In the Queue Actions list, select Send a Message. 4. Use the copy button to copy the following message, and on the Send a Message window, enter it, and choose Send Message. Step 1: Create the state machine 291 AWS Step Functions Note Developer Guide In this sample message, the input: line has been formatted with line breaks to fit the page. Use the copy button or otherwise ensure that it is entered as a single line with no breaks. { "input": "QW5kIGxpa2UgdGhlIGJhc2VsZXNzIGZhYnJpYyBvZiB0aGlzIHZpc2lvbiwgVGhlIGNsb3VkLWNhcHBlZCB0b3dlcnMsIHRoZSBnb3JnZW 91cyBwYWxhY2VzLCBUaGUgc29sZW1uIHRlbXBsZXMsIHRoZSBncmVhdCBnbG9iZSBpdHNlbGbigJQgWWVhLCBhbGwgd2hpY2ggaXQgaW5o ZXJpdOKAlHNoYWxsIGRpc3NvbHZlLCBBbmQgbGlrZSB0aGlzIGluc3Vic3RhbnRpYWwgcGFnZWFudCBmYWRlZCwgTGVhdmUgbm90IGEgcm FjayBiZWhpbmQuIFdlIGFyZSBzdWNoIHN0dWZmIEFzIGRyZWFtcyBhcmUgbWFkZSBvbiwgYW5kIG91ciBsaXR0bGUgbGlmZSBJcyByb3Vu ZGVkIHdpdGggYSBzbGVlcC4gU2lyLCBJIGFtIHZleGVkLiBCZWFyIHdpdGggbXkgd2Vha25lc3MuIE15IG9sZCBicmFpbiBpcyB0cm91Ym xlZC4gQmUgbm90IGRpc3R1cmJlZCB3aXRoIG15IGluZmlybWl0eS4gSWYgeW91IGJlIHBsZWFzZWQsIHJldGlyZSBpbnRvIG15IGNlbGwg QW5kIHRoZXJlIHJlcG9zZS4gQSB0dXJuIG9yIHR3byBJ4oCZbGwgd2FsayBUbyBzdGlsbCBteSBiZWF0aW5nIG1pbmQu" } 5. Choose Close. 6. Open the Step Functions console. 7. Go to your Amazon CloudWatch Logs log group and inspect the logs. The name of the log group will look like example-ExpressLogGroup-wJalrXUtnFEMI. Perform selective checkpointing using Standard and Express workflows This sample project demonstrates how to combine Standard and Express Workflows by running a mock e-commerce workflow that does selective checkpointing. Deploying this sample project creates a Standard workflows state machine, a nested Express Workflows state machine, an AWS Lambda function, an Amazon Simple Queue Service (Amazon SQS) queue, and an Amazon Simple Notification Service (Amazon SNS) topic. Selective checkpointing example 292 AWS Step Functions Developer Guide For more information about Express Workflows, nested workflows, and Step Functions service integrations, see the following: • Choosing workflow type in Step Functions • Start workflow executions from a task state in Step Functions • Integrating services with Step Functions Step 1: Create the State Machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your
|
step-functions-dg-094
|
step-functions-dg.pdf
| 94 |
1: Create the State Machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! Step 1: Create the State Machine 293 AWS Step Functions Developer Guide You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Build an AWS CodeBuild project using Step Functions This sample project demonstrates how to use AWS Step Functions to build an AWS CodeBuild project, run tests, and then send an Amazon SNS notification based on the results. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! Start a CodeBuild build 294 AWS Step Functions Developer Guide You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Preprocess data and train a machine learning model with Amazon SageMaker AI This sample project demonstrates how to use SageMaker AI and AWS Step Functions to preprocess data and train a machine learning model. In this project, Step Functions uses a Lambda function to seed an Amazon S3 bucket with a test dataset and a Python script for data processing. It then trains a machine learning model and performs a batch transform, using the SageMaker AI service integration. For more information about SageMaker AI and Step Functions service integrations, see the following: • Integrating services with Step Functions • Create and manage Amazon SageMaker AI jobs with Step Functions Note This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see SageMaker AI Pricing. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. Preprocess data and train a machine learning model 295 AWS Step Functions Developer Guide b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run
|
step-functions-dg-095
|
step-functions-dg.pdf
| 95 |
template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. Preprocess data and train a machine learning model 295 AWS Step Functions Developer Guide b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Orchestrate AWS Lambda functions with Step Functions The Orchestrate Lambda functions template uses several Lambda functions in a sample stock trading workflow. One function checks a stock price, then a human is prompted to choose to buy or sell the stock. A choice state selects the next function based on the recommended_type variable to complete the purchase or sale. After either function finishes, the result of the trade is then published before reaching the end of the workflow. To implement the human approval step, the workflow execution pauses until a unique TaskToken is returned. In this project, the workflow passes a message with the task token to an Amazon SQS queue. The message triggers another Lambda function that's configured to handle a callback based on the payload of the message. The workflow pauses until it receives the task token back from a Step 2: Run the demo state machine 296 AWS Step Functions Developer Guide SendTaskSuccess API call. For more information about task tokens, see Wait for a Callback with Task Token. Orchestrate Lambda functions 297 AWS Step Functions Developer Guide Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. For more information about Step Functions service integrations, see Integrating services with Step Functions. Step 1: Create the state machine 298 AWS Step Functions Developer Guide Start an Athena query and send a results notification This sample project demonstrates how to use Step Functions and Amazon Athena to start an Athena query and send a notification with query results using Standard workflows. In this project, Step Functions uses Lambda functions and an AWS Glue crawler to generate a set of example data. It then performs a query using the Athena service integration and returns the results using an SNS topic. For more information about Athena and Step Functions service integrations, see the following: • Integrating services with Step Functions • Run Athena queries with Step Functions Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it –
|
step-functions-dg-096
|
step-functions-dg.pdf
| 96 |
integration and returns the results using an SNS topic. For more information about Athena and Step Functions service integrations, see the following: • Integrating services with Step Functions • Run Athena queries with Step Functions Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. Start an Athena query 299 AWS Step Functions 1. Choose Deploy and run. Developer Guide 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Execute queries in sequence and parallel using Athena This sample project demonstrates how to run Athena queries in succession and then in parallel, handle errors and then send an Amazon SNS notification based on whether the queries succeed or fail. In this project, Step Functions uses a state machine to run Athena queries synchronously. After the query results are returned, enter parallel state with two Athena queries executing in parallel. It then waits for the job to succeed or fail, and it sends an Amazon SNS topic with a message about whether the job succeeded or failed. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Execute queries in sequence and parallel using Athena 300 AWS Step Functions Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Query large datasets using an AWS Glue crawler This sample project demonstrates how to ingest a large data set in Amazon S3 and partition it through AWS Glue Crawlers, then execute Amazon Athena queries against that partition. In this project, the Step Functions state machine invokes an AWS Glue crawler that partitions a large dataset in Amazon S3. Once the AWS Glue crawler returns a success message, the workflow executes Athena queries against that partition. Once query execution is successfully complete, an Amazon SNS notification is sent to an Amazon SNS topic. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) Step 2: Run the demo state machine
|
step-functions-dg-097
|
step-functions-dg.pdf
| 97 |
Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) Step 2: Run the demo state machine 301 AWS Step Functions Developer Guide 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Keep data in a target table updated with AWS Glue and Athena This sample project demonstrates how to query a target table to get current data with AWS Glue Catalog, then update it with new data from other sources using Amazon Athena. In this project, the Step Functions state machine calls AWS Glue Catalog to verify if a target table exists in an Amazon S3 Bucket. If no table is found one, it will create a new table. Then, Step Functions runs an Athena query to add rows to the target table from a different data source: first querying the target table to get the most recent date, then querying the source table for more recent data and inserting it into the target table. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. Step 2: Run the demo state machine 302 AWS Step Functions Developer Guide 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Create and manage an Amazon EKS cluster with a node group This sample project demonstrates how to use Step Functions and Amazon Elastic Kubernetes Service to create an Amazon EKS cluster with a node group, run a job on Amazon EKS, then examine the output. When finished, it removes the node groups and Amazon EKS cluster. For more information about Step Functions and Step Functions service integrations, see the following: Step 2: Run the demo state machine 303 AWS Step Functions Developer Guide • Integrating services with Step Functions • Create and manage Amazon EKS clusters with Step Functions Note This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see Amazon EKS Pricing. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow
|
step-functions-dg-098
|
step-functions-dg.pdf
| 98 |
Functions Note This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see Amazon EKS Pricing. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. Step 1: Create the state machine 304 AWS Step Functions Developer Guide 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Interact with an API managed by API Gateway This sample project demonstrates how to use Step Functions to make a call to API Gateway and checks whether the call succeeded. For more information about API Gateway and Step Functions service integrations, see the following: • Integrating services with Step Functions • Create API Gateway REST APIs with Step Functions Step 1: Create the state 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Make a call to API Gateway 305 AWS Step Functions Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Call a microservice running on Fargate using API Gateway integration This sample project demonstrates how to use Step Functions to make a call to API Gateway in order to interact with a service on AWS Fargate, and also to check whether the call succeeded. For more information about API Gateway and Step Functions service integrations, see the following: • Integrating services with Step Functions • Create API Gateway REST APIs with Step Functions Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. Step 2: Run the demo state machine 306 AWS Step Functions Developer Guide b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to
|
step-functions-dg-099
|
step-functions-dg.pdf
| 99 |
After review, you can create the workflow and all related resources. Step 2: Run the demo state machine 306 AWS Step Functions Developer Guide b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Send a custom event to an EventBridge event bus This sample project demonstrates how to use Step Functions to send a custom event to an event bus that matches a rule with multiple targets (Amazon EventBridge, AWS Lambda, Amazon Simple Notification Service, Amazon Simple Queue Service). For more information about Step Functions and Step Functions service integrations, see the following: • Integrating services with Step Functions • Add EventBridge events with Step Functions Step 2: Run the demo state machine 307 AWS Step Functions Note Developer Guide This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see EventBridge Pricing. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Step 1: Create the state machine 308 AWS Step Functions Congratulations! Developer Guide You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Invoke Synchronous Express Workflows through API Gateway This sample project demonstrates how to invoke Synchronous Express Workflows through Amazon API Gateway to manage an employee database. In this project, Step Functions uses API Gateway endpoints to start Step Functions Synchronous Express Workflows. These then use DynamoDB to search for, add, and remove employees in an employee database. For more information about Step Functions Synchronous Express Workflows, see Synchronous and Asynchronous Express Workflows in Step Functions. Note This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see Step Functions Pricing. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Invoke Synchronous Express Workflows through API Gateway 309 AWS Step Functions Note Standard charges apply for services deployed to your account. Developer Guide Step
|
step-functions-dg-100
|
step-functions-dg.pdf
| 100 |
starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Invoke Synchronous Express Workflows through API Gateway 309 AWS Step Functions Note Standard charges apply for services deployed to your account. Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Run an ETL/ELT workflow using Step Functions and the Amazon Redshift API This sample project demonstrates how to use Step Functions and the Amazon Redshift Data API to run an ETL/ELT workflow that loads data into the Amazon Redshift data warehouse. In this project, Step Functions uses an AWS Lambda function and the Amazon Redshift Data API to create the required database objects and to generate a set of example data, then executes two jobs in parallel that perform loading dimension tables, followed by a fact table. Once both dimension load jobs end successfully, Step Functions executes the load job for the fact table, runs the validation job, then pauses the Amazon Redshift cluster. Note You can modify the ETL logic to receive data from other sources such as Amazon S3, which can use the COPY command to copy data from Amazon S3 to an Amazon Redshift table. Step 2: Run the demo state machine 310 AWS Step Functions Developer Guide For more information about Amazon Redshift and Step Functions service integrations, see the following guides: • Integrating services with Step Functions • Using the Amazon Redshift Data API • Amazon Redshift Data API service • Creating a Step Functions state machine that uses Lambda For more information about IAM policies for Lambda and Amazon Redshift, see the following guides: • IAM policies for calling AWS Lambda • Authorizing access to the Amazon Redshift Data API Note This sample project may incur charges. For new AWS users, a free usage tier is available. On this tier, services are free below a certain level of usage. For more information about AWS costs and the Free Tier, see AWS Step Functions pricing. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Step 1: Create the state machine 311 AWS Step Functions Note Standard charges apply for services deployed to your account. Developer Guide Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Manage a batch job with AWS Batch and Amazon SNS This sample project demonstrates how to submit an AWS Batch job, and then send an Amazon SNS notification based on whether that job succeeds or fails. Deploying this sample project creates an AWS Step Functions state machine, an AWS Batch job, and an Amazon SNS topic. In this project, Step Functions uses a state machine to call the AWS
|
step-functions-dg-101
|
step-functions-dg.pdf
| 101 |
choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Manage a batch job with AWS Batch and Amazon SNS This sample project demonstrates how to submit an AWS Batch job, and then send an Amazon SNS notification based on whether that job succeeds or fails. Deploying this sample project creates an AWS Step Functions state machine, an AWS Batch job, and an Amazon SNS topic. In this project, Step Functions uses a state machine to call the AWS Batch job synchronously. It then waits for the job to succeed or fail, and it sends an Amazon SNS topic with a message about whether the job succeeded or failed. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: Step 2: Run the demo state machine 312 AWS Step Functions Developer Guide a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Fan out batch jobs with Map state This sample project demonstrates how to use Step Functions’s Map workflow state state to fan out AWS Batch jobs. In this project, Step Functions uses a state machine to invoke a Lambda function to do simple pre- processing, then invokes multiple AWS Batch jobs in parallel using the Map workflow state state. Step 2: Run the demo state machine 313 AWS Step Functions Developer Guide Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Run an AWS Batch job with Lambda This sample project demonstrates how to use Step Functions to pre-process data with AWS Lambda functions and then orchestrate AWS Batch jobs. Step 1: Create the state machine 314 AWS Step Functions Developer Guide In this project, Step Functions uses a state machine to invoke a Lambda function to do simple pre- processing before an AWS Batch job is submitted. Multiple jobs may be invoked depending on the result or success of the previous one. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review,
|
step-functions-dg-102
|
step-functions-dg.pdf
| 102 |
state machine 314 AWS Step Functions Developer Guide In this project, Step Functions uses a state machine to invoke a Lambda function to do simple pre- processing before an AWS Batch job is submitted. Multiple jobs may be invoked depending on the result or success of the previous one. Step 1: Create the state machine 1. Open the Step Functions console and choose Create state machine. 2. Choose Create from template and find the related starter template. Choose Next to continue. 3. Choose how to use the template: a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) 4. Choose Use template to continue with your selection. Note Standard charges apply for services deployed to your account. Step 2: Run the demo state machine If you chose the Run a demo option, all related resources will be deployed and ready to run. If you chose the Build on it option, you might need to set placeholder values and create additional resources before you can run your custom workflow. 1. Choose Deploy and run. 2. Wait for the AWS CloudFormation stack to deploy. This can take up to 10 minutes. 3. After the Start execution option appears, review the Input and choose Start execution. Congratulations! You should now have a running demo of your state machine. You can choose states in the Graph view to review input, output, variables, definition, and events. Step 1: Create the state machine 315 AWS Step Functions Developer Guide Developing workflows with Step Functions We recommend starting to build workflows in the Step Functions console and Workflow Studio visual editor. You can start from a blank canvas or choose starter templates for common scenarios. Building your workflows require the following tasks: • Defining your workflow • Running and debugging your workflow • Deploying your workflow You define a state machine in Amazon States Language. You can manually create your Amazon States Language definitions, but Workflow Studio will be featured in tutorials. With Workflow Studio, you can define, your machine definition, visualize and edit the steps, run and debug your workflow, and view the results all from within the Step Functions console. Working with Workflow Studio in Visual Studio Code With the AWS toolkit, you can use Workflow Studio from within VS Code to visualize, build, and even test individual states in your state machines. You provide state inputs and set variables, start the test, then you can see how your data is transformed. You can adjust the workflow and re-test. When finished, you can apply the changes to update the state machine. For more information, see Working with Workflow Studio in the AWS Toolkit for Visual Studio Code. You can also use many Step Functions features from the AWS Command Line Interface (AWS CLI). For example, you can create a state machine and list your existing state machines. You can use Step Functions commands in the AWS CLI to start and manage executions, poll for activities, record task heartbeats, and more. For a complete list of Step Functions commands, descriptions of the available arguments, and examples showing their use, see the AWS CLI Command Reference. AWS CLI Command Reference AWS CLI commands follow the Amazon States Language closely, so you can use the AWS CLI to learn about the Step Functions API actions. You can also use your existing API knowledge to prototype code or perform Step Functions actions from the command line. 316 AWS Step Functions Developer Guide Validating state machine definitions You can use the API to validate state machines and find potential problems before creating your workflow. To learn more about validating workflows, see ValidateStateMachineDefinition in the Step Functions API Reference. To get started with minimal setup, you can follow the Creating a Lambda State Machine tutorial, which shows you how to define a workflow with a single step that calls a Lambda function, then run the workflow, and view the results. Defining your workflow The first step in developing your workflow is defining the steps in Amazon States Language. Depending on your preference and tool, you can define your Step Functions state machines in JSON, YAML, or as a stringified Amazon States Language (ASL) definition. The following table shows ASL-based definition format support by tool. AWS Tool Step Functions Console HTTPS Service API AWS CLI Step Functions Local AWS Toolkit for Visual Studio Code AWS SAM Supported format(s) JSON Stringified ASL Stringified ASL Stringified ASL JSON, YAML JSON, YAML AWS CloudFormation JSON, YAML, Stringified ASL YAML single line comments in the state machine definition of a template will
|
step-functions-dg-103
|
step-functions-dg.pdf
| 103 |
developing your workflow is defining the steps in Amazon States Language. Depending on your preference and tool, you can define your Step Functions state machines in JSON, YAML, or as a stringified Amazon States Language (ASL) definition. The following table shows ASL-based definition format support by tool. AWS Tool Step Functions Console HTTPS Service API AWS CLI Step Functions Local AWS Toolkit for Visual Studio Code AWS SAM Supported format(s) JSON Stringified ASL Stringified ASL Stringified ASL JSON, YAML JSON, YAML AWS CloudFormation JSON, YAML, Stringified ASL YAML single line comments in the state machine definition of a template will not be carried forward into the created resource’s definition. If you need to persist a comment, you should use Defining your workflow 317 AWS Step Functions Developer Guide the Comment property within the state machine definition. For information, see State machine structure. With AWS CloudFormation and AWS SAM, you can upload your state machine definitions to Amazon S3 (JSON or YAML format) and provide the definition's Amazon S3 location in the template. For information see the AWS::StepFunctions::StateMachine S3Location page. The following example AWS CloudFormation templates show how you can provide the same state machine definition using different input formats. JSON with Definition { "AWSTemplateFormatVersion": "2010-09-09", "Description": "AWS Step Functions sample template.", "Resources": { "MyStateMachine": { "Type": "AWS::StepFunctions::StateMachine", "Properties": { "RoleArn": { "Fn::GetAtt": [ "StateMachineRole", "Arn" ] }, "TracingConfiguration": { "Enabled": true }, "Definition": { "StartAt": "HelloWorld", "States": { "HelloWorld": { "Type": "Pass", "End": true } } } } }, "StateMachineRole": { "Type": "AWS::IAM::Role", "Properties": { "AssumeRolePolicyDocument": { "Version": "2012-10-17", "Statement": [ { Defining your workflow 318 AWS Step Functions Developer Guide "Action": [ "sts:AssumeRole" ], "Effect": "Allow", "Principal": { "Service": [ "states.amazonaws.com" ] } } ] }, "ManagedPolicyArns": [], "Policies": [ { "PolicyName": "StateMachineRolePolicy", "PolicyDocument": { "Statement": [ { "Action": [ "lambda:InvokeFunction" ], "Resource": "*", "Effect": "Allow" } ] } } ] } } }, "Outputs": { "StateMachineArn": { "Value": { "Ref": "MyStateMachine" } } } } Defining your workflow 319 Developer Guide AWS Step Functions JSON with DefinitionString { "AWSTemplateFormatVersion": "2010-09-09", "Description": "AWS Step Functions sample template.", "Resources": { "MyStateMachine": { "Type": "AWS::StepFunctions::StateMachine", "Properties": { "RoleArn": { "Fn::GetAtt": [ "StateMachineRole", "Arn" ] }, "TracingConfiguration": { "Enabled": true }, "DefinitionString": "{\n \"StartAt\": \"HelloWorld\",\n \"States\": {\n \"HelloWorld\": {\n \"Type\": \"Pass\",\n \"End\": true\n }\n }\n}" } }, "StateMachineRole": { "Type": "AWS::IAM::Role", "Properties": { "AssumeRolePolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Action": [ "sts:AssumeRole" ], "Effect": "Allow", "Principal": { "Service": [ "states.amazonaws.com" ] } } ] }, "ManagedPolicyArns": [], "Policies": [ { "PolicyName": "StateMachineRolePolicy", "PolicyDocument": { "Statement": [ Defining your workflow 320 Developer Guide AWS Step Functions { "Action": [ "lambda:InvokeFunction" ], "Resource": "*", "Effect": "Allow" } ] } } ] } } }, "Outputs": { "StateMachineArn": { "Value": { "Ref": "MyStateMachine" } } } } YAML with Definition AWSTemplateFormatVersion: 2010-09-09 Description: AWS Step Functions sample template. Resources: MyStateMachine: Type: 'AWS::StepFunctions::StateMachine' Properties: RoleArn: !GetAtt - StateMachineRole - Arn TracingConfiguration: Enabled: true Definition: # This is a YAML comment. This will not be preserved in the state machine resource's definition. Comment: This is an ASL comment. This will be preserved in the state machine resource's definition. StartAt: HelloWorld States: Defining your workflow 321 AWS Step Functions Developer Guide HelloWorld: Type: Pass End: true StateMachineRole: Type: 'AWS::IAM::Role' Properties: AssumeRolePolicyDocument: Version: 2012-10-17 Statement: - Action: - 'sts:AssumeRole' Effect: Allow Principal: Service: - states.amazonaws.com ManagedPolicyArns: [] Policies: - PolicyName: StateMachineRolePolicy PolicyDocument: Statement: - Action: - 'lambda:InvokeFunction' Resource: "*" Effect: Allow Outputs: StateMachineArn: Value: Ref: MyStateMachine YAML with DefinitionString AWSTemplateFormatVersion: 2010-09-09 Description: AWS Step Functions sample template. Resources: MyStateMachine: Type: 'AWS::StepFunctions::StateMachine' Properties: RoleArn: !GetAtt - StateMachineRole - Arn TracingConfiguration: Enabled: true Defining your workflow 322 AWS Step Functions Developer Guide DefinitionString: | { "StartAt": "HelloWorld", "States": { "HelloWorld": { "Type": "Pass", "End": true } } } StateMachineRole: Type: 'AWS::IAM::Role' Properties: AssumeRolePolicyDocument: Version: 2012-10-17 Statement: - Action: - 'sts:AssumeRole' Effect: Allow Principal: Service: - states.amazonaws.com ManagedPolicyArns: [] Policies: - PolicyName: StateMachineRolePolicy PolicyDocument: Statement: - Action: - 'lambda:InvokeFunction' Resource: "*" Effect: Allow Outputs: StateMachineArn: Value: Ref: MyStateMachinele Develop workflows with AWS SDKs Step Functions is supported by the AWS SDKs for Java, .NET, Ruby, PHP, Python (Boto 3), JavaScript, Go, and C++. These SDKs provide a convenient way to use the Step Functions HTTPS API actions in multiple programming languages. You can develop state machines, activities, or Defining your workflow 323 AWS Step Functions Developer Guide state machine starters using the API actions exposed by these SDK libraries. You can also access visibility operations using these libraries to develop your own Step Functions monitoring and reporting tools. See the reference documentation for the current AWS SDKs and Tools for Amazon Web Services. Develop workflows through HTTPS requests Step Functions provides service operations that are accessible through HTTPS requests. You can use these operations to communicate directly with Step Functions from
|
step-functions-dg-104
|
step-functions-dg.pdf
| 104 |
the Step Functions HTTPS API actions in multiple programming languages. You can develop state machines, activities, or Defining your workflow 323 AWS Step Functions Developer Guide state machine starters using the API actions exposed by these SDK libraries. You can also access visibility operations using these libraries to develop your own Step Functions monitoring and reporting tools. See the reference documentation for the current AWS SDKs and Tools for Amazon Web Services. Develop workflows through HTTPS requests Step Functions provides service operations that are accessible through HTTPS requests. You can use these operations to communicate directly with Step Functions from your own libraries. You can develop state machines, workers, or state machine starters using the service API actions. You can also access visibility operations through the API actions to develop your own monitoring and reporting tools. For details see the AWS Step Functions API Reference. Develop workflows with the AWS Step Functions Data Science SDK Data scientists can create workflows that process and publish machine learning models using SageMaker AI and Step Functions. You can also create multi-step machine learning workflows in Python that orchestrate AWS infrastructure at scale. The AWS Step Functions Data Science SDK provides a Python API that can create and invoke Step Functions workflows. You can manage and execute these workflows directly in Python, as well as Jupyter notebooks. For more information, see: AWS Step Functions Data Science Project on Github, data science SDK documentation, and example Jupyter notebooks and SageMaker AI examples on GitHub. Running and debugging your workflows You can start workflows in a number of ways, including from the console, an API call (for example, from a Lambda function), from Amazon EventBridge and EventBridge Scheduler, from another Step Functions state machine. Running workflows can connect to third party services, use AWS SDKs, and manipulate data while running. Various tools exist to both run and debug the execution steps and data flowing through your state machine. The following sections provide additional resources for running and debugging your workflows. To learn more about the ways to start state machine executions, see Starting state machines. Choose an endpoint to run your workflows To reduce latency and store data in a location that meets your requirements, Step Functions provides endpoints in different AWS Regions. Each endpoint in Step Functions is completely independent. A state machine or activity exists only within the Region where it was created. Any state machines and activities that you create in one Region do not share any data or attributes with Running and debugging your workflows 324 AWS Step Functions Developer Guide those created in another Region. For example, you can register a state machine named STATES- Flows-1 in two different Regions. The STATES-Flows-1 state machine in one region won't share data or attributes with the STATES-Flow-1 state machine in the other region. For a list of Step Functions endpoints, see AWS Step Functions Regions and Endpoints in the AWS General Reference. Development with VS Code With the AWS toolkit, you can use Workflow Studio from within VS Code to visualize, build, and even test individual states in your state machines. You can also use your SAM and CloudFormation definition substitutions. You provide state inputs and set variables, start the test, then you can see how your data is transformed. In the State definition tab, you can adjust the workflow and re- test. When finished, you can apply the changes to update the state machine. For more information, see Working with Step Functions and Working with Workflow Studio in the AWS Toolkit for Visual Studio Code. Deploying your workflows After you have defined and debugged your workflows, you'll probably want to deploy using Infrastructure as Code frameworks. You can choose to deploy your state machines using a variety of IaC options, including: AWS Serverless Application Model, AWS CloudFormation, AWS CDK, and Terraform. AWS Serverless Application Model You can use AWS Serverless Application Model with Step Functions to build workflows and deploy the infrastructure you need, including Lambda functions, APIs and events, to create serverless applications. You can also use the AWS SAM CLI in conjunction with the AWS Toolkit for Visual Studio Code as part of an integrated experience. For more information, see Using AWS SAM to build Step Functions workflows. AWS CloudFormation You can use your state machine definitions directly in AWS CloudFormation templates. For more information, see Using AWS CloudFormation to create a workflow in Step Functions. AWS CDK You can build Standard and Express state machines with AWS CDK. To build a Standard workflow, see Using CDK to create a Standard workflow. Deploying your workflows 325 AWS Step Functions Developer Guide To build an Express workflow, see Using CDK to create an Express workflow. Terraform Terraform by HashiCorp is a framework for building applications using infrastructure as code (IaC). With
|
step-functions-dg-105
|
step-functions-dg.pdf
| 105 |
Using AWS SAM to build Step Functions workflows. AWS CloudFormation You can use your state machine definitions directly in AWS CloudFormation templates. For more information, see Using AWS CloudFormation to create a workflow in Step Functions. AWS CDK You can build Standard and Express state machines with AWS CDK. To build a Standard workflow, see Using CDK to create a Standard workflow. Deploying your workflows 325 AWS Step Functions Developer Guide To build an Express workflow, see Using CDK to create an Express workflow. Terraform Terraform by HashiCorp is a framework for building applications using infrastructure as code (IaC). With Terraform, you can create state machines and use features, such as previewing infrastructure deployments and creating reusable templates. Terraform templates help you maintain and reuse the code by breaking it down into smaller chunks. For more information, see Using Terraform to deploy state machines in Step Functions. Developing workflows in Step Functions Workflow Studio When editing a workflow in the AWS Step Functions console, you'll use a visual tool called Workflow Studio. With Workflow Studio, you can drag-and-drop states onto a canvas to build your workflows. You can add, edit, and configure states, set input and output filters, transform results, and set up error handling. As you modify states in your workflow, Workflow Studio will validate and auto-generate the state machine definition. You can review the generated code, edit the configuration, and even modify the text definition with the built-in code editor. When you're finished, you can save your workflow, run it, and then examine the results. You can access Workflow Studio from the Step Functions console, when you create or edit a workflow. You can also use Workflow Studio from within AWS Infrastructure Composer, a visual designer to create infrastructure as code with AWS Serverless Application Model and AWS CloudFormation. To discover the benefits of this approach, see Using Workflow Studio in Infrastructure Composer. Workflow Studio has three modes: Design, Code, and Config. In Design mode, you can drag-and- drop states onto the canvas. Code mode provides a built-in code editor for editing your workflow definitions within the console. In Config mode, you can manage your workflow configuration. Working with Workflow Studio in Visual Studio Code With the AWS toolkit, you can use Workflow Studio from within VS Code to visualize, build, and even test individual states in your state machines. You provide state inputs and set variables, start the test, then you can see how your data is transformed. You can adjust the workflow and re-test. When finished, you can apply the changes to update the state Using Workflow Studio 326 AWS Step Functions Developer Guide machine. For more information, see Working with Workflow Studio in the AWS Toolkit for Visual Studio Code. Design mode Design mode provides a graphical interface to visualize your workflows as you build their prototypes. The following image shows the states browser, workflow canvas, inspector, and contextual help panels in the Design mode of Workflow Studio. 1. Mode buttons switch between the three modes. You cannot switch modes if your ASL workflow definition is invalid. 2. The States browser contains the following three tabs: • The Actions tab provides a list of AWS APIs that you can drag and drop into your workflow graph in the canvas. Each action represents a Task workflow state state. • The Flow tab provides a list of flow states that you can drag and drop into your workflow graph in the canvas. • The Patterns tab provides several ready-to-use, reusable building blocks that you can use for a variety of use cases. For example, you can use these patterns to iteratively process data in an Amazon S3 bucket. 3. The Canvas and workflow graph is where you drag and drop states into your workflow graph, change the order of states, and select states to configure or view. Design mode 327 AWS Step Functions Developer Guide 4. The Inspector panel panel is where you can view and edit the properties of any state you've selected on the canvas. Turn on the Definition toggle to view the Amazon States Language code for your workflow, with the currently selected state highlighted. 5. Info links open a panel with contextual information when you need help. These panels also include links to related topics in the Step Functions documentation. 6. Design toolbar – Contains a set of buttons to perform common actions, such as undo, delete, and zoom in. 7. Utility buttons – A set of buttons to perform tasks, such as saving your workflows or exporting their ASL definitions in a JSON or YAML file. States browser From the States browser, you can select states to drag and drop on to your workflow canvas. The Actions tab provides a list of task states that connect to 3rd party HTTP endpoints and
|
step-functions-dg-106
|
step-functions-dg.pdf
| 106 |
when you need help. These panels also include links to related topics in the Step Functions documentation. 6. Design toolbar – Contains a set of buttons to perform common actions, such as undo, delete, and zoom in. 7. Utility buttons – A set of buttons to perform tasks, such as saving your workflows or exporting their ASL definitions in a JSON or YAML file. States browser From the States browser, you can select states to drag and drop on to your workflow canvas. The Actions tab provides a list of task states that connect to 3rd party HTTP endpoints and AWS APIs. The Flow tab provides a list of states with which you can direct and control your workflow. Flow states include: Choice, Parallel, Map, Pass, Wait, Success, and Fail. The Patterns tab provides ready- to-use, reusable pre-defined building blocks. You can search among all state types with the search box at the top of the panel. Design mode 328 AWS Step Functions Developer Guide Canvas and workflow graph After you choose a state to add to your workflow, you can drag it to the canvas and drop it into your workflow graph. You can also drag and drop states to move them within your workflow. If your workflow is large, you can zoom in or out to view different parts of your workflow graph in the canvas. Inspector panel You can configure any states that you add to your workflow from the Inspector panel on the right. Choose the state you want to configure, and you will see its configuration options in the Inspector panel. To see the auto-generated ASL definition for your workflow code, turn on the Definition toggle. The ASL definition associated with the state you've selected will appear highlighted. Design mode 329 AWS Step Functions Developer Guide Code mode In Code mode of Workflow Studio, you can use an integrated code editor to view, write, and edit the Using Amazon States Language to define Step Functions workflows (ASL) definition of your workflows within the Step Functions console. The following screenshot shows the components in the Code mode. Code mode 330 AWS Step Functions Developer Guide 1. Mode buttons switch between the three modes. You cannot switch modes if your ASL workflow definition is invalid. 2. The Code editor is where you write and edit the ASL definition of your workflows within the Workflow Studio. The code editor also provides features, such as syntax highlighting and auto- completion. 3. Graph visualization – Shows a real-time graphical visualization of your workflow. 4. Utility buttons – A set of buttons to perform tasks, such as saving your workflows or exporting their ASL definitions in a JSON or YAML file. 5. Code toolbar – Contains a set of buttons to perform common actions, such as undoing an action or formatting the code. 6. Graph toolbar – Contains a set of buttons to perform common actions, such as zooming in and zooming out the workflow graph. Code editor The code editor provides an IDE-like experience to write and edit your workflow definitions using JSON within the Workflow Studio. The code editor includes several features, such as syntax Code mode 331 AWS Step Functions Developer Guide highlighting, auto-complete suggestions, ASL definition validation, and context-sensitive help display. As you update your workflow definition, the Graph visualization renders a real-time graph of your workflow. You can also see the updated workflow graph in the Design mode. If you select a state in the Design mode or the graph visualization pane, the ASL definition of that state appears highlighted in the code editor. The ASL definition of your workflow is automatically updated if you reorder, delete, or add a state in the Design mode or the graph visualization pane. The code editor can make suggestions to auto-complete fields and states. • To see a list of fields you can include within a specific state, press Ctrl+Space. • To generate a code snippet for a new state in your workflow press Ctrl+Space after the current state's definition. • To display a list of all available commands and keyboard shortcuts, press F1. Graph visualization The graph visualization panel shows your workflow in a graphical format. When you write your workflow definitions in the Code editor of Workflow Studio, the graph visualization pane renders a real-time graph of your workflow. As you reorder, delete, or duplicate a state in the graph visualization pane, the workflow definition in the Code editor is automatically updated. Similarly, as you update your workflow definitions, reorder, delete, or add a state in the Code editor, the visualization is automatically updated. If the JSON in the ASL definition of your workflow is invalid, the graph visualization panel pauses the rendering and displays a status message at the bottom of the pane. Config mode In
|
step-functions-dg-107
|
step-functions-dg.pdf
| 107 |
write your workflow definitions in the Code editor of Workflow Studio, the graph visualization pane renders a real-time graph of your workflow. As you reorder, delete, or duplicate a state in the graph visualization pane, the workflow definition in the Code editor is automatically updated. Similarly, as you update your workflow definitions, reorder, delete, or add a state in the Code editor, the visualization is automatically updated. If the JSON in the ASL definition of your workflow is invalid, the graph visualization panel pauses the rendering and displays a status message at the bottom of the pane. Config mode In the Config mode of Workflow Studio, you can manage the general configuration of your state machines. In this mode, you can specify settings, such as the following: • Details: Set the workflow name and type. Note that both cannot be changed after you create the state machine. • Permissions : you can create a new role (recommended), choose an existing role, or enter an ARN for a specific role. If you select the option to create a new role, Step Functions creates an execution role for your state machines using least privileges. The generated IAM roles are valid Config mode 332 AWS Step Functions Developer Guide for the AWS Region in which you create the state machine. Prior to creation, you can review the permissions that Step Functions will automatically generate for your state machine. • Logging: You can enable and set a log level for your state machine. Step Functions logs the execution history events based on your selection. You can optionally use a customer managed key to encrypt your logs. For more information about log levels, see Log levels for Step Functions execution events. In Additional configuration, you can set one or more of the following optional configuration options: • Enable X-Ray tracing: You can send traces to X-Ray for state machine executions, even when a trace ID is not passed by an upstream service. For more information, see Trace Step Functions request data in AWS X-Ray. • Publish version on creation: A version is a numbered, immutable snapshot of a state machine that you can run. Choose this option to publish a version of your state machine while creating the state machine. Step Functions publishes version 1 as the first revision of the state machine. For more information about versions, see State machine versions in Step Functions workflows. • Encrypt with customer managed key : You can provide a key that you mange directly to encrypt your data. For information, see Data at rest encryption • Tags: Choose this box to add tags that can help you track and manage the costs associated with your resources, and provide better security in your IAM policies. For more information about tags, see Tagging state machines and activities in Step Functions. Creating a workflow with Workflow Studio in Step Functions Learn to create, edit, and run workflows using Step Functions Workflow Studio. After your workflow is ready, you can save, run, and export it. In this topic • Create a state machine • Design a workflow • Run your workflow • Edit your workflow • Export your workflow • Creating a workflow prototype with placeholders Create a workflow 333 AWS Step Functions Create a state machine Developer Guide In Workflow Studio, you can either choose a starter template or a blank template to create a workflow. A starter template is a ready-to-run sample project that automatically creates the workflow prototype and definition, and deploys all the related AWS resources that your project needs to your AWS account. You can use these starter templates to deploy and run them as is, or use the workflow prototypes to build on them. For more information about starter templates, see Deploy a state machine using a starter template for Step Functions. With a blank template, you use the Design or Code mode to create your custom workflow. Create a state machine using a starter template 1. Open the Step Functions console and choose Create state machine. 2. In the Choose a template dialog box, do one of the following to choose a sample project: • Type Task Timer in the Search by keyword box, and then choose Task Timer from the search results. • Browse through the sample projects listed under All on the right pane, and then choose Task Timer. 3. Choose Next to continue. 4. Choose how to use the template: 5. Choose Use template to continue with your selection. a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be
|
step-functions-dg-108
|
step-functions-dg.pdf
| 108 |
and then choose Task Timer from the search results. • Browse through the sample projects listed under All on the right pane, and then choose Task Timer. 3. Choose Next to continue. 4. Choose how to use the template: 5. Choose Use template to continue with your selection. a. Run a demo – creates a read-only state machine. After review, you can create the workflow and all related resources. b. Build on it – provides an editable workflow definition that you can review, customize, and deploy with your own resources. (Related resources, such as functions or queues, will not be created automatically.) Create a workflow using a blank template When you want to start from a clean canvas, create a workflow from the blank template. 1. Open the Step Functions console. Create a workflow 334 AWS Step Functions Developer Guide 2. Choose Create state machine. 3. In the Choose a template dialog box, select Blank. 4. Choose Select to open Workflow Studio in Design mode. You can now start designing your workflow in Design mode or writing your workflow definition in Code mode. 5. Choose Config to manage the configuration of your workflow in the Config mode. For example, provide a name for your workflow and choose its type. Design a workflow When you know the name of the state you want to add, use the search box at the top of the States browser to find it. Otherwise, look for the state you need in the browser and add it onto the canvas. You can reorder states in your workflow by dragging them to a different location in your workflow. As you drag a state onto the canvas, a line appears to show where the state will be inserted into your workflow, as shown in the following screenshot: Create a workflow 335 AWS Step Functions Developer Guide After a state is dropped onto the canvas, its code is auto-generated and added inside the workflow definition. To see the definition, turn on the Definition toggle on the Inspector panel. You can choose Code mode to edit the definition with the built-in code editor. After you drop a state onto the canvas, you can configure it in the Inspector panel panel on the right. This panel contains the Configuration, Input, Output, and Error Handling tabs for each of the state or API action that you place on the canvas. You configure the states you include in your workflows in the Configuration tab. For example, the Configuration tab for Lambda Invoke API action provides the following options: • State name: You can identify the state with a custom name or accept the default generated name. • API shows which API action is used by the state. • Integration type: You can choose the service integration type used to call API actions on other services. • Function name provides options to: Create a workflow 336 AWS Step Functions Developer Guide • Enter a function name: You can enter your function name or its ARN. • Get function name at runtime from state input: You can use this option to dynamically get the function name from the state input based on the path you specify. • Select function name: You can directly select from the functions available in your account and region. • Payload : you can choose to use the state input, a JSON object, or no payload to pass as the payload to your Lambda function. If you choose JSON, you can include both static values and values selected from the state input. • (Optional) Some states will have an option to select Wait for task to complete or Wait for callback. When available, you can choose one of the following service integration patterns: • No option selected: Step Functions will use the Request Response integration pattern. Step Functions will wait for an HTTP response and then progress to the next state. Step Functions will not wait for a job to complete. When no options are available, the state will use this pattern. • Wait for task to complete: Step Functions will use the Run a Job (.sync) integration pattern. • Wait for callback: Step Functions will use the Wait for a Callback with Task Token integration pattern. • (Optional) To access resources configured in different AWS accounts within your workflows, Step Functions provides cross-account access. IAM role for cross-account access provides options to: • Provide IAM role ARN: Specify the IAM role that contains appropriate resource access permissions. These resources are available in a target account, which is an AWS account to which you make cross-account calls. • Get IAM role ARN at runtime from state input: Specify a reference path to an existing key- value pair in the state’s JSON input which contains the IAM role. • Next state lets
|
step-functions-dg-109
|
step-functions-dg.pdf
| 109 |
Callback with Task Token integration pattern. • (Optional) To access resources configured in different AWS accounts within your workflows, Step Functions provides cross-account access. IAM role for cross-account access provides options to: • Provide IAM role ARN: Specify the IAM role that contains appropriate resource access permissions. These resources are available in a target account, which is an AWS account to which you make cross-account calls. • Get IAM role ARN at runtime from state input: Specify a reference path to an existing key- value pair in the state’s JSON input which contains the IAM role. • Next state lets you to select the state you want to transition to next. • (Optional) Comment field will not affect the workflow, but you can be use it to annotate your workflow. Some states will have additional generic configuration options. For example, the Amazon ECS RunTask state configuration contains an API Parameters field populated with placeholder values. For these states, you can replace the placeholder values with configurations that are suited to your needs. To delete a state Create a workflow 337 AWS Step Functions Developer Guide You can press backspace, right-click and choose Delete state, or choose Delete on the Design toolbar. Run your workflow When your workflow is ready to go, you can run it and view its execution from the Step Functions console. To run a workflow in Workflow Studio 1. In the Design, Code, or Config mode, choose Execute. The Start execution dialog box opens in a new tab. 2. In the Start execution dialog box, do the following: 1. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. 2. (Optional) In the Input box, enter input values in JSON format to run your workflow. 3. Choose Start execution. 4. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. Edit your workflow You can edit an existing workflow visually in the Design mode of Workflow Studio. Create a workflow 338 AWS Step Functions Developer Guide In the Step Functions console, choose the workflow you want to edit from the State machines page. The workflow opens in Design mode of Workflow Studio. You can also edit the workflow definition in Code mode. Choose the Code button to view or edit the workflow definition in Workflow Studio. Note If you see errors in your workflow, you must fix them in Design mode. You can't switch to the Code or Config mode if any errors exist in your workflow. When you save changes to your workflow, you have the option to also publish a new version. With versions, you can choose to run the original or alternate versions of your workflow. To learn more about managing workflows with versions, see State machine versions in Step Functions workflows Export your workflow You can export your workflow's Amazon States Language (ASL) definition and your workflow graph: 1. Choose your workflow in the Step Functions console. 2. On the State machine detail page, choose Edit. 3. Choose the Actions dropdown button, and then do one or both of the following: • To export the workflow graph to an SVG or PNG file, under Export graph, select the format you want. • To export the workflow definition as a JSON or YAML file, under Export definition, select the format you want. Creating a workflow prototype with placeholders You can use Workflow Studio or Workflow Studio in Infrastructure Composer to create prototypes of new workflows that contain placeholder resources which are named resources that do not exist yet. To create a workflow prototype: 1. Sign in to the Step Functions console. Create a workflow 339 AWS Step Functions 2. Choose Create state machine. 3. In the Choose a template dialog box, select Blank. 4. Choose Select to open Workflow Studio in Design mode. Developer Guide 5. The Design mode of Workflow Studio opens. Design your workflow in Workflow Studio. To include placeholder resources: a. Choose the state for which you want to include a placeholder resource, and then in Configuration: • For Lambda Invoke states, choose Function name, then
|
step-functions-dg-110
|
step-functions-dg.pdf
| 110 |
which are named resources that do not exist yet. To create a workflow prototype: 1. Sign in to the Step Functions console. Create a workflow 339 AWS Step Functions 2. Choose Create state machine. 3. In the Choose a template dialog box, select Blank. 4. Choose Select to open Workflow Studio in Design mode. Developer Guide 5. The Design mode of Workflow Studio opens. Design your workflow in Workflow Studio. To include placeholder resources: a. Choose the state for which you want to include a placeholder resource, and then in Configuration: • For Lambda Invoke states, choose Function name, then choose Enter function name. You can also enter a custom name for your function. • For Amazon SQS Send Message states, choose Queue URL, then choose Enter queue URL. Enter a placeholder queue URL. • For Amazon SNS Publish states, from Topic, choose a topic ARN. • For all other states listed under Actions, you can use the default configuration. Note If you see errors in your workflow, you must fix them in Design mode. You can't switch to the Code or Config mode if any errors exist in your workflow. b. (Optional) To view the auto-generated ASL definition of your workflow, choose Definition. c. (Optional) To update the workflow definition in Workflow Studio, choose the Code button. Note If you see errors in your workflow definition, you must fix them in Code mode. You can't switch to the Design or Config mode if any errors exist in your workflow definition. 6. (Optional) To edit the state machine name, choose the edit icon next to the default state machine name of MyStateMachine and specify a name in the State machine name box. You can also switch to the Config mode to edit the default state machine name. 7. Specify your workflow settings, such as state machine type and its execution role. 8. Choose Create. Create a workflow 340 AWS Step Functions Developer Guide You've now created a new workflow with placeholder resources that can be used to prototype. You can export your workflow definition and the workflow graph. • To export your workflow definition as a JSON or YAML file, in the Design or Code mode, choose the Actions dropdown button. Then, under Export definition, select the format you want to export. You can use this exported definition as the starting point for local development with the AWS Toolkit for Visual Studio Code. • To export your workflow graph to an SVG or PNG file, in the Design or Code mode, choose the Actions dropdown button. Then, under Export definition, select the format you want. Configure states inputs and outputs with Workflow Studio in Step Functions Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. Each state makes a decision or performs an action based on input that it receives. In most cases, it then passes output to other states. In Workflow Studio, you can configure how a state filters and manipulates its input and output data in the Input and Output tabs of the Inspector panel panel. Use the Info links to access contextual help when configuring inputs and outputs. Configure input and output 341 AWS Step Functions Developer Guide For detailed information about how Step Functions processes input and output, see Processing input and output in Step Functions. Configure input to a state Each state receives input from the previous state as JSON. If you want to filter the input, you can use the InputPath filter under the Input tab in the Inspector panel panel. The InputPath is a string, beginning with $, that identifies a specific JSON node. These are called reference paths, and they follow JsonPath syntax. To filter the input: • Choose Filter input with InputPath. • Enter a valid JsonPath for the InputPath filter. For example, $.data. Your InputPath filter will be added to your workflow. Example Example 1: Use InputPath filter in Workflow Studio Say the input to your state includes the following JSON data. { "comment": "Example for InputPath", "dataset1": { "val1": 1, "val2": 2, "val3": 3 }, "dataset2": { "val1": "a", "val2": "b", "val3": "c" } } To apply the InputPath filter, choose Filter input with InputPath, then enter an appropriate reference path. If you enter $.dataset2.val1, the following JSON is passed as input to the state. {"a"} Configure input and output 342 AWS Step Functions Developer Guide A reference path can also have a selection of values. If the data you reference is { "a": [1, 2, 3, 4] } and you apply the reference path $.a[0:2] as the InputPath filter, the following is the result. [ 1, 2 ] Parallel workflow state, Map workflow state, and Pass workflow state flow states have an additional input filtering option called Parameters under
|
step-functions-dg-111
|
step-functions-dg.pdf
| 111 |
choose Filter input with InputPath, then enter an appropriate reference path. If you enter $.dataset2.val1, the following JSON is passed as input to the state. {"a"} Configure input and output 342 AWS Step Functions Developer Guide A reference path can also have a selection of values. If the data you reference is { "a": [1, 2, 3, 4] } and you apply the reference path $.a[0:2] as the InputPath filter, the following is the result. [ 1, 2 ] Parallel workflow state, Map workflow state, and Pass workflow state flow states have an additional input filtering option called Parameters under their Input tab. This filter takes effect after the InputPath filter and can be used to construct a custom JSON object consisting of one or more key-value pairs. The values of each pair can either be static values, can be selected from the input, or can be selected from the Accessing execution data from the Context object in Step Functions with a path. Note To specify that a parameter uses a reference path to point to a JSON node in the input, the parameter name must end with .$. Example Example 2: Create custom JSON input for Parallel state Say the following JSON data is the input to a Parallel state. { "comment": "Example for Parameters", "product": { "details": { "color": "blue", "size": "small", "material": "cotton" }, "availability": "in stock", "sku": "2317", "cost": "$23" } } To select part of this input and pass additional key-value pairs with a static value, you can specify the following in the Parameters field, under the Parallel state’s Input tab. Configure input and output 343 AWS Step Functions Developer Guide { "comment": "Selecting what I care about.", "MyDetails": { "size.$": "$.product.details.size", "exists.$": "$.product.availability", "StaticValue": "foo" } } The following JSON data will be the result. { "comment": "Selecting what I care about.", "MyDetails": { "size": "small", "exists": "in stock", "StaticValue": "foo" } } Configure output of a state Each state produces JSON output that can be filtered before it is passed to the next state. There are several filters available, and each affects the output in a different way. Output filters available for each state are listed under the Output tab in the Inspector panel. For Task workflow state states, any output filters you select are processed in this order: 1. ResultSelector: Use this filter to manipulate the state’s result. You can construct a new JSON object with parts of the result. 2. Specifying state output using ResultPath in Step Functions: Use this filter to select a combination of the state input and the task result to pass to the output. 3. Filtering state output using OutputPath: Use this filter to filter the JSON output to choose which information from the result will be passed to the next state. Use ResultSelector ResultSelector is an optional output filter for the following states: • Task workflow state states, which are all states listed in the Actions tab of the States browser. Configure input and output 344 AWS Step Functions Developer Guide • Map workflow state states, in the Flow tab of the States browser. • Parallel workflow state states, in the Flow tab of the States browser. ResultSelector can be used to construct a custom JSON object consisting of one or more key- value pairs. The values of each pair can either be static values or selected from the state's result with a path. Note To specify that a parameter uses a path to reference a JSON node in the result, the parameter name must end with .$. Example Example to use ResultSelector filter In this example, you use ResultSelector to manipulate the response from the Amazon EMR CreateCluster API call for an Amazon EMR CreateCluster state. The following is the result from the Amazon EMR CreateCluster API call. { "resourceType": "elasticmapreduce", "resource": "createCluster.sync", "output": { "SdkHttpMetadata": { "HttpHeaders": { "Content-Length": "1112", "Content-Type": "application/x-amz-JSON-1.1", "Date": "Mon, 25 Nov 2019 19:41:29 GMT", "x-amzn-RequestId": "1234-5678-9012" }, "HttpStatusCode": 200 }, "SdkResponseMetadata": { "RequestId": "1234-5678-9012" }, "ClusterId": "AKIAIOSFODNN7EXAMPLE" } } Configure input and output 345 AWS Step Functions Developer Guide To select part of this information and pass an additional key-value pair with a static value, specify the following in the ResultSelector field, under the state’s Output tab. { "result": "found", "ClusterId.$": "$.output.ClusterId", "ResourceType.$": "$.resourceType" } Using ResultSelector produces the following result. { "result": "found", "ClusterId": "AKIAIOSFODNN7EXAMPLE", "ResourceType": "elasticmapreduce" } Use ResultPath The output of a state can be a copy of its input, the result it produces, or a combination of its input and result. Use ResultPath to control which combination of these is passed to the state output. For more use cases of ResultPath, see Specifying state output using ResultPath in Step Functions. ResultPath is an optional output filter for the following states: • Task workflow state states,
|
step-functions-dg-112
|
step-functions-dg.pdf
| 112 |
the ResultSelector field, under the state’s Output tab. { "result": "found", "ClusterId.$": "$.output.ClusterId", "ResourceType.$": "$.resourceType" } Using ResultSelector produces the following result. { "result": "found", "ClusterId": "AKIAIOSFODNN7EXAMPLE", "ResourceType": "elasticmapreduce" } Use ResultPath The output of a state can be a copy of its input, the result it produces, or a combination of its input and result. Use ResultPath to control which combination of these is passed to the state output. For more use cases of ResultPath, see Specifying state output using ResultPath in Step Functions. ResultPath is an optional output filter for the following states: • Task workflow state states, which are all states listed in the Actions tab of the States browser. • Map workflow state states, in the Flow tab of the States browser. • Parallel workflow state states, in the Flow tab of the States browser. • Pass workflow state states, in the Flow tab of the States browser. ResultPath can be used to add the result into the original state input. The specified path indicates where to add the result. Example Example to use ResultPath filter Say the following is the input to a Task state. { Configure input and output 346 AWS Step Functions Developer Guide "details": "Default example", "who": "AWS Step Functions" } The result of the Task state is the following. Hello, AWS Step Functions You can add this result to the state’s input by applying ResultPath and entering a reference path that indicates where to add the result, such as $.taskresult: With this ResultPath, the following is the JSON that is passed as the state’s output. { "details": "Default example", "who": "AWS Step Functions", "taskresult": "Hello, AWS Step Functions!" } Use OutputPath The OutputPath filter lets you filter out unwanted information, and pass only the portion of JSON that you need. The OutputPath is a string, beginning with $, that identifies nodes within JSON text. Example Example to use OutputPath filter Imagine a Lambda Invoke API call returns metadata in addition to the Lambda function’s result. { "ExecutedVersion": "$LATEST", "Payload": { "foo": "bar", "colors": [ "red", "blue", "green" ], "car": { "year": 2008, "make": "Toyota", "model": "Matrix" Configure input and output 347 AWS Step Functions } }, "SdkHttpMetadata": { "AllHttpHeaders": { "X-Amz-Executed-Version": ["$LATEST"] ... Developer Guide You can use OutputPath to filter out the additional metadata. By default, the value of OutputPath filter for Lambda Invoke states created through the Workflow Studio is $.Payload. This default value removes the additional metadata and returns an output equivalent to running the Lambda function directly. The Lambda Invoke task result example and the value of $.Payload for the Output filter pass the following JSON data as the output. { "foo": "bar", "colors": [ "red", "blue", "green" ], "car": { "year": 2008, "make": "Toyota", "model": "Matrix" } } Note The OutputPath filter is the last output filter to take effect, so if you use additional output filters such as ResultSelector or ResultPath, you should modify the default value of $.Payload for the OutputPath filter accordingly. Set up execution roles with Workflow Studio in Step Functions You can use Workflow Studio to set up execution roles for your workflows. Every Step Functions state machine requires an AWS Identity and Access Management (IAM) role which grants the state Set up execution roles 348 AWS Step Functions Developer Guide machine permission to perform actions on AWS services and resources or call HTTPS APIs. This role is called an execution role. The execution role must contain IAM policies for each action, for example, policies that allow the state machine to invoke an AWS Lambda function, run an AWS Batch job, or call the Stripe API. Step Functions requires you to provide an execution role in the following cases: • You create a state machine in the console, AWS SDKs or AWS CLI using the CreateStateMachine API. • You test a state in the console, AWS SDKs, or AWS CLI using the TestState API. Topics • About auto-generated roles • Automatically generating roles • Resolving role generation problems • Role for testing HTTP Tasks in Workflow Studio • Role for testing an optimized service integration in Workflow Studio • Role for testing an AWS SDK service integration in Workflow Studio • Role for testing flow states in Workflow Studio About auto-generated roles When you create a state machine in the Step Functions console, Workflow Studio can automatically create an execution role for you which contains the necessary IAM policies. Workflow Studio analyzes your state machine definition and generates policies with the least privileges necessary to execute your workflow. Workflow Studio can generate IAM policies for the following: • HTTP Tasks that call HTTPS APIs. • Task states that call other AWS services using optimized integrations, such as Lambda Invoke, DynamoDB GetItem, or AWS Glue StartJobRun. • Task states that run nested
|
step-functions-dg-113
|
step-functions-dg.pdf
| 113 |
Role for testing flow states in Workflow Studio About auto-generated roles When you create a state machine in the Step Functions console, Workflow Studio can automatically create an execution role for you which contains the necessary IAM policies. Workflow Studio analyzes your state machine definition and generates policies with the least privileges necessary to execute your workflow. Workflow Studio can generate IAM policies for the following: • HTTP Tasks that call HTTPS APIs. • Task states that call other AWS services using optimized integrations, such as Lambda Invoke, DynamoDB GetItem, or AWS Glue StartJobRun. • Task states that run nested workflows. • Distributed Map states, including policies to start child workflow executions, list Amazon S3 buckets, and read or write S3 objects. Set up execution roles 349 AWS Step Functions Developer Guide • X-Ray tracing. Every role that is auto-generated in Workflow Studio contains a policy which grants permissions for the state machine to send traces to X-Ray. • Using CloudWatch Logs to log execution history in Step Functions when logging is enabled on the state machine. Workflow Studio can't generate IAM policies for Task states that call other AWS services using AWS SDK integrations. Automatically generating roles 1. Open the Step Functions console and choose Create state machine. You can also update an existing state machine. Refer Step 4 if you're updating a state machine. 2. In the Choose a template dialog box, select Blank. 3. Choose Select to open Workflow Studio in Design mode. 4. Choose the Config tab. 5. Scroll down to the Permissions section, and do the following: a. For Execution role, make sure you keep the default selection of Create new role. Workflow Studio automatically generates all the required IAM policies for every valid state in your state machine definition. It displays a banner in with the message, An execution role will be created with full permissions. Set up execution roles 350 AWS Step Functions Developer Guide Tip To review the permissions that Workflow Studio automatically generates for your state machine, choose Review auto-generated permissions. Note If you delete the IAM role that Step Functions creates, Step Functions can't recreate it later. Similarly, if you modify the role (for example, by removing Step Functions from the principals in the IAM policy), Step Functions can't restore its original settings later. If Workflow Studio can't generate all the required IAM policies, it displays a banner with the message Permissions for certain actions cannot be auto-generated. An IAM role will Set up execution roles 351 AWS Step Functions Developer Guide be created with partial permissions only. For information about how to add the missing permissions, see Resolving role generation problems. b. Choose Create if you're creating a state machine. Otherwise, choose Save. c. Choose Confirm in the dialog box that appears. Workflow Studio saves your state machine and creates the new execution role. Resolving role generation problems Workflow Studio can't automatically generate an execution role with all the required permissions in the following cases: • There are errors in your state machine. Make sure to resolve all validation errors in Workflow Studio. Also, make sure that you address any server-side errors you encounter in the course of saving. • Your state machine contains tasks use AWS SDK integrations. Workflow Studio can't auto- generate IAM policies in this case. Workflow Studio displays a banner with the message, Permissions for certain actions cannot be auto-generated. An IAM role will be created with partial permissions only. In the Review auto-generated permissions table, choose the content in Status for more information about the policies your execution role is missing. Workflow Studio can still generate an execution role, but this role will not contain IAM policies for all actions. See the links under Documentation links to write your own policies and add them to the role after it is generated. These links are available even after you save the state machine. Role for testing HTTP Tasks in Workflow Studio Testing an HTTP Task state requires an execution role. If you don’t have a role with sufficient permissions, use one of the following options to create a role: • Auto-generate a role with Workflow Studio (recommended) – This is the secure option. Close the Test state dialog box and follow the instructions in Automatically generating roles. This will require you to create or update your state machine first, then go back into Workflow Studio to test your state. • Use a role with Administrator access – If you have permissions to create a role with full access to all services and resources in AWS, you can use that role to test any type of state in your workflow. To do this, you can create a Step Functions service role and add the AdministratorAccess policy to it in the IAM console https://console.aws.amazon.com/iam/. Set up execution
|
step-functions-dg-114
|
step-functions-dg.pdf
| 114 |
Test state dialog box and follow the instructions in Automatically generating roles. This will require you to create or update your state machine first, then go back into Workflow Studio to test your state. • Use a role with Administrator access – If you have permissions to create a role with full access to all services and resources in AWS, you can use that role to test any type of state in your workflow. To do this, you can create a Step Functions service role and add the AdministratorAccess policy to it in the IAM console https://console.aws.amazon.com/iam/. Set up execution roles 352 AWS Step Functions Developer Guide Role for testing an optimized service integration in Workflow Studio Task states that call optimized service integrations require an execution role. If you don’t have a role with sufficient permissions, use one of the following options to create a role: • Auto-generate a role with Workflow Studio (recommended) – This is the secure option. Close the Test state dialog box and follow the instructions in Automatically generating roles. This will require you to create or update your state machine first, then go back into Workflow Studio to test your state. • Use a role with Administrator access – If you have permissions to create a role with full access to all services and resources in AWS, you can use that role to test any type of state in your workflow. To do this, you can create a Step Functions service role and add the AdministratorAccess policy to it in the IAM console https://console.aws.amazon.com/iam/. Role for testing an AWS SDK service integration in Workflow Studio Task states that call AWS SDK integrations require an execution role. If you don’t have a role with sufficient permissions, use one of the following options to create a role: • Auto-generate a role with Workflow Studio (recommended) – This is the secure option. Close the Test state dialog box and follow the instructions in Automatically generating roles. This will require you to create or update your state machine first, then go back into Workflow Studio to test your state. Do the following: 1. Close the Test state dialog box 2. Choose the Config tab to view the Config mode. 3. Scroll down to the Permissions section. 4. Workflow Studio displays a banner with the message, Permissions for certain actions cannot be auto-generated. An IAM role will be created with partial permissions only. Choose Review auto-generated permissions. 5. The Review auto-generated permissions table displays a row that shows the action corresponding to the task state you want to test. See the links under Documentation links to write your own IAM policies into a custom role. • Use a role with Administrator access – If you have permissions to create a role with full access to all services and resources in AWS, you can use that role to test any type of state in your workflow. To do this, you can create a Step Functions service role and add the AdministratorAccess policy to it in the IAM console https://console.aws.amazon.com/iam/. Set up execution roles 353 AWS Step Functions Developer Guide Role for testing flow states in Workflow Studio You require an execution role to test flow states in Workflow Studio. Flow states are those states that direct execution flow, such as Choice workflow state, Parallel workflow state, Map workflow state, Pass workflow state, Wait workflow state, Succeed workflow state, or Fail workflow state. The TestState API doesn't work with Map or Parallel states. Use one of the following options to create a role for testing a flow state: • Use any role in your AWS account (recommended) – Flow states do not require any specific IAM policies, because they don’t call AWS actions or resources. Therefore, you can use any IAM role in your AWS account. 1. In the Test state dialog box, select any role from the Execution role dropdown list. 2. If no roles appear in the dropdown list, do the following: a. In the IAM console https://console.aws.amazon.com/iam/, choose Roles. b. Choose a role from the list, and copy its ARN from the role details page. You will need to provide this ARN in the Test state dialog box. c. In the Test state dialog box, select Enter a role ARN from the Execution role dropdown list. d. Paste the ARN in Role ARN. • Use a role with Administrator access – If you have permissions to create a role with full access to all services and resources in AWS, you can use that role to test any type of state in your workflow. To do this, you can create a Step Functions service role and add the AdministratorAccess policy to it in the IAM console https://console.aws.amazon.com/iam/. Configure error handling with Workflow Studio in Step Functions Managing state and
|
step-functions-dg-115
|
step-functions-dg.pdf
| 115 |
c. In the Test state dialog box, select Enter a role ARN from the Execution role dropdown list. d. Paste the ARN in Role ARN. • Use a role with Administrator access – If you have permissions to create a role with full access to all services and resources in AWS, you can use that role to test any type of state in your workflow. To do this, you can create a Step Functions service role and add the AdministratorAccess policy to it in the IAM console https://console.aws.amazon.com/iam/. Configure error handling with Workflow Studio in Step Functions Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. You can configure error handling within the Workflow Studio visual editor. By default, when a state reports an error, Step Functions causes the workflow execution to fail entirely. For actions and some flow states, you can configure how Step Functions handles errors. Configure error handling 354 AWS Step Functions Developer Guide Even if you have configured error handling, some errors may still cause a workflow execution to fail. For more information, see Handling errors in Step Functions workflows. In Workflow Studio, configure error handling in the Error handling tab of the Inspector panel. Retry on errors You can add one or more rules to action states and the Parallel workflow state flow state to retry the task when an error occurs. These rules are called retriers. To add a retrier, choose the edit icon in Retrier #1 box, then configure its options: • (Optional) In the Comment field, add your comment. It will not affect the workflow, but can be used to annotate your workflow. • Place the cursor in the Errors field and choose an error that will trigger the retrier, or enter a custom error name. You can choose or add multiple errors. • (Optional) Set an Interval. This is the time in seconds before Step Functions make its first retry. Additional retries will follow at intervals that you can configure with Max attempts and Backoff rate. • (Optional) Set Max attempts. This is the maximum number of retries before Step Functions will cause the execution to fail. • (Optional) Set the Backoff rate. This is a multiplier that determines by how much the retry interval will increase with each attempt. Note Not all error handling options are available for all states. Lambda Invoke has one retrier configured by default. Catch errors You can add one or more rules to action states and to the Parallel workflow state and Map workflow state flow states to catch an error. These rules are called catchers. To add a catcher, choose Add new catcher, then configure its options: • (Optional) In the Comment field, add your comment. It will not affect the workflow, but can be used to annotate your workflow. Configure error handling 355 AWS Step Functions Developer Guide • Place the cursor in Errors field and choose an error that will trigger the catcher, or enter a custom error name. You can choose or add multiple errors. • In the Fallback state field, choose a fallback state. This is the state that the workflow will move to next, after an error is caught. • (Optional) In the ResultPath field, add a ResultPath filter to add the error to the original state input. The ResultPath must be a valid JsonPath. This will be sent to the fallback state. Timeouts You can configure a timeout for action states to set the maximum number of seconds your state can run before it fails. Use timeouts to prevent stuck executions. To configure a timeout, enter the number of seconds your state should wait before the execution fails. For more information about timeouts, see TimeoutSeconds in Task workflow state state. HeartbeatSeconds You can configure a Heartbeat or periodic notification sent by your task. If you set a heartbeat interval, and your state doesn't send heartbeat notifications in the configured intervals, the task is marked as failed. To configure a heartbeat, set a positive, non-zero integer number of seconds. For more information, see HeartBeatSeconds in Task workflow state state. Using Workflow Studio in Infrastructure Composer to build Step Functions workflows Workflow Studio is available in Infrastructure Composer to help you design and build your workflows. Workflow Studio in Infrastructure Composer provides a visual infrastructure as code (IaC) environment that makes it easy for you to incorporate workflows in your serverless applications built using IaC tools, such as CloudFormation templates. AWS Infrastructure Composer is a visual builder that helps you develop AWS SAM and AWS CloudFormation templates using a simple graphical interface. With Infrastructure Composer, you design an application architecture by dragging, grouping, and connecting AWS services in a visual canvas. Infrastructure Composer then creates an IaC template from your design
|
step-functions-dg-116
|
step-functions-dg.pdf
| 116 |
Workflow Studio is available in Infrastructure Composer to help you design and build your workflows. Workflow Studio in Infrastructure Composer provides a visual infrastructure as code (IaC) environment that makes it easy for you to incorporate workflows in your serverless applications built using IaC tools, such as CloudFormation templates. AWS Infrastructure Composer is a visual builder that helps you develop AWS SAM and AWS CloudFormation templates using a simple graphical interface. With Infrastructure Composer, you design an application architecture by dragging, grouping, and connecting AWS services in a visual canvas. Infrastructure Composer then creates an IaC template from your design that you can use to deploy your application with the AWS SAM Command Line Interface (AWS SAM CLI) or CloudFormation. To learn more about Infrastructure Composer, see What is Infrastructure Composer. Using Workflow Studio in Infrastructure Composer 356 AWS Step Functions Developer Guide When you use Workflow Studio in Infrastructure Composer, Infrastructure Composer connects the individual workflow steps to AWS resources and generates the resource configurations in an AWS SAM template. Infrastructure Composer also adds the IAM permissions required for your workflow to run. Using Workflow Studio in Infrastructure Composer, you can create prototypes of your applications and turn them into production-ready applications. When you use Workflow Studio in Infrastructure Composer, you can switch back and forth between the Infrastructure Composer canvas and Workflow Studio. Topics • Using Workflow Studio in Infrastructure Composer to build a serverless workflow • Dynamically reference resources using CloudFormation definition substitutions in Workflow Studio • Connect service integration tasks to enhanced component cards • Import existing projects and sync them locally • Export Step Functions workflows directly into AWS Infrastructure Composer • Unavailable Workflow Studio features in AWS Infrastructure Composer Using Workflow Studio in Infrastructure Composer to build a serverless workflow 1. Open the Infrastructure Composer console and choose Create project to create a project. 2. In the search field in the Resources palette, enter state machine. 3. Drag the Step Functions State machine resource onto the canvas. 4. Choose Edit in Workflow Studio to edit your state machine resource. The following animation shows how you can switch to the Workflow Studio for editing your state machine definition. An animation that illustrates how you can use Workflow Studio in Infrastructure Composer. The integration with Workflow Studio to edit state machines resources created in Infrastructure Composer is only available for AWS::Serverless::StateMachine resource. This integration is not available for templates that use the AWS::StepFunctions::StateMachine resource. Using Workflow Studio in Infrastructure Composer 357 AWS Step Functions Developer Guide Dynamically reference resources using CloudFormation definition substitutions in Workflow Studio In Workflow Studio, you can use CloudFormation definition substitutions in your workflow definition to dynamically reference resources that you've defined in your IaC template. You can add placeholder substitutions to your workflow definition using the ${dollar_sign_brace} notation and they are replaced with actual values during the CloudFormation stack creation process. For more information about definition substitutions, see DefinitionSubstitutions in AWS SAM templates. The following animation shows how you can add placeholder substitutions for the resources in your state machine definition. Animation showing how to add placeholder substitutions for resources in your state machine. Connect service integration tasks to enhanced component cards You can connect the tasks that call optimized service integrations to enhanced component cards in Infrastructure Composer canvas. Doing this automatically maps any placeholder substitutions specified by the ${dollar_sign_brace} notation in your workflow definition and the DefinitionSubstitution property for your StateMachine resource. It also adds the appropriate AWS SAM policies for the state machine. If you map optimized service integration tasks with standard component cards, the connection line doesn't appear on the Infrastructure Composer canvas. The following animation shows how you can connect an optimized task to an enhanced component card and view the changes in Change Inspector. Animation showing how to connect tasks and optimized service integrations. You can't connect AWS SDK integrations in your Task state with enhanced component cards or optimized service integrations with standard component cards. For these tasks, you can map the substitutions in the Resource properties panel in Infrastructure Composer canvas, and add policies in the AWS SAM template. Tip Alternatively, you can also map placeholder substitutions for your state machine under Definition Substitutions in the Resource properties panel. When you do this, you must add the required permissions for the AWS service your Task state calls in the state machine Using Workflow Studio in Infrastructure Composer 358 AWS Step Functions Developer Guide execution role. For information about permissions your execution role might need, see Set up execution roles with Workflow Studio in Step Functions. The following animation shows how you can manually update the placeholder substitution mapping in the Resource properties panel. Animation showing how to update placeholder substitution mapping in the resource properties panel. Import existing projects and sync them locally
|
step-functions-dg-117
|
step-functions-dg.pdf
| 117 |
Substitutions in the Resource properties panel. When you do this, you must add the required permissions for the AWS service your Task state calls in the state machine Using Workflow Studio in Infrastructure Composer 358 AWS Step Functions Developer Guide execution role. For information about permissions your execution role might need, see Set up execution roles with Workflow Studio in Step Functions. The following animation shows how you can manually update the placeholder substitution mapping in the Resource properties panel. Animation showing how to update placeholder substitution mapping in the resource properties panel. Import existing projects and sync them locally You can open existing CloudFormation and AWS SAM projects in Infrastructure Composer to visualize them for better understanding and modify their designs. Using Infrastructure Composer's local sync feature, you can automatically sync and save your template and code files to your local build machine. Using the local sync mode can compliment your existing development flows. Make sure that your browser supports the File System Access API, which allows web applications to read, write, and save files in your local file system. We recommend using either Google Chrome or Microsoft Edge. Export Step Functions workflows directly into AWS Infrastructure Composer The AWS Step Functions console provides the ability to export a saved state machine workflow as a template that's recognized as an advanced IaC resource by Infrastructure Composer. This feature creates an IaC template as an AWS SAM schema and navigates you to Infrastructure Composer. For more information, see Exporting your workflow to IaC templates. Unavailable Workflow Studio features in AWS Infrastructure Composer When you use Workflow Studio in Infrastructure Composer, some of the Workflow Studio features are unavailable. In addition, the API Parameters section available in the Inspector panel panel supports CloudFormation definition substitutions. You can add the substitutions in the Code mode using the ${dollar_sign_brace} notation. For more information about this notation, see DefinitionSubstitutions in AWS SAM templates. The following list describes the Workflow Studio features that are unavailable when you use Workflow Studio in Infrastructure Composer: • Starter templates – Starter templates are ready-to-run sample projects that automatically create the workflow prototypes and definitions. These templates deploys all the related AWS resources that your project needs to your AWS account. Using Workflow Studio in Infrastructure Composer 359 AWS Step Functions Developer Guide • Config mode – This mode lets you manage the configuration of your state machines. You can update your state machine configurations in your IaC templates or use the Resource properties panel in Infrastructure Composer canvas. For information about updating configurations in the Resource properties panel, see Connect service integration tasks to enhanced component cards. • TestState API • Option to import or export workflow definitions from the Actions dropdown button in Workflow Studio. Instead, from the Infrastructure Composer menu, select Open > Project folder. Make sure that you've enabled the local sync mode to automatically save your changes in the Infrastructure Composer canvas directly to your local machine. • Execute button. When you use Workflow Studio in Infrastructure Composer, Infrastructure Composer generates the IaC code for your workflow. Therefore, you must first deploy the template. Then, run the workflow in the console or through the AWS Command Line Interface (AWS CLI). Using AWS SAM to build Step Functions workflows You can use AWS Serverless Application Model with Step Functions to build workflows and deploy the infrastructure you need, including Lambda functions, APIs and events, to create serverless applications. You can also use the AWS Serverless Application Model CLI in conjunction with the AWS Toolkit for Visual Studio Code as part of an integrated experience to build and deploy AWS Step Functions state machines. You can build a serverless application with AWS SAM, then build out your state machine in the VS Code IDE. Then you can validate, package, and deploy your resources. Tip To deploy a sample serverless application that starts a Step Functions workflow using AWS SAM, see Deploy with AWS SAM in The AWS Step Functions Workshop. Why use Step Functions with AWS SAM? When you use Step Functions with AWS SAM you can: • Get started using a AWS SAM sample template. • Build your state machine into your serverless application. Using AWS SAM 360 AWS Step Functions Developer Guide • Use variable substitution to substitute ARNs into your state machine at the time of deployment. AWS CloudFormation supports DefinitionSubstitutions that let you add dynamic references in your workflow definition to a value that you provide in your CloudFormation template. You can add dynamic references by adding substitutions to your workflow definition using the ${dollar_sign_brace} notation. You also need to define these dynamic references in the DefinitionSubstitutions property for the StateMachine resource in your CloudFormation template. These substitutions are replaced with actual values during the CloudFormation stack creation process. For more information,
|
step-functions-dg-118
|
step-functions-dg.pdf
| 118 |
AWS SAM 360 AWS Step Functions Developer Guide • Use variable substitution to substitute ARNs into your state machine at the time of deployment. AWS CloudFormation supports DefinitionSubstitutions that let you add dynamic references in your workflow definition to a value that you provide in your CloudFormation template. You can add dynamic references by adding substitutions to your workflow definition using the ${dollar_sign_brace} notation. You also need to define these dynamic references in the DefinitionSubstitutions property for the StateMachine resource in your CloudFormation template. These substitutions are replaced with actual values during the CloudFormation stack creation process. For more information, see DefinitionSubstitutions in AWS SAM templates. • Specify your state machine's role using AWS SAM policy templates. • Initiate state machine executions with API Gateway, EventBridge events, or on a schedule within your AWS SAM template. Step Functions integration with the AWS SAM specification You can use the AWS SAM Policy Templates to add permissions to your state machine. With these permissions, you can orchestrate Lambda functions and other AWS resources to form complex and robust workflows. Step Functions integration with the SAM CLI Step Functions is integrated with the AWS SAM CLI. Use this to quickly develop a state machine into your serverless application. Try the Create a Step Functions state machine using AWS SAM tutorial to learn how to use AWS SAM to create state machines. Supported AWS SAM CLI functions include: CLI Command sam init Description Initializes a Serverless Application with an AWS SAM template. Can be used with a SAM template for Step Functions. sam validate Validates an AWS SAM template. Step Functions integration with the AWS SAM specification 361 AWS Step Functions CLI Command sam package sam deploy sam publish Note Developer Guide Description Packages an AWS SAM application. It creates a ZIP file of your code and dependencies, and then uploads it to Amazon S3. It then returns a copy of your AWS SAM template, replacing references to local artifacts with the Amazon S3 location where the command uploaded the artifacts. Deploys an AWS SAM application. Publish an AWS SAM application to the AWS Serverless Application Repository. This command takes a packaged AWS SAM template and publishes the application to the specified region. When using AWS SAM local, you can emulate Lambda and API Gateway locally. However, you can't emulate Step Functions locally using AWS SAM. DefinitionSubstitutions in AWS SAM templates You can define state machines using CloudFormation templates with AWS SAM. Using AWS SAM, you can define the state machine inline in the template or in a separate file. The following AWS SAM template includes a state machine that simulates a stock trading workflow. This state machine invokes three Lambda functions to check the price of a stock and determine whether to buy or sell the stock. This transaction is then recorded in an Amazon DynamoDB table. The ARNs for the Lambda functions and DynamoDB table in the following template are specified using DefinitionSubstitutions. AWSTemplateFormatVersion: '2010-09-09' Transform: AWS::Serverless-2016-10-31 Description: | DefinitionSubstitutions in AWS SAM templates 362 AWS Step Functions Developer Guide step-functions-stock-trader Sample SAM Template for step-functions-stock-trader Resources: StockTradingStateMachine: Type: AWS::Serverless::StateMachine Properties: DefinitionSubstitutions: StockCheckerFunctionArn: !GetAtt StockCheckerFunction.Arn StockSellerFunctionArn: !GetAtt StockSellerFunction.Arn StockBuyerFunctionArn: !GetAtt StockBuyerFunction.Arn DDBPutItem: !Sub arn:${AWS::Partition}:states:::dynamodb:putItem DDBTable: !Ref TransactionTable Policies: - DynamoDBWritePolicy: TableName: !Ref TransactionTable - LambdaInvokePolicy: FunctionName: !Ref StockCheckerFunction - LambdaInvokePolicy: FunctionName: !Ref StockBuyerFunction - LambdaInvokePolicy: FunctionName: !Ref StockSellerFunction DefinitionUri: statemachine/stock_trader.asl.json StockCheckerFunction: Type: AWS::Serverless::Function Properties: CodeUri: functions/stock-checker/ Handler: app.lambdaHandler Runtime: nodejs18.x Architectures: - x86_64 StockSellerFunction: Type: AWS::Serverless::Function Properties: CodeUri: functions/stock-seller/ Handler: app.lambdaHandler Runtime: nodejs18.x Architectures: - x86_64 StockBuyerFunction: Type: AWS::Serverless::Function Properties: CodeUri: functions/stock-buyer/ Handler: app.lambdaHandler Runtime: nodejs18.x DefinitionSubstitutions in AWS SAM templates 363 AWS Step Functions Architectures: - x86_64 TransactionTable: Type: AWS::DynamoDB::Table Properties: AttributeDefinitions: - AttributeName: id AttributeType: S Developer Guide The following code is the state machine definition in the file stock_trader.asl.json which is used in the Create a Step Functions state machine using AWS SAM tutorial.This state machine definition contains several DefinitionSubstitutions denoted by the ${dollar_sign_brace} notation. For example, instead of specifying a static Lambda function ARN for the Check Stock Value task, the substitution ${StockCheckerFunctionArn} is used. This substitution is defined in the DefinitionSubstitutions property of the template. DefinitionSubstitutions is a map of key-value pairs for the state machine resource. In DefinitionSubstitutions, ${StockCheckerFunctionArn} maps to the ARN of the StockCheckerFunction resource using the CloudFormation intrinsic function !GetAtt. When you deploy the AWS SAM template, the DefinitionSubstitutions in the template are replaced with the actual values. { "Comment": "A state machine that does mock stock trading.", "StartAt": "Check Stock Value", "States": { "Check Stock Value": { "Type": "Task", "Resource": "arn:aws:states:::lambda:invoke", "OutputPath": "$.Payload", "Parameters": { "Payload.$": "$", "FunctionName": "${StockCheckerFunctionArn}" }, "Next": "Buy or Sell?" }, "Buy or Sell?": { "Type": "Choice", "Choices": [ { "Variable": "$.stock_price", "NumericLessThanEquals": 50, DefinitionSubstitutions in AWS SAM templates 364 AWS Step Functions Developer Guide "Next": "Buy Stock"
|
step-functions-dg-119
|
step-functions-dg.pdf
| 119 |
In DefinitionSubstitutions, ${StockCheckerFunctionArn} maps to the ARN of the StockCheckerFunction resource using the CloudFormation intrinsic function !GetAtt. When you deploy the AWS SAM template, the DefinitionSubstitutions in the template are replaced with the actual values. { "Comment": "A state machine that does mock stock trading.", "StartAt": "Check Stock Value", "States": { "Check Stock Value": { "Type": "Task", "Resource": "arn:aws:states:::lambda:invoke", "OutputPath": "$.Payload", "Parameters": { "Payload.$": "$", "FunctionName": "${StockCheckerFunctionArn}" }, "Next": "Buy or Sell?" }, "Buy or Sell?": { "Type": "Choice", "Choices": [ { "Variable": "$.stock_price", "NumericLessThanEquals": 50, DefinitionSubstitutions in AWS SAM templates 364 AWS Step Functions Developer Guide "Next": "Buy Stock" } ], "Default": "Sell Stock" }, "Buy Stock": { "Type": "Task", "Resource": "arn:aws:states:::lambda:invoke", "OutputPath": "$.Payload", "Parameters": { "Payload.$": "$", "FunctionName": "${StockBuyerFunctionArn}" }, "Retry": [ { "ErrorEquals": [ "Lambda.ServiceException", "Lambda.AWSLambdaException", "Lambda.SdkClientException", "Lambda.TooManyRequestsException" ], "IntervalSeconds": 1, "MaxAttempts": 3, "BackoffRate": 2 } ], "Next": "Record Transaction" }, "Sell Stock": { "Type": "Task", "Resource": "arn:aws:states:::lambda:invoke", "OutputPath": "$.Payload", "Parameters": { "Payload.$": "$", "FunctionName": "${StockSellerFunctionArn}" }, "Next": "Record Transaction" }, "Record Transaction": { "Type": "Task", "Resource": "arn:aws:states:::dynamodb:putItem", "Parameters": { "TableName": "${DDBTable}", "Item": { DefinitionSubstitutions in AWS SAM templates 365 AWS Step Functions Developer Guide "Id": { "S.$": "$.id" }, "Type": { "S.$": "$.type" }, "Price": { "N.$": "$.price" }, "Quantity": { "N.$": "$.qty" }, "Timestamp": { "S.$": "$.timestamp" } } }, "End": true } } } Next steps You can learn more about using Step Functions with AWS SAM with the following resources: • Complete the Create a Step Functions state machine using AWS SAM tutorial to create a state machine with AWS SAM. • Specify a AWS::Serverless::StateMachine resource. • Find AWS SAM Policy Templates to use. • Use AWS Toolkit for Visual Studio Code with Step Functions. • Review the AWS SAM CLI reference to learn more about the features available in AWS SAM. You can also design and build your workflows in infrastructure as code (IaC) using visual builders, such as Workflow Studio in Infrastructure Composer. For more information, see Using Workflow Studio in Infrastructure Composer to build Step Functions workflows. Next steps 366 AWS Step Functions Developer Guide Using AWS CloudFormation to create a workflow in Step Functions In this tutorial, you will create a AWS Lambda function using AWS CloudFormation. You'll use the AWS CloudFormation console and a YAML template to create a stack (IAM roles, the Lambda function, and the state machine). Then, you'll use the Step Functions console to start the state machine execution. For more information, see Working with CloudFormation Templates and the AWS::StepFunctions::StateMachine resource in the AWS CloudFormation User Guide. Step 1: Set up your AWS CloudFormation template Before you use the example templates, you should understand how to declare the different parts of an AWS CloudFormation template. To create an IAM role for Lambda Define the trust policy associated with the IAM role for the Lambda function. The following examples define a trust policy using either YAML or JSON. YAML LambdaExecutionRole: Type: "AWS::IAM::Role" Properties: AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Principal: Service: lambda.amazonaws.com Action: "sts:AssumeRole" JSON "LambdaExecutionRole": { "Type": "AWS::IAM::Role", "Properties": { "AssumeRolePolicyDocument": { Create a state machine with CloudFormation 367 AWS Step Functions Developer Guide "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "lambda.amazonaws.com" }, "Action": "sts:AssumeRole" } ] } } To create a Lambda function Define the following properties for a Lambda function that will print the message Hello World. Important Ensure that your Lambda function is under the same AWS account and AWS Region as your state machine. YAML MyLambdaFunction: Type: "AWS::Lambda::Function" Properties: Handler: "index.handler" Role: !GetAtt [ LambdaExecutionRole, Arn ] Code: ZipFile: | exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }; Runtime: "nodejs12.x" Timeout: "25" JSON "MyLambdaFunction": { Step 1: Set up your AWS CloudFormation template 368 AWS Step Functions Developer Guide "Type": "AWS::Lambda::Function", "Properties": { "Handler": "index.handler", "Role": { "Fn::GetAtt": [ "LambdaExecutionRole", "Arn" ] }, "Code": { "ZipFile": "exports.handler = (event, context, callback) => {\n callback(null, \"Hello World!\");\n};\n" }, "Runtime": "nodejs12.x", "Timeout": "25" } }, To create an IAM role for the state machine execution Define the trust policy associated with the IAM role for the state machine execution. YAML StatesExecutionRole: Type: "AWS::IAM::Role" Properties: AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Principal: Service: - !Sub states.${AWS::Region}.amazonaws.com Action: "sts:AssumeRole" Path: "/" Policies: - PolicyName: StatesExecutionPolicy PolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Step 1: Set up your AWS CloudFormation template 369 AWS Step Functions Developer Guide Action: - "lambda:InvokeFunction" Resource: "*" JSON "StatesExecutionRole": { "Type": "AWS::IAM::Role", "Properties": { "AssumeRolePolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": [ { "Fn::Sub": "states. ${AWS::Region}.amazonaws.com" } ] }, "Action": "sts:AssumeRole" } ] }, "Path": "/", "Policies": [ { "PolicyName": "StatesExecutionPolicy", "PolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "lambda:InvokeFunction" ], "Resource": "*" } ] } } Step 1: Set up your AWS CloudFormation template
|
step-functions-dg-120
|
step-functions-dg.pdf
| 120 |
Action: "sts:AssumeRole" Path: "/" Policies: - PolicyName: StatesExecutionPolicy PolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Step 1: Set up your AWS CloudFormation template 369 AWS Step Functions Developer Guide Action: - "lambda:InvokeFunction" Resource: "*" JSON "StatesExecutionRole": { "Type": "AWS::IAM::Role", "Properties": { "AssumeRolePolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": [ { "Fn::Sub": "states. ${AWS::Region}.amazonaws.com" } ] }, "Action": "sts:AssumeRole" } ] }, "Path": "/", "Policies": [ { "PolicyName": "StatesExecutionPolicy", "PolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "lambda:InvokeFunction" ], "Resource": "*" } ] } } Step 1: Set up your AWS CloudFormation template 370 AWS Step Functions Developer Guide ] } }, To create a Lambda state machine Define the Lambda state machine. YAML MyStateMachine: Type: "AWS::StepFunctions::StateMachine" Properties: DefinitionString: !Sub - |- { "Comment": "A Hello World example using an AWS Lambda function", "StartAt": "HelloWorld", "States": { "HelloWorld": { "Type": "Task", "Resource": "${lambdaArn}", "End": true } } } - {lambdaArn: !GetAtt [ MyLambdaFunction, Arn ]} RoleArn: !GetAtt [ StatesExecutionRole, Arn ] JSON "MyStateMachine": { "Type": "AWS::StepFunctions::StateMachine", "Properties": { "DefinitionString": { "Fn::Sub": [ "{\n \"Comment\": \"A Hello World example using an AWS Lambda function\",\n \"StartAt\": \"HelloWorld\",\n \"States\": {\n \"HelloWorld\": {\n \"Type\": \"Task\",\n \"Resource\": \"${lambdaArn}\", \n \"End\": true\n }\n }\n}", { Step 1: Set up your AWS CloudFormation template 371 AWS Step Functions Developer Guide "lambdaArn": { "Fn::GetAtt": [ "MyLambdaFunction", "Arn" ] } } ] }, "RoleArn": { "Fn::GetAtt": [ "StatesExecutionRole", "Arn" ] } } } Step 2: Use the AWS CloudFormation template to create a Lambda State Machine Once you understand the components of the AWS CloudFormation template, you can put them together and use the template to create an AWS CloudFormation stack. To create the Lambda state machine 1. Copy the following example data to a file named MyStateMachine.yaml for the YAML example, or MyStateMachine.json for JSON. YAML AWSTemplateFormatVersion: "2010-09-09" Description: "An example template with an IAM role for a Lambda state machine." Resources: LambdaExecutionRole: Type: "AWS::IAM::Role" Properties: AssumeRolePolicyDocument: Version: "2012-10-17" Statement: Step 2: Use the AWS CloudFormation template to create a Lambda State Machine 372 AWS Step Functions Developer Guide - Effect: Allow Principal: Service: lambda.amazonaws.com Action: "sts:AssumeRole" MyLambdaFunction: Type: "AWS::Lambda::Function" Properties: Handler: "index.handler" Role: !GetAtt [ LambdaExecutionRole, Arn ] Code: ZipFile: | exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }; Runtime: "nodejs12.x" Timeout: "25" StatesExecutionRole: Type: "AWS::IAM::Role" Properties: AssumeRolePolicyDocument: Version: "2012-10-17" Statement: - Effect: "Allow" Principal: Service: - !Sub states.${AWS::Region}.amazonaws.com Action: "sts:AssumeRole" Path: "/" Policies: - PolicyName: StatesExecutionPolicy PolicyDocument: Version: "2012-10-17" Statement: - Effect: Allow Action: - "lambda:InvokeFunction" Resource: "*" MyStateMachine: Type: "AWS::StepFunctions::StateMachine" Properties: DefinitionString: Step 2: Use the AWS CloudFormation template to create a Lambda State Machine 373 AWS Step Functions Developer Guide !Sub - |- { "Comment": "A Hello World example using an AWS Lambda function", "StartAt": "HelloWorld", "States": { "HelloWorld": { "Type": "Task", "Resource": "${lambdaArn}", "End": true } } } - {lambdaArn: !GetAtt [ MyLambdaFunction, Arn ]} RoleArn: !GetAtt [ StatesExecutionRole, Arn ] JSON { "AWSTemplateFormatVersion": "2010-09-09", "Description": "An example template with an IAM role for a Lambda state machine.", "Resources": { "LambdaExecutionRole": { "Type": "AWS::IAM::Role", "Properties": { "AssumeRolePolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": "lambda.amazonaws.com" }, "Action": "sts:AssumeRole" } ] } } }, "MyLambdaFunction": { "Type": "AWS::Lambda::Function", "Properties": { Step 2: Use the AWS CloudFormation template to create a Lambda State Machine 374 AWS Step Functions Developer Guide "Handler": "index.handler", "Role": { "Fn::GetAtt": [ "LambdaExecutionRole", "Arn" ] }, "Code": { "ZipFile": "exports.handler = (event, context, callback) => {\n callback(null, \"Hello World!\");\n};\n" }, "Runtime": "nodejs12.x", "Timeout": "25" } }, "StatesExecutionRole": { "Type": "AWS::IAM::Role", "Properties": { "AssumeRolePolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Principal": { "Service": [ { "Fn::Sub": "states. ${AWS::Region}.amazonaws.com" } ] }, "Action": "sts:AssumeRole" } ] }, "Path": "/", "Policies": [ { "PolicyName": "StatesExecutionPolicy", "PolicyDocument": { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", Step 2: Use the AWS CloudFormation template to create a Lambda State Machine 375 AWS Step Functions Developer Guide "Action": [ "lambda:InvokeFunction" ], "Resource": "*" } ] } } ] } }, "MyStateMachine": { "Type": "AWS::StepFunctions::StateMachine", "Properties": { "DefinitionString": { "Fn::Sub": [ "{\n \"Comment\": \"A Hello World example using an AWS Lambda function\",\n \"StartAt\": \"HelloWorld\",\n \"States\": {\n \"HelloWorld\": {\n \"Type\": \"Task\",\n \"Resource\": \"${lambdaArn}\",\n \"End\": true\n }\n }\n}", { "lambdaArn": { "Fn::GetAtt": [ "MyLambdaFunction", "Arn" ] } } ] }, "RoleArn": { "Fn::GetAtt": [ "StatesExecutionRole", "Arn" ] } } } } } 2. Open the AWS CloudFormation console and choose Create Stack. Step 2: Use the AWS CloudFormation template to create a Lambda State Machine 376 AWS Step Functions Developer Guide 3. On the Select Template page, choose Upload a template to Amazon S3. Choose your MyStateMachine file, and then choose Next. 4. On the Specify Details page, for Stack name, enter MyStateMachine, and then choose Next. 5. On the Options page, choose Next. 6.
|
step-functions-dg-121
|
step-functions-dg.pdf
| 121 |
true\n }\n }\n}", { "lambdaArn": { "Fn::GetAtt": [ "MyLambdaFunction", "Arn" ] } } ] }, "RoleArn": { "Fn::GetAtt": [ "StatesExecutionRole", "Arn" ] } } } } } 2. Open the AWS CloudFormation console and choose Create Stack. Step 2: Use the AWS CloudFormation template to create a Lambda State Machine 376 AWS Step Functions Developer Guide 3. On the Select Template page, choose Upload a template to Amazon S3. Choose your MyStateMachine file, and then choose Next. 4. On the Specify Details page, for Stack name, enter MyStateMachine, and then choose Next. 5. On the Options page, choose Next. 6. On the Review page, choose I acknowledge that AWS CloudFormation might create IAM resources. and then choose Create. AWS CloudFormation begins to create the MyStateMachine stack and displays the CREATE_IN_PROGRESS status. When the process is complete, AWS CloudFormation displays the CREATE_COMPLETE status. 7. (Optional) To display the resources in your stack, select the stack and choose the Resources tab. Step 3: Start a State Machine execution After you create your Lambda state machine, you can start its execution. To start the state machine execution 1. Open the Step Functions console and choose the name of the state machine that you created using AWS CloudFormation. 2. On the MyStateMachine-ABCDEFGHIJ1K page, choose New execution. The New execution page is displayed. 3. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. 4. Choose Start Execution. A new execution of your state machine starts, and a new page showing your running execution is displayed. Step 3: Start a State Machine execution 377 AWS Step Functions Developer Guide 5. (Optional) In the Execution Details, review the Execution Status and the Started and Closed timestamps. 6. To view the results of your execution, choose Output. Using AWS CDK to create a Standard workflow in Step Functions You can use the AWS Cloud Development Kit (AWS CDK) Infrastructure as Code (IAC) framework, to create an AWS Step Functions state machine that contains an AWS Lambda function. You will define AWS infrastructure using one of the CDK's supported languages. After you define your infrastructure, you will synthesize your app to an AWS CloudFormation template and deploy it to your AWS account. You will use this method to define a Step Functions state machine containing a Lambda function, and then run the state machine from the use the Step Functions AWS Management Console. Before you begin this tutorial, you must set up your AWS CDK development environment as described in Getting Started With the AWS CDK - Prerequisites in the AWS Cloud Development Kit (AWS CDK) Developer Guide. Then, install the AWS CDK with the following command at the AWS CLI: npm install -g aws-cdk This tutorial produces the same result as the section called “Create a state machine with CloudFormation”. However, in this tutorial, the AWS CDK doesn't require you to create any IAM roles; the AWS CDK does it for you. The AWS CDK version also includes a Succeed workflow state step to illustrate how to add additional steps to your state machine. Tip To deploy a sample serverless application that starts a Step Functions workflow using AWS CDK with TypeScript, see Deploy with AWS CDK in The AWS Step Functions Workshop. Using CDK to create a Standard workflow 378 AWS Step Functions Developer Guide Step 1: Set up your AWS CDK project 1. In your home directory, or another directory if you prefer, run the following command to create a directory for your new AWS CDK app. Important Be sure to name the directory step. The AWS CDK application template uses the name of the directory to generate names for source files and classes. If you use a different name, your app will not match this tutorial. TypeScript mkdir step && cd step JavaScript mkdir step && cd step Python mkdir step && cd step Java C# mkdir step && cd step Make sure you've installed .NET version 6.0 or higher. For information, see Supported versions. mkdir step && cd step 2. Initialize the app by using the cdk init command. Specify the desired template ("app") and programming language as shown in the following examples. Step 1: Set up your AWS CDK project 379 Developer Guide AWS Step Functions TypeScript cdk init --language typescript JavaScript cdk init --language javascript Python cdk init --language python After the project is initialized, activate the project's virtual environment and install the AWS CDK's baseline dependencies. source .venv/bin/activate python -m pip install -r requirements.txt Java C# cdk init --language java cdk init --language csharp Step 2: Use AWS
|
step-functions-dg-122
|
step-functions-dg.pdf
| 122 |
see Supported versions. mkdir step && cd step 2. Initialize the app by using the cdk init command. Specify the desired template ("app") and programming language as shown in the following examples. Step 1: Set up your AWS CDK project 379 Developer Guide AWS Step Functions TypeScript cdk init --language typescript JavaScript cdk init --language javascript Python cdk init --language python After the project is initialized, activate the project's virtual environment and install the AWS CDK's baseline dependencies. source .venv/bin/activate python -m pip install -r requirements.txt Java C# cdk init --language java cdk init --language csharp Step 2: Use AWS CDK to create a state machine First, we'll present the individual pieces of code that define the Lambda function and the Step Functions state machine. Then, we'll explain how to put them together in your AWS CDK app. Finally, you'll see how to synthesize and deploy these resources. To create a Lambda function The following AWS CDK code defines the Lambda function, providing its source code inline. Step 2: Use AWS CDK to create a state machine 380 AWS Step Functions TypeScript Developer Guide const helloFunction = new lambda.Function(this, 'MyLambdaFunction', { code: lambda.Code.fromInline(` exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }; `), runtime: lambda.Runtime.NODEJS_18_X, handler: "index.handler", timeout: cdk.Duration.seconds(3) }); JavaScript const helloFunction = new lambda.Function(this, 'MyLambdaFunction', { code: lambda.Code.fromInline(` exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }; `), runtime: lambda.Runtime.NODEJS_18_X, handler: "index.handler", timeout: cdk.Duration.seconds(3) }); Python hello_function = lambda_.Function( self, "MyLambdaFunction", code=lambda_.Code.from_inline(""" exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }"""), runtime=lambda_.Runtime.NODEJS_18_X, handler="index.handler", timeout=Duration.seconds(25)) Java final Function helloFunction = Function.Builder.create(this, "MyLambdaFunction") .code(Code.fromInline( Step 2: Use AWS CDK to create a state machine 381 AWS Step Functions Developer Guide "exports.handler = (event, context, callback) => { callback(null, 'Hello World!' );}")) .runtime(Runtime.NODEJS_18_X) .handler("index.handler") .timeout(Duration.seconds(25)) .build(); C# var helloFunction = new Function(this, "MyLambdaFunction", new FunctionProps { Code = Code.FromInline(@"` exports.handler = (event, context, callback) => { callback(null, 'Hello World!'); }"), Runtime = Runtime.NODEJS_12_X, Handler = "index.handler", Timeout = Duration.Seconds(25) }); You can see in this short example code: • The function's logical name, MyLambdaFunction. • The source code for the function, embedded as a string in the source code of the AWS CDK app. • Other function attributes, such as the runtime to be used (Node 18.x), the function's entry point, and a timeout. To create a state machine Our state machine has two states: a Lambda function task, and a Succeed workflow state state. The function requires that we create a Step Functions the section called “Task” that invokes our function. This Task state is used as the first step in the state machine. The success state is added to the state machine using the Task state's next() method. The following code first invokes the function named MyLambdaTask, then uses the next() method to define a success state named GreetedWorld. TypeScript const stateMachine = new sfn.StateMachine(this, 'MyStateMachine', { Step 2: Use AWS CDK to create a state machine 382 AWS Step Functions Developer Guide definition: new tasks.LambdaInvoke(this, "MyLambdaTask", { lambdaFunction: helloFunction }).next(new sfn.Succeed(this, "GreetedWorld")) }); JavaScript const stateMachine = new sfn.StateMachine(this, 'MyStateMachine', { definition: new tasks.LambdaInvoke(this, "MyLambdaTask", { lambdaFunction: helloFunction }).next(new sfn.Succeed(this, "GreetedWorld")) }); Python state_machine = sfn.StateMachine( self, "MyStateMachine", definition=tasks.LambdaInvoke( self, "MyLambdaTask", lambda_function=hello_function) .next(sfn.Succeed(self, "GreetedWorld"))) Java C# final StateMachine stateMachine = StateMachine.Builder.create(this, "MyStateMachine") .definition(LambdaInvoke.Builder.create(this, "MyLambdaTask") .lambdaFunction(helloFunction) .build() .next(new Succeed(this, "GreetedWorld"))) .build(); var stateMachine = new StateMachine(this, "MyStateMachine", new StateMachineProps { DefinitionBody = DefinitionBody.FromChainable(new LambdaInvoke(this, "MyLambdaTask", new LambdaInvokeProps { LambdaFunction = helloFunction }) .Next(new Succeed(this, "GreetedWorld"))) }); Step 2: Use AWS CDK to create a state machine 383 AWS Step Functions Developer Guide To build and deploy the AWS CDK app In your newly created AWS CDK project, edit the file that contains the stack's definition to look like the following example code. You'll recognize the definitions of the Lambda function and the Step Functions state machine from previous sections. 1. Update the stack as shown in the following examples. TypeScript Update lib/step-stack.ts with the following code. import * as cdk from 'aws-cdk-lib'; import * as lambda from 'aws-cdk-lib/aws-lambda'; import * as sfn from 'aws-cdk-lib/aws-stepfunctions'; import * as tasks from 'aws-cdk-lib/aws-stepfunctions-tasks'; export class StepStack extends cdk.Stack { constructor(app: cdk.App, id: string) { super(app, id); const helloFunction = new lambda.Function(this, 'MyLambdaFunction', { code: lambda.Code.fromInline(` exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }; `), runtime: lambda.Runtime.NODEJS_18_X, handler: "index.handler", timeout: cdk.Duration.seconds(3) }); const stateMachine = new sfn.StateMachine(this, 'MyStateMachine', { definition: new tasks.LambdaInvoke(this, "MyLambdaTask", { lambdaFunction: helloFunction }).next(new sfn.Succeed(this, "GreetedWorld")) }); } } JavaScript Update lib/step-stack.js with the following code. Step 2: Use AWS CDK to create a state machine 384 AWS Step Functions Developer Guide import * as cdk from 'aws-cdk-lib'; import * as lambda from 'aws-cdk-lib/aws-lambda'; import * as sfn from 'aws-cdk-lib/aws-stepfunctions'; import * as tasks from 'aws-cdk-lib/aws-stepfunctions-tasks'; export class StepStack extends cdk.Stack
|
step-functions-dg-123
|
step-functions-dg.pdf
| 123 |
const helloFunction = new lambda.Function(this, 'MyLambdaFunction', { code: lambda.Code.fromInline(` exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }; `), runtime: lambda.Runtime.NODEJS_18_X, handler: "index.handler", timeout: cdk.Duration.seconds(3) }); const stateMachine = new sfn.StateMachine(this, 'MyStateMachine', { definition: new tasks.LambdaInvoke(this, "MyLambdaTask", { lambdaFunction: helloFunction }).next(new sfn.Succeed(this, "GreetedWorld")) }); } } JavaScript Update lib/step-stack.js with the following code. Step 2: Use AWS CDK to create a state machine 384 AWS Step Functions Developer Guide import * as cdk from 'aws-cdk-lib'; import * as lambda from 'aws-cdk-lib/aws-lambda'; import * as sfn from 'aws-cdk-lib/aws-stepfunctions'; import * as tasks from 'aws-cdk-lib/aws-stepfunctions-tasks'; export class StepStack extends cdk.Stack { constructor(app, id) { super(app, id); const helloFunction = new lambda.Function(this, 'MyLambdaFunction', { code: lambda.Code.fromInline(` exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }; `), runtime: lambda.Runtime.NODEJS_18_X, handler: "index.handler", timeout: cdk.Duration.seconds(3) }); const stateMachine = new sfn.StateMachine(this, 'MyStateMachine', { definition: new tasks.LambdaInvoke(this, "MyLambdaTask", { lambdaFunction: helloFunction }).next(new sfn.Succeed(this, "GreetedWorld")) }); } } Python Update step/step_stack.py with the following code. from aws_cdk import ( Duration, Stack, aws_stepfunctions as sfn, aws_stepfunctions_tasks as tasks, aws_lambda as lambda_ ) class StepStack(Stack): def __init__(self, scope: Construct, construct_id: str, **kwargs) -> None: Step 2: Use AWS CDK to create a state machine 385 AWS Step Functions Developer Guide super().__init__(scope, construct_id, **kwargs) hello_function = lambda_.Function( self, "MyLambdaFunction", code=lambda_.Code.from_inline(""" exports.handler = (event, context, callback) => { callback(null, "Hello World!"); }"""), runtime=lambda_.Runtime.NODEJS_18_X, handler="index.handler", timeout=Duration.seconds(25)) state_machine = sfn.StateMachine( self, "MyStateMachine", definition=tasks.LambdaInvoke( self, "MyLambdaTask", lambda_function=hello_function) .next(sfn.Succeed(self, "GreetedWorld"))) Java Update src/main/java/com.myorg/StepStack.java with the following code. package com.myorg; import software.constructs.Construct; import software.amazon.awscdk.Stack; import software.amazon.awscdk.StackProps; import software.amazon.awscdk.Duration; import software.amazon.awscdk.services.lambda.Code; import software.amazon.awscdk.services.lambda.Function; import software.amazon.awscdk.services.lambda.Runtime; import software.amazon.awscdk.services.stepfunctions.StateMachine; import software.amazon.awscdk.services.stepfunctions.Succeed; import software.amazon.awscdk.services.stepfunctions.tasks.LambdaInvoke; public class StepStack extends Stack { public StepStack(final Construct scope, final String id) { this(scope, id, null); } public StepStack(final Construct scope, final String id, final StackProps props) { Step 2: Use AWS CDK to create a state machine 386 AWS Step Functions Developer Guide super(scope, id, props); final Function helloFunction = Function.Builder.create(this, "MyLambdaFunction") .code(Code.fromInline( "exports.handler = (event, context, callback) => { callback(null, 'Hello World!' );}")) .runtime(Runtime.NODEJS_18_X) .handler("index.handler") .timeout(Duration.seconds(25)) .build(); final StateMachine stateMachine = StateMachine.Builder.create(this, "MyStateMachine") .definition(LambdaInvoke.Builder.create(this, "MyLambdaTask") .lambdaFunction(helloFunction) .build() .next(new Succeed(this, "GreetedWorld"))) .build(); } } C# Update src/Step/StepStack.cs with the following code. using Amazon.CDK; using Constructs; using Amazon.CDK.AWS.Lambda; using Amazon.CDK.AWS.StepFunctions; using Amazon.CDK.AWS.StepFunctions.Tasks; namespace Step { public class StepStack : Stack { internal StepStack(Construct scope, string id, IStackProps props = null) : base(scope, id, props) { var helloFunction = new Function(this, "MyLambdaFunction", new FunctionProps { Step 2: Use AWS CDK to create a state machine 387 AWS Step Functions Developer Guide Code = Code.FromInline(@"exports.handler = (event, context, callback) => { callback(null, 'Hello World!'); }"), Runtime = Runtime.NODEJS_18_X, Handler = "index.handler", Timeout = Duration.Seconds(25) }); var stateMachine = new StateMachine(this, "MyStateMachine", new StateMachineProps { DefinitionBody = DefinitionBody.FromChainable(new LambdaInvoke(this, "MyLambdaTask", new LambdaInvokeProps { LambdaFunction = helloFunction }) .Next(new Succeed(this, "GreetedWorld"))) }); } } } 2. Save the source file, and then run the cdk synth command in the app's main directory. AWS CDK runs the app and synthesizes an AWS CloudFormation template from it. AWS CDK then displays the template. Note If you used TypeScript to create your AWS CDK project, running the cdk synth command may return the following error. TSError: # Unable to compile TypeScript: bin/step.ts:7:33 - error TS2554: Expected 2 arguments, but got 3. Modify the bin/step.ts file as shown in the following example to resolve this error. #!/usr/bin/env node import 'source-map-support/register'; import * as cdk from 'aws-cdk-lib'; import { StepStack } from '../lib/step-stack'; Step 2: Use AWS CDK to create a state machine 388 AWS Step Functions Developer Guide const app = new cdk.App(); new StepStack(app, 'StepStack'); app.synth(); 3. To deploy the Lambda function and the Step Functions state machine to your AWS account, issue cdk deploy. You'll be asked to approve the IAM policies the AWS CDK has generated. Step 3: Start a state machine execution After you create your state machine, you can start its execution. To start the state machine execution 1. Open the Step Functions console and choose the name of the state machine that you created using AWS CDK. 2. On the state machine page, choose Start execution. The Start execution dialog box is displayed. 3. (Optional) Enter a custom execution name to override the generated default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. 4. Choose Start Execution. Your state machine's execution starts, and a new page showing your running execution is displayed. 5. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or
|
step-functions-dg-124
|
step-functions-dg.pdf
| 124 |
default. Non-ASCII names and logging Step Functions accepts names for state machines, executions, activities, and labels that contain non-ASCII characters. Because such characters will not work with Amazon CloudWatch, we recommend using only ASCII characters so you can track metrics in CloudWatch. 4. Choose Start Execution. Your state machine's execution starts, and a new page showing your running execution is displayed. 5. The Step Functions console directs you to a page that's titled with your execution ID. This page is known as the Execution Details page. On this page, you can review the execution results as the execution progresses or after it's complete. To review the execution results, choose individual states on the Graph view, and then choose the individual tabs on the Step details pane to view each state's details including input, output, Step 3: Start a state machine execution 389 AWS Step Functions Developer Guide and definition respectively. For details about the execution information you can view on the Execution Details page, see Execution details overview. Step 4: Clean Up After you've tested your state machine, we recommend that you remove both your state machine and the related Lambda function to free up resources in your AWS account. Run the cdk destroy command in your app's main directory to remove your state machine. Next steps To learn more about developing AWS infrastructure using AWS CDK, see the AWS CDK Developer Guide. For information about writing AWS CDK apps in your language of choice, see: TypeScript Working with AWS CDK in TypeScript JavaScript Working with AWS CDK in JavaScript Python Working with AWS CDK in Python Java Working with AWS CDK in Java C# Working with AWS CDK in C# For more information about the AWS Construct Library modules used in this tutorial, see the following AWS CDK API Reference overviews: • aws-lambda • aws-stepfunctions • aws-stepfunctions-tasks Step 4: Clean Up 390 AWS Step Functions Developer Guide Using AWS CDK to create an Express workflow in Step Functions In this tutorial, you learn how to create an API Gateway REST API with a synchronous express state machine as the backend integration, using the AWS Cloud Development Kit (AWS CDK) Infrastructure as Code (IAC) framework. You will use the StepFunctionsRestApi construct to connect the State Machine to the API Gateway. The StepFunctionsRestApi construct will set up a default input/output mapping and the API Gateway REST API, with required permissions and an HTTP “ANY” method. With AWS CDK is an Infrastructure as Code (IAC) framework, you define AWS infrastructure using a programming language. You define an app in one of the CDK's supported languages, synthesize the code into an AWS CloudFormation template, and then deploy the infrastructure to your AWS account. You will use AWS CloudFormation to define an API Gateway REST API, which is integrated with Synchronous Express State Machine as the backend, then use the AWS Management Console to initiate execution. Before starting this tutorial, set up your AWS CDK development environment as described in Getting Started With the AWS CDK - Prerequisites, then install the AWS CDK by issuing: npm install -g aws-cdk Step 1: Set Up Your AWS CDK Project First, create a directory for your new AWS CDK app and initialize the project. TypeScript mkdir stepfunctions-rest-api cd stepfunctions-rest-api cdk init --language typescript JavaScript mkdir stepfunctions-rest-api Using CDK to create an Express workflow 391 AWS Step Functions Developer Guide cd stepfunctions-rest-api cdk init --language javascript Python mkdir stepfunctions-rest-api cd stepfunctions-rest-api cdk init --language python After the project has been initialized, activate the project's virtual environment and install the AWS CDK's baseline dependencies. source .venv/bin/activate python -m pip install -r requirements.txt Java C# Go mkdir stepfunctions-rest-api cd stepfunctions-rest-api cdk init --language java mkdir stepfunctions-rest-api cd stepfunctions-rest-api cdk init --language csharp mkdir stepfunctions-rest-api cd stepfunctions-rest-api cdk init --language go Note Be sure to name the directory stepfunctions-rest-api. The AWS CDK application template uses the name of the directory to generate names for source files and classes. If you use a different name, your app will not match this tutorial. Step 1: Set Up Your AWS CDK Project 392 AWS Step Functions Developer Guide Now install the construct library modules for AWS Step Functions and Amazon API Gateway. TypeScript npm install @aws-cdk/aws-stepfunctions @aws-cdk/aws-apigateway JavaScript npm install @aws-cdk/aws-stepfunctions @aws-cdk/aws-apigateway Python python -m pip install aws-cdk.aws-stepfunctions python -m pip install aws-cdk.aws-apigateway Java Edit the project's pom.xml to add the following dependencies inside the existing <dependencies> container. <dependency> <groupId>software.amazon.awscdk</groupId> <artifactId>stepfunctions</artifactId> <version>${cdk.version}</version> </dependency> <dependency> <groupId>software.amazon.awscdk</groupId> <artifactId>apigateway</artifactId> <version>${cdk.version}</version> </dependency> Maven automatically installs these dependencies the next time you build your app. To build, issue mvn compile or use your Java IDE's Build command. C# dotnet add src/StepfunctionsRestApi package Amazon.CDK.AWS.Stepfunctions dotnet add src/StepfunctionsRestApi package Amazon.CDK.AWS.APIGateway You may also install the indicated packages using the Visual Studio NuGet GUI, available via Tools > NuGet Package Manager >
|
step-functions-dg-125
|
step-functions-dg.pdf
| 125 |
install @aws-cdk/aws-stepfunctions @aws-cdk/aws-apigateway JavaScript npm install @aws-cdk/aws-stepfunctions @aws-cdk/aws-apigateway Python python -m pip install aws-cdk.aws-stepfunctions python -m pip install aws-cdk.aws-apigateway Java Edit the project's pom.xml to add the following dependencies inside the existing <dependencies> container. <dependency> <groupId>software.amazon.awscdk</groupId> <artifactId>stepfunctions</artifactId> <version>${cdk.version}</version> </dependency> <dependency> <groupId>software.amazon.awscdk</groupId> <artifactId>apigateway</artifactId> <version>${cdk.version}</version> </dependency> Maven automatically installs these dependencies the next time you build your app. To build, issue mvn compile or use your Java IDE's Build command. C# dotnet add src/StepfunctionsRestApi package Amazon.CDK.AWS.Stepfunctions dotnet add src/StepfunctionsRestApi package Amazon.CDK.AWS.APIGateway You may also install the indicated packages using the Visual Studio NuGet GUI, available via Tools > NuGet Package Manager > Manage NuGet Packages for Solution. Step 1: Set Up Your AWS CDK Project 393 AWS Step Functions Developer Guide Once you have installed the modules, you can use them in your AWS CDK app by importing the following packages. TypeScript @aws-cdk/aws-stepfunctions @aws-cdk/aws-apigateway JavaScript @aws-cdk/aws-stepfunctions @aws-cdk/aws-apigateway Python aws_cdk.aws_stepfunctions aws_cdk.aws_apigateway Java C# Go software.amazon.awscdk.services.apigateway.StepFunctionsRestApi software.amazon.awscdk.services.stepfunctions.Pass software.amazon.awscdk.services.stepfunctions.StateMachine software.amazon.awscdk.services.stepfunctions.StateMachineType Amazon.CDK.AWS.StepFunctions Amazon.CDK.AWS.APIGateway Add the following to import inside stepfunctions-rest-api.go. "github.com/aws/aws-cdk-go/awscdk/awsapigateway" "github.com/aws/aws-cdk-go/awscdk/awsstepfunctions" Step 1: Set Up Your AWS CDK Project 394 AWS Step Functions Developer Guide Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration First, we'll present the individual pieces of code that define the Synchronous Express State Machine and the API Gateway REST API, then explain how to put them together into your AWS CDK app. Then you'll see how to synthesize and deploy these resources. Note The State Machine that we will show here will be a simple State Machine with a Pass state. To create an Express State Machine This is the AWS CDK code that defines a simple state machine with a Pass state. TypeScript const machineDefinition = new stepfunctions.Pass(this, 'PassState', { result: {value:"Hello!"}, }) const stateMachine = new stepfunctions.StateMachine(this, 'MyStateMachine', { definition: machineDefinition, stateMachineType: stepfunctions.StateMachineType.EXPRESS, }); JavaScript const machineDefinition = new sfn.Pass(this, 'PassState', { result: {value:"Hello!"}, }) const stateMachine = new sfn.StateMachine(this, 'MyStateMachine', { definition: machineDefinition, stateMachineType: stepfunctions.StateMachineType.EXPRESS, }); Python machine_definition = sfn.Pass(self,"PassState", Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 395 AWS Step Functions Developer Guide result = sfn.Result("Hello")) state_machine = sfn.StateMachine(self, 'MyStateMachine', definition = machine_definition, state_machine_type = sfn.StateMachineType.EXPRESS) Java C# Pass machineDefinition = Pass.Builder.create(this, "PassState") .result(Result.fromString("Hello")) .build(); StateMachine stateMachine = StateMachine.Builder.create(this, "MyStateMachine") .definition(machineDefinition) .stateMachineType(StateMachineType.EXPRESS) .build(); var machineDefinition = new Pass(this, "PassState", new PassProps { Result = Result.FromString("Hello") }); var stateMachine = new StateMachine(this, "MyStateMachine", new StateMachineProps { Definition = machineDefinition, StateMachineType = StateMachineType.EXPRESS }); Go var machineDefinition = awsstepfunctions.NewPass(stack, jsii.String("PassState"), &awsstepfunctions.PassProps { Result: awsstepfunctions.NewResult(jsii.String("Hello")), }) var stateMachine = awsstepfunctions.NewStateMachine(stack, jsii.String("StateMachine"), &awsstepfunctions.StateMachineProps { Definition: machineDefinition, Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 396 AWS Step Functions Developer Guide StateMachineType: awsstepfunctions.StateMachineType_EXPRESS, }) You can see in this short snippet: • The machine definition named PassState, which is a Pass State. • The State Machine’s logical name, MyStateMachine. • The machine definition is used as the State Machine definition. • The State Machine Type is set as EXPRESS because StepFunctionsRestApi will only allow a Synchronous Express state machine. To create the API Gateway REST API using StepFunctionsRestApi construct We will use StepFunctionsRestApi construct to create the API Gateway REST API with required permissions and default input/output mapping. TypeScript const api = new apigateway.StepFunctionsRestApi(this, 'StepFunctionsRestApi', { stateMachine: stateMachine }); JavaScript const api = new apigateway.StepFunctionsRestApi(this, 'StepFunctionsRestApi', { stateMachine: stateMachine }); Python api = apigw.StepFunctionsRestApi(self, "StepFunctionsRestApi", state_machine = state_machine) Java StepFunctionsRestApi api = StepFunctionsRestApi.Builder.create(this, "StepFunctionsRestApi") .stateMachine(stateMachine) .build(); Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 397 AWS Step Functions C# Developer Guide var api = new StepFunctionsRestApi(this, "StepFunctionsRestApi", new StepFunctionsRestApiProps { StateMachine = stateMachine }); Go awsapigateway.NewStepFunctionsRestApi(stack, jsii.String("StepFunctionsRestApi"), &awsapigateway.StepFunctionsRestApiProps { StateMachine = stateMachine, }) To build and deploy the AWS CDK app In the AWS CDK project you created, edit the file containing the definition of the stack to look like the code below. You'll recognize the definitions of the Step Functions state machine and the API Gateway from above. TypeScript Update lib/stepfunctions-rest-api-stack.ts to read as follows. import * as cdk from 'aws-cdk-lib'; import * as stepfunctions from 'aws-cdk-lib/aws-stepfunctions' import * as apigateway from 'aws-cdk-lib/aws-apigateway'; export class StepfunctionsRestApiStack extends cdk.Stack { constructor(scope: cdk.App, id: string, props?: cdk.StackProps) { super(scope, id, props); const machineDefinition = new stepfunctions.Pass(this, 'PassState', { result: {value:"Hello!"}, }); const stateMachine = new stepfunctions.StateMachine(this, 'MyStateMachine', { definition: machineDefinition, Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 398 AWS Step Functions Developer Guide stateMachineType: stepfunctions.StateMachineType.EXPRESS, }); const api = new apigateway.StepFunctionsRestApi(this, 'StepFunctionsRestApi', { stateMachine: stateMachine }); JavaScript Update lib/stepfunctions-rest-api-stack.js to read as follows. const cdk = require('@aws-cdk/core'); const stepfunctions = require('@aws-cdk/aws-stepfunctions'); const
|
step-functions-dg-126
|
step-functions-dg.pdf
| 126 |
as stepfunctions from 'aws-cdk-lib/aws-stepfunctions' import * as apigateway from 'aws-cdk-lib/aws-apigateway'; export class StepfunctionsRestApiStack extends cdk.Stack { constructor(scope: cdk.App, id: string, props?: cdk.StackProps) { super(scope, id, props); const machineDefinition = new stepfunctions.Pass(this, 'PassState', { result: {value:"Hello!"}, }); const stateMachine = new stepfunctions.StateMachine(this, 'MyStateMachine', { definition: machineDefinition, Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 398 AWS Step Functions Developer Guide stateMachineType: stepfunctions.StateMachineType.EXPRESS, }); const api = new apigateway.StepFunctionsRestApi(this, 'StepFunctionsRestApi', { stateMachine: stateMachine }); JavaScript Update lib/stepfunctions-rest-api-stack.js to read as follows. const cdk = require('@aws-cdk/core'); const stepfunctions = require('@aws-cdk/aws-stepfunctions'); const apigateway = require('@aws-cdk/aws-apigateway'); class StepfunctionsRestApiStack extends cdk.Stack { constructor(scope: cdk.Construct, id: string, props?: cdk.StackProps) { super(scope, id, props); const machineDefinition = new stepfunctions.Pass(this, "PassState", { result: {value:"Hello!"}, }) const stateMachine = new sfn.StateMachine(this, 'MyStateMachine', { definition: machineDefinition, stateMachineType: stepfunctions.StateMachineType.EXPRESS, }); const api = new apigateway.StepFunctionsRestApi(this, 'StepFunctionsRestApi', { stateMachine: stateMachine }); } } module.exports = { StepStack } Python Update stepfunctions_rest_api/stepfunctions_rest_api_stack.py to read as follows. from aws_cdk import App, Stack from constructs import Construct Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 399 AWS Step Functions Developer Guide from aws_cdk import aws_stepfunctions as sfn from aws_cdk import aws_apigateway as apigw class StepfunctionsRestApiStack(Stack): def __init__(self, scope: Construct, construct_id: str, **kwargs) -> None: super().__init__(scope, construct_id, **kwargs) machine_definition = sfn.Pass(self,"PassState", result = sfn.Result("Hello")) state_machine = sfn.StateMachine(self, 'MyStateMachine', definition = machine_definition, state_machine_type = sfn.StateMachineType.EXPRESS) api = apigw.StepFunctionsRestApi(self, "StepFunctionsRestApi", state_machine = state_machine) Java Update src/main/java/com.myorg/StepfunctionsRestApiStack.java to read as follows. package com.myorg; import software.amazon.awscdk.core.Construct; import software.amazon.awscdk.core.Stack; import software.amazon.awscdk.core.StackProps; import software.amazon.awscdk.services.stepfunctions.Pass; import software.amazon.awscdk.services.stepfunctions.StateMachine; import software.amazon.awscdk.services.stepfunctions.StateMachineType; import software.amazon.awscdk.services.apigateway.StepFunctionsRestApi; public class StepfunctionsRestApiStack extends Stack { public StepfunctionsRestApiStack(final Construct scope, final String id) { this(scope, id, null); } public StepfunctionsRestApiStack(final Construct scope, final String id, final StackProps props) { Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 400 AWS Step Functions Developer Guide super(scope, id, props); Pass machineDefinition = Pass.Builder.create(this, "PassState") .result(Result.fromString("Hello")) .build(); StateMachine stateMachine = StateMachine.Builder.create(this, "MyStateMachine") .definition(machineDefinition) .stateMachineType(StateMachineType.EXPRESS) .build(); StepFunctionsRestApi api = StepFunctionsRestApi.Builder.create(this, "StepFunctionsRestApi") .stateMachine(stateMachine) .build(); } } C# Update src/StepfunctionsRestApi/StepfunctionsRestApiStack.cs to read as follows. using Amazon.CDK; using Amazon.CDK.AWS.StepFunctions; using Amazon.CDK.AWS.APIGateway; namespace StepfunctionsRestApi { public class StepfunctionsRestApiStack : Stack { internal StepfunctionsRestApi(Construct scope, string id, IStackProps props = null) : base(scope, id, props) { var machineDefinition = new Pass(this, "PassState", new PassProps { Result = Result.FromString("Hello") }); var stateMachine = new StateMachine(this, "MyStateMachine", new StateMachineProps Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 401 AWS Step Functions { Definition = machineDefinition, StateMachineType = StateMachineType.EXPRESS }); Developer Guide var api = new StepFunctionsRestApi(this, "StepFunctionsRestApi", new StepFunctionsRestApiProps { StateMachine = stateMachine }); } } } Go Update stepfunctions-rest-api.go to read as follows. package main import ( "github.com/aws/aws-cdk-go/awscdk" "github.com/aws/aws-cdk-go/awscdk/awsapigateway" "github.com/aws/aws-cdk-go/awscdk/awsstepfunctions" "github.com/aws/constructs-go/constructs/v3" "github.com/aws/jsii-runtime-go" ) type StepfunctionsRestApiGoStackProps struct { awscdk.StackProps } func NewStepfunctionsRestApiGoStack(scope constructs.Construct, id string, props *StepfunctionsRestApiGoStackProps) awscdk.Stack { var sprops awscdk.StackProps if props != nil { sprops = props.StackProps } stack := awscdk.NewStack(scope, &id, &sprops) // The code that defines your stack goes here var machineDefinition = awsstepfunctions.NewPass(stack, jsii.String("PassState"), &awsstepfunctions.PassProps Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 402 AWS Step Functions { Developer Guide Result: awsstepfunctions.NewResult(jsii.String("Hello")), }) var stateMachine = awsstepfunctions.NewStateMachine(stack, jsii.String("StateMachine"), &awsstepfunctions.StateMachineProps{ Definition: machineDefinition, StateMachineType: awsstepfunctions.StateMachineType_EXPRESS, }); awsapigateway.NewStepFunctionsRestApi(stack, jsii.String("StepFunctionsRestApi"), &awsapigateway.StepFunctionsRestApiProps{ StateMachine = stateMachine, }) return stack } func main() { app := awscdk.NewApp(nil) NewStepfunctionsRestApiGoStack(app, "StepfunctionsRestApiGoStack", &StepfunctionsRestApiGoStackProps{ awscdk.StackProps{ Env: env(), }, }) app.Synth(nil) } // env determines the AWS environment (account+region) in which our stack is to // be deployed. For more information see: https://docs.aws.amazon.com/cdk/latest/ guide/environments.html func env() *awscdk.Environment { // If unspecified, this stack will be "environment-agnostic". // Account/Region-dependent features and context lookups will not work, but a // single synthesized template can be deployed anywhere. //--------------------------------------------------------------------------- return nil // Uncomment if you know exactly what account and region you want to deploy // the stack to. This is the recommendation for production stacks. //--------------------------------------------------------------------------- Step 2: Use the AWS CDK to create an API Gateway REST API with Synchronous Express State Machine backend integration 403 AWS Step Functions Developer Guide // return &awscdk.Environment{ // Account: jsii.String("account-id"), // Region: jsii.String("us-east-1"), // } // Uncomment to specialize this stack for the AWS Account and Region that are // implied by the current CLI configuration. This is recommended for dev // stacks. //--------------------------------------------------------------------------- // return &awscdk.Environment{ // Account: jsii.String(os.Getenv("CDK_DEFAULT_ACCOUNT")), // Region: jsii.String(os.Getenv("CDK_DEFAULT_REGION")), // } } Save the source file, then issue cdk synth in the app's main directory. The AWS CDK runs the app and synthesizes an AWS CloudFormation template from it, then displays the template. To actually deploy the Amazon API Gateway and the AWS Step Functions state machine to your AWS account,
|
step-functions-dg-127
|
step-functions-dg.pdf
| 127 |
return &awscdk.Environment{ // Account: jsii.String("account-id"), // Region: jsii.String("us-east-1"), // } // Uncomment to specialize this stack for the AWS Account and Region that are // implied by the current CLI configuration. This is recommended for dev // stacks. //--------------------------------------------------------------------------- // return &awscdk.Environment{ // Account: jsii.String(os.Getenv("CDK_DEFAULT_ACCOUNT")), // Region: jsii.String(os.Getenv("CDK_DEFAULT_REGION")), // } } Save the source file, then issue cdk synth in the app's main directory. The AWS CDK runs the app and synthesizes an AWS CloudFormation template from it, then displays the template. To actually deploy the Amazon API Gateway and the AWS Step Functions state machine to your AWS account, issue cdk deploy. You'll be asked to approve the IAM policies the AWS CDK has generated. Step 3: Test the API Gateway After you create your API Gateway REST API with Synchronous Express State Machine as the backend integration, you can test the API Gateway. To test the deployed API Gateway using API Gateway console 1. Open the Amazon API Gateway console and sign in. 2. Choose your REST API named StepFunctionsRestApi. 3. In the Resources pane, choose the ANY method. 4. Choose the Test tab. You might need to choose the right arrow button to show the tab. 5. 6. For Method, choose POST. For Request body, copy the following request parameters. { "key": "Hello" } Step 3: Test the API Gateway 404 AWS Step Functions Developer Guide 7. Choose Test. The following information will be displayed: • Request is the resource's path that was called for the method. • Status is the response's HTTP status code. • Latency is the time between the receipt of the request from the caller and the returned response. • Response body is the HTTP response body. • Response headers are the HTTP response headers. • Log shows the simulated Amazon CloudWatch Logs entries that would have been written if this method were called outside of the API Gateway console. Note Although the CloudWatch Logs entries are simulated, the results of the method call are real. The Response body output should be something like this: "Hello" Tip Try the API Gateway with different methods and an invalid input to see the error output. You may want to change the state machine to look for a particular key and during testing provide the wrong key to fail the State Machine execution and generate an error message in the Response body output. To test the deployed API using cURL 1. Open a terminal window. 2. Copy the following cURL command and paste it into the terminal window, replacing <api-id> with your API's API ID and <region> with the region where your API is deployed. curl -X POST\ 'https://<api-id>.execute-api.<region>.amazonaws.com/prod' \ Step 3: Test the API Gateway 405 AWS Step Functions Developer Guide -d '{"key":"Hello"}' \ -H 'Content-Type: application/json' The Response Body output should be something like this: "Hello" Tip Try the API Gateway with different methods and an invalid input to see the error output. You may want to change the state machine to look for a particular key and during testing provide the wrong key to fail the State Machine execution and generate an error message in the Response Body output. Step 4: Clean Up When you're done trying out your API Gateway, you can tear down both the state machine and the API Gateway using the AWS CDK. Issue cdk destroy in your app's main directory. Using Terraform to deploy state machines in Step Functions Terraform by HashiCorp is a framework for building applications using infrastructure as code (IaC). With Terraform, you can create state machines and use features, such as previewing infrastructure deployments and creating reusable templates. Terraform templates help you maintain and reuse the code by breaking it down into smaller chunks. If you're familiar with Terraform, you can follow the development lifecycle described in this topic as a model for creating and deploying your state machines in Terraform. If you aren't familiar with Terraform, we recommend that you first complete the workshop Introduction to Terraform on AWS for getting acquainted with Terraform. Tip To deploy an example of a state machine built using Terraform, see Deploy with Terraform in The AWS Step Functions Workshop. Step 4: Clean Up 406 AWS Step Functions In this topic • Prerequisites • State machine development lifecycle with Terraform • IAM roles and policies for your state machine Prerequisites Developer Guide Before you get started, make sure you complete the following prerequisites: • Install Terraform on your machine. For information about installing Terraform, see Install Terraform. • Install Step Functions Local on your machine. We recommend that you install the Step Functions Local Docker image to use Step Functions Local. For more information, see Testing state machines with Step Functions Local (unsupported). • Install AWS SAM CLI. For installation information, see Installing the AWS SAM CLI
|
step-functions-dg-128
|
step-functions-dg.pdf
| 128 |
Functions In this topic • Prerequisites • State machine development lifecycle with Terraform • IAM roles and policies for your state machine Prerequisites Developer Guide Before you get started, make sure you complete the following prerequisites: • Install Terraform on your machine. For information about installing Terraform, see Install Terraform. • Install Step Functions Local on your machine. We recommend that you install the Step Functions Local Docker image to use Step Functions Local. For more information, see Testing state machines with Step Functions Local (unsupported). • Install AWS SAM CLI. For installation information, see Installing the AWS SAM CLI in the AWS Serverless Application Model Developer Guide. • Install the AWS Toolkit for Visual Studio Code to view the workflow diagram of your state machines. For installation information, see Installing the AWS Toolkit for Visual Studio Code in the AWS Toolkit for Visual Studio Code User Guide. State machine development lifecycle with Terraform The following procedure explains how you can use a state machine prototype that you build using Workflow Studio in the Step Functions console as a starting point for local development with Terraform and the AWS Toolkit for Visual Studio Code. To view the complete example that discusses the state machine development with Terraform and presents the best practices in detail, see Best practices for writing Step Functions Terraform projects. To start the development lifecycle of a state machine with Terraform 1. Bootstrap a new Terraform project with the following command. terraform init 2. Open the Step Functions console to create a prototype for your state machine. Prerequisites 407 AWS Step Functions Developer Guide 3. In Workflow Studio, do the following: a. b. Create your workflow prototype. Export the Amazon States Language (ASL) definition of your workflow. To do this, choose the Import/Export dropdownlist, and then select Export JSON definition. 4. Save the exported ASL definition within your project directory. You pass the exported ASL definition as an input parameter to the aws_sfn_state_machine Terraform resource that uses the templatefile function. This function is used inside the definition field that passes the exported ASL definition and any variable substitutions. Tip Because the ASL definition file can contain lengthy blocks of text, we recommend you avoid the inline EOF method. This makes it easier to substitute parameters into your state machine definition. 5. (Optional) Update the ASL definition within your IDE and visualize your changes using the AWS Toolkit for Visual Studio Code. To avoid continuously exporting your definition and refactoring it into your project, we recommend that you make updates locally in you IDE and track these updates with Git. 6. Test your workflow using Step Functions Local. Development lifecycle with Terraform 408 AWS Step Functions Tip Developer Guide You can also locally test service integrations with Lambda functions and API Gateway APIs in your state machine using AWS SAM CLI Local. 7. Preview your state machine and other AWS resources before deploying the state machine. To do this, run the following command. terraform plan 8. Deploy your state machine from your local environment or through CI/CD pipelines using the following command. terraform apply 9. (Optional) Clean up your resources and delete the state machine using the following command. terraform destroy IAM roles and policies for your state machine Use the Terraform service integration policies to add necessary IAM permissions to your state machine, for example, permission to invoke Lambda functions. You can also define explicit roles and policies and associate them with your state machine. The following IAM policy example grants your state machine access to invoke a Lambda function named myFunction. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "lambda:InvokeFunction" ], "Resource": "arn:aws:lambda:region:account-id:function:myFunction" IAM roles and policies for your state machine 409 AWS Step Functions } ] } Developer Guide We also recommend using the aws_iam_policy_document data source when defining IAM policies for your state machines in Terraform. This helps you check if your policy is malformed and substitute any resources with variables. The following IAM policy example uses the aws_iam_policy_document data source and grants your state machine access to invoke a Lambda function named myFunction. data "aws_iam_policy_document" "state_machine_role_policy" { statement { effect = "Allow" actions = [ "lambda:InvokeFunction" ] resources = ["${aws_lambda_function.function-1.arn}:*"] } } Tip To view more advanced AWS architectural patterns deployed with Terraform, see Terraform examples at Serverless Land Workflows Collection. Exporting your workflow to IaC templates The AWS Step Functions console provides the ability to export and download saved workflows as AWS CloudFormation or AWS SAM (SAM) templates. For AWS Regions that support AWS Infrastructure Composer, it additionally provides the ability to export your workflows to Infrastructure Composer and navigates to the Infrastructure Composer console, where you can continue to work with the newly generated template. Exporting to IaC templates 410 AWS Step Functions Developer Guide Template configuration options The
|
step-functions-dg-129
|
step-functions-dg.pdf
| 129 |
} Tip To view more advanced AWS architectural patterns deployed with Terraform, see Terraform examples at Serverless Land Workflows Collection. Exporting your workflow to IaC templates The AWS Step Functions console provides the ability to export and download saved workflows as AWS CloudFormation or AWS SAM (SAM) templates. For AWS Regions that support AWS Infrastructure Composer, it additionally provides the ability to export your workflows to Infrastructure Composer and navigates to the Infrastructure Composer console, where you can continue to work with the newly generated template. Exporting to IaC templates 410 AWS Step Functions Developer Guide Template configuration options The following options are available with this feature. If you select to export and download an IaC template file, the console displays the options that apply to your saved state machine for selection. If you’re exporting to Infrastructure Composer, the Step Functions console automatically implements the configurations that apply to your state machine. • Include IAM role created by console on your behalf – This option exports the execution role policies. It constructs an IAM role in the template and attaches it to the state machine resource. This option is only applicable if the state machine has an execution role that’s created by the console. • Include CloudWatch Log Group – Constructs a CloudWatch log group in the template and attaches it to the state machine resource. This option is only applicable if the state machine has a CloudWatch log group attached to it and the log level is not set to OFF. • Replace resource references with DefinitionSubstitutions – This option generates DefinitionSubstitutions for the following components: • Distributed Map S3 fields. • Activity resources. The export includes Activity resources in the AWS CloudFormation template for any Run Activity task. The export also provides DefinitionSubstitutions referencing the created Activity resources. • Any ARN or S3URI in the Payload field for all service integrations. • In addition to the ARN and S3URI fields, the export generates DefinitionSubstitutions for other frequently used service integration payload fields. The specific service integrations are the following: • athena:startQueryExecution • batch:submitJob • dynamodb:getItem, dynamodb:updateItem, dynamodb:updateItem, dynamodb:deleteItem • ecs:runTask • glue:startJobRun • http:invoke • lambda:invoke • sns:publish • sqs:sendMessage Template configuration options 411 AWS Step Functions Developer Guide • states:startExecution Export and download your workflow's IaC template To export your workflow into an IaC template file 1. Open the Step Functions console and select the state machine you want to work with. Make sure that any changes to the state machine are saved before you proceed to the next step. 2. 3. Select Export to CloudFormation or SAM template from the Actions menu. Select Type as either SAM or CloudFormation from the dialog box that appears. • • If you selected the CloudFormation template, next choose either the JSON or YAML file format. If you selected the SAM template, no formats choices are presented. The SAM template defaults to YAML file format. 4. Expand Additional configurations. By default all of the options are selected. Review and update the selection of options for your IaC template. The options are described in detail in the previous section titled Template configuration options. If an option doesn't apply to your specific workflow, then it won't display in the dialogue box. 5. Choose Download to export and download your generated IaC template file. Export your workflow directly into AWS Infrastructure Composer To export your workflow into Infrastructure Composer 1. Open the Step Functions console and select the state machine you want to work with. Make sure that any changes to the state machine are saved before you proceed to the next step. 2. 3. Select Export to Infrastructure Composer from the Actions menu. The Export to Infrastructure Composer dialog box displays. You can use the default name that displays in the Transfer bucket name field or enter a new name. Amazon S3 bucket names must be globally unique and follow the bucket naming rules. 4. Choose the Confirm and create project to export your workflow to Infrastructure Composer. 5. To save your project and workflow definition in Infrastructure Composer, activate local sync mode. Export and download IaC template 412 AWS Step Functions Note Developer Guide If you've used the Export to Infrastructure Composer feature before and created an Amazon S3 bucket using the default name, Step Functions can re-use this bucket if it still exists. Accept the default bucket name in the dialog box to re-use the existing bucket. Amazon S3 transfer bucket configuration The Amazon S3 bucket that Step Functions creates to transfer your workflow automatically encrypts objects using the AES 256 encryption standard. Step Functions also configures the bucket to use the bucket owner condition to ensure that only your AWS account is able to add objects to the bucket. The default bucket name uses the prefix states-templates, a 10-digit
|
step-functions-dg-130
|
step-functions-dg.pdf
| 130 |
Composer feature before and created an Amazon S3 bucket using the default name, Step Functions can re-use this bucket if it still exists. Accept the default bucket name in the dialog box to re-use the existing bucket. Amazon S3 transfer bucket configuration The Amazon S3 bucket that Step Functions creates to transfer your workflow automatically encrypts objects using the AES 256 encryption standard. Step Functions also configures the bucket to use the bucket owner condition to ensure that only your AWS account is able to add objects to the bucket. The default bucket name uses the prefix states-templates, a 10-digit alphanumeric string, and the AWS Region you created your workflow in: states-templates-amzn-s3-demo- bucket-us-east-1. To avoid additional charges being added to your AWS account, we recommend that you delete the Amazon S3 bucket as soon as you have finished exporting your workflow to Infrastructure Composer. Standard Amazon S3 pricing applies. Required permissions To use this Step Functions export feature with Infrastructure Composer, you need certain permissions to download an AWS SAM template and to write your template configuration to Amazon S3. To download an AWS SAM template, you must have permission to use the following API actions: • iam:GetPolicy • iam:GetPolicyVersion • iam:GetRole • iam:GetRolePolicy • iam:ListAttachedRolePolicies • iam:ListRolePolicies • iam:ListRoles Export IaC template to AWS Infrastructure Composer 413 AWS Step Functions Developer Guide For Step Functions to write your function's configuration to Amazon S3, you must have permission to use the following API actions: • S3:PutObject • S3:CreateBucket • S3:PutBucketEncryption If you are unable to export your function's configuration to Infrastructure Composer, check that your account has the required permissions for these operations. Export IaC template to AWS Infrastructure Composer 414 AWS Step Functions Developer Guide Starting state machine executions in Step Functions A state machine execution occurs when an AWS Step Functions state machine runs and performs its tasks. Each Step Functions state machine can have multiple simultaneous executions, which you can initiate from the Step Functions console, or by using the AWS SDKs, the Step Functions API actions, or the AWS Command Line Interface (AWS CLI). An execution receives JSON input and produces JSON output. You can start a Step Functions execution in the following ways: • Start an execution in the Step Functions console. You can start a state machine in the console, watch the execution, and debug failures. • Call the StartExecution API action. • Use Amazon EventBridge to start an execution in response to an event. • Use Amazon EventBridge Scheduler to start a state machine execution on a schedule. • Start a nested workflow execution from a Task state. • Start an execution with Amazon API Gateway. Tip To learn how to monitor running executions, see the tutorial: the section called “Examine executions” Start workflow executions from a task state in Step Functions AWS Step Functions can start workflow executions directly from a Task state of a state machine. This allows you to break your workflows into smaller state machines, and to start executions of these other state machines. By starting these new workflow executions you can: • Separate higher level workflow from lower level, task-specific workflows. • Avoid repetitive elements by calling a separate state machine multiple times. • Create a library of modular reusable workflows for faster development. • Reduce complexity and make it easier to edit and troubleshoot state machines. Start from a Task 415 AWS Step Functions Developer Guide Step Functions can start these workflow executions by calling its own API as an integrated service. Simply call the StartExecution API action from your Task state and pass the necessary parameters. You can call the Step Functions API using any of the service integration patterns. Tip To deploy an example nested workflow, see Optimizing costs in The AWS Step Functions Workshop. To start a new execution of a state machine, use a Task state similar to the following example: { "Type":"Task", "Resource":"arn:aws:states:::states:startExecution", "Parameters":{ "StateMachineArn":"arn:aws:states:region:account-id:stateMachine:HelloWorld", "Input":{ "Comment":"Hello world!" }, }, "Retry":[ { "ErrorEquals":[ "StepFunctions.ExecutionLimitExceeded" ] } ], "End":true } This Task state will start a new execution of the HelloWorld state machine, and will pass the JSON comment as input. Note The StartExecution API action quotas can limit the number of executions that you can start. Use the Retry on StepFunctions.ExecutionLimitExceeded to ensure your execution is started. See the following. • Quotas related to API action throttling Start from a Task 416 AWS Step Functions Developer Guide • Handling errors in Step Functions workflows Associate Workflow Executions To associate a started workflow execution with the execution that started it, pass the execution ID from the Context object to the execution input. You can access the ID from the Context object from your Task state in a running execution. Pass the execution ID by appending .$ to the parameter name, and referencing the
|
step-functions-dg-131
|
step-functions-dg.pdf
| 131 |
that you can start. Use the Retry on StepFunctions.ExecutionLimitExceeded to ensure your execution is started. See the following. • Quotas related to API action throttling Start from a Task 416 AWS Step Functions Developer Guide • Handling errors in Step Functions workflows Associate Workflow Executions To associate a started workflow execution with the execution that started it, pass the execution ID from the Context object to the execution input. You can access the ID from the Context object from your Task state in a running execution. Pass the execution ID by appending .$ to the parameter name, and referencing the ID in the Context object with $$.Execution.Id. "AWS_STEP_FUNCTIONS_STARTED_BY_EXECUTION_ID.$": "$$.Execution.Id" You can use a special parameter named AWS_STEP_FUNCTIONS_STARTED_BY_EXECUTION_ID when you start an execution. If included, this association provides links in the Step details section of the Step Functions console. When provided, you can easily trace the executions of your workflows from starting executions to their started workflow executions. Using the previous example, associate the execution ID with the started execution of the HelloWorld state machine, as follows. { "Type":"Task", "Resource":"arn:aws:states:::states:startExecution", "Parameters":{ "StateMachineArn":"arn:aws:states:region:account-id:stateMachine:HelloWorld", "Input": { "Comment": "Hello world!", "AWS_STEP_FUNCTIONS_STARTED_BY_EXECUTION_ID.$": "$$.Execution.Id" } }, "End":true } For more information, see the following: • Integrating services • Passing parameters to a service API in Step Functions • Accessing the Context object • AWS Step Functions Associate Workflow Executions 417 AWS Step Functions Developer Guide Using Amazon EventBridge Scheduler to start a Step Functions state machine execution Amazon EventBridge Scheduler is a serverless scheduler that allows you to create, run, and manage tasks from one central, managed service. With EventBridge Scheduler, you can create schedules using cron and rate expressions for recurring patterns, or configure one-time invocations. You can set up flexible time windows for delivery, define retry limits, and set the maximum retention time for failed API invocations. For example, with EventBridge Scheduler, you can start a state machine execution on a schedule when a security related event occurs or to automate a data processing job. This page explains how to use EventBridge Scheduler to start execution of a Step Functions state machine on a schedule. Topics • Set up the execution role • Create a schedule • Related resources Set up the execution role When you create a new schedule, EventBridge Scheduler must have permission to invoke its target API operation on your behalf. You grant these permissions to EventBridge Scheduler using an execution role. The permission policy you attach to your schedule's execution role defines the required permissions. These permissions depend on the target API you want EventBridge Scheduler to invoke. When you use the EventBridge Scheduler console to create a schedule, as in the following procedure, EventBridge Scheduler automatically sets up an execution role based on your selected target. If you want to create a schedule using one of the EventBridge Scheduler SDKs, the AWS CLI, or AWS CloudFormation, you must have an existing execution role that grants the permissions EventBridge Scheduler requires to invoke a target. For more information about manually setting up an execution role for your schedule, see Setting up an execution role in the EventBridge Scheduler User Guide. Using EventBridge Scheduler 418 AWS Step Functions Create a schedule To create a schedule by using the console Developer Guide 1. Open the Amazon EventBridge Scheduler console at https://console.aws.amazon.com/ scheduler/home. 2. On the Schedules page, choose Create schedule. 3. On the Specify schedule detail page, in the Schedule name and description section, do the following: a. b. c. For Schedule name, enter a name for your schedule. For example, MyTestSchedule. (Optional) For Description, enter a description for your schedule. For example, My first schedule. For Schedule group, choose a schedule group from the dropdown list. If you don't have a group, choose default. To create a schedule group, choose create your own schedule. You use schedule groups to add tags to groups of schedules. 4. • Choose your schedule options. Occurrence Do this... One-time schedule A one-time schedule For Date and time, do the following: invokes a target only once • Enter a valid date in at the date and time that you specify. Recurring schedule A recurring schedule invokes a target at a rate that you specify using a YYYY/MM/DD format. • Enter a timestamp in 24- hour hh:mm format. • For Timezone, choose the timezone. a. For Schedule type, do one of the following: • To use a cron expression to define the schedule, choose Create a schedule 419 AWS Step Functions Developer Guide Occurrence Do this... cron expression or rate expression. Cron-based schedule and enter the cron expression. • To use a rate expression to define the schedule, choose Rate-based schedule and enter the rate expression. For more informati on about cron and rate expressions, see Schedule types on EventBridge Scheduler in the Amazon EventBridge Scheduler User
|
step-functions-dg-132
|
step-functions-dg.pdf
| 132 |
YYYY/MM/DD format. • Enter a timestamp in 24- hour hh:mm format. • For Timezone, choose the timezone. a. For Schedule type, do one of the following: • To use a cron expression to define the schedule, choose Create a schedule 419 AWS Step Functions Developer Guide Occurrence Do this... cron expression or rate expression. Cron-based schedule and enter the cron expression. • To use a rate expression to define the schedule, choose Rate-based schedule and enter the rate expression. For more informati on about cron and rate expressions, see Schedule types on EventBridge Scheduler in the Amazon EventBridge Scheduler User Guide. b. For Flexible time window, choose Off to turn off the option, or choose one of the pre- defined time windows. For example, if you choose 15 minutes and you set a recurring schedule to invoke its target once every hour, the schedule runs within 15 minutes after the start of every hour. 5. (Optional) If you chose Recurring schedule in the previous step, in the Timeframe section, do the following: Create a schedule 420 AWS Step Functions Developer Guide a. b. c. For Timezone, choose a timezone. For Start date and time, enter a valid date in YYYY/MM/DD format, and then specify a timestamp in 24-hour hh:mm format. For End date and time, enter a valid date in YYYY/MM/DD format, and then specify a timestamp in 24-hour hh:mm format. 6. Choose Next. 7. On the Select target page, choose the AWS API operation that EventBridge Scheduler invokes: a. b. Choose AWS Step Functions StartExecution. In the StartExecution section, select a state machine or choose Create new state machine. Currently, you can't run Synchronous Express workflows on a schedule. c. Enter a JSON payload for the execution. Even if your state machine doesn't require any JSON payload, you must still include input in JSON format as shown in the following example. { "Comment": "sampleJSONData" } 8. Choose Next. 9. On the Settings page, do the following: a. b. To turn on the schedule, under Schedule state, toggle Enable schedule. To configure a retry policy for your schedule, under Retry policy and dead-letter queue (DLQ), do the following: • Toggle Retry. • For Maximum age of event, enter the maximum hour(s) and min(s) that EventBridge Scheduler must keep an unprocessed event. • The maximum time is 24 hours. • For Maximum retries, enter the maximum number of times EventBridge Scheduler retries the schedule if the target returns an error. The maximum value is 185 retries. Create a schedule 421 AWS Step Functions Developer Guide With retry policies, if a schedule fails to invoke its target, EventBridge Scheduler re-runs the schedule. If configured, you must set the maximum retention time and retries for the schedule. c. Choose where EventBridge Scheduler stores undelivered events. Dead-letter queue (DLQ) option Do this... Don't store Choose None. Store the event in the same AWS account where you're creating the schedule a. Choose Select an Amazon SQS queue in my AWS account as a DLQ. b. Choose the Amazon Resource Name (ARN) of the Amazon SQS queue. Store the event in a different AWS account from where you're creating the schedule a. Choose Specify an Amazon SQS queue in other AWS accounts as a DLQ. b. Enter the Amazon Resource Name (ARN) of the Amazon SQS queue. d. To use a customer managed key to encrypt your target input, under Encryption, choose Customize encryption settings (advanced). If you choose this option, enter an existing KMS key ARN or choose Create an AWS KMS key to navigate to the AWS KMS console. For more information about how EventBridge Scheduler encrypts your data at rest, see Encryption at rest in the Amazon EventBridge Scheduler User Guide. e. To have EventBridge Scheduler create a new execution role for you, choose Create new role for this schedule. Then, enter a name for Role name. If you choose this option, Create a schedule 422 AWS Step Functions Developer Guide EventBridge Scheduler attaches the required permissions necessary for your templated target to the role. 10. Choose Next. 11. In the Review and create schedule page, review the details of your schedule. In each section, choose Edit to go back to that step and edit its details. 12. Choose Create schedule. You can view a list of your new and existing schedules on the Schedules page. Under the Status column, verify that your new schedule is Enabled. To confirm that EventBridge Scheduler invoked the state machine, check the state machine's Amazon CloudWatch logs. Related resources For more information about EventBridge Scheduler, see the following: • EventBridge Scheduler User Guide • EventBridge Scheduler API Reference • EventBridge Scheduler Pricing Viewing execution details in the Step Functions console You can view in-progress and past executions of workflows in the Executions section of the Step Functions
|
step-functions-dg-133
|
step-functions-dg.pdf
| 133 |
and edit its details. 12. Choose Create schedule. You can view a list of your new and existing schedules on the Schedules page. Under the Status column, verify that your new schedule is Enabled. To confirm that EventBridge Scheduler invoked the state machine, check the state machine's Amazon CloudWatch logs. Related resources For more information about EventBridge Scheduler, see the following: • EventBridge Scheduler User Guide • EventBridge Scheduler API Reference • EventBridge Scheduler Pricing Viewing execution details in the Step Functions console You can view in-progress and past executions of workflows in the Executions section of the Step Functions console. In the Executions details, you can view the state machine’s definition, execution status, ARN, number of state transitions, and the inputs and outputs for individual states in the workflow. Related resources 423 AWS Step Functions Developer Guide Standard workflow execution details are recorded in Step Functions, but the history of Express workflow executions are not. To record Express workflow executions, you must configure your Express state machines to send logs to Amazon CloudWatch. See Logging in CloudWatch Logs to set up logging for Step Functions. The console experience to view both types of workflow executions is similar, but there are some limitations for Express workflows. See the section called “Standard and Express differences”. Note Because execution data for Express workflows are displayed using CloudWatch Logs Insights, scanning the logs will incur charges. By default, your log group only lists executions completed in the last three hours. If you specify a larger time range that includes more execution events, your costs will increase. For more information, see Vended Logs under the Logs tab on the CloudWatch Pricing page. Execution details overview The execution details link and page title use the unique execution ID generated by Step Functions or the custom ID you provided when starting the workflow. The Execution Details page includes metrics and the following options to manage your state machine: • Stop execution – Stop an in-progress execution. (Unavailable for completed executions.) • Start new execution – Start a new execution of your state machine • Redrive – Redrive executions of Standard Workflows that did not complete successfully in the last 14 days, including failed, aborted, or timed out executions. For more information, see Redriving state machines. Execution details 424 AWS Step Functions Developer Guide • Export – Export the execution details in JSON format to share or perform offline analysis. Viewing executions started with a version or alias You can also view the executions started with a version or an alias in the Step Functions console. For more information, see Listing executions for versions and aliases. The Execution Details console page contains the following sections: 1. Execution summary 2. Error message 3. View mode 4. Step details 5. Events Execution summary The Execution summary provides an overview of the execution details of your workflow, in the following tabs: Details Shows information, such as the execution's status, ARN, and timestamps for execution start and end time. You can also view the total count of State transitions that occurred while running the state machine execution. You can also view the links for X-Ray trace map and Amazon CloudWatch Execution Logs if you enabled tracing or logs for your state machine. If your state machine execution was started by another state machine, you can view the link for the parent state machine on this tab. If your state machine execution was redriven, this tab displays redrive related information, for example Redrive count. Execution input and output Shows the state machine execution input and output side-by-side. Execution details 425 AWS Step Functions Definition Developer Guide Shows the state machine's Amazon States Language definition. Error message If your state machine execution failed, the Execution Details page displays an error message. Choose Cause or View step details in the error message to view the reason for execution failure or the step that caused the error. If you choose View step details, Step Functions highlights the step that caused the error in the Step details, Graph view, and Table view tabs. If the step is a Task, Map, or Parallel state for which you've defined retries, the Step details pane displays the Retry tab for the step. Additionally, if you've redriven the execution, you can see the retries and redrive execution details in the Retries & redrives tab of the Step details pane. From the Recover dropdown button on this error message, you can either redrive your unsuccessful executions or start a new execution. For more information, see Redriving state machines. The error message for a failed state machine execution will be displayed on the Execution Details page. The error message will also have a link to the step that caused the execution failure. View mode The View mode section contains two different visualizations
|
step-functions-dg-134
|
step-functions-dg.pdf
| 134 |
for the step. Additionally, if you've redriven the execution, you can see the retries and redrive execution details in the Retries & redrives tab of the Step details pane. From the Recover dropdown button on this error message, you can either redrive your unsuccessful executions or start a new execution. For more information, see Redriving state machines. The error message for a failed state machine execution will be displayed on the Execution Details page. The error message will also have a link to the step that caused the execution failure. View mode The View mode section contains two different visualizations for your state machine. You can choose to view a graphic representation of the workflow, a table outlining the states in your workflow, or a list of the events associated with your state machine's execution: Graph view The Graph view mode displays a graphical representation of your workflow. A legend is included at the bottom that indicates the execution status of the state machine. It also contains buttons that let you zoom in, zoom out, center align the full workflow, or view the workflow in full-screen mode. From the graph view, you can choose any step in your workflow to view details about its execution in the Step details component. When you chose a step in the Graph view, the Table view also shows that step. This is true in reverse as well. If you choose a step from Table view, the Graph view shows the same step. Execution details 426 AWS Step Functions Developer Guide If your state machine contains a Map state, Parallel state, or both, you can view their names in the workflow in the Graph view. In addition, for the Map state, the Graph view lets you move across different iterations of the Map state execution data. For example, if your Map state has five iterations and you want to view the execution data for the third and fourth iterations, do the following: 1. Choose the Map state whose iteration data you want to view. 2. From Map iteration viewer, choose #2 from the dropdown list for third iteration. This is because iterations are counted from zero. Likewise, choose #3 from the dropdown list for the fourth iteration of the Map state. Alternatively, use the up arrow icon and down arrow icon controls to move between different iterations of the Map state. Note If your state machine contains nested Map states, the dropdown lists for the parent and child Map state iterations will be displayed to represent the iteration data. 3. (Optional) If one or more of your Map state iterations failed to execute, or the execution was stopped, you can view its data by choosing those iteration numbers under Failed or Aborted in the dropdown list. Finally, you can use the Export and Layout buttons to export the workflow graph as an SVG or PNG image. You can also switch between horizontal and vertical views of your workflow. Table view The Table view mode displays a tabular representation of the states in your workflow. In this View mode, you can see the details of each state that was executed in your workflow, including its name, the name of any resource it used (such as an AWS Lambda function), and if the state executed successfully. From this view, you can choose any state in your workflow to view details about its execution in the Step details component. When you chose a step in the Table view, the Graph view also shows that step. This is true in reverse as well. If you choose a step from Graph view, the Table view shows the same step. Execution details 427 AWS Step Functions Developer Guide You can also limit the amount of data displayed in the Table view mode by applying filters to the view. You can create a filter for a specific property, such as Status or Redrive attempt. For more information, see Examine executions. By default, this mode displays the Name, Type, Status, Resource, and Started After columns. You can configure the columns you want to view using the Preferences dialog box. The selections that you make on this dialog box persist for future state machine executions until they are changed again. If you add the Timeline column, the execution duration of each state is shown with respect to the runtime for the entire execution. This is displayed as a color-coded, linear timeline. This can help you identify any performance-related issues with a specific state's execution. The color-coded segments for each state on the timeline help you identify the state's execution status, such as in- progress, failed, or aborted. For example, if you've defined execution retries for a state in your state machine, these retries are shown in the timeline. Red segments represent the failed Retry
|
step-functions-dg-135
|
step-functions-dg.pdf
| 135 |
executions until they are changed again. If you add the Timeline column, the execution duration of each state is shown with respect to the runtime for the entire execution. This is displayed as a color-coded, linear timeline. This can help you identify any performance-related issues with a specific state's execution. The color-coded segments for each state on the timeline help you identify the state's execution status, such as in- progress, failed, or aborted. For example, if you've defined execution retries for a state in your state machine, these retries are shown in the timeline. Red segments represent the failed Retry attempts, while light gray segments represent the BackoffRate between each Retry attempt. If your state machine contains a Map state, Parallel state, or both, you can view their names in the workflow in Table view. For Map and Parallel states, the Table view mode displays the execution data for their iterations and parallel branches as nodes inside a tree view. You can choose each node in these states to view their individual details in the Step details section. For example, you can review the data for a specific Map state iteration that caused the state to fail. Expand the node for the Map state, and then view the status for each iteration in the Status column. Execution details 428 AWS Step Functions Step details Developer Guide The Step details section opens up on the right when you choose a state in the Graph view or Table view. This section contains the following tabs, which provide you in-depth information about the selected state: Input Shows the input details of the selected state. If there is an error in the input, it is indicated with a error icon on the tab header. In addition, you can view the reason for the error in this tab. You can also choose the Advanced view toggle button to see the input data transfer path as the data passed through the selected state. This lets you identify how your input was processed as one or more of the fields, such as InputPath, Parameters, ResultSelector, OutputPath, and ResultPath, were applied to the data. Output Shows the output of the selected state. If there is an error in the output, it is indicated with an error icon on the tab header. In addition, you can view the reason for the error in the this tab. You can also choose the Advanced view toggle button to see the output data transfer path as the data passed through the selected state. This lets you identify how your input was processed as one or more of the fields, such as InputPath, Parameters, ResultSelector, OutputPath, and ResultPath, were applied to the data. Details Shows information, such as the state type, its execution status, and execution duration. For Task states that use a resource, such as AWS Lambda, this tab provides links to the resource definition page and Amazon CloudWatch logs page for the resource invocation. It also shows values, if specified, for the Task state's TimeoutSeconds and HeartbeatSeconds fields. For Map states, this tab shows you information regarding the total count of a Map state's iterations. Iterations are categorized as Failed, Aborted, Succeeded, or InProgress. Definition Shows the Amazon States Language definition corresponding to the selected state. Execution details 429 AWS Step Functions Retry Note Developer Guide This tab appears only if you have defined a Retry field in your state machine's Task or Parallel state. Shows the initial and subsequent retry attempts for a selected state in its original execution attempt. For the initial and all the subsequent failed attempts, choose the arrow icon next to Type to view the Reason for failure that appears in a dropdown box. If the retry attempt succeeds, you can view the Output that appears in a dropdown box. If you've redriven your execution, this tab header displays the name Retries & redrives and displays the retry attempt details for each redrive. Events Shows a filtered list of the events associated with the selected state in an execution. The information you see on this tab is a subset of the complete execution event history you see in the Events table. Events The Events table displays the complete history for the selected execution as a list of events spanning multiple pages. Each page contains up to 25 events. This section also displays the total event count, which can help you determine if you exceeded the maximum event history count of 25,000 events. Execution details 430 AWS Step Functions Developer Guide By default, the results in the Events table are displayed in ascending order based on the Timestamp of the events. You can change the execution event history's sorting to descending order by clicking on the Timestamp column header. In the Events table, each event is color-coded
|
step-functions-dg-136
|
step-functions-dg.pdf
| 136 |
the complete history for the selected execution as a list of events spanning multiple pages. Each page contains up to 25 events. This section also displays the total event count, which can help you determine if you exceeded the maximum event history count of 25,000 events. Execution details 430 AWS Step Functions Developer Guide By default, the results in the Events table are displayed in ascending order based on the Timestamp of the events. You can change the execution event history's sorting to descending order by clicking on the Timestamp column header. In the Events table, each event is color-coded to indicate its execution status. For example, events that failed appear in red. To view additional details about an event, choose the arrow icon next to the event ID. Once open, the event details show the input, output, and resource invocation for the event. In addition, in the Events table, you can apply filters to limit the execution event history results that are displayed. You can choose properties such as ID, or Redrive attempt. For more information, see Examine executions. Standard and Express console experience differences Standard workflows The execution histories for Standard Workflows are always available for executions completed in the last 90 days. Express workflows Standard and Express differences 431 AWS Step Functions Developer Guide For Express workflows, the Step Functions console retrieves log data gathered through a CloudWatch Logs log group to show execution history. The histories for executions completed in the last three hours are available by default. You can customize the time range. If you specify a larger time range which includes more execution events, the cost to scan the logs will increase. For more information, see Vended Logs under the Logs tab on the CloudWatch Pricing page and Logging in CloudWatch Logs. Considerations and limitations for viewing Express workflow executions When viewing Express workflow executions on the Step Functions console, keep in mind the following considerations and limitations: Availability of Express workflow execution details relies on Amazon CloudWatch Logs For Express workflows, their execution history and detailed execution information are gathered through CloudWatch Logs Insights. This information is kept in the CloudWatch Logs log group that you specify when you create the state machine. The state machine's execution history is shown under the Executions tab on the Step Functions console. Warning If you delete the CloudWatch Logs for an Express workflow, it won't be listed under the Executions tab. We recommend that you use the default log level of ALL for logging all execution event types. You can update the log level as required for your existing state machines when you edit them. For more information, see Using CloudWatch Logs to log execution history in Step Functions and Event log levels. Partial Express workflow execution details are available if logging level is ERROR or FATAL By default, the logging level for Express workflow executions is set to ALL. If you change the log level, the execution histories and execution details for completed executions won’t be affected. However, all new executions will emit logs based on the updated log level. For more information, see Using CloudWatch Logs to log execution history in Step Functions and Event log levels. Limitations viewing Express workflow executions 432 AWS Step Functions Developer Guide For example, if you change the log level from ALL to either ERROR or FATAL, the Executions tab on the Step Functions console only lists failed executions. In the Event view tab, the console shows only the event details for the state machine steps that failed. We recommend that you use the default log level of ALL for logging all execution event types. You can update the log level as required for your existing state machines when you edit the state machine. State machine definition for a prior execution can't be viewed after the state machine has been modified State machine definitions for past executions are not stored for Express workflows. If you change your state machine definition, you can only view the state machine definition for executions using the most current definition. For example, if you remove one or more steps from your state machine definition, Step Functions detects a mismatch between the definition and prior execution events. Because previous definitions are not stored for Express workflows, Step Functions can't display the state machine definition for executions run on an earlier version of the state machine definition. As a result, the Definition, Graph view, and Table view tabs are unavailable for executions run on previous versions of a state machine definition. Restarting state machine executions with redrive in Step Functions You can use redrive to restart executions of Standard Workflows that didn't complete successfully in the last 14 days. These include failed, aborted, or timed out executions. When you redrive an execution, Step Functions continues
|
step-functions-dg-137
|
step-functions-dg.pdf
| 137 |
prior execution events. Because previous definitions are not stored for Express workflows, Step Functions can't display the state machine definition for executions run on an earlier version of the state machine definition. As a result, the Definition, Graph view, and Table view tabs are unavailable for executions run on previous versions of a state machine definition. Restarting state machine executions with redrive in Step Functions You can use redrive to restart executions of Standard Workflows that didn't complete successfully in the last 14 days. These include failed, aborted, or timed out executions. When you redrive an execution, Step Functions continues the failed execution from the unsuccessful step and uses the same input. Step Functions preserves the results and execution history of the successful steps, which are not rerun when you redrive an execution. For example, say that your workflow contains two states: a Pass workflow state state followed by a Task workflow state state. If your workflow execution fails at the Task state, and you redrive the execution, the execution reschedules and then reruns the Task state. Redriven executions use the same state machine definition and execution ARN that was used for the original execution attempt. If your original execution attempt was associated with a version, alias, or both, the redriven execution is associated with the same version, alias, or both. Even if you Redriving state machines 433 AWS Step Functions Developer Guide update your alias to point to a different version, the redriven execution continues to use the version associated with the original execution attempt. Because redriven executions use the same state machine definition, you must start a new execution if you update your state machine definition. When you redrive an execution, the state machine level timeout, if defined, is reset to 0. For more information about state machine level timeout, see TimeoutSeconds. Execution redrives are considered as state transitions. For information about how state transitions affect billing, see Step Functions Pricing. Redrive eligibility for unsuccessful executions You can redrive executions if your original execution attempt meets the following conditions: • You started the execution on or after November 15, 2023. Executions that you started prior to this date aren't eligible for redrive. • The execution status isn't SUCCEEDED. • The workflow execution hasn't exceeded the redrivable period of 14 days. Redrivable period refers to the time during which you can redrive a given execution. This period starts from the day a state machine completes its execution. • The workflow execution hasn't exceeded the maximum open time of one year. For information about state machine execution quotas, see Quotas related to state machine executions. • The execution event history count is less than 24,999. Redriven executions append their event history to the existing event history. Make sure your workflow execution contains less than 24,999 events to accommodate the ExecutionRedriven history event and at least one other history event. Redrive behavior of individual states Depending on the state that failed in your workflow, the redrive behavior for all unsuccessful states varies. The following table describes the redrive behavior for all the states. State name Redrive execution behavior Pass workflow state If a preceding step fails or the state machine times out, the Pass state is exited and isn't executed on redrive. Redrive eligibility for unsuccessful executions 434 AWS Step Functions State name Redrive execution behavior Developer Guide Task workflow state Schedules and starts the Task state again. When you redrive an execution that reruns a Task state, the TimeoutSeconds for the state, if defined, is reset to 0. For more information about timeout, see Task state. Choice workflow state Reevaluates the Choice state rules. Wait workflow state Succeed workflow state If the state specifies Timestamp or TimestampPath that refers to a timestamp in the past, redrive causes the Wait state to be exited and enters the state specified in the Next field. Doesn't redrive state machine executions that enter the Succeed state. Fail workflow state Reenters the Fail state and fails again. Parallel workflow state Inline Map state Reschedules and redrives only those branches that failed or aborted. If the state failed because of a States.Da taLimitExceeded is rerun, including the branches that were error, the Parallel state successful in the original execution attempt. Reschedules and redrives only those iterations that failed or aborted. If the state failed because of a States.Da error, the Inline Map taLimitExceeded state is rerun, including the iterations that were successful in the original execution attempt. Redrive behavior of individual states 435 AWS Step Functions State name Distributed Map state Developer Guide Redrive execution behavior redrives the unsuccessful child workflow executions in a Map Run. For more informati on, see Redriving Map Runs in Step Functions executions. If the state failed because of a States.Da taLimitExceeded ed Map state is rerun. This includes the child
|
step-functions-dg-138
|
step-functions-dg.pdf
| 138 |
the original execution attempt. Reschedules and redrives only those iterations that failed or aborted. If the state failed because of a States.Da error, the Inline Map taLimitExceeded state is rerun, including the iterations that were successful in the original execution attempt. Redrive behavior of individual states 435 AWS Step Functions State name Distributed Map state Developer Guide Redrive execution behavior redrives the unsuccessful child workflow executions in a Map Run. For more informati on, see Redriving Map Runs in Step Functions executions. If the state failed because of a States.Da taLimitExceeded ed Map state is rerun. This includes the child error, the Distribut workflows that were successful in the original execution attempt. IAM permission to redrive an execution Step Functions needs appropriate permission to redrive an execution. The following IAM policy example grants the least privilege required to your state machine for redriving an execution. Remember to replace the italicized text with your resource-specific information. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "states:RedriveExecution" ], "Resource": "arn:aws:states:region:account-id:execution:myStateMachine:*" } ] } For an example of the permission you need to redrive a Map Run, see Example of IAM policy for redriving a Distributed Map. Redriving executions in console You can redrive eligible executions from the Step Functions console. IAM permission to redrive an execution 436 AWS Step Functions Developer Guide For example, imagine that you run a state machine and a parallel state fails to run. The following image shows a Lambda Invoke step named Do square number inside a Parallel state has returned and failed. This caused the Parallel state to fail as well. The branches whose execution were in progress or not started are stopped and the state machine execution fails. To redrive an execution from the console 1. Open the Step Functions console, and then choose an existing state machine that failed execution. 2. On the state machine detail page, under Executions, choose a failed execution instance. 3. Choose Redrive. 4. In the Redrive dialog box, choose Redrive execution. Tip If you're on the Execution Details page of a failed execution, do one of the following to redrive the execution: Redriving executions in console 437 AWS Step Functions Developer Guide • Choose Recover, and then select Redrive from failure. • Choose Actions, and then select Redrive. Notice that redrive uses the same state machine definition and ARN. It continues running the execution from the step that failed in the original execution attempt. In this example, it's the Do square number step and Wait 3 sec branch inside the Parallel state. After restarting the execution of these unsuccessful steps in the Parallel state, redrive will continue execution for the Done step. 5. Choose the execution to open the Execution Details page. On this page, you can view the results of the redriven execution. For example, in the Execution summary section, you can see Redrive count, which represents the number of times an execution has been redriven. In the Events section, you can see the redrive related execution events appended to the events of the original execution attempt. For example, the ExecutionRedriven event. Redriving executions using API You can redrive eligible executions using the RedriveExecution API. This API restarts unsuccessful executions of Standard Workflows from the step that failed, aborted, or timed out. In the AWS Command Line Interface (AWS CLI), run the following command to redrive an unsuccessful state machine execution. Remember to replace the italicized text with your resource-specific information. aws stepfunctions redrive-execution --execution-arn arn:aws:states:us-east-2:account- id:execution:myStateMachine:foo Examining redriven executions You can examine a redriven execution in the console or using the APIs: GetExecutionHistory and DescribeExecution. Redriving executions using API 438 AWS Step Functions Developer Guide Examine redriven executions on console 1. Open the Step Functions console, and then choose an existing state machine for which you've redriven an execution. 2. Open the Execution Details page. On this page, you can view the results of the redriven execution. For example, in the Execution summary section, you can see Redrive count, which represents the number of times an execution has been redriven. In the Events section, you can see the redrive related execution events appended to the events of the original execution attempt. For example, the ExecutionRedriven event. Examine redriven executions using APIs If you've redriven a state machine execution, you can use one of the following APIs to view details about the redriven execution. Remember to replace the italicized text with your resource- specific information. • GetExecutionHistory – Returns the history of the specified execution as a list of events. This API also returns the details about the redrive attempt of an execution, if available. In the AWS CLI, run the following command. aws stepfunctions get-execution-history --execution-arn arn:aws:states:us- east-2:account-id:execution:myStateMachine:foo • DescribeExecution – Provides information about a state machine execution. This can be the state machine associated with
|
step-functions-dg-139
|
step-functions-dg.pdf
| 139 |
Examine redriven executions using APIs If you've redriven a state machine execution, you can use one of the following APIs to view details about the redriven execution. Remember to replace the italicized text with your resource- specific information. • GetExecutionHistory – Returns the history of the specified execution as a list of events. This API also returns the details about the redrive attempt of an execution, if available. In the AWS CLI, run the following command. aws stepfunctions get-execution-history --execution-arn arn:aws:states:us- east-2:account-id:execution:myStateMachine:foo • DescribeExecution – Provides information about a state machine execution. This can be the state machine associated with the execution, the execution input and output, execution redrive details, if available, and relevant execution metadata. In the AWS CLI, run the following command. aws stepfunctions describe-execution --execution-arn arn:aws:states:us- east-2:account-id:execution:myStateMachine:foo Examining redriven executions 439 AWS Step Functions Developer Guide Retry behavior of redriven executions If your redriven execution reruns a Task workflow state, Parallel workflow state, or Inline Map state, for which you have defined retries, the retry attempt count for these states is reset to 0 to allow for the maximum number of attempts on redrive. For a redriven execution, you can track individual retry attempts of these states using the console. To examine the individual retry attempts in the console 1. On the Execution Details page of the Step Functions console, choose a state that was retried on redrive. 2. Choose the Retries & redrives tab. 3. Choose the arrow icon next to each retry attempt to view its details. If the retry attempt succeeded, you can view the results in Output that appears in a dropdown box. The following image shows an example of the retries performed for a state in the original execution attempt and the redrives of that execution. In this image, three retries are performed in the original and redrive execution attempts. The execution succeeds in the fourth redrive attempt and returns an output of 16. Retry behavior of redriven executions 440 AWS Step Functions Developer Guide Retry behavior of redriven executions 441 AWS Step Functions Developer Guide Viewing a Distributed Map Run execution in Step Functions The Step Functions console provides a Map Run Details page which displays all the information related to a Distributed Map state execution. For example, you can view the status of the Distributed Map state's execution, the Map Run's ARN, and the statuses of the items processed in the child workflow executions started by the Distributed Map state. You can also view a list of all child workflow executions and access their details. If your Map Run was redriven, you will see redrive details in the Map Run execution summary too. When you run a Map state in Distributed mode, Step Functions creates a Map Run resource. A Map Run refers to a set of child workflow executions that a Distributed Map state starts, and the runtime settings that control these executions. Step Functions assigns an Amazon Resource Name (ARN) to your Map Run. You can examine a Map Run in the Step Functions console. You can also invoke the DescribeMapRun API action. Map Runs do not emit metrics to CloudWatch. However, child workflow executions of a Map Run do emit metrics to CloudWatch. These metrics will have a labelled State Machine ARN with the following format: arn:partition:states:region:account:stateMachine:stateMachineName/MapRunLabel or UUID The Map Run Details has three sections: Map Run execution summary, Item processing status, and Listing executions. Map Run execution summary The Map Run Execution summary provides an overview of the execution details of the Distributed Map state. Details Shows execution status of the Distributed Map state, the Map Run ARN, and type of the child workflow executions started by the Distributed Map state. You can view additional configurations, such as tolerated failure threshold for the Map Run and the maximum concurrency specified for child workflow executions. Input and output Shows the input received by the Distributed Map state and the corresponding output that it generates. Viewing Map Runs 442 AWS Step Functions Developer Guide You can view the input dataset and its location, and the input filters applied to the individual data items in that dataset. If you export the output of the Distributed Map state execution, this tab shows the path to the Amazon S3 bucket that contains the execution results. Otherwise, it points you to the parent workflow's Execution Details page to view the execution output. Error message If your Map Run failed, the Map Run Details page displays an error message with the reason for failure. From the Recover dropdown button on this error message, you can either redrive the unsuccessful child workflow executions started by this Map Run or start a new execution of the parent workflow. See Redriving Map Runs to learn how to restart your workflow. Item processing status The Item
|
step-functions-dg-140
|
step-functions-dg.pdf
| 140 |
this tab shows the path to the Amazon S3 bucket that contains the execution results. Otherwise, it points you to the parent workflow's Execution Details page to view the execution output. Error message If your Map Run failed, the Map Run Details page displays an error message with the reason for failure. From the Recover dropdown button on this error message, you can either redrive the unsuccessful child workflow executions started by this Map Run or start a new execution of the parent workflow. See Redriving Map Runs to learn how to restart your workflow. Item processing status The Item processing status section displays the status of the items processed in a Map Run. For example, Pending indicates that a child workflow execution hasn’t started processing the item yet. Item statuses are dependent on the status of the child workflow executions processing the items. If a child workflow execution failed, times out, or if a user cancels the execution, Step Functions doesn't receive any information about the processing result of the items inside that child workflow execution. All items processed by that execution share the child workflow execution's status. For example, say that you want to process 100 items in two child workflow executions, where each execution processes a batch of 50 items. If one of the executions fails and the other succeeds, you'll have 50 successful and 50 failed items. The following table explains the types of processing statuses available for all items: Status Pending Description Indicates an item that the child workflow execution hasn't started processing. If a Map Run stops, fails, or a user cancels the execution before processing of an item starts, the item remains in Pending status. Error message 443 AWS Step Functions Status Running Succeeded Failed Developer Guide Description For example, if a Map Run fails with 10 unprocessed items, these 10 items remain in the Pending status. Indicates an item currently being processed by the child workflow execution. Indicates that the child workflow execution successfully processed the item. A successful child workflow execution can't have any failed items. If one item in the dataset fails during execution, the entire child workflow execution fails. Indicates that the child workflow execution either failed to process the item, or the execution timed out. If any one item processed by a child workflow execution fails, the entire child workflow execution fails. For example, consider a child workflow execution that processed 1000 items. If any one item in that dataset fails during execution , then Step Functions considers the entire child workflow execution as failed. When you redrive a Map Run, the count of items with this status is reset to 0. Item processing status 444 AWS Step Functions Status Aborted Developer Guide Description Indicates that the child workflow execution started processing the item, but either the user cancelled the execution, or Step Functions stopped the execution because the Map Run failed. For example, consider a Running child workflow execution that's processing 50 items. If the Map Run stops because of a failure or because a user cancelled the execution, the child workflow execution and the status of all 50 items changes to Aborted. If you use a child workflow execution of the Express type, you can't stop the execution. When you redrive a Map Run that starts child workflow executions of type Express, the count of items with this status is reset to 0. This is because Express child workflows are restarted using the StartExecution API action instead of being redriven. Listing executions The Executions section lists all of the child workflow executions for a specific Map Run. Use the Search by exact execution name field to search for a specific child workflow execution. To see details about a specific execution, select a child workflow execution from the list and choose the View details button to open its Execution details page. You can also use the API or AWS CLI to list child workflow executions started by the Map Run: • Using the API, call ListExecutions with the mapRunArn parameter set to the ARN of the parent workflow. Listing executions 445 AWS Step Functions Developer Guide • Using the AWS CLI, call list-executions with the map-run-arn parameter set to the ARN of the parent workflow. Important The retention policy for child workflow executions is 90 days. Completed child workflow executions that are older will not be displayed in the Executions table, even if the Distributed Map state or parent workflow continues to run longer than the retention period. You can view execution details, including results, of these child workflow executions if you export the Distributed Map state output to an Amazon S3 bucket using ResultWriter (Map). Tip Choose the refresh button to view the most current list of all child workflow executions. Redriving Map Runs in Step
|
step-functions-dg-141
|
step-functions-dg.pdf
| 141 |
set to the ARN of the parent workflow. Important The retention policy for child workflow executions is 90 days. Completed child workflow executions that are older will not be displayed in the Executions table, even if the Distributed Map state or parent workflow continues to run longer than the retention period. You can view execution details, including results, of these child workflow executions if you export the Distributed Map state output to an Amazon S3 bucket using ResultWriter (Map). Tip Choose the refresh button to view the most current list of all child workflow executions. Redriving Map Runs in Step Functions executions You can restart unsuccessful child workflow executions in a Map Run by redriving your parent workflow. A redriven parent workflow redrives all the unsuccessful states, including Distributed Map. A parent workflow redrives unsuccessful states if there's no <stateType>Exited event corresponding to the <stateType>Entered event for a state when the parent workflow completed its execution. For example, if the event history doesn't contain the MapStateExited event for a MapStateEntered event, you can redrive the parent workflow to redrive all the unsuccessful child workflow executions in the Map Run. A Map Run is either not started or fails in the original execution attempt when the state machine doesn't have the required permission to access the ItemReader (Map), ResultWriter (Map), or both. If the Map Run wasn't started in the original execution attempt of the parent workflow, redriving the parent workflow starts the Map Run for the first time. To resolve this, add the required permissions to your state machine role, and then redrive the parent workflow. If you redrive the parent workflow without adding the required permissions, it attempts to start a new Map Run run that will fail again. For information about the permissions that you might need, see IAM policies for using Distributed Map states. Redriving Map Runs 446 Developer Guide AWS Step Functions Topics • Redrive eligibility for child workflows in a Map Run • Child workflow execution redrive behavior • Scenarios of input used on Map Run redrive • IAM permission to redrive a Map Run • Redriving Map Run in console • Redriving Map Run using API Redrive eligibility for child workflows in a Map Run You can redrive the unsuccessful child workflow executions in a Map Run if the following conditions are met: • You started the parent workflow execution on or after November 15, 2023. Executions that you started prior to this date aren't eligible for redrive. • You haven't exceeded the hard limit of 1000 redrives of a given Map Run. If you've exceeded this limit, you'll receive the States.Runtime error. • The parent workflow is redrivable. If the parent workflow isn't redrivable, you can't redrive the child workflow executions in a Map Run. For more information about redrive eligibility of a workflow, see Redrive eligibility for unsuccessful executions. • The child workflow executions of type Standard in your Map Run haven't exceeded the 25,000 execution event history limit. Child workflow executions that have exceeded the event history limit are counted towards the tolerated failure threshold and considered as failed. For more information about the redrive eligibility of an execution, see Redrive eligibility for unsuccessful executions. A new Map Run is started and the existing Map Run isn't redriven in the following cases even if the Map Run failed in the original execution attempt: • Map Run failed because of the States.DataLimitExceeded error. • Map Run failed because of the JSON data interpolation error, States.Runtime. For example, you selected a non-existent JSON node in Filtering state output using OutputPath. Redrive eligibility for child workflows in a Map Run 447 AWS Step Functions Developer Guide A Map Run can continue to run even after the parent workflow stops or times out. In these scenarios, the redrive doesn't happen immediately: • Map Run might still be canceling in progress child workflow executions of type Standard, or waiting for child workflow executions of type Express to complete their executions. • Map Run might still be writing results to the ResultWriter (Map), if you configured it to export results. In these cases, the running Map Run completes its operations before attempting to redrive. Child workflow execution redrive behavior The redriven child workflow executions in a Map Run exhibit the behavior as described in the following table. Express child workflow Standard child workflow All child workflow executions that failed or timed out in the original execution attempt All child workflow executions that failed, timed out, or canceled in the original are started using the StartExecution API execution attempt are redriven using the action. The first state in ItemProcessor is run RedriveExecution API action. These child first. Unsuccessful executions can always be redriven. This is because Express child workflow executions are always started as a new execution using
|
step-functions-dg-142
|
step-functions-dg.pdf
| 142 |
redrive behavior The redriven child workflow executions in a Map Run exhibit the behavior as described in the following table. Express child workflow Standard child workflow All child workflow executions that failed or timed out in the original execution attempt All child workflow executions that failed, timed out, or canceled in the original are started using the StartExecution API execution attempt are redriven using the action. The first state in ItemProcessor is run RedriveExecution API action. These child first. Unsuccessful executions can always be redriven. This is because Express child workflow executions are always started as a new execution using the StartExecution API action. workflows are redriven from the last state in ItemProcessor that resulted in their unsuccess ful execution. Unsuccessful Standard child workflow executions can't always be redriven. If an execution isn't redrivable, it won't be attempted again. The last error or output of the execution is permanent. This is possible when an execution exceeds 25,000 history events, or its redrivable period of 14 days has expired. A Standard child workflow execution might not be redrivable if the parent workflow execution has closed within 14 days, but the Child workflow execution redrive behavior 448 AWS Step Functions Developer Guide Express child workflow Standard child workflow child workflow execution closed earlier than 14 days. Express child workflow executions use the same execution ARN as the original execution Standard child workflow executions use the same execution ARN as the original attempt, but you can't distinctly identify their execution attempt. You can distinctly identify individual redrives. the individual redrives in the console and using APIs, such as GetExecutionHistory and DescribeExecution. For more information, see the section called “Examining redriven executions”. If you've redriven a Map Run, and it has reached its concurrency limit, the child workflow executions in that Map Run transition to the pending state. The execution status of the Map Run also transitions to the Pending redrive state. Until the specified concurrency limit can allow for more child workflow executions to run, the execution remains in the Pending redrive state. For example, say that the concurrency limit of the Distributed Map in your workflow is 3000, and the number of child workflows to be rerun is 6000. This causes 3000 child workflows to run in parallel while the remaining 3000 workflows remain in the Pending redrive state. After the first batch of 3000 child workflows complete their execution, the remaining 3000 child workflows are run. When a Map Run has completed its execution or is aborted, the count of child workflow executions in the Pending redrive state is reset to 0. Scenarios of input used on Map Run redrive Depending on how you provided input to the Distributed Map in the original execution attempt, a redriven Map Run will use the input as described in the following table. Input in the original execution attempt Input used on Map Run redrive Input passed from a previous state or the execution input. The redriven Map Run uses the same input. Scenarios of input used on Map Run redrive 449 AWS Step Functions Developer Guide Input in the original execution attempt Input used on Map Run redrive Input passed using ItemReader (Map) and the Map Run didn't start the child workflow The redriven Map Run uses the input in the Amazon S3 bucket. executions because one of the following conditions is true: • Map Run failed with the States.It emReaderFailed error. • Map Run failed with the States.Re sultWriterFailed error. • The parent workflow execution was timed out or canceled before the Map Run was started. Input passed using ItemReader. The Map Run failed after starting or attempting to start The redriven Map Run uses the same input provided in the original execution attempt. child workflow executions. IAM permission to redrive a Map Run Step Functions needs appropriate permission to redrive a Map Run. The following IAM policy example grants the least privilege required to your state machine for redriving a Map Run. Remember to replace the italicized text with your resource-specific information. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "states:RedriveExecution" ], "Resource": "arn:aws:states:us-east-2:account- id:execution:stateMachineName/myMapRunLabel:*" } ] } IAM permission to redrive a Map Run 450 AWS Step Functions Developer Guide Redriving Map Run in console The following image shows the execution graph of a state machine that contains a Distributed Map. This execution failed because the Map Run failed. To redrive the Map Run, you must redrive the parent workflow. To redrive a Map Run from the console 1. Open the Step Functions console, and then choose an existing state machine that contains a Distributed Map that failed execution. 2. On the state machine detail page, under Executions, choose a failed execution instance of this state machine. 3. Choose Redrive. 4. In the Redrive dialog box, choose Redrive execution.
|
step-functions-dg-143
|
step-functions-dg.pdf
| 143 |
Guide Redriving Map Run in console The following image shows the execution graph of a state machine that contains a Distributed Map. This execution failed because the Map Run failed. To redrive the Map Run, you must redrive the parent workflow. To redrive a Map Run from the console 1. Open the Step Functions console, and then choose an existing state machine that contains a Distributed Map that failed execution. 2. On the state machine detail page, under Executions, choose a failed execution instance of this state machine. 3. Choose Redrive. 4. In the Redrive dialog box, choose Redrive execution. Tip You can also redrive a Map Run from the Execution Details or Map Run Details page. If you're on the Execution Details page, do one of the following to redrive the execution: Redriving Map Run in console 451 AWS Step Functions Developer Guide • Choose Recover, and then select Redrive from failure. • Choose Actions, and then select Redrive. If you're on the Map Run Details page, choose Recover, and then select Redrive from failure. Notice that redrive uses the same state machine definition and ARN. It continues running the execution from the step that failed in the original execution attempt. In this example, it's the Distributed Map step named Map and the Process input step inside it. After restarting the unsuccessful child workflow executions of the Map Run, redrive will continue execution for the Done step. 5. From the Execution Details page, choose Map Run to see the details of the redriven Map Run. On this page, you can view the results of the redriven execution. For example, in the Map Run execution summary section, you can see Redrive count, which represents the number of times the Map Run has been redriven. In the Events section, you can see the redrive related execution events appended to the events of the original execution attempt. For example, the MapRunRedriven event. After you've redriven a Map Run, you can examine its redrive details in the console or using the GetExecutionHistory and DescribeExecution API actions. For more information about examining a redriven execution, see Examining redriven executions. Redriving Map Run using API You can redrive an eligible Map Run using the RedriveExecution API on the parent workflow. This API restarts unsuccessful child workflow executions in a Map Run. In the AWS Command Line Interface (AWS CLI), run the following command to redrive an unsuccessful state machine execution. Remember to replace the italicized text with your resource-specific information. aws stepfunctions redrive-execution --execution-arn arn:aws:states:us-east-2:account- id:execution:myStateMachine:foo Redriving Map Run using API 452 AWS Step Functions Developer Guide After you have redriven a Map Run, you can examine its redrive details in the console or using the DescribeMapRun API action. To examine the redrive details of Standard workflow executions in a Map Run, you can use the GetExecutionHistory or DescribeExecution API action. For more information about examining a redriven execution, see the section called “Examining redriven executions”. You can examine the redrive details of Express workflow executions in a Map Run on the Step Functions console if you've enabled logging on the parent workflow. For more information, see Using CloudWatch Logs to log execution history in Step Functions. Redriving Map Run using API 453 AWS Step Functions Developer Guide Processing input and output in Step Functions Managing state with variables and JSONata Step Functions recently added variables and JSONata to manage state and transform data. Learn more in the blog post Simplifying developer experience with variables and JSONata in AWS Step Functions When a Step Functions execution receives JSON input, it passes that data to the first state in the workflow as input. With JSONata, you can retrieve state input from $states.input. Your state machine executions also provide that initial input data in the Context object. You can retrieve the original state machine input at any point in your workflow from $states.context.Execution.Input. When states exit, their output is available to the very next state in your state machine. Your state inputs will pass through as state output by default, unless you modify the state output. For data that you might need in later steps, consider storing it in variables. For more info, see the section called “Passing data with variables”. QueryLanguage recommendation For new state machines, we recommend the JSONata query language. In state machines that do not specify a query language, the state machine defaults to JSONPath for backward compatibility. You must opt-in to use JSONata for your state machines or individual states. Processing input and output with JSONata With JSONata expressions, you can select and transform data. In the Arguments field, you can customize the data sent to the action. The result can be transformed into custom state output in the Output field. You can also store data in variables in the Assign field.
|
step-functions-dg-144
|
step-functions-dg.pdf
| 144 |
“Passing data with variables”. QueryLanguage recommendation For new state machines, we recommend the JSONata query language. In state machines that do not specify a query language, the state machine defaults to JSONPath for backward compatibility. You must opt-in to use JSONata for your state machines or individual states. Processing input and output with JSONata With JSONata expressions, you can select and transform data. In the Arguments field, you can customize the data sent to the action. The result can be transformed into custom state output in the Output field. You can also store data in variables in the Assign field. For more info, see Transforming data with JSONata. The following diagram shows how JSON information moves through a JSONata task state. 454 AWS Step Functions Developer Guide Processing input and output with JSONPath Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. For state machines that use JSONPath, the following fields control the flow of data from state to state: InputPath, Parameters, ResultSelector, ResultPath, and OutputPath. Each JSONPath field can manipulate JSON as it moves through each state in your workflow. JSONPath fields can use paths to select portions of the JSON from the input or the result. A path is a string, beginning with $, that identifies nodes within JSON text. Step Functions paths use JsonPath syntax. The following diagram shows how JSON information moves through a JSONPath task state. The InputPath selects the parts of the JSON input to pass to the task of the Task state (for example, an AWS Lambda function). You can adjust the data that is sent to your action in the Parameters field. Then, with ResultSelector, you can select portions of the action result to carry forward. ResultPath then selects the combination of state input and task results to pass to the output. 455 AWS Step Functions Developer Guide OutputPath can filter the JSON output to further limit the information that's passed to the output. Topics • Passing data between states with variables • Transforming data with JSONata in Step Functions • Accessing execution data from the Context object in Step Functions • Using JSONPath paths • Manipulate parameters in Step Functions workflows • Example: Manipulating state data with paths in Step Functions workflows • Specifying state output using ResultPath in Step Functions 456 AWS Step Functions Developer Guide • Map state input and output fields in Step Functions Passing data between states with variables Managing state with variables and JSONata Step Functions recently added variables and JSONata to manage state and transform data. Learn more in the blog post Simplifying developer experience with variables and JSONata in AWS Step Functions The following video link describes variables and JSONata in Step Functions with a DynamoDB example: Enhanced Data Flow in AWS Step Functions With variables and state output, you can pass data between the steps of your workflow. Using workflow variables, you can store data in a step and retrieve that data in future steps. For example, you could store an API response that contains data you might need later. Conversely, state output can only be used as input to the very next step. Conceptual overview of variables With workflow variables, you can store data to reference later. For example, Step 1 might store the result from an API request so a part of that request can be re-used later in Step 5. In the following scenario, the state machine fetches data from an API once. In Step 1, the workflow stores the returned API data (up to 256 KiB per state) in a variable ‘x’ to use in later steps. Without variables, you would need to pass the data through output from Step 1 to Step 2 to Step 3 to Step 4 to use it in Step 5. What if those intermediate steps do not need the data? Passing data from state to state through outputs and input would be unnecessary effort. With variables, you can store data and use it in any future step. You can also modify, rearrange, or add steps without disrupting the flow of your data. Given the flexibility of variables, you might only need to use Output to return data from Parallel and Map sub-workflows, and at the end of your state machine execution. Passing data with variables 457 AWS Step Functions Developer Guide States that support variables The following state types support Assign to declare and assign values to variables: Pass, Task, Map, Parallel, Choice, Wait. To set a variable, provide a JSON object with variable names and values: "Assign": { "productName": "product1", "count" : 42, "available" : true } To reference a variable, prepend the name with a dollar sign ($), for example, $productName. Conceptual overview of variables 458 AWS Step Functions Developer Guide Reserved
|
step-functions-dg-145
|
step-functions-dg.pdf
| 145 |
to return data from Parallel and Map sub-workflows, and at the end of your state machine execution. Passing data with variables 457 AWS Step Functions Developer Guide States that support variables The following state types support Assign to declare and assign values to variables: Pass, Task, Map, Parallel, Choice, Wait. To set a variable, provide a JSON object with variable names and values: "Assign": { "productName": "product1", "count" : 42, "available" : true } To reference a variable, prepend the name with a dollar sign ($), for example, $productName. Conceptual overview of variables 458 AWS Step Functions Developer Guide Reserved variable : $states Step Functions defines a single reserved variable called $states. In JSONata states, the following structures are assigned to $states for use in JSONata expressions: # Reserved $states variable in JSONata states $states = { "input": // Original input to the state "result": // API or sub-workflow's result (if successful) "errorOutput": // Error Output (only available in a Catch) "context": // Context object } On state entry, Step Functions assigns the state input to $states.input. The value of $states.input can be used in all fields that accept JSONata expressions. $states.input always refers to the original state input. For Task, Parallel, and Map states: • $states.result refers to the API or sub-workflow’s raw result if successful. • $states.errorOutput refers to the Error Output if the API or sub-workflow failed. $states.errorOutput can be used in the Catch field’s Assign or Output. Attempting to access $states.result or $states.errorOutput in fields and states where they are not accessible will be caught at creation, update, or validation of the state machine. The $states.context object provides your workflows information about their specific execution, such as StartTime, task token, and initial workflow input. To learn more, see Accessing execution data from the Context object in Step Functions . Variable name syntax Variable names follow the rules for Unicode Identifiers as described in Unicode® Standard Annex #31. The first character of a variable name must be a Unicode ID_Start character, and the second and subsequent characters must be Unicode ID_Continue characters. The maximum length of a variable name is 80. The variable name convention is similar to rules for JavaScript and other programming languages. Reserved variable : $states 459 AWS Step Functions Variable scope Developer Guide Step Functions workflows avoid race conditions with variables by using a workflow-local scope. Workflow-local scope includes all states inside a state machine's States field, but not states inside Parallel or Map states. States inside Parallel or Map states can refer to outer scope variables, but they create and maintain their own separate workflow-local variables and values. Parallel branches and Map iterations can access variable values from outer scopes, but they do not have access to variable values from other concurrent branches or iterations. When handling errors, the Assign field in a Catch can assign values to variables in the outer scope, that is, the scope in which the Parallel/Map state exists. Exception: Distributed Map states cannot currently reference variables in outer scopes. A variable exists in a scope if any state in the scope assigns a value to it. To help avoid common errors, a variable assigned in an inner scope cannot have the same name as one assigned in an outer scope. For example, if the top-level scope assigns a value to a variable called myVariable, then no other scope (inside a Map, Parallel) can assign to myVariable as well. Access to variables depends on the current scope. Parallel and Map states have their own scope, but can access variables in outer scopes. When a Parallel or Map state completes, all of their variables will go out of scope and stop being accessible. Use the Output field to pass data out of Parallel branches and Map iterations. Assign field in ASL The Assign field in ASL is used to assign values to one or more variables. The Assign field is available at the top level of each state (except Succeed and Fail), inside Choice state rules, and inside Catch fields. For example: # Example of Assign with JSONata "Store inputs": { "Type": "Pass", "Next": "Get Current Price", "Comment": "Store the input desired price into a variable: $desiredPrice", "Assign": { "desiredPrice": "{% $states.input.desired_price %}", "maximumWait": "{% $states.input.max_days %}" Variable scope 460 AWS Step Functions } }, Developer Guide The Assign field takes a JSON object. Each top-level field names a variable to assign. In the previous examples, the variable names are desiredPrice and maximumWait. When using JSONata, {% ... %} indicates a JSONata expression which might contain variables or more complex expressions. For more information about JSONata expressions, refer to the JSONata.org documentation. When using JSONata as the query language, the following diagram shows how Assign and Output fields are processed in parallel. Note the implication: assigning variable values will
|
step-functions-dg-146
|
step-functions-dg.pdf
| 146 |
"desiredPrice": "{% $states.input.desired_price %}", "maximumWait": "{% $states.input.max_days %}" Variable scope 460 AWS Step Functions } }, Developer Guide The Assign field takes a JSON object. Each top-level field names a variable to assign. In the previous examples, the variable names are desiredPrice and maximumWait. When using JSONata, {% ... %} indicates a JSONata expression which might contain variables or more complex expressions. For more information about JSONata expressions, refer to the JSONata.org documentation. When using JSONata as the query language, the following diagram shows how Assign and Output fields are processed in parallel. Note the implication: assigning variable values will not affect state Output. The following JSONata example retrieves order.product from the state input. The variable currentPrice is set to a value from the result of the task. # Example of Task with JSONata assignment from result { "Type": "Task", ... "Assign": { "product": "{% $states.input.order.product %}", Assign field in ASL 461 AWS Step Functions Developer Guide "currentPrice": "{% $states.result.Payload.current_price %}" }, "Next": "the next state" } Note: You cannot assign a value to a part of a variable. For example, you can "Assign": {"x":42}, but you cannot "Assign":{"x.y":42} or "Assign":{"x[2]":42}. Evaluation order in an assign field All variable references in Step Functions states use the values as they were on state entry. The previous fact is important to understand how the Assign field assigns values to one or more variables. First, new values are calculated, then Step Functions assigns the new values to the variables. The new variable values will be available starting with the next state. For example, consider the following Assign field: # Starting values: $x=3, $a=6 "Assign": { "x": "{% $a %}", "nextX": "{% $x %}" } # Ending values: $x=6, $nextX=3 In the preceding example, the variable x is both assigned and referenced. Remember, all expressions are evaluated first, then assignments are made. And newly assigned values will be available in the next state. Let's go through the example in detail. Assume that in a previous state, $x was assigned a value of three (3) and $a was assigned a value of six (6). The following steps describe the process: 1. All expressions are evaluated, using current values of all variables. The expression "{% $a %}" will evaluate to 6, and "{% $x %}" will evaluate to 3. 2. Next, assignments are made: $x will be assigned the value six (6) Evaluation order in an assign field 462 AWS Step Functions Developer Guide $nextX will be assigned three (3) Note: If $x had not been previously assigned, the example would fail because $x would be undefined. In summary, Step Functions evaluates all expressions and then makes assignments. The order in which the variables occur in the Assign field does not matter. Limits The maximum size of a single variable is 256Kib, for both Standard and Express workflows. The maximum combined size for all variables in a single Assign field is also 256Kib. For example, you could assign X and Y to 128KiB, but you could not assign both X and Y to 256KiB in the same Assign field. The total size of all stored variables cannot exceed 10MiB per execution. Using variables in JSONPath states Variables are also available in states that use JSONPath for their query language. You can reference a variable in any field that accepts a JSONpath expression ( $. or $$. syntax), with the exception of ResultPath, which specifies a location in state input to inject the state's result. Variables cannot be used in ResultPath. In JSONPath, the $ symbol refers to the ‘current’ value and $$ represents the states Context object. JSONPath expressions can start with $. as in $.customer.name. You can access context with $$. as in $$.Execution.Id. To reference a variable, you also use the $ symbol before a variable name, for example, $x or $order.numItems. In JSONPath fields that accept intrinsic functions, variables can be used in the arguments, for example States.Format('The order number is {}', $order.number). The following digram illustrates how the assign step in a JSONPath task occurs in at the same time as the ResultSelector: Limits 463 AWS Step Functions Developer Guide Assigning variables in JSONPath JSONPath variable assignments behave similarly to payload templates. Fields that end with .$ indicate the value is a JSONPath expression which Step Functions evaluates to a value during state machine execution (for example: $.order..product and $.order.total). # Example of Assign with JSONPath { "Type": "Task", ... "Assign": { "products.$": "$.order..product", "orderTotal.$": "$.order.total" Using variables in JSONPath states 464 AWS Step Functions }, "Next": "the next state" } Developer Guide For JSONPath states, the value of $ in an Assign field depends on the state type. In Task, Map, Parallel states, the $ refers to the API/sub-workflow result. In Choice and Wait state, $ refers to the effective input,
|
step-functions-dg-147
|
step-functions-dg.pdf
| 147 |
Fields that end with .$ indicate the value is a JSONPath expression which Step Functions evaluates to a value during state machine execution (for example: $.order..product and $.order.total). # Example of Assign with JSONPath { "Type": "Task", ... "Assign": { "products.$": "$.order..product", "orderTotal.$": "$.order.total" Using variables in JSONPath states 464 AWS Step Functions }, "Next": "the next state" } Developer Guide For JSONPath states, the value of $ in an Assign field depends on the state type. In Task, Map, Parallel states, the $ refers to the API/sub-workflow result. In Choice and Wait state, $ refers to the effective input, which is the value after InputPath has been applied to the state input. For Pass, $ refers to the result, whether generated by the Result field or the InputPath/Parameters fields. The following JSONPath example assigns a JSON object to the details variable, the result of the JSONPath expression $.result.code to resultCode, and the result of the JSONPath expression States.Format('Hello {}', $customer.name) to message. If this was in a Task state, then $ in $.order.items and $.result.code refers to the API result. The startTime variable is assigned with a value from the Context object, $$.Execution.StartTime. "Assign": { "details": { "status": "SUCCESS", "lineItems.$": "$.order.items" }, "resultCode.$": "$.result.code", "message.$": "States.Format('Hello {}', $customer.name)", "startTime.$": "$$.Execution.StartTime" } Transforming data with JSONata in Step Functions With JSONata, you gain a powerful open source query and expression language to select and transform data in your workflows. For a brief introduction and complete JSONata reference, see JSONata.org documentation. The following video link describes variables and JSONata in Step Functions with a DynamoDB example: Enhanced Data Flow in AWS Step Functions You must opt-in to use the JSONata query and transformation language for existing workflows. When creating a workflow in the console, we recommend choosing JSONata for the top-level state machine QueryLanguage. For existing or new workflows that use JSONPath, the console provides an option to convert individual states to JSONata. Transforming data 465 AWS Step Functions Developer Guide After selecting JSONata, your workflow fields will be reduced from five JSONPath fields (InputPath, Parameters, ResultSelector, ResultPath, and OutputPath) to only two fields: Arguments and Output. Also, you will not use .$ on JSON object key names. If you are new to Step Functions, you only need to know that JSONata expressions use the following syntax: JSONata syntax: "{% <JSONata expression> %}" The following code samples show a conversion from JSONPath to JSONata: # Original sample using JSONPath { "QueryLanguage": "JSONPath", // Set explicitly; could be set and inherited from top- level "Type": "Task", ... "Parameters": { "static": "Hello", "title.$": "$.title", "name.$": "$customerName", // With $customerName declared as a variable "not-evaluated": "$customerName" } } # Sample after conversion to JSONata { "QueryLanguage": "JSONata", // Set explicitly; could be set and inherited from top- level "Type": "Task", ... "Arguments": { // JSONata states do not have Parameters "static": "Hello", "title": "{% $states.input.title %}", "name": "{% $customerName %}", // With $customerName declared as a variable "not-evaluated": "$customerName" } } Given input { "title" : "Doctor" } and variable customerName assigned to "María", both state machines will produce the following JSON result: Transforming data 466 AWS Step Functions Developer Guide { "static": "Hello", "title": "Doctor", "name": "María", "not-evaluated": "$customerName" } In the next diagram, you can see a graphical representation showing how converting JSONPath (left) to JSONata (right) will reduce the complexity of the steps in your state machines: You can (optionally) select and transform data from the state input into Arguments to send to your integrated action. With JSONata, you can then (optionally) select and transform the results from the action for assigning to variables and for state Output. Note: Assign and Output steps occur in parallel. If you choose to transform data during variable assignment, that transformed data will not be available in the Output step. You must reapply the JSONata transformation in the Output step. Transforming data 467 AWS Step Functions Developer Guide QueryLanguage field In your workflow ASL definitions, there is a QueryLanguage field at the top level of a state machine definition and in individual states. By setting QueryLanguage inside individual states, you can incrementally adopt JSONata in an existing state machine rather than upgrading the state machine all at once. The QueryLanguage field can be set to "JSONPath" or "JSONata". If the top-level QueryLanguage field is omitted, it defaults to "JSONPath". If a state contains a state-level QueryLanguage field, Step Functions will use the specified query language for that state. If the state does not contain a QueryLanguage field, then it will use the query language specified in the top-level QueryLanguage field. Writing JSONata expressions in JSON strings When a string in the value of an ASL field, a JSON object field, or a JSON array element is surrounded by {% %} characters, that string will be
|
step-functions-dg-148
|
step-functions-dg.pdf
| 148 |
all at once. The QueryLanguage field can be set to "JSONPath" or "JSONata". If the top-level QueryLanguage field is omitted, it defaults to "JSONPath". If a state contains a state-level QueryLanguage field, Step Functions will use the specified query language for that state. If the state does not contain a QueryLanguage field, then it will use the query language specified in the top-level QueryLanguage field. Writing JSONata expressions in JSON strings When a string in the value of an ASL field, a JSON object field, or a JSON array element is surrounded by {% %} characters, that string will be evaluated as JSONata . Note, the string must start with {% with no leading spaces, and must end with %} with no trailing spaces. Improperly opening or closing the expression will result in a validation error. Some examples: QueryLanguage field 468 AWS Step Functions Developer Guide • "TimeoutSeconds" : "{% $timeout %}" • "Arguments" : {"field1" : "{% $name %}"} in a Task state • "Items": [1, "{% $two %}", 3] in a Map state Not all ASL fields accept JSONata. For example, each state’s Type field must be set to a constant string. Similarly, the Task state’s Resource field must be a constant string. The Map state Items field will accept a JSON array or a JSONata expression that must evaluate to an array. Reserved variable : $states Step Functions defines a single reserved variable called $states. In JSONata states, the following structures are assigned to $states for use in JSONata expressions: # Reserved $states variable in JSONata states $states = { "input": // Original input to the state "result": // API or sub-workflow's result (if successful) "errorOutput": // Error Output (only available in a Catch) "context": // Context object } On state entry, Step Functions assigns the state input to $states.input. The value of $states.input can be used in all fields that accept JSONata expressions. $states.input always refers to the original state input. For Task, Parallel, and Map states: • $states.result refers to the API or sub-workflow’s raw result if successful. • $states.errorOutput refers to the Error Output if the API or sub-workflow failed. $states.errorOutput can be used in the Catch field’s Assign or Output. Attempting to access $states.result or $states.errorOutput in fields and states where they are not accessible will be caught at creation, update, or validation of the state machine. The $states.context object provides your workflows information about their specific execution, such as StartTime, task token, and initial workflow input. To learn more, see Accessing execution data from the Context object in Step Functions . Reserved variable : $states 469 AWS Step Functions Developer Guide Handling expression errors At runtime, JSONata expression evaluation might fail for a variety of reasons, such as: • Type error - An expression, such as {% $x + $y %}, will fail if $x or $y is not a number. • Type incompatibility - An expression might evaluate to a type that the field will not accept. For example, the field TimeoutSeconds requires a numeric input, so the expression {% $timeout %} will fail if $timeout returns a string. • Value out of range - An expression that produces a value that is outside the acceptable range for a field will fail. For example, an expression such as {% $evaluatesToNegativeNumber %} will fail in the TimeoutSeconds field. • Failure to return a result - JSON cannot represent an undefined value expression, so the expression {% $data.thisFieldDoesNotExist %} would result in an error. In each case, the interpreter will throw the error: States.QueryEvaluationError. Your Task, Map, and Parallel states can provide a Catch field to catch the error, and a Retry field to retry on the error. Converting from JSONPath to JSONata The following sections compare and explain the differences between code written with JSONPath and JSONata. No more path fields ASL requires developers use Path versions of fields, as in TimeoutSecondsPath, to select a value from the state data when using JSONPath. When you use JSONata, you no longer use Path fields because ASL will interpret {% %}-enclosed JSONata expressions automatically for you in non-Path fields, such as TimeoutSeconds. • JSONPath legacy example: "TimeoutSecondsPath": "$timeout" • JSONata : "TimeoutSeconds": "{% $timeout %}" Similarly, the Map state ItemsPath has been replaced with the Items field which accepts a JSON array or a JSONata expression that must evaluate to an array. Handling expression errors 470 AWS Step Functions JSON Objects Developer Guide ASL uses the term payload template to describe a JSON object that can contain JSONPath expressions for Parameters and ResultSelector field values. ASL will not use the term payload template for JSONata because JSONata evaluation happens for all strings whether they occur on their own or inside a JSON object or a JSON array. No more .$ ASL requires you to
|
step-functions-dg-149
|
step-functions-dg.pdf
| 149 |
%}" Similarly, the Map state ItemsPath has been replaced with the Items field which accepts a JSON array or a JSONata expression that must evaluate to an array. Handling expression errors 470 AWS Step Functions JSON Objects Developer Guide ASL uses the term payload template to describe a JSON object that can contain JSONPath expressions for Parameters and ResultSelector field values. ASL will not use the term payload template for JSONata because JSONata evaluation happens for all strings whether they occur on their own or inside a JSON object or a JSON array. No more .$ ASL requires you to append ‘.$’ to field names in payload templates to use JSONPath and Intrinsic Functions. When you specify "QueryLanguage":"JSONata", you no longer use the ‘. $’ convention for JSON object field names. Instead, you enclose JSONata expressions in {% %} characters. You use the same convention for all string-valued fields, regardless of how deeply the object is nested inside other arrays or objects. Arguments and Output Fields When the QueryLanguage is set to JSONata, the old I/O processing fields will be disabled (InputPath, Parameters, ResultSelector, ResultPath and OutputPath) and most states will get two new fields: Arguments and Output. JSONata provides a simpler way to perform I/O transformations compared to the fields used with JSONPath. JSONata’s features makes Arguments and Output more capable than the previous five fields with JSONPath. These new field names also help simplify your ASL and clarify the model for passing and returning values. The Arguments and Output fields (and other similar fields such as Map state’s ItemSelector) will accept either a JSON object such as: "Arguments": { "field1": 42, "field2": "{% jsonata expression %}" } Or, you can use a JSONata expression directly, for example: "Output": "{% jsonata expression %}" Output can also accept any type of JSON value too, for example: "Output":true, "Output":42. Converting to JSONata 471 AWS Step Functions Developer Guide The Arguments and Output fields only support JSONata, so it is invalid to use them with workflows that use JSONPath. Conversely, InputPath, Parameters, ResultSelector, ResultPath, OutputPath , and other JSONPath fields are only supported in JSONPath, so it is invalid to use path-based fields when using JSONata as your top level workflow or state query language. Pass state The optional Result in a Pass state was previously treated as the output of a virtual task. With JSONata selected as the workflow or state query language, you can now use the new Output field. Choice state When using JSONPath, choice states have an input Variable and numerous comparison paths, such as the following NumericLessThanEqualsPath : # JSONPath choice state sample, with Variable and comparison path "Check Price": { "Type": "Choice", "Default": "Pause", "Choices": [ { "Variable": "$.current_price.current_price", "NumericLessThanEqualsPath": "$.desired_price", "Next": "Send Notification" } ], } With JSONata, the choice state has a Condition where you can use a JSONata expression: # Choice state after JSONata conversion "Check Price": { "Type": "Choice", "Default": "Pause" "Choices": [ { "Condition": "{% $current_price <= $states.input.desired_priced %}", "Next": "Send Notification" } ] Converting to JSONata 472 AWS Step Functions Developer Guide Note: Variables and comparison fields are only available for JSONPath. Condition is only available for JSONata. JSONata examples The following examples can be created in Workflow Studio to experiment with JSONata. You can create and execute the state machines, or use the Test state to pass in data and even modify the state machine definition. Example: Input and Output This example shows how to use $states.input to use the state input and the Output field to specify the state output when you opt into JSONata. { "Comment": "Input and Output example using JSONata", "QueryLanguage": "JSONata", "StartAt": "Basic Input and Output", "States": { "Basic Input and Output": { "QueryLanguage": "JSONata", "Type": "Succeed", "Output": { "lastName": "{% 'Last=>' & $states.input.customer.lastName %}", "orderValue": "{% $states.input.order.total %}" } } } } When the workflow is executed with the following as input: { "customer": { "firstName": "Martha", "lastName": "Rivera" }, "order": { "items": 7, "total": 27.91 } JSONata examples 473 AWS Step Functions } Developer Guide Test state or state machine execution will return the following JSON output: { "lastName": "Last=>Rivera", "orderValue": 27.91 } Example: Filtering with JSONata You can filter your data with JSONata Path operators. For example, imagine you have a list of products for input, and you only want to process products that contain zero calories. You can create a state machine definition with the following ASL and test the FilterDietProducts state with the sample input that follows. State machine definition for filtering with JSONata { "Comment": "Filter products using JSONata", "QueryLanguage": "JSONata", JSONata examples 474 AWS Step Functions Developer Guide "StartAt": "FilterDietProducts", "States": { "FilterDietProducts": { "Type": "Pass", "Output": { "dietProducts": "{% $states.input.products[calories=0] %}" }, "End": true } } } Sample input for the test { "products": [ {
|
step-functions-dg-150
|
step-functions-dg.pdf
| 150 |
with JSONata Path operators. For example, imagine you have a list of products for input, and you only want to process products that contain zero calories. You can create a state machine definition with the following ASL and test the FilterDietProducts state with the sample input that follows. State machine definition for filtering with JSONata { "Comment": "Filter products using JSONata", "QueryLanguage": "JSONata", JSONata examples 474 AWS Step Functions Developer Guide "StartAt": "FilterDietProducts", "States": { "FilterDietProducts": { "Type": "Pass", "Output": { "dietProducts": "{% $states.input.products[calories=0] %}" }, "End": true } } } Sample input for the test { "products": [ { "calories": 140, "flavour": "Cola", "name": "Product-1" }, { "calories": 0, "flavour": "Cola", "name": "Product-2" }, { "calories": 160, "flavour": "Orange", "name": "Product-3" }, { "calories": 100, "flavour": "Orange", "name": "Product-4" }, { "calories": 0, "flavour": "Lime", "name": "Product-5" } ] } JSONata examples 475 AWS Step Functions Developer Guide Output from testing the step in your state machine { "dietProducts": [ { "calories": 0, "flavour": "Cola", "name": "Product-2" }, { "calories": 0, "flavour": "Lime", "name": "Product-5" } ] } JSONata functions provided by Step Functions JSONata contains function libraries for String, Numeric, Aggregation, Boolean, Array, Object, Date/ Time, and High Order functions. Step Functions provides additional JSONata functions that you can use in your JSONata expressions. These built-in functions serve as replacements for Step JSONata functions 476 AWS Step Functions Developer Guide Functions intrinsic functions. Intrinsic functions are only available in states that use the JSONPath query language. Note: Built-in JSONata functions that require integer values as parameters will automatically round down any non-integer numbers provided. $partition - JSONata equivalent of States.ArrayPartition intrinsic function to partition a large array. The first parameter is the array to partition, the second parameter is an integer representing the chunk size. The return value will be a two-dimensional array. The interpreter chunks the input array into multiple arrays of the size specified by chunk size. The length of the last array chunk may be less than the length of the previous array chunks if the number of remaining items in the array is smaller than the chunk size. "Assign": { "arrayPartition": "{% $partition([1,2,3,4], $states.input.chunkSize) %}" } $range - JSONata equivalent of States.ArrayRange intrinsic function to generate an array of values. This function takes three arguments. The first argument is an integer representing the first element of the new array, the second argument is an integer representing the final element of the new array, and the third argument is the delta value integer for the elements in the new array. The return value is a newly-generated array of values ranging from the first argument of the function to the second argument of the function with elements in between adjusted by the delta. The delta value can be positive or negative which will increment or decrement each element from the last until the end value is reached or exceeded. "Assign": { "arrayRange": "{% $range(0, 10, 2) %}" } $hash - JSONata equivalent of the States.Hash intrinsic function to calculate the hash value of a given input. This function takes two arguments. The first argument is the source string to be hashed. The second argument is a string representing the hashing algorithm to for the hash calculation. JSONata functions 477 AWS Step Functions Developer Guide The hashing algorithm must be one of the following values: "MD5", "SHA-1", "SHA-256", "SHA-384", "SHA-512". The return value is a string of the calculated hash of the data. This function was created because JSONata does not natively support the ability to calculate hashes. "Assign": { "myHash": "{% $hash($states.input.content, $hashAlgorithmName) %}" } $random - JSONata equivalent of the States.MathRandom intrinsic function to return a random number n where 0 ≤ n < 1. The function takes an optional integer argument representing the seed value of the random function. If you use this function with the same seed value, it returns an identical number. This overloaded function was created because the built-in JSONata function $random does not accept a seed value. "Assign": { "randNoSeed": "{% $random() %}", "randSeeded": "{% $random($states.input.seed) %}" } $uuid - JSONata version of the States.UUID intrinsic function. The function takes no arguments. This function return a v4 UUID. This function was created because JSONata does not natively support the ability to generate UUIDs. "Assign": { "uniqueId": "{% $uuid() %}" } $parse - JSONata function to deserialize JSON strings. The function takes a stringified JSON as its only argument. JSONata supports this functionality via $eval; however, $eval is not supported in Step Functions workflows. JSONata functions 478 AWS Step Functions Developer Guide "Assign": { "deserializedPayload": "{% $parse($states.input.json_string) %}" } Accessing execution data from the Context object in Step Functions Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. The Context object is
|
step-functions-dg-151
|
step-functions-dg.pdf
| 151 |
was created because JSONata does not natively support the ability to generate UUIDs. "Assign": { "uniqueId": "{% $uuid() %}" } $parse - JSONata function to deserialize JSON strings. The function takes a stringified JSON as its only argument. JSONata supports this functionality via $eval; however, $eval is not supported in Step Functions workflows. JSONata functions 478 AWS Step Functions Developer Guide "Assign": { "deserializedPayload": "{% $parse($states.input.json_string) %}" } Accessing execution data from the Context object in Step Functions Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. The Context object is an internal JSON structure that is available during an execution, and contains information about your state machine and execution. The context provides your workflows information about their specific execution. Your workflows can reference the Context object in a JSONata expression with $states.context. Accessing the Context object To access the Context object in JSONata To access the Context object in JSONata states, use $states.context in a JSONata expression. { "ExecutionID" : "{% $states.context.Execution.Id %}" } To access the Context object in JSONPath To access the Context object in JSONPath, you first append .$ to the end of the key to indicate the value is a path. Then, prepend the value with $$. to select a node in the Context object. { "ExecutionID.$": "$$.Execution.Id" } JSONPath states can refer to the context ($$.) from the following JSONPath fields: Context object 479 Developer Guide AWS Step Functions • InputPath • OutputPath • ItemsPath (in Map states) • Variable (in Choice states) • ResultSelector • Parameters • Variable to variable comparison operators Context object fields The Context object includes information about the state machine, state, execution, and task. This JSON object includes nodes for each type of data, and is in the following format. { "Execution": { "Id": "String", "Input": {}, "Name": "String", "RoleArn": "String", "StartTime": "Format: ISO 8601", "RedriveCount": Number, "RedriveTime": "Format: ISO 8601" }, "State": { "EnteredTime": "Format: ISO 8601", "Name": "String", "RetryCount": Number }, "StateMachine": { "Id": "String", "Name": "String" }, "Task": { "Token": "String" } } During an execution, the Context object is populated with relevant data. RedriveTime Context object is only available if you've redriven an execution. If you've redriven a Map Run, the Context object fields 480 AWS Step Functions Developer Guide RedriveTime context object is only available for child workflows of type Standard. For a redriven Map Run with child workflows of type Express, RedriveTime isn't available. Content from a running execution includes specifics in the following format. { "Execution": { "Id": "arn:aws:states:region:123456789012:execution:stateMachineName:executionName", "Input": { "key": "value" }, "Name": "executionName", "RoleArn": "arn:aws:iam::123456789012:role...", "StartTime": "2019-03-26T20:14:13.192Z" }, "State": { "EnteredTime": "2019-03-26T20:14:13.192Z", "Name": "Test", "RetryCount": 3 }, "StateMachine": { "Id": "arn:aws:states:region:123456789012:stateMachine:stateMachineName", "Name": "stateMachineName" }, "Task": { "Token": "h7XRiCdLtd/83p1E0dMccoxlzFhglsdkzpK9mBVKZsp7d9yrT1W" } } Note For Context object data related to Map states, see Context object data for Map states. Context object data for Map states Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. Context object data for Map states 481 AWS Step Functions Developer Guide There are two additional items available in the Context object when processing a Map state: Index and Value. For each Map state iteration, Index contains the index number for the array item that is being currently processed, while Value contains the array item being processed. Within a Map state, the Context object includes the following data: "Map": { "Item": { "Index": Number, "Value": "String" } } These are available only in a Map state, and can be specified in the ItemSelector (Map) field. Note You must define parameters from the Context object in the ItemSelector block of the main Map state, not within the states included in the ItemProcessor section. Given a state machine using a JSONPath Map state, you can inject information from the Context object as follows. { "StartAt": "ExampleMapState", "States": { "ExampleMapState": { "Type": "Map", "ItemSelector": { "ContextIndex.$": "$$.Map.Item.Index", "ContextValue.$": "$$.Map.Item.Value" }, "ItemProcessor": { "ProcessorConfig": { "Mode": "INLINE" }, "StartAt": "TestPass", "States": { "TestPass": { "Type": "Pass", "End": true Context object data for Map states 482 AWS Step Functions } } }, "End": true } } } Developer Guide If you execute the previous state machine with the following input, Index and Value are inserted in the output. [ { "who": "bob" }, { "who": "meg" }, { "who": "joe" } ] The output for the execution returns the values of Index and Value items for each of the three iterations as follows: [ { "ContextIndex": 0, "ContextValue": { "who": "bob" } }, { "ContextIndex": 1, "ContextValue": { "who": "meg" } }, { "ContextIndex": 2, Context object data for Map states 483 AWS Step Functions "ContextValue": { "who": "joe" } } ] Using JSONPath paths Developer Guide Managing state and transforming data Learn about Passing
|
step-functions-dg-152
|
step-functions-dg.pdf
| 152 |
machine with the following input, Index and Value are inserted in the output. [ { "who": "bob" }, { "who": "meg" }, { "who": "joe" } ] The output for the execution returns the values of Index and Value items for each of the three iterations as follows: [ { "ContextIndex": 0, "ContextValue": { "who": "bob" } }, { "ContextIndex": 1, "ContextValue": { "who": "meg" } }, { "ContextIndex": 2, Context object data for Map states 483 AWS Step Functions "ContextValue": { "who": "joe" } } ] Using JSONPath paths Developer Guide Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. In the Amazon States Language, a path is a string beginning with $ that you can use to identify components within JSON text. Paths follow JsonPath syntax, which is only available when the QueryLanguage is set to JSONPath. You can specify a path to access subsets of the input when specifying values for InputPath, ResultPath, and OutputPath. You must use square bracket notation if your field name contains any character that is not included in the member-name-shorthand definition of the JsonPath ABNF rule. Therefore, to encode special characters, such as punctuation marks (excluding _), you must use square bracket notation. For example, $.abc.['def ghi']. Reference Paths A reference path is a path whose syntax is limited in such a way that it can identify only a single node in a JSON structure: • You can access object fields using only dot (.) and square bracket ([ ]) notation. • Functions such as length() aren't supported. • Lexical operators, which are non-symbolic, such as subsetof aren't supported. • Filtering by regular expression or by referencing another value in the JSON structure is not supported. • The operators @, ,, :, and ? are not supported For example, if state input data contains the following values: Using JSONPath paths 484 Developer Guide AWS Step Functions { "foo": 123, "bar": ["a", "b", "c"], "car": { "cdr": true } } The following reference paths would return the following. $.foo => 123 $.bar => ["a", "b", "c"] $.car.cdr => true Certain states use paths and reference paths to control the flow of a state machine or configure a state's settings or options. For more information, see Modeling workflow input and output path processing with data flow simulator and Using JSONPath effectively in AWS Step Functions. Flattening an array of arrays If the Parallel workflow state or Map workflow state state in your state machines return an array of arrays, you can transform them into a flat array with the ResultSelector field. You can include this field inside the Parallel or Map state definition to manipulate the result of these states. To flatten arrays, use the syntax: [*] in the ResultSelector field as shown in the following example. "ResultSelector": { "flattenArray.$": "$[*][*]" } For examples that show how to flatten an array, see Step 3 in the following tutorials: • Processing batch data with a Lambda function in Step Functions • Processing individual items with a Lambda function in Step Functions Reference Paths 485 AWS Step Functions Developer Guide Manipulate parameters in Step Functions workflows Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. The InputPath, Parameters and ResultSelector fields provide a way to manipulate JSON as it moves through your workflow. InputPath can limit the input that is passed by filtering the JSON notation by using a path (see Using JSONPath paths). With the Parameters field, you can pass a collection of key-value pairs, using either static values or selections from the input using a path. The ResultSelector field provides a way to manipulate the state’s result before ResultPath is applied. AWS Step Functions applies the InputPath field first, and then the Parameters field. You can first filter your raw input to a selection you want using InputPath, and then apply Parameters to manipulate that input further, or add new values. You can then use the ResultSelector field to manipulate the state's output before ResultPath is applied. InputPath Use InputPath to select a portion of the state input. For example, suppose the input to your state includes the following. { "comment": "Example for InputPath.", "dataset1": { "val1": 1, "val2": 2, "val3": 3 }, "dataset2": { "val1": "a", "val2": "b", "val3": "c" Manipulate parameters with paths 486 AWS Step Functions } } You could apply the InputPath. "InputPath": "$.dataset2", Developer Guide With the previous InputPath, the following is the JSON that is passed as the input. { "val1": "a", "val2": "b", "val3": "c" } Note A path can yield a selection of values. Consider the following example. { "a": [1, 2, 3, 4] } If you apply the path $.a[0:2], the following is the result. [ 1,
|
step-functions-dg-153
|
step-functions-dg.pdf
| 153 |
includes the following. { "comment": "Example for InputPath.", "dataset1": { "val1": 1, "val2": 2, "val3": 3 }, "dataset2": { "val1": "a", "val2": "b", "val3": "c" Manipulate parameters with paths 486 AWS Step Functions } } You could apply the InputPath. "InputPath": "$.dataset2", Developer Guide With the previous InputPath, the following is the JSON that is passed as the input. { "val1": "a", "val2": "b", "val3": "c" } Note A path can yield a selection of values. Consider the following example. { "a": [1, 2, 3, 4] } If you apply the path $.a[0:2], the following is the result. [ 1, 2 ] Parameters This section describes the different ways you can use the Parameters field. Key-value pairs Use the Parameters field to create a collection of key-value pairs that are passed as input. The values of each can either be static values that you include in your state machine definition, or selected from either the input or the Context object with a path. For key-value pairs where the value is selected using a path, the key name must end in .$. For example, suppose you provide the following input. Parameters 487 AWS Step Functions Developer Guide { "comment": "Example for Parameters.", "product": { "details": { "color": "blue", "size": "small", "material": "cotton" }, "availability": "in stock", "sku": "2317", "cost": "$23" } } To select some of the information, you could specify these parameters in your state machine definition. "Parameters": { "comment": "Selecting what I care about.", "MyDetails": { "size.$": "$.product.details.size", "exists.$": "$.product.availability", "StaticValue": "foo" } }, Given the previous input and the Parameters field, this is the JSON that is passed. { "comment": "Selecting what I care about.", "MyDetails": { "size": "small", "exists": "in stock", "StaticValue": "foo" } }, In addition to the input, you can access a special JSON object, known as the Context object. The Context object includes information about your state machine execution. See Accessing execution data from the Context object in Step Functions . Parameters 488 AWS Step Functions Connected resources Developer Guide The Parameters field can also pass information to connected resources. For example, if your task state is orchestrating an AWS Batch job, you can pass the relevant API parameters directly to the API actions of that service. For more information, see: • Passing parameters to a service API in Step Functions • Integrating services Amazon S3 If the Lambda function data you are passing between states might grow to more than 262,144 bytes, we recommend using Amazon S3 to store the data, and implement one of the following methods: • Use the Distributed Map state in your workflow so that the Map state can read input directly from Amazon S3 data sources. For more information, see Distributed mode. • Parse the Amazon Resource Name (ARN) of the bucket in the Payload parameter to get the bucket name and key value. For more information, see Using Amazon S3 ARNs instead of passing large payloads in Step Functions. Alternatively, you can adjust your implementation to pass smaller payloads in your executions. ResultSelector Use the ResultSelector field to manipulate a state's result before ResultPath is applied. The ResultSelector field lets you create a collection of key value pairs, where the values are static or selected from the state's result. Using the ResultSelector field, you can choose what parts of a state's result you want to pass to the ResultPath field. Note With the ResultPath field, you can add the output of the ResultSelector field to the original input. ResultSelector is an optional field in the following states: ResultSelector 489 AWS Step Functions • Map workflow state • Task workflow state • Parallel workflow state Developer Guide For example, Step Functions service integrations return metadata in addition to the payload in the result. ResultSelector can select portions of the result and merge them with the state input with ResultPath. In this example, we want to select just the resourceType and ClusterId, and merge that with the state input from an Amazon EMR createCluster.sync. Given the following: { "resourceType": "elasticmapreduce", "resource": "createCluster.sync", "output": { "SdkHttpMetadata": { "HttpHeaders": { "Content-Length": "1112", "Content-Type": "application/x-amz-JSON-1.1", "Date": "Mon, 25 Nov 2019 19:41:29 GMT", "x-amzn-RequestId": "1234-5678-9012" }, "HttpStatusCode": 200 }, "SdkResponseMetadata": { "RequestId": "1234-5678-9012" }, "ClusterId": "AKIAIOSFODNN7EXAMPLE" } } You can then select the resourceType and ClusterId using ResultSelector: "Create Cluster": { "Type": "Task", "Resource": "arn:aws:states:::elasticmapreduce:createCluster.sync", "Parameters": { <some parameters> }, "ResultSelector": { "ClusterId.$": "$.output.ClusterId", "ResourceType.$": "$.resourceType" }, ResultSelector 490 AWS Step Functions Developer Guide "ResultPath": "$.EMROutput", "Next": "Next Step" } With the given input, using ResultSelector produces: { "OtherDataFromInput": {}, "EMROutput": { "ClusterId": "AKIAIOSFODNN7EXAMPLE", "ResourceType": "elasticmapreduce", } } Flattening an array of arrays If the Parallel workflow state or Map workflow state state in your state machines return an array of arrays, you can transform them into a flat array with the ResultSelector
|
step-functions-dg-154
|
step-functions-dg.pdf
| 154 |
}, "ClusterId": "AKIAIOSFODNN7EXAMPLE" } } You can then select the resourceType and ClusterId using ResultSelector: "Create Cluster": { "Type": "Task", "Resource": "arn:aws:states:::elasticmapreduce:createCluster.sync", "Parameters": { <some parameters> }, "ResultSelector": { "ClusterId.$": "$.output.ClusterId", "ResourceType.$": "$.resourceType" }, ResultSelector 490 AWS Step Functions Developer Guide "ResultPath": "$.EMROutput", "Next": "Next Step" } With the given input, using ResultSelector produces: { "OtherDataFromInput": {}, "EMROutput": { "ClusterId": "AKIAIOSFODNN7EXAMPLE", "ResourceType": "elasticmapreduce", } } Flattening an array of arrays If the Parallel workflow state or Map workflow state state in your state machines return an array of arrays, you can transform them into a flat array with the ResultSelector field. You can include this field inside the Parallel or Map state definition to manipulate the result of these states. To flatten arrays, use the syntax: [*] in the ResultSelector field as shown in the following example. "ResultSelector": { "flattenArray.$": "$[*][*]" } For examples that show how to flatten an array, see Step 3 in the following tutorials: • Processing batch data with a Lambda function in Step Functions • Processing individual items with a Lambda function in Step Functions ResultSelector 491 AWS Step Functions Developer Guide Example: Manipulating state data with paths in Step Functions workflows Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. This topic contains examples of how to manipulate state input and output JSON using the InputPath, ResultPath, and OutputPath fields. Any state other than a Fail workflow state state or a Succeed workflow state state can include the input and output processing fields, such as InputPath, ResultPath, or OutputPath. Additionally, the Wait workflow state and Choice workflow state states don't support the ResultPath field. With these fields, you can use a JsonPath to filter the JSON data as it moves through your workflow. You can also use the Parameters field to manipulate the JSON data as it moves through your workflow. For information about using Parameters, see Manipulate parameters in Step Functions workflows. For example, start with the AWS Lambda function and state machine described in the Creating a Step Functions state machine that uses Lambda tutorial. Modify the state machine so that it includes the following InputPath, ResultPath, and OutputPath. { "Comment": "A Hello World example of the Amazon States Language using an AWS Lambda function", "StartAt": "HelloWorld", "States": { "HelloWorld": { "Type": "Task", "Resource": "arn:aws:lambda:region:123456789012:function:HelloFunction", "InputPath": "$.lambda", "ResultPath": "$.data.lambdaresult", "OutputPath": "$.data", "End": true } } Example: Manipulating state data with paths 492 Developer Guide AWS Step Functions } Start an execution using the following input. { "comment": "An input comment.", "data": { "val1": 23, "val2": 17 }, "extra": "foo", "lambda": { "who": "AWS Step Functions" } } Assume that the comment and extra nodes can be discarded, but that you want to include the output of the Lambda function, and preserve the information in the data node. In the updated state machine, the Task state is altered to process the input to the task. "InputPath": "$.lambda", This line in the state machine definition limits the task input to only the lambda node from the state input. The Lambda function receives only the JSON object {"who": "AWS Step Functions"} as input. "ResultPath": "$.data.lambdaresult", This ResultPath tells the state machine to insert the result of the Lambda function into a node named lambdaresult, as a child of the data node in the original state machine input. Because you are not performing any other manipulation on the original input and the result using OutputPath, the output of the state now includes the result of the Lambda function with the original input. { "comment": "An input comment.", "data": { "val1": 23, "val2": 17, Example: Manipulating state data with paths 493 AWS Step Functions Developer Guide "lambdaresult": "Hello, AWS Step Functions!" }, "extra": "foo", "lambda": { "who": "AWS Step Functions" } } But, our goal was to preserve only the data node, and include the result of the Lambda function. OutputPath filters this combined JSON before passing it to the state output. "OutputPath": "$.data", This selects only the data node from the original input (including the lambdaresult child inserted by ResultPath) to be passed to the output. The state output is filtered to the following. { "val1": 23, "val2": 17, "lambdaresult": "Hello, AWS Step Functions!" } In this Task state: 1. InputPath sends only the lambda node from the input to the Lambda function. 2. ResultPath inserts the result as a child of the data node in the original input. 3. OutputPath filters the state input (which now includes the result of the Lambda function) so that it passes only the data node to the state output. Example to manipulate original state machine input, result, and final output using JsonPath Consider the following state machine that verifies an insurance applicant's identity and address. Note To view the complete
|
step-functions-dg-155
|
step-functions-dg.pdf
| 155 |
"val2": 17, "lambdaresult": "Hello, AWS Step Functions!" } In this Task state: 1. InputPath sends only the lambda node from the input to the Lambda function. 2. ResultPath inserts the result as a child of the data node in the original input. 3. OutputPath filters the state input (which now includes the result of the Lambda function) so that it passes only the data node to the state output. Example to manipulate original state machine input, result, and final output using JsonPath Consider the following state machine that verifies an insurance applicant's identity and address. Note To view the complete example, see How to use JSON Path in Step Functions. { Example: Manipulating state data with paths 494 AWS Step Functions Developer Guide "Comment": "Sample state machine to verify an applicant's ID and address", "StartAt": "Verify info", "States": { "Verify info": { "Type": "Parallel", "End": true, "Branches": [ { "StartAt": "Verify identity", "States": { "Verify identity": { "Type": "Task", "Resource": "arn:aws:states:::lambda:invoke", "Parameters": { "Payload.$": "$", "FunctionName": "arn:aws:lambda:us-east-2:111122223333:function:check- identity:$LATEST" }, "End": true } } }, { "StartAt": "Verify address", "States": { "Verify address": { "Type": "Task", "Resource": "arn:aws:states:::lambda:invoke", "Parameters": { "Payload.$": "$", "FunctionName": "arn:aws:lambda:us-east-2:111122223333:function:check- address:$LATEST" }, "End": true } } } ] } } } Example: Manipulating state data with paths 495 AWS Step Functions Developer Guide If you run this state machine using the following input, the execution fails because the Lambda functions that perform verification only expect the data that needs to be verified as input. Therefore, you must specify the nodes that contain the information to be verified using an appropriate JsonPath. { "data": { "firstname": "Jane", "lastname": "Doe", "identity": { "email": "[email protected]", "ssn": "123-45-6789" }, "address": { "street": "123 Main St", "city": "Columbus", "state": "OH", "zip": "43219" }, "interests": [ { "category": "home", "type": "own", "yearBuilt": 2004 }, { "category": "boat", "type": "snowmobile", "yearBuilt": 2020 }, { "category": "auto", "type": "RV", "yearBuilt": 2015 }, ] } } To specify the node that the check-identity Lambda function must use, use the InputPath field as follows: Example: Manipulating state data with paths 496 AWS Step Functions Developer Guide "InputPath": "$.data.identity" And to specify the node that the check-address Lambda function must use, use the InputPath field as follows: "InputPath": "$.data.address" Now if you want to store the verification result within the original state machine input, use the ResultPath field as follows: "ResultPath": "$.results" However, if you only need the identity and verification results and discard the original input, use the OutputPath field as follows: "OutputPath": "$.results" For more information, see Processing input and output in Step Functions. Filtering state output using OutputPath With OutputPath you can select a portion of the state output to pass to the next state. With this approach, you can filter out unwanted information, and pass only the portion of JSON that you need. If you don't specify an OutputPath the default value is $. This passes the entire JSON node (determined by the state input, the task result, and ResultPath) to the next state. Specifying state output using ResultPath in Step Functions Managing state and transforming data This page refers to JSONPath. Step Functions recently added variables and JSONata to manage state and transform data. Learn about Passing data with variables and Transforming data with JSONata. Filtering state output 497 AWS Step Functions Developer Guide The output of a state can be a copy of its input, the result it produces (for example, output from a Task state’s Lambda function), or a combination of its input and result. Use ResultPath to control which combination of these is passed to the state output. The following state types can generate a result and can include ResultPath: • Pass workflow state • Task workflow state • Parallel workflow state • Map workflow state Use ResultPath to combine a task result with task input, or to select one of these. The path you provide to ResultPath controls what information passes to the output. Note ResultPath is limited to using reference paths, which limit scope so the path must identify only a single node in JSON. See Reference Paths in the Amazon States Language. Use ResultPath to replace input with the task result If you do not specify a ResultPath, the default behavior is the same as "ResultPath": "$". The state will replace the entire state input with the result from the task. # State Input { "comment": "This is a test", "details": "Default example", "who" : "Step Functions" } # Path "ResultPath": "$" # Task result "Hello, Step Functions!" # State Output Replace input with result 498 AWS Step Functions "Hello, Step Functions!" Note Developer Guide ResultPath is used to include content from the result with the input, before passing it to the output. But, if ResultPath isn't specified, the default action is to replace the entire input.
|
step-functions-dg-156
|
step-functions-dg.pdf
| 156 |
ResultPath, the default behavior is the same as "ResultPath": "$". The state will replace the entire state input with the result from the task. # State Input { "comment": "This is a test", "details": "Default example", "who" : "Step Functions" } # Path "ResultPath": "$" # Task result "Hello, Step Functions!" # State Output Replace input with result 498 AWS Step Functions "Hello, Step Functions!" Note Developer Guide ResultPath is used to include content from the result with the input, before passing it to the output. But, if ResultPath isn't specified, the default action is to replace the entire input. Discard the result and keep the original input If you set ResultPath to null, the state will pass the original input to the output. The state's input payload will be copied directly to the output, with no regard for the task result. # State Input { "comment": "This is a test", "details": "Default example", "who" : "Step Functions" } # Path "ResultPath": null # Task result "Hello, Step Functions!" # State Output { "comment": "This is a test", "details": "Default example", "who" : "Step Functions" } Use ResultPath to include the result with the input If you specify a path for ResultPath, the state output will combine the state input and task result: # State Input { "comment": "This is a test", Discard Result and Keep Input 499 AWS Step Functions Developer Guide "details": "Default example", "who" : "Step Functions" } # Path "ResultPath": "$.taskresult" # Task result "Hello, Step Functions!" # State Output { "comment": "This is a test", "details": "Default example", "who" : "Step Functions", "taskresult" : "Hello, Step Functions!" } You can also insert the result into a child node of the input. Set the ResultPath to the following. "ResultPath": "$.strings.lambdaresult" Given the following input: { "comment": "An input comment.", "strings": { "string1": "foo", "string2": "bar", "string3": "baz" }, "who": "AWS Step Functions" } The task result would be inserted as a child of the strings node in the input. { "comment": "An input comment.", "strings": { "string1": "foo", "string2": "bar", "string3": "baz", Include Result with Input 500 AWS Step Functions Developer Guide "lambdaresult": "Hello, Step Functions!" }, "who": "AWS Step Functions" } The state output now includes the original input JSON with the result as a child node. Use ResultPath to update a node in the input with the result If you specify an existing node for ResultPath, the task result will replace that existing node: # State Input { "comment": "This is a test", "details": "Default example", "who" : "Step Functions" } # Path "ResultPath": "$.comment" # Task result "Hello, Step Functions!" # State Output { "comment": "Hello, Step Functions!", "details": "Default example", "who" : "Step Functions" } Use ResultPath to include both error and input in a Catch In some cases, you might want to preserve the original input with the error. Use ResultPath in a Catch to include the error with the original input, instead of replacing it. "Catch": [{ "ErrorEquals": ["States.ALL"], "Next": "NextTask", "ResultPath": "$.error" }] Update a Node in Input with Result 501 AWS Step Functions Developer Guide If the previous Catch statement catches an error, it includes the result in an error node within the state input. For example, with the following input: {"foo": "bar"} The state output when catching an error is the following. { "foo": "bar", "error": { "Error": "Error here" } } For more information about error handling, see the following: • Handling errors in Step Functions workflows • Handling error conditions using a Step Functions state machine Map state input and output fields in Step Functions Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. Map states concurrently iterate over a collection of items in a dataset, such as a JSON array, a list of Amazon S3 objects, or the rows of JSON Lines or a CSV file in an Amazon S3 bucket. It repeats a set of steps for each item in the collection. You can configure the input that the Map state receives and the output it generates using these fields. Step Functions applies each field in your Distributed Map state in the order shown in the following list and illustration: Note Based on your use case, you may not need to apply all of these fields. Map state input and output fields in Step Functions 502 AWS Step Functions 1. ItemReader (Map) 2. ItemsPath (Map, JSONPath only) 3. ItemSelector (Map) 4. ItemBatcher (Map) 5. ResultWriter (Map) Developer Guide Note These Map state input and output fields are currently unavailable in the data flow simulator in the Step Functions console. Map state input and output fields in Step Functions 503 AWS Step Functions ItemReader (Map) Developer Guide The ItemReader field is a JSON object, which
|
step-functions-dg-157
|
step-functions-dg.pdf
| 157 |
in the following list and illustration: Note Based on your use case, you may not need to apply all of these fields. Map state input and output fields in Step Functions 502 AWS Step Functions 1. ItemReader (Map) 2. ItemsPath (Map, JSONPath only) 3. ItemSelector (Map) 4. ItemBatcher (Map) 5. ResultWriter (Map) Developer Guide Note These Map state input and output fields are currently unavailable in the data flow simulator in the Step Functions console. Map state input and output fields in Step Functions 503 AWS Step Functions ItemReader (Map) Developer Guide The ItemReader field is a JSON object, which specifies a dataset and its location. A Distributed Map state uses this dataset as its input. The following example shows the syntax of the ItemReader field in a JSONPath-based workflow, for a dataset in a text delimited file that's stored in an Amazon S3 bucket. "ItemReader": { "ReaderConfig": { "InputType": "CSV", "CSVHeaderLocation": "FIRST_ROW" }, "Resource": "arn:aws:states:::s3:getObject", "Parameters": { "Bucket": "myBucket", "Key": "csvDataset/ratings.csv", "VersionId": "BcK42coT2jE1234VHLUvBV1yLNod2OEt" } } The following example shows that in JSONata-based workflows, Parameters is replaced with Arguments. "ItemReader": { "ReaderConfig": { "InputType": "CSV", "CSVHeaderLocation": "FIRST_ROW" }, "Resource": "arn:aws:states:::s3:getObject", "Arguments": { "Bucket": "amzn-s3-demo-bucket", "Key": "csvDataset/ratings.csv" } } Tip In Workflow Studio, you specify the dataset and its location in the Item source field. ItemReader 504 AWS Step Functions Contents • Contents of the ItemReader field • Examples of datasets • IAM policies for datasets Contents of the ItemReader field Developer Guide Depending on your dataset, the contents of the ItemReader field varies. For example, if your dataset is a JSON array passed from a previous step in the workflow, the ItemReader field is omitted. If your dataset is an Amazon S3 data source, this field contains the following sub-fields. ReaderConfig A JSON object that specifies the following details: • InputType Accepts one of the following values: CSV, JSON, JSONL,MANIFEST. Specifies the type of Amazon S3 data source, such as a text delimited file (CSV), object, JSON file, JSON Lines, or an Amazon S3 inventory list. In Workflow Studio, you can select an input type from the Amazon S3 item source dropdown list under the Item source field. • CSVDelimiter Specify this field only if you use CSV as InputType, which indicates a text delimited file. Accepts one of the following values: COMMA (default), PIPE, SEMICOLON, SPACE, TAB. Note The CSVDelimiter field enables ItemReader more flexibility to support files that are delimited by other characters besides the comma. Therefore, assume that our references to CSV files in relation to ItemReader also include files that use delimiters accepted by the CSVDelimiter field. • CSVHeaderLocation Required if InputType is CSV, which indicates a text delimited file with delimiters accepted by the CSVDelimiter field. ItemReader 505 AWS Step Functions Developer Guide Accepts one of the following values to specify the location of the column header: • FIRST_ROW – Use this option if the first line of the file is the header. • GIVEN – Use this option to specify the header within the state machine definition. For example, if your file contains the following data. 1,307,3.5,1256677221 1,481,3.5,1256677456 1,1091,1.5,1256677471 ... Provide the following JSON array as a CSV header. "ItemReader": { "ReaderConfig": { "InputType": "CSV", "CSVHeaderLocation": "GIVEN", "CSVHeaders": [ "userId", "movieId", "rating", "timestamp" ] } } Important Currently, Step Functions supports headers of up to 10 KiB for text delimited files. Tip In Workflow Studio, you can find this option under Additional configuration in the Item source field. • MaxItems ItemReader 506 AWS Step Functions Developer Guide Limits the number of data items passed to the Map state. For example, suppose that you provide a text delimited file that contains 1000 rows and specify a limit of 100. Then, the interpreter passes only 100 rows to the Map state. The Map state processes items in sequential order, starting after the header row. By default, the Map state iterates over all the items in the specified dataset. Note Currently, you can specify a limit of up to 100,000,000. The Distributed Map state stops reading items beyond this limit. Tip In Workflow Studio, you can find this option under Additional configuration in the Item source field. Alternatively, you can specify a reference path to an existing key-value pair in your Distributed Map state input. This path must resolve to a positive integer. You specify the reference path in the MaxItemsPath sub-field. Important You can specify either the MaxItems or the MaxItemsPath sub-field, but not both. Resource The Amazon S3 API action that Step Functions must invoke depending on the specified dataset. Parameters A JSON object that specifies the Amazon S3 bucket name and object key that the dataset is stored in. In this field, you can also provide the Amazon S3 object version, if the bucket has versioning enabled. ItemReader 507 AWS Step Functions Important Developer
|
step-functions-dg-158
|
step-functions-dg.pdf
| 158 |
key-value pair in your Distributed Map state input. This path must resolve to a positive integer. You specify the reference path in the MaxItemsPath sub-field. Important You can specify either the MaxItems or the MaxItemsPath sub-field, but not both. Resource The Amazon S3 API action that Step Functions must invoke depending on the specified dataset. Parameters A JSON object that specifies the Amazon S3 bucket name and object key that the dataset is stored in. In this field, you can also provide the Amazon S3 object version, if the bucket has versioning enabled. ItemReader 507 AWS Step Functions Important Developer Guide Make sure that your Amazon S3 buckets are in the same AWS account and AWS Region as your state machine. Note that even though your state machine may be able to access files in buckets across different AWS accounts that are in the same AWS Region, Step Functions only supports state machines to list objects in S3 buckets that are in both the same AWS account and the same AWS Region as the state machine. Examples of datasets You can specify one of the following options as your dataset: • JSON array from a previous step • A list of Amazon S3 objects • JSON file in an Amazon S3 bucket • JSON Lines file in an Amazon S3 bucket • CSV file in an Amazon S3 bucket • Amazon S3 inventory list Important Step Functions needs appropriate permissions to access the Amazon S3 datasets that you use. For information about IAM policies for the datasets, see IAM policies for datasets. JSON array from a previous step A Distributed Map state can accept a JSON input passed from a previous step in the workflow. This input must either be an array, or must contain an array within a specific node. To select a node that contains the array, you can use the ItemsPath (Map, JSONPath only) field. To process individual items in the array, the Distributed Map state starts a child workflow execution for each array item. The following tabs show examples of the input passed to the Map state and the corresponding input to a child workflow execution. ItemReader 508 AWS Step Functions Note Developer Guide Step Functions omits the ItemReader field when your dataset is a JSON array from a previous step. Input passed to the Map state Consider the following JSON array of three items. "facts": [ { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" }, { "verdict": "false", "statement_date": "6/7/2022", "statement_source": "television" }, { "verdict": "mostly-true", "statement_date": "5/18/2016", "statement_source": "news" } ] Input passed to a child workflow execution The Distributed Map state starts three child workflow executions. Each execution receives an array item as input. The following example shows the input received by a child workflow execution. { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" } ItemReader 509 AWS Step Functions Amazon S3 objects example Developer Guide A Distributed Map state can iterate over the objects that are stored in an Amazon S3 bucket. When the workflow execution reaches the Map state, Step Functions invokes the ListObjectsV2 API action, which returns an array of the Amazon S3 object metadata. In this array, each item contains data, such as ETag and Key, for the data stored in the bucket. To process individual items in the array, the Distributed Map state starts a child workflow execution. For example, suppose that your Amazon S3 bucket contains 100 images. Then, the array returned after invoking the ListObjectsV2 API action contains 100 items. The Distributed Map state then starts 100 child workflow executions to process each array item. Note • Currently, Step Functions also includes an item for each folder you create in a specific Amazon S3 bucket using the Amazon S3 console. This results in an extra child workflow execution started by the Distributed Map state. To avoid creating an extra child workflow execution for the folder, we recommend that you use the AWS CLI to create folders. For more information, see High-level Amazon S3 commands in the AWS Command Line Interface User Guide. • Step Functions needs appropriate permissions to access the Amazon S3 datasets that you use. For information about IAM policies for the datasets, see IAM policies for datasets. The following tabs show examples of the ItemReader field syntax and the input passed to a child workflow execution for this dataset. ItemReader syntax In this example, you've organized your data, which includes images, JSON files, and objects, within a prefix named processData in an Amazon S3 bucket named amzn-s3-demo-bucket. "ItemReader": { "Resource": "arn:aws:states:::s3:listObjectsV2", "Parameters": { "Bucket": "amzn-s3-demo-bucket", "Prefix": "processData" } } ItemReader 510 AWS Step Functions Developer Guide Input passed to a child workflow execution The Distributed Map state starts as many child workflow executions as the number of items present in the Amazon S3 bucket. The
|
step-functions-dg-159
|
step-functions-dg.pdf
| 159 |
for datasets. The following tabs show examples of the ItemReader field syntax and the input passed to a child workflow execution for this dataset. ItemReader syntax In this example, you've organized your data, which includes images, JSON files, and objects, within a prefix named processData in an Amazon S3 bucket named amzn-s3-demo-bucket. "ItemReader": { "Resource": "arn:aws:states:::s3:listObjectsV2", "Parameters": { "Bucket": "amzn-s3-demo-bucket", "Prefix": "processData" } } ItemReader 510 AWS Step Functions Developer Guide Input passed to a child workflow execution The Distributed Map state starts as many child workflow executions as the number of items present in the Amazon S3 bucket. The following example shows the input received by a child workflow execution. { "Etag": "\"05704fbdccb224cb01c59005bebbad28\"", "Key": "processData/images/n02085620_1073.jpg", "LastModified": 1668699881, "Size": 34910, "StorageClass": "STANDARD" } JSON file in an Amazon S3 bucket A Distributed Map state can accept a JSON file that's stored in an Amazon S3 bucket as a dataset. The JSON file must contain an array. When the workflow execution reaches the Map state, Step Functions invokes the GetObject API action to fetch the specified JSON file. The Map state then iterates over each item in the array and starts a child workflow execution for each item. For example, if your JSON file contains 1000 array items, the Map state starts 1000 child workflow executions. Note • The execution input used to start a child workflow execution can't exceed 256 KiB. However, Step Functions supports reading an item of up to 8 MB from a text delimited file, JSON, or JSON Lines file if you then apply the optional ItemSelector field to reduce the item's size. • Currently, Step Functions supports 10 GB as the maximum size of an individual file in Amazon S3. • Step Functions needs appropriate permissions to access the Amazon S3 datasets that you use. For information about IAM policies for the datasets, see IAM policies for datasets. The following tabs show examples of the ItemReader field syntax and the input passed to a child workflow execution for this dataset. ItemReader 511 AWS Step Functions Developer Guide For this example, imagine you have a JSON file named factcheck.json. You've stored this file within a prefix named jsonDataset in an Amazon S3 bucket. The following is an example of the JSON dataset. [ { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" }, { "verdict": "false", "statement_date": "6/7/2022", "statement_source": "television" }, { "verdict": "mostly-true", "statement_date": "5/18/2016", "statement_source": "news" }, ... ] ItemReader syntax "ItemReader": { "Resource": "arn:aws:states:::s3:getObject", "ReaderConfig": { "InputType": "JSON" }, "Parameters": { "Bucket": "amzn-s3-demo-bucket", "Key": "jsonDataset/factcheck.json" } } Input to a child workflow execution The Distributed Map state starts as many child workflow executions as the number of array items present in the JSON file. The following example shows the input received by a child workflow execution. ItemReader 512 AWS Step Functions Developer Guide { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" } JSON Lines file in an Amazon S3 bucket A Distributed Map state can accept a JSON Lines file that's stored in an Amazon S3 bucket as a dataset. Note • The execution input used to start a child workflow execution can't exceed 256 KiB. However, Step Functions supports reading an item of up to 8 MB from a text delimited file, JSON, or JSON Lines file if you then apply the optional ItemSelector field to reduce the item's size. • Currently, Step Functions supports 10 GB as the maximum size of an individual file in Amazon S3. • Step Functions needs appropriate permissions to access the Amazon S3 datasets that you use. For information about IAM policies for the datasets, see IAM policies for datasets. The following tabs show examples of the ItemReader field syntax and the input passed to a child workflow execution for this dataset. For this example, imagine you have a JSON Lines file named factcheck.jsonl. You've stored this file within a prefix named jsonlDataset in an Amazon S3 bucket. The following is an example of the file's contents. {"verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech"} {"verdict": "false", "statement_date": "6/7/2022", "statement_source": "television"} {"verdict": "mostly-true", "statement_date": "5/18/2016", "statement_source": "news"} ItemReader syntax "ItemReader": { ItemReader 513 AWS Step Functions Developer Guide "Resource": "arn:aws:states:::s3:getObject", "ReaderConfig": { "InputType": "JSONL" }, "Parameters": { "Bucket": "amzn-s3-demo-bucket", "Key": "jsonlDataset/factcheck.jsonl" } } Input to a child workflow execution The Distributed Map state starts as many child workflow executions as the number of lines present in the JSONL file. The following example shows the input received by a child workflow execution. { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" } CSV file in an Amazon S3 bucket Note The CSVDelimiter field enables ItemReader more flexibility to support files that are delimited by other characters besides the comma. Therefore, assume that our references to CSV files in relation to ItemReader also include files that use delimiters accepted by the CSVDelimiter field. A Distributed Map
|
step-functions-dg-160
|
step-functions-dg.pdf
| 160 |
} Input to a child workflow execution The Distributed Map state starts as many child workflow executions as the number of lines present in the JSONL file. The following example shows the input received by a child workflow execution. { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" } CSV file in an Amazon S3 bucket Note The CSVDelimiter field enables ItemReader more flexibility to support files that are delimited by other characters besides the comma. Therefore, assume that our references to CSV files in relation to ItemReader also include files that use delimiters accepted by the CSVDelimiter field. A Distributed Map state can accept a text delimited file that's stored in an Amazon S3 bucket as a dataset. If you use a text delimited file as your dataset, you need to specify a column header. For information about how to specify a header, see Contents of the ItemReader field. Step Functions parses text delimited files based on the following rules: • The delimiter that separates fields is specified by CSVDelimiter in ReaderConfig. The delimiter defaults to COMMA. ItemReader 514 AWS Step Functions Developer Guide • Newlines are a delimiter that separates records. • Fields are treated as strings. For data type conversions, use the States.StringToJson intrinsic function in ItemSelector (Map). • Double quotation marks (" ") are not required to enclose strings. However, strings that are enclosed by double quotation marks can contain commas and newlines without acting as record delimiters. • You can preserve double quotes by repeating them. • If the number of fields in a row is less than the number of fields in the header, Step Functions provides empty strings for the missing values. • If the number of fields in a row is more than the number of fields in the header, Step Functions skips the additional fields. For more information about how Step Functions parses a text delimited file, see Example of parsing an input CSV file. When the workflow execution reaches the Map state, Step Functions invokes the GetObject API action to fetch the specified file. The Map state then iterates over each row in the file and starts a child workflow execution to process the items in each row. For example, suppose that you provide a text delimited file that contains 100 rows as input. Then, the interpreter passes each row to the Map state. The Map state processes items in serial order, starting after the header row. Note • The execution input used to start a child workflow execution can't exceed 256 KiB. However, Step Functions supports reading an item of up to 8 MB from a text delimited file, JSON, or JSON Lines file if you then apply the optional ItemSelector field to reduce the item's size. • Currently, Step Functions supports 10 GB as the maximum size of an individual file in Amazon S3. • Step Functions needs appropriate permissions to access the Amazon S3 datasets that you use. For information about IAM policies for the datasets, see IAM policies for datasets. The following tabs show examples of the ItemReader field syntax and the input passed to a child workflow execution for this dataset. ItemReader 515 AWS Step Functions ItemReader syntax Developer Guide For example, say that you have a CSV file named ratings.csv. Then, you've stored this file within a prefix that's named csvDataset in an Amazon S3 bucket. { "ItemReader": { "ReaderConfig": { "InputType": "CSV", "CSVHeaderLocation": "FIRST_ROW", "CSVDelimiter": "PIPE" }, "Resource": "arn:aws:states:::s3:getObject", "Parameters": { "Bucket": "amzn-s3-demo-bucket", "Key": "csvDataset/ratings.csv" } } } Input to a child workflow execution The Distributed Map state starts as many child workflow executions as the number of rows present in the CSV file, excluding the header row, if in the file. The following example shows the input received by a child workflow execution. { "rating": "3.5", "movieId": "307", "userId": "1", "timestamp": "1256677221" } S3 inventory example A Distributed Map state can accept an Amazon S3 inventory manifest file that's stored in an Amazon S3 bucket as a dataset. When the workflow execution reaches the Map state, Step Functions invokes the GetObject API action to fetch the specified Amazon S3 inventory manifest file. The Map state then iterates over the objects in the inventory to return an array of Amazon S3 inventory object metadata. ItemReader 516 AWS Step Functions Note Developer Guide • Currently, Step Functions supports 10 GB as the maximum size of an individual file in an Amazon S3 inventory report after decompression. However, Step Functions can process more than 10 GB if each individual file is under 10 GB. • Step Functions needs appropriate permissions to access the Amazon S3 datasets that you use. For information about IAM policies for the datasets, see IAM policies for datasets. The following is an example of an inventory file in CSV format. This file includes
|
step-functions-dg-161
|
step-functions-dg.pdf
| 161 |
return an array of Amazon S3 inventory object metadata. ItemReader 516 AWS Step Functions Note Developer Guide • Currently, Step Functions supports 10 GB as the maximum size of an individual file in an Amazon S3 inventory report after decompression. However, Step Functions can process more than 10 GB if each individual file is under 10 GB. • Step Functions needs appropriate permissions to access the Amazon S3 datasets that you use. For information about IAM policies for the datasets, see IAM policies for datasets. The following is an example of an inventory file in CSV format. This file includes the objects named csvDataset and imageDataset, which are stored in an Amazon S3 bucket that's named amzn- s3-demo-source-bucket. "amzn-s3-demo-source-bucket","csvDataset/","0","2022-11-16T00:27:19.000Z" "amzn-s3-demo-source-bucket","csvDataset/ titles.csv","3399671","2022-11-16T00:29:32.000Z" "amzn-s3-demo-source-bucket","imageDataset/","0","2022-11-15T20:00:44.000Z" "amzn-s3-demo-source-bucket","imageDataset/ n02085620_10074.jpg","27034","2022-11-15T20:02:16.000Z" ... Important Currently, Step Functions doesn't support user-defined Amazon S3 inventory report as a dataset. You must also make sure that the output format of your Amazon S3 inventory report is CSV. For more information about Amazon S3 inventories and how to set them up, see Amazon S3 Inventory in the Amazon S3 User Guide. The following example of an inventory manifest file shows the CSV headers for the inventory object metadata. { "sourceBucket" : "amzn-s3-demo-source-bucket", "destinationBucket" : "arn:aws:s3:::amzn-s3-demo-inventory", "version" : "2016-11-30", "creationTimestamp" : "1668560400000", "fileFormat" : "CSV", ItemReader 517 AWS Step Functions Developer Guide "fileSchema" : "Bucket, Key, Size, LastModifiedDate", "files" : [ { "key" : "amzn-s3-demo-bucket/destination-prefix/ data/20e55de8-9c21-45d4-99b9-46c732000228.csv.gz", "size" : 7300, "MD5checksum" : "a7ff4a1d4164c3cd55851055ec8f6b20" } ] } The following tabs show examples of the ItemReader field syntax and the input passed to a child workflow execution for this dataset. ItemReader syntax { "ItemReader": { "ReaderConfig": { "InputType": "MANIFEST" }, "Resource": "arn:aws:states:::s3:getObject", "Parameters": { "Bucket": "amzn-s3-demo-destination-bucket", "Key": "destination-prefix/amzn-s3-demo-bucket/config-id/YYYY-MM-DDTHH-MMZ/ manifest.json" } } } Input to a child workflow execution { "LastModifiedDate": "2022-11-16T00:29:32.000Z", "Bucket": "amzn-s3-demo-source-bucket", "Size": "3399671", "Key": "csvDataset/titles.csv" } Depending on the fields you selected while configuring the Amazon S3 inventory report, the contents of your manifest.json file may vary from the example shown. ItemReader 518 AWS Step Functions IAM policies for datasets Developer Guide When you create workflows with the Step Functions console, Step Functions can automatically generate IAM policies based on the resources in your workflow definition. These policies include the least privileges necessary to allow the state machine role to invoke the StartExecution API action for the Distributed Map state. These policies also include the least privileges necessary Step Functions to access AWS resources, such as Amazon S3 buckets and objects and Lambda functions. We highly recommend that you include only those permissions that are necessary in your IAM policies. For example, if your workflow includes a Map state in Distributed mode, scope your policies down to the specific Amazon S3 bucket and folder that contains your dataset. Important If you specify an Amazon S3 bucket and object, or prefix, with a reference path to an existing key-value pair in your Distributed Map state input, make sure that you update the IAM policies for your workflow. Scope the policies down to the bucket and object names the path resolves to at runtime. The following IAM policy examples grant the least privileges required to access your Amazon S3 datasets using the ListObjectsV2 and GetObject API actions. Example IAM policy for Amazon S3 object as dataset The following example shows an IAM policy that grants the least privileges to access the objects organized within processImages in an Amazon S3 bucket named amzn-s3-demo-bucket. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:ListBucket" ], "Resource": [ "arn:aws:s3:::amzn-s3-demo-bucket" ], "Condition": { "StringLike": { ItemReader 519 AWS Step Functions Developer Guide "s3:prefix": [ "processImages" ] } } } ] } Example IAM policy for a CSV file as dataset The following example shows an IAM policy that grants least privileges to access a CSV file named ratings.csv. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetObject" ], "Resource": [ "arn:aws:s3:::amzn-s3-demo-bucket/csvDataset/ratings.csv" ] } ] } Example IAM policy for an Amazon S3 inventory as dataset The following example shows an IAM policy that grants least privileges to access an Amazon S3 inventory report. { "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": [ "s3:GetObject" ], "Resource": [ ItemReader 520 AWS Step Functions Developer Guide "arn:aws:s3:::destination-prefix/amzn-s3-demo-bucket/config-id/YYYY-MM- DDTHH-MMZ/manifest.json", "arn:aws:s3:::destination-prefix/amzn-s3-demo-bucket/config-id/data/*" ] } ] } ItemsPath (Map, JSONPath only) Managing state and transforming data This page refers to JSONPath. Step Functions recently added variables and JSONata to manage state and transform data. Learn about Passing data with variables and Transforming data with JSONata. In JSONPath-based states, use the ItemsPath field to select an array within a JSON input provided to a Map state. The Map state repeats a set of steps for each item in the array. By default, the Map state sets ItemsPath to $, which selects the entire input. If the input to the Map state is a JSON
|
step-functions-dg-162
|
step-functions-dg.pdf
| 162 |
] } ] } ItemsPath (Map, JSONPath only) Managing state and transforming data This page refers to JSONPath. Step Functions recently added variables and JSONata to manage state and transform data. Learn about Passing data with variables and Transforming data with JSONata. In JSONPath-based states, use the ItemsPath field to select an array within a JSON input provided to a Map state. The Map state repeats a set of steps for each item in the array. By default, the Map state sets ItemsPath to $, which selects the entire input. If the input to the Map state is a JSON array, it runs an iteration for each item in the array, passing that item to the iteration as input. Note You can use ItemsPath in the Distributed Map state only if you use a JSON input passed from a previous state in the workflow. You can use the ItemsPath field to specify a location in the input that points to JSON array used for iterations. The value of ItemsPath must be a Reference Path, and that path must point to JSON array. For instance, consider input to a Map state that includes two arrays, like the following example. { "ThingsPiratesSay": [ { "say": "Avast!" }, ItemsPath 521 Developer Guide AWS Step Functions { "say": "Yar!" }, { "say": "Walk the Plank!" } ], "ThingsGiantsSay": [ { "say": "Fee!" }, { "say": "Fi!" }, { "say": "Fo!" }, { "say": "Fum!" } ] } In this case, you could specify which array to use for Map state iterations by selecting it with ItemsPath. The following state machine definition specifies the ThingsPiratesSay array in the input using ItemsPath.It then runs an iteration of the SayWord pass state for each item in the ThingsPiratesSay array. { "StartAt": "PiratesSay", "States": { "PiratesSay": { "Type": "Map", "ItemsPath": "$.ThingsPiratesSay", "ItemProcessor": { "StartAt": "SayWord", "States": { "SayWord": { "Type": "Pass", "End": true } } }, ItemsPath 522 AWS Step Functions "End": true } } } Developer Guide When processing input, the Map state applies ItemsPath after InputPath. It operates on the effective input to the state after InputPath filters the input. For more information on Map states, see the following: • Map state • Map state processing modes • Repeat actions with Inline Map • Inline Map state input and output processing ItemSelector (Map) Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. By default, the effective input for the Map state is the set of individual data items present in the raw state input. The ItemSelector field lets you override the data items’ values before they’re passed on to the Map state. To override the values, specify a valid JSON input that contains a collection of key-value pairs. These pairs can be static values provided in your state machine definition, values selected from the state input using a path, or values accessed from the Context object. If you specify key-value pairs using a path or Context object, the key name must end in .$. Note The ItemSelector field replaces the Parameters field within the Map state. If you use the Parameters field in your Map state definitions to create custom input, we highly recommend that you replace them with ItemSelector. ItemSelector 523 AWS Step Functions Developer Guide You can specify the ItemSelector field in both an Inline Map state and a Distributed Map state. For example, consider the following JSON input that contains an array of three items within the imageData node. For each Map state iteration, an array item is passed to the iteration as input. [ { "resize": "true", "format": "jpg" }, { "resize": "false", "format": "png" }, { "resize": "true", "format": "jpg" } ] Using the ItemSelector field, you can define a custom JSON input to override the original input as shown in the following example. Step Functions then passes this custom input to each Map state iteration. The custom input contains a static value for size and the value of a Context object data for Map state. The $$.Map.Item.Value Context object contains the value of each individual data item. { "ItemSelector": { "size": 10, "value.$": "$$.Map.Item.Value" } } The following example shows the input received by one iteration of the Inline Map state: { "size": 10, "value": { "resize": "true", "format": "jpg" } ItemSelector 524 AWS Step Functions } Tip Developer Guide For a complete example of a Distributed Map state that uses the ItemSelector field, see Copy large-scale CSV using Distributed Map. ItemBatcher (Map) Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. The ItemBatcher field is a JSON object, which specifies to process a group of items in a single child workflow execution. Use batching when processing large CSV files or JSON
|
step-functions-dg-163
|
step-functions-dg.pdf
| 163 |
input received by one iteration of the Inline Map state: { "size": 10, "value": { "resize": "true", "format": "jpg" } ItemSelector 524 AWS Step Functions } Tip Developer Guide For a complete example of a Distributed Map state that uses the ItemSelector field, see Copy large-scale CSV using Distributed Map. ItemBatcher (Map) Managing state and transforming data Learn about Passing data between states with variables and Transforming data with JSONata. The ItemBatcher field is a JSON object, which specifies to process a group of items in a single child workflow execution. Use batching when processing large CSV files or JSON arrays, or large sets of Amazon S3 objects. The following example shows the syntax of the ItemBatcher field. In the following syntax, the maximum number of items that each child workflow execution should process is set to 100. { "ItemBatcher": { "MaxItemsPerBatch": 100 } } By default, each item in a dataset is passed as input to individual child workflow executions. For example, assume you specify a JSON file as input that contains the following array: [ { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" }, ItemBatcher 525 Developer Guide AWS Step Functions { "verdict": "false", "statement_date": "6/7/2022", "statement_source": "television" }, { "verdict": "true", "statement_date": "5/18/2016", "statement_source": "news" }, ... ] For the given input, each child workflow execution receives an array item as its input. The following example shows the input of a child workflow execution: { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" } To help optimize the performance and cost of your processing job, select a batch size that balances the number of items against the items processing time. If you use batching, Step Functions adds the items to an Items array. It then passes the array as input to each child workflow execution. The following example shows a batch of two items passed as input to a child workflow execution: { "Items": [ { "verdict": "true", "statement_date": "6/11/2008", "statement_source": "speech" }, { "verdict": "false", "statement_date": "6/7/2022", "statement_source": "television" } ] } ItemBatcher 526 AWS Step Functions Tip Developer Guide To learn more about using the ItemBatcher field in your workflows, try the following tutorials and workshop: • Process an entire batch of data within a Lambda function • Iterate over items in a batch inside child workflow executions • Distributed map and related resources in The AWS Step Functions Workshop Contents • Fields to specify item batching Fields to specify item batching To batch items, specify the maximum number of items to batch, the maximum batch size, or both. You must specify one of these values to batch items. Max items per batch Specifies the maximum number of items that each child workflow execution processes. The interpreter limits the number of items batched in the Items array to this value. If you specify both a batch number and size, the interpreter reduces the number of items in a batch to avoid exceeding the specified batch size limit. If you don't specify this value but provide a value for maximum batch size, Step Functions processes as many items as possible in each child workflow execution without exceeding the maximum batch size in bytes. For example, imagine you run an execution with an input JSON file that contains 1130 nodes. If you specify a maximum items value for each batch of 100, Step Functions creates 12 batches. Of these, 11 batches contain 100 items each, while the twelfth batch contains the remaining 30 items. Alternatively, you can specify the maximum items for each batch as a reference path to an existing key-value pair in your Distributed Map state input. This path must resolve to a positive integer. For example, given the following input: ItemBatcher 527 AWS Step Functions Developer Guide { "maxBatchItems": 500 } You can specify the maximum number of items to batch using a reference path (JSONPath only) as follows: { ... "Map": { "Type": "Map", "MaxConcurrency": 2000, "ItemBatcher": { "MaxItemsPerBatchPath": "$.maxBatchItems" } ... ... } } For JSONata-based states, you can also provide a JSONata expression that evaluates to a positive integer. Important You can specify either the MaxItemsPerBatch or the MaxItemsPerBatchPath (JSONPath only) sub-field, but not both. Max KiB per batch Specifies the maximum size of a batch in bytes, up to 256 KiB. If you specify both a maximum batch number and size, Step Functions reduces the number of items in a batch to avoid exceeding the specified batch size limit. Alternatively, you can specify the maximum batch size as a reference path to an existing key- value pair in your Distributed Map state input. This path must resolve to a positive integer. ItemBatcher 528 AWS Step Functions Note Developer Guide If you use batching and don't specify a maximum batch size, the interpreter processes as many items it can process up to 256 KiB
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.