Transfer / Client / create_workflow
create_workflow#
- Transfer.Client.create_workflow(**kwargs)#
- Allows you to create a workflow with specified steps and step details the workflow invokes after file transfer completes. After creating a workflow, you can associate the workflow created with any transfer servers by specifying the - workflow-detailsfield in- CreateServerand- UpdateServeroperations.- See also: AWS API Documentation - Request Syntax- response = client.create_workflow( Description='string', Steps=[ { 'Type': 'COPY'|'CUSTOM'|'TAG'|'DELETE'|'DECRYPT', 'CopyStepDetails': { 'Name': 'string', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } }, 'OverwriteExisting': 'TRUE'|'FALSE', 'SourceFileLocation': 'string' }, 'CustomStepDetails': { 'Name': 'string', 'Target': 'string', 'TimeoutSeconds': 123, 'SourceFileLocation': 'string' }, 'DeleteStepDetails': { 'Name': 'string', 'SourceFileLocation': 'string' }, 'TagStepDetails': { 'Name': 'string', 'Tags': [ { 'Key': 'string', 'Value': 'string' }, ], 'SourceFileLocation': 'string' }, 'DecryptStepDetails': { 'Name': 'string', 'Type': 'PGP', 'SourceFileLocation': 'string', 'OverwriteExisting': 'TRUE'|'FALSE', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } } } }, ], OnExceptionSteps=[ { 'Type': 'COPY'|'CUSTOM'|'TAG'|'DELETE'|'DECRYPT', 'CopyStepDetails': { 'Name': 'string', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } }, 'OverwriteExisting': 'TRUE'|'FALSE', 'SourceFileLocation': 'string' }, 'CustomStepDetails': { 'Name': 'string', 'Target': 'string', 'TimeoutSeconds': 123, 'SourceFileLocation': 'string' }, 'DeleteStepDetails': { 'Name': 'string', 'SourceFileLocation': 'string' }, 'TagStepDetails': { 'Name': 'string', 'Tags': [ { 'Key': 'string', 'Value': 'string' }, ], 'SourceFileLocation': 'string' }, 'DecryptStepDetails': { 'Name': 'string', 'Type': 'PGP', 'SourceFileLocation': 'string', 'OverwriteExisting': 'TRUE'|'FALSE', 'DestinationFileLocation': { 'S3FileLocation': { 'Bucket': 'string', 'Key': 'string' }, 'EfsFileLocation': { 'FileSystemId': 'string', 'Path': 'string' } } } }, ], Tags=[ { 'Key': 'string', 'Value': 'string' }, ] ) - Parameters:
- Description (string) – A textual description for the workflow. 
- Steps (list) – - [REQUIRED] - Specifies the details for the steps that are in the specified workflow. - The - TYPEspecifies which of the following actions is being taken for this step.- COPY- Copy the file to another location.
- CUSTOM- Perform a custom step with an Lambda function target.
- DECRYPT- Decrypt a file that was encrypted before it was uploaded.
- DELETE- Delete the file.
- TAG- Add a tag to the file.
 - Note- Currently, copying and tagging are supported only on S3. - For file location, you specify either the Amazon S3 bucket and key, or the Amazon EFS file system ID and path. - (dict) – - The basic building block of a workflow. - Type (string) – - Currently, the following step types are supported. - COPY- Copy the file to another location.
- CUSTOM- Perform a custom step with an Lambda function target.
- DECRYPT- Decrypt a file that was encrypted before it was uploaded.
- DELETE- Delete the file.
- TAG- Add a tag to the file.
 
- CopyStepDetails (dict) – - Details for a step that performs a file copy. - Consists of the following values: - A description 
- An Amazon S3 location for the destination of the file copy. 
- A flag that indicates whether to overwrite an existing file of the same name. The default is - FALSE.
 - Name (string) – - The name of the step, used as an identifier. 
- DestinationFileLocation (dict) – - Specifies the location for the file being copied. Use - ${Transfer:UserName}or- ${Transfer:UploadDate}in this field to parametrize the destination prefix by username or uploaded date.- Set the value of - DestinationFileLocationto- ${Transfer:UserName}to copy uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.
- Set the value of - DestinationFileLocationto- ${Transfer:UploadDate}to copy uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.
 - Note- The system resolves - UploadDateto a date format of YYYY-MM-DD, based on the date the file is uploaded in UTC.- S3FileLocation (dict) – - Specifies the details for the Amazon S3 file that’s being copied or decrypted. - Bucket (string) – - Specifies the S3 bucket for the customer input file. 
- Key (string) – - The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object. 
 
- EfsFileLocation (dict) – - Specifies the details for the Amazon Elastic File System (Amazon EFS) file that’s being decrypted. - FileSystemId (string) – - The identifier of the file system, assigned by Amazon EFS. 
- Path (string) – - The pathname for the folder being used by a workflow. 
 
 
- OverwriteExisting (string) – - A flag that indicates whether to overwrite an existing file of the same name. The default is - FALSE.- If the workflow is processing a file that has the same name as an existing file, the behavior is as follows: - If - OverwriteExistingis- TRUE, the existing file is replaced with the file being processed.
- If - OverwriteExistingis- FALSE, nothing happens, and the workflow processing stops.
 
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
 
- CustomStepDetails (dict) – - Details for a step that invokes an Lambda function. - Consists of the Lambda function’s name, target, and timeout (in seconds). - Name (string) – - The name of the step, used as an identifier. 
- Target (string) – - The ARN for the Lambda function that is being called. 
- TimeoutSeconds (integer) – - Timeout, in seconds, for the step. 
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
 
- DeleteStepDetails (dict) – - Details for a step that deletes the file. - Name (string) – - The name of the step, used as an identifier. 
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
 
- TagStepDetails (dict) – - Details for a step that creates one or more tags. - You specify one or more tags. Each tag contains a key-value pair. - Name (string) – - The name of the step, used as an identifier. 
- Tags (list) – - Array that contains from 1 to 10 key/value pairs. - (dict) – - Specifies the key-value pair that are assigned to a file during the execution of a Tagging step. - Key (string) – [REQUIRED] - The name assigned to the tag that you create. 
- Value (string) – [REQUIRED] - The value that corresponds to the key. 
 
 
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
 
- DecryptStepDetails (dict) – - Details for a step that decrypts an encrypted file. - Consists of the following values: - A descriptive name 
- An Amazon S3 or Amazon Elastic File System (Amazon EFS) location for the source file to decrypt. 
- An S3 or Amazon EFS location for the destination of the file decryption. 
- A flag that indicates whether to overwrite an existing file of the same name. The default is - FALSE.
- The type of encryption that’s used. Currently, only PGP encryption is supported. 
 - Name (string) – - The name of the step, used as an identifier. 
- Type (string) – [REQUIRED] - The type of encryption used. Currently, this value must be - PGP.
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
- OverwriteExisting (string) – - A flag that indicates whether to overwrite an existing file of the same name. The default is - FALSE.- If the workflow is processing a file that has the same name as an existing file, the behavior is as follows: - If - OverwriteExistingis- TRUE, the existing file is replaced with the file being processed.
- If - OverwriteExistingis- FALSE, nothing happens, and the workflow processing stops.
 
- DestinationFileLocation (dict) – [REQUIRED] - Specifies the location for the file being decrypted. Use - ${Transfer:UserName}or- ${Transfer:UploadDate}in this field to parametrize the destination prefix by username or uploaded date.- Set the value of - DestinationFileLocationto- ${Transfer:UserName}to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.
- Set the value of - DestinationFileLocationto- ${Transfer:UploadDate}to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.
 - Note- The system resolves - UploadDateto a date format of YYYY-MM-DD, based on the date the file is uploaded in UTC.- S3FileLocation (dict) – - Specifies the details for the Amazon S3 file that’s being copied or decrypted. - Bucket (string) – - Specifies the S3 bucket for the customer input file. 
- Key (string) – - The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object. 
 
- EfsFileLocation (dict) – - Specifies the details for the Amazon Elastic File System (Amazon EFS) file that’s being decrypted. - FileSystemId (string) – - The identifier of the file system, assigned by Amazon EFS. 
- Path (string) – - The pathname for the folder being used by a workflow. 
 
 
 
 
 
- OnExceptionSteps (list) – - Specifies the steps (actions) to take if errors are encountered during execution of the workflow. - Note- For custom steps, the Lambda function needs to send - FAILUREto the call back API to kick off the exception steps. Additionally, if the Lambda does not send- SUCCESSbefore it times out, the exception steps are executed.- (dict) – - The basic building block of a workflow. - Type (string) – - Currently, the following step types are supported. - COPY- Copy the file to another location.
- CUSTOM- Perform a custom step with an Lambda function target.
- DECRYPT- Decrypt a file that was encrypted before it was uploaded.
- DELETE- Delete the file.
- TAG- Add a tag to the file.
 
- CopyStepDetails (dict) – - Details for a step that performs a file copy. - Consists of the following values: - A description 
- An Amazon S3 location for the destination of the file copy. 
- A flag that indicates whether to overwrite an existing file of the same name. The default is - FALSE.
 - Name (string) – - The name of the step, used as an identifier. 
- DestinationFileLocation (dict) – - Specifies the location for the file being copied. Use - ${Transfer:UserName}or- ${Transfer:UploadDate}in this field to parametrize the destination prefix by username or uploaded date.- Set the value of - DestinationFileLocationto- ${Transfer:UserName}to copy uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.
- Set the value of - DestinationFileLocationto- ${Transfer:UploadDate}to copy uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.
 - Note- The system resolves - UploadDateto a date format of YYYY-MM-DD, based on the date the file is uploaded in UTC.- S3FileLocation (dict) – - Specifies the details for the Amazon S3 file that’s being copied or decrypted. - Bucket (string) – - Specifies the S3 bucket for the customer input file. 
- Key (string) – - The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object. 
 
- EfsFileLocation (dict) – - Specifies the details for the Amazon Elastic File System (Amazon EFS) file that’s being decrypted. - FileSystemId (string) – - The identifier of the file system, assigned by Amazon EFS. 
- Path (string) – - The pathname for the folder being used by a workflow. 
 
 
- OverwriteExisting (string) – - A flag that indicates whether to overwrite an existing file of the same name. The default is - FALSE.- If the workflow is processing a file that has the same name as an existing file, the behavior is as follows: - If - OverwriteExistingis- TRUE, the existing file is replaced with the file being processed.
- If - OverwriteExistingis- FALSE, nothing happens, and the workflow processing stops.
 
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
 
- CustomStepDetails (dict) – - Details for a step that invokes an Lambda function. - Consists of the Lambda function’s name, target, and timeout (in seconds). - Name (string) – - The name of the step, used as an identifier. 
- Target (string) – - The ARN for the Lambda function that is being called. 
- TimeoutSeconds (integer) – - Timeout, in seconds, for the step. 
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
 
- DeleteStepDetails (dict) – - Details for a step that deletes the file. - Name (string) – - The name of the step, used as an identifier. 
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
 
- TagStepDetails (dict) – - Details for a step that creates one or more tags. - You specify one or more tags. Each tag contains a key-value pair. - Name (string) – - The name of the step, used as an identifier. 
- Tags (list) – - Array that contains from 1 to 10 key/value pairs. - (dict) – - Specifies the key-value pair that are assigned to a file during the execution of a Tagging step. - Key (string) – [REQUIRED] - The name assigned to the tag that you create. 
- Value (string) – [REQUIRED] - The value that corresponds to the key. 
 
 
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
 
- DecryptStepDetails (dict) – - Details for a step that decrypts an encrypted file. - Consists of the following values: - A descriptive name 
- An Amazon S3 or Amazon Elastic File System (Amazon EFS) location for the source file to decrypt. 
- An S3 or Amazon EFS location for the destination of the file decryption. 
- A flag that indicates whether to overwrite an existing file of the same name. The default is - FALSE.
- The type of encryption that’s used. Currently, only PGP encryption is supported. 
 - Name (string) – - The name of the step, used as an identifier. 
- Type (string) – [REQUIRED] - The type of encryption used. Currently, this value must be - PGP.
- SourceFileLocation (string) – - Specifies which file to use as input to the workflow step: either the output from the previous step, or the originally uploaded file for the workflow. - To use the previous file as the input, enter - ${previous.file}. In this case, this workflow step uses the output file from the previous workflow step as input. This is the default value.
- To use the originally uploaded file location as input for this step, enter - ${original.file}.
 
- OverwriteExisting (string) – - A flag that indicates whether to overwrite an existing file of the same name. The default is - FALSE.- If the workflow is processing a file that has the same name as an existing file, the behavior is as follows: - If - OverwriteExistingis- TRUE, the existing file is replaced with the file being processed.
- If - OverwriteExistingis- FALSE, nothing happens, and the workflow processing stops.
 
- DestinationFileLocation (dict) – [REQUIRED] - Specifies the location for the file being decrypted. Use - ${Transfer:UserName}or- ${Transfer:UploadDate}in this field to parametrize the destination prefix by username or uploaded date.- Set the value of - DestinationFileLocationto- ${Transfer:UserName}to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the name of the Transfer Family user that uploaded the file.
- Set the value of - DestinationFileLocationto- ${Transfer:UploadDate}to decrypt uploaded files to an Amazon S3 bucket that is prefixed with the date of the upload.
 - Note- The system resolves - UploadDateto a date format of YYYY-MM-DD, based on the date the file is uploaded in UTC.- S3FileLocation (dict) – - Specifies the details for the Amazon S3 file that’s being copied or decrypted. - Bucket (string) – - Specifies the S3 bucket for the customer input file. 
- Key (string) – - The name assigned to the file when it was created in Amazon S3. You use the object key to retrieve the object. 
 
- EfsFileLocation (dict) – - Specifies the details for the Amazon Elastic File System (Amazon EFS) file that’s being decrypted. - FileSystemId (string) – - The identifier of the file system, assigned by Amazon EFS. 
- Path (string) – - The pathname for the folder being used by a workflow. 
 
 
 
 
 
- Tags (list) – - Key-value pairs that can be used to group and search for workflows. Tags are metadata attached to workflows for any purpose. - (dict) – - Creates a key-value pair for a specific resource. Tags are metadata that you can use to search for and group a resource for various purposes. You can apply tags to servers, users, and roles. A tag key can take more than one value. For example, to group servers for accounting purposes, you might create a tag called - Groupand assign the values- Researchand- Accountingto that group.- Key (string) – [REQUIRED] - The name assigned to the tag that you create. 
- Value (string) – [REQUIRED] - Contains one or more values that you assigned to the key name you create. 
 
 
 
- Return type:
- dict 
- Returns:
- Response Syntax- { 'WorkflowId': 'string' } - Response Structure- (dict) – - WorkflowId (string) – - A unique identifier for the workflow. 
 
 
 - Exceptions- Transfer.Client.exceptions.InvalidRequestException
- Transfer.Client.exceptions.ThrottlingException
- Transfer.Client.exceptions.InternalServiceError
- Transfer.Client.exceptions.ServiceUnavailableException
- Transfer.Client.exceptions.ResourceExistsException
- Transfer.Client.exceptions.AccessDeniedException