AWS S3 Connector 1.0.0.0
The AWS S3 Connector lets you connect to S3 buckets and:
-
List bucket contents.
-
Upload, download, delete, move, and rename AWS S3 objects.
-
Copy objects between AWS S3 buckets.
Prerequisites
- RunMyJobs 9.2.9 or higher.
- Connection Management Extension 1.0.0.4 or later. This will be installed or updated automatically if necessary when you install the AWS S3 Connector.
- AWS S3 Utilities 1.0.0.0 or later. This will be installed or updated automatically if necessary when you install the AWS S3 Connector.
- An AWS Connection.
- Privileges Required to Use the AWS S3 Connector.
Contents of the Component
Object Type | Name | Description |
---|---|---|
Application | GLOBAL.Redwood.REDWOOD.AWS.REDWOOD.S3 | Enables connection with AWS S3 |
Constraint Definition | REDWOOD.Redwood_AWS_S3_EncryptionTypeConstraint | Constraint for AWS S3 Encryption Type field |
Constraint Definition | REDWOOD.Redwood_AWS_S3_StorageClassConstraint | Constraint for AWS S3 Storage Class field |
Job Definition | REDWOOD.Redwood_AWS_S3_CopyObject | Copy, move, or rename objects in an S3 bucket |
Job Definition | REDWOOD.Redwood_AWS_S3_DeleteObject | Delete Selected Object(S) in AWS S3 |
Job Definition | REDWOOD.Redwood_AWS_S3_DownloadFile | Download file(s) from AWS S3 |
Job Definition | REDWOOD.Redwood_AWS_S3_ListBucket | List the contents of an AWS S3 Bucket |
Job Definition | REDWOOD.Redwood_AWS_S3_UploadFile | Upload Selected File(S) to AWS S3 |
Job Definition | REDWOOD.Redwood_AWS_S3_UploadLocalFiles | Upload local files from a Platform Agent to an S3 bucket |
Job Definition | REDWOOD.Redwood_AWS_S3_UploadLocalFilesTemplate | Template for uploading local files from a Platform Agent to an S3 bucket |
Job Definition Type | REDWOOD.Redwood_AWS_S3 | Amazon Web Services Connector |
Library | REDWOOD.Redwood_AWS_S3 Library | Library for Amazon Web Services S3 connector |
Redwood_AWS_S3_CopyObject
Lets you copy, move, and rename an object or objects in an AWS S3 bucket.
Parameters
Redwood_AWS_S3_DeleteObject
Lets you delete objects in AWS S3.
Parameters
Tab | Name | Description | Documentation | Data Type | Direction | Default Expression | Values |
---|---|---|---|---|---|---|---|
Parameters | connection
|
Connection | The AWS Connection to use for the operation | String | In |
|
|
Parameters | regionName
|
Region | The region that the S3 Bucket is in | String | In |
|
|
Parameters | bucketName
|
Bucket Name | The name of the S3 bucket that the object is stored in | String | In |
|
|
Parameters | recursive
|
Recursive | When set to Yes, the delete will be performed recursively. When set to No, delete will only be performed inside of the folder specified by the Key Prefix (or at the root if no Key Prefix is supplied). Note: When set to No, if a key prefix is given, it must be the full key of a folder object. If it is not a valid folder key, nothing will be deleted. | String | In | N
|
Y=Yes,N=No |
Parameters | keyPrefix
|
Key Prefix | The key prefix to search under. Everything after the key prefix is Recursive. | String | In |
|
|
Parameters | objectNameFilter
|
Object Name | The name of the object to delete. Wildcards such as * are allowed. If a wildcard is used, all objects in the specified keyPrefix that match the pattern will be deleted. | String | In |
|
|
Redwood_AWS_S3_DownloadFile
Lets you download files from AWS S3.
Parameters
Tab | Name | Description | Documentation | Data Type | Direction | Default Expression | Values |
---|---|---|---|---|---|---|---|
Parameters | connection
|
Connection | The AWS Connection to use for the operation | String | In |
|
|
Parameters | regionName
|
Region | The region that the S3 Bucket is in | String | In |
|
|
Parameters | bucketName
|
Bucket Name | The name of the S3 bucket that the file is stored in | String | In |
|
|
Parameters | keyPrefix
|
Key Prefix | The key prefix to search under | String | In |
|
|
Parameters | objectName
|
Object Name | The name of the object to download. Wildcards such as * are allowed. If a wildcard is used, all objects in the specified keyPrefix that match the pattern will be downloaded. | String | In |
|
|
Parameters | file
|
Downloaded File | Link to the downloaded file, if there are more than one file, they will all be streamed into a single zip file called 'output.zip'. | File | Out |
|
|
Redwood_AWS_S3_ListBucket
Lets you list the contents of an AWS S3 Bucket.
Parameters
Tab | Name | Description | Documentation | Data Type | Direction | Default Expression | Values |
---|---|---|---|---|---|---|---|
Parameters | connection
|
Connection | The AWS Connection to use for the operation | String | In |
|
|
Parameters | regionName
|
Region | The region that the S3 Bucket is in | String | In |
|
|
Parameters | bucketName
|
Bucket Name | The name of the S3 bucket to list the contents of | String | In |
|
|
Parameters | recursive
|
Recursive | When set to Yes, the listing will be performed recursively. When set to No, listing will only be performed inside of the folder specified by the Key Prefix (or at the root if no Key Prefix is supplied). Note: When set to No, if a key prefix is given, it must be the full key of a folder object. If it is not a valid folder key, nothing will be returned. | String | In | N
|
Y , N |
Parameters | keyPrefix
|
Key Prefix | The key prefix to search under | String | In |
|
|
Parameters | objectNameFilter
|
Object Name Filter | Only return objects whose name matches the filter. Wildcards * and ? are supported. | String | In |
|
|
Parameters | listing
|
Bucket Listing | Link to the generated RTX file containing the listing output | Table | Out |
|
|
Redwood_AWS_S3_UploadFile
Lets you upload files to AWS S3, either directly from the file system or as part of a Chain Definition.
Parameters
Tab | Name | Description | Documentation | Data Type | Direction |
---|---|---|---|---|---|
Parameters | connection
|
Connection | The AWS Connection to use for the operation | String | In |
Parameters | regionName
|
Region | The region that the S3 Bucket is in | String | In |
Parameters | bucketName
|
Bucket Name | The name of the S3 bucket that the object is stored in | String | In |
Parameters | file
|
File to Upload | The job file to upload | File | In |
Parameters | keyPrefix
|
Key Prefix | The key prefix to upload the file to. If omitted, the file will be uploaded to the root of the bucket | String | In |
Parameters | fileName
|
Uploaded File Name | The new name to assign to the uploaded file. If omitted, the original filename is retained. | String | In |
Parameters | metadata
|
Metadata | User-Defined metadata to set on the uploaded files. Each entry must be in key=value format where the key is the name of the metadata entry, and the value is its value. If the x-amz-meta- prefix is not supplied, it will be added automatically. Example: x-amz-meta-custom-entry=value | String | In |
System Metadata | encryptionType
|
Server Side Encryption | The encryption settings to use for the uploaded file(s). If SSE-KMS or DSSE-KMS is selected, use the KMS Key ARN parameter to supply the ARN of the key to use for encryption. | String | In |
System Metadata | kmsKeyArn
|
KMS Key ARN | The ARN of the KMS key to use for encryption. Example: arn:aws:kms:region-name:xxxxxxx:key/xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx | String | In |
System Metadata | storageClass
|
Storage Class | The storage class to use for the uploaded file(s) | String | In |
System Metadata | tags
|
Tags | The set of Tags to add to the uploaded objects. Each entry in the array should be specified as a Key=Value pair, where the Key is the name of the tag, and the value is the tag’s value. | String | In |
System Metadata | contentDisposition
|
Content Disposition | The value to set for the Content Disposition of the uploaded file(s) | String | In |
System Metadata | contentType
|
Content Type | The value to set for the Content Type of the uploaded file(s) | String | In |
Redwood_AWS_S3_UploadLocalFiles
Lets you upload local files from a Platform Agent to an S3 bucket.
Parameters
Tab | Name | Description | Documentation | Data Type | Direction |
---|---|---|---|---|---|
AWS S3 Target | connection
|
Connection | The AWS Connection to use for the operation | String | In |
AWS S3 Target | regionName
|
Region | The region that the S3 Bucket is in | String | In |
AWS S3 Target | bucketName
|
Bucket Name | The name of the S3 bucket that the object is stored in | String | In |
AWS S3 Target | keyPrefix
|
Key Prefix | The key prefix to include with the file | String | In |
AWS S3 Target | metadata
|
Metadata | User-Defined metadata to set on the uploaded files. Each entry must be in key=value format where the key is the name of the metadata entry, and the value is its value. If the x-amz-meta- prefix is not supplied, it will be added automatically. Example: x-amz-meta-custom-entry=value | String | In |
System Metadata | encryptionType
|
Server Side Encryption | The encryption settings to use for the uploaded file(s). If SSE-KMS or DSSE-KMS is selected, use the KMS Key ARN parameter to supply the ARN of the key to use for encryption. | String | In |
System Metadata | kmsKeyArn
|
KMS Key ARN | The ARN of the KMS key to use for encryption. Example: arn:aws:kms:region-name:xxxxxxx:key/xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx | String | In |
System Metadata | storageClass
|
Storage Class | The storage class to use for the uploaded file(s) | String | In |
System Metadata | tags
|
Tags | The set of Tags to add to the uploaded objects. Each entry in the array should be specified as a Key=Value pair, where the Key is the name of the tag, and the value is the tag’s value. | String | In |
System Metadata | contentDisposition
|
Content Disposition | The value to set for the Content Disposition of the uploaded file(s) | String | In |
System Metadata | contentType
|
Content Type | The value to set for the Content Type of the uploaded file(s) | String | In |
Source Files | processServer
|
Process Server | The name of the Process Server linked to the Platform Agent you would like to access. | String | In |
Source Files | filePath
|
Folder Path | The path to the folder where the local files are located. | String | In |
Source Files | fileName
|
File Name | The name of the file(s) to upload to the S3 bucket. Wildcards are supported. If omitted, the folder will be uploaded | String | In |
Redwood_AWS_S3_UploadLocalFilesTemplate
Template for uploading local files from a Platform Agent to an S3 bucket.
Parameters
Tab | Name | Description | Documentation | Data Type | Direction |
---|---|---|---|---|---|
AWS S3 Target | connection
|
Connection | The AWS Connection to use for the operation | String | In |
AWS S3 Target | regionName
|
Region | The region that the S3 Bucket is in | String | In |
AWS S3 Target | bucketName
|
Bucket Name | The name of the S3 bucket that the object is stored in | String | In |
AWS S3 Target | keyPrefix
|
Key Prefix | The key prefix to include with the file | String | In |
AWS S3 Target | metadata
|
Metadata | User-Defined metadata to set on the uploaded files. Each entry must be in key=value format where the key is the name of the metadata entry, and the value is its value. If the x-amz-meta- prefix is not supplied, it will be added automatically. Example: x-amz-meta-custom-entry=value | String | In |
System Metadata | encryptionType
|
Server Side Encryption | The encryption settings to use for the uploaded file(s). If SSE-KMS or DSSE-KMS is selected, use the KMS Key ARN parameter to supply the ARN of the key to use for encryption. | String | In |
System Metadata | kmsKeyArn
|
KMS Key ARN | The ARN of the KMS key to use for encryption. Example: arn:aws:kms:region-name:xxxxxxx:key/xxxxxx-xxxx-xxxx-xxxx-xxxxxxxx | String | In |
System Metadata | storageClass
|
Storage Class | The storage class to use for the uploaded file(s) | String | In |
System Metadata | tags
|
Tags | The set of Tags to add to the uploaded objects. Each entry in the array should be specified as a Key=Value pair, where the Key is the name of the tag, and the value is the tag’s value. | String | In |
System Metadata | contentDisposition
|
Content Disposition | The value to set for the Content Disposition of the uploaded file(s) | String | In |
System Metadata | contentType
|
Content Type | The value to set for the Content Type of the uploaded file(s) | String | In |
Source Files | processServer
|
Process Server | The name of the Process Server linked to the Platform Agent you would like to access. | String | In |
Source Files | filePath
|
Folder Path | The path to the folder where the local files are located. | String | In |
Source Files | fileName
|
File Name | The name of the file(s) to upload to the S3 bucket. Wildcards are supported. If omitted, the folder will be uploaded | String | In |
Listing the Contents of an AWS S3 Bucket
You can use the Redwood_AWS_S3_ListBucket Process Definition to query a particular AWS S3 bucket, optionally with a key prefix. This Process Definition returns the contents of the bucket in RTX format, so that you can use it in a Chain Definition.
To list the contents of an AWS S3 bucket:
-
In the AWS > S3 Application, submit the Redwood_AWS_S3_ListBucket Process Definition.
-
Choose the Connection.
-
Choose the Region the bucket is in.
-
Enter the Bucket Name.
-
To list bucket contents recursively, enter
Y
in the Recursive field. Otherwise, enterN
.Note: If you set this to
Y
and provide a Key Prefix, it must be the full key prefix of a folder object. If it is not a valid folder key prefix, nothing will be listed. -
If you want to list only objects that have a particular key prefix, enter it in the Key Prefix field.
-
To filter the response, enter a filter string in the Object Name Filter field. Wildcards * and ? are supported.
-
Submit the Process Definition.
-
In the Process Monitor, select the process, then look at the Detail View. Under Files, the
listing.rtx
file contains the response (if any).
Copying Objects in AWS S3
You can copy objects both within buckets and to other buckets. You can even copy objects to a bucket in a different region.
To copy objects in S3:
-
In the AWS > S3 Application, submit the Redwood_AWS_S3_CopyObject Process Definition.
-
In the Parameters tab, identify the object or objects you want to copy.
-
Choose the Connection.
-
Choose the Bucket Region the bucket is in.
-
Enter the Bucket Name.
-
To copy objects recursively, enter
Y
in the Recursive field. Otherwise, enterN
.Note: If you set this to
Y
and provide an Origin Key Prefix, it must be the full key prefix of a folder object. If it is not a valid folder key prefix, nothing will be copied. -
If the object or objects you want to copy have a key prefix, enter it in the Origin Key Prefix field. If the object or objects you want to copy are at the root level of the bucket, leave this field blank.
-
To indicate the name of the file or files to be copied, enter a name in the Object Name field. The wildcard character * is supported.
Note: If you use wildcard characters, all objects in the specified Origin Key Prefix (if you supply one) that match the pattern will be copied.
-
-
In the Destination tab, indicate where you want the copied object or objects to be put.
-
If you want to copy the objects to a different region, choose the Destination Bucket Region to use. If not, leave this field blank, and the objects will be copied to the region identified in the Parameters tab.
-
If you want to copy the objects to a different bucket, enter the Destination Bucket Name to use. If not, leave this field blank, and the objects will be copied to the bucket identified in the Parameters tab.
-
If you want the copied objects to have a particular key prefix, enter it in the Destination Key Prefix field.
-
If you are copying a single object (in other words, if you have not used any wildcard characters in the Object Name field in the Parameters tab), you can set the name of the copied object by entering a name in the Destination Object Name field. If you leave this field blank, the original object name is kept.
-
To delete the source object after it has been successfully copied, enter
Y
in the Delete Source Object field. Otherwise, enterN
.
-
-
Submit the Process Definition.
Moving Objects in AWS S3
In AWS S3, there is currently no "move" operation. However, you can move an object by calling the Redwood_AWS_S3_CopyObject Process Definition and checking Delete Source Object.
Note: Because this is not an atomic operation, you could conceivably copy an object without deleting the original.
Deleting Objects in AWS S3
To delete an object or objects in AWS S3:
-
In the AWS > S3 Application, submit the Redwood_AWS_S3_DeleteObject Process Definition.
-
In the Parameters tab, identify the object or objects you want to delete.
- Choose the Connection.
- Choose the Region the object or objects are in.
- Enter the Bucket Name.
-
To delete objects recursively, enter
Y
in the Recursive field. Otherwise, enterN
.Note: If you set this to
Y
and provide a Key Prefix, it must be the full key prefix of a folder object. If it is not a valid folder key prefix, nothing will be deleted. - If the object or objects you want to delete have a key prefix, enter it in the Key Prefix field. If the object or objects you want to delete are at the root level of the bucket, leave this field blank.
-
To indicate the name of the file or files to be deleted, enter a name in the Object Name field. The wildcard character * is supported.
Note: If you use wildcard characters, all objects in the specified Key Prefix (if you supply one) that match the pattern will be deleted.
-
Submit the Process Definition.
Downloading Objects from AWS S3
To download an object or objects from AWS S3:
-
In the AWS > S3 Application, submit the Redwood_AWS_S3_DownloadFile Process Definition.
-
In the Parameters tab, identify the object or objects you want to download.
- Choose the Connection.
- Choose the Region the object or objects are in.
- Enter the Bucket Name.
- If the object or objects you want to download have a key prefix, enter it in the Key Prefix field. If the object or objects you want to download are at the root level of the bucket, leave this field blank.
-
To indicate the name of the file or files to be downloaded, enter a name in the Object Name field. The wildcard character * is supported.
Note: If you use wildcard characters, all objects in the specified Key Prefix (if you supply one) that match the pattern will be downloaded.
-
Submit the Process Definition.
-
In the Process Monitor, select the process, then look at the Detail View under Files.
-
If you download a single file, it is attached to the process in its original format.
-
If you download more than one file, they are attached to the process as a zip file.
-
Note: If download fails, the Redwood_AWS_S3_DownloadFile goes into Error status.
Uploading a File to AWS S3
To upload a file to AWS S3, either directly from the file system or with a Chain Definition:
-
In the AWS > S3 Application, submit the Redwood_AWS_S3_UploadFile Process Definition.
-
In the Parameters tab, identify the file you want to upload and indicate where it should be uploaded.
- Choose the Connection.
- Choose the Region the file should be uploaded to.
- Enter the Bucket Name.
-
In the File to Upload field, click Upload and select the file to upload.
Note: RunMyJobs will attempt to guess the
Content-Type
for upload based on the file suffix. However, you can override this manually with the Content Type field in the System Metadata tab. - If you want to upload the file with a particular key prefix, enter it in the Key Prefix field.
-
To set the name of the uploaded file in AWS S3, enter a name in the Updated File Name field. The wildcard character * is supported.
-
To specify user-defined AWS S3 metadata for the uploaded file, enter key-value pairs in the Metadata field, where the key is the name for each entry.
Note: Key names must be prefixed with
x-amz-meta-
. If you do not supply this prefix, RunMyJobs will add it automatically.
-
In the System Metadata tab, enter any AWS S3 system metadata you want applied to the uploaded file.
-
In the Server Side Encryption field, enter the encryption settings for the uploaded file. If you enter SSE-KMS or DSSE-KMS, use the KMS Key ARN field to supply the ARN of the key to use for encryption.
-
In the KMS Key ARN field, optionally enter the ARN to use for server-side encryption.
-
In the Storage Class field, enter the AWS S3 storage class to be used for the uploaded file.
-
In the Tags field, optionally enter key-value pairs to specify a set of tags to be applied to the uploaded file.
-
In the Content Disposition field, optionally enter presentation information for the uploaded file.
-
In the Content Type field, optionally enter a
Content-Type
header field value. If you do not supply a Content Type value, RunMyJobs will attempt to guess theContent-Type
from the file's file suffix.
-
-
Submit the Process Definition.
Uploading Local Files to AWS S3
This section describes how to upload files from a specific directory on a Platform Agent computer to AWS S3. There are three parts to this process:
-
Use the AWS S3 Target tab to specify where the file or files should go.
-
Use the System Metadata tab to apply AWS S3 metadata to the file or files you are uploading.
-
Use the Source Files tab to identify the files to be uploaded. The directory containing these files must be listed in the in the
server_root
file on the Platform Agent for that location.
To upload one or more local files:
-
In the AWS > S3 Application, submit the Redwood_AWS_S3_UploadLocalFiles Process Definition.
-
In the AWS S3 Target tab, identify where the files should be uploaded to.
- Choose the Connection.
- Choose the Region the object or objects should be uploaded to.
- Enter the Bucket Name.
- If you want to upload files with a particular key prefix, enter it in the Key Prefix field.
-
To specify user-defined AWS S3 metadata for the uploaded file or files, enter key-value pairs in the Metadata field, where the key is the name for each entry.
Note: Key names must be prefixed with
x-amz-meta-
. If you do not supply this prefix, RunMyJobs will add it automatically.
-
In the System Metadata tab, enter any AWS S3 system metadata you want applied to the uploaded file or files.
-
In the Server Side Encryption field, enter the encryption settings for the uploaded files. If you enter SSE-KMS or DSSE-KMS, use the KMS Key ARN field to supply the ARN of the key to use for encryption.
-
In the KMS Key ARN field, optionally enter the ARN to use for server-side encryption.
-
In the Storage Class field, enter the AWS S3 storage class to be used for the uploaded files.
-
In the Tags field, optionally enter key-value pairs to specify a set of tags to be applied to the uploaded files.
-
In the Content Disposition field, optionally enter presentation information for the uploaded files.
-
In the Content Type field, optionally enter a
Content-Type
header field value.Note: If you do not supply a Content Type value, RunMyJobs will attempt to guess the
Content-Type
from each file's suffix.
-
-
In the Source Files tab, indicate the file or files you want to upload.
-
From the Process Server dropdown list, choose the Process Server hosting the files to be uploaded.
-
In the Folder Path field, enter the full OS path of the folder containing the files to be uploaded.
Note: This folder must be included the
server_root
file on the Platform Agent. -
In the File Name field, do one of these things:
-
To specify a single file to be uploaded, enter the name of the file.
-
To specify multiple files to be uploaded, enter a string that uses one or more wildcard * characters.
-
To specify that the entire folder should be uploaded, leave the File Name field blank.
-
-
-
Submit the Process Definition.
Uploading Local Files to AWS S3 with a Template
To create a customized Process Definition, optionally with default values, for uploading files from a Process Server to AWS S3:
-
Right-click the Redwood_AWS_S3_UploadLocalFilesTemplate Process Definition and choose New (from Template) from the context menu. The New Process Definition pop-up window displays.
-
Choose a Partition.
-
Enter a Name.
- Delete the default Application value (if any) and substitute your own Application name if desired.
-
In the Parameters tab, enter any Default Expressions you want to use.
-
When specifying the Connection value, use the format
EXTCONNECTION:<partition>.<connection name>
.
-
-
Save and then submit the new Process Definition.