S3

Learn how to use the Amazon S3 storage action to build your workflow

Introduction

The Amazon S3 storage action in DBSync Cloud Workflow enables users to upload files directly to an Amazon S3 bucket as part of their integration processes. This is useful for scalable storage, backup, compliance, and sharing of files and reports generated by workflows.

Use Cases

  • Store integration logs and audit data in an S3 bucket.

  • Upload CSV or JSON files for reporting or downstream processing.

  • Archive transformed data for compliance or historical access.

Use Case Scenario

Scenario: Archiving Daily Sales Reports to Amazon S3

A retail company runs a DBSync workflow every night to generate a sales report in CSV format. To comply with data retention policies and provide easy access to historical reports, the company uploads the file to an Amazon S3 bucket structured by date.

Workflow Configuration

  1. Query daily sales records from the database.

  2. Format the data using a CSV or JSON formatter.

  3. Upload the formatted file to Amazon S3 using the S3 Storage Action.

S3 Action Settings

  • Bucket Name: company-sales-data

  • Object Key: reports/sales/${date}/SalesReport_${date}.csv

  • File Content: Output from CSV formatter

  • File Type: text/csv

  • ACL (Optional): private or public-read

Prerequisites

  • An active AWS account with S3 access.

  • IAM credentials (Access Key ID and Secret Access Key) with PutObject permissions.

  • Configured S3 connection in DBSync.

Configuration Steps

  1. Create an S3 Connection

    • Navigate to the Connections section in DBSync.

    • Select Amazon S3 as the connection type.

    • Enter the Access Key ID, Secret Access Key, and region.

  2. Add the S3 Storage Action

    • Add an Action Step in your workflow.

    • Choose S3 Storage as the action type.

Read from S3
  1. Set Action Properties

  • Bucket Name: Target S3 bucket.

  • Object Key: File path and name in the bucket.

  • File Content: Data to be uploaded.

  • Content Type: MIME type (e.g., text/csv, application/json).

  • ACL (Optional): Choose access level such as private, public-read, etc.

Write to S3
  1. Run and Validate

  • Execute the workflow.

  • Check the S3 bucket to verify successful file upload.

Best Practices

  • Use environment variables or encrypted credentials for AWS keys.

  • Structure S3 keys (paths) using date or record identifiers to organize files.

  • Use lifecycle rules in S3 for auto-archiving or deletion.

Troubleshooting

  • Access Denied: Ensure IAM policy includes necessary S3 permissions.

  • Invalid Bucket: Verify the bucket name and region.

  • Upload Fails: Check internet connectivity and file size limits.

Limitations

  • Large file uploads may require multipart upload support (not currently available).

  • Region-specific endpoints must be configured correctly.

  • No support for client-side encryption in current version.

Last updated