# Amazon S3

## Overview

The Amazon Simple Storage Service (S3) Connector in DBSync Cloud Workflow enables secure and flexible integration between Amazon Simple Storage Service (S3) and local file systems. It supports operations such as uploading and downloading files and directories, enhancing your data management capabilities. Use it to transfer files, archive data, trigger workflows based on file events, or synchronize file content with other systems - all without custom coding or infrastructure setup.

***

## Key Benefits

**For Business Users:**

* **Centralized File Integration:** Easily connect enterprise applications with S3 for unified data storage and retrieval.
* **Automated Data Workflows:** Eliminate manual upload/download processes by automating file interactions.
* **Supports Multiple Scenarios:** From backup and archival to downstream data processing and analytics.

**For Technical Teams:**

* Secure AWS Authentication: Uses AWS Access Keys and IAM roles for controlled access.
* Flexible Operations: Supports file upload, download, listing, and copying as workflow tasks.
* Event-Driven Actions: Integrates with AWS event triggers for real-time workflows.

***

## Prerequisites

Before configuring the Amazon S3 Connector, ensure the following prerequisites are met:

1. DBSync Access
2. A valid license for this connector and access to the DBSync Cloud Workflow platform
3. AWS Credentials
4. AWS Access Key ID and Secret Access Key with IAM permissions to access and manage S3 resources.
5. AWS IAM Permissions
6. The IAM user or role must have permissions such as s3:ListBucket, s3:GetObject, s3:PutObject, and others as needed by your workflows.
7. S3 Bucket Created
8. At least one pre-configured S3 bucket where objects will be read or written.

|                                                                                                                                                                                                            |
| ---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
| <p><strong>PRO TIP</strong></p><p>Use IAM roles with appropriate permissions for enhanced security when accessing your S3 buckets. This minimizes the risk of exposing your Access Key and Secret Key.</p> |

## Connector Configuration

Each Amazon S3 connector instance represents a single AWS account’s access configuration. To connect multiple AWS accounts or credential sets, create separate connector instances.

After configuration, validate the connection to ensure the credentials are correct and have sufficient permissions.

***

### Quick Setup Guide

Follow these steps to configure your Amazon S3 connector:

1. **Login to DBSync Cloud Workflow**
   * Use your DBSync credentials to access the platform.
2. **Add the Amazon S3 Connector**
   * Navigate to the Connectors list  in the Apps pane and select S3.
3. **Enter Your AWS Credentials**
   * **Authorization Type:** Select the authorization type as AccessKey.
   * **AWS Key:** Enter your AWS access key here. This key identifies your AWS account.
   * **AWS Secret:** Provide your AWS secret here. This secret is associated with your AWS access key and should be kept confidential.
   * **AWS Region:** Specify the AWS Region where your Amazon S3 bucket resides. For example, us-east-1.
   * **AWS SSE Key** (Optional): If you are using Server-Side Encryption (SSE), enter your AWS SSE Key here. This field is optional.
   * **Restrict to Bucket** (Optional): You can restrict the connector to a specific Amazon S3 bucket by entering the bucket name. Leave it blank if no restriction is required.
   * Restrict to Path (Optional): Optionally, enter a specific path within your bucket to further restrict access. Leave this blank to allow access to all paths in the bucket.

<figure><img src="https://1036205596-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2Fv9avy716UiAsS24zOznZ%2Fuploads%2Fsqb2jtGhzNlEmmWKrtQt%2Funknown.png?alt=media&#x26;token=b44180da-a684-48c3-89e9-cf5a2a8f2bdb" alt=""><figcaption></figcaption></figure>

4. **Validate Connection**
   * Use the Test Connection button to verify AWS access permissions and connectivity.
5. **Save Configuration**
   * Once validated, save the connector for use in workflows and projects.

After successful setup, your S3 storage is available for file and object operations in workflows.

***

## Common Use Cases

The Amazon S3 Connector can support a variety of business scenarios:

* **Automate File Backups**\
  Schedule workflows to upload database exports, logs, or reports to S3 for backup and archival.
* **Data Ingestion for Analytics**\
  Move raw or processed data from applications into S3 buckets for downstream analytics or data lake ingestion.
* **Automated Invoice Archival to Amazon S3**

<details>

<summary>Problem<strong>:</strong> A business generates hundreds of invoice PDFs daily and stores them in a local folder. Over time, managing and backing up these files becomes difficult.</summary>

**Solution using DBSync:**

* Set up a File System connector to watch the invoice directory.
* Use the Amazon S3 connector to upload all files in that folder to a specified bucket daily.
* Schedule the process to run at the end of each day.
* Outcome:
* Automated daily backup of critical files
* Improved data availability and disaster recovery
* Reduced manual effort in managing invoice archives

</details>

***

## Uses

* Scheduled Auto Backups: Automate backups and archives from file systems or databases to Amazon S3, reducing manual intervention.
* Improved File Sharing: Enhance file sharing capabilities between databases and Amazon S3.

## Frequently Asked Questions (FAQ)

**What is Amazon S3?**

Amazon Simple Storage Service (S3) is a scalable storage solution provided by AWS that allows users to store and retrieve any amount of data at any time from anywhere on the web. The S3 Connector uses AWS APIs with secure credentials to manage objects in S3 buckets.

**Can this connector be used for bi-directional integration?**

Yes, the connector supports both bi-directional and uni-directional integrations.

**How frequently can we perform Integration?**

DBSync can trigger data integration every minute.

**Can I connect to multiple AWS accounts?**\
Yes - create separate connector configurations using different IAM credentials.

**Is my data secure?**\
Yes - your AWS keys are securely stored, and all S3 communication occurs over encrypted HTTPS connections.

**Can I transfer large objects (e.g., multi-GB files)?**\
Yes - the connector supports transfer of large files, subject to AWS S3 service limits and throughput.

**Can workflows be triggered by new uploads in S3?**\
While S3 does not natively push events to DBSync, you can poll the bucket or integrate with AWS event systems to trigger workflows.

**Does this work with IAM roles and temporary credentials?**\
Yes - you can provide temporary session tokens or use roles via federated access methods.

***

## Summary

The Amazon S3 Connector in DBSync Cloud Workflow provides a powerful, secure, and flexible way to integrate your object storage with the rest of your business systems. Whether you need to automate backups, support analytics pipelines, or manage file lifecycle across platforms, this connector simplifies file-based integration and workflows without custom development.

## Related Links

**Monitoring and Alerts Setup**: Set up alerts to monitor your workflows proactively.

**Scheduler:** By establishing a schedule, you can automate your workflows to run without manual intervention, allowing you to manage recurring tasks more efficiently.

<br>


---

# Agent Instructions: Querying This Documentation

If you need additional information that is not directly available in this page, you can query the documentation dynamically by asking a question.

Perform an HTTP GET request on the current page URL with the `ask` query parameter:

```
GET https://docs.mydbsync.com/cloud-workflow/2026_connectors_new/amazon-s3.md?ask=<question>
```

The question should be specific, self-contained, and written in natural language.
The response will contain a direct answer to the question and relevant excerpts and sources from the documentation.

Use this mechanism when the answer is not explicitly present in the current page, you need clarification or additional context, or you want to retrieve related documentation sections.
