CSV

Using the CSV Connector in DBSync Cloud Workflow

Overview

The CSV Connector in DBSync Cloud Workflow allows users to perform operations on CSV files, such as parsing (reading CSV content by rows) and composing (creating a CSV file from a data stream). This functionality enables seamless integration and manipulation of CSV data within workflows.

Operations

  • Parse CSV: This operation reads and interprets the content of a CSV file.

  • Compose CSV: This operation creates or generates a CSV file from the provided data stream.

Steps to Use the CSV Connector

Configure the CSV Action

  1. Select Operation: Choose either Parse or Compose.

Parse CSV

  1. Parse From: Select the CSV data stream variable that contains the CSV content you want to parse. This can be a hardcoded file path, a data-stream variable, or a regular expression to match file patterns.

  2. Delimiter: Choose the delimiter used in your CSV file from the dropdown list (e.g., Comma, Semicolon, Tab, Pipe, Space). You can also select Custom to specify a different separator.

  3. Include Header Row: Select Yes if the first row of the CSV file contains header information (column names). If selected, the column names will be displayed in the Header Layout section.

  4. Header Layout: If the header row is included, enter the column names that need to be processed. This field is user-editable.

  5. File Encoding: Specify the character encoding for the CSV file (e.g., UTF-8, ASCII) to ensure proper data handling.

  6. Row Batch Size: Set the number of rows to read per batch. This improves performance by processing multiple rows at once.

Compose CSV

  1. Compose To: Define a variable that will store the composed CSV content. The variable name follows the pattern fileContent_Step{actionRowNumber} (e.g., $file_content_step2).

  2. Select Separator: Choose the delimiter for the composed CSV file. You can select from common delimiters or specify a custom one.

  3. Include Header Row: Check this option if you want to include a header row in the composed CSV. If checked, enter the column names in the Header Layout section.

  4. Map: If the header row is included, map the fields from the data stream to the corresponding columns in the CSV. If not included, the columns will be labeled as Column<1>, Column<2>, etc.

  5. CSV Output: This variable will save the CSV data stream after composing, which can be used for further processing or uploading to a target.

Example Workflow

  1. Login to DBSync Cloud Workflow.

  2. Create a new project and add a task for CSV operations.

  3. Add the CSV Connector and configure it for either parsing or composing CSV files.

  4. Run the workflow to execute the CSV operations and monitor the results.

PRO TIP

When working with large CSV files, use the Row Batch Size setting to process multiple rows at once. This can significantly improve performance and reduce processing time.

Frequently Asked Questions (FAQ)

1. Can I use the CSV Connector to both read and write CSV files?

Yes, the CSV Connector supports both parsing (reading) and composing (writing) operations, allowing you to manage CSV files in both directions within your workflows.

2. What delimiters are supported by the CSV Connector?

The CSV Connector supports common delimiters such as Comma, Semicolon, Tab, Pipe, and Space. You can also specify a custom delimiter if needed.

3. How do I handle CSV files with different encodings?

You can specify the file encoding (e.g., UTF-8, ASCII) in the File Encoding field to ensure proper data handling for different encodings.

4. Can I include a header row when composing a CSV file?

Yes, you can choose to include a header row by selecting the Include Header Row option and specifying the column names in the Header Layout section.

  • DBSync Query Documentation : Learn how to add and configure queries to retrieve data from various sources, enabling you to pull in the necessary information for your workflows.

Last updated