FTP/SFTP Integration

Connect to FTP and SFTP servers to upload and download files seamlessly. Perfect for legacy system integrations, batch file transfers, and automated data exchange workflows.

Overview

This integration package provides secure file transfer capabilities using both traditional FTP and the more secure SFTP protocol. It supports uploading from strings or binary data, downloading file contents, and streaming large files to optimize memory usage. Ideal for integrating with legacy systems, EDI file exchanges, or any scenario requiring automated file transfers.

Key Features

🔐 Secure Transfers

Use SFTP for encrypted file transfers over SSH, ensuring data security.

⚡ Streaming Support

Process large files efficiently with streaming mode to reduce memory consumption.

📤 Flexible Uploads

Upload files from strings, binary data, or streams for maximum flexibility.

📥 Download Options

Download files as strings or binary streams based on your processing needs.

🔄 Dual Protocol

Support for both FTP and SFTP protocols with the same configuration structure.

🎯 Path Control

Specify origin paths for downloads and destination paths for uploads with precision.

Authentication Setup

SFTP Configuration

For SFTP (Secure FTP over SSH), use this configuration structure:

{
  "sftp": {
    "host": "sftp.example.com",
    "user": "your-username",
    "password": "your-password",
    "use_streaming": true  // Optional: Enable for files >20MB
  }
}

FTP Configuration

For traditional FTP connections:

{
  "ftp": {
    "host": "ftp.example.com",
    "user": "your-username",
    "password": "your-password"
  }
}

💡 Streaming for Large Files

When working with files larger than 20MB, enable use_streaming: true to process files in chunks instead of loading them entirely into memory. This significantly improves performance and prevents memory issues.

Common Use Cases

📊 EDI File Exchange

Automate exchange of EDI files (850 POs, 856 ASNs, 810 Invoices) with trading partners who use FTP/SFTP for B2B data exchange.

Flow: ERP → Generate EDI → Upload to Partner SFTP → Partner Downloads → Process

🏢 Legacy System Integration

Connect modern applications to legacy systems that only support file-based data exchange via FTP servers.

Flow: Download CSV from Legacy FTP → Transform → Push to Modern API

📦 Product Catalog Updates

Receive product catalog updates, price lists, or inventory files from suppliers via SFTP drops.

Flow: Download Supplier Files → Parse CSV/XML → Update Product Database

🎨 Media Asset Distribution

Upload product images, videos, or documents to partner FTP servers for distribution or processing.

Flow: Resize Images → Upload to Partner SFTP → Partner Publishes to Marketplace

📈 Report Distribution

Generate reports and automatically deliver them to client SFTP servers on scheduled intervals.

Flow: Generate Daily Sales Report → Export CSV → Upload to Client SFTP

💾 Data Backup

Archive data exports and backups to secure SFTP storage for compliance or disaster recovery.

Flow: Export Database → Compress → Upload to Backup SFTP → Retention Policy

Best Practices

Use SFTP Over FTP

Always prefer SFTP over traditional FTP for security. SFTP encrypts all data transfers, protecting sensitive information from interception.

Enable Streaming for Large Files

Set use_streaming: true when working with files larger than 20MB to prevent memory issues and improve performance.

Implement Retry Logic

Network issues can cause FTP transfers to fail. Implement retry logic with exponential backoff for production reliability.

Use Descriptive File Names

Include timestamps and identifiers in file names (e.g., orders_2024-12-10_143052.csv) to avoid overwrites and enable tracking.

Validate Files After Transfer

After uploading, verify file size or checksum to ensure transfer completed successfully before marking the workflow as complete.

Archive Processed Files

Move or rename processed files instead of deleting them immediately. This allows for troubleshooting and prevents duplicate processing.

Configuration Examples

Download Configuration

When downloading files, specify the origin_field to indicate the remote file path:

{
  "sftp": {
    "host": "sftp.example.com",
    "user": "downloader",
    "password": "secure-password",
    "origin_field": "/incoming/products/catalog.csv"
  }
}

Upload Configuration

When uploading files, specify the destination_path for where the file should be placed:

{
  "sftp": {
    "host": "sftp.partner.com",
    "user": "uploader",
    "password": "secure-password",
    "destination_path": "/outgoing/orders/order_.xml",
    "use_streaming": true
  }
}

Troubleshooting

🔴 Issue: Connection Timeout

Cause: Firewall blocking FTP/SFTP ports or incorrect host.

Solution:

  • Verify host and port (FTP: 21, SFTP: 22) are correct
  • Check if firewall allows outbound connections on required ports
  • Test connectivity using command-line FTP/SFTP tools first
  • Confirm the server is accepting connections from your IP

🔴 Issue: Authentication Failed

Cause: Incorrect credentials or permission issues.

Solution:

  • Verify username and password are correct
  • Check if account requires SSH key authentication (SFTP)
  • Ensure account has not expired or been locked
  • Test credentials using FileZilla or command-line tools

🔴 Issue: Permission Denied When Uploading

Cause: User lacks write permissions on destination directory.

Solution:

  • Verify user has write permissions on the target directory
  • Check if directory exists (some servers require manual creation)
  • Try uploading to a different directory to isolate the issue
  • Contact server administrator to adjust permissions

🔴 Issue: File Not Found When Downloading

Cause: Incorrect path or file hasn't been uploaded yet.

Solution:

  • Double-check the origin_field path is correct
  • List directory contents to verify file exists and get exact name
  • Check for case sensitivity in file paths (Linux servers are case-sensitive)
  • Verify file upload completed before attempting download

🔴 Issue: Out of Memory Errors

Cause: Large files loaded entirely into memory.

Solution:

  • Enable use_streaming: true in configuration
  • Use binary/stream actions instead of string actions for large files
  • Increase Node.js memory limit if necessary: --max-old-space-size=4096
  • Process files in chunks if possible