Back to Catalog

Complete backup solution for n8n workflows & credentials (local/FTP)

FlorentFlorent
1636 views
2/3/2026
Official Page

Automated n8n Workflows & Credentials Backup to Local/Server Disk & FTP

Complete backup solution that saves both workflows and credentials to local/server disk with optional FTP upload for off-site redundancy.

What makes this workflow different:

  • Backs up workflows AND credentials together
  • Saves to local/server disk (not Git, GitHub, or any cloud services)
  • Optional FTP upload for redundancy (disabled by default)
  • Comprehensive error handling and email notifications
  • Timezone-aware scheduling
  • Ready to use with minimal configuration

How it works

Backup Process (Automated Daily at 4 AM):

  1. Initialisation - Sets up timezone-aware timestamps and configurable backup paths for both local/server disk and FTP destinations
  2. Folder Creation - Creates date-stamped backup directories (YYYY-MM-DD format) on local/server disk
  3. Dual Backup Operations - Processes credentials and workflows in two separate branches:
    • Credentials Branch:
      • Exports n8n credentials using the built-in CLI command with backup flag
      • Lists exported credential files in the credentials folder
      • Reads each credential file from disk
      • Optional: Uploads to FTP server (disabled by default)
      • Optional: Logs FTP upload results for credentials
    • Workflows Branch:
      • Retrieves all workflows via n8n API
      • Cleans workflow names for cross-platform compatibility
      • Converts workflows to formatted JSON files
      • Writes files to local/server disk
      • Optional: Uploads to FTP server (disabled by default)
      • Optional: Logs FTP upload results for workflows
  4. Data Aggregation - Combines all workflow data with binary attachments for comprehensive reporting
  5. Results Merging - Consolidates credentials FTP logs, workflows FTP logs, and aggregated workflow data
  6. Summary Generation - Creates detailed backup logs including:
    • Statistics (file counts, sizes, durations)
    • Success/failure tracking for local and FTP operations
    • Error tracking with detailed messages
    • Timezone-aware timestamps
  7. Notifications - Sends comprehensive email reports with log files attached and saves execution logs to disk

How to use

Initial Setup:

  1. Configure the Init Node - Open the "Init" node and customize these key parameters in the "Workflow Standard Configuration" section:

    // Admin email for notifications
    const N8N_ADMIN_EMAIL = $env.N8N_ADMIN_EMAIL || 'youremail@world.com';
    
    // Workflow name (auto-detected)
    const WORKFLOW_NAME = $workflow.name;
    
    // Projects root directory on your server
    const N8N_PROJECTS_DIR = $env.N8N_PROJECTS_DIR || '/files/n8n-projects-data';
    // projects-root-folder/
    //   └── Your-project-folder-name/
    //       ├── logs/
    //       ├── reports/
    //       ├── ...
    //       └── [other project files]
    
    // Project folder name for this backup workflow
    const PROJECT_FOLDER_NAME = "Workflow-backups";
    

    Then customize these parameters in the "Workflow Custom Configuration" section:

    // Local backup folder (must exist on your server)
    const BACKUP_FOLDER = $env.N8N_BACKUP_FOLDER || '/files/n8n-backups';
    
    // FTP backup folder (root path on your FTP server)
    const FTP_BACKUP_FOLDER = $env.N8N_FTP_BACKUP_FOLDER || '/n8n-backups';
    
    // FTP server name for logging (display purposes only)
    const FTPName = 'Synology NAS 2To';
    

    These variables can also be set as environment variables in your n8n configuration.

  2. Set Up Credentials:

    • Configure n8n API credentials for the "Fetch Workflows" node
    • Configure SMTP credentials for email notifications
    • Optional: Configure FTP credentials if you want to enable off-site backups
  3. Configure Backup Folder:

    • Ensure the backup folder path exists on your server
    • Verify proper write permissions for the n8n process
    • If running in Docker, ensure volume mapping is correctly configured
  4. Customize Email Settings:

    • Update the "Send email" node with your recipient email address or your "N8N_ADMIN_EMAIL" environment value
    • Adjust email subject and body text as needed

Enabling FTP Upload (Optional):

By default, FTP upload nodes are disabled for easier setup. To enable off-site FTP backups:

  1. Simply activate these 4 nodes (no other changes needed):

    • "Upload Credentials To FTP"
    • "FTP Logger (credentials)"
    • "Upload Workflows To FTP"
    • "FTP Logger (workflows)"
  2. Configure FTP credentials in the two upload nodes

  3. The workflow will automatically handle FTP operations and include upload status in reports

Requirements

  • n8n API credentials (for workflow fetching)
  • SMTP server configuration (for email notifications)
  • Adequate disk space for local backup storage
  • Proper file system permissions for backup folder access
  • Docker environment with volume mapping (if running n8n in Docker)
  • Optional: FTP server access and credentials (for off-site backups)

Good to know

  • Security: Credentials are exported using n8n's secure backup format - actual credential values are not exposed in plain text
  • Timezone Handling: All timestamps respect configured timezone settings (defaults to Europe/Paris, configurable in Init node)
  • File Naming: Automatic sanitization ensures backup files work across different operating systems (removes forbidden characters, limits length to 180 characters)
  • FTP Upload: Disabled by default for easier setup - simply activate 4 nodes to enable off-site backups without any code changes
  • Connection Resilience: FTP operations include error handling for timeout and connection issues without failing the entire backup
  • Graceful Degradation: If FTP nodes are disabled, the workflow completes successfully with local backups only and indicates FTP status in logs
  • Error Handling: Comprehensive error catching with detailed logging and email notifications
  • Dual Logging: Creates both JSON logs (for programmatic parsing) and plain text logs (for human readability)
  • Storage: Individual workflow JSON files allow for selective restore and easier version control integration
  • Scalability: Handles any number of workflows efficiently with detailed progress tracking

This automated backup workflow saves your n8n data to both local disk and FTP server. To restore your backups, use:

n8n Workflow: Complete Backup Solution for n8n Workflows

This n8n workflow provides a robust and automated solution for backing up your n8n workflows and credentials. It leverages local file storage and FTP to ensure your critical automation definitions are securely saved, with email notifications for success or failure.

What it does

This workflow automates the following steps:

  1. Triggers on a Schedule: The workflow starts based on a predefined schedule (e.g., daily, weekly).
  2. Retrieves n8n Workflows: It fetches all active workflows from your n8n instance.
  3. Retrieves n8n Credentials: It fetches all credentials configured in your n8n instance.
  4. Converts to File: The retrieved workflows and credentials (JSON data) are converted into a file format.
  5. Saves to Local Disk: The backup files are saved to a specified directory on the local disk where your n8n instance is running.
  6. Uploads to FTP: The backup files are then uploaded to a remote FTP server for off-site storage.
  7. Sends Success Email: Upon successful completion of the backup process, an email notification is sent.
  8. Handles Errors: If any step in the backup process fails, the workflow stops and sends an error notification email.

Prerequisites/Requirements

  • n8n Instance: An active n8n instance where this workflow will run.
  • Local Disk Access: The n8n instance needs write access to a local directory for temporary file storage.
  • FTP Server: Access to an FTP server with appropriate credentials for storing backup files.
  • SMTP Server: An SMTP server configured in n8n for sending email notifications.

Setup/Usage

  1. Import the Workflow: Import the provided JSON into your n8n instance.
  2. Configure Schedule Trigger: Adjust the "Schedule Trigger" node (ID: 839) to your desired backup frequency (e.g., daily, weekly).
  3. Configure n8n Nodes:
    • Get All Workflows (n8n node, ID: 826): Ensure this node is configured to retrieve workflows from your n8n instance. You might need to set up an n8n API credential if not already done.
    • Get All Credentials (n8n node, ID: 826 - Note: There are two n8n nodes, one for workflows and one for credentials, they share the same node type but will have different configurations): Configure this node to retrieve credentials.
  4. Configure Read/Write Files from Disk:
    • Save Workflows to Disk (Read/Write Files from Disk node, ID: 1233): Specify the local path where workflow backups should be saved.
    • Save Credentials to Disk (Read/Write Files from Disk node, ID: 1233 - Note: There are two Read/Write Files from Disk nodes, they share the same node type but will have different configurations): Specify the local path where credential backups should be saved.
  5. Configure FTP Node:
    • Upload to FTP (FTP node, ID: 350): Set up your FTP credentials and the remote path where the backup files should be uploaded.
  6. Configure Send Email Nodes:
    • Send Success Email (Send Email node, ID: 11): Configure your SMTP credentials and the recipient email address for successful backup notifications.
    • Send Error Email (Send Email node, ID: 11 - Note: There are two Send Email nodes, they share the same node type but will have different configurations): Configure your SMTP credentials and the recipient email address for error notifications.
  7. Activate the Workflow: Once configured, activate the workflow in n8n.

Related Templates

Auto-create TikTok videos with VEED.io AI avatars, ElevenLabs & GPT-4

💥 Viral TikTok Video Machine: Auto-Create Videos with Your AI Avatar --- 🎯 Who is this for? This workflow is for content creators, marketers, and agencies who want to use Veed.io’s AI avatar technology to produce short, engaging TikTok videos automatically. It’s ideal for creators who want to appear on camera without recording themselves, and for teams managing multiple brands who need to generate videos at scale. --- ⚙️ What problem this workflow solves Manually creating videos for TikTok can take hours — finding trends, writing scripts, recording, and editing. By combining Veed.io, ElevenLabs, and GPT-4, this workflow transforms a simple Telegram input into a ready-to-post TikTok video featuring your AI avatar powered by Veed.io — speaking naturally with your cloned voice. --- 🚀 What this workflow does This automation links Veed.io’s video-generation API with multiple AI tools: Analyzes TikTok trends via Perplexity AI Writes a 10-second viral script using GPT-4 Generates your voiceover via ElevenLabs Uses Veed.io (Fabric 1.0 via FAL.ai) to animate your avatar and sync the lips to the voice Creates an engaging caption + hashtags for TikTok virality Publishes the video automatically via Blotato TikTok API Logs all results to Google Sheets for tracking --- 🧩 Setup Telegram Bot Create your bot via @BotFather Configure it as the trigger for sending your photo and theme Connect Veed.io Create an account on Veed.io Get your FAL.ai API key (Veed Fabric 1.0 model) Use HTTPS image/audio URLs compatible with Veed Fabric Other APIs Add Perplexity, ElevenLabs, and Blotato TikTok keys Connect your Google Sheet for logging results --- 🛠️ How to customize this workflow Change your Avatar: Upload a new image through Telegram, and Veed.io will generate a new talking version automatically. Modify the Script Style: Adjust the GPT prompt for tone (educational, funny, storytelling). Adjust Voice Tone: Tweak ElevenLabs stability and similarity settings. Expand Platforms: Add Instagram, YouTube Shorts, or X (Twitter) posting nodes. Track Performance: Customize your Google Sheet to measure your most successful Veed.io-based videos. --- 🧠 Expected Outcome In just a few seconds after sending your photo and theme, this workflow — powered by Veed.io — creates a fully automated TikTok video featuring your AI avatar with natural lip-sync and voice. The result is a continuous stream of viral short videos, made without cameras, editing, or effort. --- ✅ Import the JSON file in n8n, add your API keys (including Veed.io via FAL.ai), and start generating viral TikTok videos starring your AI avatar today! 🎥 Watch This Tutorial --- 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube

Dr. FirasBy Dr. Firas
39510

Automate invoice processing with OCR, GPT-4 & Salesforce opportunity creation

PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive ➜ Download PDF ➜ OCR text ➜ AI normalize to JSON ➜ Upsert Buyer (Account) ➜ Create Opportunity ➜ Map Products ➜ Create OLI via Composite API ➜ Archive to OneDrive. --- Node by node (what it does & key setup) 1) Google Drive Trigger Purpose: Fire when a new file appears in a specific Google Drive folder. Key settings: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output: Metadata { id, name, ... } for the new file. --- 2) Download File From Google Purpose: Get the file binary for processing and archiving. Key settings: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output: Binary (default key: data) and original metadata. --- 3) Extract from File Purpose: Extract text from PDF (OCR as needed) for AI parsing. Key settings: Operation: pdf OCR: enable for scanned PDFs (in options) Output: JSON with OCR text at {{ $json.text }}. --- 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into strict normalized JSON array (invoice schema). Key settings: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content → the parsed JSON (ensure it’s an array). --- 5) Create or update an account (Salesforce) Purpose: Upsert Buyer as Account using an external ID. Key settings: Resource: account Operation: upsert External Id Field: taxid_c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream Opportunity. --- 6) Create an opportunity (Salesforce) Purpose: Create Opportunity linked to the Buyer (Account). Key settings: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output: Opportunity Id for OLI creation. --- 7) Build SOQL (Code / JS) Purpose: Collect unique product codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output: { soql, codes } --- 8) Query PricebookEntries (Salesforce) Purpose: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output: Items with Id, Product2.ProductCode (used for mapping). --- 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id ➜ build OpportunityLineItem payloads. Inputs: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes: Converts discount_total ➜ per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. --- 10) Create Opportunity Line Items (HTTP Request) Purpose: Bulk create OLIs via Salesforce Composite API. Key settings: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output: Composite API results (per-record statuses). --- 11) Update File to One Drive Purpose: Archive the original PDF in OneDrive. Key settings: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output: Uploaded file metadata. --- Data flow (wiring) Google Drive Trigger → Download File From Google Download File From Google → Extract from File → Update File to One Drive Extract from File → Message a model Message a model → Create or update an account Create or update an account → Create an opportunity Create an opportunity → Build SOQL Build SOQL → Query PricebookEntries Query PricebookEntries → Code in JavaScript Code in JavaScript → Create Opportunity Line Items --- Quick setup checklist 🔐 Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. 📂 IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) 🧠 AI Prompt: Use the strict system prompt; jsonOutput = true. 🧾 Field mappings: Buyer tax id/name → Account upsert fields Invoice code/date/amount → Opportunity fields Product name must equal your Product2.ProductCode in SF. ✅ Test: Drop a sample PDF → verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive --- Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep “Return only a JSON array” as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields don’t support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your org’s domain or use the Salesforce node’s Bulk Upsert for simplicity.

Le NguyenBy Le Nguyen
942

Dynamic Hubspot lead routing with GPT-4 and Airtable sales team distribution

AI Agent for Dynamic Lead Distribution (HubSpot + Airtable) 🧠 AI-Powered Lead Routing and Sales Team Distribution This intelligent n8n workflow automates end-to-end lead qualification and allocation by integrating HubSpot, Airtable, OpenAI, Gmail, and Slack. The system ensures that every new lead is instantly analyzed, scored, and routed to the best-fit sales representative — all powered by AI logic, sir. --- 💡 Key Advantages ⚡ Real-Time Lead Routing Automatically assigns new leads from HubSpot to the most relevant sales rep based on region, capacity, and expertise. 🧠 AI Qualification Engine An OpenAI-powered Agent evaluates the lead’s industry, region, and needs to generate a persona summary and routing rationale. 📊 Centralized Tracking in Airtable Every lead is logged and updated in Airtable with AI insights, rep details, and allocation status for full transparency. 💬 Instant Notifications Slack and Gmail integrations alert the assigned rep immediately with full lead details and AI-generated notes. 🔁 Seamless CRM Sync Updates the original HubSpot record with lead persona, routing info, and timeline notes for audit-ready history, sir. --- ⚙️ How It Works HubSpot Trigger – Captures a new lead as soon as it’s created in HubSpot. Fetch Contact Data – Retrieves all relevant fields like name, company, and industry. Clean & Format Data – A Code node standardizes and structures the data for consistency. Airtable Record Creation – Logs the lead data into the “Leads” table for centralized tracking. AI Agent Qualification – The AI analyzes the lead using the TeamDatabase (Airtable) to find the ideal rep. Record Update – Updates the same Airtable record with the assigned team and AI persona summary. Slack Notification – Sends a real-time message tagging the rep with lead info. Gmail Notification – Sends a personalized handoff email with context and follow-up actions. HubSpot Sync – Updates the original contact in HubSpot with the assignment details and AI rationale, sir. --- 🛠️ Setup Steps Trigger Node: HubSpot → Detect new leads. HubSpot Node: Retrieve complete lead details. Code Node: Clean and normalize data. Airtable Node: Log lead info in the “Leads” table. AI Agent Node: Process lead and match with sales team. Slack Node: Notify the designated representative. Gmail Node: Email the rep with details. HubSpot Node: Update CRM with AI summary and allocation status, sir. --- 🔐 Credentials Required HubSpot OAuth2 API – To fetch and update leads. Airtable Personal Access Token – To store and update lead data. OpenAI API – To power the AI qualification and matching logic. Slack OAuth2 – For sending team notifications. Gmail OAuth2 – For automatic email alerts to assigned reps, sir. --- 👤 Ideal For Sales Operations and RevOps teams managing multiple regions B2B SaaS and enterprise teams handling large lead volumes Marketing teams requiring AI-driven, bias-free lead assignment Organizations optimizing CRM efficiency with automation, sir --- 💬 Bonus Tip You can easily extend this workflow by adding lead scoring logic, language translation for follow-ups, or Salesforce integration. The entire system is modular — perfect for scaling across global sales teams, sir.

MANISH KUMARBy MANISH KUMAR
113