Restore workflows and credentials from remote FTP backup storage
Restore workflows & credentials from FTP - Remote Backup Solution
This n8n template provides a safe and intelligent restore solution for self-hosted n8n instances, allowing you to restore workflows and credentials from FTP remote backups.
Perfect for disaster recovery or migrating between environments, this workflow automatically identifies your most recent FTP backup and provides a manual restore capability that intelligently excludes the current workflow to prevent conflicts. Works seamlessly with date-organized backup folders stored on any FTP/SFTP server.
Good to know
- This workflow uses n8n's native import commands (
n8n import:workflowandn8n import:credentials) - Works with date-formatted backup folders (YYYY-MM-DD) stored on FTP servers
- The restore process intelligently excludes the current workflow to prevent overwriting itself
- Requires FTP/SFTP server access and proper Docker volume configuration
- All downloaded files are temporarily stored server-side before import
- Compatible with backups created by n8n's export commands and uploaded to FTP
- Supports selective restoration: restore only credentials, only workflows, or both
How it works
Restore Process (Manual)
- Manual trigger with configurable pinned data options (credentials: true/false, worflows: true/false)
- The Init node sets up all necessary paths, timestamps, and configuration variables using your environment settings
- The workflow connects to your FTP server and scans for available backup dates
- Automatically identifies the most recent backup folder (latest YYYY-MM-DD date)
- Creates temporary restore folders on your local server for downloaded files
- If restoring credentials:
- Lists all credential files from FTP backup folder
- Downloads credential files to temporary local folder
- Writes files to disk using "Read/Write Files from Disk" node
- Direct import using n8n's import command
- Credentials are imported with their encrypted format intact
- If restoring workflows:
- Lists all workflow JSON files from FTP backup folder
- Downloads workflow files to temporary local folder
- Filters out the credentials subfolder to prevent importing it as a workflow
- Writes workflow files to disk
- Intelligently excludes the current restore workflow to prevent conflicts
- Imports all other workflows using n8n's import command
- Optional email notifications provide detailed restore summaries with command outputs
- Temporary files remain on server for verification (manual cleanup recommended)
How to use
Prerequisites
- Existing n8n backups on FTP server in date-organized folder structure (format:
/ftp-backup-folder/YYYY-MM-DD/) - Workflow backups as JSON files in the date folder
- Credentials backups in subfolder:
/ftp-backup-folder/YYYY-MM-DD/n8n-credentials/ - FTP/SFTP access credentials configured in n8n
- For new environments:
N8N_ENCRYPTION_KEYfrom source environment (see dedicated section below)
Initial Setup
-
Configure your environment variables:
N8N_ADMIN_EMAIL: Your email for notifications (optional)FTP_BACKUP_FOLDER: FTP path where backups are stored (e.g.,/n8n-backups)N8N_PROJECTS_DIR: Projects root directory (e.g.,/files/n8n-projects-data)GENERIC_TIMEZONE: Your local timezone (e.g.,Europe/Paris)N8N_ENCRYPTION_KEY: Required if restoring credentials to a new environment (see dedicated section below)
-
Create your FTP credential in n8n:
- Add a new FTP/SFTP credential
- Configure host, port, username, and password/key
- Test the connection
-
Update the Init node:
- (Optional) Configure your email here:
const N8N_ADMIN_EMAIL = $env.N8N_ADMIN_EMAIL || 'youremail@world.com'; - Set
PROJECT_FOLDER_NAMEto"Workflow-backups"(or your preferred name) - Set
FTP_BACKUP_FOLDERto match your FTP backup path (default:/n8n-backups) - Set
credentialsto"n8n-credentials"(or your backup credentials folder name) - Set
FTPNameto a descriptive name for your FTP server (used in notifications)
- (Optional) Configure your email here:
-
Configure FTP credentials in nodes:
- Update the FTP credential in "List Credentials Folders" node
- Verify all FTP nodes use the same credential
- Test connection by executing "List Credentials Folders" node
-
Optional: Configure SMTP for email notifications:
- Add SMTP credential in n8n
- Activate "SUCCESS email Credentials" and "SUCCESS email Workflows" nodes
- Or remove email nodes if not needed
Performing a Restore
- Open the workflow and locate the "Start Restore" manual trigger node
- Edit the pinned data to choose what to restore:
{ "credentials": true, "worflows": true }credentials: true- Restore credentials from FTPworflows: true- Restore workflows from FTP (note: typo preserved from original)- Set both to
trueto restore everything
- Update the node's notes to reflect your choice (for documentation)
- Click "Execute workflow" on the "Start Restore" node
- The workflow will:
- Connect to FTP and find the most recent backup
- Download selected files to temporary local folders
- Import credentials and/or workflows
- Send success email with detailed operation logs
- Check the console logs or email for detailed restore summary
Important Notes
- The workflow automatically excludes itself during restore to prevent conflicts
- Credentials are restored with their encryption intact. If restoring to a new environment, you must configure the
N8N_ENCRYPTION_KEYfrom the source environment (see dedicated section below) - Existing workflows/credentials with the same names will be overwritten
- Temporary folders are created with date prefix (e.g.,
2025-01-15-restore-credentials) - Test in a non-production environment first if unsure
Critical: N8N_ENCRYPTION_KEY Configuration
Why this is critical: n8n generates an encryption key automatically on first launch and saves it in the ~/.n8n/config file. However, if this file is lost (for example, due to missing Docker volume persistence), n8n will generate a NEW key, making all previously encrypted credentials inaccessible.
When you need to configure N8N_ENCRYPTION_KEY:
- Restoring to a new n8n instance
- When your data directory is not persisted between container recreations
- Migrating from one server to another
- As a best practice to ensure key persistence across updates
How credentials encryption works:
- Credentials are encrypted with a specific key unique to each n8n instance
- This key is auto-generated on first launch and stored in
/home/node/.n8n/config - When you backup credentials, they remain encrypted but the key is NOT included
- If the key file is lost or a new key is generated, restored credentials cannot be decrypted
- Setting
N8N_ENCRYPTION_KEYexplicitly ensures the key remains consistent
Solution: Retrieve and configure the encryption key
Step 1: Get the key from your source environment
# Check if the key is defined in environment variables
docker-compose exec n8n printenv N8N_ENCRYPTION_KEY
If this command returns nothing, the key is auto-generated and stored in n8n's data volume:
# Enter the container
docker-compose exec n8n sh
# Check configuration file
cat /home/node/.n8n/config
# Exit container
exit
Step 2: Configure the key in your target environment
Option A: Using .env file (recommended for security)
# Add to your .env file
N8N_ENCRYPTION_KEY=your_retrieved_key_here
Then reference it in docker-compose.yml:
services:
n8n:
environment:
- N8N_ENCRYPTION_KEY=${N8N_ENCRYPTION_KEY}
Option B: Directly in docker-compose.yml (less secure)
services:
n8n:
environment:
- N8N_ENCRYPTION_KEY=your_retrieved_key_here
Step 3: Restart n8n
docker-compose restart n8n
Step 4: Now restore your credentials
Only after configuring the encryption key, run the restore workflow with credentials: true.
Best practice for future backups:
- Always save your
N8N_ENCRYPTION_KEYin a secure location alongside your backups - Consider storing it in a password manager or secure vault
- Document it in your disaster recovery procedures
Requirements
FTP Server
- FTP or SFTP server with existing n8n backups
- Read access to backup folder structure
- Network connectivity from n8n instance to FTP server
Existing Backups on FTP
- Date-organized backup folders (YYYY-MM-DD format)
- Backup files created by n8n's export commands or compatible format
- Credentials in subfolder structure:
YYYY-MM-DD/n8n-credentials/
Environment
- Self-hosted n8n instance (Docker recommended)
- Docker volumes mounted with write access to project folder
- Access to n8n CLI commands (
n8n import:credentialsandn8n import:workflow) - Proper file system permissions for temporary folder creation
Credentials
- FTP/SFTP credential configured in n8n
- Optional: SMTP credentials for email notifications
Technical Notes
FTP Connection and Download Process
- Uses n8n's built-in FTP node for all remote operations
- Supports both FTP and SFTP protocols
- Downloads files as binary data before writing to disk
- Temporary local storage required for import process
Smart Workflow Exclusion
- During workflow restore, the current workflow's name is cleaned and matched against backup files
- This prevents the restore workflow from overwriting itself
- The exclusion logic handles special characters and spaces in workflow names
- A bash command removes the current workflow from the temporary restore folder before import
Credentials Subfolder Filtering
- The "Filter out Credentials sub-folder" node checks for binary data presence
- Only items with binary data (actual files) proceed to disk write
- Prevents the credentials subfolder from being imported as a workflow
Timezone Handling
- All timestamps use UTC for technical operations
- Display times use local timezone for user-friendly readability
- FTP backup folder scanning works with YYYY-MM-DD format regardless of timezone
Security
- FTP connections should use SFTP or FTPS for encrypted transmission
- Credentials are imported in n8n's encrypted format (encryption preserved)
- Temporary files stored in project-specific folders
- Consider access controls for who can trigger restore operations
- No sensitive credential data is logged in console output
Troubleshooting
Common Issues
- FTP connection fails: Verify FTP credentials are correctly configured and server is accessible
- No backups found: Ensure the
FTP_BACKUP_FOLDERpath is correct and contains date-formatted folders (YYYY-MM-DD) - Permission errors: Ensure Docker user has write access to
N8N_PROJECTS_DIRfor temporary folders - Path not found: Verify all volume mounts in
docker-compose.ymlmatch your project folder location - Import fails: Check that backup files are in valid n8n export format
- Download errors: Verify FTP path structure matches expected format (date folder / credentials subfolder / files)
- Workflow conflicts: The workflow automatically excludes itself, but ensure backup files are properly named
- Credentials not restored: Verify the FTP backup contains a
n8n-credentialssubfolder with credential files - Credentials decrypt error: Ensure
N8N_ENCRYPTION_KEYmatches the source environment
Error Handling
- "Find Last Backup" node has error output configured to catch FTP listing issues
- "Download Workflow Files" node continues on error to handle presence of credentials subfolder
- All critical nodes log detailed error information to console
- Email notifications include stdout and stderr from import commands
Version Compatibility
- Tested with n8n version 1.113.3
- Compatible with Docker-based n8n installations
- Requires n8n CLI access (available in official Docker images)
- Works with any FTP/SFTP server (Synology NAS, dedicated FTP servers, cloud FTP services)
This workflow is designed for FTP/SFTP remote backup restoration. For local disk backups, see the companion workflow "n8n Restore from Disk".
Works best with backups from: "Automated n8n Workflows & Credentials Backup to Local/Server Disk & FTP"
n8n Workflow: Restore Workflows and Credentials from Remote FTP Backup Storage
This n8n workflow provides a manual trigger to initiate the restoration of n8n workflows and credentials from a remote FTP backup. It's designed to help administrators recover their n8n instance configuration in case of data loss or migration.
What it does
This workflow performs the following steps:
- Manual Trigger: The workflow is initiated manually by clicking "Execute workflow" in the n8n editor.
- Retrieve Backup Files from FTP: It connects to a specified FTP server to download backup files.
- Read Backup Data: The downloaded files are then read from the local disk.
- Process Workflows:
- It filters for workflow backup files (e.g.,
workflows.json). - For each workflow file, it executes a shell command to import the workflow into the n8n instance.
- If the import command fails, it stops the workflow and sends an error email.
- It filters for workflow backup files (e.g.,
- Process Credentials:
- It filters for credential backup files (e.g.,
credentials.json). - For each credential file, it executes a shell command to import the credentials into the n8n instance.
- If the import command fails, it stops the workflow and sends an error email.
- It filters for credential backup files (e.g.,
- Error Handling and Notification: If any step involving shell command execution fails, the workflow stops, and an email notification is sent to a configured recipient with details about the error.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: An active n8n instance where this workflow will be imported and executed.
- FTP Server: Access to an FTP server containing your n8n workflow and credential backup files.
- You will need to configure an FTP Credential within n8n.
- SMTP Server: Access to an SMTP server for sending email notifications in case of errors.
- You will need to configure an Email Send Credential within n8n.
- n8n CLI Access: The n8n instance must have access to its own CLI (Command Line Interface) to execute
n8n import:workflowandn8n import:credentialcommands. This typically means the n8n user has the necessary permissions on the host system. - Backup Files: Your n8n workflow and credential backup files (e.g.,
workflows.json,credentials.json) should be present on the FTP server in a predictable structure.
Setup/Usage
- Import the Workflow:
- Download the JSON content of this workflow.
- In your n8n instance, go to "Workflows" and click "New".
- Click the "Import from JSON" button and paste the workflow JSON.
- Configure Credentials:
- FTP Node (ID: 350): Configure the FTP credential to connect to your backup storage.
- Send Email Node (ID: 11): Configure the Email Send credential for sending error notifications. Update the "To" email address to your desired recipient.
- Customize FTP Paths:
- FTP Node (ID: 350): Adjust the "Remote Path" to point to the directory on your FTP server where the backup files are located.
- Read/Write Files from Disk Node (ID: 1233): Ensure the "File Path" for reading files matches where the FTP node saves them locally.
- Review and Customize Code Nodes (ID: 834):
- The "Code" nodes contain the
n8n import:workflowandn8n import:credentialcommands. Review these commands to ensure they match your n8n CLI setup and the expected filenames of your backup files. - The current implementation assumes
workflows.jsonandcredentials.jsonas filenames. Adjust if your backup files have different names.
- The "Code" nodes contain the
- Execute the Workflow:
- Once configured, click the "Execute workflow" button on the "When clicking ‘Execute workflow’" node (ID: 838) to start the restoration process.
- Monitor the execution for any errors. If an error occurs, an email will be sent.
Related Templates
Automate interior design lead qualification with AI & human approval to Notion
Overview This automated workflow intelligently qualifies interior design leads, generates personalized client emails, and manages follow-up through a human-approval process. Built with n8n, Claude AI, Telegram approval, and Notion database integration. ⚠️ Hosting Options This template works with both n8n Cloud and self-hosted instances. Most nodes are native to n8n, making it cloud-compatible out of the box. What This Template Does Automated Lead Management Pipeline: Captures client intake form submissions from website or n8n forms AI-powered classification into HOT/WARM/COLD categories based on budget, project scope, and commitment indicators Generates personalized outreach emails tailored to each lead type Human approval workflow via Telegram for quality control Email revision capability for rejected drafts Automated client email delivery via Gmail Centralized lead tracking in Notion database Key Features ✅ Intelligent Lead Scoring: Analyzes 12+ data points including budget (AED), space count, project type, timeline, and style preferences ✅ Personalized Communication: AI-generated emails reference specific client details, demonstrating genuine understanding ✅ Quality Control: Human-in-the-loop approval via Telegram prevents errors before client contact ✅ Smart Routing: Different workflows for qualified leads (meeting invitations) vs. unqualified leads (respectful alternatives) ✅ Revision Loop: Rejected emails automatically route to revision agent for improvements ✅ Database Integration: All leads stored in Notion for pipeline tracking and analytics Use Cases Interior design firms managing high-volume lead intake Architecture practices with complex qualification criteria Home renovation companies prioritizing project value Any service business requiring budget-based lead scoring Sales teams needing approval workflows before client contact Prerequisites Required Accounts & API Keys: Anthropic Claude API - For AI classification and email generation Telegram Bot Token - For approval notifications Gmail Account - For sending client emails (or any SMTP provider) Notion Account - For lead database storage n8n Account - Cloud or self-hosted instance Technical Requirements: Basic understanding of n8n workflows Ability to create Telegram bots via BotFather Gmail app password or OAuth setup Notion database with appropriate properties Setup Instructions Step 1: Clone and Import Template Copy this template to your n8n instance (cloud or self-hosted) All nodes will appear as inactive - this is normal Step 2: Configure Form Trigger Open the Client Intake Form Trigger node Choose your trigger type: For n8n forms: Configure form fields matching the template structure For webhook: Copy webhook URL and integrate with your website form Required form fields: First Name, Second Name, Email, Contact Number Project Address, Project Type, Spaces Included Budget Range, Completion Date, Style Preferences Involvement Level, Previous Experience, Inspiration Links Step 3: Set Up Claude AI Credentials Obtain API key from https://console.anthropic.com In n8n: Create new credential → Anthropic → Paste API key Apply credential to these nodes: AI Lead Scoring Engine Personalized Client Outreach Email Generator Email Revision Agent Step 4: Configure Telegram Approval Bot Create bot via Telegram's @BotFather Copy bot token Get your Telegram Chat ID (use @userinfobot) In n8n: Create Telegram credential with bot token Configure Human-in-the-Loop Email Approval node: Add your Chat ID Customize approval message format if desired Step 5: Set Up Gmail Sending Enable 2-factor authentication on Gmail account Generate app password: Google Account → Security → App Passwords In n8n: Create Gmail credential using app password Configure Client Email Delivery node with sender details Step 6: Connect Notion Database Create Notion integration at https://www.notion.so/my-integrations Copy integration token Create database with these properties: Client Name (Title), Email (Email), Contact Number (Phone) Project Address (Text), Project Type (Multi-select) Spaces Included (Text), Budget (Select), Timeline (Date) Classification (Select: HOT/WARM/COLD), Confidence (Select) Estimated Value (Number), Status (Select) Share database with your integration In n8n: Add Notion credential → Paste token Configure Notion Lead Database Manager with database ID Step 7: Customize Classification Rules (Optional) Open AI Lead Scoring Engine node Review classification criteria in the prompt: HOT: 500k+ AED, full renovations, 2+ spaces WARM: 100k+ AED, 2+ spaces COLD: <100k AED OR single space Adjust thresholds to match your business requirements Modify currency if not using AED Step 8: Personalize Email Templates Open Personalized Client Outreach Email Generator node Customize: Company name and branding Signature placeholders ([Your Name], [Title], etc.) Tone and style preferences Alternative designer recommendations for COLD leads Step 9: Test the Workflow Activate the workflow Submit a test form with sample data Monitor each node execution in n8n Check Telegram for approval message Verify email delivery and Notion database entry Step 10: Set Up Error Handling (Recommended) Add error workflow trigger Configure notifications for failed executions Set up retry logic for API failures Workflow Node Breakdown Client Intake Form Trigger Captures lead data from website forms or n8n native forms with all project details. AI Lead Scoring Engine Analyzes intake data using structured logic: budget validation, space counting, and multi-factor evaluation. Returns HOT/WARM/COLD classification with confidence scores. Lead Classification Router Routes leads into three priority workflows based on AI classification, optimizing resource allocation. Sales Team Email Notifier Sends instant alerts to sales representatives with complete lead details and AI reasoning for internal tracking. Personalized Client Outreach Email Generator AI-powered composer creating tailored responses demonstrating genuine understanding of client vision, adapted by lead type. Latest Email Version Controller Captures most recent email output ensuring only final approved version proceeds to delivery. Human-in-the-Loop Email Approval Telegram-based review checkpoint sending generated emails to team member for quality control before client delivery. Approval Decision Router Evaluates reviewer's response, routing approved emails to client delivery or rejected emails to revision agent. Email Revision Agent AI-powered editor refining rejected emails based on feedback while maintaining personalization and brand voice. Client Email Delivery Sends final approved personalized emails demonstrating understanding of project vision with clear next steps. Notion Lead Database Manager Records all potential clients with complete intake data, classification results, and tracking information for pipeline management. Customization Tips Adjust Classification Thresholds: Modify budget ranges and space requirements in the AI Lead Scoring Engine prompt to match your market and service level. Multi-Language Support: Update all AI agent prompts with instructions for your target language. Claude supports 100+ languages. Additional Routing: Add branches for special cases like urgent projects, VIP clients, or specific geographic regions. CRM Integration: Replace Notion with HubSpot, Salesforce, or Airtable using respective n8n nodes. SMS Notifications: Add Twilio node for immediate HOT lead alerts to mobile devices. Troubleshooting Issue: Telegram approval not received Verify bot token is correct Confirm chat ID matches your Telegram account Check bot is not blocked Issue: Claude API errors Verify API key validity and credits Check prompt length isn't exceeding token limits Review rate limits on your Anthropic plan Issue: Gmail not sending Confirm app password (not regular password) is used Check "Less secure app access" if using older method Verify daily sending limits not exceeded Issue: Notion database not updating Confirm integration has access to database Verify property names match exactly (case-sensitive) Check property types align with data being sent Template Metrics Execution Time: ~30-45 seconds per lead (including AI processing) API Calls: 2-3 Claude requests per lead (classification + email generation, +1 if revision) Cost Estimate: ~$0.05-0.15 per lead processed (based on Claude API pricing) Support & Community n8n Community Forum: https://community.n8n.io Template Issues: Report bugs or suggest improvements via n8n template feedback Claude Documentation: https://docs.anthropic.com Notion API Docs: https://developers.notion.com License This template is provided as-is under MIT license. Modify and adapt freely for your business needs. --- Version: 1.0 Last Updated: October 2025 Compatibility: n8n v1.0+ (Cloud & Self-Hosted), Claude API v2024-10+
Automated UGC video generator with Gemini images and SORA 2
This workflow automates the creation of user-generated-content-style product videos by combining Gemini's image generation with OpenAI's SORA 2 video generation. It accepts webhook requests with product descriptions, generates images and videos, stores them in Google Drive, and logs all outputs to Google Sheets for easy tracking. Main Use Cases Automate product video creation for e-commerce catalogs and social media. Generate UGC-style content at scale without manual design work. Create engaging video content from simple text prompts for marketing campaigns. Build a centralized library of product videos with automated tracking and storage. How it works The workflow operates as a webhook-triggered process, organized into these stages: Webhook Trigger & Input Accepts POST requests to the /create-ugc-video endpoint. Required payload includes: product prompt, video prompt, Gemini API key, and OpenAI API key. Image Generation (Gemini) Sends the product prompt to Google's Gemini 2.5 Flash Image model. Generates a product image based on the description provided. Data Extraction Code node extracts the base64 image data from Gemini's response. Preserves all prompts and API keys for subsequent steps. Video Generation (SORA 2) Sends the video prompt to OpenAI's SORA 2 API. Initiates video generation with specifications: 720x1280 resolution, 8 seconds duration. Returns a video generation job ID for polling. Video Status Polling Continuously checks video generation status via OpenAI API. If status is "completed": proceeds to download. If status is still processing: waits 1 minute and retries (polling loop). Video Download & Storage Downloads the completed video file from OpenAI. Uploads the MP4 file to Google Drive (root folder). Generates a shareable Google Drive link. Logging to Google Sheets Records all generation details in a tracking spreadsheet: Product description Video URL (Google Drive link) Generation status Timestamp Summary Flow: Webhook Request → Generate Product Image (Gemini) → Extract Image Data → Generate Video (SORA 2) → Poll Status → If Complete: Download Video → Upload to Google Drive → Log to Google Sheets → Return Response If Not Complete: Wait 1 Minute → Poll Status Again Benefits: Fully automated video creation pipeline from text to finished product. Scalable solution for generating multiple product videos on demand. Combines cutting-edge AI models (Gemini + SORA 2) for high-quality output. Centralized storage in Google Drive with automatic logging in Google Sheets. Flexible webhook interface allows integration with any application or service. Retry mechanism ensures videos are captured even with longer processing times. --- Created by Daniel Shashko
Track personal finances in Google Sheets with AI agent via Slack
Who's it for This workflow is perfect for individuals who want to maintain detailed financial records without the overhead of complex budgeting apps. If you prefer natural language over data entry forms and want an AI assistant to handle the bookkeeping, this template is for you. It's especially useful for: People who want to track cash and online transactions separately Anyone who lends money to friends/family and needs debt tracking Users comfortable with Slack as their primary interface Those who prefer conversational interactions over manual spreadsheet updates What it does This AI-powered finance tracker transforms your Slack workspace into a personal finance command center. Simply mention your bot with transactions in plain English (e.g., "₹500 cash food, borrowed ₹1000 from John"), and the AI agent will: Parse transactions using natural language understanding via Google Gemini Calculate balance changes for cash and online accounts Show a preview of changes before saving anything Update Google Sheets only after you approve Track debts (who owes you, who you owe, repayments) Send daily reminders at 11 PM with current balances and active debts The workflow maintains conversational context using PostgreSQL memory, so you can say things like "yesterday's transactions" or "that payment to Sarah" and it understands the context. How it works Scheduled Daily Check-in (11 PM) Fetches current balances from Google Sheets Retrieves all active debts Formats and sends a Slack message with balance summary Prompts you to share the day's transactions AI Agent Transaction Processing When you mention the bot in Slack: Phase 1: Parse & Analyze Extracts amount, payment type (cash/online), category (food, travel, etc.) Identifies transaction type (expense, income, borrowed, lent, repaid) Stores conversation context in PostgreSQL memory Phase 2: Calculate & Preview Reads current balances from Google Sheets Calculates new balances based on transactions Shows formatted preview with projected changes Waits for your approval ("yes"/"no") Phase 3: Update Database (only after approval) Logs transactions with unique IDs and timestamps Updates debt records with person names and status Recalculates and stores new balances Handles debt lifecycle (Active → Settled) Phase 4: Confirmation Sends success message with updated balances Shows active debts summary Includes logging timestamp Requirements Essential Services: n8n instance (self-hosted or cloud) Slack workspace with admin access Google account Google Gemini API key PostgreSQL database Recommended: Claude AI model (mentioned in workflow notes as better alternative to Gemini) How to set up Google Sheets Setup Create a new Google Sheet with three tabs named exactly: Balances Tab: | Date | CashBalance | OnlineBalance | Total_Balance | |------|--------------|----------------|---------------| Transactions Tab: | TransactionID | Date | Time | Amount | PaymentType | Category | TransactionType | PersonName | Description | Added_At | |----------------|------|------|--------|--------------|----------|------------------|-------------|-------------|----------| Debts Tab: | PersonName | Amount | Type | Datecreated | Status | Notes | |-------------|--------|------|--------------|--------|-------| Add header rows and one initial balance row in the Balances tab with today's date and starting amounts. Slack App Setup Go to api.slack.com/apps and create a new app Under OAuth & Permissions, add these Bot Token Scopes: app_mentions:read chat:write channels:read Install the app to your workspace Copy the Bot User OAuth Token Create a dedicated channel (e.g., personal-finance-tracker) Invite your bot to the channel Google Gemini API Visit ai.google.dev Create an API key Save it for n8n credentials setup PostgreSQL Database Set up a PostgreSQL database (you can use Supabase free tier): Create a new project Note down connection details (host, port, database name, user, password) The workflow will auto-create the required table n8n Workflow Configuration Import the workflow and configure: A. Credentials Google Sheets OAuth2: Connect your Google account Slack API: Add your Bot User OAuth Token Google Gemini API: Add your API key PostgreSQL: Add database connection details B. Update Node Parameters All Google Sheets nodes: Select your finance spreadsheet Slack nodes: Select your finance channel Schedule Trigger: Adjust time if you prefer a different check-in hour (default: 11 PM) Postgres Chat Memory: Change sessionKey to something unique (e.g., financetrackeryour_name) Keep tableName as n8nchathistory_finance or rename consistently C. Slack Trigger Setup Activate the "Bot Mention trigger" node Copy the webhook URL from n8n In Slack App settings, go to Event Subscriptions Enable events and paste the webhook URL Subscribe to bot event: app_mention Save changes Test the Workflow Activate both workflow branches (scheduled and agent) In your Slack channel, mention the bot: @YourBot ₹100 cash snacks Bot should respond with a preview Reply "yes" to approve Verify Google Sheets are updated How to customize Change Transaction Categories Edit the AI Agent's system message to add/remove categories. Current categories: travel, food, entertainment, utilities, shopping, health, education, other Modify Daily Check-in Time Change the Schedule Trigger's triggerAtHour value (0-23 in 24-hour format). Add Currency Support Replace ₹ with your currency symbol in: Format Daily Message code node AI Agent system prompt examples Switch AI Models The workflow uses Google Gemini, but notes recommend Claude. To switch: Replace "Google Gemini Chat Model" node Add Claude credentials Connect to AI Agent node Customize Debt Types Modify AI Agent's system prompt to change debt handling logic: Currently: IOwe and TheyOwe_Me You can add more types or change naming Add More Payment Methods Current: cash, online To add more (e.g., credit card): Update AI Agent prompt Modify Balances sheet structure Update balance calculation logic Change Approval Keywords Edit AI Agent's Phase 2 approval logic to recognize different approval phrases. Add Spending Analytics Extend the daily check-in to calculate: Weekly/monthly spending summaries Category-wise breakdowns Use additional Code nodes to process transaction history Important Notes ⚠️ Never trigger with normal messages - Only use app mentions (@botname) to avoid infinite loops where the bot replies to its own messages. 💡 Context Awareness - The bot remembers conversation history, so you can reference "yesterday", "last week", or previous transactions naturally. 🔒 Data Privacy - All your financial data stays in your Google Sheets and PostgreSQL database. The AI only processes transaction text temporarily. 📊 Backup Regularly - Export your Google Sheets periodically as backup. --- Pro Tips: Start with small test transactions to ensure everything works Use consistent person names for debt tracking The bot understands various formats: "₹500 cash food" = "paid 500 rupees in cash for food" You can batch transactions in one message: "₹100 travel, ₹200 food, ₹50 snacks"