Back to Catalog

Sync multi-bank balance data to BigQuery using Plaid

Fahmi FahrezaFahmi Fahreza
147 views
2/3/2026
Official Page

Automated Multi-Bank Balance Sync to BigQuery

This workflow automatically fetches balances from multiple financial institutions (RBC, Amex, Wise, PayPal) using Plaid, maps them to QuickBooks account names, and loads structured records into Google BigQuery for analytics.

Who’s it for?

Finance teams, accountants, and data engineers managing consolidated bank reporting in Google BigQuery.

How it works

  1. The Schedule Trigger runs weekly.
  2. Four Plaid API calls fetch balances from RBC, Amex, Wise, and PayPal.
  3. Each response splits out individual accounts and maps them to QuickBooks names.
  4. All accounts are merged into one dataset.
  5. The workflow structures the account data, generates UUIDs, and formats SQL inserts.
  6. BigQuery node uploads the finalized records.

How to set up

Add Plaid and Google BigQuery credentials, replace client IDs and secrets with variables, test each connection, and schedule the trigger for your reporting cadence.

n8n Workflow: Sync Multi-Bank Balance Data to BigQuery Using Plaid

This n8n workflow automates the process of fetching balance data for multiple bank accounts via the Plaid API, processing the data, and then syncing it to Google BigQuery. It's designed to provide a scheduled, consolidated view of financial balances across various institutions.

What it does

This workflow performs the following key steps:

  1. Triggers on a Schedule: The workflow starts automatically on a predefined schedule (e.g., daily, hourly).
  2. Fetches Plaid Access Tokens: It makes an HTTP request to retrieve a list of Plaid access tokens. These tokens are essential for accessing financial data for different bank accounts.
  3. Processes Each Access Token: For each access token retrieved:
    • It makes an HTTP request to the Plaid API's /accounts/balance/get endpoint, using the access token to fetch the current balance data for the associated bank accounts.
    • It then transforms the raw Plaid balance data using a Code node to extract relevant information and format it for BigQuery. This likely includes flattening nested structures and selecting specific fields.
  4. Consolidates Data: All the processed balance data from different access tokens are merged into a single dataset.
  5. Uploads to Google BigQuery: The consolidated and formatted balance data is then inserted or updated into a specified table in Google BigQuery.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running n8n instance to import and execute the workflow.
  • Plaid Account: An active Plaid account with access to the Plaid API.
    • You will need Plaid API keys (Client ID and Secret).
    • You will need to have generated and stored Plaid access_tokens for the bank accounts you wish to monitor. The workflow expects an external mechanism to provide these tokens (e.g., another API endpoint or a static list).
  • Google Cloud Project: A Google Cloud Project with the BigQuery API enabled.
  • Google BigQuery Credentials: An n8n Google BigQuery credential configured (likely using a Service Account or OAuth 2.0) with permissions to write to your BigQuery dataset and table.
  • Google BigQuery Dataset and Table: A pre-existing BigQuery dataset and table configured to receive the balance data. The table schema should match the output format of the Code node.
  • HTTP Endpoint for Plaid Tokens: An accessible HTTP endpoint that returns a list of Plaid access tokens.

Setup/Usage

  1. Import the Workflow:
    • Download the provided JSON workflow definition.
    • In your n8n instance, go to "Workflows" and click "New".
    • Click the three-dot menu in the top right and select "Import from JSON".
    • Paste the workflow JSON and click "Import".
  2. Configure Credentials:
    • Google BigQuery: Set up a Google BigQuery credential in n8n.
    • Plaid API: The HTTP Request nodes for Plaid will need to be configured with your Plaid Client ID and Secret, typically as headers or part of the request body, depending on your Plaid API version and authentication method.
  3. Update Node Settings:
    • Schedule Trigger (Node 839): Adjust the cron expression or interval to your desired schedule for data synchronization.
    • HTTP Request (Node 19 - "Fetch Plaid Access Tokens"):
      • Update the URL to your endpoint that provides Plaid access tokens.
      • Configure any necessary authentication (e.g., API keys, basic auth) for this endpoint.
    • HTTP Request (Node 19 - "Get Plaid Balance Data"):
      • Ensure the URL points to the Plaid /accounts/balance/get endpoint.
      • Configure the request body to include the access_token from the previous node and your Plaid Client ID and Secret.
    • Code (Node 834 - "Process Plaid Data"): Review and modify the JavaScript code if your Plaid data structure or desired BigQuery schema differs. This node is crucial for transforming the data.
    • Google BigQuery (Node 479):
      • Select your configured Google BigQuery credential.
      • Specify your Google Cloud Project ID, BigQuery Dataset ID, and Table ID.
      • Choose the appropriate operation (e.g., Insert All) and configure how the data should be mapped to your BigQuery table columns.
  4. Activate the Workflow: Once all configurations are complete, activate the workflow to start scheduled data synchronization.

Related Templates

Generate song lyrics and music from text prompts using OpenAI and Fal.ai Minimax

Spark your creativity instantly in any chatβ€”turn a simple prompt like "heartbreak ballad" into original, full-length lyrics and a professional AI-generated music track, all without leaving your conversation. πŸ“‹ What This Template Does This chat-triggered workflow harnesses AI to generate detailed, genre-matched song lyrics (at least 600 characters) from user messages, then queues them for music synthesis via Fal.ai's minimax-music model. It polls asynchronously until the track is ready, delivering lyrics and audio URL back in chat. Crafts original, structured lyrics with verses, choruses, and bridges using OpenAI Submits to Fal.ai for melody, instrumentation, and vocals aligned to the style Handles long-running generations with smart looping and status checks Returns complete song package (lyrics + audio link) for seamless sharing πŸ”§ Prerequisites n8n account (self-hosted or cloud with chat integration enabled) OpenAI account with API access for GPT models Fal.ai account for AI music generation πŸ”‘ Required Credentials OpenAI API Setup Go to platform.openai.com β†’ API keys (sidebar) Click "Create new secret key" β†’ Name it (e.g., "n8n Songwriter") Copy the key and add to n8n as "OpenAI API" credential type Test by sending a simple chat completion request Fal.ai HTTP Header Auth Setup Sign up at fal.ai β†’ Dashboard β†’ API Keys Generate a new API key β†’ Copy it In n8n, create "HTTP Header Auth" credential: Name="Fal.ai", Header Name="Authorization", Header Value="Key [Your API Key]" Test with a simple GET to their queue endpoint (e.g., /status) βš™οΈ Configuration Steps Import the workflow JSON into your n8n instance Assign OpenAI API credentials to the "OpenAI Chat Model" node Assign Fal.ai HTTP Header Auth to the "Generate Music Track", "Check Generation Status", and "Fetch Final Result" nodes Activate the workflowβ€”chat trigger will appear in your n8n chat interface Test by messaging: "Create an upbeat pop song about road trips" 🎯 Use Cases Content Creators: YouTubers generating custom jingles for videos on the fly, streamlining production from idea to audio export Educators: Music teachers using chat prompts to create era-specific folk tunes for classroom discussions, fostering interactive learning Gift Personalization: Friends crafting anniversary R&B tracks from shared memories via quick chats, delivering emotional audio surprises Artist Brainstorming: Songwriters prototyping hip-hop beats in real-time during sessions, accelerating collaboration and iteration ⚠️ Troubleshooting Invalid JSON from AI Agent: Ensure the system prompt stresses valid JSON; test the agent standalone with a sample query Music Generation Fails (401/403): Verify Fal.ai API key has minimax-music access; check usage quotas in dashboard Status Polling Loops Indefinitely: Bump wait time to 45-60s for complex tracks; inspect fal.ai queue logs for bottlenecks Lyrics Under 600 Characters: Tweak agent prompt to enforce fuller structures like [V1][C][V2][B][C]; verify output length in executions

Daniel NkenchoBy Daniel Nkencho
601

Auto-reply & create Linear tickets from Gmail with GPT-5, gotoHuman & human review

This workflow automatically classifies every new email from your linked mailbox, drafts a personalized reply, and creates Linear tickets for bugs or feature requests. It uses a human-in-the-loop with gotoHuman and continuously improves itself by learning from approved examples. How it works The workflow triggers on every new email from your linked mailbox. Self-learning Email Classifier: an AI model categorizes the email into defined categories (e.g., Bug Report, Feature Request, Sales Opportunity, etc.). It fetches previously approved classification examples from gotoHuman to refine decisions. Self-learning Email Writer: the AI drafts a reply to the email. It learns over time by using previously approved replies from gotoHuman, with per-classification context to tailor tone and style (e.g., different style for sales vs. bug reports). Human Review in gotoHuman: review the classification and the drafted reply. Drafts can be edited or retried. Approved values are used to train the self-learning agents. Send approved Reply: the approved response is sent as a reply to the email thread. Create ticket: if the classification is Bug or Feature Request, a ticket is created by another AI agent in Linear. Human Review in gotoHuman: How to set up Most importantly, install the gotoHuman node before importing this template! (Just add the node to a blank canvas before importing) Set up credentials for gotoHuman, OpenAI, your email provider (e.g. Gmail), and Linear. In gotoHuman, select and create the pre-built review template "Support email agent" or import the ID: 6fzuCJlFYJtlu9mGYcVT. Select this template in the gotoHuman node. In the "gotoHuman: Fetch approved examples" http nodes you need to add your formId. It is the ID of the review template that you just created/imported in gotoHuman. Requirements gotoHuman (human supervision, memory for self-learning) OpenAI (classification, drafting) Gmail or your preferred email provider (for email trigger+replies) Linear (ticketing) How to customize Expand or refine the categories used by the classifier. Update the prompt to reflect your own taxonomy. Filter fetched training data from gotoHuman by reviewer so the writer adapts to their personalized tone and preferences. Add more context to the AI email writer (calendar events, FAQs, product docs) to improve reply quality.

gotoHumanBy gotoHuman
353

Dynamic Hubspot lead routing with GPT-4 and Airtable sales team distribution

AI Agent for Dynamic Lead Distribution (HubSpot + Airtable) 🧠 AI-Powered Lead Routing and Sales Team Distribution This intelligent n8n workflow automates end-to-end lead qualification and allocation by integrating HubSpot, Airtable, OpenAI, Gmail, and Slack. The system ensures that every new lead is instantly analyzed, scored, and routed to the best-fit sales representative β€” all powered by AI logic, sir. --- πŸ’‘ Key Advantages ⚑ Real-Time Lead Routing Automatically assigns new leads from HubSpot to the most relevant sales rep based on region, capacity, and expertise. 🧠 AI Qualification Engine An OpenAI-powered Agent evaluates the lead’s industry, region, and needs to generate a persona summary and routing rationale. πŸ“Š Centralized Tracking in Airtable Every lead is logged and updated in Airtable with AI insights, rep details, and allocation status for full transparency. πŸ’¬ Instant Notifications Slack and Gmail integrations alert the assigned rep immediately with full lead details and AI-generated notes. πŸ” Seamless CRM Sync Updates the original HubSpot record with lead persona, routing info, and timeline notes for audit-ready history, sir. --- βš™οΈ How It Works HubSpot Trigger – Captures a new lead as soon as it’s created in HubSpot. Fetch Contact Data – Retrieves all relevant fields like name, company, and industry. Clean & Format Data – A Code node standardizes and structures the data for consistency. Airtable Record Creation – Logs the lead data into the β€œLeads” table for centralized tracking. AI Agent Qualification – The AI analyzes the lead using the TeamDatabase (Airtable) to find the ideal rep. Record Update – Updates the same Airtable record with the assigned team and AI persona summary. Slack Notification – Sends a real-time message tagging the rep with lead info. Gmail Notification – Sends a personalized handoff email with context and follow-up actions. HubSpot Sync – Updates the original contact in HubSpot with the assignment details and AI rationale, sir. --- πŸ› οΈ Setup Steps Trigger Node: HubSpot β†’ Detect new leads. HubSpot Node: Retrieve complete lead details. Code Node: Clean and normalize data. Airtable Node: Log lead info in the β€œLeads” table. AI Agent Node: Process lead and match with sales team. Slack Node: Notify the designated representative. Gmail Node: Email the rep with details. HubSpot Node: Update CRM with AI summary and allocation status, sir. --- πŸ” Credentials Required HubSpot OAuth2 API – To fetch and update leads. Airtable Personal Access Token – To store and update lead data. OpenAI API – To power the AI qualification and matching logic. Slack OAuth2 – For sending team notifications. Gmail OAuth2 – For automatic email alerts to assigned reps, sir. --- πŸ‘€ Ideal For Sales Operations and RevOps teams managing multiple regions B2B SaaS and enterprise teams handling large lead volumes Marketing teams requiring AI-driven, bias-free lead assignment Organizations optimizing CRM efficiency with automation, sir --- πŸ’¬ Bonus Tip You can easily extend this workflow by adding lead scoring logic, language translation for follow-ups, or Salesforce integration. The entire system is modular β€” perfect for scaling across global sales teams, sir.

MANISH KUMARBy MANISH KUMAR
113