11 templates found
Category:
Author:
Sort:

Automate Figma versioning and Jira updates with n8n webhook integration

How It Works: This n8n template automates the process of tracking design changes in Figma and updating relevant Jira issues. The template is triggered when a new version is created in Figma via a custom plugin. Once the version is committed, the plugin sends the design details to an n8n workflow using a webhook. The workflow then performs the following actions: Fetches the Jira issue based on the provided issue link from Figma. Adds the design changes as a comment to the Jira issue. Updates the status of the Jira issue based on the provided task status (e.g., "In Progress", "Done"). This streamlines the workflow, reducing the need for manual updates and ensuring that both the design team and developers have the latest design changes and task statuses in sync. How to Use It: Set up the Figma Plugin: Install the Figma Commit Plugin from GitHub. In the plugin, fill out the version name, design link, Jira issue link, and the task status. Commit the changes in Figma, which will trigger the webhook. Set Up the n8n Workflow: Import this template into your n8n instance. Connect the Figma Trigger node to capture version updates from Figma. Configure the Jira nodes to retrieve the issue and update the status/comment based on the data sent from the plugin. Automate: Once the version is committed in Figma, the workflow will automatically update the Jira issue and keep both your Figma design and Jira tasks in sync! By integrating Figma, Jira, and n8n through this template, you’ll eliminate manual steps, making collaboration between design and development teams more efficient.

omid devBy omid dev
5815

Auto-post platform-optimized content to X and Threads with late API and Google Sheets

X (Twitter) and Threads (by Meta) both have different maximum character lengths. Different X and Threads Content Auto Poster This n8n template demonstrates how to post different content optimized for X (Twitter) and Meta Threads using the Late API. You can use it for any niche. For example: posting AI news to X and Threads. Possible use cases: Schedule your posts to X and Threads. Use this workflow as a content calendar and automated posting system. Apply it across different content niches. How it works The automation runs according to the time defined in the Schedule Trigger node. Content is pulled from Google Sheets. Any URL is shortened using your preferred short URL API. Images are uploaded to Late’s server first. Content for X is posted in Step 2. The workflow checks that the content length is under 280 characters. Content for Threads is posted in Step 3. The workflow checks that the content length is under 500 characters. Posts on X are published as threaded posts, while on Threads they are single posts. Once posted, the Google Sheets content database is updated. Requirements Google OAuth credentials with the Google Sheets API enabled Bitly account and access token (or OAuth) GetLate API connected to your X and Threads accounts --- HOW TO USE STEP 1 Adjust the settings in the Schedule Trigger node to define when the workflow runs. Open this Google Sheets template, then go to File → Make a copy, and update the settings in the Get Topic node. Get your Bitly OAuth or Access Token here and add the credentials in the Short Link node. Get your API key from getlate.dev and add the credentials in the Upload IMG node. STEP 2 Add your Late credentials to the Post Twitter node. Get your Twitter account ID from Late, and update it in the JSON Body section of the Post Twitter node. STEP 3 Add your Late credentials to the Post Threads node. Get your Threads account ID from Late, and update it in the JSON Body section of the Post Threads node.

FariezBy Fariez
2854

Extract customer pain points from support forums with Bright Data & GPT-4

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors customer support forums and Q&A platforms to extract valuable customer insights and pain points. It saves you time by eliminating the need to manually browse through forum discussions and provides structured analysis of customer questions, answers, and recurring issues. Overview This workflow automatically scrapes customer support forums like Stack Exchange and SuperUser to find questions and discussions related to specific topics or brands. It uses AI to analyze forum content, extract customer pain points, and identify recurring issues, then sends structured insights directly to your product team via email. Tools Used n8n: The automation platform that orchestrates the workflow Bright Data: For scraping forum pages and Q&A platforms without being blocked OpenAI: AI agent for intelligent forum content analysis and insight extraction Gmail: For sending automated insight reports to your team How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Gmail: Connect your Gmail account for sending team notifications Customize: Set target forum URLs and define the topics or brands to monitor Use Cases Product Teams: Identify customer pain points and feature requests from forum discussions Customer Support: Monitor common issues and questions customers are asking Market Research: Understand customer needs and challenges in your industry Competitive Analysis: Track how customers discuss competitor products and services Connect with Me Website: https://www.nofluff.online YouTube: https://www.youtube.com/@YaronBeen/videos LinkedIn: https://www.linkedin.com/in/yaronbeen/ Get Bright Data: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) n8n automation forummonitoring customersupport brightdata webscraping customerinsights n8nworkflow workflow nocode forumautomation customerresearch supportmonitoring painpointanalysis communitymonitoring forumanalysis customerfeedback productinsights supportforums stackexchange customervoice userresearch productfeedback techsupport communitylistening customerexperience supportanalysis forumdata qandamonitoring customerpainpoints

Yaron BeenBy Yaron Been
2382

Enrich and manage candidates data in Notion

This workflow allows you to add candidates’ profile assessments to Notion before an interview. Prerequisites Add an input field on your Calendly Invite page where the candidate can enter their LinkedIn URL. Create credentials for your Calendly account. Follow the steps mentioned in the documentation to learn how to do that. Create credentials for Humantic AI following the steps mentioned here. Create a page on Notion similar to this page. Create credentials for the Notion node by following the steps in the documentation. Calendly Trigger node: This node will trigger the workflow when an interview gets scheduled. Make sure to add a field to collect the candidates' LinkedIn URL on your invite page. Humantic AI: This node uses the LinkedIn URL received by the previous node to create a candidate profile in Humantic AI. Humantic AI1: This node will analyze the candidates' profile. Notion node: This node will create a new page in Notion using the information from the previous node.

Harshil AgrawalBy Harshil Agrawal
2008

Automate HR Q&A sessions with AI question clustering and Google Calendar integration

This workflow helps HR teams run smoother monthly Q\&A sessions with employees. Who’s it for HR teams and managers who want to centralize employee questions, avoid duplicates, and keep meetings focused. How it works Employees submit questions through a styled form. Questions are stored in a database. HR selects a date range to review collected questions. An AI Agent deduplicates and clusters similar questions, then generates a meeting script in Markdown format. The Agent automatically creates a Google Calendar event (with a Google Meet link) on the last Friday of the current month at 16:00–17:00. The script is returned as a downloadable .txt file for HR to guide the session. Requirements MySQL (or compatible DB) for storing questions Google Calendar credentials OpenAI (or another supported LLM provider) How to customize Adjust meeting day/time in the Set node expressions Change database/table name in MySQL nodes Modify clustering logic in the AI Agent prompt Replace the form styling with your company’s branding This template ensures no repeated questions, keeps HR better prepared with a structured script, and automates meeting scheduling in just one click.

Gabriel SantosBy Gabriel Santos
1924

Beginner data analysis: merge, filter & summarize in Google Sheets with GPT-4o

This beginner-friendly n8n workflow teaches essential data manipulation techniques using Google Sheets and AI. You'll learn how to: ✅ Merge two datasets by a shared column (Channel) 🔍 Filter rows based on performance metrics (Clicks, Spend) 🔀 Branch logic into "Great" vs. "Poor" outcomes 📊 Summarize results by team leader 🤖 Use an OpenAI-powered agent to generate a written analysis highlighting the best and worst performers Perfect for marketers, analysts, or anyone learning how to clean, transform, and interpret data inside n8n. Includes: 📁 Sample Google Sheet to copy 🛠 Setup instructions for Google Sheets & OpenAI ✨ AI summary powered by GPT-4o-mini 👋 Questions or Feedback? Feel free to reach out — I’m happy to help! Robert Breen Founder, Ynteractive 🌐 ynteractive.com 📧 robert@ynteractive.com 📺 YouTube: YnteractiveTraining 🔗 LinkedIn: linkedin.com/in/robertbreen

Robert BreenBy Robert Breen
1639

AI invoice agent

--- 📄 AI Invoice Agent The AI Invoice Agent automates the invoice creation, email delivery, and status tracking process for client billing. It ensures invoices are generated, sent professionally, and updated in Google Sheets with minimal manual work. --- 🔹 How It Works Trigger Activated manually (Execute Workflow) when you want to process invoices. Fetch Invoices Reads client invoice data from a Google Sheet (Client Invoices). Filter Pending Invoices Passes through only invoices with Status = Pending. Prepare Invoice Data Collects and formats details: Invoice ID Client Name & Address Project Name Amount (USD) Invoice Date (today’s date) Due Date (7 days later) Loop Over Invoices Processes each invoice one by one. AI Email Draft Uses GPT-4.1-mini to generate a polite, professional email. Tone: friendly but business-oriented. Signed as Upward Engine Team. Extract Email Parts Separates subject and body from the AI output using an Information Extractor. Generate Invoice PDF Uses CraftMyPDF to create a formatted invoice PDF with: Company details (Upward Engine) Client details Invoice ID, Date, Due Date Amount due Footer message Send Email to Client Sends invoice email via Gmail, attaching the PDF invoice. Update Invoice Status Updates Google Sheets to mark the invoice as Completed. Saves Invoice ID, Date, Due Date, and updated status. Loop Continuation Continues until all pending invoices are processed. --- 🔹 Tools & Integrations Google Sheets → Stores client & invoice data Filter Node → Selects only Pending invoices GPT-4.1-mini (OpenAI) → Generates professional emails Information Extractor → Separates subject & body CraftMyPDF → Creates PDF invoices Gmail → Sends invoice emails with PDF attachments --- 🔹 Example Workflow ✅ Google Sheets: Invoice marked as Pending ➡️ AI generates email → “Invoice INV-1023 for Web Design Project – Due Sep 5” ➡️ PDF invoice created & attached ➡️ Email sent to client with subject + body ➡️ Status updated in Google Sheet → Completed --- ⚡ This agent ensures zero missed invoices, professional client communication, and up-to-date tracking — fully automated for agencies and small businesses. ---

Rakin JakariaBy Rakin Jakaria
1337

Multi-LLM customer support chatbot for WordPress & webhook integrations

AI Chat Bot workflow for WordPress & Webhook Live Chats This workflow powers a versatile AI chatbot that can be integrated into any live chat interface, such as our free Forerunner™ AI Chat Bot for WordPress. It's designed to automate customer support and lead generation by handling a variety of user queries independently. The setup process is straightforward and typically takes less than five minutes. This involves connecting your preferred Large Language Model (LLM) and a live chat platform to the workflow via webhooks. How the Workflow Works The core of this workflow is an AI Agent that acts as the brain of the chatbot. It processes user input and generates responses based on predefined rules and your chosen language model. User Input: When a user sends a message through your live chat, it's sent to the workflow via a webhook. This message is then passed to the AI Agent for processing. AI Response Generation: The AI Agent analyzes the message, retrieves relevant conversational history from the Simple Memory node to maintain context, and uses the selected Large Language Model (e.g., OpenAI, Gemini, or Claude) to formulate a response. Conditional Logic: After the response is generated, the workflow uses an If node to check if the conversation should end. If the response contains the specific tag [ENDOFCONVERSATION], the workflow prepares to end the chat. Otherwise, the conversation continues. Send to Client: The final response is then sent back to the live chat interface, where it is displayed to the user. This completes the loop, allowing the chatbot to engage in a continuous conversation until the task is complete.

Design for OnlineBy Design for Online
1330

Track US fintech & healthtech funding rounds: Crunchbase to Google Sheets

Track US Fintech & Healthtech Funding Rounds: Crunchbase to Google Sheets 🌍 Overview This workflow fetches the latest funding rounds from Crunchbase (filtered by industry + location), formats the results, and saves them neatly into Google Sheets every day. You end up with a live deal flow tracker that updates itself! --- 🟢 Section 1: Schedule & Data Fetch 🔗 Nodes: 1️⃣ Daily Check for New Funding Rounds (Schedule Trigger) ⏰ Runs automatically every morning at 8 AM. Why it’s useful: You don’t need to run it manually. 2️⃣ Fetch Crunchbase Funding Rounds (HTTP Request) 🌐 Pulls funding rounds from Crunchbase’s API. Filters: 📍 Location → United States* 🏭 Industry → Fintech, Healthtech* ⏳ Sorted → newest first 📄 Limit → 25 per run 💡 Beginner Benefit: ✅ No coding required to hit Crunchbase API ✅ Automated — always fetches fresh funding data --- 🔵 Section 2: Extract & Format 🔗 Nodes: 3️⃣ Extract & Format Funding Data (Code) 📑 Converts Crunchbase API JSON into clean, readable rows with: 🏢 Company Name 🏭 Industry 💵 Money Raised (USD) 📅 Announced Date 🏷️ Funding Round Type 👥 Investors 🔗 Crunchbase URL 💡 Beginner Benefit: ✅ No messy JSON → clean structured output ✅ Auto-creates a link to each funding round --- 🟣 Section 3: Save to Sheets 🔗 Nodes: 4️⃣ Save to Google Sheets 📊 Appends the formatted funding round data into your Google Sheet. Columns: Company, Industry, Investors, Amount, Date, URL, etc. 💡 Beginner Benefit: ✅ Data goes directly into Google Sheets → no copy-paste ✅ You can filter, chart, or connect Sheets to dashboards --- 📊 Final Overview | Section | What Happens | Why It’s Useful | | --------- | --------------------- | ----------------------------------------------- | | 🟢 Fetch | Scheduler + API fetch | Always pulls new Crunchbase deals automatically | | 🔵 Format | Extract + clean JSON | Turns raw API data into readable rows | | 🟣 Save | Google Sheets | Creates your own funding tracker sheet | --- 🚀 Why This Workflow is a Game-Changer ⏱️ Zero manual work → Wake up to fresh funding data daily 📊 Deal flow in Sheets → Perfect for VCs, founders, analysts 🔍 Customizable filters → Change location, industry, or of results 🔗 Action-ready → Use Sheets to track trends, outreach investors, or monitor competitors --- ✨ With this workflow, you’ve basically built your own Crunchbase alerts dashboard — no coding required! ---

Yaron BeenBy Yaron Been
517

Automatic email invoice archiving & data extraction with Gmail, Drive & AI

Automated Invoice Archiving Automatically fetch, store, and extract key information from invoices received via email from your ISP or utility provider (electricity, gas, telecom, water, etc.).The workflow saves the invoices to Google Drive (or optionally to your personal FTP/SFTP server) and logs all invoice details into Google Sheets via AI-powered extraction. Read: Full setup Guide How it works Scheduled TriggerRuns the workflow at a selected interval (e.g., every hour). You can freely adjust the timing. Gmail – Fetch MessagesReads your Gmail inbox and retrieves only messages coming from your ISP/utility provider’s email address, filtering for messages with PDF attachments. Gmail – Download Invoice Fetches the full email content and downloads the attached invoice (PDF). Google Drive – Upload File Uploads the invoice into a specific Google Drive folder of your choice. (Optional) Upload to FTP/SFTP Sends a copy of the invoice to your personal server via secure FTP/SFTP. AI Extraction Pipeline Extract PDF Text – converts the PDF into text (OCR not required if text-based). AI Agent (OpenRouter) – understands the invoice content and extracts structured fields (invoice number, date, provider name, total amount, tax info, line items, etc.) Code Node – sanitizes and parses the JSON from the AI model. Google Sheets – Append Invoice DataAdds a new row to your Google Sheet with all parsed invoice fields. (Optional) CleanupAutomatically deletes:– the Gmail message– the temporary file in Google Drive(Useful when you only want your FTP or Sheets copy.) Parameters to configure | Parameter | Description | Recommended configuration | | --- | --- | --- | | Gmail Credentials | OAuth2 credentials needed to read and delete emails. | Create OAuth credentials on Google Cloud → enable Gmail API → paste Client ID & Secret into n8n → “Connect OAuth2”. | | Sender Email Filter | Email address your provider uses to send invoices. | Example: billing@your-isp.com, invoices@utility.it, ciao@octopusenergy.it | | Google Drive Folder | Destination folder for saving invoices. | Copy the folder ID from the Drive URL and paste it into folderId. | | Google Drive Credentials | OAuth2 connection for file uploads/deletions. | Same Google Cloud project → enable Drive API → OAuth connect in n8n. | | FTP/SFTP Server (optional) | Upload invoices to your private server. | Host / IP · Port · Username · Password or SSH Key · Destination path (e.g. /home/user/invoices/). | | AI Model (OpenRouter) | Large-language model used to parse invoice text. | Example: gpt-4.1, llama-3.1, or any preferred OpenRouter model. | | Google Sheets Document | Destination spreadsheet for structured data. | Create a Sheet → add columns (Vendor, Invoice Number, Date, Amount, Service Type, etc.) → insert documentId & sheet name. | | Sheets Credentials (Service Account) | Used for writing into Google Sheets. | Create Service Account → download JSON → add to n8n → share the Sheet with the Service Account email. | | Trigger Interval | How often the workflow checks for new invoices. | Every hour · every 30 minutes · daily at set ti | Node-by-node breakdown Schedule Trigger Runs at the interval you choose (default: hourly).Start → triggers entire workflow. Gmail – Get Many Messages Filters inbox items using: Sender email (your ISP/utility address) Has attachment Unread or recent messages Downloads metadata + attachment references. Filter – Contains Attachment Ensures only messages with binary attachments continue. Gmail – Get Invoice Downloads: Full email JSON The invoice PDF (binary data) Google Drive – Upload File Uploads invoice PDF with a dynamic filename: {{ $json.from.value[0].name }}-{{ $json.date }}.pdf Requires: Google Drive OAuth2 credentials Folder ID (destination directory) HTTP Request – Download File Retrieves the raw PDF file from Google Drive for further processing. (Optional) FTP/SFTP Upload Uploads the PDF to your server using: Host / IP Port (default 22) Username Password or private key Destination path Filename is sanitized to ensure Unix compatibility. (Optional) Delete Temporary File Deletes the Google Drive file if you don’t want duplicates. (Optional) Delete Gmail Message Removes the original email once processed (optional inbox cleanup). Extract from File (PDF → Text) Reads the PDF and extracts raw text for AI processing. OpenRouter Chat Model LLM backend for the AI agent. Provides: invoice parsing field extraction structured reasoning AI Agent – Extract Invoice Fields The agent is instructed to return strict JSON only, containing keys such as: vendor_name invoice_number invoice_date total_amount tax_details line_items[] po_number po_date Works for most standard PDF invoices. Code – Clean & Parse JSON Sanitizes the AI output: Removes markdown fences Extracts valid JSON Parses into a clean JS object If the AI output is malformed, debugging info is returned. Google Sheets – Append Data Appends the extracted fields into a structured row.Example mappings: Vendor → {{ $json.vendor_name }} Invoice Number → {{ $json.invoice_number }} Date → {{ $json.invoice_date }} Amount → {{ $json.total_amount }} Service Type → {{ $json.line_items[0].description }} 💡 Tips & best practices Add multiple sender filters if you have more than one utility provider. Ensure invoices are text-based PDFs for best extraction results. Use Google Drive as a reliable long-term archive, or keep only FTP if you prefer local storage. Create charts in Google Sheets for tracking: Monthly utility cost trends Year-over-year comparison Consumption spikes (if included in invoices) ⚠️ Important notes Utility invoices contain personal and financial data. Keep your FTP/SFTP server secure. Google APIs require proper OAuth2 or Service Account setup; misconfiguration may cause permission errors. This workflow is for personal automation, not a replacement for official fiscal archiving. AI extraction quality depends on invoice formatting and the model you choose.

Paolo RoncoBy Paolo Ronco
328

Template-based Google Drive folder generation with Forms and Apps Script

Overview Stop manually creating folder structures for every new client or project. This workflow provides a simple form where users enter a name, and automatically duplicates your template folder structure in Google Drive—replacing all placeholders with the submitted name. What This Workflow Does Displays a form where users enter a name (client, project, event, etc.) Creates a new main folder in Google Drive Calls Google Apps Script to duplicate your entire template structure Replaces all {{NAME}} placeholders in files and folder names Key Features Simple form interface — No technical knowledge required to use Recursive duplication — Copies all subfolders and files Smart placeholders — Automatically replaces {{NAME}} everywhere Production-ready — Works immediately after setup Prerequisites Google Drive account with OAuth2 credentials in n8n Google Apps Script deployment (code below) Template folder in Drive using {{NAME}} as placeholder Setup Step 1: Create your template folder 📁 {{NAME}} - Project Files ├── 📁 01. {{NAME}} - Documents ├── 📁 02. {{NAME}} - Assets ├── 📁 03. Deliverables └── 📄 {{NAME}} - Brief.gdoc Step 2: Deploy Apps Script Go to script.google.com Create new project → Paste code below Deploy → New deployment → Web app Execute as: Me | Access: Anyone Copy the deployment URL Step 3: Configure workflow Replace these placeholders: DESTINATIONPARENTFOLDER_ID — Where new folders are created YOURAPPSSCRIPT_URL — URL from Step 2 YOURTEMPLATEFOLDER_ID — Folder to duplicate Step 4: Test Activate workflow → Open form URL → Submit a name → Check Drive! --- Apps Script Code javascript function doPost(e) { try { var params = e.parameter; var templateFolderId = params.templateFolderId; var name = params.name; var destinationFolderId = params.destinationFolderId; if (!templateFolderId || !name) { return jsonResponse({ success: false, error: 'Missing required parameters: templateFolderId and name are required' }); } var templateFolder = DriveApp.getFolderById(templateFolderId); if (destinationFolderId) { var destinationFolder = DriveApp.getFolderById(destinationFolderId); copyContentsRecursively(templateFolder, destinationFolder, name); return jsonResponse({ success: true, id: destinationFolder.getId(), url: destinationFolder.getUrl(), name: destinationFolder.getName(), mode: 'copiedtoexisting', timestamp: new Date().toISOString() }); } else { var parentFolder = templateFolder.getParents().next(); var newFolderName = replacePlaceholders(templateFolder.getName(), name); var newFolder = parentFolder.createFolder(newFolderName); copyContentsRecursively(templateFolder, newFolder, name); return jsonResponse({ success: true, id: newFolder.getId(), url: newFolder.getUrl(), name: newFolder.getName(), mode: 'created_new', timestamp: new Date().toISOString() }); } } catch (error) { return jsonResponse({ success: false, error: error.toString() }); } } function replacePlaceholders(text, name) { var result = text; result = result.replace(/\{\{NAME\}\}/g, name); result = result.replace(/\{\{name\}\}/g, name.toLowerCase()); result = result.replace(/\{\{Name\}\}/g, name); return result; } function copyContentsRecursively(sourceFolder, destinationFolder, name) { var files = sourceFolder.getFiles(); while (files.hasNext()) { try { var file = files.next(); var newFileName = replacePlaceholders(file.getName(), name); file.makeCopy(newFileName, destinationFolder); Utilities.sleep(150); } catch (error) { Logger.log('Error copying file: ' + error.toString()); } } var subfolders = sourceFolder.getFolders(); while (subfolders.hasNext()) { try { var subfolder = subfolders.next(); var newSubfolderName = replacePlaceholders(subfolder.getName(), name); var newSubfolder = destinationFolder.createFolder(newSubfolderName); Utilities.sleep(200); copyContentsRecursively(subfolder, newSubfolder, name); } catch (error) { Logger.log('Error copying subfolder: ' + error.toString()); } } } function jsonResponse(data) { return ContentService .createTextOutput(JSON.stringify(data)) .setMimeType(ContentService.MimeType.JSON); } --- Use Cases Agencies — Client folder structure on new signup Freelancers — Project folders from intake form HR Teams — Employee onboarding folders Schools — Student portfolio folders Event Planners — Event documentation folders Notes Apps Script may take +60 seconds for large structures Timeout is set to 5 minutes for complex templates Your Google account needs edit access to template and destination folders

Antonio GassoBy Antonio Gasso
88
All templates loaded