14 templates found
Category:
Author:
Sort:

Export n8n Cloud execution data to CSV

Overview This template helps n8n cloud plan users execute all executions to a CSV for easy data analysis. Identify what workflows are generating the most executions or could be optimized. How this workflow works Click "Test Workflow" to manually execute the workflow Open the "Convert to CSV" node to access the binary data of the CSV file Download the CSV file Nodes included: n8n node Convert to File No Operation, do nothing - replace with another Set up steps Import the workflow to your workspace Add your n8n API credential Benefits of Exporting n8n Cloud Executions to CSV Exporting n8n Cloud executions to CSV offers significant advantages for enhancing workflow management and data analysis capabilities. Here are three key benefits: Enhanced Data Analysis: Comprehensive Insights: Exporting execution data allows for in-depth analysis of workflow performance, helping identify bottlenecks and optimize processes. Custom Reporting: CSV files can be easily imported into various data analysis tools (e.g., Excel, Google Sheets, or BI software) to create custom reports and visualizations tailored to specific business needs. Improved Workflow Monitoring: Historical Data Review: Accessing historical execution data enables users to track workflow changes and their impacts over time, facilitating better decision-making. Error Tracking and Debugging: By reviewing execution logs, users can quickly identify and address errors or failures, ensuring smoother and more reliable workflow operations. Regulatory Compliance and Auditing: Audit Trails: Keeping a record of all executions provides a clear audit trail, essential for regulatory compliance and internal audits. Data Retention: Exported data ensures that execution records are preserved according to organizational data retention policies, safeguarding against data loss. By leveraging the capabilities of CSV exports, users can gain valuable insights, streamline workflow management, and ensure robust data handling practices, ultimately driving better performance and efficiency in their n8n Cloud operations.

LudwigBy Ludwig
4207

Slack slash commands AI chat bot

This is a response chatbot in public channels through slash commands. I explain more in detail through the YouTube video, but it's only available in Korean. How it works? When you request the created slash command in Slack, the request comes to the webhook. Then, the Switch Node branches appropriately according to each slash command request. Here, a slash command called /ask is connected to the chatbot, and the chatbot generates answers to the questions asked. The final node responds to the channel. Set up steps Create a Slack app. Add chat:write permission in Slack OAuth&Permissions&gt;Scopes. Create a Command in Slack Slash Commands menu and enter the n8n Webhook node's URL. Complete creating the Slash Commands. Enter the created command in the Switch node. <br /> --- <br /> 슬래시 커맨드를 통한 공개 채널에서의 응답 챗봇 입니다. 유튜브 영상에 더 자세하게 설명 드립니다. 설명 슬랙에 생성한 슬래시 커맨드를 슬랙에서 요청하면 웹훅에 요청이 들어옵니다. 이후 Switch Node에서 각 슬래시 커맨드의 요청에 따라 알맞게 분기합니다. 여기에서는 /ask​라는 슬래시 커맨드가 챗봇으로 연결되어 있고, 챗봇에서 질문한 내용의 답변을 생성합니다. 마지막 노드에서 채널로 응답을 합니다. 설정 방법 Slack 앱을 만드세요. Slack OAuth&Permissions&gt;Scopes 에서 chat:write 권한을 추가하세요. Slack Slash Commands 메뉴에서 Command를 생성하고, n8n Webhook 노드의 url을 입력하세요. Slash Slash Commands 생성을 완료하세요. Switch 노드에 생성한 커맨드를 입력하세요.

InfoGrabBy InfoGrab
3718

Automated resume job matching engine with Bright Data MCP & OpenAI 4o mini

Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The Automated Resume Job Matching Engine is an intelligent workflow designed for career platforms, HR tech startups, recruiting firms, and AI developers who want to streamline job-resume matching using real-time data from LinkedIn and job boards. This workflow is tailored for: HR Tech Founders - Building next-gen recruiting products Recruiters & Talent Sourcers - Seeking automated candidate-job fit evaluation Job Boards & Portals - Enriching user experience with AI-driven job recommendations Career Coaches & Resume Writers - Offering personalized job fit analysis AI Developers - Automating large-scale matching tasks using LinkedIn and job data What problem is this workflow solving? Manually matching a resume to job description is time-consuming, biased, and inefficient. Additionally, accessing live job postings and candidate profiles requires overcoming web scraping limitations. This workflow solves: Automated LinkedIn profile and job post data extraction using Bright Data MCP infrastructure Semantic matching between job requirements and candidate resume using OpenAI 4o mini Pagination handling for high-volume job data End-to-end automation from scraping to delivery via webhook and persisting the job matched response to disk What this workflow does Bright Data MCP for Job Data Extraction Uses Bright Data MCP Clients to extract multiple job listings (supports pagination) Pulls job data from LinkedIn with the pre-defined filtering criteria's OpenAI 4o mini LLM Matching Engine Extracts paginated job data from the Bright Data MCP extracted info via the MCP scrapeashtml tool. Extracts textual job description information via the scraped job information by leveraging the Bright Data MCP scrapeashtml tool. AI Job Matching node handles the job description and the candidate resume compare to generate match scores with insights Data Delivery Sends final match report to a Webhook Notification endpoint Persistence of AI matched job response to disk Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. In n8n, configure the OpenAi account credentials. In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data APITOKEN within the Environments textbox above as APITOKEN=&lt;your-token&gt;. Update the Set input fields for candidate resume, keywords and other filtering criteria's. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. Update the file name and path to persist on disk. How to customize this workflow to your needs Target Different Job Boards Set input fields with the sites like Indeed, ZipRecruiter, or Monster Customize Matching Criteria Adjust the prompt inside the AI Job Match node Include scoring metrics like skills match %, experience relevance, or cultural fit Automate Scheduling Use a Cron Node to periodically check for new jobs matching a profile Set triggers based on webhook or input form submissions Output Customization Add Markdown/PDF formatting for report summaries Extend with Google Sheets export for internal analytics Enhance Data Security Mask personal info before sending to external endpoints

Ranjan DailataBy Ranjan Dailata
2888

Archive empty pages in Notion database

This workflow will archive empty pages in your Notion databases, Add your n8n integration to the Notion databases that you want to process. To configure this workflow set the Notion credentials in the 4 Notion nodes and if needed change the time in the Cron node, The default is to run at 2am every day.

JonathanBy Jonathan
1986

Get the last five SpaceX launches from the spacex.land API using GraphQL

Companion workflow for GraphQL node docs

amudhanBy amudhan
1588

Auto-generate social media posts from URLs with AI, Telegram & multi-platform posting

How it works This workflow turns any URL sent to a Telegram bot into ready-to-publish social posts: Trigger: Telegram message (checks if it contains a URL). Fetch & parse: Downloads the page and extracts readable text + title. AI writing: Generates platform-specific copy (Facebook, Instagram, LinkedIn). Image: Creates an AI image and stores it in Supabase Storage. Publish: Posts to Facebook Pages, Instagram Business, LinkedIn. Logging: Updates Google Sheets with post URLs and sends a Telegram confirmation (image + links). Setup Telegram – create a bot, connect via n8n Telegram credentials. OpenAI / Gemini – add API key in n8n Credentials and select it in the AI nodes. Facebook/Instagram (Graph API) – create a credential called facebookGraph with: • accessToken (page-scoped or system user) • pageId (for Facebook Page photos) • igUserId (Instagram Business account ID) • optional fbApiVersion (default v19.0) LinkedIn – connect with OAuth2 in the LinkedIn node (leave as credential). Supabase – credential supabase with url and apiKey. Ensure a bucket exists (default used in the Set node is social-media). Google Sheets – replace YOURGOOGLESHEET_ID and Sheet1. Grant your n8n Google OAuth2 access. Notes • No API keys are stored in the template. Everything runs via n8n Credentials. • You can change bucket name, image size/quality, and AI prompts in the respective nodes. • The confirmation message on Telegram includes direct permalinks to the published posts. Required credentials • Telegram Bot • OpenAI (or Gemini) • Facebook/Instagram Graph • LinkedIn OAuth2 • Supabase (url + apiKey) • Google Sheets OAuth2 Inputs • A Telegram message that contains a URL. Outputs • Social posts published on Facebook, Instagram, LinkedIn. • Row appended/updated in Google Sheets with post URLs and image link. • Telegram confirmation with the generated image + post links.

Karol OtrębaBy Karol Otręba
1095

Repurpose YouTube videos to multiple content types with OpenRouter AI and Airtable

YouTube Content Repurposing Automation Who's it for This workflow is for content creators, marketers, agencies, coaches, and businesses who want to maximize their YouTube content ROI by automatically generating multiple content assets from single videos. It's especially useful for professionals who want to: Repurpose YouTube videos into blogs, social posts, newsletters, and tutorials without manual effort Scale their content production across multiple channels and platforms Create consistent, high-quality content derivatives while saving time and resources Build automated content systems that generate multiple revenue streams Maintain active presence across social media, email, and blog platforms simultaneously What problem is this workflow solving Content creators face significant challenges when trying to maximize their video content: Time-intensive manual repurposing: Converting one YouTube video into multiple content formats traditionally requires hours of manual writing, editing, and formatting across different platforms. Inconsistent content quality: Manual repurposing often leads to varying quality levels and missed opportunities to optimize content for specific platforms. High costs for content services: Hiring ghostwriters or content agencies to repurpose videos can cost thousands of dollars monthly. Scaling bottlenecks: Manual processes prevent creators from efficiently scaling their content across multiple channels and formats. This workflow solves these problems by automatically extracting YouTube video transcripts, using AI to generate multiple high-quality content formats (tutorials, blog posts, social media content, newsletters), and organizing everything in Airtable for easy management and distribution. How it works Automated Video Processing Starts with a manual trigger and retrieves YouTube URLs from your Airtable configuration, processing only videos marked as "selected" while filtering out those marked for deletion. Intelligent Transcript Extraction Uses Scrape Creator API to extract video transcripts, automatically cleaning and formatting the text for optimal AI processing and content generation. Multi-Format Content Generation Leverages OpenRouter models, o you can easily test different AI models and choose the one that delivers the best results for your needs: Step-by-step tutorials with code snippets and technical details YouTube scripts with hooks, titles, and conclusions Blog posts optimized for lead generation Structured summaries with key takeaways LinkedIn posts with engagement triggers Newsletter content for email marketing Twitter/X posts for social media Smart Content Filtering Processes only the content types you've selected in Airtable, ensuring efficient resource usage and faster execution times. Automated Content Organization Matches and combines all generated content pieces by URL, then updates your Airtable with complete, ready-to-use content assets organized by type and source video. How to set up Required credentials OpenRouter API key Airtable Personal Access Token Scrape Creators API Key - For YouTube transcript extraction and processing Airtable base setup Create an Airtable base with one main table: Videos Table: title (Single line text): Video title for reference url (URL): YouTube video URL to process Status (Single select): Options: "selected", "delete", "processed" output (Multiple select): Content types to generate summary tutorial blog-post linkedin newsletter tweeter youtube summary (Long text): Generated video summary tutorial (Long text): Generated step-by-step tutorial keytakeaways (Long text): Extracted key insights blog_post (Long text): Generated blog post content linkedin (Long text): LinkedIn post content newsletter (Long text): Email newsletter content tweeter (Long text): Twitter/X post content youtube_titles (Long text): YouTube video title suggestions youtube_hook (Long text): Video opening hooks youtube_steps (Long text): Video step breakdowns youtube_conclusion (Long text): Video ending/CTAs API Configuration Scrape Creator Setup: Sign up for Scrape Creator API Obtain your API key from the dashboard Configure the HTTP Request node with your credentials Set the endpoint to: https://api.scrapecreators.com/v1/youtube/video/transcript OpenAI Setup: Create an OpenRouter account and generate an API key Workflow Configuration Import the workflow JSON into your n8n instance Update all credential references with your API keys Configure the Airtable nodes with your base and table IDs Test the workflow with a single video URL first Requirements n8n instance (self-hosted or cloud) Active API subscriptions for OpenRouter (or the LLM or your choice), Airtable, and Scrape Creator YouTube video URLs - Must be publicly accessible videos with available transcripts Airtable account - Free tier sufficient for most use cases How to customize the workflow Modify content generation prompts Edit the LLM Chain nodes to customize content style and format: Tutorial node: Adjust technical depth and formatting preferences Blog post node: Modify tone, length, and CTA strategies LinkedIn node: Customize engagement hooks and professional tone Newsletter node: Tailor subject lines and email marketing approach Adjust AI model selection Update the OpenRouter Chat Model to use different models Add new content formats Create additional LLM Chain nodes for new content types: Instagram captions TikTok scripts Podcast descriptions Course outlines

Alexandra SpalatoBy Alexandra Spalato
876

Get all the entries from Contentful

No description available.

Harshil AgrawalBy Harshil Agrawal
847

Declutter Gmail: archive inactive emails with GPT-4 classification

Who is this for? This workflow is for professionals, entrepreneurs, or anyone overwhelmed by a cluttered Gmail inbox. If you want to automatically archive low-priority emails using AI, this is the perfect hands-free solution. What does it solve? Your inbox fills up with old, read emails that no longer need your attention but manually archiving them takes time. This workflow uses AI to scan and intelligently decide whether each email should be archived, needs a reply, or is spam. It helps you: Declutter your Gmail inbox automatically Identify important vs. unimportant emails Save time with smart email triage How it works A scheduled trigger runs the workflow (you set how often). It fetches all read emails older than 45 days from Gmail. Each email is passed to an AI model(GPT-4) that classifies it as: Actionable Archive If the AI recommends archiving, the workflow archives the email from your inbox. All other emails are left untouched so you can review them as needed. How to set up? Connect your Gmail (OAuth2) and OpenAI API credentials. Open the "Schedule Trigger" node and choose how often the workflow should run (e.g., daily, weekly). Optionally adjust the Gmail filter in the “List Old Emails” node to change which emails are targeted. Start the workflow and let AI clean up your inbox automatically. How to customize this workflow to your needs Change the Gmail filter: Edit the query in the Gmail node to include other conditions (e.g., older_than:30d, specific labels, unread only). Update the AI prompt: Modify the prompt in the Function node to detect more nuanced categories like “Meeting Invite” or “Newsletter.” Adjust schedule frequency: Change how often the cleanup runs (e.g., hourly, daily).

Matt Chong | n8n CreatorBy Matt Chong | n8n Creator
403

Complete Lyft API integration for AI agents with 16 operations using MCP

Need help? Want access to this workflow + many more paid workflows + live Q&A sessions with a top verified n8n creator? Join the community Complete MCP server exposing 16 Lyft API operations to AI agents. ⚡ Quick Setup Import this workflow into your n8n instance Credentials Add Lyft credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Lyft API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.lyft.com/v1 • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (16 total) 🔧 Cost (1 endpoints) • GET /cost: Retrieve Cost Estimate 🔧 Drivers (1 endpoints) • GET /drivers: List Nearby Drivers 🔧 Eta (1 endpoints) • GET /eta: Retrieve Pickup ETA 🔧 Profile (1 endpoints) • GET /profile: Retrieve User Profile 🔧 Rides (7 endpoints) • GET /rides: Update Sandbox Ride Status • POST /rides: Request a Lyft • GET /rides/{id}: Get the ride detail of a given ride ID • POST /rides/{id}/cancel: Cancel a ongoing requested ride • PUT /rides/{id}/destination: Update the destination of the ride • PUT /rides/{id}/rating: Add the passenger's rating, feedback, and tip • GET /rides/{id}/receipt: Get the receipt of the rides. 🔧 Ridetypes (1 endpoints) • GET /ridetypes: Update Driver Availability 🔧 Sandbox (4 endpoints) • PUT /sandbox/primetime: Set Prime Time Percentage • PUT /sandbox/rides/{id}: Propagate ride through ride status • PUT /sandbox/ridetypes: Preset types of rides for sandbox • PUT /sandbox/ridetypes/{ride_type}: Driver availability for processing ride request 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Lyft API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic &gt; 🆓 Free for community use! Ready to deploy in under 2 minutes.

David AshbyBy David Ashby
113

Generate news cards from Spotify emotions with LLM, Google News and APITemplate.io

📄 Workflow Overview Title: Spotify Emotion-to-News Card Generator (APITemplate.io + Slack) What it does: This workflow analyzes the emotion of your recently played Spotify track using OpenRouter (LLM), fetches a related trending Google News article, generates a visual news card with APITemplate.io, and posts it to Slack. 👥 Who’s it for Music lovers, marketers, and developers who want to automatically turn their listening mood into a visual daily digest or Slack update. ⚙️ How it works Spotify Trigger — Fetch your recently played tracks. LLM (Emotion Analyzer) — Infer the main emotion from the track title and artist. Google News Query — Build an RSS URL based on the emotion keyword. RSS Reader — Retrieve trending news headlines. APITemplate.io — Render the top article into an image card. Slack — Post title, link, and card image into your channel. 🧰 Requirements Spotify API credentials OpenRouter API key APITemplate.io account (with template ID) Slack OAuth2 connection 🪄 How to customize Replace the APITemplate.io template ID with your own. Adjust the RSS URL language (hl=en-US → hl=ja-JP for Japanese news). Modify the Slack message text for your preferred channel tone. ⚠️ Disclaimer If you use community nodes (LangChain), this template is for self-hosted n8n only.

nodaBy noda
63

Inventory reconciliation between Notion & Airtable with GPT-4o Slack alerts

📘 Description This workflow performs automated inventory reconciliation between Notion (physical counts) and Airtable (system counts), ensuring both databases stay synchronized. It fetches records from both systems, merges them into a unified comparison payload, validates the structure, and calculates discrepancies. If a mismatch is detected, the workflow automatically updates Airtable with the corrected count and notifies the operations team on Slack. If everything matches, a simple “No action needed” Slack message is sent. Any malformed or incomplete payloads are logged into Google Sheets for audit tracking. ⚙️ What This Workflow Does (Step-by-Step) 🟢 Manual Trigger – Execute Workflow Starts the reconciliation process on demand. 📥 Fetch Records from Notion Retrieves physical stock data (cycle count) stored in Notion. 📦 Fetch Records from Airtable Loads inventory data from Airtable’s system-of-record table. 🔀 Merge Notion + Airtable Inputs Combines both datasets into a single payload for unified processing. 🔍 Validate Payload Structure (IF Node) Ensures that key fields (like id) exist. Valid → continue Invalid → logged to Google Sheets. 🧾 Log Invalid Versioning Requests to Google Sheets Stores broken or incomplete payload entries for later review. 🧮 Build Combined Notion + Airtable Payload (Code Node) Constructs the structured comparison object: { notion: {...}, airtable: [...] } 📊 Compare Notion Record With Airtable Record (Code Node) Performs core reconciliation logic: Matches items by name Compares physical vs. system count Calculates difference Determines if a correction is needed If mismatch → flagged for update. 🔎 Check If Record Requires Update (IF Node) Branches logic into: Mismatch → Update Airtable + Alert Match → No action summary 🛠️ Update Airtable Record With Corrected Count Writes the accurate physical count from Notion into Airtable. 🧠 Configure GPT-4o – Slack Summary Models Two models: For “no action needed” summaries For “Airtable updated” discrepancy alerts 🤖 Generate Slack Summary / Generate Slack Summary1 AI produces short, precise, operations-friendly Slack messages based on whether a discrepancy existed. 💬 Slack – Send Summary Notification / Send Update Notification Sends final Slack message to the operations user, confirming: Stock match status Updates made Item details Difference values 🧩 Prerequisites Notion API integration Airtable API credentials Azure OpenAI GPT-4o Slack API connection Google Sheets OAuth 💡 Key Benefits ✔ Eliminates manual reconciliation errors ✔ Keeps Airtable continuously aligned with real physical counts ✔ Provides instant Slack visibility to operations teams ✔ Logs all invalid or malformed cases ✔ Centralizes Notion + Airtable consistency checks 👥 Perfect For Operations teams managing multi-system inventory Warehouse cycle count workflows Audit-driven companies needing accurate stock data Businesses using Notion + Airtable as parallel systems

Rahul JoshiBy Rahul Joshi
61