7 templates found
Category:
Author:
Sort:

Export n8n Cloud execution data to CSV

Overview This template helps n8n cloud plan users execute all executions to a CSV for easy data analysis. Identify what workflows are generating the most executions or could be optimized. How this workflow works Click "Test Workflow" to manually execute the workflow Open the "Convert to CSV" node to access the binary data of the CSV file Download the CSV file Nodes included: n8n node Convert to File No Operation, do nothing - replace with another Set up steps Import the workflow to your workspace Add your n8n API credential Benefits of Exporting n8n Cloud Executions to CSV Exporting n8n Cloud executions to CSV offers significant advantages for enhancing workflow management and data analysis capabilities. Here are three key benefits: Enhanced Data Analysis: Comprehensive Insights: Exporting execution data allows for in-depth analysis of workflow performance, helping identify bottlenecks and optimize processes. Custom Reporting: CSV files can be easily imported into various data analysis tools (e.g., Excel, Google Sheets, or BI software) to create custom reports and visualizations tailored to specific business needs. Improved Workflow Monitoring: Historical Data Review: Accessing historical execution data enables users to track workflow changes and their impacts over time, facilitating better decision-making. Error Tracking and Debugging: By reviewing execution logs, users can quickly identify and address errors or failures, ensuring smoother and more reliable workflow operations. Regulatory Compliance and Auditing: Audit Trails: Keeping a record of all executions provides a clear audit trail, essential for regulatory compliance and internal audits. Data Retention: Exported data ensures that execution records are preserved according to organizational data retention policies, safeguarding against data loss. By leveraging the capabilities of CSV exports, users can gain valuable insights, streamline workflow management, and ensure robust data handling practices, ultimately driving better performance and efficiency in their n8n Cloud operations.

LudwigBy Ludwig
4207

Build your own SQLite MCP server

This template is for Self-Hosted N8N Instances only. This n8n demonstrates how to build a simple SQLite MCP server to perform local database operations as well as use it for Business Intelligence. This MCP example is based off an official MCP reference implementation which can be found here -https://github.com/modelcontextprotocol/servers/tree/main/src/sqlite How it works A MCP server trigger is used and connected to 5 tools: 2 Code Node and 3 Custom Workflow. The 2 Code Node tools use the SQLLite3 library and are simple read-only queries and as such, the Code Node tool can be simply used. The 3 custom workflow tools are used for select, insert and update queries as these are operations which require a bit more discretion. Whilst it may be easier to allow the agent to use raw SQL queries, we may find it a little safer to just allow for the parameters instead. The custom workflow tool allows us to define this restricted schema for tool input which we'll use to construct the SQL statement ourselves. All 3 custom workflow tools trigger the same "Execute workflow" trigger in this very template which has a switch to route the operation to the correct handler. Finally, we use our Code nodes to handle select, insert and update operations. The responses are then sent back to the the MCP client. How to use This SQLite MCP server allows any compatible MCP client to manage a SQLite database by supporting select, create and update operations. You will need to have a SQLite database available before you can use this server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/integrating-with-claude-desktop Try the following queries in your MCP client: "Please create a table to store business insights and add the following..." "what business insights do we have on current retail trends?" "Who has contributed the most business insights in the past week?" Requirements SQLite for database. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow If the scope of schemas or tables is too open, try restrict it so the MCP serves a specific purpose for business operations. eg. Confine the querying and editing to HR only tables before providing access to people in that department. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!

JimleukBy Jimleuk
2607

Call & SMS πŸ› οΈ Twilio tool MCP server

πŸ› οΈ Twilio Tool MCP Server Complete MCP server exposing all Twilio Tool operations to AI agents. Zero configuration needed - all 2 operations pre-built. ⚑ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL πŸ”§ How it Works β€’ MCP Trigger: Serves as your server endpoint for AI agent requests β€’ Tool Nodes: Pre-configured for every Twilio Tool operation β€’ AI Expressions: Automatically populate parameters via $fromAI() placeholders β€’ Native Integration: Uses official n8n Twilio Tool tool with full error handling πŸ“‹ Available Operations (2 total) Every possible Twilio Tool operation is included: πŸ”§ Call (1 operations) β€’ Make a call πŸ”§ Sms (1 operations) β€’ Send an SMS/MMS/WhatsApp message πŸ€– AI Integration Parameter Handling: AI agents automatically provide values for: β€’ Resource IDs and identifiers β€’ Search queries and filters β€’ Content and data payloads β€’ Configuration options Response Format: Native Twilio Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic πŸ’‘ Usage Examples Connect this MCP server to any AI agent or workflow: β€’ Claude Desktop: Add MCP server URL to configuration β€’ Custom AI Apps: Use MCP URL as tool endpoint β€’ Other n8n Workflows: Call MCP tools from any workflow β€’ API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits β€’ Complete Coverage: Every Twilio Tool operation available β€’ Zero Setup: No parameter mapping or configuration needed β€’ AI-Ready: Built-in $fromAI() expressions for all parameters β€’ Production Ready: Native n8n error handling and logging β€’ Extensible: Easily modify or add custom logic > πŸ†“ Free for community use! Ready to deploy in under 2 minutes.

David AshbyBy David Ashby
1167

Repurpose YouTube videos to multiple content types with OpenRouter AI and Airtable

YouTube Content Repurposing Automation Who's it for This workflow is for content creators, marketers, agencies, coaches, and businesses who want to maximize their YouTube content ROI by automatically generating multiple content assets from single videos. It's especially useful for professionals who want to: Repurpose YouTube videos into blogs, social posts, newsletters, and tutorials without manual effort Scale their content production across multiple channels and platforms Create consistent, high-quality content derivatives while saving time and resources Build automated content systems that generate multiple revenue streams Maintain active presence across social media, email, and blog platforms simultaneously What problem is this workflow solving Content creators face significant challenges when trying to maximize their video content: Time-intensive manual repurposing: Converting one YouTube video into multiple content formats traditionally requires hours of manual writing, editing, and formatting across different platforms. Inconsistent content quality: Manual repurposing often leads to varying quality levels and missed opportunities to optimize content for specific platforms. High costs for content services: Hiring ghostwriters or content agencies to repurpose videos can cost thousands of dollars monthly. Scaling bottlenecks: Manual processes prevent creators from efficiently scaling their content across multiple channels and formats. This workflow solves these problems by automatically extracting YouTube video transcripts, using AI to generate multiple high-quality content formats (tutorials, blog posts, social media content, newsletters), and organizing everything in Airtable for easy management and distribution. How it works Automated Video Processing Starts with a manual trigger and retrieves YouTube URLs from your Airtable configuration, processing only videos marked as "selected" while filtering out those marked for deletion. Intelligent Transcript Extraction Uses Scrape Creator API to extract video transcripts, automatically cleaning and formatting the text for optimal AI processing and content generation. Multi-Format Content Generation Leverages OpenRouter models, o you can easily test different AI models and choose the one that delivers the best results for your needs: Step-by-step tutorials with code snippets and technical details YouTube scripts with hooks, titles, and conclusions Blog posts optimized for lead generation Structured summaries with key takeaways LinkedIn posts with engagement triggers Newsletter content for email marketing Twitter/X posts for social media Smart Content Filtering Processes only the content types you've selected in Airtable, ensuring efficient resource usage and faster execution times. Automated Content Organization Matches and combines all generated content pieces by URL, then updates your Airtable with complete, ready-to-use content assets organized by type and source video. How to set up Required credentials OpenRouter API key Airtable Personal Access Token Scrape Creators API Key - For YouTube transcript extraction and processing Airtable base setup Create an Airtable base with one main table: Videos Table: title (Single line text): Video title for reference url (URL): YouTube video URL to process Status (Single select): Options: "selected", "delete", "processed" output (Multiple select): Content types to generate summary tutorial blog-post linkedin newsletter tweeter youtube summary (Long text): Generated video summary tutorial (Long text): Generated step-by-step tutorial keytakeaways (Long text): Extracted key insights blog_post (Long text): Generated blog post content linkedin (Long text): LinkedIn post content newsletter (Long text): Email newsletter content tweeter (Long text): Twitter/X post content youtube_titles (Long text): YouTube video title suggestions youtube_hook (Long text): Video opening hooks youtube_steps (Long text): Video step breakdowns youtube_conclusion (Long text): Video ending/CTAs API Configuration Scrape Creator Setup: Sign up for Scrape Creator API Obtain your API key from the dashboard Configure the HTTP Request node with your credentials Set the endpoint to: https://api.scrapecreators.com/v1/youtube/video/transcript OpenAI Setup: Create an OpenRouter account and generate an API key Workflow Configuration Import the workflow JSON into your n8n instance Update all credential references with your API keys Configure the Airtable nodes with your base and table IDs Test the workflow with a single video URL first Requirements n8n instance (self-hosted or cloud) Active API subscriptions for OpenRouter (or the LLM or your choice), Airtable, and Scrape Creator YouTube video URLs - Must be publicly accessible videos with available transcripts Airtable account - Free tier sufficient for most use cases How to customize the workflow Modify content generation prompts Edit the LLM Chain nodes to customize content style and format: Tutorial node: Adjust technical depth and formatting preferences Blog post node: Modify tone, length, and CTA strategies LinkedIn node: Customize engagement hooks and professional tone Newsletter node: Tailor subject lines and email marketing approach Adjust AI model selection Update the OpenRouter Chat Model to use different models Add new content formats Create additional LLM Chain nodes for new content types: Instagram captions TikTok scripts Podcast descriptions Course outlines

Alexandra SpalatoBy Alexandra Spalato
876

Tesla 1hour indicators tool (mid-term technical analysis AI)

πŸ•’ Evaluate Tesla (TSLA) price action and market structure on the 1-hour timeframe using 6 real-time indicators. This sub-agent is designed to feed mid-term technical insights into the Tesla Financial Market Data Analyst Tool. It uses GPT-4.1 to interpret Alpha Vantage indicator data delivered via secure webhooks. ⚠️ This workflow is not standalone and is executed via Execute Workflow. πŸ”Œ Requires: Tesla Quant Technical Indicators Webhooks Tool Alpha Vantage Premium API Key --- πŸ”§ Connected Indicators This tool fetches and analyzes the latest 20 datapoints for: RSI (Relative Strength Index) MACD (Moving Average Convergence Divergence) BBANDS (Bollinger Bands) SMA (Simple Moving Average) EMA (Exponential Moving Average) ADX (Average Directional Index) --- πŸ“‹ Sample Output json { "summary": "TSLA is gaining strength on the 1-hour chart. RSI is rising, MACD has crossed bullish, and BBANDS are widening.", "timeframe": "1h", "indicators": { "RSI": 62.1, "BBANDS": { "upper": 176.90, "lower": 169.70, "middle": 173.30, "close": 176.30 }, "SMA": 174.20, "EMA": 175.60, "ADX": 27.5, "MACD": { "macd": 0.84, "signal": 0.65, "histogram": 0.19 } } } --- 🧠 Agent Components | Component | Role | | ------------------------------ | -------------------------------------------------- | | 1hour Data | Pulls Alpha Vantage indicator data via webhook | | Tesla 1hour Indicators Agent | Interprets signals using structured GPT-4.1 prompt | | OpenAI Chat Model | GPT-4.1 LLM performs analysis | | Simple Memory | Maintains session context | --- πŸ› οΈ Setup Instructions Import Workflow into n8n Name it: Tesla1hourIndicators_Tool Install the Webhook Fetcher Tool πŸ‘‰ Required: TeslaQuantTechnicalIndicatorsWebhooks_Tool This agent expects webhook /1hourData to return pre-cleaned data Add Credentials Alpha Vantage Premium API Key (via HTTP Query Auth) OpenAI GPT-4.1 credentials Configure for Sub-Agent Use Triggered only via Execute Workflow from: πŸ‘‰ Tesla Financial Market Data Analyst Tool Inputs: message (optional) sessionId (required for memory linkage) --- πŸ“Œ Sticky Notes Overview 🟒 Trigger Setup – Activated only by the parent agent πŸ“Š 1h Webhook Fetcher – Calls Alpha Vantage via secured endpoint 🧠 AI Agent Summary – Interprets trend/momentum from indicator data πŸ”— GPT Model Notes – GPT-4.1 parses and explains technical alignment πŸ“˜ Documentation Sticky – Embedded in canvas with full walkthrough --- πŸ” Licensing & Support Β© 2025 Treasurium Capital Limited Company This tool is part of a proprietary multi-agent AI architecture. No commercial reuse or redistribution permitted. πŸ”— Author: Don Jayamaha πŸ”— Templates: https://n8n.io/creators/don-the-gem-dealer/ --- πŸš€ Detect TSLA trend shifts and validate setups with 1-hour technical clarityβ€”powered by Alpha Vantage + GPT-4.1. This tool is required by the Tesla Financial Market Data Analyst Tool.

Don Jayamaha JrBy Don Jayamaha Jr
805

Smart Customer Support System with GPT-4o, Gmail, Slack & Drive Knowledge Base'

The AI Support Agent combines Gmail, Slack, and Google Drive into a seamless support workflow powered by GPT-4o and Pinecone. 🧠 Email Monitoring – New support emails are pulled from Gmail every minute. πŸ“€ Classification – AI categorizes emails (e.g., billing, support, spam, urgent). πŸ“š Knowledge-Based Replies – GPT-4o drafts personalized replies using your support documents synced from Google Drive and stored in Pinecone. πŸ“© Automatic Response – The agent replies to the customer in the same Gmail thread. 🚨 Escalation Detection – If human support is needed, Slack is notified instantly. πŸ“Š Logging – Each interaction is logged in Google Sheets for tracking and analysis. πŸ” Live Sync – Any document added to your Google Drive folder is auto-loaded into the knowledge base for future AI responses. πŸ› οΈ Quick Setup Steps πŸ› οΈ Quick Setup Checklist ⏱ Time to Deploy: ~10–15 minutes πŸ”Œ 1. Connect Integrations βœ… Gmail (OAuth2) βœ… Google Drive (OAuth2) βœ… Google Sheets (OAuth2) βœ… OpenAI API Key βœ… Pinecone API Key βœ… Slack Webhook (for alerts) πŸ—‚οΈ 2. Update Workflow IDs Replace the sample IDs in your nodes: πŸ“ Google Drive Folder ID β†’ Where your KB lives πŸ“Š Google Sheet ID β†’ Where interactions are logged 🚨 Slack Webhook URL β†’ Where urgent alerts go πŸ”Ž Pinecone Index β†’ Your vector storage index 🎨 3. Customize Prompt & Tone Go to πŸ”§ β€œResponse Agent” Node Update the System Prompt to reflect your brand’s tone: e.g. β€œWe’re always here to help, and we reply fast.” πŸ“‚ 4. Upload Your Docs Add .pdf, .txt, or .docx files to your synced Google Drive folder. The agent will auto-read and embed them into Pinecone for AI-powered replies. ▢️ 5. Run & Test Send a test email from another account βœ… Watch the reply come through Gmail βœ… Check Slack for urgent alert βœ… Confirm logging in Google Sheets βœ… Done!

David OlusolaBy David Olusola
468

WhatsApp outbound messaging with Baserow & WasenderAPI

Master Outbound WhatsApp: Baserow & WasenderAPI This workflow integrates with your Baserow 'Messages' table, triggering on 'Sent' status. Messages fire via WasenderAPI, rigorously logged as 'Outbound' in Baserow. Gain total control; drive results. How it works Monitors Baserow 'Messages' table for 'Sent' status. Sends messages via WasenderAPI. Logs outbound details in Baserow. Who's it for For teams dominating outbound WhatsApp and centralizing Baserow logging. Demand communication efficiency? This is your solution. Setup Steps Rapid implementation. Action plan: Activate all critical workflow nodes. Copy Sent_whatsapp webhook URL. Configure Baserow automation (on 'Sent' status) to trigger webhook. Ensure Baserow 'Messages' table includes 'Status' ('Sent' option), linked 'WhatsApp Number', and 'Message Content' fields. (Optional: Baserow Message Form for input). Embed WasenderAPI and Baserow API tokens in n8n Credentials. Security is non-negotiable. Requirements Active n8n instance (self-hosted/cloud). WasenderAPI.com trial/subscription. Baserow account with pre-configured 'Contacts' (link) and 'Messages' (link) tables.

Stephan KoningBy Stephan Koning
272
All templates loaded