9 templates found
Category:
Author:
Sort:

Fetch dynamic prompts from GitHub and auto-populate n8n expressions in prompt

Who Is This For? This workflow is designed for AI engineers, automation specialists, and content creators who need a scalable system to dynamically manage prompts stored in GitHub. It eliminates manual updates, enforces required variable checks, and ensures that AI interactions always receive fully processed prompts. --- 🚀 What Problem Does This Solve? Manually managing AI prompts can be inefficient and error-prone. This workflow: ✅ Fetches dynamic prompts from GitHub ✅ Auto-populates placeholders with values from the setVars node ✅ Ensures all required variables are present before execution ✅ Processes the formatted prompt through an AI agent --- 🛠 How This Workflow Works This workflow consists of three key branches, ensuring smooth prompt retrieval, variable validation, and AI processing. --- 1️⃣ Retrieve the Prompt from GitHub (HTTP Request → Extract from File → SetPrompt) The workflow starts manually or via an external trigger. It fetches a text-based prompt stored in a GitHub repository. The Extract from File Node retrieves the content from the GitHub file. The SetPrompt Node stores the prompt, making it accessible for processing. 📌 Note: The prompt must contain n8n expression format variables (e.g., {{ $json.company }}) so they can be dynamically replaced. --- 2️⃣ Extract & Auto-Populate Variables (Check All Prompt Vars → Replace Variables) A Code Node scans the prompt for placeholders in the n8n expression format ({{ $json.variableName }}). The workflow compares required variables against the setVars node: ✅ If all variables are present, it proceeds to variable replacement. ❌ If any variables are missing, the workflow stops and returns an error listing them. The Replace Variables Node replaces all placeholders with values from setVars. 📌 Example of a properly formatted GitHub prompt: Hello {{ $json.company }}, your product {{ $json.features }} launches on {{ $json.launch_date }}. This ensures seamless replacement when processed in n8n. --- 3️⃣ AI Processing & Output (AI Agent → Prompt Output) The Set Completed Prompt Node stores the final, processed prompt. The AI Agent Node (Ollama Chat Model) processes the prompt. The Prompt Output Node returns the fully formatted response. 📌 Optional: Modify this to use OpenAI, Claude, or other AI models. --- ⚠️ Error Handling: Missing Variables If a required variable is missing, the workflow stops execution and provides an error message: ⚠️ Missing Required Variables: ["launch_date"] This ensures no incomplete prompts are sent to AI agents. --- ✅ Example Use Case 📜 GitHub Prompt File (Using n8n Expressions) Hello {{ $json.company }}, your product {{ $json.features }} launches on {{ $json.launch_date }}. 🔹 Variables in setVars Node json { "company": "PropTechPro", "features": "AI-powered Property Management", "launch_date": "March 15, 2025" } ✅ Successful Output Hello PropTechPro, your product AI-powered Property Management launches on March 15, 2025. 🚨 Error Output (If Missing launch_date) ⚠️ Missing Required Variables: ["launch_date"] --- 🔧 Setup Instructions 1️⃣ Connect Your GitHub Repository Store your prompt in a public or private GitHub repo. The workflow will fetch the raw file using the GitHub API. 2️⃣ Configure the SetVars Node Define the required variables in the SetVars Node. Make sure the variable names match those used in the prompt. 3️⃣ Test & Run Click Test Workflow to execute. If variables are missing, it will show an error. If everything is correct, it will output the fully formatted prompt. --- ⚡ How to Customize This Workflow 💡 Need CRM or Database Integration? Connect the setVars node to an Airtable, Google Sheets, or HubSpot API to pull variables dynamically. 💡 Want to Modify the AI Model? Replace the Ollama Chat Model with OpenAI, Claude, or a custom LLM endpoint. --- 📌 Why Use This Workflow? ✅ No Manual Updates Required – Fetches prompts dynamically from GitHub. ✅ Prevents Broken Prompts – Ensures required variables exist before execution. ✅ Works for Any Use Case – Handles AI chat prompts, marketing messages, and chatbot scripts. ✅ Compatible with All n8n Deployments – Works on Cloud, Self-Hosted, and Desktop versions.

RealSimple SolutionsBy RealSimple Solutions
3558

Generate cheap viral AI videos to TikTok with Google Veo3 fast and Postiz

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates the entire process of generating short AI videos using Google Veo3 Fast, enhancing them with SEO-optimized titles, and uploading them directly to TikTok via Postiz, all triggered from a central Google Sheet. This setup ensures a seamless pipeline from video creation to TikTok upload, with minimal manual intervention. --- Benefits Full automation from prompt input to social media publishing. Cheaper video generation using Veo3 Fast vs traditional AI video tools. Centralized management through Google Sheets – no coding required for end users. SEO-enhanced titles with GPT-4o to boost engagement. Scheduled or manual triggering, perfect for batch operations. No manual uploads – integration with Postiz means content is published hands-free. --- How It Works This workflow automates the process of generating AI videos using Google Veo3 Fast, saving them to Google Drive, and uploading them to TikTok via Postiz. Here’s how it functions: Trigger: The workflow can be started manually or scheduled (e.g., every 5 minutes) to check for new video requests in a Google Sheet. Video Generation: The workflow retrieves a video prompt and duration from the Google Sheet. It sends the prompt to Google Veo3 Fast via the Fal.ai API to generate the video. The system periodically checks the video generation status until it’s completed. Post-Processing: Once the video is ready, it is downloaded and uploaded to Google Drive. A YouTube-optimized title is generated using GPT-4o Mini based on the video prompt. TikTok Upload: The video is uploaded to Postiz, a social media scheduling tool. Postiz then publishes the video to the connected TikTok account with the generated title. Tracking: The Google Sheet is updated with the video URL for record-keeping. --- Set Up Steps To configure this workflow, follow these steps: Prepare the Google Sheet: Create a Google Sheet with columns: PROMPT: Description of the video. DURATION: Length of the video. VIDEO: (Leave empty, auto-filled by the workflow). Obtain API Keys: Sign up at Fal.ai to get an API key for Google Veo3 Fast. Replace YOURAPIKEY in the "Create Video" node’s HTTP header (Authorization: Key YOURAPIKEY). Configure Postiz for TikTok: Create a Postiz account (free trial available). Connect your TikTok account in Postiz and note the Channel ID. Replace XXX in the "TikTok" node with your TikTok Channel ID. Set the Postiz API key in the "Upload Video to Postiz" node. Set Up Google Drive & Sheets Access: Ensure the workflow has OAuth access to: Google Sheets (to read/write video data). Google Drive (to store generated videos). Schedule or Run Manually: The workflow can be triggered manually or scheduled (e.g., every 5 minutes) to process new video requests. Note: This workflow requires self-hosted n8n due to community node dependencies. --- Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
3166

Reddit lead finder: Automated prospecting with GPT-4, Supabase and Gmail alerts

This workflow monitors targeted subreddits for potential sales leads using Reddit’s API, AI content analysis, Supabase, and Google Sheets. It is built specifically to discover posts from Reddit users who may benefit from a particular product or service. It can be easily customized for any market. --- 🔍 Features Targeted Subreddit Monitoring: Searches multiple niche subreddits like smallbusiness, startup, sweatystartup, etc., using relevant keywords. AI-Powered Relevance Scoring: Uses OpenAI GPT to analyze each post and determine if it’s written by someone who might benefit from your product, returning a simple “yes” or “no.” Duplicate Lead Filtering with Supabase: Ensures you don’t email the same lead more than once by storing already-processed Reddit post IDs in a Supabase table. Content Filtering: Filters out posts with no body text or no upvotes to ensure only high-quality content is processed. Lead Storage in Google Sheets: Saves qualified leads into a connected Google Sheet with key data (URL, post content, subreddit, and timestamp). Email Digest Alerts: Compiles relevant leads and sends a daily digest of matched posts to your team’s inbox for review or outreach. Manual or Scheduled Trigger: Can be manually triggered or automatically scheduled (via the built-in Schedule Trigger node). --- ⚙️ Tech Stack Reddit API – For post discovery OpenAI Chat Model – For AI-based relevance filtering Supabase – For lead de-duplication Google Sheets – For storing lead details Gmail API – For sending email alerts --- 🔧 Customization Tips Adjust Audience: Modify the subreddits and keywords in the initial Code node to match your market. Change the AI Prompt: Customize the prompt in the “Analysis Content by AI” node to describe your product or service. Search Comments Instead: To monitor comments instead of posts, change type=link to type=comment in the Reddit Search node. Change Email Recipients: Edit the Gmail node to direct leads to a different email address or format.

BlueProBy BluePro
1009

Get updates when a response is created in SurveyMonkey

Companion workflow for SurveyMonkey node docs

amudhanBy amudhan
722

Edit & deliver images with DALL-E 2, Google Drive & Telegram messaging

🎨 AI Image Editor with Form Upload + Telegram Delivery 🚀 Who’s it for? 👥 This workflow is built for content creators, social media managers, designers, and agencies who need fast, AI-powered image editing without the hassle. Whether you're batch-editing for clients or spicing up personal projects, this tool gets it done — effortlessly. What it does 🛠️ A seamless pipeline that: 📥 Accepts uploads + prompts via a clean form ☁️ Saves images to Google Drive automatically 🧠 Edits images with OpenAI’s image API 📁 Converts results to downloadable PNGs 📬 Delivers the final image instantly via Telegram Perfect for AI-enhanced workflows that need speed, structure, and simplicity. How it works ⚙️ User Uploads: Fill a form with an image + editing prompt Cloud Save: Auto-upload to your Google Drive folder AI Editing: OpenAI processes the image with your prompt Convert & Format: Image saved as PNG Telegram Delivery: Final result sent straight to your chat 💬 You’ll need ✅ 🔑 OpenAI API key 📂 Google Drive OAuth2 setup 🤖 Telegram bot token & chat ID ⚙️ n8n instance (self-hosted or cloud) Setup in 4 Easy Steps 🛠️ Connect APIs Add OpenAI, Google Drive, and Telegram credentials to n8n Store keys securely (avoid hardcoding!) Configure Settings Set Google Drive folder ID Add Telegram chat ID Tweak image size (default: 1024×1024) Deploy the Form Add a Webhook Trigger node Test with a sample image Share the form link with users 🎯 Fine-Tune Variables In the Set node, customize: 📐 Image size 📁 Folder path 📲 Delivery options ⏱️ Timeout duration Want to customize more? 🎛️ 🖼️ Image Settings Change size (e.g. 512x512 or 2048x2048) Update the model (when new versions drop) 📂 Storage Auto-organize files by date/category Add dynamic file names using n8n expressions 📤 Delivery Swap Telegram with Slack, email, Discord Add multiple delivery channels Include image prompt or metadata in messages 📝 Form Upgrades Add fields for advanced editing Validate file types (e.g. PNG/JPEG only) Show a progress bar for long edits ⚡ Advanced Features Add error handling or retry flows Support batch editing Include approvals or watermarking before delivery ⚠️ Notes & Best Practices ✅ Check OpenAI credit balance 🖼️ Test with different image sizes/types ⏱️ Adjust timeout settings for larger files 🔐 Always secure your API keys

David OlusolaBy David Olusola
457

Create nested data processing loops using n8n sub-workflows

Nested Loops with Sub-workflows Template Description This template provides a practical solution for a common n8n challenge: creating nested loops. While a powerful feature, n8n's standard Loop nodes don't work as expected in a nested structure. This template demonstrates the reliable workaround using a main workflow that calls a separate sub-workflow for each iteration. Purpose The template is designed to help you handle scenarios where you need to iterate through one list of data for every item in another list. This is a crucial pattern for combining or processing related data, ensuring your workflows are both clean and modular. Instructions for Setup This template contains both the main workflow and the sub-workflow on a single canvas. Copy the sub-workflow part of this template (starting with the Execute Sub-workflow Trigger node) and paste it into a new, empty canvas. In the Execute Sub-workflow node in the main workflow on this canvas, update the Sub-workflow field to link to the new workflow you just created. Run the main workflow to see the solution in action. For a detailed walkthrough of this solution, check out the full blog post

Viktor KlepikovskyiBy Viktor Klepikovskyi
357

Convert Slack messages to Notion todos using 📝 emoji reactions

This n8n workflow turns Slack messages into actionable Notion todos — using nothing more than a simple emoji reaction. By reacting to any Slack message with the 📝 emoji, the workflow will automatically capture the message, extract its content, and create a new Notion to_do item with a link back to the original message. A daily scheduled Slack message then reminds you of any tasks left unchecked in Notion. Perfect for async teams who live in Slack but organize work in Notion, this template helps you bridge the gap between communication and execution—without switching tools or relying on memory. Who’s it for Teams that use Slack and Notion daily Product managers, leads, and async-first teams who want quick capture of action items Anyone tired of copy-pasting Slack messages into Notion manually How it works Emoji-triggered Notion capture The workflow listens for the reaction_added event from Slack. When a user reacts to a message with :memo:, it fetches the full message content and permalink. It creates a to_do block in Notion with the message and a direct link to the original Slack thread. Daily Slack reminder Every day at 08:00, the workflow scans all to_do blocks in a designated Notion page. It filters out those that are still unchecked. It aggregates the unchecked tasks and sends a single Slack message to the user with the list of open todos. How to set up Connect your Slack account and configure the trigger to watch for the :memo: reaction. Connect your Notion account and select a page where todos should be created. Customize the schedule time if needed (default is 08:00). (Optional) Set up the final Slack message node to send reminders to yourself or a specific team channel. Requirements A Slack app with permission to read messages and reactions A Notion integration with access to edit the target page A Notion page with blocks enabled (or create one manually) How to customize the workflow Change the emoji from :memo: to another (e.g. :fire: for urgent, :idea: for brainstorms) Add logic to assign Notion tasks to specific team members Use Slack threads, tags, or message metadata to auto-categorize tasks Modify the daily reminder to include due dates, priorities, or Slack buttons to check off tasks

Guilherme CamposBy Guilherme Campos
298

Connect AI agents to eBay deal API with MCP server

Need help? Want access to this workflow + many more paid workflows + live Q&A sessions with a top verified n8n creator? Join the community Complete MCP server exposing 4 Deal API operations to AI agents. ⚡ Quick Setup Import this workflow into your n8n instance Credentials Add Deal API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Deal API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ebay.com{basePath} • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (4 total) 🔧 Deal_Item (1 endpoints) • GET /deal_item: List Deal Items 🔧 Event (2 endpoints) • GET /event: List Event Items • GET /event/{event_id}: This method retrieves the details for an eBay event 🔧 Event_Item (1 endpoints) • GET /event_item: This method returns a paginated set of event items 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Deal API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.

David AshbyBy David Ashby
114

Generate property investment reports with GPT-4, SerpAPI, Google Docs & Airtable

How It Works ⚙️ This workflow is a comprehensive, multi-AI-agent system that acts as a virtual property analysis team. It automates the entire process from initial data research to final report delivery. Data Research (AI Agent 1): The workflow is triggered by an input of a property address. The first AI agent, powered by Google Search, immediately scours the web to gather all relevant public data, including historical sales data, local rental prices, and nearby amenities. Market Analysis (AI Agent 2): All the raw data from the first agent is consolidated. A second AI agent, powered by OpenAI, then performs a deep, intelligent analysis. It identifies key insights, calculates potential rental yield, and assigns a definitive "Investment Score." Report Generation (AI Agent 3): A third AI agent, also using OpenAI, takes the structured analysis and writes a professional, persuasive report. The report is then automatically populated into a pre-designed Google Docs template, ensuring a polished and professional look. Database Logging & Delivery: The final report is automatically converted to a PDF and sent to the client via Gmail. Simultaneously, the key findings (address, score, key takeaways) are logged into an Airtable database for quick reference and tracking. How to Set Up 🛠️ Import the Workflow: Copy the provided workflow JSON and import it into your n8n instance. Configure Credentials: Google Search: Add your API Key. OpenAI: Add your API Key. Google Docs: Add your OAuth2 credential. Airtable: Add your API Key and token. Gmail: Add your OAuth2 credential. Customize Workflow Nodes: Node 6 (Google Docs): Create a new Google Doc to serve as your report template. Use placeholders like {{ reportcontent }} and {{ propertyaddress }} to define where the AI-generated text will go. Then, copy the document ID from the URL and paste it into this node's parameters. Node 7 (Airtable): Replace [YOUR AIRTABLE BASE ID] and Property Reports with your specific Airtable base and table names. Ensure your table has columns that match the data you are sending (Property Address, Investment Score, etc.). Save & Activate: Once all settings and credentials are configured, save the workflow and click the "Inactive" toggle in the top-right corner to make it live.

MarthBy Marth
106
All templates loaded