42 templates found
Category:
Author:
Sort:

Monitor competitors' websites for changes with OpenAI and Firecrawl

Who is this template for? This workflow template is designed for people seeking alerts when certain specific changes are made to any web page. Leveraging agentic AI, it analyzes the page every day and autonomously decides whether to send you an e-mail notification. Example use cases Track price changes on [competitor's website]. Notify me when the price drops below €50. Monitor new blog posts on [industry leader's website] and summarize key insights. Check [competitor's job page] for new job postings related to software development. Watch for new product launches on [e-commerce site] and send me a summary. Detect any changes in the terms and conditions of [specific website]. Track customer reviews for [specific product] on [review site] and extract key themes. How it works When clicking 'test workflow' in the editor, a new browser tab will open where you can fill in the details of your espionage assignment Make sure you be as concise as possible when instructing AI. Instruct specific and to the point (see examples at the bottom). After submission, the flow will start off by extracting both the relevant website url and an optimized prompt. OpenAI's structured outputs is utilized, followed by a code node to parse the results for further use. From here on, the endless loop of daily checks will begin: Initial scrape 1 day delay Second scrape AI agent decides whether or not to notify you Back to step 1 You can cancel an espionage assignment at any time in the executions tab Set up steps Insert your OpenAI API key in the structured outputs node (second one) Create a Firecrawl account and connect your Firecrawl API key in both 'Scrape page'-nodes Connect your OpenAI account in the AI agents' model node Connect your Gmail account in the AI agents' Gmail tool node

SleakBy Sleak
12444

Create a table in MySQL and insert data

Companion workflow for MySQL node docs

amudhanBy amudhan
6749

Automate company research using ProspectLens and Google Sheets

This n8n workflow automates the process of researching companies by gathering relevant data such as traffic volume, foundation details, funding information, founders, and more. The workflow leverages the ProspectLens API, which is particularly useful for researching companies commonly found on Crunchbase and LinkedIn. ProspectLens is an API that provides very detailed company data. All you need to do is supply the company's domain name. You can obtain your ProspectLens API key here: https://apiroad.net/marketplace/apis/prospectlens In n8n, create a new "HTTP Header" credential. Set x-apiroad-key as the "Name" and enter your APIRoad API key as the "Value". Use this credential in the HTTP Request node of the workflow.

AnthonyBy Anthony
5113

Analyze suspicious email contents with ChatGPT Vision

Phishing Email Detection and Reporting with n8n Who is this for? This workflow is designed for IT teams, security professionals, and managed service providers (MSPs) looking to automate the process of detecting, analyzing, and reporting phishing emails. What problem is this workflow solving? Phishing emails are a significant cybersecurity threat, and manually detecting and reporting them is time-consuming and prone to errors. This workflow streamlines the process by automating email analysis, generating detailed reports, and logging incidents in a centralized system like Jira. What this workflow does This workflow automates phishing email detection and reporting by integrating Gmail and Microsoft Outlook email triggers, analyzing the content and headers of incoming emails, and generating Jira tickets for flagged phishing emails. Here’s what happens: Email Triggers: Captures incoming emails from Gmail or Microsoft Outlook. Email Analysis: Extracts email content, headers, and metadata for analysis. HTML Screenshot: Converts the email’s HTML body into a visual screenshot. AI Phishing Detection: Leverages ChatGPT to analyze the email and detect potential phishing indicators. Jira Integration: Automatically creates a Jira ticket with detailed analysis and attaches the email screenshot for review by the security team. Customizable Reports: Includes options to customize ticket descriptions and adapt the workflow to organizational needs. Setup Authentication: Set up Gmail and Microsoft Outlook OAuth credentials in n8n to access your email accounts securely. API Keys: Add API credentials for the HTML screenshot service (hcti.io) and ChatGPT. Jira Integration: Configure your Jira project and issue types in the workflow. Workflow Configuration: Update sticky notes and nodes to include any additional setup or configuration details unique to your system. How to customize this workflow to your needs Email Filters: Modify email triggers to filter specific subjects or sender addresses. Analysis Scope: Adjust the ChatGPT prompt to refine phishing detection logic. Integration: Replace Jira with your preferred ticketing system or modify the ticket fields to include additional information. This workflow provides an end-to-end automated solution for phishing email management, enhancing efficiency and reducing security risks. It’s perfect for teams looking to minimize manual effort and improve incident response times.

Angel MenendezBy Angel Menendez
4617

AI content creation and publishing engine with Mistral, Creatomate, and YouTube

Description This n8n workflow automates the entire process of creating and publishing AI-generated videos, triggered by a simple message from a Telegram bot (YTAdmin). It transforms a text prompt into a structured video with scenes, visuals, and voiceover, stores assets in MongoDB, renders the final output using Creatomate, and uploads the video to YouTube. Throughout the process, YTAdmin receives real-time updates on the workflow’s progress. This is ideal for content creators, marketers, or businesses looking to scale video production using automation and AI. --- You can see a video demonstrating this template in action here: https://www.youtube.com/watch?v=EjI-ChpJ4xA&t=200s --- How it Works Trigger: Message from YTAdmin (Telegram Bot) The flow starts when YTAdmin sends a content prompt. Generate Structured Content A Mistral language model processes the input and outputs structured content, typically broken into scenes. Split & Process Content into Scenes The content is split into categorized parts for scene generation. Generate Media Assets For each scene: Images: Generated using OpenAI’s image model. Voiceovers: Created using OpenAI’s text-to-speech. Audio files are encoded and stored in MongoDB. Scene Composition Assets are grouped into coherent scenes. Render with Creatomate A complete payload is generated and sent to the Creatomate rendering API to produce the video. Progress messages are sent to YTAdmin. The flow pauses briefly to avoid rate limits. Render Callback Once Creatomate completes rendering, it sends a callback to the flow. If the render fails, an error message is sent to YTAdmin. If the render succeeds, the flow proceeds to post-processing. Generate Title & Description A second Mistral prompt generates a compelling title and description for YouTube. Upload to YouTube The rendered video is retrieved from Creatomate. It’s uploaded to YouTube with the AI-generated metadata. Final Update A success message is sent to YTAdmin, confirming upload completion. --- Set Up Steps (Approx. 10–15 Minutes)Step 1: Set Up YTAdmin Bot Create a Telegram bot via BotFather and get your API token. Add this token in n8n's Telegram credentials and link to the "Receive Message from YTAdmin" trigger. Step 2: Connect Your AI Providers Mistral: Add your API key under HTTP Request or AI Model nodes. OpenAI: Create an account at platform.openai.com and obtain an API key. Use it for both image generation and voiceover synthesis. Step 3: Configure Audio File Storage with MongoDB via Custom API Receives the Base64 encoded audio data sent in the request body. Connects to the configured MongoDB instance (connection details are managed securely within the API- code below). Uses the MongoDB driver and GridFS to store the audio data. Returns the unique _id (ObjectId) of the stored file in GridFS as a response. This _id is crucial as it will be used in subsequent steps to generate the download URL for the audio file. My API code can be found here for reference: https://github.com/nanabrownsnr/YTAutomation.git Step 4: Set Up Creatomate Create a Creatomate account, define your video templates, and retrieve your API key. Configure the HTTP request node to match your Creatomate payload requirements. Step 5: Connect YouTube In n8n, add OAuth2 credentials for your YouTube account. Make sure your Google Cloud project has YouTube Data API enabled. Step 6: Deploy and Test Send a message to YTAdmin and monitor the flow in n8n. Verify that content is generated, media is created, and the final video is rendered and uploaded. --- Customization Options Change the AI Prompts Modify the generation prompts to adjust tone, voice, or content type (e.g., news recaps, product videos, educational summaries). Switch Messaging Platform Replace Telegram (YTAdmin) with Slack, Discord, or WhatsApp by swapping out the trigger and response nodes. Add Subtitles or Effects Integrate Whisper or another speech-to-text tool to generate subtitles. Add overlay or transition effects in the Creatomate video payload. Use Local File Storage Instead of MongoDB Swap out MongoDB upload http nodes with filesystem or S3-compatible storage. Repurpose for Other Platforms Swap YouTube upload with TikTok, Instagram, or Vimeo endpoints for broader publishing. --- Need Help or Want to Customize This Workflow? If you'd like assistance setting this up or adapting it for a different use case, feel free to reach out to me at nanabrownsnr@gmail.com. I'm happy to help!

NanaBBy NanaB
3266

Create a RAG system with Paul Essays, Milvus, and OpenAI for cited answers

Create a RAG System with Paul Essays, Milvus, and OpenAI for Cited Answers This workflow automates the process of creating a document-based AI retrieval system using Milvus, an open-source vector database. It consists of two main steps: Data collection/processing Retrieval/response generation The system scrapes Paul Graham essays, processes them, and loads them into a Milvus vector store. When users ask questions, it retrieves relevant information and generates responses with citations. Step 1: Data Collection and Processing Set up a Milvus server using the official guide Create a collection named "my_collection" Execute the workflow to scrape Paul Graham essays: Fetch essay lists Extract names Split content into manageable items Limit results (if needed) Fetch texts Extract content Load everything into Milvus Vector Store This step uses OpenAI embeddings for vectorization. Step 2: Retrieval and Response Generation When a chat message is received, the system: Sets chunks to send to the model Retrieves relevant information from the Milvus Vector Store Prepares chunks Answers the query based on those chunks Composes citations Generates a comprehensive response This process uses OpenAI embeddings and models to ensure accurate and relevant answers with proper citations. For more information on vector databases and similarity search, visit Milvus documentation.

Cheney ZhangBy Cheney Zhang
2192

Generate BigQuery SQL from natural language queries using GPT-4o chat

Give business users a chat box; get back valid BigQuery SQL and live query results. The workflow: Captures a plain-language question from a chat widget or internal portal. Fetches the current table + column schema from your BigQuery dataset (via INFORMATION_SCHEMA). Feeds both the schema and the question to GPT-4o so it can craft a syntactically correct SQL query using only fields that truly exist. Executes the AI-generated SQL in BigQuery and returns the results. Stores a short-term memory by session, enabling natural follow-up questions. Perfect for analysts, customer-success teams, or any stakeholder who needs data without writing SQL. --- ⚙️ Setup Instructions Import the workflow n8n → Workflows → Import from File (or Paste JSON) → Save Add credentials | Service | Where to create credentials | Node(s) to update | |---------|----------------------------|-------------------| | OpenAI | <https://platform.openai.com> → Create API key | OpenAI Chat Model | | Google BigQuery | Google Cloud Console → IAM & Admin → Service Account JSON key | Google BigQuery (schema + query) | Point the schema fetcher to your dataset In Google BigQuery1 you’ll see: sql SELECT tablename, columnname, data_type FROM n8nautomation-453001.emailleadsschema.INFORMATION_SCHEMA.COLUMNS Replace n8nautomation-453001.emailleadsschema with YOURPROJECT.YOURDATASET. Keep the rest of the query the same—BigQuery’s INFORMATIONSCHEMA always surfaces tablename, columnname, and datatype. Update the execution node Open Google BigQuery (the second BigQuery node). In Project ID select your project. The SQL Query field is already {{ $json.output.query }} so it will run whatever the AI returns. (Optional)Embed the chat interface Test end-to-end Open the embedded chat widget. Ask: “How many distinct email leads were created last week?” After a few seconds the workflow will return a table of results—or an error if the schema lacks the requested fields. As specific questions about your data Activate Toggle Active so the chat assistant is available 24/7. 🧩 Customization Ideas Row-limit safeguard: automatically append LIMIT 1000 to every query. Chart rendering: send query results to Google Sheets + Looker Studio for instant visuals. Slack bot: forward both the question and the SQL result to a Slack channel for team visibility. Schema caching: store the INFORMATION_SCHEMA result for 24 hours to cut BigQuery costs. --- Contact Email: rbreen@ynteractive.com Website: https://ynteractive.com YouTube: https://www.youtube.com/@ynteractivetraining LinkedIn: https://www.linkedin.com/in/robertbreen

Robert BreenBy Robert Breen
2018

Convert PPTX to PDF using ConvertAPI

Who is this for? For developers and organizations that need to convert PPTX files to PDF. What problem is this workflow solving? The file format conversion problem. What this workflow does Downloads the PPTX file from the web. Converts the PPTX file to PDF. Stores the PDF file in the local file system. How to customize this workflow to your needs Open the HTTP Request node. Adjust the URL parameter (all endpoints can be found here). Use your API Token for authentication. Pass the token in the Authorization header as a Bearer token. You can manage your API Tokens in the User panel → Authentication. Optionally, additional Body Parameters can be added for the converter.

ConvertAPIBy ConvertAPI
1749

Automate assigning GitHub issues

This workflow assigns a user to an issue if they include "assign me" when opening or commenting. To use this workflow you will need to update the credentials used for the Github nodes.

manoharBy manohar
1695

Create an issue on GitLab on every GitHub release

For every release on GitHub this workflow will create an issue on GitLab. Copy workflow to your n8n Fill in missing fields (credentials & repo names) Based on Cron node to be able to track github repos you're not a member of (as you won't be able to create a webhook). If you do own the repo, you could replace Cron & GH node with a GitHub Trigger.

ManuBy Manu
1626

Automated birthday emails with Google Sheets, OpenRouter GPT-4o & Gmail

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Send Automated Personalized Birthday Emails Using Google Sheets, OpenRouter AI, and Gmail 🧠 What This Workflow Does This workflow sends personalized birthday greetings via email every morning using data from Google Sheets and messages generated with AI. It’s great for communities, schools, small businesses, or anyone who wants to automate meaningful connections. ⚙️ Features 🗓 Daily Birthday Check — Runs every day at 9 AM 📋 Google Sheets Integration — Reads user data: Name, Email, DOB 🔍 Smart Date Matching — Extracts day & month from DOB to match today’s date 🤖 OpenRouter AI Integration — Generates a custom subject + email message 🛠 Function Node Cleanup — Separates AI response into subject & body 📬 Gmail Node — Sends personalized birthday wishes instantly 🔧 Tech Stack Google Sheets OpenRouter (or OpenAI-compatible model) Gmail 💡 Use Cases Educators sending birthday emails to students Team leads acknowledging team members’ birthdays Freelancers staying in touch with clients 1Coaches or mentors maintaining personal rapport 📝 Requirements Google Sheet with columns: Name, DOB (DD/MM/YYYY), and Email Gmail account with OAuth2 connected OpenRouter (or OpenAI) API key Basic understanding of n8n nodes

Parth PansuriyaBy Parth Pansuriya
1522

Instagram influencer finder with Bright Data (Auto-Filter & Save to Sheets)

This workflow automatically identifies and qualifies Instagram influencers based on your marketing criteria. It saves you hours of manual research by automatically filtering profiles that meet specific engagement, follower, and verification requirements, then storing qualified leads directly in Google Sheets. Overview This workflow uses Bright Data to scrape Instagram profile data, then applies smart filters to identify high-quality influencers or brand accounts. Only profiles that meet all your criteria (verified status, follower count, engagement rate, and account type) are saved to your lead database, keeping your list clean and actionable. Tools Used n8n: The automation platform that orchestrates the workflow Bright Data: For scraping Instagram profile data without restrictions Google Sheets: For storing qualified influencer leads and profile data How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the Instagram scraping node Configure Google Sheets: Connect your Google Sheets account and copy the template spreadsheet Customize Filters: Adjust the criteria (followers, engagement rate, etc.) to match your needs Run: Simply paste any Instagram profile URL and execute the workflow Use Cases Influencer Marketing: Build a database of qualified influencers for campaigns Brand Partnerships: Identify potential brand collaboration opportunities Competitor Analysis: Track competitor accounts and their engagement metrics Lead Generation: Find business accounts in your niche for B2B outreach Market Research: Analyze account types and engagement patterns in your industry Connect with Me Website: https://www.nofluff.online YouTube: https://www.youtube.com/@YaronBeen/videos LinkedIn: https://www.linkedin.com/in/yaronbeen/ Get Bright Data: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) n8n automation influencermarketing instagram brightdata webscraping leadgeneration n8nworkflow workflow nocode instagrammarketing influenceroutreach socialmediastrategy brandpartnerships marketingautomation instagramanalytics influencerdatabase contentcreators digitalmarketing socialmediatools influencerresearch instagramleads marketingtools influenceridentification instagramscraping leadqualification influencerengagement brandcollaboration instagramautomation marketingdatabase

Yaron BeenBy Yaron Been
1452