8 templates found
Category:
Author:
Sort:

Competitor price monitoring with web scraping,Google Sheets & Telegram

How it works ++Download the google sheet here++ and replace this with the googles sheet node: Google sheet , upload to google sheets and replace in the google sheets node. Scheduled trigger: Runs once a day at 8 AM (server time). Fetch product list: Reads your “master” sheet (product_url + last known price) from Google Sheets. Loop with delay: Iterates over each row (product) one at a time, inserting a short pause (20 s) between HTTP requests to avoid blocking. Scrape current price: Loads each product_url, extracts the current price via a simple CSS selector. Compare & normalize: Compares the newly scraped price against the “lastprice” from your sheet, calculates percentage change, and tags items where pricechanged == true. On price change: Send alert: Formats a Telegram message (“Price Drop” or “Price Hike”) and pushes it to your configured chat. Log history: Appends a new row to a separate “price_tracking” tab with timestamp, old price, new price, and % change. Update master sheet: After a 1 min pause, writes the updated current_price back to your “master” sheet so future runs use it as the new baseline. Set up step Google Sheets credentials (~5 min) Create a Google Sheets OAuth credential in n8n. Copy your sheet’s ID and ensure you have two tabs: productdata (columns: producturl, price) pricetracking (columns: timestamp, producturl, lastprice, currentprice, pricediffpct, price_changed) Paste the sheet ID into both Google Sheets nodes (“Read” and “Append/Update”). Telegram credentials (~5 min) Create a Telegram Bot token via BotFather. Copy your chat_id (for your target group or personal chat). Add those credentials to n8n and drop them into the “Telegram” node. Workflow parameters (~5 min) Verify the schedule in the Schedule Trigger node is set to 08:00 (or adjust to your preferred run time). In the Loop Over Items node, confirm “Batch Size” is 1 (to process one URL at a time). Adjust the Delay to avoid Request Blocking node if your site requires a longer pause (default is 20 s). In the Parse Data From The HTML Page node, double-check the CSS selector matches how prices appear on your target site. Once credentials are in place and your sheet tabs match the expected column names, the flow should be ready to activate. Total setup time is under 15 minutes—detailed notes are embedded as sticky comments throughout the workflow to help you tweak selectors, change timeouts, or adjust sheet names without digging into code.

Tony PaulBy Tony Paul
5362

Turn YouTube transcripts into newsletter drafts using Dumpling AI + GPT-4o

🧾 What this workflow does This workflow turns YouTube video links into ready-to-edit newsletter drafts using Dumpling AI and GPT-4o. It reads new video URLs from a Google Sheet, extracts their transcripts, summarizes them into email-friendly content, and logs the finished draft back into the same sheet. An email notification is also sent to alert the user once each draft is created. --- 👤 Who is this for Newsletter writers or marketers repurposing video content YouTube creators building email follow-ups from videos Agencies or VAs batching social → email content Automation users streamlining content workflows --- ⚙️ How to set up ✅ Requirements Google Sheet with the following columns: link — YouTube video URL blog post — for saving the generated newsletter draft Active accounts for: Dumpling AI (API for YouTube transcripts) OpenAI GPT-4 or GPT-4o Google Sheets Gmail (OAuth2 credential) --- 🔧 Setup steps Connect all credentials using n8n's Credential Manager: Google Sheets (OAuth2) Dumpling AI (via HTTP Header Auth) OpenAI Gmail Update the sheet ID and tab name in both Google Sheets nodes. Customize the GPT-4o prompt (optional): Located in the “GPT-4o: Write Newsletter Draft from Transcript” node You can edit tone, structure, and audience targeting in the system message Verify email recipient in the Gmail node and update if needed. --- 🧠 How it works The workflow is triggered manually or on schedule. It pulls YouTube links without drafts from the sheet. Each video’s transcript is fetched using Dumpling AI. GPT-4o summarizes the transcript into a clean, friendly newsletter format. The draft is written back to the same row in Google Sheets. An email is sent to notify the user that the draft is ready. --- 🛠️ Customization ideas Send finished drafts to Notion or Airtable instead of Sheets Generate social media posts from the same transcript Add automatic review steps using GPT scoring or editing Trigger this on new form submissions or YouTube uploads instead --- This is a fast, AI-powered way to turn long-form video content into clean, polished newsletters — ready to share or schedule with minimal editing.

YangBy Yang
2049

Vector database as a big data analysis tool for AI agents [2/3 - anomaly]

Vector Database as a Big Data Analysis Tool for AI Agents Workflows from the webinar "Build production-ready AI Agents with Qdrant and n8n". This series of workflows shows how to build big data analysis tools for production-ready AI agents with the help of vector databases. These pipelines are adaptable to any dataset of images, hence, many production use cases. Uploading (image) datasets to Qdrant Set up meta-variables for anomaly detection in Qdrant Anomaly detection tool KNN classifier tool For anomaly detection The first pipeline to upload an image dataset to Qdrant. This is the second pipeline to set up cluster (class) centres & cluster (class) threshold scores needed for anomaly detection. The third is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant to detect if it's an anomaly to the uploaded dataset. For KNN (k nearest neighbours) classification The first pipeline to upload an image dataset to Qdrant. The second is the KNN classifier tool, which takes any image as input and classifies it on the uploaded to Qdrant dataset. To recreate both You'll have to upload crops and lands datasets from Kaggle to your own Google Storage bucket, and re-create APIs/connections to Qdrant Cloud (you can use Free Tier cluster), Voyage AI API & Google Cloud Storage. [This workflow] Setting Up Cluster (Class) Centres & Cluster (Class) Threshold Scores for Anomaly Detection Preparatory workflow to set cluster centres and cluster threshold scores so anomalies can be detected based on these thresholds. Here, we're using two approaches to set up these centres: the "distance matrix approach" and the "multimodal embedding model approach".

Jenny By Jenny
1864

Detect and store the information about a purchase using the image of a receipt

Companion workflow for blogpost

amudhanBy amudhan
1696

Build a RAG agent with n8n, Qdrant & OpenAI

This template helps you to create an intelligent document assistant that can answer questions from uploaded files. It shows a complete single-vector RAG (Retrieval-Augmented Generation) system that automatically processes documents, lets you chat with it in natural language and provides accurate, source-cited responses. The workflow consists of two parts: the data loading pipeline and RAG AI Agent that answers your questions based on the uploaded documents. To test tis workflow, you can use the following example files in a shared Google Drive folder. 💡 Find more information on creating RAG AI agents in n8n on the official page. 🔗Example files The template uses the following example files in the Google Docs format: German Data Protection law: Bundesdatenschutzgesetz (BDSG) Computer Security Incident Handling Guide (NIST.SP.800-61r2) Berkshire Hathaway letter to shareholders from 2024 🚀How to get started Copy or import the template to your n8n instance. Create your Google Drive credentials via the Google Cloud Console and add them to the trigger node "Detect New Files". A detailed walk-through can be found in the n8n docs. Create a Qdrant API key and add it to the "Insert into Vector Store" node credentials. The API key will be displayed after you have logged into Qdrant and created a Cluster. Create or activate your OpenAI API key. 1️⃣ Import your data and store it in a vector database ✅ Upload files to Google Drive. IMPORTANT: This template supports files in Google Docs format. New files will be downloaded in HTML format and converted to Markdown. This preserves the overall document structure and improves the quality of responses. Open the shared Google Drive folder Create a new folder on your Google Drive Activate the workflow Copy the files from the shared folder to your new folder The webhook will catch the added files and you will see the execution in your "Executions" tab. Note: If the webhook doesn’t see the files you copied, try adding them to your Google Drive folder from the opened shared files via the Move to feature. ✅ Chunk, embed, and store your data with a connected OpenAI embedding model and Qdrant vector store. A Qdrant collection – vector storage for your data – will be created automatically after the n8n webhook has caught your data from Google Drive. You can name your collection in the "Insert into Vector Store" node. 2️⃣ Add retrieval capabilities and chat with your data ✅ Select the database with imported data in the “Search Documents” sub-node of an AI Agent. ✅ Start a chat with your agent via the chat interface: it will retrieve data from the vector store and provide a response. ❓You can ask the following questions based on the example files to test this workflow: What are the main steps in incident handling? What does Warren Buffett say about mistakes at Berkshire? What are the requirements for processing personal data? Do any documents mention data breach notification? 🌟Adapt the workflow to your own use case Knowledge management - Query company docs, policies, and procedures Research assistance - Search through academic papers and reports Customer support - Build agents that reference product documentation Legal/compliance - Query contracts, regulations, and legal documents Personal productivity - Chat with your notes, articles, and saved content The workflow automatically detects new files, processes them into searchable vector chunks, and maintains conversation context. Just drop files in your Google Drive folder and start asking questions. 💻 📞Get in touch with me if you want to customise this workflow or have any questions.

YuliaBy Yulia
494

AI trading assistant for Telegram using chatGPT-4o (with Position Sizing)

MT5 AI Trading Assistant - Telegram Bot Workflow with position sizing capabilities Open trades for forex/xauusd/gold with this n8n template. It demonstrates how to automate MetaTrader 5 trading executions through natural language commands via Telegram, enabling hands-free trade management with AI-powered intent classification and parameter parsing. Use cases are many Try executing market orders right away when you see an opportunity using commands, setting up limit orders with precise entry points, or managing pending signals directly from your phone! This workflow handles everything from simple "buy EU 1% 30 pip SL and 3R TP" commands to complex limit orders with risk-reward calculations. Good to know This template is for educational purposes only - requires significant security hardening before live trading use OpenAI API costs apply - approximately $0.0001-0.0005 per message depending on complexity Webhook endpoints are NOT authenticated by default - you must add authentication before production deployment or if you want extra security No built-in position size limits - add validation to cap maximum risk percentage MT5 Expert Advisor required - this workflow only sends signals; you need a separate EA to execute trades Local/VPS deployment recommended - not suitable for public cloud without security hardening Always test with demo accounts first - automated trading involves substantial risk of loss How it works The Telegram Chat Trigger listens for incoming messages from authorized users in your Telegram bot A reaction emoji (👍) is immediately sent to acknowledge message receipt using the Telegram API The first AI classifier (GPT-4o mini) analyzes the message intent and categorizes it into six types: trade execution, trade inquiry, help request, signal query, signal clear, or off-topic For trade execution commands, a regex-based parser extracts parameters (asset, direction, risk, stop loss, take profit) from natural language If initial parsing is incomplete or confidence is low, an LLM fallback parser attempts to extract missing parameters with strict validation Complete trade parameters are routed through a Switch node to determine order type (market vs limit) Market orders trigger an HTTP POST to webhook/mt5-market-order with asset, direction, risk, SL, and TP parameters Limit orders include an additional entry price parameter and route to webhook/mt5-limit-order For signal queries, the workflow fetches pending signals from webhook/mt5-signals and formats the response Signal clearing commands POST to webhook/mt5-clear-all to remove all pending orders Trade execution success/failure is validated and appropriate confirmation messages are sent back to Telegram Help requests receive multi-part instructional messages with command examples and formatting guidelines How to use Set up your Telegram bot via BotFather and obtain your bot token Configure the Telegram credentials in both the Chat Trigger and all response nodes Add your OpenAI API key to the three "4o mini" language model nodes Deploy the four MT5 webhook endpoints (market-order, limit-order, signals, clear-all) on your local machine or VPS Critical: Implement authentication on all webhook endpoints before connecting to live MT5 accounts Test commands with demo account first: try "buy EU 1% 30 pip SL 3R TP" or "show pending signals" Consider adding user ID whitelisting in the "set credentials and params" node for access control Modify the AI prompts in the classifier nodes to adjust trading parameter validation rules Requirements Telegram Bot API account for receiving and sending messages OpenAI API key for GPT-4o mini language model (intent classification and parameter parsing) MetaTrader 5 platform with custom Expert Advisor that listens to webhook signals n8n instance running locally or on VPS (self-hosted or cloud) Basic understanding of forex trading concepts (pips, risk management, order types) Customising this workflow Add a user whitelist validation node after the Telegram trigger to restrict access to specific Telegram user IDs Implement maximum risk percentage caps (e.g., 5%) in the parameter validation nodes Create an asset whitelist to only allow specific currency pairs your broker supports Add audit logging by storing trade commands and execution results to a database Include trade confirmation mode: send preview before execution and wait for "confirm" message Integrate with position sizing calculators based on account equity and volatility Add webhook authentication headers (API keys or JWT tokens) to all HTTP request nodes Create a scheduled workflow to auto-expire pending signals after a certain time period Try a popular use-case such as connecting to TradingView alerts instead of manual Telegram commands Important Security Disclaimer ⚠️ This workflow is NOT production-ready in its current form. Before using with live trading: Add authentication to all webhook endpoints Implement input validation for risk limits, asset whitelists, and parameter ranges Add rate limiting to prevent spam attacks Set up user authorization to restrict who can execute trades Test extensively with demo accounts for at least 3-6 months Understand that automated trading involves substantial risk of loss This template is provided for educational purposes only and does not constitute financial advice. You are solely responsible for implementing security measures, testing thoroughly, and accepting all trading risks. Purchasing this N8N workflow comes with the Metatrader5 and N8N Integration for Forex and Gold Trading via Webhooks Workflow too so it is sold together and vice versa as well along with the MQL code for the ExpertAdvisor listener all for the price of 120 usd total Questions? If you have questions or need help with this workflow, feel free to reach out: elijahmamuri@gmail.com elijahfxtrading@gmail.com

Cj Elijah GarayBy Cj Elijah Garay
97

Gmail to Zendesk: AI-enriched ticket creation with GPT-4o and Google Sheets logging

Description Turn incoming Gmail messages into structured Zendesk tickets, enriched by Azure OpenAI, and log key details to Google Sheets for tracking. Ideal for IT Support teams needing fast, consistent intake and documentation. ⚡ What This Template Does Fetches new emails via Gmail Trigger. ✉️ Normalizes Gmail data and formats it for downstream steps. Enriches and structures content with Azure OpenAI Chat Model and Output Parsers. Creates Zendesk tickets from the processed data. 🎫 Appends or updates logs in Google Sheets for auditing and reporting. 📊 Key Benefits Saves time by automating ticket creation and logging. ⏱️ Improves ticket quality with AI-driven normalization and structure. Ensures consistent records in Google Sheets for easy reporting. Reduces manual errors in IT Support intake. ✅ Features Gmail-triggered intake flow for new messages. AI enrichment using Azure OpenAI Chat Model with parsing and memory tooling. Zendesk ticket creation (create: ticket) with structured fields. Google Sheets logging (appendOrUpdate: sheet). Modular design with Execute Workflow nodes for reuse and scaling. Requirements n8n instance (Cloud or self-hosted). Gmail credentials configured in n8n for the Gmail Trigger. Zendesk credentials with permission to create tickets. Google Sheets credentials with access to the target spreadsheet (append/update enabled). Azure OpenAI credentials configured for the Azure OpenAI Chat Model and associated parsing. Target Audience IT Support and Helpdesk teams handling email-based requests. 🛠️ Operations teams standardizing inbound email workflows. Agencies and MSPs offering managed support intake. Internal automation teams centralizing ticket capture and logging. Step-by-Step Setup Instructions Connect Gmail credentials in n8n and select the inbox/label for the Gmail Trigger. Add Zendesk credentials and confirm ticket creation permissions. Configure Google Sheets credentials and select the target sheet for logs. Add Azure OpenAI credentials to the Azure OpenAI Chat Model node and verify parsing steps. Import the workflow, assign credentials to each node, update any placeholders, and run a test. Rename the final email/logging nodes descriptively (e.g., “Log to Support Sheet”) and schedule if needed.

Rahul JoshiBy Rahul Joshi
82

Create Anki language flashcards with GPT-4, DALL-E and ElevenLabs

Turn any topic into a ready-to-study Anki deck. This workflow generates vocabulary flashcards with AI images and native pronunciation, then sends the .apkg file straight to your inbox. What it does You fill out a simple form (topic, languages, difficulty) GPT-4 creates vocabulary with translations, readings, and example sentences DALL-E 3 generates a unique image for each word ElevenLabs adds native pronunciation audio (word + example) Everything gets packaged into a real .apkg file The deck lands in your email, ready to import into Anki A backup copy saves to Google Sheets Why I built this I was spending hours making flashcards by hand for language learning. Finding images, recording audio, formatting everything for Anki... it took forever. This workflow does all of that in about 3 minutes. Setup (~15 min) Install npm packages: jszip and sql.js Add OpenAI credentials (for GPT-4 + DALL-E) Add ElevenLabs credentials Connect Gmail and Google Sheets via OAuth Update OPENAIAPIKEY in the DALL-E code node Update the Spreadsheet ID in the Sheets node Features 20 languages supported 7 image styles (minimal icons, kawaii, realistic, watercolor, pixel art...) 6 difficulty levels (A1 to C2) Optional reverse cards (target→native AND native→target) Works on Anki desktop and mobile

Victor Manuel Lagunas FrancoBy Victor Manuel Lagunas Franco
57
All templates loaded