AI-powered price watchdog: competitor monitoring & alerting (Decodo & Gemini)
Never miss a competitor price change again.
This advanced workflow automates the most difficult aspect of market monitoring: intelligently extracting structured pricing data from complex, dynamic competitor websites and comparing it against your historical baseline. It sends instant, conditional alerts only when a significant price shift is detected.
The workflow uses Decodo for dynamic scraping, Gemini for reliable data parsing, and Google Sheets for robust historical state management.
✨ Key Features
- AI-Powered Extraction: Uses Gemini 2.5 Flash to analyze raw, noisy website HTML and output a clean JSON array of plan names, prices, and features, bypassing brittle CSS selectors.
- Historical Comparison: Automatically retrieves the price from the previous workflow run and calculates the percentage difference (diff) for every single plan item.
- Edge Case Handling: Includes specific code logic to prevent errors and flag crucial events like a "Free-to-Paid" plan transition (division by zero).
- Conditional Alerting: Sends immediate Slack notifications only when the price change exceeds your predefined percentage threshold.
- State Management: Uses Google Sheets to automatically shift data (Current Price $\rightarrow$ Old Price) to maintain the historical baseline for the next scheduled execution.
⚙️ How it Works (The Monitoring Loop)
- Setup & Sourcing: The workflow is executed on a schedule, defining the global alert threshold and retrieving the list of target URLs from a Google Sheet.
- Scraping (Dynamic): Decodo runs with JavaScript rendering ON to fetch the complete, dynamic HTML of the pricing page.
- AI Structuring: Gemini receives the raw HTML and uses a strict System Prompt to extract a clean JSON array of all pricing plans.
- Comparison & Calculation: A Code Node parses the current plan list and the list from the previous run (stored in Sheets). It calculates the percentage change for every matching plan.
- Alert Decision: An If Node checks the calculated change against the threshold. If the condition is met, the filtered alert proceeds to Slack.
- Data Shift & Log: The final Sheets Update node shifts the current plan data to the "Last Plans" column and moves the previous "Last Plans" to the "Old Plans" column, setting the new baseline for the next scheduled check.
📥 Decodo Node Installation
The Decodo node is used three times in this workflow for precision scraping and searching.
- Find the Node: Click the + button in your n8n canvas.
- Search: Search for the
Decodonode and select it. - Credentials: When configuring the first Decodo node, use your API key (obtained with the 80% discount coupon).
🎁 Exclusive Deal for n8n Users
To run this workflow, you require a robust scraping provider. We have secured a massive discount for Decodo users:
- Get 80% OFF the 23k Advanced Scraping API plan.
- Coupon Code:
ATTAN8N - Sign Up Here: Claim 80% Discount on Decodo
🛠️ Setup Instructions
- Credentials: Obtain API keys for Decodo (using the coupon below), Google Sheets, and Slack.
- Google Sheets Setup: Create a sheet with the following required columns for tracking (one row per URL):
NameURLOld Plans(JSON String)Last Plans(JSON String)Updated At(Date)
- Global Configuration: Open the
Config: Alert Parametersnode to set youralert_threshold(e.g.,10).
I understand. To complete the final template description for your Competitor Price Monitoring workflow, here is the dedicated How to Adapt the Template section, focusing on functional changes a user can make for advanced monitoring.
➕ How to Adapt the Template
The workflow is currently configured for maximum efficiency and stability. To expand its functionality or change its dependencies, you can implement the following adaptations:
- Change Database for Storage: Replace the Google Sheets nodes with Airtable or Notion nodes for historical storage. Since your comparison logic relies on the JSON string being saved and retrieved, you will only need to change the read/write operations (the Code logic remains the same).
- Change Alert Channel: Easily swap the Slack node with a Gmail, Discord, or Pushover node to deliver critical price alerts to a different team or application.
- Dynamic Thresholds: Modify the
Config: Alert Parametersto include separate fields for price increases (e.g.,alert_increase_threshold) and price decreases (e.g.,alert_decrease_threshold), allowing you to track competitor sales differently than price hikes. - Advanced Price Filtering: Adjust the code logic in
Code: Isolate Pricing Sectionto target specific currency symbols (e.g.,€,£) or to filter out prices that appear to be marked as promotional (e.g., text containing "SALE" or "Discount"). - Add Advanced Alert Reporting: Instead of sending a simple Slack message, use the full list of
price_diffs(which contains all plans) to generate a consolidated daily CSV report or a professional HTML email summarizing all movements, even those below the alert threshold.
AI-Powered Price Watchdog: Competitor Monitoring & Alerting (Decodo & Gemini)
This n8n workflow automates the process of monitoring competitor product prices, identifying significant changes, and sending alerts. It leverages AI to analyze price data from a Google Sheet and notifies you via Slack when price drops or increases exceed a defined threshold.
What it does
- Triggers on Schedule: The workflow can be configured to run automatically at scheduled intervals (e.g., daily, hourly) or manually.
- Reads Competitor Data: It fetches product data, including competitor URLs and current prices, from a specified Google Sheet.
- Prepares Data for AI: It transforms the incoming Google Sheet data into a structured format suitable for AI processing.
- Analyzes Price Changes with AI: It uses a Google Gemini Chat Model via an AI Agent to compare current prices with historical data (presumably from the Google Sheet, though not explicitly shown in the provided JSON for fetching historical data) and identify significant price movements.
- Parses AI Output: A Structured Output Parser extracts key information from the AI's response, such as identified price changes and relevant details.
- Splits Data for Individual Processing: Each identified price change is processed individually.
- Filters for Significant Changes: An "If" node checks if the price change meets a predefined condition (e.g., a certain percentage drop or increase).
- Formats Alert Message: For significant changes, a "Set" node constructs a clear and informative alert message.
- Sends Slack Notification: If a significant price change is detected, a notification is sent to a designated Slack channel.
- Handles No Significant Changes: If no significant price changes are found, the workflow proceeds without sending an alert.
Prerequisites/Requirements
- n8n Instance: A running n8n instance.
- Google Sheets Account: Access to a Google Sheet containing your competitor product data (product name, URL, current price, and potentially historical prices).
- Google Gemini API Key: An API key for the Google Gemini Chat Model, configured as an n8n credential.
- Slack Account: A Slack workspace and a channel where you want to receive alerts, configured as an n8n credential.
Setup/Usage
- Import the Workflow: Import the provided JSON into your n8n instance.
- Configure Credentials:
- Google Sheets: Set up your Google Sheets credential to allow n8n to read data from your spreadsheet.
- Google Gemini Chat Model: Configure your Google Gemini API key as a credential for the "Google Gemini Chat Model" node.
- Slack: Set up your Slack credential for the "Slack" node, specifying the workspace and channel.
- Update Google Sheet Node:
- In the "Google Sheets" node, specify the Spreadsheet ID and the Sheet Name where your competitor data is located.
- Configure AI Agent (Google Gemini Chat Model):
- Review the prompt and instructions within the "AI Agent" and "Google Gemini Chat Model" nodes to ensure they align with how you want the AI to analyze price changes. You might need to adjust the prompt to explicitly guide the AI on how to interpret historical vs. current prices from your Google Sheet data.
- Adjust 'If' Node Conditions:
- In the "If" node, modify the conditions to define what constitutes a "significant" price change (e.g., a price drop of more than 5%, or an increase of more than $10).
- Customize Slack Message:
- In the "Edit Fields" (Set) node before Slack, customize the message content to be sent to Slack, including dynamic data from the AI's analysis.
- Set Schedule Trigger:
- Configure the "Schedule Trigger" node to run the workflow at your desired frequency (e.g., every hour, once a day).
- Activate the Workflow: Once configured, activate the workflow to start monitoring.
Related Templates
AI-powered code review with linting, red-marked corrections in Google Sheets & Slack
Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOURSPREADSHEETID>, <YOURSHEETGIDORNAME>, <YOURSLACKCHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.
Automate RSS to social media pipeline with AI, Airtable & GetLate for multiple platforms
Overview Automates your complete social media content pipeline: sources articles from Wallabag RSS, generates platform-specific posts with AI, creates contextual images, and publishes via GetLate API. Built with 63 nodes across two workflows to handle LinkedIn, Instagram, and Bluesky—with easy expansion to more platforms. Ideal for: Content marketers, solo creators, agencies, and community managers maintaining a consistent multi-platform presence with minimal manual effort. How It Works Two-Workflow Architecture: Content Aggregation Workflow Monitors Wallabag RSS feeds for tagged articles (to-share-linkedin, to-share-instagram, etc.) Extracts and converts content from HTML to Markdown Stores structured data in Airtable with platform assignment AI Generation & Publishing Workflow Scheduled trigger queries Airtable for unpublished content Routes to platform-specific sub-workflows (LinkedIn, Instagram, Bluesky) LLM generates optimized post text and image prompts based on custom brand parameters Optionally generates AI images and hosts them on Imgbb CDN Publishes via GetLate API (immediate or draft mode) Updates Airtable with publication status and metadata Key Features: Tag-based content routing using Wallabag's native system Swappable AI providers (Groq, OpenAI, Anthropic) Platform-specific optimization (tone, length, hashtags, CTAs) Modular design—duplicate sub-workflows to add new platforms in \~30 minutes Centralized Airtable tracking with 17 data points per post Set Up Steps Setup time: \~45-60 minutes for initial configuration Create accounts and get API keys (\~15 min) Wallabag (with RSS feeds enabled) GetLate (social media publishing) Airtable (create base with provided schema—see sticky notes) LLM provider (Groq, OpenAI, or Anthropic) Image service (Hugging Face, Fal.ai, or Stability AI) Imgbb (image hosting) Configure n8n credentials (\~10 min) Add all API keys in n8n's credential manager Detailed credential setup instructions in workflow sticky notes Set up Airtable database (\~10 min) Create "RSS Feed - Content Store" base Add 19 required fields (schema provided in workflow sticky notes) Get Airtable base ID and API key Customize brand prompts (\~15 min) Edit "Set Custom SMCG Prompt" node for each platform Define brand voice, tone, goals, audience, and image preferences Platform-specific examples provided in sticky notes Configure platform settings (\~10 min) Set GetLate account IDs for each platform Enable/disable image generation per platform Choose immediate publish vs. draft mode Adjust schedule trigger frequency Test and deploy Tag test articles in Wallabag Monitor the first few executions in draft mode Activate workflows when satisfied with the output Important: This is a proof-of-concept template. Test thoroughly with draft mode before production use. Detailed setup instructions, troubleshooting tips, and customization guidance are in the workflow's sticky notes. Technical Details 63 nodes: 9 Airtable operations, 8 HTTP requests, 7 code nodes, 3 LangChain LLM chains, 3 RSS triggers, 3 GetLate publishers Supports: Multiple LLM providers, multiple image generation services, unlimited platforms via modular architecture Tracking: 17 metadata fields per post, including publish status, applied parameters, character counts, hashtags, image URLs Prerequisites n8n instance (self-hosted or cloud) Accounts: Wallabag, GetLate, Airtable, LLM provider, image generation service, Imgbb Basic understanding of n8n workflows and credential configuration Time to customize prompts for your brand voice Detailed documentation, Airtable schema, prompt examples, and troubleshooting guides are in the workflow's sticky notes. Category Tags social-media-automation, ai-content-generation, rss-to-social, multi-platform-posting, getlate-api, airtable-database, langchain, workflow-automation, content-marketing
Ai website scraper & company intelligence
AI Website Scraper & Company Intelligence Description This workflow automates the process of transforming any website URL into a structured, intelligent company profile. It's triggered by a form, allowing a user to submit a website and choose between a "basic" or "deep" scrape. The workflow extracts key information (mission, services, contacts, SEO keywords), stores it in a structured Supabase database, and archives a full JSON backup to Google Drive. It also features a secondary AI agent that automatically finds and saves competitors for each company, building a rich, interconnected database of company intelligence. --- Quick Implementation Steps Import the Workflow: Import the provided JSON file into your n8n instance. Install Custom Community Node: You must install the community node from: https://www.npmjs.com/package/n8n-nodes-crawl-and-scrape FIRECRAWL N8N Documentation https://docs.firecrawl.dev/developer-guides/workflow-automation/n8n Install Additional Nodes: n8n-nodes-crawl-and-scrape and n8n-nodes-mcp fire crawl mcp . Set up Credentials: Create credentials in n8n for FIRE CRAWL API,Supabase, Mistral AI, and Google Drive. Configure API Key (CRITICAL): Open the Web Search tool node. Go to Parameters → Headers and replace the hardcoded Tavily AI API key with your own. Configure Supabase Nodes: Assign your Supabase credential to all Supabase nodes. Ensure table names (e.g., companies, competitors) match your schema. Configure Google Drive Nodes: Assign your Google Drive credential to the Google Drive2 and save to Google Drive1 nodes. Select the correct Folder ID. Activate Workflow: Turn on the workflow and open the Webhook URL in the “On form submission” node to access the form. --- What It Does Form Trigger Captures user input: “Website URL” and “Scraping Type” (basic or deep). Scraping Router A Switch node routes the flow: Deep Scraping → AI-based MCP Firecrawler agent. Basic Scraping → Crawlee node. Deep Scraping (Firecrawl AI Agent) Uses Firecrawl and Tavily Web Search. Extracts a detailed JSON profile: mission, services, contacts, SEO keywords, etc. Basic Scraping (Crawlee) Uses Crawl and Scrape node to collect raw text. A Mistral-based AI extractor structures the data into JSON. Data Storage Stores structured data in Supabase tables (companies, company_basicprofiles). Archives a full JSON backup to Google Drive. Automated Competitor Analysis Runs after a deep scrape. Uses Tavily web search to find competitors (e.g., from Crunchbase). Saves competitor data to Supabase, linked by company_id. --- Who's It For Sales & Marketing Teams: Enrich leads with deep company info. Market Researchers: Build structured, searchable company databases. B2B Data Providers: Automate company intelligence collection. Developers: Use as a base for RAG or enrichment pipelines. --- Requirements n8n instance (self-hosted or cloud) Supabase Account: With tables like companies, competitors, social_links, etc. Mistral AI API Key Google Drive Credentials Tavily AI API Key (Optional) Custom Nodes: n8n-nodes-crawl-and-scrape --- How It Works Flow Summary Form Trigger: Captures “Website URL” and “Scraping Type”. Switch Node: deep → MCP Firecrawler (AI Agent). basic → Crawl and Scrape node. Scraping & Extraction: Deep path: Firecrawler → JSON structure. Basic path: Crawlee → Mistral extractor → JSON. Storage: Save JSON to Supabase. Archive in Google Drive. Competitor Analysis (Deep Only): Finds competitors via Tavily. Saves to Supabase competitors table. End: Finishes with a No Operation node. --- How To Set Up Import workflow JSON. Install community nodes (especially n8n-nodes-crawl-and-scrape from npm). Configure credentials (Supabase, Mistral AI, Google Drive). Add your Tavily API key. Connect Supabase and Drive nodes properly. Fix disconnected “basic” path if needed. Activate workflow. Test via the webhook form URL. --- How To Customize Change LLMs: Swap Mistral for OpenAI or Claude. Edit Scraper Prompts: Modify system prompts in AI agent nodes. Change Extraction Schema: Update JSON Schema in extractor nodes. Fix Relational Tables: Add Items node before Supabase inserts for arrays (social links, keywords). Enhance Automation: Add email/slack notifications, or replace form trigger with a Google Sheets trigger. --- Add-ons Automated Trigger: Run on new sheet rows. Notifications: Email or Slack alerts after completion. RAG Integration: Use the Supabase database as a chatbot knowledge source. --- Use Case Examples Sales Lead Enrichment: Instantly get company + competitor data from a URL. Market Research: Collect and compare companies in a niche. B2B Database Creation: Build a proprietary company dataset. --- WORKFLOW IMAGE --- Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|-----------| | Form Trigger 404 | Workflow not active | Activate the workflow | | Web Search Tool fails | Missing Tavily API key | Replace the placeholder key | | FIRECRAWLER / find competitor fails | Missing MCP node | Install n8n-nodes-mcp | | Basic scrape does nothing | Switch node path disconnected | Reconnect “basic” output | | Supabase node error | Wrong table/column names | Match schema exactly | --- Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. Contact: shilpa.raju@digitalbiz.tech For more such offerings, visit us: https://www.digitalbiz.tech ---