Competitor intelligence agent: SERP monitoring + summary with Thordata + OpenAI
Who this is for?
This workflow is designed for:
- Marketing analysts, SEO specialists, and content strategists who want automated intelligence on their online competitors.
- Growth teams that need quick insights from SERP (Search Engine Results Pages) without manual data scraping.
- Agencies managing multiple clients’ SEO presence and tracking competitive positioning in real-time.
What problem is this workflow solving?
Manual competitor research is time-consuming, fragmented, and often lacks actionable insights. This workflow automates the entire process by:
- Fetching SERP results from multiple search engines (Google, Bing, Yandex, DuckDuckGo) using Thordata’s Scraper API.
- Using OpenAI GPT-4.1-mini to analyze, summarize, and extract keyword opportunities, topic clusters, and competitor weaknesses.
- Producing structured, JSON-based insights ready for dashboards or reports.
Essentially, it transforms raw SERP data into strategic marketing intelligence — saving hours of research time.
What this workflow does
Here’s a step-by-step overview of how the workflow operates:
Step 1: Manual Trigger
Initiates the process on demand when you click “Execute Workflow.”
Step 2: Set the Input Query
The “Set Input Fields” node defines your search query, such as:
> “Top SEO strategies for e-commerce in 2025”
Step 3: Multi-Engine SERP Fetching
Four HTTP request tools send the query to Thordata Scraper API to retrieve results from:
- Bing
- Yandex
- DuckDuckGo
Each uses Bearer Authentication configured via “Thordata SERP Bearer Auth Account.”
Step 4: AI Agent Processing
The LangChain AI Agent orchestrates the data flow, combining inputs and preparing them for structured analysis.
Step 5: SEO Analysis
-
The SEO Analyst node (powered by GPT-4.1-mini) parses SERP results into a structured schema, extracting:
- Competitor domains
- Page titles & content types
- Ranking positions
- Keyword overlaps
- Traffic share estimations
- Strengths and weaknesses
Step 6: Summarization
The Summarize the content node distills complex data into a concise executive summary using GPT-4.1-mini.
Step 7: Keyword & Topic Extraction
The Keyword and Topic Analysis node extracts:
- Primary and secondary keywords
- Topic clusters and content gaps
- SEO strength scores
- Competitor insights
Step 8: Output Formatting
The Structured Output Parser ensures results are clean, validated JSON objects for further integration (e.g., Google Sheets, Notion, or dashboards).
4. Setup
Prerequisites
- n8n Cloud or Self-Hosted instance
- Thordata Scraper API Key (for SERP data retrieval)
- OpenAI API Key (for GPT-based reasoning)
Setup Steps
-
Add Credentials
- Go to Credentials → Add New → HTTP Bearer Auth → Paste your Thordata API token.
- Add OpenAI API Credentials for the GPT model.
-
Import the Workflow
- Copy the provided JSON or upload it into your n8n instance.
-
Set Input
- In the “Set the Input Fields” node, replace the example query with your desired topic, e.g.:
“Google Search for Top SEO strategies for e-commerce in 2025”
- In the “Set the Input Fields” node, replace the example query with your desired topic, e.g.:
-
Execute
- Click “Execute Workflow” to run the analysis.
How to customize this workflow to your needs
Modify Search Query
Change the search_query variable in the Set Node to any target keyword or topic.
Change AI Model
In the OpenAI Chat Model nodes, you can switch from gpt-4.1-mini to another model for better quality or lower cost.
Extend Analysis
Edit the JSON schema in the “Information Extractor” nodes to include:
- Sentiment analysis of top pages
- SERP volatility metrics
- Content freshness indicators
Export Results
Connect the output to:
- Google Sheets / Airtable for analytics
- Notion / Slack for team reporting
- Webhook / Database for automated storage
Summary
This workflow creates an AI-powered Competitor Intelligence System inside n8n by blending:
- Real-time SERP scraping (Thordata)
- Automated AI reasoning (OpenAI GPT-4.1-mini)
- Structured data extraction (LangChain Information Extractors)
n8n Competitor Intelligence Agent: SERP Monitoring & Summary with Thordata & OpenAI
This n8n workflow automates the process of monitoring search engine results pages (SERPs) for competitor intelligence, extracting key information, and summarizing it using AI. It's designed to help you keep track of competitor movements, new content, and market trends by regularly checking specified search queries and processing the results.
What it does
This workflow performs the following key steps:
- Manual Trigger: The workflow is initiated manually, allowing you to control when the competitor intelligence gathering process begins.
- Google Sheets: Although not explicitly configured in the provided JSON, this node is present, suggesting an intended integration for reading competitor search terms or writing extracted data to a Google Sheet.
- Edit Fields (Set): This node is typically used to manipulate or define data within the workflow, such as setting variables or structuring input for subsequent nodes.
- AI Agent: This node acts as an intelligent agent, likely orchestrating calls to various AI models and tools to achieve a specific goal, such as analyzing search results.
- Basic LLM Chain: This node represents a fundamental chain that processes language model inputs and outputs, likely used for general text processing or instruction following.
- OpenAI Chat Model: This node integrates with OpenAI's chat models (e.g., GPT-3.5, GPT-4) to perform natural language processing tasks like summarization, entity extraction, or content analysis on the SERP data.
- Structured Output Parser: This node is used to parse the output from the AI models into a structured format (e.g., JSON), making it easier to consume and use in subsequent steps.
- Information Extractor: This specialized AI chain node is designed to extract specific pieces of information (e.g., competitor names, product mentions, URLs) from unstructured text, which would be crucial for competitor intelligence.
- Merge: This node is used to combine data from multiple branches or steps within the workflow, ensuring all relevant information is brought together for final processing or output.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running instance of n8n.
- OpenAI API Key: For the OpenAI Chat Model node.
- Google Sheets Account: (Potentially) If the Google Sheets node is intended to be used for input or output.
- Thordata (or similar SERP API): While not explicitly present in the provided JSON, the workflow's name "SERP monitoring" strongly suggests an external service like Thordata would be used to fetch SERP data. This would likely be integrated via an HTTP Request node (not shown in the provided JSON).
Setup/Usage
- Import the workflow: Import the provided JSON into your n8n instance.
- Configure Credentials:
- Set up your OpenAI API Key credential in n8n.
- (If applicable) Configure your Google Sheets credential.
- Review and Configure Nodes:
- Google Sheets: If you plan to use it, configure the spreadsheet ID, sheet name, and operation (e.g., "Read" for keywords, "Write" for results).
- Edit Fields (Set): Adjust this node to define any initial data or variables needed for your competitor analysis, such as target keywords or competitor names.
- AI Agent, Basic LLM Chain, OpenAI Chat Model, Structured Output Parser, Information Extractor: Review the configurations of these AI-related nodes. You may need to adjust prompts, models, or output schemas based on your specific intelligence gathering needs.
- (Missing): You will likely need to add an HTTP Request node or a dedicated SERP API node (if available) to fetch search results from a service like Thordata. This node would typically come after the trigger and before the AI processing.
- Execute Workflow: Click the "Execute workflow" button to run the workflow manually and begin monitoring.
This workflow provides a robust framework for automating competitor intelligence by leveraging AI to process and summarize SERP data, making it easier to stay informed about your competitive landscape.
Related Templates
Client review collection & sentiment analysis with HighLevel, GPT-4o, Gmail & Slack
📘 Description: This automation streamlines client review collection and sentiment summarization for Techdome using HighLevel CRM, Azure OpenAI GPT-4o, Gmail, Slack, and Google Sheets. It starts by pulling recently won deals from HighLevel, then generates and sends AI-written HTML review request emails with built-in Google Review and feedback form links. After waiting 24 hours, it fetches the client’s reply thread, summarizes the sentiment using GPT-4o, and posts a clean update to Slack for team visibility. Any failures—API errors, empty responses, or data validation issues—are logged automatically to Google Sheets for full transparency and QA. The result: a fully hands-free Client Appreciation + Feedback Intelligence Loop, improving brand perception and internal responsiveness. ⚙️ What This Workflow Does (Step-by-Step) ▶️ When Clicking ‘Execute Workflow’ (Manual Trigger) Allows on-demand execution or scheduled testing of the workflow. Initiates the fetch for all newly “Won” deals from HighLevel CRM. 🏆 Fetch All Won Deals from HighLevel Retrieves all opportunities labeled “won” in HighLevel, gathering essential client details such as name, email, and deal information to personalize outgoing emails. 🔍 Validate Deal Fetch Success (IF Node) Checks each record for a valid id field. ✅ True Path: Moves ahead to generate AI email content. ❌ False Path: Logs the event to Google Sheets under the error log sheet. 🧠 Configure GPT-4o Model (Azure OpenAI) Initializes the GPT-4o engine that powers all language-generation tasks in this workflow—ensuring precise tone, correct formatting, and safe structured HTML output. 💌 Generate Personalized Review Request Email (AI Agent) Uses GPT-4o to create a tailored, HTML-formatted email thanking the client for their business and requesting feedback. Includes two clickable CTA buttons: ⭐ Google Review Link: 📝 Internal Feedback Form: Google Form link for in-depth feedback Each email maintains Techdome’s friendly, brand-consistent voice with clean inline CSS styling. 📨 Send Review Request Email to Client (Gmail Node) Automatically sends the AI-generated email to the client’s registered address through Gmail. Ensures timely post-service communication without manual follow-ups. ⏳ Wait for 24 Hours Before Next Action Pauses the workflow for 24 hours to give clients time to read and respond to the review request. 📥 Retrieve Email Thread for Response (Gmail Node) After the waiting period, fetches the Gmail thread associated with the initial email to capture client replies or feedback messages. 🧠 Configure GPT-4o Model (Summarization Engine) Prepares another GPT-4o instance specialized for summarizing client replies into concise, sentiment-aware Slack messages. 💬 Summarize Client Feedback (AI Agent) Analyzes the Gmail thread and produces a short Slack-formatted summary using this structure: 🎉 New Client Review Received!Client: <Name> Feedback: <Message snippet> Sentiment: Positive / Neutral / Negative Focuses on tone clarity and quick readability for internal teams. 📢 Announce Review Summary in Slack Posts the AI-generated summary in a designated Slack channel, keeping success and support teams instantly informed of client sentiments and feedback trends. 📊 Log Errors in Google Sheets Appends all failures—including fetch issues, missing fields, or parsing errors—to the Google Sheets “error log sheet,” maintaining workflow reliability and accountability. 🧩 Prerequisites HighLevel CRM OAuth credentials (to fetch deals) Azure OpenAI GPT-4o access (for AI-driven writing and summarization) Gmail API connection (for sending & reading threads) Slack API integration (for posting summaries) Google Sheets access (for error logging) 💡 Key Benefits ✅ Automates personalized review outreach after project completion ✅ Waits intelligently before analyzing responses ✅ Uses GPT-4o to summarize client sentiment in human tone ✅ Sends instant Slack updates for real-time visibility ✅ Keeps audit logs of all errors for debugging 👥 Perfect For Client Success and Account Management Teams Agencies using HighLevel CRM for project delivery Teams aiming to collect consistent client feedback and reviews Businesses wanting AI-assisted sentiment insights in Slack
Automated trip weather forecasts from Google Calendar to Telegram
How it works This workflow for trip weather forecasting is event-driven, starting when a calendar event is created or updated, and provides timely weather alerts and forecasts tailored to your travel dates and locations. Overall, this workflow efficiently integrates calendar travel plans with real-time and updated weather intelligence for ultimate travel preparedness and peace of mind. From the creator If you’re jetting off frequently, bouncing between time zones, juggling meetings, and squeezing every drop of life out of travel, you need this flow. This ain’t your grandma’s weather app. It’s a bulletproof system that scans your calendar, mines your trips, and delivers laser-targeted weather intel and urgent alerts, right when you need it. No more surprises. No more scrambling. Just real-time weather mastery that saves your schedule. You’re not just traveling: you’re dominating. This flow makes sure the only thing you worry about is your next move, not whether the weather’s gonna ruin it. Time to upgrade from a tourist to a boss. Step-by-step 📅 Google Calendar Triggers (Event Created/Updated): The workflow starts immediately upon creation or update of any calendar event, enabling real-time detection of new or changed travel plans. ✈ Identify Trips: Filters these calendar events to detect travel-related trips by matching keywords such as "trip," "flight," or "vacation" in titles or descriptions. 📍Extract Locations: Parses each trip event’s details to extract start and end dates and the trip destination from the summary/description/location fields. 🌐 Build interrogation URL: Constructs a Visual Crossing API request URL dynamically based on the extracted trip location and dates, including daily forecasts and alerts. Fetches the detailed weather forecast and alert data for the trip location and duration right after detecting the event. Formats the raw weather data into a readable summary 🌤️🌪🌀 including temperatures, precipitation probabilities, conditions, and eventual severe weather alerts. 📲 📧 Send Forecast: Sends the forecast summary with alerts via Telegram to keep the user informed instantly. ⌛One day before the trip: Pauses the workflow until exactly one day before the trip start date, ensuring a timely second fetch when more accurate or updated weather data is available and the updated forecast is sent. Optional You can replace the Telegram node with email, WhatsApp, Slack, SMS notifications, or add multiple notification nodes to receive them across all desired channels.
Monitor NASA asteroid threats with AI fact-check and multi-channel alerts
Who Is This For? This workflow is designed for space enthusiasts, science educators, journalists, fact-checkers, and researchers who want to stay informed about near-Earth asteroid threats while filtering out media sensationalism. It's also valuable for anyone studying how different regions cover space-related news. What It Does This workflow creates an automated planetary defense monitoring system that: Scans NASA's Near Earth Object database for potentially hazardous asteroids over a 7-day window Searches news coverage across three regions (US, Japan, EU) to compare media reporting Uses AI (GPT-4o-mini) to fact-check news claims against official NASA data Detects misinformation and measures media sensationalism levels Generates visual charts comparing actual threat levels vs media panic Sends alerts through multiple channels (Slack, Discord, Email) Logs all alerts to Google Sheets for historical analysis How It Works Trigger: Runs daily at 9 AM or on-demand via webhook NASA Data Fetch: Retrieves 7-day asteroid forecast from NASA NeoWs API Threat Analysis: Identifies potentially hazardous asteroids and assigns alert levels (LOW/MEDIUM/HIGH) News Search: Searches news in US, Japan, and EU using Apify's Google Search Scraper AI Fact-Check: GPT-4o-mini compares news claims against NASA data, detecting misinformation Visualization: Generates gauge charts for threat level and media panic, plus regional comparison bar chart Multi-Channel Alerts: Sends formatted reports to Slack, Discord, Email, and logs to Google Sheets Set Up Steps Estimated time: 15-20 minutes NASA API (Required): Get your free API key at api.nasa.gov Apify (Required): Create account and connect via OAuth OpenAI (Required): Add your API key from platform.openai.com Notification Channels (Choose at least one): Slack: Create OAuth app and connect Discord: Create webhook URL Email: Configure SMTP settings Google Sheets (Optional): Create a sheet for logging with columns: Date, Alert Level, Hazardous Count, Threat Score, Media Panic Score, Misinformation Detected, Top Asteroid, Most Accurate Region Requirements NASA API key (free) Apify account (free tier available) OpenAI API key (paid) At least one notification channel configured n8n version 1.0+ How to Customize Change scan frequency: Modify the Schedule Trigger node Add more regions: Edit the "Configure Regional Search" code node Adjust alert thresholds: Modify lunar distance threshold (currently 10) in "Analyze Asteroid Threats" Disable channels: Simply remove connections to notification nodes you don't need Customize messages: Edit the "Format Multi-Channel Messages" node