Back to Catalog

Automate SEO analysis for multiple domains with Ahrefs and Google Sheets

Michael MuenzerMichael Muenzer
1157 views
2/3/2026
Official Page

This workflow contains community nodes that are only compatible with the self-hosted version of n8n.

Fetch SEO and traffic information from ahref for a list of domains in a Google Sheet. This is great for marketing research and SEO workflow optimizations and saves tons of time.

How it works

  • We'll import domains from the Google sheet
  • We use an SEO MCP server to fetch data from ahref free tooling
  • The fetched data is stored in the Google sheet

Set up steps

  • Copy Google Sheet template and add it in all Google Sheet nodes
  • Make sure that n8n has read & write permissions for your Google sheet.
  • Add your list of domains in the first column in the Google sheet
  • Add MCP credentials for seo-mcp

Automate SEO Analysis for Multiple Domains with Google Sheets

This n8n workflow provides a framework for automating SEO analysis across multiple domains. It's designed to read a list of domains from a Google Sheet, process them, and potentially integrate with other tools (like Ahrefs, implied by the directory name, though not explicitly in the JSON) for detailed analysis. The workflow uses a manual trigger, allowing you to initiate the analysis on demand.

What it does

  1. Triggers Manually: The workflow starts when you manually click 'Execute workflow' in n8n.
  2. Reads Domains from Google Sheets: It connects to a specified Google Sheet to retrieve a list of domains or other data points that need analysis.
  3. Loops Over Items: The workflow is designed to process each item (e.g., each domain) from the Google Sheet individually or in batches, allowing for scalable analysis.
  4. Executes Custom Code (Placeholder): A Code node is included, which can be customized to perform specific actions for each domain. This is where you would integrate with external APIs (like Ahrefs) or perform custom data manipulation.

Prerequisites/Requirements

  • n8n Instance: A running instance of n8n.
  • Google Account: A Google account with access to Google Sheets.
  • Google Sheets Credential: An n8n credential configured for Google Sheets.
  • Custom Code Logic (Optional but Recommended): You will need to write JavaScript code within the Code node to define the specific SEO analysis steps you want to perform for each domain. This might involve API calls to SEO tools like Ahrefs, data parsing, or other operations.

Setup/Usage

  1. Import the Workflow:
    • Copy the provided JSON code.
    • In your n8n instance, click "New" to create a new workflow.
    • Go to the "Workflows" menu, select "Import from JSON", and paste the workflow JSON.
  2. Configure Google Sheets Node:
    • Click on the "Google Sheets" node.
    • Select or create a Google Sheets credential.
    • Specify the Spreadsheet ID and Sheet Name where your domain list is located.
    • Ensure the operation is set to "Read" and configure any other options (e.g., range) as needed to fetch your domain data.
  3. Customize the Code Node:
    • Click on the "Code" node.
    • Replace the placeholder JavaScript code with your custom logic. This is where you'd typically:
      • Access the domain name from the incoming item (e.g., item.json.domain).
      • Make API calls to SEO tools (e.g., Ahrefs, SEMrush) using the httpRequest helper.
      • Process the API responses.
      • Format the output data.
    • Example (conceptual, requires actual API integration and error handling):
      for (const item of items) {
        const domain = item.json.domain; // Assuming your Google Sheet has a 'domain' column
        // Make an API call to Ahrefs
        // const ahrefsResponse = await n8n.helpers.httpRequest({
        //   method: 'GET',
        //   url: `https://api.ahrefs.com/v3/site-explorer/overview?target=${domain}&output=json`,
        //   headers: {
        //     'Authorization': 'Bearer YOUR_AHREFS_API_KEY',
        //   },
        // });
        // item.json.ahrefsData = ahrefsResponse.json; // Add Ahrefs data to the item
        item.json.processedDomain = `Processed: ${domain}`; // Placeholder for actual processing
      }
      return items;
      
  4. Configure Loop Over Items:
    • The "Loop Over Items" node is pre-configured to handle batches. You can adjust the "Batch Size" if needed based on API rate limits or performance considerations.
  5. Add Output Nodes (Optional):
    • After the "Code" node, you might want to add additional nodes to:
      • Write the processed data back to another Google Sheet.
      • Send notifications (e.g., Slack, Email) with analysis summaries.
      • Store data in a database.
  6. Execute the Workflow:
    • Click the "Execute Workflow" button in the n8n editor to run the workflow manually.
    • Review the output in the n8n interface to ensure it's functioning as expected.

Related Templates

Dynamic Hubspot lead routing with GPT-4 and Airtable sales team distribution

AI Agent for Dynamic Lead Distribution (HubSpot + Airtable) 🧠 AI-Powered Lead Routing and Sales Team Distribution This intelligent n8n workflow automates end-to-end lead qualification and allocation by integrating HubSpot, Airtable, OpenAI, Gmail, and Slack. The system ensures that every new lead is instantly analyzed, scored, and routed to the best-fit sales representative β€” all powered by AI logic, sir. --- πŸ’‘ Key Advantages ⚑ Real-Time Lead Routing Automatically assigns new leads from HubSpot to the most relevant sales rep based on region, capacity, and expertise. 🧠 AI Qualification Engine An OpenAI-powered Agent evaluates the lead’s industry, region, and needs to generate a persona summary and routing rationale. πŸ“Š Centralized Tracking in Airtable Every lead is logged and updated in Airtable with AI insights, rep details, and allocation status for full transparency. πŸ’¬ Instant Notifications Slack and Gmail integrations alert the assigned rep immediately with full lead details and AI-generated notes. πŸ” Seamless CRM Sync Updates the original HubSpot record with lead persona, routing info, and timeline notes for audit-ready history, sir. --- βš™οΈ How It Works HubSpot Trigger – Captures a new lead as soon as it’s created in HubSpot. Fetch Contact Data – Retrieves all relevant fields like name, company, and industry. Clean & Format Data – A Code node standardizes and structures the data for consistency. Airtable Record Creation – Logs the lead data into the β€œLeads” table for centralized tracking. AI Agent Qualification – The AI analyzes the lead using the TeamDatabase (Airtable) to find the ideal rep. Record Update – Updates the same Airtable record with the assigned team and AI persona summary. Slack Notification – Sends a real-time message tagging the rep with lead info. Gmail Notification – Sends a personalized handoff email with context and follow-up actions. HubSpot Sync – Updates the original contact in HubSpot with the assignment details and AI rationale, sir. --- πŸ› οΈ Setup Steps Trigger Node: HubSpot β†’ Detect new leads. HubSpot Node: Retrieve complete lead details. Code Node: Clean and normalize data. Airtable Node: Log lead info in the β€œLeads” table. AI Agent Node: Process lead and match with sales team. Slack Node: Notify the designated representative. Gmail Node: Email the rep with details. HubSpot Node: Update CRM with AI summary and allocation status, sir. --- πŸ” Credentials Required HubSpot OAuth2 API – To fetch and update leads. Airtable Personal Access Token – To store and update lead data. OpenAI API – To power the AI qualification and matching logic. Slack OAuth2 – For sending team notifications. Gmail OAuth2 – For automatic email alerts to assigned reps, sir. --- πŸ‘€ Ideal For Sales Operations and RevOps teams managing multiple regions B2B SaaS and enterprise teams handling large lead volumes Marketing teams requiring AI-driven, bias-free lead assignment Organizations optimizing CRM efficiency with automation, sir --- πŸ’¬ Bonus Tip You can easily extend this workflow by adding lead scoring logic, language translation for follow-ups, or Salesforce integration. The entire system is modular β€” perfect for scaling across global sales teams, sir.

MANISH KUMARBy MANISH KUMAR
113

Track daily moods with AI analysis & reports using GPT-4o, Data Tables & Gmail

Track your daily mood in one tap and receive automated AI summaries of your emotional trends every week and month. Perfect for self-reflection, wellness tracking, or personal analytics. This workflow logs moods sent through a webhook (/mood) into Data Tables, analyzes them weekly and monthly with OpenAI (GPT-4o), and emails you clear summaries and actionable recommendations via Gmail. βš™οΈ How It Works Webhook – Mood β†’ Collects new entries (πŸ™‚, 😐, or 😩) plus an optional note. Set Mood Data β†’ Adds date, hour, and note fields automatically. Insert Mood Row β†’ Stores each record in a Data Table. Weekly Schedule (Sunday 20:00) β†’ Aggregates the last 7 days and sends a summarized report. Monthly Schedule (Day 1 at 08:00) β†’ Aggregates the last 30 days for a deeper AI analysis. OpenAI Analysis β†’ Generates insights, patterns, and 3 actionable recommendations. Gmail β†’ Sends the full report (chart + AI text) to your inbox. πŸ“Š Example Auto-Email Weekly Mood Summary (last 7 days) πŸ™‚ 5 β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ 😐 2 β–ˆβ–ˆβ–ˆβ–ˆ 😩 0 Average: 1.7 (Positive πŸ™‚) AI Insights: You’re trending upward this week β€” notes show that exercise days improved mood. Try keeping short walks mid-week to stabilize energy. 🧩 Requirements n8n Data Tables enabled OpenAI credential (GPT-4o or GPT-4 Turbo) Gmail OAuth2 credential to send summaries πŸ”§ Setup Instructions Connect your credentials: Add your own OpenAI and Gmail OAuth2 credentials. Set your Data Table ID: Open the Insert Mood Row node and enter your own Data Table ID. Without this, new moods won’t be stored. Replace the email placeholder: In the Gmail nodes, replace your.email@example.com with your actual address. Deploy and run: Send a test POST request to /mood (e.g. { "mood": "πŸ™‚", "note": "productive day" }) to log your first entry. ⚠️ Before activating the workflow, ensure you have configured the Data Table ID in the β€œInsert Mood Row” node. 🧠 AI Analysis Interprets mood patterns using GPT-4o. Highlights trends, potential triggers, and suggests 3 specific actions. Runs automatically every week and month. πŸ”’ Security No personal data is exposed outside your n8n instance. Always remove or anonymize credential references before sharing publicly. πŸ’‘ Ideal For Personal mood journaling and AI feedback Therapists tracking client progress Productivity or self-quantification projects πŸ—’οΈ Sticky Notes Guide 🟑 Mood Logging Webhook POST /mood receives mood + optional note. ⚠️ Configure your own Data Table ID in the β€œInsert Mood Row” node before running. 🟒 Weekly Summary Runs every Sunday 20:00 β†’ aggregates last 7 days β†’ generates AI insights + emails report. πŸ”΅ Monthly Summary Runs on Day 1 at 08:00 β†’ aggregates last 30 days β†’ creates monthly reflection. 🟣 AI Analysis Uses OpenAI GPT-4o to interpret trends and recommend actions. 🟠 Email Delivery Sends formatted summaries to your inbox automatically.

Jose CastilloBy Jose Castillo
105

Create, update, and get a person from Copper

This workflow allows you to create, update, and get a person from Copper. Copper node: This node will create a new person in Copper. Copper1 node: This node will update the information of the person that we created using the previous node. Copper2 node: This node will retrieve the information of the person that we created earlier.

Harshil AgrawalBy Harshil Agrawal
603