Seo keyword analysis and filter
Use case This workflow is designed for e-commerce brands and content teams who:
- Need to scale SEO content production without sacrificing quality
- Want to eliminate manual keyword filtering (saves 10+ hours/week)
- Aim to dominate niche search terms (e.g., "vegan leather crossbody bags")
What this workflow does Automates the end-to-end process from keyword discovery to publish-ready articles:
- Keyword Harvesting: Pulls 1,000+ keywords/day from SEMrush/Ahrefs
- Smart Filtering:Blocks competitor brands (e.g., "Zara alternatives")
- Detects irrelevant demographics ("kids", "petite")
- AI Content Generation:Flags non-compliant colors (non-black/white terms)
- Multi-Channel Output: Formats content for blogs, product descriptions, and email campaigns
setup
- Add Google,SEMrush and OpenAI credentials
- Set the rules excel of google drive
- Test workflow by testing workflow
- Review generated opportunity report in Google Sheets
How to adjust this template
- Change scenario: Replace the rules and define different target
n8n SEO Keyword Analysis and Filter Workflow
This n8n workflow automates the process of analyzing SEO keywords from a Google Sheet, classifying them using an AI model, and then filtering them based on specific criteria. It's designed to help SEO specialists efficiently manage and categorize keywords, ensuring only relevant ones are processed further.
What it does
This workflow performs the following key steps:
- Triggers Manually: The workflow is initiated manually, allowing for on-demand analysis.
- Reads Keywords from Google Sheet: It fetches SEO keywords from a specified Google Sheet.
- Classifies Keywords with AI: Each keyword is sent to an OpenAI Chat Model for classification, determining if it's a "Brand" or "Non-Brand" keyword.
- Filters Keywords: Keywords are then filtered based on their classification.
- One branch processes "Brand" keywords.
- Another branch processes "Non-Brand" keywords.
- Aggregates Filtered Keywords: The filtered keywords from both branches are aggregated back into a single stream.
- Extracts Data from File (Placeholder): A placeholder node for extracting data from a file is present, though not actively connected in the provided JSON. This suggests potential for future expansion to process file-based keyword lists.
- Edits Fields (Placeholder): A placeholder node for editing fields is present, though not actively connected in the provided JSON. This suggests potential for future data manipulation.
- Writes to Google Drive (Placeholder): A placeholder node for writing data to Google Drive is present, though not actively connected in the provided JSON. This suggests potential for future output to cloud storage.
Prerequisites/Requirements
- n8n Instance: A running n8n instance.
- Google Sheets Account: To store and retrieve your SEO keywords.
- OpenAI API Key: For the
OpenAI Chat Modelnode to classify keywords. - Google Drive Account (Optional, for future enhancements): If you plan to use the Google Drive node.
Setup/Usage
- Import the Workflow: Download the provided JSON and import it into your n8n instance.
- Configure Credentials:
- Google Sheets: Set up your Google Sheets credential to allow n8n to read data from your spreadsheet.
- OpenAI Chat Model: Provide your OpenAI API Key in the credentials for the
OpenAI Chat Modelnode.
- Configure Google Sheets Node:
- Specify the Spreadsheet ID and Sheet Name where your SEO keywords are located.
- Ensure your keywords are in a column that can be easily referenced (e.g., "Keyword").
- Configure OpenAI Chat Model:
- Review the prompt used for classification to ensure it aligns with your definition of "Brand" and "Non-Brand" keywords. Adjust if necessary.
- Configure Filter Nodes:
- The "If" node is configured to check the output of the OpenAI Chat Model. Verify the conditions match the expected output format of your AI classification (e.g.,
{{ $json.classification === 'Brand' }}).
- The "If" node is configured to check the output of the OpenAI Chat Model. Verify the conditions match the expected output format of your AI classification (e.g.,
- Execute the Workflow: Click "Execute workflow" in the
Manual Triggernode to run the analysis. - Review Results: After execution, you can inspect the output of the
Aggregatenode to see the classified and filtered keywords.
Note: The Extract from File, Edit Fields, and Google Drive nodes are currently disconnected placeholders. To use them, you would need to connect them into the workflow and configure their respective operations.
Related Templates
AI-powered code review with linting, red-marked corrections in Google Sheets & Slack
Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOURSPREADSHEETID>, <YOURSHEETGIDORNAME>, <YOURSLACKCHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.
Synchronizing WooCommerce inventory and creating products with Google Gemini AI and BrowserAct
Synchronize WooCommerce Inventory & Create Products with Gemini AI & BrowserAct This sophisticated n8n template automates WooCommerce inventory management by scraping supplier data, updating existing products, and intelligently creating new ones with AI-formatted descriptions. This workflow is essential for e-commerce operators, dropshippers, and inventory managers who need to ensure their product pricing and stock levels are synchronized with multiple third-party suppliers, minimizing overselling and maximizing profit. --- Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. --- How it works The workflow is typically run by a Schedule Trigger (though a Manual Trigger is also shown) to check stock automatically. It reads a list of suppliers and their inventory page URLs from a central Google Sheet. The workflow loops through each supplier: A BrowserAct node scrapes the current stock and price data from the supplier's inventory page. A Code node parses this bulk data into individual product items. It then loops through each individual product found. The workflow checks WooCommerce to see if the product already exists based on its name. If the product exists: It proceeds to update the existing product's price and stock quantity. If the product DOES NOT exist: An If node checks if the missing product's category matches a predefined type (optional filtering). If it passes the filter, a second BrowserAct workflow scrapes detailed product attributes from a dedicated product page (e.g., DigiKey). An AI Agent (Gemini) transforms these attributes into a specific, styled HTML table for the product description. Finally, the product is created in WooCommerce with all scraped details and the AI-generated description. Error Handling: Multiple Slack nodes are configured to alert your team immediately if any scraping task fails or if the product update/creation process encounters an issue. Note: This workflow does not support image uploads for new products. To enable this functionality, you must modify both the n8n and BrowserAct workflows. --- Requirements BrowserAct API account for web scraping BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) BrowserAct templates named “WooCommerce Inventory & Stock Synchronization” and “WooCommerce Product Data Reconciliation” Google Sheets credentials for the supplier list WooCommerce credentials for product management Google Gemini account for the AI Agent Slack credentials for error alerts --- Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node --- Workflow Guidance and Showcase STOP Overselling! Auto-Sync WooCommerce Inventory from ANY Supplier
Automate RSS to social media pipeline with AI, Airtable & GetLate for multiple platforms
Overview Automates your complete social media content pipeline: sources articles from Wallabag RSS, generates platform-specific posts with AI, creates contextual images, and publishes via GetLate API. Built with 63 nodes across two workflows to handle LinkedIn, Instagram, and Bluesky—with easy expansion to more platforms. Ideal for: Content marketers, solo creators, agencies, and community managers maintaining a consistent multi-platform presence with minimal manual effort. How It Works Two-Workflow Architecture: Content Aggregation Workflow Monitors Wallabag RSS feeds for tagged articles (to-share-linkedin, to-share-instagram, etc.) Extracts and converts content from HTML to Markdown Stores structured data in Airtable with platform assignment AI Generation & Publishing Workflow Scheduled trigger queries Airtable for unpublished content Routes to platform-specific sub-workflows (LinkedIn, Instagram, Bluesky) LLM generates optimized post text and image prompts based on custom brand parameters Optionally generates AI images and hosts them on Imgbb CDN Publishes via GetLate API (immediate or draft mode) Updates Airtable with publication status and metadata Key Features: Tag-based content routing using Wallabag's native system Swappable AI providers (Groq, OpenAI, Anthropic) Platform-specific optimization (tone, length, hashtags, CTAs) Modular design—duplicate sub-workflows to add new platforms in \~30 minutes Centralized Airtable tracking with 17 data points per post Set Up Steps Setup time: \~45-60 minutes for initial configuration Create accounts and get API keys (\~15 min) Wallabag (with RSS feeds enabled) GetLate (social media publishing) Airtable (create base with provided schema—see sticky notes) LLM provider (Groq, OpenAI, or Anthropic) Image service (Hugging Face, Fal.ai, or Stability AI) Imgbb (image hosting) Configure n8n credentials (\~10 min) Add all API keys in n8n's credential manager Detailed credential setup instructions in workflow sticky notes Set up Airtable database (\~10 min) Create "RSS Feed - Content Store" base Add 19 required fields (schema provided in workflow sticky notes) Get Airtable base ID and API key Customize brand prompts (\~15 min) Edit "Set Custom SMCG Prompt" node for each platform Define brand voice, tone, goals, audience, and image preferences Platform-specific examples provided in sticky notes Configure platform settings (\~10 min) Set GetLate account IDs for each platform Enable/disable image generation per platform Choose immediate publish vs. draft mode Adjust schedule trigger frequency Test and deploy Tag test articles in Wallabag Monitor the first few executions in draft mode Activate workflows when satisfied with the output Important: This is a proof-of-concept template. Test thoroughly with draft mode before production use. Detailed setup instructions, troubleshooting tips, and customization guidance are in the workflow's sticky notes. Technical Details 63 nodes: 9 Airtable operations, 8 HTTP requests, 7 code nodes, 3 LangChain LLM chains, 3 RSS triggers, 3 GetLate publishers Supports: Multiple LLM providers, multiple image generation services, unlimited platforms via modular architecture Tracking: 17 metadata fields per post, including publish status, applied parameters, character counts, hashtags, image URLs Prerequisites n8n instance (self-hosted or cloud) Accounts: Wallabag, GetLate, Airtable, LLM provider, image generation service, Imgbb Basic understanding of n8n workflows and credential configuration Time to customize prompts for your brand voice Detailed documentation, Airtable schema, prompt examples, and troubleshooting guides are in the workflow's sticky notes. Category Tags social-media-automation, ai-content-generation, rss-to-social, multi-platform-posting, getlate-api, airtable-database, langchain, workflow-automation, content-marketing