Find top keywords for Youtube and Google and store them in NocoDB
Template Description
WDF Top Keywords: This workflow is designed to streamline keyword research by automating the process of generating, filtering, and analyzing Google and YouTube keyword data. Ensure compliance with local regulations and API terms of service when using this workflow.
📌 Purpose
The WDF Top Keywords workflow automates collecting, processing, and managing keyword data for both Google and YouTube platforms. Leveraging multiple data sources and APIs ensures an efficient and scalable approach to identifying high-impact keywords for SEO, content creation, and marketing campaigns.
Key Features
- Automates the generation of keyword suggestions using autocomplete APIs.
- Integrates with NocoDB to store and manage keyword data.
- Filters keywords based on monthly search volume and cost-per-click (CPC).
- Supports bulk import of keyword data into structured databases.
- Outputs both Google and YouTube keyword insights, enabling informed decision-making.
🎯 Target Audience
This workflow is ideal for:
- Digital marketers aiming to optimize ad campaigns with data-driven insights.
- SEO specialists looking to identify high-potential keywords efficiently.
- Content creators seeking trending and relevant topics for their platforms.
- Agencies managing keyword research for multiple clients.
⚙️ How It Works
- Trigger: The workflow runs on-demand or at scheduled intervals.
- Keyword Generation:
- Retrieves base keywords from NocoDB.
- Generates autocomplete suggestions for Google and YouTube.
- Data Processing:
- Filters and formats keyword data based on specific criteria (e.g., search volume, CPC).
- Consolidates results for efficient storage and analysis.
- Storage and Output:
- Saves data into structured NocoDB tables for tracking and reuse.
- Bulk imports monthly search volume statistics for detailed analysis.
🛠️ Key APIs and Tools Used
- NocoDB: Stores and organizes base and processed keyword data.
- DataForSEO API: Provides search volume and keyword performance metrics.
- Google Autocomplete API: Suggests relevant Google search terms.
- YouTube Autocomplete API: Suggests trending YouTube keywords.
- Social Flood Docker Instance: Serves as the local integration hub.
Setup Instructions
- Required Tools:
- Create the following NocoDB tables:
- Base Keyword Search
- Second Order Google Keywords
- Second Order YouTube Keywords
- Search Volume
This template empowers users to handle complex keyword research tasks effortlessly, saving time and providing actionable insights. Share this template to enhance your workflow efficiency!
n8n Workflow: Keyword Research and NocoDB Storage (Incomplete)
This n8n workflow is designed to perform keyword research, filter results, and store them in NocoDB. However, it is currently incomplete as crucial nodes for the actual keyword research (e.g., an API call to a keyword research service) are missing.
What it does (Planned/Incomplete)
The workflow outlines the following steps:
- Trigger: It can be manually executed or scheduled to run at specific intervals.
- HTTP Request (Placeholder): An
HTTP Requestnode is included, likely intended for making an API call to a keyword research service (e.g., YouTube, Google, or a dedicated SEO tool) to fetch keyword data. This node is currently not configured and is a placeholder. - Code (Placeholder): A
Codenode is present, suggesting that some custom JavaScript logic might be applied to process or transform the data received from the HTTP Request. This node is currently empty. - Split Out (Placeholder): A
Split Outnode is included, which would typically be used to break down an array of items into individual items for further processing. This suggests the API response might contain multiple keywords or data points. - Loop Over Items (Placeholder): A
Loop Over Items (Split in Batches)node is configured to process items in batches, indicating that the workflow expects to handle a list of keywords or data. - Filter (Placeholder): A
Filternode is included, implying that the workflow intends to filter the processed keyword data based on certain criteria (e.g., relevance, search volume, competition). This node is not configured. - If (Placeholder): An
Ifnode is present, suggesting conditional logic will be applied. This could be used to route data based on whether it meets certain criteria after filtering. This node is not configured. - NocoDB (Placeholder): A
NocoDBnode is included, indicating the final step is to store the filtered and processed keyword data into a NocoDB database. This node is not configured. - Sticky Note: A sticky note is included, likely for documentation or explanation within the workflow itself.
Prerequisites/Requirements
- n8n Instance: A running instance of n8n.
- NocoDB Account/Instance: Access to a NocoDB instance where the keyword data will be stored.
- Keyword Research API Key/Account (Missing): An API key or account for a keyword research service (e.g., Google Keyword Planner API, YouTube Data API, or a third-party SEO tool API) would be required to make the
HTTP Requestnode functional.
Setup/Usage
- Import the Workflow:
- Download the provided JSON file.
- In your n8n instance, click "New" in the workflows section, then "Import from JSON".
- Paste the JSON content or upload the file.
- Configure Credentials:
- You will need to configure a NocoDB credential for the
NocoDBnode. - Crucially, you will need to set up credentials for the keyword research API you intend to use and configure the
HTTP Requestnode accordingly.
- You will need to configure a NocoDB credential for the
- Configure Placeholder Nodes:
- HTTP Request Node: Edit the
HTTP Requestnode (ID: 19) to connect to your chosen keyword research API. Define the URL, method, headers (including your API key), and body as required by the API. - Code Node: Edit the
Codenode (ID: 834) to add any custom JavaScript logic needed to parse or transform the API response. - Filter Node: Edit the
Filternode (ID: 844) to define the conditions for filtering your keywords (e.g., minimum search volume, specific keywords to include/exclude). - If Node: Edit the
Ifnode (ID: 20) to define any conditional routing based on the filtered data. - NocoDB Node: Edit the
NocoDBnode (ID: 510) to specify the base, table, and columns where the keyword data should be stored.
- HTTP Request Node: Edit the
- Activate the Workflow: Once all nodes are configured, activate the workflow.
- Execute:
- You can manually execute the workflow by clicking "Execute Workflow" with the
Manual Triggernode. - If using the
Schedule Trigger, the workflow will run automatically at the defined intervals.
- You can manually execute the workflow by clicking "Execute Workflow" with the
Related Templates
Auto-create TikTok videos with VEED.io AI avatars, ElevenLabs & GPT-4
💥 Viral TikTok Video Machine: Auto-Create Videos with Your AI Avatar --- 🎯 Who is this for? This workflow is for content creators, marketers, and agencies who want to use Veed.io’s AI avatar technology to produce short, engaging TikTok videos automatically. It’s ideal for creators who want to appear on camera without recording themselves, and for teams managing multiple brands who need to generate videos at scale. --- ⚙️ What problem this workflow solves Manually creating videos for TikTok can take hours — finding trends, writing scripts, recording, and editing. By combining Veed.io, ElevenLabs, and GPT-4, this workflow transforms a simple Telegram input into a ready-to-post TikTok video featuring your AI avatar powered by Veed.io — speaking naturally with your cloned voice. --- 🚀 What this workflow does This automation links Veed.io’s video-generation API with multiple AI tools: Analyzes TikTok trends via Perplexity AI Writes a 10-second viral script using GPT-4 Generates your voiceover via ElevenLabs Uses Veed.io (Fabric 1.0 via FAL.ai) to animate your avatar and sync the lips to the voice Creates an engaging caption + hashtags for TikTok virality Publishes the video automatically via Blotato TikTok API Logs all results to Google Sheets for tracking --- 🧩 Setup Telegram Bot Create your bot via @BotFather Configure it as the trigger for sending your photo and theme Connect Veed.io Create an account on Veed.io Get your FAL.ai API key (Veed Fabric 1.0 model) Use HTTPS image/audio URLs compatible with Veed Fabric Other APIs Add Perplexity, ElevenLabs, and Blotato TikTok keys Connect your Google Sheet for logging results --- 🛠️ How to customize this workflow Change your Avatar: Upload a new image through Telegram, and Veed.io will generate a new talking version automatically. Modify the Script Style: Adjust the GPT prompt for tone (educational, funny, storytelling). Adjust Voice Tone: Tweak ElevenLabs stability and similarity settings. Expand Platforms: Add Instagram, YouTube Shorts, or X (Twitter) posting nodes. Track Performance: Customize your Google Sheet to measure your most successful Veed.io-based videos. --- 🧠 Expected Outcome In just a few seconds after sending your photo and theme, this workflow — powered by Veed.io — creates a fully automated TikTok video featuring your AI avatar with natural lip-sync and voice. The result is a continuous stream of viral short videos, made without cameras, editing, or effort. --- ✅ Import the JSON file in n8n, add your API keys (including Veed.io via FAL.ai), and start generating viral TikTok videos starring your AI avatar today! 🎥 Watch This Tutorial --- 📄 Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
Track competitor SEO keywords with Decodo + GPT-4.1-mini + Google Sheets
This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primarykeywords, seostrengthscore, keyworddensity_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo for intelligent web scraping OpenAI GPT-4.1-mini for keyword and SEO analysis Google Sheets for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.
Generate song lyrics and music from text prompts using OpenAI and Fal.ai Minimax
Spark your creativity instantly in any chat—turn a simple prompt like "heartbreak ballad" into original, full-length lyrics and a professional AI-generated music track, all without leaving your conversation. 📋 What This Template Does This chat-triggered workflow harnesses AI to generate detailed, genre-matched song lyrics (at least 600 characters) from user messages, then queues them for music synthesis via Fal.ai's minimax-music model. It polls asynchronously until the track is ready, delivering lyrics and audio URL back in chat. Crafts original, structured lyrics with verses, choruses, and bridges using OpenAI Submits to Fal.ai for melody, instrumentation, and vocals aligned to the style Handles long-running generations with smart looping and status checks Returns complete song package (lyrics + audio link) for seamless sharing 🔧 Prerequisites n8n account (self-hosted or cloud with chat integration enabled) OpenAI account with API access for GPT models Fal.ai account for AI music generation 🔑 Required Credentials OpenAI API Setup Go to platform.openai.com → API keys (sidebar) Click "Create new secret key" → Name it (e.g., "n8n Songwriter") Copy the key and add to n8n as "OpenAI API" credential type Test by sending a simple chat completion request Fal.ai HTTP Header Auth Setup Sign up at fal.ai → Dashboard → API Keys Generate a new API key → Copy it In n8n, create "HTTP Header Auth" credential: Name="Fal.ai", Header Name="Authorization", Header Value="Key [Your API Key]" Test with a simple GET to their queue endpoint (e.g., /status) ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign OpenAI API credentials to the "OpenAI Chat Model" node Assign Fal.ai HTTP Header Auth to the "Generate Music Track", "Check Generation Status", and "Fetch Final Result" nodes Activate the workflow—chat trigger will appear in your n8n chat interface Test by messaging: "Create an upbeat pop song about road trips" 🎯 Use Cases Content Creators: YouTubers generating custom jingles for videos on the fly, streamlining production from idea to audio export Educators: Music teachers using chat prompts to create era-specific folk tunes for classroom discussions, fostering interactive learning Gift Personalization: Friends crafting anniversary R&B tracks from shared memories via quick chats, delivering emotional audio surprises Artist Brainstorming: Songwriters prototyping hip-hop beats in real-time during sessions, accelerating collaboration and iteration ⚠️ Troubleshooting Invalid JSON from AI Agent: Ensure the system prompt stresses valid JSON; test the agent standalone with a sample query Music Generation Fails (401/403): Verify Fal.ai API key has minimax-music access; check usage quotas in dashboard Status Polling Loops Indefinitely: Bump wait time to 45-60s for complex tracks; inspect fal.ai queue logs for bottlenecks Lyrics Under 600 Characters: Tweak agent prompt to enforce fuller structures like [V1][C][V2][B][C]; verify output length in executions