Create SEO content brief from keyword to Google Doc
Who’s it for
Content/SEO teams who want a fast, consistent, research-driven brief for a copywriters from a single keyword—without manual review and analysis of the SERP (Google results).
How it works / What it does
- Form Trigger collects the keyword/topic and redirects to Google Drive Folder after the final node.
- FireCrawl Search & Scrape pulls the top 5 pages for the chosen keyword.
- AI Agent (with Think + OpenAI Chat Model) analyzes sources and generates an original Markdown brief.
- Markdown to JSON converts the Markdown into Google Docs batchUpdate requests (H1/H2/H3, lists, links, spacing). Then this is used in Update a document for updating the empty doc.
- Create a document + Update a document write a Google Doc titled “SEO Brief for ” and update the Google Doc in your target Drive folder.
How to set up
- Add credentials: Firecrawl (Authorization header), OpenAI (Chat), Google Docs OAuth2.
- Replace placeholders: {{APIKEY}}, {{googledrivefolderid}}, {{googledrivefolderurl}}.
- Publish and open the Form URL to test.
Requirements
Firecrawl API key • OpenAI API key • Google account with access to the target Drive folder.
Resources
Google OAuth2 Credentials Setup - https://docs.n8n.io/integrations/builtin/credentials/google/oauth-generic/
Firecrawl API key - https://take.ms/lGcUp
OpenAI API key - https://docs.n8n.io/integrations/builtin/credentials/openai/
n8n Workflow: Create SEO Content Brief from Keyword to Google Doc
This n8n workflow automates the process of generating a comprehensive SEO content brief from a given keyword and saving it directly into a Google Doc. It leverages AI capabilities to understand the keyword, plan the content structure, and then uses an HTTP Request to interact with a custom API (likely for content generation or data enrichment) before creating the final document.
What it does
This workflow streamlines the content creation process by:
- Triggering on Form Submission: Initiates when a user submits a keyword via an n8n form.
- Preparing Input for AI: Takes the submitted keyword and formats it for the AI Agent.
- AI-Powered Content Planning: Uses an AI Agent (powered by an OpenAI Chat Model) to "Think" and strategize the content brief based on the keyword. This likely involves outlining sections, identifying key topics, or suggesting angles.
- External Data Fetch/Generation: Makes an HTTP Request to an external API. This step is likely used to fetch additional SEO data, generate specific content sections, or enrich the brief with competitive analysis.
- Generating Google Doc: Creates a new Google Doc and populates it with the structured content brief generated in the previous steps.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running n8n instance.
- OpenAI API Key: For the "OpenAI Chat Model" node to function.
- Google Account: With access to Google Docs, and appropriate n8n credentials configured for Google Docs.
- Custom API Endpoint: The "HTTP Request" node will require a URL and potentially authentication for the external API it interacts with. This API is not defined within the workflow JSON, implying it's a custom or third-party service you need to provide.
Setup/Usage
- Import the Workflow: Download the provided JSON and import it into your n8n instance.
- Configure Credentials:
- OpenAI: Set up your OpenAI API key credential in n8n.
- Google Docs: Set up your Google Account credential with access to Google Docs.
- Configure the "HTTP Request" Node:
- Update the URL, HTTP Method, headers, and body as required to interact with your specific content generation or data enrichment API.
- Configure the "Google Docs" Node:
- Ensure the Google Docs node is configured to create new documents in your desired folder and with the correct naming convention, pulling data from the preceding nodes.
- Activate the Workflow: Once all credentials and configurations are set, activate the workflow.
- Submit a Keyword: Use the URL provided by the "On form submission" trigger node to submit a keyword. The workflow will then execute, generating your SEO content brief.
Related Templates
Track competitor SEO keywords with Decodo + GPT-4.1-mini + Google Sheets
This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primarykeywords, seostrengthscore, keyworddensity_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo for intelligent web scraping OpenAI GPT-4.1-mini for keyword and SEO analysis Google Sheets for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.
Automated YouTube video uploads with 12h interval scheduling in JST
This workflow automates a batch upload of multiple videos to YouTube, spacing each upload 12 hours apart in Japan Standard Time (UTC+9) and automatically adding them to a playlist. ⚙️ Workflow Logic Manual Trigger — Starts the workflow manually. List Video Files — Uses a shell command to find all .mp4 files under the specified directory (/opt/downloads/单词卡/A1-A2). Sort and Generate Items — Sorts videos by day number (dayXX) extracted from filenames and assigns a sequential order value. Calculate Publish Schedule (+12h Interval) — Computes the next rounded JST hour plus a configurable buffer (default 30 min). Staggers each video’s scheduled time by order × 12 hours. Converts JST back to UTC for YouTube’s publishAt field. Split in Batches (1 per video) — Iterates over each video item. Read Video File — Loads the corresponding video from disk. Upload to YouTube (Scheduled) — Uploads the video privately with the computed publishAtUtc. Add to Playlist — Adds the newly uploaded video to the target playlist. 🕒 Highlights Timezone-safe: Pure UTC ↔ JST conversion avoids double-offset errors. Sequential scheduling: Ensures each upload is 12 hours apart to prevent clustering. Customizable: Change SPANHOURS, BUFFERMIN, or directory paths easily. Retry-ready: Each upload and playlist step has retry logic to handle transient errors. 💡 Typical Use Cases Multi-part educational video series (e.g., A1–A2 English learning). Regular content release cadence without manual scheduling. Automated YouTube publishing pipelines for pre-produced content. --- Author: Zane Category: Automation / YouTube / Scheduler Timezone: JST (UTC+09:00)
Monitor bank transactions with multi-channel alerts for accounting teams
Enhance financial oversight with this automated n8n workflow. Triggered every 5 minutes, it fetches real-time bank transactions via an API, enriches and transforms the data, and applies smart logic to detect critical, high, and medium priority alerts based on error conditions, amounts, or risk scores. It sends multi-channel notifications via email and Slack, logs all data to Google Sheets, and generates summary statistics for comprehensive tracking. 💰🚨 Key Features Real-time monitoring every 5 minutes for instant alerts. Smart prioritization (Critical, High, Medium) based on risk and errors. Multi-channel notifications via email and Slack. Detailed logging and summary reports in Google Sheets. How It Works Schedule Trigger: Runs every 5 minutes. Fetch Transactions: HTTP request retrieves real-time transaction data. API Error?: If condition for error logic is met, sends error alert. Enrich & Transform Data: Advanced risk calculation enhances data. Critical Alert?: If condition (50% or risk > 8) is met, raises alert. High Priority?: If condition (5% or risk > 7) is met, raises alert. Medium Priority?: If condition is met, raises alert. Log Priority to Sheet: Google Sheets appends critical, high, or medium priority data. Send Critical Email: HTML email to execute sheets append. Send High Priority Email: Email to finance team. Send High Priority Slack: Slack notification to finance team. Send Medium Priority Email: Email to finance team. Merge All Alerts: Combines all alerts for comprehensive tracking. Generate Summary Stats: Code block for analytics. Log Summary to Sheet: Summary statistics storage. Setup Instructions Import the workflow into n8n and configure the bank API credentials in "Fetch Transactions." Set up Google Sheets OAuth2 and replace the sheet ID for logging nodes. Configure Gmail API Key and Slack Bot Token for alerts. Test the workflow with sample transaction data exceeding risk or amount thresholds. Adjust priority conditions (e.g., 50%, 5%, risk > 8) based on your risk policy. Prerequisites Bank API access with real-time transaction data (e.g., https://api.bank.com) Google Sheets OAuth2 credentials Gmail API Key for email alerts Slack Bot Token (with chat:write permissions) Structured transaction data format Google Sheet Structure: Create a sheet with columns: Transaction ID Amount Date Risk Score Priority (Critical/High/Medium) Alert Sent Summary Stats Updated At Modification Options Adjust the "Schedule Trigger" interval (e.g., every 10 minutes). Modify "Critical Alert?" and "High Priority?" conditions for custom thresholds. Customize email and Slack templates with branded messaging. Integrate with fraud detection tools for enhanced risk analysis. Enhance "Generate Summary Stats" with additional metrics (e.g., average risk). Discover more workflows – Get in touch with us