Automated blog generation with Gemini AI, GitHub & Jekyll publishing
📝 Use Case This n8n workflow automates the creation and publication of technical blog posts based on a list of topics stored in Google Sheets. It fetches context using Tavily and Wikipedia, generates Markdown-formatted content with Gemini AI, commits it to a GitHub repository, and updates a Jekyll-powered blog — all without manual intervention. Ideal for developers, bloggers, or content teams who want to streamline technical content creation and publishing. --- ⚙️ Setup Instructions 🔑 Prerequisites n8n (cloud or self-hosted) Tavily API key Google Sheets with blog topics Gemini (Google Palm) API key GitHub repository (Jekyll enabled) GitHub OAuth2 credentials Google OAuth2 credentials 🧩 Setup Steps Import the workflow JSON into your n8n instance. Set up the following credentials in n8n: Tavily API Google Sheets OAuth2 Google Palm/Gemini AI GitHub OAuth2 Prepare your Google Sheet: Columns: Title, status, row_number Set status to blank for topics to be picked up. Configure: GitHub repo and _posts/ path Jekyll setup (front matter, _config.yml, GitHub Pages) Adjust prompt/custom parameters if needed. Enable and deploy the workflow. Schedule it daily or trigger manually. --- 🔄 Workflow Details | Node | Function | |------|----------| | Schedule Trigger | Triggers the flow at a set interval | | Google Sheets (Get Topic) | Fetches the next incomplete blog topic | | Extract Topic | Parses topic text from the sheet | | Tavily Search | Gathers up-to-date content related to the topic | | Wikipedia Tool | Optionally adds more context or images | | Summarize Results | Formats the context for the AI | | Gemini AI Agent (LangChain) | Generates a Markdown blog post with YAML front matter | | Set File Parameters | Prepares the filename, content, and commit message | | GitHub Commit | Uploads the .md file to the _posts/ directory | | Update Google Sheet | Marks topic as done after successful commit | --- 🛠️ Customization Options Change LLM prompt (e.g. tone, depth, format). Use OpenAI instead of Gemini by switching nodes. Modify filename pattern or GitHub repo path. Add Slack/Discord notifications after publish. Extend flow to upload images or embed YouTube links. --- ⚠️ Community Nodes Used This workflow uses the following community nodes: @tavily/n8n-nodes-tavily.tavily – for deep search > ⚠️ Ensure these are installed and enabled in your n8n instance. --- 💡 Pro Tips Use GitHub Actions to trigger an automatic Jekyll build post-commit. Structure blog posts with front matter, headings, and table of contents for SEO. Set Schedule Trigger to daily at a fixed time to keep content flowing. Enhance formatting in AI output using code blocks, images, and lists. --- ✅ Example Output markdown --- title: "How LLMs Are Changing Web Development" date: "2025-07-25" categories: [webdev, AI] tags: [LLM, Gemini, n8n, automation] excerpt: "Learn how LLMs like Gemini are transforming how we generate and deploy developer content." author: "Saswat Saubhagya" --- Table of Contents Introduction Understanding LLMs Use Cases in Web Development Challenges Conclusion ...
Monitor stock market with AI: news analysis & multi-channel alerts via Slack & Telegram
AI Customer Support Triage with Gmail, OpenAI, Airtable & Slack How it Works This workflow monitors your Gmail support inbox every minute, automatically sending each unread email to OpenAI for intelligent analysis. The AI evaluates sentiment (Positive/Neutral/Negative/Critical), urgency level (Low/Medium/High/Critical), categorizes requests (Technical/Billing/Feature Request/Bug Report/General), extracts key issues, and generates professional response templates. The system calculates a priority score (0-110) by combining urgency and sentiment weights, then routes tickets accordingly. Critical issues trigger immediate Slack alerts with full context and 30-minute SLA reminders, while routine tickets post to standard monitoring channels. Every ticket logs to Airtable with complete analysis and thread tracking, then updates a Google Sheets dashboard for real-time analytics. A secondary AI pass generates strategic insights (trend identification, risk assessment, actionable recommendations) and stores them back in Airtable. The entire process takes seconds from email arrival to team notification, eliminating manual triage and ensuring critical issues get immediate attention. --- Who is this for? Customer support teams needing automated prioritization for high email volumes SaaS companies tracking support metrics and response times Startups with lean teams requiring intelligent ticket routing E-commerce businesses managing technical, billing, and return inquiries Support managers needing data-driven insights into customer pain points --- Setup Steps Setup time: 20-30 minutes Requirements: Gmail, OpenAI API key, Airtable account, Google Sheets, Slack workspace Monitor Support Emails: Connect Gmail via OAuth2, configure INBOX monitoring for unread emails AI Analysis Engine: Add OpenAI API key, system prompt pre-configured for support analysis Parse & Enrich Data: JavaScript code automatically calculates priority scores (no changes needed) Route by Urgency: Configure routing rules for critical vs routine tickets Slack Alerts: Create Slack app, get bot token and channel IDs, replace placeholders in nodes Airtable Database: Create base with "tblSupportTickets" table, add API key and Base ID (replace appXXXXXXXXXXXXXX) Google Sheets Dashboard: Create spreadsheet, enable Sheets API, add OAuth2 credentials, replace Sheet ID (1XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX) Generate Insights: Second OpenAI call analyzes patterns, stores insights in Airtable Test: Send test email, verify Slack alerts, check Airtable/Sheets data logging --- Customization Guidance Priority Scoring: Adjust urgency weight (25) and sentiment weight (10) in Code node to match your SLA requirements Categories: Modify AI system prompt to add industry-specific categories (e.g., healthcare: appointments, prescriptions) Routing Rules: Add paths for High urgency, VIP customers, or specific categories Auto-Responses: Insert Gmail send node after routine tickets for automatic acknowledgment emails Multi-Language: Add Google Translate node for non-English support VIP Detection: Query CRM APIs or match email domains to flag enterprise customers Team Assignment: Route different categories to dedicated Slack channels by department Cost Optimization: Use GPT-3.5 (~$0.001/email) instead of GPT-4, self-host n8n for unlimited executions --- Once configured, this workflow operates as your intelligent support triage layer—analyzing every email instantly, routing urgent issues to the right team, maintaining comprehensive analytics, and generating strategic insights to improve support operations. --- Built by Daniel Shashko Connect on LinkedIn
Smart knowledge base builder — auto-convert websites into AI training data
AI-Powered Knowledge Base Builder — Turn Any Website into LLM-Optimized Markdown & TXT Files Automate the entire process of converting any website or domain into clean, structured, AI-ready knowledge bases for Large Language Models (LLMs), semantic search, and chatbot development. --- Key Workflow Highlights URL Input via Simple Form – Paste a single link or a full domain. Automated Link Discovery – Crawl and map all related pages with Firecrawl API. Clean Markdown Extraction – Use Parsera API for accurate, clutter-free content. LLM-Optimized Formatting – Standardize with OpenAI GPT-4.1-mini for llms.txt. Cloud Storage Integration – Save directly to Google Drive for instant access. Batch Processing at Scale – Handle single pages or hundreds of URLs effortlessly. --- Perfect For: AI engineers building domain-specific training datasets Data scientists running semantic search & vector database pipelines Researchers collecting website archives for AI or analytics Automation specialists creating chatbot-ready content libraries --- Why This Workflow Outperforms Manual Processes 100% Automated — From link input to Google Drive-ready .txt file Flexible Scope — Choose between single-page extraction or full-site crawling Clean, AI-Friendly Output — Markdown converted to standardized LLM format Scalable & Reliable — Handles bulk data ingestion without formatting issues Cloud-First — Centralized storage for team-wide accessibility --- Problems Solved No more manual copy-paste from dozens of web pages Eliminate formatting inconsistencies across datasets Avoid scattered files — all output stored in one central folder Instead, you get: Automated URL mapping for deep data coverage Proxy-enabled scraping for accurate extraction Ready-to-use llms.txt files for chatbots, fine-tuning, and AI pipelines --- How It Works — Step-by-Step Form Submission Input your URL and choose “Single Page” or “Full Domain Crawl.” URL Mapping with Firecrawl API Automatically discovers all internal links related to the starting URL. Content Extraction with Parsera API Removes ads, navigation clutter, and irrelevant elements to produce clean Markdown. LLM-Optimized Formatting with OpenAI GPT-4.1-mini Generates structured files including: Site title & meta description Page sections with summaries & full text Cloud Upload to Google Drive Final .txt or .md files stored in your specified folder. --- Business & AI Advantages Save 90%+ time preparing AI training datasets Improve AI accuracy with high-quality, consistent input Maintain centralized, cloud-based storage Scale globally with proxy-based content collection --- Setup in Under 10 Minutes Import the workflow into n8n. Add credentials for: Firecrawl API Parsera API OpenAI API Key Google Drive (Service Account or OAuth) Update your Google Drive folder ID. Run a test job with a sample URL. Deploy and connect to your AI pipeline. --- Tools & Integrations Used n8n Form Trigger – For user-friendly input Firecrawl API – Comprehensive internal link mapping Parsera API – Clean, structured content extraction OpenAI GPT-4.1-mini – LLM-optimized formatting Google Drive API – Secure cloud storage Batch & Switch Logic – Efficient multi-page processing --- Advanced Customization Options Change output format: .md, .json, .csv Swap storage to Dropbox, AWS S3, Notion, Airtable Modify AI prompts for alternative formatting Filter by keywords or metadata before saving Automate runs via Google Sheets, email triggers, or cron schedules Add AI-powered translation for multilingual datasets Enrich with SEO metadata or author information Push directly to vector databases like Pinecone, Weaviate, Qdrant --- SEO-Optimized Keywords for Maximum Reach AI data extraction workflow Automated LLM training dataset builder Web to Markdown converter for AI Firecrawl Parsera OpenAI n8n integration llms.txt file generator for chatbots Automated website content scraper for AI Knowledge base creation automation AI-ready data pipeline for semantic search Batch website-to-dataset conversion