Back to Catalog

Automate content research with Reddit scraping, AI analysis, and Google Sheets

Michael TalebMichael Taleb
3297 views
2/3/2026
Official Page

Workflow Summary

This workflow automatically scrapes new Reddit posts from your chosen subreddits and keywords, analyzes them with AI to extract summaries, pain points, and content angles, and then saves the insights into a Google Sheet. It’s a fully automated Content Research Engine that delivers fresh marketing ideas and community pain points straight into your database.

Setting up the workflow

  1. Connect Reddit • In n8n, create a Reddit credential. • Add the subreddits and keywords you want to track.

  2. Connect Google Sheets • Make a copy of the database sheet. • Connect your Google account in n8n.

  3. Connect OpenAI • Add your OpenAI API key as a credential. • The AI will summarize posts, extract pain points, and suggest content ideas.

  4. Run the workflow • The workflow will search Reddit on a schedule. • Results are processed by AI and automatically added to your Google Sheet.

Automate Content Research with Reddit Scraping, AI Analysis, and Google Sheets

This n8n workflow automates the process of gathering content ideas from Reddit, analyzing them with AI, and storing the results in Google Sheets for further use. It simplifies content research by automatically identifying trending topics and extracting key insights.

What it does

This workflow performs the following steps:

  1. Triggers on a Schedule: The workflow starts automatically at predefined intervals (e.g., daily, weekly).
  2. Scrapes Reddit: It connects to Reddit to fetch posts from specified subreddits or based on certain keywords.
  3. Analyzes with AI: The retrieved Reddit posts are then sent to an AI agent (powered by OpenAI) for analysis. The AI agent extracts relevant information, such as content themes, sentiment, or key takeaways.
  4. Parses AI Output: A structured output parser ensures that the AI's response is formatted consistently, making it easy to process.
  5. Edits Fields: Data is transformed and cleaned to prepare it for storage.
  6. Merges Data: Combines the original Reddit data with the AI-analyzed insights.
  7. Stores in Google Sheets: The final, enriched data is then appended to a Google Sheet, creating a centralized repository for content ideas.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running n8n instance (cloud or self-hosted).
  • Reddit Account & API Credentials: To scrape Reddit posts.
  • OpenAI API Key: For the AI Agent and Chat Model to analyze content.
  • Google Account: With access to Google Sheets to store the research data.

Setup/Usage

  1. Import the Workflow: Download the provided JSON and import it into your n8n instance.
  2. Configure Credentials:
    • Reddit: Set up your Reddit OAuth2 or API credentials in n8n.
    • OpenAI: Add your OpenAI API Key as a credential.
    • Google Sheets: Authenticate your Google account to allow n8n to access your Google Sheets.
  3. Customize Nodes:
    • Schedule Trigger: Adjust the schedule to your preferred frequency for running the research.
    • Reddit Node: Configure the subreddits, keywords, or other filters for the posts you want to scrape.
    • AI Agent Node: Refine the prompt for the AI agent to guide its analysis (e.g., "Summarize the main content idea and identify potential angles for a blog post").
    • Google Sheets Node: Specify the Spreadsheet ID and Sheet Name where you want to store the data. Ensure the sheet has appropriate headers matching the data output.
  4. Activate the Workflow: Once configured, activate the workflow. It will now run automatically based on your defined schedule.

Related Templates

Track competitor SEO keywords with Decodo + GPT-4.1-mini + Google Sheets

This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primarykeywords, seostrengthscore, keyworddensity_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo for intelligent web scraping OpenAI GPT-4.1-mini for keyword and SEO analysis Google Sheets for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.

Ranjan DailataBy Ranjan Dailata
161

Create personalized email outreach with AI, Telegram bot & website scraping

Demo Personalized Email This n8n workflow is built for AI and automation agencies to promote their workflows through an interactive demo that prospects can try themselves. The featured system is a deep personalized email demo. --- 🔄 How It Works Prospect Interaction A prospect starts the demo via Telegram. The Telegram bot (created with BotFather) connects directly to your n8n instance. Demo Guidance The RAG agent and instructor guide the user step-by-step through the demo. Instructions and responses are dynamically generated based on user input. Workflow Execution When the user triggers an action (e.g., testing the email demo), n8n runs the workflow. The workflow collects website data using Crawl4AI or standard HTTP requests. Email Demo The system personalizes and sends a demo email through SparkPost, showing the automation’s capability. Logging and Control Each user interaction is logged in your database using their name and id. The workflow checks limits to prevent misuse or spam. Error Handling If a low-CPU scraping method fails, the workflow automatically escalates to a higher-CPU method. ⚙️ Requirements Before setting up, make sure you have the following: n8n — Automation platform to run the workflow Docker — Required to run Crawl4AI Crawl4AI — For intelligent website crawling Telegram Account — To create your Telegram bot via BotFather SparkPost Account — To send personalized demo emails A database (e.g., PostgreSQL, MySQL, or SQLite) — To store log data such as user name and ID 🚀 Features Telegram interface using the BotFather API Instructor and RAG agent to guide prospects through the demo Flow generation limits per user ID to prevent abuse Low-cost yet powerful web scraping, escalating from low- to high-CPU flows if earlier ones fail --- 💡 Development Ideas Replace the RAG logic with your own query-answering and guidance method Remove the flow limit if you’re confident the demo can’t be misused Swap the personalized email demo with any other workflow you want to showcase --- 🧠 Technical Notes Telegram bot created with BotFather Website crawl process: Extract sub-links via /sitemap.xml, sitemap_index.xml, or standard HTTP requests Fall back to Crawl4AI if normal requests fail Fetch sub-link content via HTTPS or Crawl4AI as backup SparkPost used for sending demo emails --- ⚙️ Setup Instructions Create a Telegram Bot Use BotFather on Telegram to create your bot and get the API token. This token will be used to connect your n8n workflow to Telegram. Create a Log Data Table In your database, create a table to store user logs. The table must include at least the following columns: name — to store the user’s name or Telegram username. id — to store the user’s unique identifier. Install Crawl4AI with Docker Follow the installation guide from the official repository: 👉 https://github.com/unclecode/crawl4ai Crawl4AI will handle website crawling and content extraction in your workflow. --- 📦 Notes This setup is optimized for low cost, easy scalability, and real-time interaction with prospects. You can customize each component — Telegram bot behavior, RAG logic, scraping strategy, and email workflow — to fit your agency’s demo needs. 👉 You can try the live demo here: @emaildemobot ---

Michael A PutraBy Michael A Putra
474

Automate event RSVPs with email validation & badge generation using VerifiEmail & HTMLCssToImage

Validated RSVP Confirmation with Automated Badge Generation Overview: This comprehensive workflow automates the entire event RSVP process from form submission to attendee confirmation, including real-time email validation and personalized digital badge generation. ✨ KEY FEATURES: • Real-time Email Validation - Verify attendee emails using VerifiEmail API to prevent fake registrations • Automated Badge Generation - Create beautiful, personalized event badges with attendee details • Smart Email Routing - Send confirmation emails with badges for valid emails, rejection notices for invalid ones • Comprehensive Logging - Track all RSVPs (both valid and invalid) in Google Sheets for analytics • Dual Path Logic - Handle valid and invalid submissions differently with conditional branching • Anti-Fraud Protection - Detect disposable emails and invalid domains automatically 🔧 WORKFLOW COMPONENTS: Webhook Trigger - Receives RSVP submissions Email Validation - Verifies email authenticity using VerifiEmail API Conditional Logic - Separates valid from invalid submissions Badge Creator - Generates HTML-based personalized event badges Image Converter - Converts HTML badges to shareable PNG images using HTMLCssToImage Email Sender - Delivers confirmation with badge or rejection notice via Gmail Data Logger - Records all attempts in Google Sheets for tracking and analytics 🎯 PERFECT FOR: • Conference organizers managing hundreds of RSVPs • Corporate event planners requiring verified attendee lists • Webinar hosts preventing fake registrations • Workshop coordinators issuing digital badges • Community event managers tracking attendance 💡 BENEFITS: • Reduces manual verification time by 95% • Eliminates fake email registrations • Creates professional branded badges automatically • Provides real-time RSVP tracking and analytics • Improves attendee experience with instant confirmations • Maintains clean, verified contact lists 🛠️ REQUIRED SERVICES: • n8n (cloud or self-hosted) • VerifiEmail API (https://verifi.email) • HTMLCssToImage API (https://htmlcsstoimg.com) • Gmail account (OAuth2) • Google Sheets 📈 USE CASE SCENARIO: When someone submits your event RSVP form, this workflow instantly validates their email, generates a personalized badge with their details, and emails them a confirmation—all within seconds. Invalid emails receive a helpful rejection notice, and every submission is logged for your records. No manual work required! 🎨 BADGE CUSTOMIZATION: The workflow includes a fully customizable HTML badge template featuring: • Gradient background with modern design • Attendee name, designation, and organization • Event name and date • Email address and validation timestamp • Google Fonts (Poppins) for professional typography 📊 ANALYTICS INCLUDED: Track metrics like: • Total RSVPs received • Valid vs invalid email ratio • Event-wise registration breakdown • Temporal patterns • Organization/company distribution ⚡ PERFORMANCE: • Processing time: ~3-5 seconds per RSVP • Scales to handle 100+ concurrent submissions • Email delivery within 10 seconds • Real-time Google Sheets updates 🔄 EASY SETUP: Import the workflow JSON Configure your credentials (detailed instructions included) Create your form with required fields (name, email, event, designation, organization) Connect the webhook Activate and start receiving validated RSVPs! 🎓 LEARNING VALUE: This workflow demonstrates: • Webhook integration patterns • API authentication methods • Conditional workflow branching • HTML-to-image conversion • Email automation best practices • Data logging strategies • Error handling techniques ---

Jitesh DugarBy Jitesh Dugar
67