Back to Catalog

Monitor RSS feeds, extract full articles with Jina AI, and save to Supabase

automediaautomedia
249 views
2/3/2026
Official Page

Monitor RSS Feeds, Extract Full Articles, and Save to Supabase

Overview

This workflow solves a common problem with RSS feeds: they often only provide a short summary or snippet of the full article. This template automatically monitors a list of your favorite blog RSS feeds, filters for new content, visits the article page to extract the entire blog post, and then saves the structured data into a Supabase database.

It's designed for content creators, marketers, researchers, and anyone who needs to build a personal knowledge base, conduct competitive analysis, or power a content aggregation system without manual copy-pasting.


Use Cases

  • Content Curation: Automatically gather full-text articles for a newsletter or social media content.
  • Personal Knowledge Base: Create a searchable archive of articles from experts in your field.
  • Competitive Analysis: Track what competitors are publishing without visiting their blogs every day.
  • AI Model Training: Collect a clean, structured dataset of full-text articles to fine-tune an AI model.

How It Works

  1. Scheduled Trigger: The workflow runs automatically on a set schedule (default is once per day).
  2. Fetch RSS Feeds: It takes a list of RSS feed URLs you provide in the "blogs to track" node.
  3. Filter for New Posts: It checks the publication date of each article and only continues if the article is newer than a specified age (e.g., published within the last 60 days).
  4. Extract Full Content: For each new article, the workflow uses the Jina AI Reader URL (https://r.jina.ai/) to scrape the full, clean text from the blog post's webpage. This is a free and powerful way to get past the RSS snippet limit.
  5. Save to Supabase: Finally, it organizes the extracted data and saves it to your chosen Supabase table. The following data is saved by default:
    • title
    • source_url (the link to the original article)
    • content_snippet (the full extracted article text)
    • published_date
    • creator (the author)
    • status (a static value you can set, e.g., "new")
    • content_type (a static value you can set, e.g., "blog")

Setup Instructions

You can get this template running in about 10-15 minutes.

  1. Set Up Your RSS Feed List:

    • Navigate to the "blogs to track" Set node.
    • In the source_identifier field, replace the example URLs with the RSS feed URLs for the blogs you want to monitor. You can add as many as you like.
    • Tip: The best way to find a site's RSS feed is to use a tool like Perplexity or a web-browsing enabled LLM.
    // Example list of RSS feeds
    ['https://blog.n8n.io/rss', 'https://zapier.com/blog/feeds/latest/']
    
  2. Configure the Content Age Filter:

    • Go to the "max_content_age_days" Set node.
    • Change the value from the default 60 to your desired timeframe (e.g., 7 to only get articles from the last week).
  3. Connect Your Storage Destination:

    • The template uses the "Save Blog Data to Database" Supabase node.
    • First, ensure you have a table in your Supabase project with columns to match the data (e.g., title, source_url, content_snippet, published_date, creator, etc.).
    • In the n8n node, create new credentials using your Supabase Project URL and Service Role Key.
    • Select your table from the list and map the data fields from the workflow to your table columns.
    • Want to use something else? You can easily replace the Supabase node with a Google Sheets, Airtable, or the built-in n8n Table node. Just drag the final connection to your new node and configure the field mapping.
  4. Set Your Schedule:

    • Click on the first node, "Schedule Trigger".
    • Adjust the trigger interval to your needs. The default is every day at noon.
  5. Activate Workflow:

    • Click the "Save" button, then toggle the workflow to Active. You're all set!

n8n Workflow: Monitor RSS Feeds, Extract Full Articles with Jina AI, and Save to Supabase

This n8n workflow automates the process of monitoring multiple RSS feeds, extracting the full content of new articles using Jina AI, and then storing this enriched data into a Supabase database. It's designed to keep a structured record of articles from your chosen feeds, complete with their full content, for further analysis or use.

What it does

This workflow performs the following key steps:

  1. Triggers on a Schedule: The workflow runs at a predefined interval (e.g., every hour, daily) to check for new articles.
  2. Reads RSS Feeds: It reads items from a configured RSS feed URL.
  3. Filters for New Articles: It compares the pubDate of fetched articles with a stored timestamp to identify only new articles. If an article is older than the last run, it's skipped.
  4. Extracts Full Article Content with Jina AI: For each new article, it uses the Jina AI Reader API to fetch the full, clean content of the article from its URL.
  5. Prepares Data: It transforms and structures the extracted data, including the article title, link, publication date, and full content.
  6. Saves to Supabase: The processed article data is then inserted into a specified table in your Supabase database.
  7. Updates Last Run Timestamp: After processing, it updates a timestamp in Supabase to mark the last successful run, ensuring that only genuinely new articles are processed in subsequent runs.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running instance of n8n.
  • RSS Feed URL(s): The URL(s) of the RSS feeds you wish to monitor.
  • Jina AI API Key: An API key for Jina AI Reader to extract full article content.
  • Supabase Account: A Supabase project with:
    • A database.
    • A table configured to store article data (e.g., columns for title, link, pubDate, full_content, created_at).
    • Supabase API URL and Service Role Key (or equivalent for database access).

Setup/Usage

  1. Import the Workflow:
    • Download the workflow JSON provided.
    • In your n8n instance, go to "Workflows" and click "New".
    • Click the "Import from JSON" button and paste the workflow JSON.
  2. Configure Credentials:
    • Jina AI: Locate the "HTTP Request" node that calls the Jina AI Reader. You will need to add your Jina AI API key as a header or query parameter as required by Jina AI's documentation.
    • Supabase: Configure the "Supabase" nodes with your Supabase credentials (API URL, Service Role Key, and the table name you want to use).
  3. Configure RSS Feed:
    • In the "RSS Read" node, enter the URL of the RSS feed you want to monitor.
  4. Configure Schedule:
    • Adjust the "Schedule Trigger" node to your desired interval (e.g., every 1 hour, once a day).
  5. Configure Supabase Table and Timestamp Logic:
    • Supabase - Get Last Run: Ensure this node correctly queries your Supabase table to retrieve the created_at timestamp of the last processed article or a dedicated last_run timestamp if you have one.
    • If Node: Verify the condition in the "If" node to compare the RSS item's pubDate with the last_run timestamp from Supabase.
    • Supabase - Insert Article: Map the fields from the "Edit Fields" node to the correct columns in your Supabase table.
    • Supabase - Update Last Run: This node is crucial for maintaining the last_run timestamp. Ensure it updates a designated field in your Supabase database with the current workflow execution time or the pubDate of the latest article processed.
  6. Activate the Workflow: Once all configurations are complete, activate the workflow. It will start running automatically based on your defined schedule.

Related Templates

Create verified user profiles with email validation, PDF generation & Gmail delivery

Verified User Profile Creation - Automated Email Validation & PDF Generation --- Overview This comprehensive automation workflow streamlines the user onboarding process by validating email addresses, generating professional profile PDFs, and delivering them seamlessly to verified users. 🎯 What This Workflow Does: Receives User Data - Webhook trigger accepts user signup information (name, email, city, profession, bio) Validates Email Addresses - Uses VerifiEmail API to ensure only legitimate email addresses proceed Conditional Branching - Smart logic splits workflow based on email verification results Generates HTML Profile - Creates beautifully styled HTML templates with user information Converts to PDF - Transforms HTML into professional, downloadable PDF documents Email Delivery - Sends personalized welcome emails with PDF attachments to verified users Data Logging - Records all verified users in Google Sheets for analytics and tracking Rejection Handling - Notifies users with invalid emails and provides guidance ✨ Key Features: βœ… Email Verification - Prevents fake registrations and maintains data quality πŸ“„ Professional PDF Generation - Beautiful, branded profile documents πŸ“§ Automated Email Delivery - Personalized welcome messages with attachments πŸ“Š Google Sheets Logging - Complete audit trail of all verified users πŸ”€ Smart Branching - Separate paths for valid and invalid emails 🎨 Modern Design - Clean, responsive HTML/CSS templates πŸ”’ Secure Webhook - POST endpoint for seamless form integration 🎯 Perfect Use Cases: User registration systems Community membership verification Professional certification programs Event registration with verified attendees Customer onboarding processes Newsletter signup verification Educational platform enrollments Membership card generation πŸ“¦ What's Included: Complete workflow with 12 informative sticky notes Pre-configured webhook endpoint Email verification integration PDF generation setup Gmail sending configuration Google Sheets logging Error handling guidelines Rejection email template πŸ› οΈ Required Integrations: VerifiEmail - For email validation (https://verifi.email) HTMLcsstoPDF - For PDF generation (https://htmlcsstopdf.com) Gmail OAuth2 - For email delivery Google Sheets OAuth2 - For data logging ⚑ Quick Setup Time: 15-20 minutes πŸŽ“ Skill Level: Beginner to Intermediate --- Benefits: βœ… Reduces manual verification work by 100% βœ… Prevents spam and fake registrations βœ… Delivers professional branded documents automatically βœ… Maintains complete audit trail βœ… Scales effortlessly with user growth βœ… Provides excellent user experience βœ… Easy integration with any form or application --- Technical Details: Trigger Type: Webhook (POST) Total Nodes: 11 (including 12 documentation sticky notes) Execution Time: ~3-5 seconds per user API Calls: 3 external (VerifiEmail, HTMLcsstoPDF, Google Sheets) Email Format: HTML with binary PDF attachment Data Storage: Google Sheets (optional) --- License: MIT (Free to use and modify) --- 🎁 BONUS FEATURES: Comprehensive sticky notes explaining each step Beautiful, mobile-responsive email template Professional PDF styling with modern design Easily customizable for your branding Ready-to-use webhook endpoint Error handling guidelines included --- Perfect for: Developers, No-code enthusiasts, Business owners, SaaS platforms, Community managers, Event organizers Start automating your user verification process today! πŸš€

Jitesh DugarBy Jitesh Dugar
51

Automate Reddit brand monitoring & responses with GPT-4o-mini, Sheets & Slack

How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. --- Who is this for? Brand managers and marketing teams needing automated social listening and engagement on Reddit Community managers responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers who want to scale Reddit marketing without hiring a team PR and reputation teams monitoring brand sentiment and responding to discussions in real-time Product marketers seeking organic engagement opportunities in product-related communities Any business that wants to build authentic Reddit presence while avoiding spammy marketing tactics --- Setup Steps Setup time: Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements: Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOURSHEETID with your actual Google Sheets document ID Send Slack Report: Replace YOURCHANNELID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours --- Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel

Daniel ShashkoBy Daniel Shashko
679

Generate influencer posts with GPT-4, Google Sheets, and Media APIs

This template transforms uploaded brand assets into AI-generated influencer-style posts β€” complete with captions, images, and videos β€” using n8n, OpenAI, and your preferred image/video generation APIs. --- 🧠 Who it’s for Marketers, creators, or brand teams who want to speed up content ideation and visual generation. Perfect for social-media teams looking to turn product photos and brand visuals into ready-to-review creative posts. --- βš™οΈ How it works Upload your brand assets β€” A form trigger collects up to three files: product, background, and prop. AI analysis & content creation β€” An OpenAI LLM analyzes your brand tone and generates post titles, captions, and visual prompts. Media generation β€” Connected image/video generation workflows create corresponding visuals. Result storage β€” All captions, image URLs, and video URLs are automatically written to a Google Sheet for review or publishing. --- 🧩 How to set it up Replace all placeholders in nodes: <<YOURSHEETID>> <<FILEUPLOADBASE>> <<YOURAPIKEY>> <<YOURN8NDOMAIN>>/form/<<FORM_ID>> Add your own credentials in: Google Sheets HTTP Request AI/LLM nodes Execute the workflow or trigger via form. Check your connected Google Sheet for generated posts and media links. --- πŸ› οΈ Requirements | Tool | Purpose | |------|----------| | OpenAI / compatible LLM key | Caption & idea generation | | Image/Video generation API | Creating visuals | | Google Sheets credentials | Storing results | | (Optional) n8n Cloud / self-hosted | To run the workflow | --- 🧠 Notes The workflow uses modular sub-workflows for image and video creation; you can connect your own generation nodes. All credentials and private URLs have been removed. Works seamlessly with both n8n Cloud and self-hosted setups. Output is meant for creative inspiration β€” review before posting publicly. --- 🧩 Why it’s useful Speeds up campaign ideation and content creation. Provides structured, reusable results in Google Sheets. Fully visual, modular, and customizable for any brand or industry. --- 🧠 Example Use Cases Influencer campaign planning Product launch creatives E-commerce catalog posts Fashion, lifestyle, or tech brand content --- βœ… Security & best practices No hardcoded keys or credentials included. All private URLs replaced with placeholders. Static data removed from the public JSON. Follows n8n’s template structure, node naming, and sticky-note annotation guidelines. --- πŸ“¦ Template info Name: AI-Powered Influencer Post Generator with Google Sheets and Image/Video APIs Category: AI / Marketing Automation / Content Generation Author: Palak Rathor Version: 1.0 (Public Release β€” October 2025)

Palak RathorBy Palak Rathor
485