Back to Catalog

Multi platform content generator from YouTube using AI & RSS

Budi SJBudi SJ
1675 views
2/3/2026
Official Page

This workflow contains community nodes that are only compatible with the self-hosted version of n8n.

Multi Platform Content Generator from YouTube using AI & RSS

This workflow automates content generation by monitoring YouTube channels, extracting transcripts via AI, and creating platform-optimized content for LinkedIn, X/Twitter, Threads, and Instagram. Ideal for creators, marketers, and social media managers aiming to scale content production with minimal effort.

✨ Key Features

  • πŸ”” Automated YouTube Monitoring via RSS feed
  • 🧠 AI-Powered Transcript Extraction using Supadata API
  • ✍️ Multi-Platform Content Generation with OpenRouter AI
  • 🎯 Platform Optimization based on tone and character limits
  • πŸ“¬ Telegram Notification for easy preview
  • πŸ“Š Centralized Data Management via Google Sheets

> πŸ—‚οΈ All video data, summaries, and generated content are tracked and stored in a single, centralized Google Sheets template
> This ensures full visibility, easy access, and smooth collaboration across your team.


βš™οΈ Workflow Components

1. 🧭 Channel Monitoring

  • Schedule Trigger: Initiates workflow periodically
  • Google Sheets (Read): Pulls YouTube channel URLs
  • HTTP Request + HTML Parser: Extracts channel IDs from URLs
  • RSS Reader: Fetches latest video metadata

2. 🧾 Content Processing

  • Supadata API: Extracts transcript from YouTube video
  • OpenRouter AI: Summarizes transcript + generates content per platform
  • Conditional Check: Prevents duplicate content by checking existing records

3. πŸ“€ Multi-Platform Output

  • LinkedIn: Story-driven format (≀ 1300 characters)
  • X/Twitter: Short, punchy copy (≀ 280 characters)
  • Threads: Friendly, conversational
  • Instagram: Short captions for visual posts

4. πŸ—ƒοΈ Data Management

  • Google Sheets (Write): Stores video metadata + generated posts
  • Telegram Bot: Sends content preview
  • ID Tracking: Avoids reprocessing using video ID

πŸ” Required Credentials

  • Google Sheets OAuth2
  • Supadata API
  • OpenRouter API
  • Telegram Bot Token & Chat ID

🎁 Benefits

  • βŒ› Save Time: Automates transcript + content generation
  • πŸ”Š Consistent Tone: Adjust AI prompts for brand voice
  • πŸ“‘ Multi-Platform Ready: One video β†’ multiple formats
  • πŸ“‚ Centralized Logs via Google Sheets: Easily track, audit, and collaborate
  • πŸš€ Scalable: Handle many channels with ease

n8n Multi-Platform Content Generator from YouTube using AI and RSS

This n8n workflow automates the process of generating multi-platform content from new YouTube videos using AI, and then publishing it to various platforms. It monitors an RSS feed for new YouTube videos, extracts information, generates AI-powered content (like summaries or social media posts), and then distributes this content.

What it does

This workflow performs the following key steps:

  1. Monitors RSS Feed: Regularly checks a specified RSS feed (likely a YouTube channel's RSS feed) for new video entries.
  2. Filters New Items: Ensures that only newly published items are processed, preventing duplicate content generation.
  3. Extracts Video URL: From the RSS feed item, it extracts the YouTube video URL.
  4. Fetches Video Transcript (via HTTP Request): Makes an HTTP request to an external service (likely a YouTube transcript API or a custom script) to retrieve the video transcript.
  5. Generates AI Content: Uses an AI language model (via LangChain's OpenRouter Chat Model) to process the video transcript and generate various content formats (e.g., summary, social media posts, blog ideas).
  6. Formats Content: Utilizes a "Set" node to structure the AI-generated content into a desired format.
  7. Saves to Google Sheets: Stores the generated content and video details in a Google Sheet for record-keeping or further use.
  8. Publishes to Telegram: Sends a notification or the generated content to a Telegram channel or chat.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running instance of n8n.
  • RSS Feed URL: The URL of the YouTube channel's RSS feed you wish to monitor.
  • Google Sheets Account: Configured credentials for Google Sheets to store data.
  • Telegram Bot Token and Chat ID: Configured credentials for Telegram to send messages.
  • OpenRouter API Key: An API key for OpenRouter to access their AI language models.
  • YouTube Transcript Service (External): An external service or API endpoint capable of fetching YouTube video transcripts given a URL. The workflow uses an HTTP Request node, implying an external API call.

Setup/Usage

  1. Import the Workflow:
    • Copy the provided JSON code.
    • In your n8n instance, click on "Workflows" in the left sidebar.
    • Click "New" and then "Import from JSON".
    • Paste the JSON code and click "Import".
  2. Configure Credentials:
    • Google Sheets: Locate the "Google Sheets" node (ID: 18) and configure your Google Sheets credentials. You'll need to specify the Spreadsheet ID and Sheet Name where the data will be stored.
    • Telegram: Locate the "Telegram" node (ID: 49) and configure your Telegram credentials (Bot Token and Chat ID).
    • OpenRouter Chat Model: Locate the "OpenRouter Chat Model" node (ID: 1281) and configure your OpenRouter API key.
  3. Configure RSS Feed:
    • Locate the "RSS Read" node (ID: 37).
    • Update the "URL" field with the RSS feed URL of the YouTube channel you want to monitor.
  4. Configure HTTP Request for Transcript:
    • Locate the "HTTP Request" node (ID: 19).
    • Update the "URL" field to point to your YouTube transcript fetching service. Ensure the request body or parameters are correctly set to pass the YouTube video URL.
  5. Configure AI Prompts:
    • Locate the "Basic LLM Chain" node (ID: 1123).
    • Review and adjust the prompts within this node to guide the AI in generating the desired content (e.g., summary format, social media post length, specific keywords).
  6. Activate the Workflow:
    • Once all credentials and configurations are set, click the "Activate" toggle in the top right corner of the workflow editor to start the automation.

The workflow will now periodically check the RSS feed, process new videos with AI, and publish the generated content.

Related Templates

Automate Dutch Public Procurement Data Collection with TenderNed

TenderNed Public Procurement What This Workflow Does This workflow automates the collection of public procurement data from TenderNed (the official Dutch tender platform). It: Fetches the latest tender publications from the TenderNed API Retrieves detailed information in both XML and JSON formats for each tender Parses and extracts key information like organization names, titles, descriptions, and reference numbers Filters results based on your custom criteria Stores the data in a database for easy querying and analysis Setup Instructions This template comes with sticky notes providing step-by-step instructions in Dutch and various query options you can customize. Prerequisites TenderNed API Access - Register at TenderNed for API credentials Configuration Steps Set up TenderNed credentials: Add HTTP Basic Auth credentials with your TenderNed API username and password Apply these credentials to the three HTTP Request nodes: "Tenderned Publicaties" "Haal XML Details" "Haal JSON Details" Customize filters: Modify the "Filter op ..." node to match your specific requirements Examples: specific organizations, contract values, regions, etc. How It Works Step 1: Trigger The workflow can be triggered either manually for testing or automatically on a daily schedule. Step 2: Fetch Publications Makes an API call to TenderNed to retrieve a list of recent publications (up to 100 per request). Step 3: Process & Split Extracts the tender array from the response and splits it into individual items for processing. Step 4: Fetch Details For each tender, the workflow makes two parallel API calls: XML endpoint - Retrieves the complete tender documentation in XML format JSON endpoint - Fetches metadata including reference numbers and keywords Step 5: Parse & Merge Parses the XML data and merges it with the JSON metadata and batch information into a single data structure. Step 6: Extract Fields Maps the raw API data to clean, structured fields including: Publication ID and date Organization name Tender title and description Reference numbers (kenmerk, TED number) Step 7: Filter Applies your custom filter criteria to focus on relevant tenders only. Step 8: Store Inserts the processed data into your database for storage and future analysis. Customization Tips Modify API Parameters In the "Tenderned Publicaties" node, you can adjust: offset: Starting position for pagination size: Number of results per request (max 100) Add query parameters for date ranges, status filters, etc. Add More Fields Extend the "Splits Alle Velden" node to extract additional fields from the XML/JSON data, such as: Contract value estimates Deadline dates CPV codes (procurement classification) Contact information Integrate Notifications Add a Slack, Email, or Discord node after the filter to get notified about new matching tenders. Incremental Updates Modify the workflow to only fetch new tenders by: Storing the last execution timestamp Adding date filters to the API query Only processing publications newer than the last run Troubleshooting No data returned? Verify your TenderNed API credentials are correct Check that you have setup youre filter proper Need help setting this up or interested in a complete tender analysis solution? Get in touch πŸ”— LinkedIn – Wessel Bulte

Wessel BulteBy Wessel Bulte
247

πŸŽ“ How to transform unstructured email data into structured format with AI agent

This workflow automates the process of extracting structured, usable information from unstructured email messages across multiple platforms. It connects directly to Gmail, Outlook, and IMAP accounts, retrieves incoming emails, and sends their content to an AI-powered parsing agent built on OpenAI GPT models. The AI agent analyzes each email, identifies relevant details, and returns a clean JSON structure containing key fields: From – sender’s email address To – recipient’s email address Subject – email subject line Summary – short AI-generated summary of the email body The extracted information is then automatically inserted into an n8n Data Table, creating a structured database of email metadata and summaries ready for indexing, reporting, or integration with other tools. --- Key Benefits βœ… Full Automation: Eliminates manual reading and data entry from incoming emails. βœ… Multi-Source Integration: Handles data from different email providers seamlessly. βœ… AI-Driven Accuracy: Uses advanced language models to interpret complex or unformatted content. βœ… Structured Storage: Creates a standardized, query-ready dataset from previously unstructured text. βœ… Time Efficiency: Processes emails in real time, improving productivity and response speed. *βœ… Scalability: Easily extendable to handle additional sources or extract more data fields. --- How it works This workflow automates the transformation of unstructured email data into a structured, queryable format. It operates through a series of connected steps: Email Triggering: The workflow is initiated by one of three different email triggers (Gmail, Microsoft Outlook, or a generic IMAP account), which constantly monitor for new incoming emails. AI-Powered Parsing & Structuring: When a new email is detected, its raw, unstructured content is passed to a central "Parsing Agent." This agent uses a specified OpenAI language model to intelligently analyze the email text. Data Extraction & Standardization: Following a predefined system prompt, the AI agent extracts key information from the email, such as the sender, recipient, subject, and a generated summary. It then forces the output into a strict JSON structure using a "Structured Output Parser" node, ensuring data consistency. Data Storage: Finally, the clean, structured data (the from, to, subject, and summarize fields) is inserted as a new row into a specified n8n Data Table, creating a searchable and reportable database of email information. --- Set up steps To implement this workflow, follow these configuration steps: Prepare the Data Table: Create a new Data Table within n8n. Define the columns with the following names and string type: From, To, Subject, and Summary. Configure Email Credentials: Set up the credential connections for the email services you wish to use (Gmail OAuth2, Microsoft Outlook OAuth2, and/or IMAP). Ensure the accounts have the necessary permissions to read emails. Configure AI Model Credentials: Set up the OpenAI API credential with a valid API key. The workflow is configured to use the model, but this can be changed in the respective nodes if needed. Connect the Nodes: The workflow canvas is already correctly wired. Visually confirm that the email triggers are connected to the "Parsing Agent," which is connected to the "Insert row" (Data Table) node. Also, ensure the "OpenAI Chat Model" and "Structured Output Parser" are connected to the "Parsing Agent" as its AI model and output parser, respectively. Activate the Workflow: Save the workflow and toggle the "Active" switch to ON. The triggers will begin polling for new emails according to their schedule (e.g., every minute), and the automation will start processing incoming messages. --- Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
1616

Tax deadline management & compliance alerts with GPT-4, Google Sheets & Slack

AI-Driven Tax Compliance & Deadline Management System Description Automate tax deadline monitoring with AI-powered insights. This workflow checks your tax calendar daily at 8 AM, uses GPT-4 to analyze upcoming deadlines across multiple jurisdictions, detects overdue and critical items, and sends intelligent alerts via email and Slack only when immediate action is required. Perfect for finance teams and accounting firms who need proactive compliance management without manual tracking. πŸ›οΈπŸ€–πŸ“Š Good to Know AI-Powered: GPT-4 provides risk assessment and strategic recommendations Multi-Jurisdiction: Handles Federal, State, and Local tax requirements automatically Smart Alerts: Only notifies executives when deadlines are overdue or critical (≀3 days) Priority Classification: Categorizes deadlines as Overdue, Critical, High, or Medium priority Dual Notifications: Critical alerts to leadership + daily summaries to team channel Complete Audit Trail: Logs all checks and deadlines to Google Sheets for compliance records How It Works Daily Trigger - Runs at 8:00 AM every morning Fetch Data - Pulls tax calendar and company configuration from Google Sheets Analyze Deadlines - Calculates days remaining, filters by jurisdiction/entity type, categorizes by priority AI Analysis - GPT-4 provides strategic insights and risk assessment on upcoming deadlines Smart Routing - Only sends alerts if overdue or critical deadlines exist Critical Alerts - HTML email to executives + Slack alert for urgent items Team Updates - Slack summary to finance channel with all upcoming deadlines Logging - Records compliance check results to Google Sheets for audit trail Requirements Google Sheets Structure Sheet 1: TaxCalendar DeadlineID | DeadlineName | DeadlineDate | Jurisdiction | Category | AssignedTo | IsActive FED-Q1 | Form 1120 Q1 | 2025-04-15 | Federal | Income | John Doe | TRUE Sheet 2: CompanyConfig (single row) Jurisdictions | EntityType | FiscalYearEnd Federal, California | Corporation | 12-31 Sheet 3: ComplianceLog (auto-populated) Date | AlertLevel | TotalUpcoming | CriticalCount | OverdueCount 2025-01-15 | HIGH | 12 | 3 | 1 Credentials Needed Google Sheets - Service Account OAuth2 OpenAI - API Key (GPT-4 access required) SMTP - Email account for sending alerts Slack - Bot Token with chat:write permission Setup Steps Import workflow JSON into n8n Add all 4 credentials Replace these placeholders: YOURTAXCALENDAR_ID - Tax calendar sheet ID YOURCONFIGID - Company config sheet ID YOURLOGID - Compliance log sheet ID C12345678 - Slack channel ID tax@company.com - Sender email cfo@company.com - Recipient email Share all sheets with Google service account email Invite Slack bot to channels Test workflow manually Activate the trigger Customizing This Workflow Change Alert Thresholds: Edit "Analyze Deadlines" node: Critical: Change <= 3 to <= 5 for 5-day warning High: Change <= 7 to <= 14 for 2-week notice Medium: Change <= 30 to <= 60 for 2-month lookout Adjust Schedule: Edit "Daily Tax Check" trigger: Change hour/minute for different run time Add multiple trigger times for tax season (8 AM, 2 PM, 6 PM) Add More Recipients: Edit "Send Email" node: To: cfo@company.com, director@company.com CC: accounting@company.com BCC: archive@company.com Customize Email Design: Edit "Format Email" node to change colors, add logo, or modify layout Add SMS Alerts: Insert Twilio node after "Is Critical" for emergency notifications Integrate Task Management: Add HTTP Request node to create tasks in Asana/Jira for critical deadlines Troubleshooting | Issue | Solution | |-------|----------| | No deadlines found | Check date format (YYYY-MM-DD) and IsActive = TRUE | | AI analysis failed | Verify OpenAI API key and account credits | | Email not sending | Test SMTP credentials and check if critical condition met | | Slack not posting | Invite bot to channel and verify channel ID format | | Permission denied | Share Google Sheets with service account email | πŸ“ž Professional Services Need help with implementation or customization? Our team offers: 🎯 Custom workflow development 🏒 Enterprise deployment support πŸŽ“ Team training sessions πŸ”§ Ongoing maintenance πŸ“Š Custom reporting & dashboards πŸ”— Additional API integrations Discover more workflows – Get in touch with us

Oneclick AI SquadBy Oneclick AI Squad
93