Back to Catalog

🎥 Analyze YouTube video for summaries, transcripts & content + Google Gemini AI

Joseph LePageJoseph LePage
8153 views
2/3/2026
Official Page

🎥 Analyze YouTube Video for Summaries, Transcripts & Content + Google Gemini

Who is this for?

This workflow is ideal for content creators, video marketers, and research professionals who need to extract actionable insights, detailed transcripts, or metadata from YouTube videos efficiently. It is particularly useful for those leveraging AI tools to analyze video content and optimize audience engagement.

What problem does this workflow solve? / Use case

Analyzing video content manually can be time-consuming and prone to errors. This workflow automates the process by extracting key metadata, generating summaries, and providing structured transcripts tailored to specific use cases. It helps users save time and ensures accurate data extraction for content optimization.

What this workflow does

  • Extracts audience-specific metadata (e.g., video type, tone, key topics, engagement drivers).
  • Generates customized outputs based on six prompt types:
    • Default: Actionable insights and strategies.
    • Transcribe: Verbatim transcription.
    • Timestamps: Timestamped dialogue.
    • Summary: Concise bullet-point summary.
    • Scene: Visual descriptions of settings and techniques.
    • Clips: High-engagement video segments with timestamps.
  • Saves extracted data as a text file in Google Drive.
  • Sends analyzed outputs via Gmail or provides them in a completion form.

Setup

  1. Configure API keys:
    • Add your Google API key as an environment variable.
  2. Input requirements:
    • Provide the YouTube video ID (e.g., wBuULAoJxok).
    • Select a prompt type from the dropdown menu.
  3. Connect credentials:
    • Set up Google Drive and Gmail integrations in n8n.

How to customize this workflow to your needs

  • Modify the metadata prompt to extract additional fields relevant to your use case.
  • Adjust the output format for summaries or transcripts based on your preferences (e.g., structured bullets or plain text).
  • Add nodes to integrate with other platforms like Slack or Notion for further collaboration.

Example Usage

  1. Input: YouTube video ID (wBuULAoJxok) and prompt type (summary).
  2. Output: A concise summary highlighting actionable insights, tools, and resources mentioned in the video.

n8n YouTube Video Analyzer with Google Gemini AI

This n8n workflow provides a streamlined way to analyze YouTube videos using Google Gemini AI, generating summaries and transcripts, and then delivering the results via email. It's designed to automate the process of extracting key information from video content and sharing it efficiently.

What it does

This workflow automates the following steps:

  1. Triggers on Form Submission: The workflow starts when a user submits a form, likely containing a YouTube video URL and a recipient email address.
  2. Extracts Video ID: It processes the provided YouTube URL to extract the video ID, which is essential for interacting with the YouTube API.
  3. Fetches YouTube Transcript: It makes an HTTP request to a YouTube transcript API (likely youtube-transcript-api.vercel.app) to retrieve the full transcript of the specified video.
  4. Generates AI Summary: The retrieved transcript is then sent to a Google Gemini AI model to generate a concise summary of the video content.
  5. Formats Output: The raw transcript and the AI-generated summary are formatted into a readable Markdown document.
  6. Sends Email with Analysis: Finally, the formatted summary and transcript are sent as an email to the specified recipient.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running n8n instance to import and execute the workflow.
  • YouTube Transcript API: Access to a YouTube transcript API (the workflow uses youtube-transcript-api.vercel.app).
  • Google Gemini AI: A Google Gemini AI API key and access to the Gemini model for content summarization.
  • Gmail Account: A configured Gmail credential in n8n to send emails.
  • Google Drive (Optional): Although a Google Drive node is present, it's currently disconnected and not actively used in the main flow.

Setup/Usage

  1. Import the Workflow: Download the provided JSON and import it into your n8n instance.
  2. Configure Credentials:
    • Google Gemini AI: Set up a credential for Google Gemini AI. This will likely involve providing your API key.
    • Gmail: Configure your Gmail OAuth2 or API key credential within n8n.
  3. Configure the "On form submission" Trigger:
    • Review the form fields to ensure they align with the expected input (e.g., youtubeUrl, email).
    • Activate the workflow.
  4. Test the Workflow: Submit the form with a YouTube video URL and an email address to test the end-to-end functionality.
  5. Customize (Optional):
    • Modify the "HTTP Request" node if you are using a different YouTube transcript API.
    • Adjust the prompt in the "Google Gemini AI" node to fine-tune the summarization output.
    • Change the email content or recipient logic in the "Gmail" node as needed.
    • If you wish to save the output to Google Drive, connect and configure the "Google Drive" node.

Related Templates

Track software vulnerability patents with ScrapeGraphAI, Matrix, and Intercom

Software Vulnerability Patent Tracker ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically tracks newly-published patent filings that mention software-security vulnerabilities, buffer-overflow mitigation techniques, and related technology keywords. Every week it aggregates fresh patent data from USPTO and international patent databases, filters it by relevance, and delivers a concise JSON digest (and optional Intercom notification) to R&D teams and patent attorneys. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud, v1.7.0+) ScrapeGraphAI community node installed Basic understanding of patent search syntax (for customizing keyword sets) Optional: Intercom account for in-app alerts Required Credentials | Credential | Purpose | |------------|---------| | ScrapeGraphAI API Key | Enables ScrapeGraphAI nodes to fetch and parse patent-office webpages | | Intercom Access Token (optional) | Sends weekly digests directly to an Intercom workspace | Additional Setup Requirements | Setting | Recommended Value | Notes | |---------|-------------------|-------| | Cron schedule | 0 9 1 | Triggers every Monday at 09:00 server time | | Patent keyword matrix | See example CSV below | List of comma-separated keywords per tech focus | Example keyword matrix (upload as keywords.csv or paste into the “Matrix” node): topic,keywords Buffer Overflow,"buffer overflow, stack smashing, stack buffer" Memory Safety,"memory safety, safe memory allocation, pointer sanitization" Code Injection,"SQL injection, command injection, injection prevention" How it works This workflow automatically tracks newly-published patent filings that mention software-security vulnerabilities, buffer-overflow mitigation techniques, and related technology keywords. Every week it aggregates fresh patent data from USPTO and international patent databases, filters it by relevance, and delivers a concise JSON digest (and optional Intercom notification) to R&D teams and patent attorneys. Key Steps: Schedule Trigger: Fires weekly based on the configured cron expression. Matrix (Keyword Loader): Loads the CSV-based technology keyword matrix into memory. Code (Build Search Queries): Dynamically assembles patent-search URLs for each keyword group. ScrapeGraphAI (Fetch Results): Scrapes USPTO, EPO, and WIPO result pages and parses titles, abstracts, publication numbers, and dates. If (Relevance Filter): Removes patents older than 1 year or without vulnerability-related terms in the abstract. Set (Normalize JSON): Formats the remaining records into a uniform JSON schema. Intercom (Notify Team): Sends a summarized digest to your chosen Intercom workspace. (Skip or disable this node if you prefer to consume the raw JSON output instead.) Sticky Notes: Contain inline documentation and customization tips for future editors. Set up steps Setup Time: 10-15 minutes Install Community Node Navigate to “Settings → Community Nodes”, search for ScrapeGraphAI, and click “Install”. Create Credentials Go to “Credentials” → “New Credential” → select ScrapeGraphAI API → paste your API key. (Optional) Add an Intercom credential with a valid access token. Import the Workflow Click “Import” → “Workflow JSON” and paste the template JSON, or drag-and-drop the .json file. Configure Schedule Open the Schedule Trigger node and adjust the cron expression if a different frequency is required. Upload / Edit Keyword Matrix Open the Matrix node, paste your custom CSV, or modify existing topics & keywords. Review Search Logic In the Code (Build Search Queries) node, review the base URLs and adjust patent databases as needed. Define Notification Channel If using Intercom, select your Intercom credential in the Intercom node and choose the target channel. Execute & Activate Click “Execute Workflow” for a trial run. Verify the output. If satisfied, switch the workflow to “Active”. Node Descriptions Core Workflow Nodes: Schedule Trigger – Initiates the workflow on a weekly cron schedule. Matrix – Holds the CSV keyword table and makes each row available as an item. Code (Build Search Queries) – Generates search URLs and attaches meta-data for later nodes. ScrapeGraphAI – Scrapes patent listings and extracts structured fields (title, abstract, pub. date, link). If (Relevance Filter) – Applies date and keyword relevance filters. Set (Normalize JSON) – Maps scraped fields into a clean JSON schema for downstream use. Intercom – Sends formatted patent summaries to an Intercom inbox or channel. Sticky Notes – Provide inline documentation and edit history markers. Data Flow: Schedule Trigger → Matrix → Code → ScrapeGraphAI → If → Set → Intercom Customization Examples Change Data Source to Google Patents javascript // In the Code node const base = 'https://patents.google.com/?q='; items.forEach(item => { item.json.searchUrl = ${base}${encodeURIComponent(item.json.keywords)}&oq=${encodeURIComponent(item.json.keywords)}; }); return items; Send Digest via Slack Instead of Intercom javascript // Replace Intercom node with Slack node { "text": 🚀 New Vulnerability-related Patents (${items.length})\n + items.map(i => • <${i.json.link}|${i.json.title}>).join('\n') } Data Output Format The workflow outputs structured JSON data: json { "topic": "Memory Safety", "keywords": "memory safety, safe memory allocation, pointer sanitization", "title": "Memory protection for compiled binary code", "publicationNumber": "US20240123456A1", "publicationDate": "2024-03-21", "abstract": "Techniques for enforcing memory safety in compiled software...", "link": "https://patents.google.com/patent/US20240123456A1/en", "source": "USPTO" } Troubleshooting Common Issues Empty Result Set – Ensure that the keywords are specific but not overly narrow; test queries manually on USPTO. ScrapeGraphAI Timeouts – Increase the timeout parameter in the ScrapeGraphAI node or reduce concurrent requests. Performance Tips Limit the keyword matrix to <50 rows to keep weekly runs under 2 minutes. Schedule the workflow during off-peak hours to reduce load on patent-office servers. Pro Tips: Combine this workflow with a vector database (e.g., Pinecone) to create a semantic patent knowledge base. Add a “Merge” node to correlate new patents with existing vulnerability CVE entries. Use a second ScrapeGraphAI node to crawl citation trees and identify emerging technology clusters.

vinci-king-01By vinci-king-01
45

Auto Generate Descriptive Node Names with AI for Workflow Readability

⚡Auto Rename n8n Workflow Nodes with AI✨ This workflow uses AI to automatically generate clear and descriptive names for every node in your n8n workflows. It analyzes each node's type, parameters, and connections to create meaningful names, making your workflows instantly readable. Who is it for? This workflow is for n8n users who manage complex workflows with dozens of nodes. If you've ever: Built workflows full of generic names like HTTP Request 2 or Edit Fields 1 Struggled to understand your own work after a few weeks Copied workflows from others with unclear node names Spent hours manually renaming nodes one by one ...then this workflow will save you significant time and effort. Requirements n8n API Credentials: Must be configured to allow listing and updating workflows AI Provider Credentials: An API key for your preferred AI provider (OpenRouter is used currently) How it works Trigger: Launch via form (select from dropdown) or manual trigger (quick testing with pre-selected workflow) Fetch: Retrieve the target workflow's JSON and extract nodes and connections Generate: Send the workflow JSON to the AI, which creates a unique, descriptive name for every node Validate: Verify the AI mapping covers all original node names Apply: If valid, update all node names, parameter references, and connections throughout the workflow Save: Save/Update the workflow with renamed nodes and provide links to both new and previous versions If validation fails (e.g., AI missed nodes), the workflow stops with an error. You can modify the error handling to retry or loop back to the AI node. Setup Connect n8n API credentials Open any n8n node in the workflow and make sure your n8n API credentials is connected Configure AI provider credentials Open the "OpenRouter" node (or replace with your preferred AI) Add your API credentials Adjust the model if needed (current: openai/gpt-5.1-codex-mini) Test the workflow Use Manual Trigger for quick testing with a pre-selected workflow Use Form Trigger for a user-friendly interface with workflow selection Important notice If you're renaming a currently opened workflow, you must reload the page after execution to see the latest version, n8n doesn't automatically refresh the canvas when workflow versions are updated via API. Need help? If you're facing any issues using this workflow, join the community discussion on the n8n forum.

AnanBy Anan
2650

Automate RSS to Instagram with AI-generated content and Cloudinary

What it does Reads and aggregates news from one or more RSS feeds (customizable by category). Uses AI to select the most relevant or engaging articles. Generates a caption and headline with a natural, professional tone. Creates a realistic AI-generated image to match the topic. Uploads the image to Cloudinary and publishes the post directly to Instagram through the Meta Graph API. Runs automatically on schedule (default: every 5 hours) — no manual steps required. Why it’s different Works with any subject or niche, from tech to fashion, news, travel, and more. Includes a guide with curated RSS feed sources by category — ready to plug in. AI-driven content generation for text and visuals, tuned for professional results. Fully automated workflow — from discovery to publishing. Self-hosted and scalable, with no vendor lock-in. What’s included Workflow JSON file (import-ready for n8n). PDF deployment guide (written together), covering: how to set up RSS sources by category; configuring APIs (OpenAI, Cloudinary, Meta Graph); scheduling and testing the workflow; recommended best practices for stability and scaling.

Paolo RoncoBy Paolo Ronco
65