Back to Catalog

Bulk delete all YouTube playlists from your channel

Lucía Maio BriosoLucía Maio Brioso
729 views
2/3/2026
Official Page

🧑‍💼 Who is this for?

This workflow is for any YouTube user who wants to bulk delete all playlists from their own channel — whether to start fresh, clean up old content, or prepare the account for a new purpose.

It’s useful for:

  • Creators reorganizing their channel
  • People transferring content to another account
  • Anyone who wants to avoid deleting playlists manually one by one

🧠 What problem is this workflow solving?

YouTube does not offer a built-in way to delete multiple playlists at once. If you have dozens or hundreds of playlists, removing them manually is extremely time-consuming.

This workflow automates the entire deletion process in seconds, saving you hours of repetitive effort.

⚙️ What this workflow does

  • Connects to your YouTube account
  • Fetches all playlists you’ve created (excluding system playlists)
  • Deletes them one by one automatically

> ⚠️ This action is irreversible. Once a playlist is deleted, it cannot be recovered. Use with caution.

🛠️ Setup

  1. 🔐 Create a YouTube OAuth2 credential in n8n for your channel.
  2. 🧭 Assign the credential to both YouTube nodes.
  3. ✅ Click “Test workflow” to execute.

> 🟨 By default, this workflow deletes everything. If you want to be more selective, see the customization tips below.

🧩 How to customize this workflow to your needs

  • Add a confirmation flag Insert a Set node with a custom field like confirm_delete = true, and follow it with an IF node to prevent accidental execution.

  • ✂️ Delete only some playlists Add a Filter node after fetching playlists — you can match by title, ID, or keyword (e.g. only delete playlists containing “old”).

  • 🛑 Add a pause before deletion Insert a Wait or NoOp node to give you a moment to cancel before it runs.

  • 🔁 Adapt to scheduled cleanups Use a Cron trigger if you want to periodically clear temporary playlists.

Bulk Delete All YouTube Playlists from Your Channel

This n8n workflow provides a simple, manual trigger to interact with your YouTube account. While the workflow's name suggests bulk deletion, the current JSON definition only includes a YouTube node without specific operations configured. This README will describe the workflow as it is defined in the JSON.

What it does

This workflow, in its current state, contains the following components:

  1. Manual Trigger: Initiates the workflow when you manually click 'Execute workflow' in the n8n editor.
  2. YouTube Node: A YouTube node is present, indicating an intention to interact with the YouTube API.
  3. Sticky Note: A sticky note is included, likely for documentation or comments within the workflow canvas.

Note: As per the provided JSON, the YouTube node is present but does not have any specific operation (like listing playlists, deleting videos, etc.) configured. To perform actions like bulk deleting playlists, you would need to configure the YouTube node with the appropriate operation and parameters.

Prerequisites/Requirements

  • n8n Instance: You need a running n8n instance to import and execute this workflow.
  • YouTube Account: An account with YouTube access.
  • Google OAuth2 Credential for YouTube: To interact with the YouTube API, you will need to set up a Google OAuth2 credential in n8n and connect it to the YouTube node. This credential will grant n8n permission to access your YouTube data.

Setup/Usage

  1. Import the Workflow:
    • In your n8n instance, go to the "Workflows" section.
    • Click "New" or "Import from JSON" and paste the provided JSON content.
  2. Configure YouTube Credentials:
    • Click on the "YouTube" node.
    • Under the "Credentials" section, select an existing Google OAuth2 credential or create a new one.
    • When creating a new credential, follow the n8n documentation for Google OAuth2 (linked in the JSON) to set up your Google Cloud Project, enable the YouTube Data API v3, and obtain your Client ID and Client Secret. Ensure the necessary scopes for YouTube access are included.
  3. Configure YouTube Node (for desired action):
    • Currently, the YouTube node is unconfigured for any specific action. To perform tasks like deleting playlists, you would need to:
      • Select the appropriate "Resource" (e.g., "Playlist").
      • Select the desired "Operation" (e.g., "Delete").
      • Provide any necessary parameters (e.g., "Playlist ID").
  4. Execute the Workflow:
    • Click the "Execute Workflow" button (play icon) in the n8n editor to run the workflow manually.

Related Templates

Document RAG & chat agent: Google Drive to Qdrant with Mistral OCR

Knowledge RAG & AI Chat Agent: Google Drive to Qdrant Description This workflow transforms a Google Drive folder into an intelligent, searchable knowledge base and provides a chat agent to query it. It’s composed of two distinct flows: An ingestion pipeline to process documents. A live chat agent that uses RAG (Retrieval-Augmented Generation) and optional web search to answer user questions. This system fully automates the creation of a “Chat with your docs” solution and enhances it with external web-searching capabilities. --- Quick Implementation Steps Import the workflow JSON into your n8n instance. Set up credentials for Google Drive, Mistral AI, OpenAI, and Qdrant. Open the Web Search node and add your Tavily AI API key to the Authorization header. In the Google Drive (List Files) node, set the Folder ID you want to ingest. Run the workflow manually once to populate your Qdrant database (Flow 1). Activate the workflow to enable the chat trigger (Flow 2). Copy the public webhook URL from the When chat message received node and open it in a new tab to start chatting. --- What It Does The workflow is divided into two primary functions: Knowledge Base Ingestion (Manual Trigger) This flow populates your vector database. Scans Google Drive: Lists all files from a specified folder. Processes Files Individually: Downloads each file. Extracts Text via OCR: Uses Mistral AI OCR API for text extraction from PDFs, images, etc. Generates Smart Metadata: A Mistral LLM assigns metadata like documenttype, project, and assignedto. Chunks & Embeds: Text is cleaned, chunked, and embedded via OpenAI’s text-embedding-3-small model. Stores in Qdrant: Text chunks, embeddings, and metadata are stored in a Qdrant collection (docaiauto). AI Chat Agent (Chat Trigger) This flow powers the conversational interface. Handles User Queries: Triggered when a user sends a chat message. Internal RAG Retrieval: Searches Qdrant Vector Store first for answers. Web Search Fallback: If unavailable internally, the agent offers to perform a Tavily AI web search. Contextual Responses: Combines internal and external info for comprehensive answers. --- Who's It For Ideal for: Teams building internal AI knowledge bases from Google Drive. Developers creating AI-powered support, research, or onboarding bots. Organizations implementing RAG pipelines. Anyone making unstructured Google Drive documents searchable via chat. --- Requirements n8n instance (self-hosted or cloud). Google Drive Credentials (to list and download files). Mistral AI API Key (for OCR & metadata extraction). OpenAI API Key (for embeddings and chat LLM). Qdrant instance (cloud or self-hosted). Tavily AI API Key (for web search). --- How It Works The workflow runs two independent flows in parallel: Flow 1: Ingestion Pipeline (Manual Trigger) List Files: Fetch files from Google Drive using the Folder ID. Loop & Download: Each file is processed one by one. OCR Processing: Upload file to Mistral Retrieve signed URL Extract text using Mistral DOC OCR Metadata Extraction: Analyze text using a Mistral LLM. Text Cleaning & Chunking: Split into 1000-character chunks. Embeddings Creation: Use OpenAI embeddings. Vector Insertion: Push chunks + metadata into Qdrant. Flow 2: AI Chat Agent (Chat Trigger) Chat Trigger: Starts when a chat message is received. AI Agent: Uses OpenAI + Simple Memory to process context. RAG Retrieval: Queries Qdrant for related data. Decision Logic: Found → Form answer. Not found → Ask if user wants web search. Web Search: Performs Tavily web lookup. Final Response: Synthesizes internal + external info. --- How To Set Up Import the Workflow Upload the provided JSON into your n8n instance. Configure Credentials Create and assign: Google Drive → Google Drive nodes Mistral AI → Upload, Signed URL, DOC OCR, Cloud Chat Model OpenAI → Embeddings + Chat Model nodes Qdrant → Vector Store nodes Add Tavily API Key Open Web Search node → Parameters → Headers Add your key under Authorization (e.g., tvly-xxxx). Node Configuration Google Drive (List Files): Set Folder ID. Qdrant Nodes: Ensure same collection name (docaiauto). Run Ingestion (Flow 1) Click Test workflow to populate Qdrant with your Drive documents. Activate Chat (Flow 2) Toggle the workflow ON to enable real-time chat. Test Open the webhook URL and start chatting! --- How To Customize Change LLMs: Swap models in OpenAI or Mistral nodes (e.g., GPT-4o, Claude 3). Modify Prompts: Edit the system message in ai chat agent to alter tone or logic. Chunking Strategy: Adjust chunkSize and chunkOverlap in the Code node. Different Sources: Replace Google Drive with AWS S3, Local Folder, etc. Automate Updates: Add a Cron node for scheduled ingestion. Validation: Add post-processing steps after metadata extraction. Expand Tools: Add more functional nodes like Google Calendar or Calculator. --- Use Case Examples Internal HR Bot: Answer HR-related queries from stored policy docs. Tech Support Assistant: Retrieve troubleshooting steps for products. Research Assistant: Summarize and compare market reports. Project Management Bot: Query document ownership or project status. --- Troubleshooting Guide | Issue | Possible Solution | |------------|------------------------| | Chat agent doesn’t respond | Check OpenAI API key and model availability (e.g., gpt-4.1-mini). | | Known documents not found | Ensure ingestion flow ran and both Qdrant nodes use same collection name. | | OCR node fails | Verify Mistral API key and input file integrity. | | Web search not triggered | Re-check Tavily API key in Web Search node headers. | | Incorrect metadata | Tune Information Extractor prompt or use a stronger Mistral model. | --- Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. We can help you set it up for free — from connecting credentials to deploying it live. Contact: shilpa.raju@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help. ---

DIGITAL BIZ TECHBy DIGITAL BIZ TECH
1409

Newsletter signup flow with Email Verification API, Gmail & Google Sheets tracking

Newsletter Sign-up with Email Verification & Welcome Email Automation 📋 Description A complete, production-ready newsletter automation workflow that validates email addresses, sends personalized welcome emails, and maintains comprehensive logs in Google Sheets. Perfect for marketing teams, content creators, and businesses looking to build high-quality email lists with minimal manual effort. ✨ Key Features Email Verification Real-time validation using Verifi Email API Checks email format (RFC compliance) Verifies domain existence and MX records Detects disposable/temporary email addresses Identifies potential spoofed emails Automated Welcome Emails Personalized HTML emails with subscriber's first name Beautiful, mobile-responsive design with gradient headers Branded confirmation and unsubscribe links Sent via Gmail (or SMTP) automatically to valid subscribers Smart Data Handling Comprehensive logging to Google Sheets with three separate tabs Handles incomplete submissions gracefully Preserves original user data throughout verification process Tracks source attribution for multi-channel campaigns Error Management Automatic retry logic on API failures Separate logging for different error types Detailed technical reasons for invalid emails No data loss with direct webhook referencing 🎯 Use Cases Newsletter sign-ups on websites and landing pages Lead generation forms with quality control Marketing campaigns requiring verified email lists Community building with automated onboarding SaaS product launches with email collection Content creator audience building E-commerce customer list management 📊 What Gets Logged Master Log (All Subscribers) Timestamp, name, email, verification result Verification score and email sent status Source tracking, disposable status, domain info Invalid Emails Log Detailed rejection reasons Technical diagnostic information MX record status, RFC compliance Provider information for troubleshooting Invalid Submissions Log Incomplete form data Missing required fields Timestamp for follow-up 🔧 Technical Stack Trigger: Webhook (POST endpoint) Email Verification: Verifi Email API Email Sending: Gmail OAuth2 (or SMTP) Data Storage: Google Sheets (3 tabs) Processing: JavaScript code nodes for data formatting 🚀 Setup Requirements Google Account - For Sheets and Gmail integration Verifi Email API Key - (https://verifi.email) Google Sheets - Pre-configured with 3 tabs (template provided) 5-10 minutes - Quick setup with step-by-step instructions included 📈 Benefits ✅ Improve Email Deliverability - Remove invalid emails before sending campaigns ✅ Reduce Bounce Rates - Only send to verified, active email addresses ✅ Save Money - Don't waste email credits on invalid addresses ✅ Better Analytics - Track conversion rates by source ✅ Professional Onboarding - Personalized welcome experience ✅ Scalable Solution - Handles high-volume sign-ups automatically ✅ Data Quality - Build a clean, high-quality subscriber list 🎨 Customization Options Email Template - Fully customizable HTML design Verification Threshold - Adjust score requirements Brand Colors - Match your company branding Confirmation Flow - Add double opt-in if desired Multiple Sources - Track different signup forms Language - Easily translate email content 📦 What's Included ✅ Complete n8n workflow JSON (ready to import) ✅ Google Sheets template structure ✅ Responsive HTML email template ✅ Setup documentation with screenshots ✅ Troubleshooting guide ✅ Customization examples 🔒 Privacy & Compliance GDPR-compliant with unsubscribe links Secure data handling via OAuth2 No data shared with third parties Audit trail in Google Sheets Easy data deletion/export 💡 Quick Stats 12 Nodes - Fully automated workflow 3 Data Paths - Valid, invalid, and incomplete submissions 100% Uptime - When properly configured Instant Processing - Real-time email verification Unlimited Scale - Based on your API limits 🏆 Perfect For Marketing Agencies SaaS Companies Content Creators E-commerce Stores Community Platforms Educational Institutions Membership Sites Newsletter Publishers 🌟 Why Use This Workflow? Instead of manually verifying emails or dealing with bounce complaints, this workflow automates the entire process from sign-up to welcome email. Save hours of manual work, improve your email deliverability, and create a professional first impression with every new subscriber. Start building a high-quality email list today! ---

Jitesh DugarBy Jitesh Dugar
245

Competitor intelligence agent: SERP monitoring + summary with Thordata + OpenAI

Who this is for? This workflow is designed for: Marketing analysts, SEO specialists, and content strategists who want automated intelligence on their online competitors. Growth teams that need quick insights from SERP (Search Engine Results Pages) without manual data scraping. Agencies managing multiple clients’ SEO presence and tracking competitive positioning in real-time. What problem is this workflow solving? Manual competitor research is time-consuming, fragmented, and often lacks actionable insights. This workflow automates the entire process by: Fetching SERP results from multiple search engines (Google, Bing, Yandex, DuckDuckGo) using Thordata’s Scraper API. Using OpenAI GPT-4.1-mini to analyze, summarize, and extract keyword opportunities, topic clusters, and competitor weaknesses. Producing structured, JSON-based insights ready for dashboards or reports. Essentially, it transforms raw SERP data into strategic marketing intelligence — saving hours of research time. What this workflow does Here’s a step-by-step overview of how the workflow operates: Step 1: Manual Trigger Initiates the process on demand when you click “Execute Workflow.” Step 2: Set the Input Query The “Set Input Fields” node defines your search query, such as: > “Top SEO strategies for e-commerce in 2025” Step 3: Multi-Engine SERP Fetching Four HTTP request tools send the query to Thordata Scraper API to retrieve results from: Google Bing Yandex DuckDuckGo Each uses Bearer Authentication configured via “Thordata SERP Bearer Auth Account.” Step 4: AI Agent Processing The LangChain AI Agent orchestrates the data flow, combining inputs and preparing them for structured analysis. Step 5: SEO Analysis The SEO Analyst node (powered by GPT-4.1-mini) parses SERP results into a structured schema, extracting: Competitor domains Page titles & content types Ranking positions Keyword overlaps Traffic share estimations Strengths and weaknesses Step 6: Summarization The Summarize the content node distills complex data into a concise executive summary using GPT-4.1-mini. Step 7: Keyword & Topic Extraction The Keyword and Topic Analysis node extracts: Primary and secondary keywords Topic clusters and content gaps SEO strength scores Competitor insights Step 8: Output Formatting The Structured Output Parser ensures results are clean, validated JSON objects for further integration (e.g., Google Sheets, Notion, or dashboards). Setup Prerequisites n8n Cloud or Self-Hosted instance Thordata Scraper API Key (for SERP data retrieval) OpenAI API Key (for GPT-based reasoning) Setup Steps Add Credentials Go to Credentials → Add New → HTTP Bearer Auth* → Paste your Thordata API token. Add OpenAI API Credentials* for the GPT model. Import the Workflow Copy the provided JSON or upload it into your n8n instance. Set Input In the “Set the Input Fields” node, replace the example query with your desired topic, e.g.: “Google Search for Top SEO strategies for e-commerce in 2025” Execute Click “Execute Workflow” to run the analysis. How to customize this workflow to your needs Modify Search Query Change the search_query variable in the Set Node to any target keyword or topic. Change AI Model In the OpenAI Chat Model nodes, you can switch from gpt-4.1-mini to another model for better quality or lower cost. Extend Analysis Edit the JSON schema in the “Information Extractor” nodes to include: Sentiment analysis of top pages SERP volatility metrics Content freshness indicators Export Results Connect the output to: Google Sheets / Airtable for analytics Notion / Slack for team reporting Webhook / Database for automated storage Summary This workflow creates an AI-powered Competitor Intelligence System inside n8n by blending: Real-time SERP scraping (Thordata) Automated AI reasoning (OpenAI GPT-4.1-mini) Structured data extraction (LangChain Information Extractors)

Ranjan DailataBy Ranjan Dailata
632