7 templates found
Category:
Author:
Sort:

Move data between JSON and spreadsheets

This workflow illustrates how to convert data from JSON to binary format and import JSON data or files into Google Sheets or local spreadsheets.

LorenaBy Lorena
3639

Generate SEO-optimized blog content with Gemini, Scrapeless and Pinecone RAG

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works This advanced automation builds a fully autonomous SEO blog writer using n8n, Scrapeless, LLMs, and Pinecone vector database. It’s powered by a Retrieval-Augmented Generation (RAG) system that collects high-performing blog content, stores it in a vector store, and then generates new blog posts based on that knowledge—endlessly. Part 1: Build a Knowledge Base from Popular Blogs Scrape existing articles from a well-established writer (in this case, Mark Manson) using the Scrapeless node. Extract content from blog pages and store it in Pinecone, a powerful vector database that supports similarity search. Use Gemini Embedding 001 or any other supported embedding model to encode blog content into vectors. Result: You’ll have a searchable vector store of expert-level content, ready to be used for content generation and intelligent search. Part 2: SERP Analysis & AI Blog Generation Use Scrapeless' SERP node to fetch search results based on your keyword and search intent. Send the results to an LLM (like Gemini, OpenRouter, or OpenAI) to generate a keyword analysis report in Markdown → then converted to HTML. Extract long-tail keywords, search intent insights, and content angles from this report. Feed everything into another LLM with access to your Pinecone-stored knowledge base, and generate a fully SEO-optimized blog post. Set up steps Prerequisites Scrapeless API key Pinecone account and index setup An embedding model (Gemini, OpenAI, etc.) n8n instance with Community Node: n8n-nodes-scrapeless installed Credential Configuration Add your Scrapeless and Pinecone credentials in n8n under the "Credentials" tab Choose embedding dimensions according to the model you use (e.g., 768 for Gemini Embedding 001) Key Highlights Clones a real content creator: Replicates knowledge and writing style from top-performing blog authors. Auto-scrapes hundreds of blog posts without being blocked. Stores expert content in a vector DB to build a reusable knowledge base. Performs real-time SERP analysis using Scrapeless to fetch and analyze search data. Generates SEO blog drafts using RAG with detailed keyword intelligence. Output includes: blog title, HTML summary report, long-tail keywords, and AI-written article body. RAG + SEO: The Future of Content Creation This template combines: AI reasoning from large language models Reliable data scraping from Scrapeless Scalable storage via Pinecone vector DB Flexible orchestration using n8n nodes This is not just an automation—it’s a full-stack SEO content machine that enables you to: Build a domain-specific knowledge base Run intelligent keyword research Generate traffic-ready content on autopilot 💡 Use Cases SaaS content teams cloning competitor success Affiliate marketers scaling high-traffic blog production Agencies offering automated SEO content services AI researchers building personal knowledge bots Writers automating first-draft generation with real-world tone

scrapeless officialBy scrapeless official
1228

Automated execution cleanup system with n8n API and custom retention rules

Make your n8n instance faster, cleaner, and more efficient by deleting old workflow executions — while keeping only the most recent ones you actually need. Whether you're using n8n Cloud or self-hosted, this lightweight workflow helps reduce database/storage usage and improves UI responsiveness, using only official n8n nodes. 🔍 Description Automatically clean up old executions in your n8n instance using only official nodes — no external database queries required. Whether you're on the Cloud version or running self-hosted, this workflow helps you optimize performance and keep your instance tidy by maintaining only the most recent executions per workflow. Ideal for users managing dozens or hundreds of workflows, this solution reduces storage usage and improves the responsiveness of the n8n UI, especially in environments where execution logs can accumulate quickly. ✅ What It Does Retrieves up to 250 recent executions across all workflows Groups executions by workflow Keeps only the most recent N executions per workflow (value is configurable) Deletes all older executions (regardless of their status: success, error, etc.) Works entirely with native n8n nodes — no external database access required Optionally: set the number of executions to keep as 0 to delete all past executions from your instance in a single run 🛠️ How to Set Up 🔑 Create a Personal API Key in your n8n instance: Go to Settings → API Keys → Create a new key 🔧 Create a new n8n API Credential (used by both nodes): In your n8n credentials panel: Name: anything you like (e.g., “Internal API Access”) API Key: paste the Personal API Key you just created Base URL: your full n8n instance URL with the /api/v1 path, e.g. https://your-n8n-instance.com/api/v1 ✅ Use this credential in both: The Get Many Executions node (to fetch recent executions) The Delete Many Executions node (to remove outdated executions) 🧩 In the “Set Executions to Keep” node: Edit the variable executionsToKeep and set the number of most recent executions to retain per workflow (e.g. 10) Tip: Set it to 0 to delete all executions 📦 Note: The “Get Many Executions” node will retrieve up to 250 executions per run — this is the maximum allowed by the n8n API. 🧠 No further setup is required — the filtering and grouping logic is handled inside the Code Node automatically. 🧪 Included Nodes Overview 🕒 Schedule Trigger → Set to run daily, weekly, etc. 📥 Get Many Executions → Fetches past executions via n8n API 🛠️ Set Executions to Keep → Set how many recent ones to keep 🧠 Code Node → Filters out executions to delete per workflow 🗑️ Delete Executions → Deletes outdated executions 💡 Why Use This? Reduce clutter and improve performance in your n8n instance Maintain execution logs only when they’re useful Avoid bloating your storage or database with obsolete data Compatible with both n8n Cloud and self-hosted setups Uses only official, supported n8n nodes — no SQL, no extra setup 🔒 This workflow modifies and deletes execution data. Always review and test it first on a staging instance or on a limited set of workflows before using it in production.

Arlin PerezBy Arlin Perez
844

Auto-generate WordPress articles from news with Claude AI and LinkedIn sharing

How it works This workflow sources news from news websites. The information is then passed to an LLM, which processes the article's content. An editor approves or rejects the article. If accepted, the article is first published on the WordPress site and then on the LinkedIn page. Setup Instructions Credentials You'll need to add credentials for the following services in your n8n instance: News API: A credential for your chosen news provider. LLM: Your API key for the LLM you want to use. Google OAuth: For both Gmail and Google Sheets. WordPress OAuth2: To publish articles via the API. See the WordPress Developer Docs. LinkedIn OAuth2: To share the post on a company page. Node Configuration Don't forget to: Fetch News (HTTP Request): Set the query parameters (keywords, language, etc.) for your news source. Basic LLM Chain: Review and customize the prompt to match your desired tone, language, and style. Approval request (Gmail): Set your email address in the Send To field. HTTP Request WP - Push article: Replace <site_Id> in the URL with your WordPress Site ID. getImageId (Code Node): Update the array with your image IDs from the WordPress Media Library. Create a post (LinkedIn): Enter your LinkedIn Organization ID. Append row in sheet (Google Sheets): Select your Google Sheet file and the target sheet. All Email Nodes: Make sure the Send To field is your email.

Marco VenturiBy Marco Venturi
711

Supabase storage tutorial: Upload, fetch, sign & list files

Learn Supabase Storage Fundamentals with n8n This template demonstrates how to integrate Supabase Storage with n8n for uploading, fetching, generating temporary signed URLs, and listing files. It’s a beginner-friendly workflow that helps you understand how to connect Supabase’s storage API with n8n automation. --- Who it’s for Developers and teams new to Supabase who want a hands-on learning workflow. Anyone looking to automate file uploads and retrieval from Supabase Storage. Educators or technical teams teaching Supabase fundamentals with practical demos. --- How it works Upload File – A user uploads a file through an n8n form, which gets stored in a Supabase storage bucket. Fetch File – Retrieve files by providing their filename. Temporary Access – Generate signed URLs with custom expiry for secure file sharing. List Objects – View all stored files in the chosen Supabase bucket. --- How to set up Create a Supabase account and set up a project. Create a bucket in Supabase (e.g., test-n8n). Get your Project URL and Anon Key from Supabase. In n8n, create a Supabase API Credential using your keys. Import this workflow and connect it with your credentials. Run the forms to test file upload, retrieval, and listing. --- Requirements A Supabase project with storage enabled. A configured Supabase API Credential in n8n. --- Customization Change the bucket name (test-n8n) to your own. Adjust signed URL expiry times for temporary file access. Replace Supabase with another S3-compatible storage if needed. Extend the workflow with notifications (Slack, Email) after file upload. --- 📝 Lessons Included Lesson 1 – Upload file to Supabase storage. Lesson 2 – Fetch file from storage. Lesson 3 – Create a temporary signed document with expiry. Lesson 4 – List all items in Supabase storage. --- 🔑 Prerequisites Supabase account + project. Project URL and API Key (Anon). Bucket created in Supabase. Policy created to allow read/write access. n8n with Supabase API credentials configured.

Alok KumarBy Alok Kumar
686

Task escalation system with Google Sheets, Gmail, Telegram & Jira automation

Description This workflow sends an instant email alert when a task in a Google Sheet is marked as Urgent, and then sends a Telegram reminder notification after 2 hours if the task still hasn’t been updated. Then a Jira ticket is created so the task enters in the formal workflow and another Telegram message is sent with the details of the issue created. It helps teams avoid missed deadlines and ensures urgent tasks get attention — without requiring anyone to refresh or monitor the sheet manually. Context In shared task lists, urgent items can be overlooked if team members aren't actively checking the spreadsheet. This workflow solves that by: Sending an email as soon as a task becomes Urgent Waiting 2 hours Checking if the task is still open Sending a Telegram reminder only if action has not been taken Creating a Jira issue Sending a Telegram message with the details of the issue created This prevents both silence and spam, creating a smart and reliable alert system. Target Users Project Managers using Google Sheets Team leads managing shared task boards Remote teams needing lightweight coordination Anyone who wants escalation notifications without complex systems Technical Requirements Google Sheets credential Gmail credential Telegram Bot + Chat ID Google Sheet with a column named Priority Jira credential Workflow Steps Trigger: Google Sheets Trigger (on update in the “Priority” column) IF Node – Checks if Priority = Urgent Send Email – Sends alert email with task name, owner, status, deadline Mark Notified = Yes in the sheet Wait 2 hours IF Status is still not resolved Send Telegram reminder create an Issue on Jira based on the information provided Send Telegram message with the details of the ticket Key Features Real-time alerts on critical tasks Simple logic (no code required) Custom email body with dynamic fields Works on any Google Sheet with a “Priority” column Telegram notification ensures the task doesn’t get forgotten Expected Output Personalized email alert when a task is marked as "Urgent" Email includes task info: title, owner, deadline, status, next step Telegram message after 2 hours if the task is still open Automatic creation of a Jira issue with the higgest priority Telegram message to notify about the new Jira ticket How it works Trigger: Watches for “Priority” updates 🔍 Check: If Priority = Urgent AND Notified is empty 📧 Email: Sends a personalized alert ✏️ Sheet Update: Marks the task as already notified ⏳ Wait: 2-hour delay 🤖 Check Again: If Status hasn’t changed → send Telegram reminder, create Jira ticket and send the details. Tutorial video: Watch the Youtube Tutorial video About me : I’m Yassin a Project & Product Manager Scaling tech products with data-driven project management. 📬 Feel free to connect with me on Linkedin

Yassin ZeharBy Yassin Zehar
183

Automatic Gmail unsubscribe detection with AI and Google Sheets contact management

Automatically detect unsubscribe replies in your outreach emails and clean your Google Sheets contact list; keeping your domain reputation and deliverability strong. --- 🎯 Who it’s for This template is designed for freelancers, lead generation specialists, and outreach managers; particularly those running email outreach campaigns for clients or personal lead-gen projects. If you send cold emails, manage multiple lead lists, or handle outreach at scale, this workflow helps you automatically manage unsubscribe requests to maintain healthy email deliverability and protect your domain reputation. --- ⚙️ How it works Trigger: The workflow starts when a new reply is received in your Gmail inbox. Intent Detection: The email text is analyzed using OpenAI to detect unsubscribe intent (“unsubscribe”, “remove me”, “opt out”, etc.). Normalization & Filtering: A Code node verifies the AI output for accuracy and ensures the result is standardized as either "unsubscribe" or "keep". Check & Update Sheets: If the contact requested removal, the workflow checks your Unsubscribe Sheet to see if they’re already listed. If not, the contact is added to the Unsubscribe Sheet and simultaneously removed from your Main Outreach Sheet. Optional Gmail Label: Adds an “Unsubscribe” tag in Gmail for quick visual tracking (optional customization). --- 🧩 Requirements To run this workflow, you’ll need: Gmail Credentials → for reading incoming replies and applying labels. Google Sheets Credentials → to manage both the “Main” and “Unsubscribe” spreadsheets. OpenAI API Key → used for detecting unsubscribe intent from message text. All credentials can be added through the n8n Credentials Manager. --- 🧠 How to Customize Polling Time: Adjust how often the Gmail Trigger checks for new replies (default: every 5 minutes). Sheets: Replace the linked Google Sheets IDs with your own. You can change sheet names and columns freely. Intent Rules: Modify the Code node to include additional unsubscribe phrases or alternate keywords. Optional Gmail Tagging: Enable or disable tagging for unsubscribed messages. Secondary Validation: Enable the second “If” check after the OpenAI node to double-confirm unsubscribe intent before moving contacts. --- 💡 Why this workflow matters By automatically managing unsubscribe requests, you: Respect recipients’ opt-out preferences Reduce spam complaints Protect your domain reputation and increase deliverability Save hours of manual list cleaning This is a must-have automation for anyone running cold email outreach, especially freelancers managing multiple client inboxes. --- 🪄 Quick Setup Tips Replace all "Gmail account" and "Google Service Account account" credential references with your actual credentials. Ensure your sheet has an EMAIL column for lookup. Test with a few mock replies before activating for production. Optional: Add a time-based trigger to run the sheet cleanup periodically.

ItunuBy Itunu
98
All templates loaded