Auto workflow backup to Google Drive – automated export of all your workflows
n8n Workflow Backup to Google Drive – Automated Export of All Your Workflows This workflow is designed to automatically create backups of all your workflows in n8n and store them as individual .json files in Google Drive. It's a fully automated system that helps developers, agencies, or automation teams ensure their automation logic is always safe, versioned, and ready to restore or share. What is this for? If you’re building and managing multiple automations inside n8n, losing a workflow due to accidental deletion or misconfiguration can cost you hours of work. This template solves that by exporting all your workflows into separate files and storing them in a dated Google Drive folder. It helps with disaster recovery, version tracking, and team collaboration — without any manual exporting. How this works: -Once triggered (manually or via a schedule), the workflow performs the following steps: -Creates a new folder in your Google Drive, named with today’s date (e.g. “Workflow Backups Monday 16-05-2025”). -Connects to your n8n instance using the internal API and retrieves a list of all existing workflows. -Iterates over each workflow, converts it into a .json file using the built-in file conversion node. -Uploads each individual .json file to the newly created folder in Google Drive. -Optionally, the workflow finds and deletes old backup folders to keep your Google Drive clean and avoid clutter. You get a clean, timestamped folder with all your flows — ready to restore, send, or store securely. You can trigger it manually or schedule it (e.g., to run weekly on Monday mornings). How to set it up: Import the provided workflow JSON into your n8n instance. Set up your credentials: -Replace the placeholder “Google demo” with your actual Google Drive OAuth2 credentials in all Google Drive nodes. -Replace the placeholder “n8n demo” with your n8n API credentials so the workflow can fetch your flows. -Go to the node “Create new folder” and replace the folder ID with your own destination folder in Google Drive where backups should be stored. -(Optional) Enable the “Schedule Trigger” to run the backup automatically once a week or on your preferred interval. You’re ready to go — test it with the Manual Trigger first and check your Google Drive for results.
Periodically send data from HTTP Request node to Telegram
No description available.
AI client onboarding agent: Auto welcome email generator
AI Client Onboarding Agent: Auto Welcome Email Generator This workflow automates welcoming new clients. When someone submits a form, their details are pulled from Google Sheets, a personalized onboarding checklist is generated using Google Gemini, and an email is sent directly to the client. It also includes error handling to ensure nothing is missed. --- 🟢 Section 1 – Trigger & Client Data Capture Nodes: ⏰ Trigger on New Client Form Submission → Fires when a new row is added in Google Sheets (from the client’s form). 🧍 Extract and Structure Client Data → Collects and formats client details: name, email, company, services, extra info. ✅ Beginner view: This is the doorway. When a client fills the form, their info is automatically pulled into the workflow. --- 📑 Section 2 – Checklist & Personalization Nodes: 📋 Client Checklist → Creates a default onboarding checklist (account setup, welcome call, docs, etc.). 🧠 Personalize Using Gemini → Sends client details + checklist to Google Gemini AI → generates a tailored onboarding email body. ✅ Beginner view: This is where the magic happens. Instead of a boring generic email, each client gets a customized message that feels personal. --- 📤 Section 3 – Delivery & Completion Nodes: 📧 Send Email to Client → Sends the personalized onboarding email directly to the client’s inbox. ✅ Execution Completed → Marks the workflow as successfully finished. ✅ Beginner view: Think of this as the final handshake with the client — they get a warm, professional onboarding email without you lifting a finger. --- 🚨 Section 4 – Error Handling Nodes: ⚠️ Error Handler → Captures any failure in the workflow. ❌ Execution Failure → Defines fallback/alert action if something breaks. ✅ Beginner view: This is your safety net. If an email fails or Gemini is unavailable, the workflow won’t just stop — you can set it up to alert you. --- 📊 Summary Table | Section | Key Nodes | Purpose | Beginner Benefit | | ----------------- | ----------------------------------- | -------------------------------------- | ------------------------------ | | 🟢 Trigger & Data | Google Sheets Trigger, Data Extract | Capture client info | Auto-collects form submissions | | 📑 Checklist & AI | Checklist, Gemini | Generate personalized onboarding email | Each client feels special | | 📤 Delivery | Gmail, Execution Completed | Send email & close flow | Client gets email instantly | | 🚨 Error Handling | Error Trigger, Failure Node | Catch issues | Ensures nothing is missed | --- 🌟 Why This Workflow Rocks Saves hours → no manual onboarding emails Personalized at scale → Gemini tailors messages per client Error-proof → built-in error handling keeps you safe Scalable → works for 10 or 10,000 clients --- 👉 Example Flow in Action: A client named Sarah fills the onboarding form. Workflow captures her details → “Sarah, MarketingPro Agency, Needs Analytics Setup.” Gemini creates a custom email: > Hi Sarah, welcome aboard! Here’s your onboarding plan tailored for Analytics Setup… Gmail sends it instantly. You get notified only if something fails. ---
Generate AI News LinkedIn Posts with GPT-4o-mini, NewsAPI, and Qdrant
Overview Automated LinkedIn content generator that: Fetches trending AI news using NewsAPI Enhances content with Qdrant vector store context Generates professional LinkedIn posts using GPT-4o-mini Tracks email interactions in Google Sheets 🛠️ Prerequisites API Keys : NewsAPI, OpenAI (GPT-4o-mini), Qdrant Accounts : Gmail Oauth, Google Sheets, LinkedIn developer API Environment Variables : OPENAIAPIKEY NEWSAPI_KEY QDRANTURL/QDRANTAPI_KEY 📁 Google Sheets Setup Create a spreadsheet with these columns: ISO date Email address Unique ID "Approve" or "Reject" ⚙️ Setup Instructions Pre-populate Qdrant : Create collection "posts" with LinkedIn post examples Add 10+ example posts for style reference Node Configuration : Update Gmail credentials (OAuth2) Set fromEmail/toEmail in email nodes Configure Google Sheets document IDs Test Workflow : Run Schedule Trigger manually first Verify email notifications work Check Qdrant vector store connectivity 🎨 Customization Options Tone Adjustment : Modify system message in "AI Agent" Post Style : Update prompt in "Generate LinkedIn Post" Filter Criteria : Edit NewsAPI URL parameters Scheduling : Change interval in Schedule Trigger
Run bulk RAG queries from CSV with Lookio
This template processes a CSV of questions and returns an enriched CSV with RAG-based answers produced by your Lookio assistant. Upload a CSV that contains a column named Query, and the workflow will loop through every row, call the Lookio API, and append a Response column containing the assistant's answer. It's ideal for batch tasks like drafting RFP responses, pre-filling support replies, generating knowledge-checked summaries, or validating large lists of product/customer questions against your internal documentation. Who is this for? Knowledge managers & technical writers: Produce draft answers to large question sets using your company docs. Sales & proposal teams: Auto-generate RFP answer drafts informed by internal docs. Support & operations teams: Bulk-enrich FAQs or support ticket templates with authoritative responses. Automation builders: Integrate Lookio-powered retrieval into bulk data pipelines. What it does / What problem does this solve? Automates bulk queries: Eliminates the manual process of running many individual lookups. Ensures answers are grounded: Responses come from your uploaded documents via Lookio, reducing hallucinations. Produces ready-to-use output: Delivers an enriched CSV with a new Response column for downstream use. Simple UX: Users only need to upload a CSV with a Query column and download the resulting file. How it works Form submission: User uploads a CSV via the Form Trigger. Extract & validate: Extract all rows reads the CSV and Aggregate rows checks for a Query column. Per-row loop: Split Out and Loop Over Queries iterate rows; Isolate the Query column normalizes data. Call Lookio: Lookio API call posts each query to your assistant and returns the answer. Build output: Prepare output appends Response values and Generate enriched CSV creates the downloadable file delivered by Form ending and file download. Why use Lookio for high quality RAG? While building a native RAG pipeline in n8n offers granular control, achieving consistently high-quality and reliable results requires significant effort in data processing, chunking strategy, and retrieval logic optimization. Lookio is designed to address these challenges by providing a managed RAG service accessible via a simple API. It handles the entire backend pipeline—from processing various document formats to employing advanced retrieval techniques—allowing you to integrate a production-ready knowledge source into your workflows. This approach lets you focus on building your automation in n8n, rather than managing the complexities of a RAG infrastructure. How to set up Create a Lookio assistant: Sign up at https://www.lookio.app/, upload documents, and create an assistant. Get credentials: Copy your Lookio API Key and Assistant ID. Configure the workflow nodes: In the Lookio API call HTTP Request node, replace the apikey header value with your Lookio API Key and update assistantid with your Assistant ID (replace placeholders like <your-lookio-api-key> and <your-assistant-id>). Ensure the Form Trigger is enabled and accepts a .csv file. CSV format: Ensure the input CSV has a column named Query (case-sensitive as configured). Activate the workflow: Run a test upload and download the enriched CSV. Requirements An n8n instance with the ability to host Forms and run workflows A Lookio account (API Key) and an Assistant ID How to take it further Add rate limiting / retries: Insert error handling and delay nodes to respect API limits for large batches. Improve the speed: You could drastically reduce the processing time by parallelizing the queries instead of doing them one after the other in the loop. For that, you could use HTTP request nodes that would trigger your sort of sub-workflow. Store results: Add an Airtable or Google Sheets node to archive questions and responses for audit and reuse. Post-process answers: Add an LLM node to summarize or standardize responses, or to add confidence flags. Trigger variations: Replace the Form Trigger with a Google Drive or Airtable trigger to process CSVs automatically from a folder or table.
Auto-generate sales presentations from Claap calls with GPT-4o and Google Slides
Template presentation This template generates a sales follow-up presentation in Google Slides after a sales call recorded in Claap. The workflow is simplified to showcase the main use case. You can customize and enrich this workflow by connecting to the CRM, researching data online or adding more files in the presentation. The presentation template used in this workflow is available here. Workflow configuration Create a webhook in Claap, by following this article. Edit the labels that trigger the workflow and route on the relevant presentation. Fill your Open AI credentials by creating an API Key in OpenAI Platform Edit the presentation personalization details (user set as editor, content, title) Fill your Slack credentials by following steps in this video.
Generate weekly grocery lists in Notion with automated email notifications
Who’s it for Busy homemakers, creators, and anyone who wants a simple, no-cost way to plan weekly meals and get a ready-to-shop grocery list—without extra apps. What it does Runs on a weekly Cron schedule, generates a grocery list, creates rows in your Notion database (Ingredient, Quantity, Status), and emails the list (optional Telegram confirmation). It includes a Notion connection check, detailed error notifications (email and optional Telegram), a success email with timestamp, and optional persistent logging to a Notion “Logs” database. Requirements Notion account + integration connected to your database SMTP credentials for the Email node (Optional) Telegram bot and chat ID for alerts (Optional) Notion log database if you want persistent logs How to set up Import the workflow and open Set: Configuration to fill: fromEmail, emailTo, notifyEmail, notionDb, telegramChatId (Optional) logToNotion (true/false), notionLogDb In Notion, create a DB with properties: Ingredient (Title), Quantity (Rich text), Status (Select: “To Buy”) Attach your Notion and SMTP credentials (and Telegram if used). Run once to test, then set Cron to your preferred weekly time. How to customize Edit the recipe/grocery items in the Code node. Change the Cron schedule (test with “Every minute,” then revert to weekly). Enable Telegram alerts, and/or turn on Notion logging for audit trails.
Connect AI agents to eBay deal API with MCP server
Need help? Want access to this workflow + many more paid workflows + live Q&A sessions with a top verified n8n creator? Join the community Complete MCP server exposing 4 Deal API operations to AI agents. ⚡ Quick Setup Import this workflow into your n8n instance Credentials Add Deal API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Deal API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.ebay.com{basePath} • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (4 total) 🔧 Deal_Item (1 endpoints) • GET /deal_item: List Deal Items 🔧 Event (2 endpoints) • GET /event: List Event Items • GET /event/{event_id}: This method retrieves the details for an eBay event 🔧 Event_Item (1 endpoints) • GET /event_item: This method returns a paginated set of event items 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Deal API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
Run AI-powered market research with Groq, OpenAI, Documentero and Gmail
Description This n8n template demonstrates how to build an AI-powered Market Research Assistant using a multi-agent workflow. It helps you get a 360-degree view of a product idea or research topic by analysing: Customer insights and pain points Market size and macro/micro-economic trends Competitive landscape and alternatives The workflow mirrors how product managers and strategy teams conduct discovery — by breaking down research into parallel workstreams and then synthesizing insights into a single narrative. How it works Planner Agent The main agent receives your research topic as input and defines: Research objective Key areas of focus (Customer, Market, Competition) Assumptions and constraints Parallel Research Agents Based on the planner’s output, three specialist agents run in parallel: Customer Insights Agent Researches public sources such as articles and forums to infer customer behaviour, pain points, and existing tools. Market Scan Agent Analyses macro-economic and micro-economic trends, estimates TAM/SAM/SOM, and highlights key risks and assumptions. Competitor Insights Agent Identifies existing competitors and substitutes and summarises how they are positioned in the market. Synthesis Agent The outputs from all research agents are consolidated and analysed by a synthesis agent, which produces a market discovery memo. Final Output The discovery memo is generated as a document and sent to your email. How to use Trigger the workflow via the chat message node. Provide your research topic or product idea, along with optional context such as target market. The workflow runs automatically and delivers a structured discovery memo to your inbox. Setup Steps API credentials for: Groq or OpenAI (LLM) Documentero (document generation) A configured Documentero template Gmail OAuth or email credentials for delivery of memo