9 templates found
Category:
Author:
Sort:

Automated meeting recording & AI summaries with Google Calendar, Vexa & Llama 3.2

Transform your meetings into actionable insights automatically! This workflow captures meeting audio, transcribes conversations, generates AI summaries, and emails the results to participants—all without manual intervention. What's the Goal? Auto-record meetings when they start and stop when they end Transcribe audio to text using Vexa Bot integration Generate intelligent summaries with AI-powered analysis Email summaries to meeting participants automatically Eliminate manual note-taking and post-meeting admin work Never miss important discussions or action items again Why Does It Matter? Save 90% of Post-Meeting Time: No more manual transcription or summary writing Never Lose Key Information: Automatic capture ensures nothing falls through cracks Improve Team Productivity: Focus on discussions, not note-taking Perfect Meeting Records: Searchable transcripts and summaries for future reference Instant Distribution: Summaries reach all participants immediately after meetings How It Works Step 1: Meeting Detection & Recording Start Meeting Trigger: Detects when meeting begins via Google Meet webhook Launch Vexa Bot: Automatically joins meeting and starts recording End Meeting Trigger: Detects meeting end and stops recording Step 2: Audio Processing & Transcription Stop Vexa Bot: Ends recording and retrieves audio file Fetch Meeting Audio: Downloads recorded audio from Vexa Bot Transcribe Audio: Converts speech to text using AI transcription Step 3: AI Summary Generation Prepare Transcript: Formats transcribed text for AI processing Generate Summary: AI model creates concise meeting summary with: Key discussion points Decisions made Action items assigned Next steps identified Step 4: Distribution Send Email: Automatically emails summary to all meeting participants Setup Requirements Google Meet Integration: Configure Google Meet webhook and API credentials Set up meeting detection triggers Test with sample meeting Vexa Bot Configuration: Add Vexa Bot API credentials for recording Configure audio file retrieval settings Set recording quality and format preferences AI Model Setup: Configure AI transcription service (e.g., OpenAI Whisper, Google Speech-to-Text) Set up AI summary generation with custom prompts Define summary format and length preferences Email Configuration: Set up SMTP credentials for email distribution Create email templates for meeting summaries Configure participant list extraction from meeting metadata Import Instructions Get Workflow JSON: Copy the workflow JSON code Open n8n Editor: Navigate to your n8n dashboard Import Workflow: Click menu (⋯) → "Import from Clipboard" → Paste JSON → Import Configure Credentials: Add API keys for Google Meet, Vexa Bot, AI services, and SMTP Test Workflow: Run a test meeting to verify end-to-end functionality Your meetings will now automatically transform into actionable summaries delivered to your inbox!

Oneclick AI SquadBy Oneclick AI Squad
3918

Schedule LinkedIn posts with AI content generation and Telegram approval

Overview This workflow automates LinkedIn posts using OpenAI. The prompts are stored in the workflow and can be customized as needed to fit your needs. The workflow uses a combination of a Schedule Trigger, some code that determines what day of the week it is (no posting Friday - Sunday), a prompts node to set your OpenAI prompts, a random selection of a prompt so that you are not generating content that looks repetitive. We send that all to OpenAI API, select a random time, have the final LinkedIn post sent to your Telegram for approval, once approved wait for the correct time slot, and then Post to your LinkedIn account using the LinkedIn node. How it works: Run or schedule the workflow in n8n The automation can be triggered manually or on a custom schedule (excluding weekends if needed). You should customize the prompts in the Prompt Node to suit your needs. A random LinkedIn post prompt is selected Pre-written prompts are rotated to keep content fresh and non-repetitive. OpenAI generates the LinkedIn post The prompt is sent to OpenAI via API, and the result is returned in clean, ready-to-use form. You receive the draft via Telegram. The post is sent to Telegram for quick approval or review. Post is scheduled or published via the LinkedIn Connector Once approved, the workflow delays until the target time, then sends the content to LinkedIn. What's needed: An OpenAPI API key, LinkedIn Account, and a Telegram Account. For Telegram you will need to configure the Bot service. Step-by-Step: Telegram Approval for Your Workflow A. Set Up a Telegram Bot Open Telegram and search for “@BotFather”. Start a chat and type /newbot to create a bot. Give your bot a name and a unique username (e.g., YourApprovalBot). Copy the API token that BotFather gives you. B. Add Your Bot to a Private Chat (with You) Find your bot in Telegram, click “Start” to activate it. Send a test message (like “hello”) so the chat is created. C. Get Your User ID Search for “userinfobot” or use sites like userinfobot in Telegram. Type /start and it will reply with your Telegram user ID. OpenAI powers the LinkedIn post creation Add Your OpenAI API Key: Log in to your OpenAI Platform account: https://platform.openai.com/. Go to API keys and create a new secret key. In n8n, create a new "OpenAI API" credential and paste your API key. Give it a name. Apply Credential to Nodes: OpenAI Message Node Connect your LinkedIn account to the Linked in Node Select your account from the LinkedIn Dropdown box.

Chad McGreanorBy Chad McGreanor
1153

WordPress context backup to github

This template can backup WordPress context github。

boagoldenBy boagolden
641

Generate AI News LinkedIn Posts with GPT-4o-mini, NewsAPI, and Qdrant

Overview Automated LinkedIn content generator that: Fetches trending AI news using NewsAPI Enhances content with Qdrant vector store context Generates professional LinkedIn posts using GPT-4o-mini Tracks email interactions in Google Sheets 🛠️ Prerequisites API Keys : NewsAPI, OpenAI (GPT-4o-mini), Qdrant Accounts : Gmail Oauth, Google Sheets, LinkedIn developer API Environment Variables : OPENAIAPIKEY NEWSAPI_KEY QDRANTURL/QDRANTAPI_KEY 📁 Google Sheets Setup Create a spreadsheet with these columns: ISO date Email address Unique ID "Approve" or "Reject" ⚙️ Setup Instructions Pre-populate Qdrant : Create collection "posts" with LinkedIn post examples Add 10+ example posts for style reference Node Configuration : Update Gmail credentials (OAuth2) Set fromEmail/toEmail in email nodes Configure Google Sheets document IDs Test Workflow : Run Schedule Trigger manually first Verify email notifications work Check Qdrant vector store connectivity 🎨 Customization Options Tone Adjustment : Modify system message in "AI Agent" Post Style : Update prompt in "Generate LinkedIn Post" Filter Criteria : Edit NewsAPI URL parameters Scheduling : Change interval in Schedule Trigger

LucienBy Lucien
452

Automate face swapping for GIFs with Fal.run AI and Google Services

This workflow allows you to automatically swap faces in animated GIFs using AI, without writing a single line of code. By simply inserting the URL of a face image and a GIF into a Google Sheet, the automation takes care of everything: it sends the data to AI platform, monitors the processing status, retrieves the final face-swapped GIF, uploads it to Google Drive, and updates the Google Sheet with the result. This solution is perfect for content creators, marketers, or developers looking to integrate AI-powered GIF editing into their workflows in a fast and scalable way. Whether used manually or on a scheduled basis, the workflow turns a tedious creative task into a fully automated pipeline. This workflow automates GIF face-swapping by integrating Google Sheets for input/output and Fal.run for AI processing, ensuring seamless execution via scheduled or manual triggers. --- Example Face image: Gif Image: Result: --- How It Works Trigger: The workflow can be triggered manually ("When clicking ‘Test workflow’") or automatically via a scheduled trigger ("Schedule Trigger") set to run at intervals (e.g., every 5 minutes). Data Retrieval: The "Google Sheets" node fetches data from a predefined Google Sheet, which includes two columns: FACE IMAGE: URL of the face image to swap. GIF IMAGE: URL of the target GIF. API Request: The "Set data" node formats the retrieved URLs into variables (faceimage and gifimage). The "Create Image" node sends a POST request to the Fal.run API (easel-gifswap endpoint) with these URLs to initiate the face-swapping process. The API returns a request_id. Status Check: The "Wait 60 sec." node pauses execution for 60 seconds to allow processing time. The "Get status" node queries the Fal.run API using the request_id to check if the task is COMPLETED. If completed, the "Get Url image" node retrieves the final GIF URL. Output Handling: The "Upload Image" node saves the resulting GIF to Google Drive. The "Update result" node writes the output GIF URL back to the Google Sheet under the RESULT column. --- Set Up Steps Prepare Google Sheet: Create a Google Sheet with columns: FACE IMAGE, GIF IMAGE, and RESULT. Populate the first two columns with image URLs. Leave RESULT empty for the workflow to fill. Configure API Key: Sign up to obtain an API key. In the "Create Image" node, set HTTP header authentication: Name: Authorization Value: Key YOURAPIKEY Schedule Execution: Link the "Schedule Trigger" node to run periodically (e.g., every 5 minutes) or trigger manually for testing. Test and Deploy: Run the workflow to verify face-swapping functionality. Monitor the Google Sheet for the RESULT column updates with the processed GIF URL. --- Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
423

Automate support ticket classification & routing from HubSpot to Jira with GPT

Who is this for? This n8n workflow template is designed for customer support, CX, and ops teams that manage customer messages through HubSpot and use Jira for internal task management. It is especially useful for SaaS companies aiming to automate ticket triage, sentiment detection, and team assignment using AI agents. --- 🧩 What problem is this workflow solving? Customer service teams often struggle with manual message classification, delayed reactions to churn signals, and inefficiencies in routing support issues to the right internal teams. This workflow uses LLMs and automated profiling to: Detect churn risk or intent in customer messages Summarize issues Classify tickets into categories (e.g. fulfillment, technical, invoicing) Automatically create Jira tickets based on enriched insights --- 🤖 What this workflow does This AI-powered workflow processes HubSpot support tickets and routes them to Jira based on sentiment and topic. Here’s the full breakdown: Triggers: Either manually or on a schedule (via cron). Fetch HubSpot tickets: Retrieves new messages and their metadata. Run Orchestration Agent: Uses Sentinel Agent to detect emotional tone, churn risk, and purchase intent. Calls Profiler Agent to enrich customer profiles from HubSpot. Summarizes the message using OpenAI. Classifies the ticket using a custom classifier (technical, fulfillment, etc.). Generate a Jira ticket: Title and description are generated using GPT. The assignee and project are predefined. AI agents can be expanded (e.g. add Guide or Facilitator agents). --- ⚙️ Setup To use this template, you’ll need: HubSpot account with OAuth2 credentials in n8n Jira Software Cloud account and project ID OpenAI credentials for GPT-based nodes Optional: Create sub-workflows for additional AI agents Steps: Clone the workflow in your n8n instance. Replace placeholder credentials for HubSpot, OpenAI, and Jira. Adjust Jira project/issue type IDs to match your setup. Test the workflow using the manual trigger or scheduled trigger node. --- 🛠️ How to customize this workflow to your needs Edit category logic In the “Category Classifier” node, modify categories and prompt structure to match your internal team structures (e.g., billing, account management, tech support). Refine AI prompts Customize the agent prompt definitions in: Sentinel_agent Profiler_agent Orchestrator to better align with your brand tone or service goals. Update Jira integration You can route tickets to different projects or team leads by adjusting the “Create an issue in Jira” node based on classification output. Add escalation paths Insert Slack, email, or webhook notifications for specific risk levels or customer segments. --- This workflow empowers your team with real-time message triage, automated decision-making, and AI-enhanced customer insight, turning every inbound ticket into a data-driven action item. ---

PollupAIBy PollupAI
284

Automated resume tailoring with Telegram Bot, LinkedIn & OpenRouter AI

This n8n workflow lets you effortlessly tailor your resume for any job using Telegram and LinkedIn. Simply send a LinkedIn job URL or paste a job description to the Telegram bot, and the workflow will: Extract the job information (using optional proxy if needed) Fetch your resume in JSON Resume format (hosted on GitHub Gist or elsewhere) Use an OpenRouter-powered LLM agent to automatically adapt your resume to match the job requirements Generate both HTML and PDF versions of your tailored resume Return the PDF file and shareable download links directly in Telegram The workflow is open-source and designed with privacy in mind. You can host the backend yourself to keep your data entirely under your control. It requires a Telegram Bot, a public JSON Resume, and an OpenRouter account. Proxy support is available for LinkedIn scraping. Perfect for anyone looking to quickly customize their resume for multiple roles with minimal manual effort!

Daniel IlieshBy Daniel Iliesh
209

Automate cancellation feedback collection with Stripe and Google Sheets

Who's it for This template is perfect for any SaaS business or subscription service using Stripe. Product managers, customer success teams, and founders can use this to automatically collect cancellation feedback without manual follow-up. Ideal for companies looking to reduce churn by understanding why customers leave. What it does When a customer cancels their Stripe subscription, this workflow instantly: Detects the cancellation via Stripe webhook Fetches customer details from Stripe API Sends a personalized feedback survey email with embedded customer information Logs all cancellations to Google Sheets for tracking Receives survey responses via webhook Automatically routes feedback to different Google Sheets based on reason (pricing concerns, feature requests, or other feedback) This organized approach helps you identify patterns in cancellations and prioritize improvements that matter most. How it works Stripe triggers the workflow when a subscription is canceled Customer data is fetched from the Stripe API (email, name, plan details) Personalized email is sent with a survey link containing customer data as URL parameters Cancellation is logged to a Google Sheets "Cancellations" tab When the customer submits the survey, a webhook receives the response Feedback is routed to dedicated sheets based on cancellation reason: Price Concerns → for pricing-related issues Feature Requests → for missing functionality Other Feedback → for everything else Set up steps Setup time: ~20 minutes Prerequisites Stripe account (test mode recommended for initial setup) Google account with Google Sheets Email service (Gmail, Outlook, or SMTP) Survey tool with webhook support (Tally or Typeform recommended) Configuration Stripe webhook: Copy the webhook URL from the "Stripe Subscription Canceled" node and add it to your Stripe Dashboard → Webhooks. Select the customer.subscription.deleted event. Email credentials: Configure Gmail, Outlook, or SMTP credentials in the "Send Feedback Survey Email" node. Update the fromEmail parameter. Survey form: Create a survey form with these fields: Hidden fields (auto-populated from URL): email, customer_id, name, plan Visible fields: reason dropdown ("Too Expensive", "Missing Features", "Other"), comments textarea Configure webhook to POST responses to the "Survey Response Webhook" URL Google Sheets: Create a spreadsheet with 4 sheets: "Cancellations", "Price Concerns", "Feature Requests", and "Other Feedback". Connect your Google account in the Google Sheets nodes. Survey URL: Replace [SURVEY_URL] in the email template with your actual survey form URL. Test: Use Stripe test mode to trigger a test cancellation and verify the workflow executes correctly. Requirements Stripe account with API access Google Sheets (free) Email service: Gmail, Outlook, or SMTP server Survey tool: Tally (free), Typeform (paid), or custom form with webhook capability n8n instance: Cloud or self-hosted How to customize Different surveys by plan: Add an IF node after getting customer details to send different survey links based on subscription tier Slack notifications: Add a Slack node after "Route by Feedback Type" to alert your team about price concerns in real-time Delayed email: Insert a Wait node before sending the email to give customers a 24-hour cooldown period CRM integration: Add nodes to sync cancellation data with your CRM (HubSpot, Salesforce, etc.) Follow-up workflow: Create a separate workflow that triggers when feedback is received to send personalized follow-up offers Custom routing logic: Modify the Switch node conditions to match your specific survey options or add more categories Tips for success Use Stripe test mode initially to avoid sending emails to real customers during setup Customize the email tone to match your brand voice Keep the survey short (2-3 questions max) for higher response rates Review feedback weekly to identify patterns and prioritize improvements Consider offering a discount or incentive for completing the survey

Daiki TakayamaBy Daiki Takayama
52

Track software vulnerability patents with ScrapeGraphAI, Matrix, and Intercom

Software Vulnerability Patent Tracker ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically tracks newly-published patent filings that mention software-security vulnerabilities, buffer-overflow mitigation techniques, and related technology keywords. Every week it aggregates fresh patent data from USPTO and international patent databases, filters it by relevance, and delivers a concise JSON digest (and optional Intercom notification) to R&D teams and patent attorneys. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud, v1.7.0+) ScrapeGraphAI community node installed Basic understanding of patent search syntax (for customizing keyword sets) Optional: Intercom account for in-app alerts Required Credentials | Credential | Purpose | |------------|---------| | ScrapeGraphAI API Key | Enables ScrapeGraphAI nodes to fetch and parse patent-office webpages | | Intercom Access Token (optional) | Sends weekly digests directly to an Intercom workspace | Additional Setup Requirements | Setting | Recommended Value | Notes | |---------|-------------------|-------| | Cron schedule | 0 9 1 | Triggers every Monday at 09:00 server time | | Patent keyword matrix | See example CSV below | List of comma-separated keywords per tech focus | Example keyword matrix (upload as keywords.csv or paste into the “Matrix” node): topic,keywords Buffer Overflow,"buffer overflow, stack smashing, stack buffer" Memory Safety,"memory safety, safe memory allocation, pointer sanitization" Code Injection,"SQL injection, command injection, injection prevention" How it works This workflow automatically tracks newly-published patent filings that mention software-security vulnerabilities, buffer-overflow mitigation techniques, and related technology keywords. Every week it aggregates fresh patent data from USPTO and international patent databases, filters it by relevance, and delivers a concise JSON digest (and optional Intercom notification) to R&D teams and patent attorneys. Key Steps: Schedule Trigger: Fires weekly based on the configured cron expression. Matrix (Keyword Loader): Loads the CSV-based technology keyword matrix into memory. Code (Build Search Queries): Dynamically assembles patent-search URLs for each keyword group. ScrapeGraphAI (Fetch Results): Scrapes USPTO, EPO, and WIPO result pages and parses titles, abstracts, publication numbers, and dates. If (Relevance Filter): Removes patents older than 1 year or without vulnerability-related terms in the abstract. Set (Normalize JSON): Formats the remaining records into a uniform JSON schema. Intercom (Notify Team): Sends a summarized digest to your chosen Intercom workspace. (Skip or disable this node if you prefer to consume the raw JSON output instead.) Sticky Notes: Contain inline documentation and customization tips for future editors. Set up steps Setup Time: 10-15 minutes Install Community Node Navigate to “Settings → Community Nodes”, search for ScrapeGraphAI, and click “Install”. Create Credentials Go to “Credentials” → “New Credential” → select ScrapeGraphAI API → paste your API key. (Optional) Add an Intercom credential with a valid access token. Import the Workflow Click “Import” → “Workflow JSON” and paste the template JSON, or drag-and-drop the .json file. Configure Schedule Open the Schedule Trigger node and adjust the cron expression if a different frequency is required. Upload / Edit Keyword Matrix Open the Matrix node, paste your custom CSV, or modify existing topics & keywords. Review Search Logic In the Code (Build Search Queries) node, review the base URLs and adjust patent databases as needed. Define Notification Channel If using Intercom, select your Intercom credential in the Intercom node and choose the target channel. Execute & Activate Click “Execute Workflow” for a trial run. Verify the output. If satisfied, switch the workflow to “Active”. Node Descriptions Core Workflow Nodes: Schedule Trigger – Initiates the workflow on a weekly cron schedule. Matrix – Holds the CSV keyword table and makes each row available as an item. Code (Build Search Queries) – Generates search URLs and attaches meta-data for later nodes. ScrapeGraphAI – Scrapes patent listings and extracts structured fields (title, abstract, pub. date, link). If (Relevance Filter) – Applies date and keyword relevance filters. Set (Normalize JSON) – Maps scraped fields into a clean JSON schema for downstream use. Intercom – Sends formatted patent summaries to an Intercom inbox or channel. Sticky Notes – Provide inline documentation and edit history markers. Data Flow: Schedule Trigger → Matrix → Code → ScrapeGraphAI → If → Set → Intercom Customization Examples Change Data Source to Google Patents javascript // In the Code node const base = 'https://patents.google.com/?q='; items.forEach(item => { item.json.searchUrl = ${base}${encodeURIComponent(item.json.keywords)}&oq=${encodeURIComponent(item.json.keywords)}; }); return items; Send Digest via Slack Instead of Intercom javascript // Replace Intercom node with Slack node { "text": 🚀 New Vulnerability-related Patents (${items.length})\n + items.map(i => • <${i.json.link}|${i.json.title}>).join('\n') } Data Output Format The workflow outputs structured JSON data: json { "topic": "Memory Safety", "keywords": "memory safety, safe memory allocation, pointer sanitization", "title": "Memory protection for compiled binary code", "publicationNumber": "US20240123456A1", "publicationDate": "2024-03-21", "abstract": "Techniques for enforcing memory safety in compiled software...", "link": "https://patents.google.com/patent/US20240123456A1/en", "source": "USPTO" } Troubleshooting Common Issues Empty Result Set – Ensure that the keywords are specific but not overly narrow; test queries manually on USPTO. ScrapeGraphAI Timeouts – Increase the timeout parameter in the ScrapeGraphAI node or reduce concurrent requests. Performance Tips Limit the keyword matrix to <50 rows to keep weekly runs under 2 minutes. Schedule the workflow during off-peak hours to reduce load on patent-office servers. Pro Tips: Combine this workflow with a vector database (e.g., Pinecone) to create a semantic patent knowledge base. Add a “Merge” node to correlate new patents with existing vulnerability CVE entries. Use a second ScrapeGraphAI node to crawl citation trees and identify emerging technology clusters.

vinci-king-01By vinci-king-01
45
All templates loaded