8 templates found
Category:
Author:
Sort:

Automate WhatsApp booking system with GPT-4 Assistant, Cal.com and SMS reminders

AI-powered WhatsApp booking system with instant SMS confirmations Who is this for? This workflow is designed for solo entrepreneurs, consultants, coaches, clinics, or any business that handles client appointments and wants to automate the entire scheduling experience via WhatsApp β€” without the need for live agents. What problem is this workflow solving? Responding to inbound messages, collecting booking details, suggesting available times, and sending reminders can be a huge time drain. This workflow eliminates manual handling by: Automating WhatsApp conversations with an AI assistant Booking appointments directly into Cal.com Sending timely SMS reminders before appointments It ensures you never miss a lead or a follow-up β€” even while you sleep. What this workflow does From a single WhatsApp message, the workflow: Triggers via a WhatsApp webhook Uses GPT-4 to handle conversation flow and qualify the prospect Collects name, email, selected service Calls Cal.com API to fetch available time slots Books the appointment and stores it in Google Sheets Sends a confirmation message via WhatsApp Periodically scans for upcoming appointments Sends SMS reminders to clients 2 hours before their session Setup Connect your Webhook node to a WhatsApp API (e.g., 360dialog, Twilio, or Ultramsg) Add your OpenAI API key for the GPT-4 nodes Configure your Cal.com API key and set your calendar ID Link your Google Sheets with fields like: name, email, date, time, status, reminder_sent Connect your SMS service (e.g., sms77) with API credentials Adjust the schedule in the reminder node as needed How to customize this workflow to your needs Change the language or tone of the AI assistant by editing the system prompt in the GPT node Filter available time slots by service, team member, or duration Modify the reminder timing (e.g., 1 hour before, 24h before, etc.) Add conditional logic to route users to different booking flows based on their responses Integrate additional CRMs or notification channels like email or Slack πŸ“„ Documentation: Notion Guide --- Need help customizing? Contact me for consulting and support : Linkedin / Youtube

Dr. FirasBy Dr. Firas
14239

TrustPilot SaaS product review tracker with Bright Data & OpenAI

Who this is for The TrustPilot SaaS Product Review Tracker is designed for product managers, SaaS growth teams, customer experience analysts, and marketing teams who need to extract, summarize, and analyze customer feedback at scale from TrustPilot. This workflow is tailored for: Product Managers - Monitoring feedback to drive feature improvements Customer Support & CX Teams - Identifying sentiment trends or recurring issues Marketing & Growth Teams - Leveraging testimonials and market perception Data Analysts - Tracking competitor reviews and benchmarking Founders & Executives - Wanting aggregated insights into customer satisfaction What problem is this workflow solving? Manually monitoring, extracting, and summarizing TrustPilot reviews is time-consuming, fragmented, and hard to scale across multiple SaaS products. This workflow automates that process from unlocking the data behind anti-bot layers to summarizing and storing customer insights enabling teams to respond faster, spot trends, and make data-backed product decisions. This workflow solves: The challenge of scraping protected review data (using Bright Data Web Unlocker) The need for structured insights from unstructured review content The lack of automated delivery to storage and alerting systems like Google Sheets or webhooks What this workflow does Extract TrustPilot Reviews: Uses Bright Data Web Unlocker to bypass anti-bot protections and pull markdown-based content from product review pages Convert Markdown to Text: Leverages a basic LLM chain to clean and convert scraped markdown into plain text Structured Information Extraction: Uses OpenAI GPT-4o via the Information Extractor node to extract fields like product name, review date, rating, and reviewer sentiment Summarization Chain: Generates concise summaries of overall review sentiment and themes using OpenAI Merge & Aggregate Output: Consolidates individual extracted records into a structured batch output Outbound Data Delivery: Google Sheets – Appends summary and structured review data Write to Disk – Persists raw and processed content locally Webhook Notification – Sends a real-time alert with summarized insights Pre-conditions You need to have a Bright Data account and do the necessary setup as mentioned in the "Setup" section below. You need to have an OpenAI Account. Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, Configure the Google Sheet Credentials with your own account. Follow this documentation - Set Google Sheet Credential In n8n, configure the OpenAi account credentials. Ensure the URL and Bright Data zone name are correctly set in the Set URL, Filename and Bright Data Zone node. Set the desired local path in the Write a file to disk node to save the responses. How to customize this workflow to your needs Target Multiple Products : Configure the Bright Data input URL dynamically for different SaaS product TrustPilot URLs Loop through a product list and run parallel jobs for each Customize Extraction Fields : Update the prompt in the Information Extractor to include: Review title Response from company Specific feature mentions Competitor references Tune Summarization Style Change tone: executive summary, customer pain-point focus, or marketing quote extract Enable sentiment aggregation (e.g., 30% negative, 50% neutral, 20% positive) Expand Output Destinations Push to Notion, Airtable, or CRM tools using additional webhook nodes Generate and send PDF reports (via PDFKit or HTML-to-PDF nodes) Schedule summary digests via Gmail or Slack

Ranjan DailataBy Ranjan Dailata
1128

Smart RSS feed monitoring with AI filtering, Baserow storage, and Slack alerts

This workflow automates the process of monitoring multiple RSS feeds, intelligently identifying new articles, maintaining a record of processed content, and delivering timely notifications to a designated Slack channel. It leverages AI to ensure only truly new and relevant articles are dispatched, preventing duplicate alerts and information overload. πŸš€ Main Use Cases Automated News Aggregation: Continuously monitor industry news, competitor updates, or specific topics from various RSS feeds. πŸ“ˆ Content Curation: Filter and deliver only new, unprocessed articles to a team or personal Slack channel. 🎯 Duplicate Prevention: Maintain a persistent record of seen articles to avoid redundant notifications. πŸ›‘οΈ Enhanced Information Delivery: Provide a streamlined and intelligent way to stay updated without manual checking. πŸ“§ How it works The workflow operates in distinct, interconnected phases to ensure efficient and intelligent article delivery: RSS Feed Data Acquisition πŸ“₯ Initiation: The workflow is manually triggered to begin the process. πŸ–±οΈ RSS Link Retrieval: It connects to a Baserow database to fetch a list of configured RSS feed URLs. πŸ”— Individual Feed Processing: Each RSS feed URL is then processed independently. πŸ”„ Content Fetching & Parsing: An HTTP Request node downloads the raw XML content of each RSS feed, which is then parsed into a structured JSON format for easy manipulation. πŸ“„βž‘οΈπŸŒ³ Historical Data Management πŸ“š Seen Articles Retrieval: Concurrently, the workflow queries another Baserow table to retrieve a comprehensive list of article GUIDs or links that have been previously processed and notified. This forms the basis for duplicate detection. πŸ” Intelligent Article Filtering with AI 🧠 Data Structuring for AI: A Code node prepares the newly fetched articles and the list of already-seen articles into a specific JSON structure required by the AI Agent. πŸ—οΈ AI-Powered Filtering: An AI Agent, powered by an OpenAI Chat Model and supported by a Simple Memory component, receives this structured data. It is precisely prompted to compare the new articles against the historical "seen" list and return only those articles that are genuinely new and unprocessed. πŸ€– Output Validation: A Structured Output Parser ensures that the AI Agent's response adheres to a predefined JSON schema, guaranteeing data integrity for subsequent steps. βœ… JSON Cleaning: A final Code node takes the AI's raw JSON string output, parses it, and formats it into individual n8n items, ready for notification and storage. 🧹 Notification & Record Keeping πŸ”” Persistent Record: For each newly identified article, its link is saved to the Baserow "seen products" table, marking it as processed and preventing future duplicate notifications. πŸ’Ύ Slack Notification: The details of the new article (title, content, link) are then formatted and sent as a rich message to a specified Slack channel, providing real-time updates. πŸ’¬ Summary Flow: Manual Trigger β†’ RSS Link Retrieval (Baserow) β†’ HTTP Request β†’ XML Parsing | Seen Articles Retrieval (Baserow) β†’ Data Structuring (Code) β†’ AI-Powered Filtering (AI Agent, OpenAI, Memory, Parser) β†’ JSON Cleaning (Code) β†’ Save Seen Articles (Baserow) β†’ Slack Notification πŸŽ‰ Benefits: Fully Automated: Eliminates manual checking of RSS feeds and Slack notifications. ⏱️ Intelligent Filtering: Leverages AI to accurately identify and deliver only new content, avoiding duplicates. πŸ’‘ Centralized Data Management: Utilizes Baserow for robust storage of RSS feed configurations and processed article history. πŸ—„οΈ Real-time Alerts: Delivers timely updates directly to your team or personal Slack channel. ⚑ Scalable & Customizable: Easily adaptable to monitor various RSS feeds and integrate with different Baserow tables and Slack channels. βš™οΈ Setup Requirements: Baserow API Key: Required for accessing and updating your Baserow databases. πŸ”‘ OpenAI API Key: Necessary for the AI Agent to function. πŸ€– Slack Credentials: Either a Slack OAuth token (recommended for full features) or a Webhook URL for sending messages. πŸ—£οΈ Baserow Table Configuration: A table with an rssLink column to store your RSS feed URLs. A table with a Nom column to store the links of processed articles. --- For any questions or further assistance, feel free to connect with me on LinkedIn: https://www.linkedin.com/in/daniel-shashko/

Daniel ShashkoBy Daniel Shashko
858

Backup n8n workflows with versioning and Notion tracking

Copy n8n workflows to a slave n8n repository Inspired by Alex Kim's workflow, this version adds the ability to keep multiple versions of the same workflow on the destination instance. Each copied workflow’s name is prefixed with the date (YYYYMMDD_), enabling simple version tracking. Process details and workflow counts are recorded centrally in Notion. How it works Workflows from the source n8n instance are copied to the destination using the n8n API node. On the destination, each workflow name is prefixed with the current date (e.g., 20250803_PDF Summarizer), so you can keep multiple daily versions. The workflow tracks and saves: The date of execution. Number of workflows processed. Both details are recorded in Notion. Rolling retention policy example: Day 1: Workflows are saved with 20250803_ prefix. Day 2: New set saved with 20250804_. Day 3: Day 1’s set is deleted, new set saved as 20250805_. To keep more days, adjust the β€œSubtract From Date” node. How to use Create a Notion database with one page and three fields: sequence: Should contain "prefix". Value: Today's date as YYYYMMDD_. Comment: Number of saved workflows. Configure the Notion node: Enter your Notion credentials. Link to the created database/page. Update the "Subtract From Date" node: Set how many days’ versions you want to keep (default: 2 days). Set the limit to 1 in the "Limit" node for testing. Input credentials for both source and destination n8n instances. Requirements Notion for tracking execution date and workflow count. n8n API Keys for both source and destination instances. Ensure you have the necessary API permissions (read, create, delete workflows) n8n version this workflow was tested on 1.103.2 (Ubuntu) Need Help? Comment this post or contact me on LinkedIn Ask in the Forum!

StΓ©phane HeckelBy StΓ©phane Heckel
637

Automate blog engagement with GPT-5 generated comments for WordPress

This workflow automatically generates realistic comments for your WordPress articles using AI. It makes your blog look more active, improves engagement, and can even support SEO by adding keyword-relevant comments. How It Works Fetches all published blog posts from your WordPress site via the REST API. Builds a tailored AI prompt using the article’s title, excerpt, and content. Uses OpenAI to generate a short, natural-sounding comment (some positive, some neutral, some longer, some shorter). Assigns a random commenter name and email. Posts the generated comment back to WordPress. Requirements n8n version: 1.49.0 or later (recommended). Active OpenAI API key. WordPress site with REST API enabled. WordPress API credentials (username + application password). Setup Instructions Import this workflow into n8n. Add your credentials in n8n > Credentials: OpenAI API (API key). WordPress API (username + application password). Replace the sample URL https://example.com with your own WordPress site URL. Execute manually or schedule it to run periodically. Categories AI & Machine Learning WordPress Content Marketing Engagement Tags ai, openai, wordpress, comments, automation, engagement, n8n

Ali KhosravaniBy Ali Khosravani
421

Automated weekly security audit reports with Gmail delivery

πŸ”’ N8N Security Audit Report - Automated Weekly Email 🎯 What does this workflow do? This workflow automatically generates and emails a comprehensive security audit report for your N8N instance every week. It identifies potential security risks related to: Credentials πŸ”‘ : Exposed or insecure credentials Nodes 🧩 : Sensitive nodes (Code, HTTP Request, SSH, FTP, etc.) Instance settings 🏒 : Global security configuration Community nodes πŸ“¦ : Third-party nodes that may pose risks The report includes direct links to affected workflows, execution statuses, and actionable recommendations. --- ✨ Key Features πŸ“Š Smart Risk Assessment Calculates overall risk level: 🟩 Low / 🟧 Moderate / πŸŸ₯ High Tracks unique credentials (not just total occurrences) Provides detailed breakdown by node type πŸ”— Direct Workflow Links Clickable links to each workflow mentioned Shows last execution status (🟒 success / πŸ”΄ failed) Displays execution timestamps 🌍 Bilingual Support Full support for French and English Switch language with a single variable πŸ“§ Beautiful HTML Email Clean, professional formatting Color-coded risk levels Emoji icons for easy scanning --- πŸš€ Quick Setup (5 minutes) 1️⃣ Configure Credentials N8N API: Generate an API key in your N8N settings Gmail OAuth2: Set up OAuth2 for Gmail sending 2️⃣ Set Your Variables Edit the "Set Config Variables" node: javascript { "email_to": "your.email@domain.com", "project_name": "My-N8N-Project", "server_url": "https://n8n.yourdomain.com", // NO trailing slash! "Language": "EN" // or "FR" } 3️⃣ Test & Activate Click "Execute Workflow" to test Check your email inbox Activate for weekly automation --- πŸ“§ Example Report Output Subject: πŸ”’ Audit Report My-Project – Risk 🟧 Moderate Content: πŸ“Š Summary β€’ Credentials involved: 8 (5 unique) β€’ Nodes involved: 12 πŸ’» code: 4 🌐 httpRequest: 3 πŸ” ssh: 2 β€’ Community nodes: 1 β€’ Overall risk level: 🟧 Moderate πŸ” Credentials Risk Report πŸ”Ή Credentials with full access πŸ”‘ My AWS Credentials πŸ”‘ Database Admin πŸ“‹ Workflow: Data Processing Pipeline 🟒 (25-10-2024 06:15 β†’ 06:16) πŸ’» Process Data 🌐 API Call 🧩 Nodes Risk Report [...detailed node analysis...] --- 🎨 Customization Options Change Schedule Modify the "Schedule Trigger" node to run: Daily at 8 AM Monthly on the 1st Custom cron expression Add Recipients Add multiple emails in the Gmail node's toList parameter Adjust Risk Thresholds Edit the JavaScript in "Format Audit Report" nodes to customize when risk levels change Use Different Email Service Replace Gmail node with: SMTP Microsoft Outlook SendGrid Any email service N8N supports --- πŸ’‘ Use Cases βœ… Compliance Monitoring: Track security posture for audits βœ… Team Awareness: Keep your team informed of security status βœ… Change Detection: Notice when new risky nodes are added βœ… Best Practices: Get recommendations to improve security βœ… Multi-Environment: Run separate instances for dev/staging/prod --- πŸ”§ Technical Details Nodes Used: 8 Credentials Required: 2 (N8N API + Gmail OAuth2) External Dependencies: None N8N Version: Compatible with latest N8N versions Execution Time: ~10-20 seconds --- πŸ“‹ Requirements N8N instance with API access Gmail account (or other email service) N8N API key with audit permissions Valid SSL certificate for workflow links (recommended) --- πŸ› Troubleshooting Empty report? β†’ Check your N8N API key has audit permissions Workflow links don't work? β†’ Verify server_url is correct and has no trailing slash No execution status shown? β†’ Workflows must have been executed at least once Wrong language displayed? β†’ Set Language to exactly "FR" or "EN" (uppercase) --- 🌟 Why This Template? Unlike basic monitoring tools, this workflow: βœ… Provides context-aware security analysis βœ… Links directly to affected workflows βœ… Shows real execution data (not just theoretical risks) βœ… Calculates unique credential exposure (not just counts) βœ… Supports bilingual reports βœ… Delivers actionable recommendations --- 🀝 Feedback & Support Found this helpful? Please rate the template! Have suggestions? Drop a comment below. Pro tip: Combine this with N8N's native alerting for real-time incident response! --- Tags: security audit monitoring compliance automation email reporting credentials governance --- πŸ“œ License MIT - Feel free to modify and share!

MatthieuBy Matthieu
389

Scrape multi-page websites recursively with Google Sheets storage

Configurable Multi-Page Web Scraper Introduction This n8n workflow provides a robust and highly reusable solution for scraping data from paginated websites. Instead of building a complex series of nodes for every new site, you only need to update a simple JSON configuration in the initial Input Node, making your scraping tasks faster and more standardized. Purpose The core purpose of this template is to automate the extraction of structured data (e.g., product details, quotes, articles) from websites with multiple pages. It is designed to be fully recursive: it follows the "next page" link until no link is found, aggregates the results from all pages, and cleanly structures the final output into a single list of items. Setup and Configuration Locate the Input Node: The entire configuration for the scraper is held within the first node of the workflow. Update the JSON: Replace the existing JSON content with your target website's details: startUrl: The URL of the first page to begin scraping. nextPageSelector: The CSS selector for the "Next" or "Continue" link element that leads to the next page. This is crucial for the pagination loop. fields: An array of objects defining the data to extract on each page. For each field, specify the name (the output key), the selector (the CSS selector pointing to the data), and the value (the HTML attribute to pull, usually text or href). Run the Workflow: After updating the configuration, execute the workflow. It will automatically loop through all pages and deliver a final, structured list of the scraped data. For a detailed breakdown of the internal logic, including how the loop is constructed using the Set, If, and HTTP Request nodes, please refer to the original blog post: Flexible Web Scraping with n8n: A Configurable, Multi-Page Template

Viktor KlepikovskyiBy Viktor Klepikovskyi
185

Detect AWS Orphaned Resources & Send Cost Reports to Slack, Email, and Sheets

How it works This workflow automatically scans AWS accounts for orphaned resources (unattached EBS volumes, old snapshots >90 days, unassociated Elastic IPs) that waste money. It calculates cost impact, validates compliance tags, and sends multi-channel alerts via Slack, Email, and Google Sheets audit logs. Key Features: πŸ” Multi-region scanning with parallel execution πŸ’° Monthly/annual cost calculation with risk scoring πŸ“Š Professional HTML reports with charts and tables 🏷️ Tag compliance validation (SOC2/ISO27001/HIPAA) βœ… Conditional alerting (only alerts when resources found) πŸ“ˆ Google Sheets audit trail for trend analysis What gets detected: Unattached EBS volumes ($0.10/GB/month waste) Snapshots older than 90 days ($0.05/GB/month) Unassociated Elastic IPs ($3.60/month each) Typical savings: $50-10K/month depending on account size Set up steps Prerequisites AWS Configuration: Create IAM user n8n-resource-scanner with these permissions: ec2:DescribeVolumes ec2:DescribeSnapshots ec2:DescribeAddresses ec2:DescribeInstances lambda:InvokeFunction Deploy Lambda function aws-orphaned-resource-scanner (Node.js 18+) Add EC2 read-only permissions to Lambda execution role Generate AWS Access Key + Secret Key Lambda Function Code: See sticky notes in workflow for complete implementation using @aws-sdk/client-ec2 Credentials Required: AWS IAM (Access Key + Secret) Slack (OAuth2 or Webhook) Gmail (OAuth2) Google Sheets (OAuth2) Configuration Initialize Config Node: Update these settings: awsRegions: Your AWS regions (default: us-east-1) emailRecipients: FinOps team emails slackChannel: Alert channel (e.g., cloud-ops) requiredTags: Compliance tags to validate snapshotAgeDays: Age threshold (default: 90) Set Region Variables: Choose regions to scan Lambda Function: Deploy function with provided code (see workflow sticky notes) Google Sheet: Create spreadsheet with headers: Scan Date | Region | Resource Type | Resource ID | Monthly Cost | Compliance | etc. Credentials: Connect all four credential types in n8n Schedule: Enable "Weekly Scan Trigger" (default: Mondays 8 AM UTC) Testing Click "Execute Workflow" to run manual test Verify Lambda invokes successfully Check Slack alert appears Confirm email with HTML report received Validate Google Sheets logging works Customization Options Multi-region: Add regions in "Initialize Config" Alert thresholds: Modify cost/age thresholds Additional resource types: Extend Lambda function Custom tags: Update required tags list Schedule frequency: Adjust cron trigger Use Cases FinOps Teams: Automated cloud waste detection and cost reporting Cloud Operations: Weekly compliance and governance audits DevOps: Resource cleanup automation and alerting Security/Compliance: Tag validation for SOC2/ISO27001/HIPAA Executive Reporting: Monthly cost optimization metrics Resources AWS IAM Best Practices Lambda Function Code

Chad M. CrowellBy Chad M. Crowell
94
All templates loaded