Store data received from webhook in JSON
Store the data received from the CocktailDB API in JSON
Build an AI marketing team with OpenAI O3 and GPT-4.1-mini for automated content creation
π§ Overview This multi-agent n8n automation simulates a high-functioning marketing team. A strategic CMO agent receives your chat-based input, decides which specialist is best for the task, and delegates accordingly. Each specialist (copywriter, SEO expert, brand strategist, etc.) operates independently using fast, cost-effective GPT-4.1-mini modelsβresulting in parallel task execution and full-funnel marketing output with minimal human input. --- βοΈ How It Works A chat message trigger listens for input (e.g. βWrite a full email funnel for our SaaS launchβ). The CMO Agent (powered by OpenAI O3) reads the message and determines intent, strategy, and needed outputs. It dynamically delegates tasks to the correct AI agent: Copywriter Agent Facebook Ads Specialist SEO Content Writer Email Marketer Social Media Manager Brand Voice Specialist Each agent uses a dedicated GPT-4.1-mini model to produce results instantly. Final content is returned to the user or passed along for integration with your CMS, ad platforms, or CRM. --- π§° Tools Used n8n β Orchestrates the entire agent communication and routing logic OpenAI O3 β Advanced strategic reasoning (CMO Agent) OpenAI GPT-4.1-mini β Fast and cost-efficient for specialist agents LangChain Nodes β For multi-agent thinking and tool-based execution --- π Quick Start Import Workflow: Load the provided .json into your n8n instance Set Credentials: Add your OpenAI API key under βOpenAI Accountβ Deploy Webhook: Use the βWhen Chat Message Receivedβ trigger Test It: Ask a question like: > βGenerate a 7-day onboarding email sequence for a weight loss appβ Watch the Agents Collaborate! --- π©βπΌ Meet Your AI Marketing Team | Agent | Purpose | Model | Output | |-------|---------|-------|--------| | π§ CMO Agent | Strategy, delegation, and task routing | O3 | Central brain | | βοΈ Copywriter Agent | Website copy, CTAs, product descriptions | GPT-4.1-mini | Fast, human-like copy | | π± Facebook Ads Copywriter | Ad headlines, angles, A/B tests | GPT-4.1-mini | Platform-specific ad copy | | π SEO Writer | Blog posts, keyword-rich content | GPT-4.1-mini | Long-form content | | π§ Email Specialist | Sequences, newsletters, welcome flows | GPT-4.1-mini | Funnel-ready emails | | π² Social Media Manager | Content calendars, posts, hashtags | GPT-4.1-mini | Cross-platform content | | π¨ Brand Voice Specialist | Tone consistency, style guides | GPT-4.1-mini | On-brand text | --- π‘ Use Cases Product Launches: Strategy β Landing Page β Emails β Social Posts Lead Nurture Funnels: Segmented email campaigns with consistent tone Content Sprints: Generate 30+ blog posts and socials in a day Ad Variations: Create 20 ad angles in 30 seconds Brand Guidelines: Enforce consistent messaging across departments --- πΈ Cost Optimization Use O3 sparinglyβonly for strategic tasks All specialist agents use GPT-4.1-mini for low-latency, high-efficiency generation Run agents in parallel to reduce wait times Add caching for repeat requests --- π§ Customization Tips Edit the tool prompts to match your brandβs style and niche Connect outputs to Google Sheets, Notion, Slack, or email tools Integrate with Zapier, Make.com, or your CRM for full automation --- π Connect With Me Website: nofluff.online YouTube: @YaronBeen LinkedIn: Yaron Been --- π·οΈ Tags n8n OpenAI MarketingAI CMOagent Automation GPT4 LangChain NoCode MarketingTeam AIWorkflow EmailMarketing SEO Copywriting SocialMedia DigitalMarketing BrandVoice AItools MultiAgentSystem ContentCreation MarketingStrategy ContentOps
Download and compress folder from S3 to ZIP file
This workflow downloads all files from a specific folder in a S3 Bucket and compresses them so you can download it via n8n or do further processings. Fill in your Credentials and Settings in the Nodes marked with "*". Might serve well as Blueprint or as manual Download for S3 Folders. Since I found it rather tricky to compress all binary files into one zip file I figured might it be an interesting Template. Hint: This is the expression to get every binary key to compress them dynamically. {{ Object.keys($binary).join(',') }} (used in the "Compress"-Node) Enjoy the Workflow! β€οΈ https://let-the-work-flow.com Workflow Automation & Development
Sync new Shopify products to Odoo products
Seamlessly sync newly created Shopify products into Odoo with this smart, no-code n8n workflow. Automatically checks for duplicates and creates new products only when needed β fully compatible with standard Odoo setup. π Key Features: Trigger-based Automation: Starts instantly when a new product is created in Shopify Duplicate Check: Searches for an existing product in Odoo using Shopifyβs product SKU as the Internal Reference (Default Code) Smart Condition Logic: If a matching product is found, the workflow stops β preventing duplicates If not found, a new product is created in Odoo Auto-Creation of Odoo Product using Shopify data: Product Name Description Price Internal Reference (SKU) Works with both Odoo Community and Enterprise editions Fully customizable and extendable in n8n No coding, no custom modules required π Requirements: n8n (self-hosted or cloud) Shopify API access Odoo instance with standard API access π‘ A perfect solution for businesses that regularly add new products to Shopify and want them mirrored in Odoo β automatically and without manual effort. +++++++++++++++++++++++++++++++++++++++++ Workflow functions like the integration of a Shopify product with an Odoo product. Trigger: The workflow starts when a new product is created in Shopify. Check Existing Products: It searches in Odoo to check if any product already exists with the same Internal Reference or Default Code (taken from the Shopify product data). Condition: If a matching product is found: The workflow stops (no duplicates are created). If no matching product is found: It proceeds to create a new product in Odoo. Create Product in Odoo: A new product is created in Odoo using the following fields from Shopify: Product Name Description Price Internal Reference (or Default Code)
Automate hotel price comparison with multi-platform scraping and email reporting
This is a production-ready, end-to-end workflow that automatically compares hotel prices across multiple booking platforms and delivers beautiful email reports to users. Unlike basic building blocks, this workflow is a complete solution ready to deploy. --- β¨ What Makes This Production-Ready β Complete End-to-End Automation Input: Natural language queries via webhook Processing: Multi-platform scraping & comparison Output: Professional email reports + analytics Feedback: Real-time webhook responses β Advanced Features π§ Natural Language Processing for flexible queries π Parallel scraping from multiple platforms π Analytics tracking with Google Sheets integration π Beautiful HTML email reports π‘οΈ Error handling and graceful degradation π± Webhook responses for real-time feedback β Business Value For Travel Agencies: Instant price comparison service for clients For Hotels: Competitive pricing intelligence For Travelers: Save time and money with automated research --- π Setup Instructions Step 1: Import Workflow Copy the workflow JSON from the artifact In n8n, go to Workflows β Import from File/URL Paste the JSON and click Import Step 2: Configure Credentials A. SMTP Email (Required) Settings β Credentials β Add Credential β SMTP Host: smtp.gmail.com (for Gmail) Port: 587 User: your-email@gmail.com Password: your-app-password (not regular password!) Gmail Setup: Enable 2FA on your Google Account Generate App Password: https://myaccount.google.com/apppasswords Use the generated password in n8n B. Google Sheets (Optional - for analytics) Settings β Credentials β Add Credential β Google Sheets OAuth2 Follow the OAuth flow to connect your Google account Sheet Setup: Create a new Google Sheet Name the first sheet "Analytics" Add headers: timestamp, query, hotel, city, checkIn, checkOut, bestPrice, platform, totalResults, userEmail Copy the Sheet ID from URL and paste in the "Save to Google Sheets" node Step 3: Set Up Scraping Service You need to create a scraping API that the workflow calls. Here are your options: Option A: Use Your Existing Python Script Create a simple Flask API wrapper: python api_wrapper.py from flask import Flask, request, jsonify import subprocess import json app = Flask(name) @app.route('/scrape/<platform>', methods=['POST']) def scrape(platform): data = request.json query = f"{data['checkIn']} to {data['checkOut']}, {data['hotel']}, {data['city']}" try: result = subprocess.run( ['python3', 'pricescrap2.py', query, platform], capture_output=True, text=True, timeout=30 ) Parse your script output output = result.stdout Assuming your script returns price data return jsonify({ 'price': extracted_price, 'currency': 'USD', 'roomType': 'Standard Room', 'url': booking_url, 'availability': True }) except Exception as e: return jsonify({'error': str(e)}), 500 if name == 'main': app.run(host='0.0.0.0', port=5000) Deploy: bash pip install flask python api_wrapper.py Update n8n HTTP Request nodes: URL: http://your-server-ip:5000/scrape/booking URL: http://your-server-ip:5000/scrape/agoda URL: http://your-server-ip:5000/scrape/expedia Option B: Use Third-Party Scraping Services Recommended Services: ScraperAPI (scraperapi.com) - $49/month for 100k requests Bright Data (brightdata.com) - Pay as you go Apify (apify.com) - Has pre-built hotel scrapers Example with ScraperAPI: javascript // In HTTP Request node URL: http://api.scraperapi.com Query Parameters: apikey: YOURAPI_KEY url: https://booking.com/search?hotel={{$json.hotelName}}... Option C: Use n8n SSH Node (Like Your Original) Keep your SSH approach but improve it: Replace HTTP Request nodes with SSH nodes Point to your server with the Python script Ensure error handling and timeouts javascript // SSH Node Configuration Host: your-server-ip Command: python3 /path/to/pricescrap2.py "{{$json.hotelName}}" "{{$json.city}}" "{{$json.checkInISO}}" "{{$json.checkOutISO}}" "booking" Step 4: Activate Webhook Click on "Webhook - Receive Request" node Click "Listen for Test Event" Copy the webhook URL (e.g., https://your-n8n.com/webhook/hotel-price-check) Test with this curl command: bash curl -X POST https://your-n8n.com/webhook/hotel-price-check \ -H "Content-Type: application/json" \ -d '{ "message": "I want to check Marriott Hotel in Singapore from 15th March to 18th March", "email": "user@example.com", "name": "John Doe" }' Step 5: Activate Workflow Toggle the workflow to Active The webhook is now live and ready to receive requests --- π Usage Examples Example 1: Basic Query json { "message": "Hilton Hotel in Dubai from 20th December to 23rd December", "email": "traveler@email.com", "name": "Sarah" } Example 2: Flexible Format json { "message": "I need prices for Taj Hotel, Mumbai. Check-in: 5th January, Check-out: 8th January", "email": "customer@email.com" } Example 3: Short Format json { "message": "Hyatt Singapore March 10 to March 13", "email": "user@email.com" } --- π¨ Customization Options Add More Booking Platforms Steps: Duplicate an existing "Scrape" node Update the platform parameter Connect it to "Aggregate & Compare" Update the aggregation logic to include the new platform Change Email Template Edit the "Format Email Report" node's JavaScript: Modify HTML structure Change colors (currently purple gradient) Add your company logo Include terms and conditions Add SMS Notifications Using Twilio: Add new node: Twilio β Send SMS Connect after "Aggregate & Compare" Format: "Best deal: ${hotel} at ${platform} for ${price}" Add Slack Integration Add Slack node after "Aggregate & Compare" Send to travel-deals channel Include quick booking links Implement Caching Add Redis or n8n's built-in cache: javascript // Before scraping, check cache const cacheKey = ${hotelName}-${city}-${checkIn}-${checkOut}; const cached = await $cache.get(cacheKey); if (cached && Date.now() - cached.timestamp < 3600000) { return cached.data; // Use 1-hour cache } --- π Analytics & Monitoring Google Sheets Dashboard The workflow automatically logs to Google Sheets. Create a dashboard with: Metrics to track: Total searches per day/week Most searched hotels Most searched cities Average price ranges Platform with best prices (frequency) User engagement (repeat users) Example Sheet Formulas: // Total searches today =COUNTIF(A:A, TODAY()) // Most popular hotel =INDEX(C:C, MODE(MATCH(C:C, C:C, 0))) // Average best price =AVERAGE(G:G) Set Up Alerts Add a node after "Aggregate & Compare": javascript // Alert if prices are unusually high if (bestDeal.price > avgPrice * 1.5) { // Send alert to admin return [{ json: { alert: true, message: High prices detected for ${hotelName} } }]; } --- π‘οΈ Error Handling The workflow includes comprehensive error handling: Missing Information If user doesn't provide hotel/city/dates β Responds with helpful prompt Scraping Failures If all platforms fail β Sends "No results" email with suggestions Partial Results If some platforms work β Shows available results + notes errors Email Delivery Issues Uses continueOnFail: true to prevent workflow crashes --- π Security Best Practices Rate Limiting Add rate limiting to prevent abuse: javascript // In Parse & Validate node const userEmail = $json.email; const recentSearches = await $cache.get(searches:${userEmail}); if (recentSearches && recentSearches.length > 10) { return [{ json: { status: 'rate_limited', response: 'Too many requests. Please try again in 1 hour.' } }]; } Input Validation Already implemented - validates hotel names, cities, dates Email Verification Add email verification before first use: javascript // Send verification code const code = Math.random().toString(36).substring(7); await $sendEmail({ to: userEmail, subject: 'Verify your email', body: Your code: ${code} }); API Key Protection Never expose scraping API keys in responses or logs --- π Deployment Options Option 1: n8n Cloud (Easiest) Sign up at n8n.cloud Import workflow Configure credentials Activate Pros: No maintenance, automatic updates Cons: Monthly cost Option 2: Self-Hosted (Most Control) bash Using Docker docker run -it --rm \ --name n8n \ -p 5678:5678 \ -v ~/.n8n:/home/node/.n8n \ n8nio/n8n Using npm npm install -g n8n n8n start Pros: Free, full control Cons: You manage updates Option 3: Cloud Platforms Railway.app (recommended for beginners) DigitalOcean App Platform AWS ECS Google Cloud Run --- π Scaling Recommendations For < 100 searches/day Current setup is perfect Use n8n Cloud Starter or small VPS For 100-1000 searches/day Add Redis caching (1-hour cache) Use queue system for scraping Upgrade to n8n Cloud Pro For 1000+ searches/day Implement job queue (Bull/Redis) Use dedicated scraping service Load balance multiple n8n instances Consider microservices architecture --- π Troubleshooting Issue: Webhook not responding Solution: Check workflow is Active Verify webhook URL is correct Check n8n logs: Settings β Log Streaming Issue: No prices returned Solution: Test scraping endpoints individually Check if hotel name matches exactly Verify dates are in future Try different date ranges Issue: Emails not sending Solution: Verify SMTP credentials Check "less secure apps" setting (Gmail) Use App Password instead of regular password Check spam folder Issue: Slow response times Solution: Enable parallel scraping (already configured) Add timeout limits (30 seconds recommended) Implement caching Use faster scraping service
Track Shopify orders in Google Sheets and send Discord notifications
This workflow tracks new Shopify orders in real-time and logs them to a Google Sheet, while also sending a structured order summary to a Discord channel. Perfect for keeping your team and records updated without checking your Shopify admin manually. β Features: Trigger: Listens to orders/create event via the Shopify Trigger node Authentication: Uses Shopify Access Token, generated via a custom/private Shopify app Google Sheets Logging: Automatically appends order details to a sheet with the following columns: Order Number Customer Email Customer Name City Country Order Total Currency Subtotal Tax Financial Status Payment Gateway Order Date Line Item Titles Line Item Prices Order Link Discord Alerts: Sends a clean and formatted summary to your Discord server Line Item Extraction: Breaks down item titles and prices into readable format using code Multi-currency Compatible: Displays currency type dynamically (not hardcoded) --- π§© Nodes Used: Shopify Trigger (Access Token) Code β extract lineitemtitles and lineitemprices Google Sheets β Append row Code (JavaScript) β Format Discord message Discord β Send message --- π Sticky Notes: π οΈ Use your own Google Sheet link and Discord webhook π You can duplicate and adapt this for orders/updated or refunds/create events π No hardcoded API keys β credentials managed via UI --- πΌοΈ Sample Outputs π Google Sheet Entry | Order Number | Customer Email | Customer Name | City | Country | Order Total | Currency | Subtotal | Tax | Financial Status | Payment Gateway | Order Date | Line Item Titles | Line Item Prices | Order Link | |--------------|------------------|----------------|-----------|----------|--------------|----------|----------|--------|-------------------|------------------|------------------------------|----------------------------------------------------------------------------------------------------|----------------------------------|------------| | 1003 | abc123@gmail.com | test name | test city | Pakistan | 2522.77 | PKR | 2174.8 | 347.97 | paid | bogus | 2025-07-31T13:45:35-04:00 | Selling Plans Ski Wax, The Complete Snowboard, The Complete Snowboard, The Collection Snowboard: Liquid | 24.95, 699.95, 699.95, 749.95 | View Order | π¬ Discord Message Preview --- > Tested with Shopify's "Bogus" gateway β works without real card info in a development store.
Detect cannibalized keywords and competing pages with Google Search Console
Find Cannibalized Pages (Google Search Console) This n8n template helps you detect page cannibalization in Google Search Console (GSC): situations where multiple pages on your site rank for the same query and more than one page gets clicks. Use it to spot competing URLs, consolidate content, improve internal linking, and protect your CTR/rankings. --- Good to know Data source: Google Search Console Search Analytics (Dimensions: query, page). Scope: Defaults to last 12 months and up to 10,000 rows per run (adjustable). Logic: Keeps only queries with >1 page and where the second page has clicks > 0 β higher confidence of true cannibalization. Privacy: Template ships with a placeholder property (sc-domain:example.com) and a neutral credential name; replace both after import. Cost: n8n nodes used here are free. GSC usage is also free (subject to Google limits). --- How it works Manual Start β run the workflow on demand. Google Search Console β fetch last 12 months of queryβpage rows. Summarize β group by query, building two arrays: appended_page[] β all pages seen for that query appendedclicks[] β clicks for each page-query row (aligned with appendedpage) Filter β pass only queries where: count_query > 1 (more than one page involved), and appended_clicks[1] > 0 (the second page also received clicks) Output β list of cannibalized queries with the competing pages and their click counts. Example output json { "query": "best running shoes", "appended_page": [ "https://example.com/blog/best-running-shoes", "https://example.com/guide/running-shoes-2025" ], "appended_clicks": [124, 37], "count_query": 3 } How to use Import the JSON into n8n. Open the Google Search Console node and: Connect your Google Search Console OAuth2 credential. Replace siteUrl with your property (sc-domain:your-domain.com). Press Execute Workflow on Manual Start. Review the output β focus on queries where the second page has meaningful clicks. π‘ Tip: If your site is large, start with a shorter date range (e.g., 90 days) or raise rowLimit. --- Requirements Access to the target property in Google Search Console. One Google Search Console OAuth2 credential in n8n. --- Customising this workflow More robust detection: In the Summarize node, change clicks aggregation from append to sum. Then filter for βat least 2 pages with sum_clicks > 0β to avoid any dependency on row order. Scoring & sorting: Add a Code/Function node to sort competing pages by clicks or impressions and compute click-share per page. Deeper analysis: Include impressions and position in the GSC node and extend the summary to prioritize fixes (e.g., high impressions + split clicks). Reporting: Send results to Google Sheets or export a CSV; create a dashboard of top cannibalized queries. Thresholds: Expose minimum click thresholds as workflow variables (e.g., second page clicks β₯ 3) to reduce noise. --- Troubleshooting Empty results: Widen date range, increase rowLimit, or temporarily relax the filter (remove the second-page click condition to validate data flow). No property data: Ensure you used sc-domain: vs. https:// property format correctly and that your user has GSC access. Credential issues: Reconnect the OAuth2 credential and reauthorize if needed.
Add AI-generated headings & conclusions to WordPress posts with GPT-4.1
This workflow enriches your WordPress articles by automatically adding an AI-generated heading and a short concluding paragraph. It ensures each post ends with valuable, engaging content to improve user satisfaction, branding, and SEO. How It Works Fetches published articles from your WordPress site via the REST API. Cleans and formats the article text for processing. Sends the content to OpenAI with a structured prompt. AI generates a new heading + 3-line conclusion tailored to the article. Appends the generated text to the original content. Updates the article back in WordPress automatically. Requirements n8n version: 1.49.0 or later (recommended). Active OpenAI API key. WordPress REST API enabled. WordPress API credentials (username + application password). Setup Instructions Import this workflow into n8n. Go to Credentials and configure: OpenAI API (API key). WordPress API (username + application password). Replace https://example.com with your siteβs URL. Run manually or schedule it to enhance content automatically. Categories AI & Machine Learning WordPress Content Marketing SEO Tags ai, openai, wordpress, seo, content enhancement, automation, n8n
Track website traffic data with SEMrush API and Google Sheets
π Website Traffic Monitoring with SEMrush API and Google Sheets Integration Leverage the powerful SEMrush Website Traffic Checker API to automatically fetch detailed website traffic insights and log them into Google Sheets for real-time monitoring and reporting. This no-code n8n workflow simplifies traffic analysis for marketers, analysts, and website owners. --- βοΈ Node-by-Node Workflow Breakdown π’ On Form Submission Trigger: The workflow is initiated when a user submits a website URL via a form. This serves as the input for further processing. Use Case: When you want to track multiple websites and monitor their performance over time. π Website Traffic Checker API Request: The workflow makes a POST request to the SEMrush Website Traffic Checker API via RapidAPI using the website URL that was submitted. API Data: The API returns detailed traffic insights, including: Visits Bounce rate Page views Sessions Traffic sources And more! π Reformat Parsing: The raw API response is parsed to extract the relevant data under trafficSummary. Data Structure: The workflow creates a clean dataset of traffic data, making it easy to store in Google Sheets. π Google Sheets Logging Data: The traffic data is appended as a new row in your Google Sheet. Google Sheet Setup: The data is organized and updated in a structured format, allowing you to track website performance over time. --- π‘ Use Cases π SEO & Digital Marketing Agencies: Automate client website audits by pulling live traffic data into reports. π Website Owners & Bloggers: Monitor traffic growth and analyze content performance automatically. π Data Analysts & Reporting Teams: Feed traffic data into dashboards and integrate with other KPIs for deeper analysis. π΅οΈ Competitor Tracking: Regularly log competitor site metrics for comparative benchmarking. --- π― Key Benefits β Automated Traffic Monitoring β Run reports automatically on-demand or on a scheduled basis. β Real-Time Google Sheets Logging β Easily centralize and structure traffic data for easy sharing and visualization. β Zero Code Required β Powered by n8nβs visual builder, set up workflows quickly without writing a single line of code. β Scalable & Flexible β Extend the workflow to include alerts, additional API integrations, or other automated tasks. --- π How to Get Your SEMrush API Key via RapidAPI Visit the API Listing π SEMrush Website Traffic Checker API Sign In or Create an Account Log in to RapidAPI or sign up for a free account. Subscribe to the API Choose the appropriate pricing plan and click Subscribe. Access Your API Key Go to the Endpoints tab. Your API key is located under the X-RapidAPI-Key header. Secure & Use the Key Add your API key to the request headers in your workflow. Never expose the key publicly. --- π§ Step-by-Step Setup Instructions Creating the Form to Capture URL In n8n, create a new workflow and add a Webhook trigger node to capture website URLs. Configure the webhook to accept URL submissions from your form. Add a form to your website or app that triggers the webhook when a URL is submitted. Configure SEMrush API Request Node Add an HTTP Request node after the webhook. Set the method to POST and the URL to the SEMrush API endpoint. Add the necessary headers: X-RapidAPI-Host: semrush-website-traffic-checker.p.rapidapi.com X-RapidAPI-Key: [Your API Key] Pass the captured website URL from the webhook as a parameter in the request body. Reformat API Response Add a Set node to parse and structure the API response. Extract only the necessary data, such as: trafficSummary.visits trafficSummary.bounceRate trafficSummary.pageViews trafficSummary.sessions Format the response to be clean and suitable for Google Sheets. Store Data in Google Sheets Add the Google Sheets node to your workflow. Authenticate with your Google account. Select the spreadsheet and worksheet where you want to store the traffic data. Configure the node to append new rows with the extracted traffic data. Google Sheets Columns Setup A: Website URL B: Visits C: Bounce Rate D: Page Views E: Sessions F: Date/Time (optional, you can use a timestamp) Test and Deploy Run a test submission through your form to ensure the workflow works as expected. Check the Google Sheets document to verify that the data is being logged correctly. Set up scheduling or additional workflows as needed (e.g., periodic updates). --- π Customizing the Template You can modify the workflow to suit your specific needs: Add more data points: Customize the SEMrush API request to fetch additional metrics (e.g., traffic sources, keywords, etc.). Create separate sheets: If you're tracking multiple websites, you can create a different sheet for each website or group websites by category. Add alerts: Set up email or Slack notifications if specific traffic conditions (like sudden drops) are met. Visualize data: Integrate Google Sheets with Google Data Studio or other tools for more advanced visualizations. --- π Start Automating in Minutes Build your automated website traffic dashboard with n8n today β no coding required. π Start with n8n for Free Save time, improve accuracy, and supercharge your traffic insights workflow!
AI-powered n8n release notes summary notifications via Gmail with GPT-5-Mini
Who's it for This workflow is perfect for n8n users and teams who want to stay up-to-date with the latest n8n releases without manually checking GitHub. Get AI-powered summaries of new features and bug fixes delivered straight to your inbox. What it does This workflow automatically monitors the n8n GitHub releases page and sends you smart email notifications when new updates are published. It fetches release notes, filters them based on your schedule (daily, weekly, etc.), and uses OpenAI to generate concise summaries highlighting the most important bug fixes and features. The summaries are then formatted into a clean HTML email and sent via Gmail. How to set up Configure the Schedule Trigger - Set how often you want to check for updates (daily, weekly, etc.) Add OpenAI credentials - Connect your OpenAI API key or use a different LLM Add Gmail credentials - Connect your Google account Set recipient email - Update the "To" email address in the Gmail node Activate the workflow and you're done! Requirements OpenAI API account (or alternative LLM) Gmail account with n8n credentials configured How to customize Adjust the schedule trigger to match your preferred notification frequency The filtering logic automatically adapts to your schedule (24 hours for daily, 7 days for weekly, etc.) Modify the AI prompt to focus on different aspects of the release notes Customize the HTML email template to match your preferences