Analyze Meta ads creatives with Google Vision & Video Intelligence APIs
This workflow transforms your Meta Ads creatives into a rich dataset of actionable insights. It's designed for data-driven marketers, performance agencies, and analysts who want to move beyond basic metrics and understand the specific visual and textual elements that drive ad performance. By automatically analyzing every video and image with Google's powerful AI (Video Intelligence and Vision APIs), it systematically deconstructs your creatives into labeled data, ready for correlation with campaign results. Use Case You know some ads perform better than others, but do you know why? Is it the presence of a person, a specific object, the on-screen text, or the spoken words in a video? Answering these questions manually is nearly impossible at scale. This workflow automates the deep analysis process, allowing you to: Automate Creative Analysis: Stop guessing and start making data-backed decisions about your creative strategy. Uncover Hidden Performance Drivers: Identify which objects, themes, text, or spoken phrases correlate with higher engagement and conversions. Build a Structured Creative Database: Create a detailed, searchable log of every element within your ads for long-term analysis and trend-spotting. Save Countless Hours: Eliminate the tedious manual process of watching, tagging, and logging creative assets. How it Works The workflow is triggered on a schedule and follows a clear, structured path: Configuration & Ad Ingestion: The workflow begins on a schedule (e.g., weekly on Monday at 10 AM). It starts by fetching all active ads from a specific Meta Ads Campaign, which you define in the Set Campaign ID node. Intelligent Branching (Video vs. Image): An IF node inspects each creative to determine its type. Video creatives are routed to the Google Video Intelligence API pipeline. Image creatives are routed to the Google Vision API pipeline. The Video Analysis Pipeline: For each video, the workflow gets a direct source URL, downloads the file, and converts it to a Base64 string. It then initiates an asynchronous analysis job in the Google Video Intelligence API, requesting LABELDETECTION, SPEECHTRANSCRIPTION, and TEXT_DETECTION. A loop with a wait timer periodically checks the job status until the analysis is complete. Finally, a Code node parses the complex JSON response, structuring the annotations (like detected objects with timestamps or full speech transcripts) into clean rows. The Image Analysis Pipeline: For each image, the file is downloaded, converted to Base64, and sent to the Google Vision API. It requests a wide range of features, including label, text, logo, and object detection. A Code node parses the response and formats the annotations into a standardized structure. Data Logging & Robust Error Handling: All successfully analyzed data from both pipelines is appended to a primary Google Sheet. The workflow is built to be resilient. If an error occurs (e.g., a video fails to be processed by the API, or an image URL is missing), a detailed error report is logged to a separate errors sheet in your Google Sheet, ensuring no data is lost and problems are easy to track. --- Setup Instructions To use this template, you need to configure a few key nodes. Credentials: Connect your Meta Ads account. Connect your Google account. This account needs access to Google Sheets and must have the Google Cloud Vision API and Google Cloud Video Intelligence API enabled in your GCP project. The Set Campaign ID Node: This is the primary configuration step. Open this Set node and replace the placeholder value with the ID of the Meta Ads campaign you want to analyze. Google Sheets Nodes: You need to configure two Google Sheets nodes: Add Segments data: Select your spreadsheet and the specific sheet where you want to save the successful analysis results. Ensure your sheet has the following headers: campaignid, adid, creativeid, videoid, filename, imageurl, source, annotationtype, labelortext, category, fulltranscript, confidence, starttimes, endtimes, languagecode, processedat_utc. Add errors: Select your spreadsheet and the sheet you want to use for logging errors (e.g., a sheet named "errors"). Ensure this sheet has headers like: errortype, errormessage, campaignid, adid, creativeid, filename, processedatutc. Activate the Workflow: Set your desired frequency in the Run Weekly on Monday at 10 AM (Schedule Trigger) node. Save and activate the workflow. --- Further Ideas & Customization This workflow provides the "what" inside your creatives. The next step is to connect it to performance. Build a Performance Analysis Workflow: Create a second workflow that reads this Google Sheet, fetches performance data (spend, clicks, conversions) for each ad_id from the Meta Ads API, and merges the two datasets. This will allow you to see which labels correlate with the best performance. Create Dashboards: Use the structured data in your Google Sheet as a source for a Looker Studio or Tableau dashboard to visualize creative trends. Incorporate Generative AI: Add a final step that sends the combined performance and annotation data to an LLM (like in the example you provided) to automatically generate qualitative summaries and recommendations for each creative. Add Notifications: Use the Slack or Email nodes to send a summary after each run, reporting how many creatives were analyzed and if any errors occurred.
Automated security alert analysis with Sophos, Gemini AI, and VirusTotal
How It Works This workflow automates the analysis of security alerts from Sophos Central, turning raw events into actionable intelligence. It uses the official Sophos SIEM integration tool to fetch data, enriches it with VirusTotal, and leverages Google Gemini to provide a real-time threat summary and mitigation plan via Telegram. Prerequisite (Important): This workflow is triggered by a webhook that receives data from an external Python script. You must first set up the Sophos-Central-SIEM-Integration script from the official Sophos GitHub. This script will fetch data and forward it to your n8n webhook URL. Tool Source Code: Sophos/Sophos-Central-SIEM-Integration The n8n Workflow Steps Webhook: Receives enriched event and alert data from the external Python script. IF (Filter): Immediately filters the incoming data to ensure only events with a high or critical severity are processed, reducing noise from low-priority alerts. Code (Prepare Indicator): Intelligently inspects the Sophos event data to extract the primary threat indicator. It prioritizes indicators in the following order: File Hash (SHA256), URL/Domain, or Source IP. HTTP Request (VirusTotal): The extracted indicator is sent to the VirusTotal API to get a detailed reputation report, including how many security vendors flagged it as malicious. Code (Prompt for Gemini): The raw JSON output from VirusTotal is processed into a clean, human-readable summary and a detailed list of flagging vendors. AI Agent (Google Gemini): All collected data—the original Sophos log, the full alert details, and the formatted VirusTotal reputation—is compiled into a detailed prompt for Gemini. The AI acts as a virtual SOC analyst to: Create a concise incident summary. Determine the risk level. Provide a list of concrete, actionable mitigation steps. Telegram: The complete analysis and mitigation plan from Gemini is formatted into a clean, easy-to-read message and sent to your specified Telegram chat. Setup Instructions Configure the external Python script to forward events to this workflow's Production URL. In n8n, create Credentials for Google Gemini, VirusTotal, and Telegram. Assign the newly created credentials to the corresponding nodes in the workflow.
Send Automated Patient Appointment Reminders via Email & SMS with Multi-Database Support
🏥 PATIENT APPOINTMENT REMINDER SYSTEM MULTI-DATABASE SUPPORT PURPOSE: Automatically send professional appointment reminders via email and SMS to reduce no-shows and improve patient experience. FEATURES: ✅ Dual reminders (3-day + 1-day before) ✅ Email + SMS notifications ✅ Phone number auto-formatting ✅ Timezone handling ✅ Professional HTML email templates ✅ Multi-database support: • PostgreSQL (Enterprise) • Google Sheets (Easy) • Airtable (Recommended) ✅ Flexible webhook support BUSINESS IMPACT: 📈 Reduce no-shows by 30-50% 💰 Increase revenue from better attendance ⏰ Save staff time on manual reminders 📱 Professional patient communication SETUP: Activate ONLY the database nodes your client needs!
Generate daily stock buy/sell signals using technical indicators and Google Sheets
📊 Description This automation calculates commonly used technical indicators for selected stocks and presents the results in a simple, structured dashboard. It removes the need for manual chart analysis by automatically fetching price data, calculating indicators, and generating clear Buy, Sell, or Neutral signals. The workflow is designed to run daily and provides a consistent technical snapshot for each tracked stock. It is suitable for traders and analysts who want a repeatable and transparent way to monitor technical conditions without relying on manual tools. ⚙️ What This Template Does Runs automatically on a daily schedule Processes a predefined list of stock symbols Fetches recent daily price data from a market data API Calculates RSI, Moving Averages, and MACD Applies rule-based logic to generate Buy, Sell, or Neutral signals Stores indicator values and signals in Google Sheets ✅ Key Benefits Eliminates manual technical analysis Uses standard, widely accepted indicators Produces clear and easy-to-interpret signals Keeps all results in a single dashboard Easy to customize and extend 🧩 Features Daily scheduled execution Historical price data integration RSI (14-period) calculation Moving Averages (SMA 20 and SMA 50) MACD (12, 26, 9) calculation Rule-based Buy / Sell / Neutral classification Google Sheets dashboard output Built-in data validation checks 🔐 Requirements To use this workflow, you will need: A market data API key (Alpha Vantage or similar) A Google Sheets account for storing results Google Sheets credentials configured in n8n An active n8n instance (cloud or self-hosted) 🎯 Target Audience Stock traders and investors Technical analysts Finance and research teams Automation builders working with market data 🛠 Customization Options Update the stock list to track different symbols Adjust indicator periods or thresholds Modify Buy / Sell signal rules Change the schedule frequency Extend the dashboard with additional indicators
Automate Zoom 🎦 user onboarding with OAuth token management and data tables
This workflow automates the management of Zoom OAuth tokens and the creation of new Zoom users through the Zoom API. This workflow automates the process of creating a new Zoom user by first ensuring a valid OAuth access token is available. It is designed to handle the fact that Zoom access tokens are short-lived (1 hour) by using a longer-lived refresh token (90 days) stored in an n8n Data Table. It includes two main phases: Token Generation & Management The workflow initially requests a Zoom access token using the OAuth 2.0 “authorization code” method. The resulting access token (valid for 1 hour) and refresh token (valid for 90 days) are stored in an n8n Data Table. When executed again, the workflow checks for the most recent token, refreshes it using the refresh token, and updates the Data Table automatically. User Creation in Zoom Once a valid token is retrieved, the workflow collects the user’s first name, last name, and email (set in the “Data” node). It then generates a secure random password for the new user. Using the Zoom API, it sends a POST request to create the new user, automatically triggering an invitation email from Zoom. --- Key Features ✅ Full Automation of Zoom Authentication Eliminates manual token handling by automatically refreshing and updating OAuth credentials. ✅ Centralized Token Storage Securely stores access and refresh tokens in an n8n Data Table, simplifying reuse across workflows. ✅ Error Prevention Ensures that expired tokens are replaced before API requests, avoiding failed Zoom operations. 4.✅ Automatic User Provisioning Creates Zoom users automatically with prefilled credentials and triggers Zoom’s built-in invitation process. ✅ Scalability Can be easily extended to handle bulk user creation, role assignments, or integration with other systems (e.g., HR, CRM). ✅ Transparency & Modularity Each node is clearly labeled with “Sticky Notes” explaining every step, making maintenance and handover simple. --- How it works Trigger and Data Retrieval: The workflow starts manually. It first retrieves user data (first name, last name, email) from the "Data" node. In parallel, it fetches all stored token records from a Data Table. Token Management: The retrieved tokens are sorted and limited to get only the most recent one. This latest token (which contains the refreshtoken) is then used in an HTTP Request to Zoom's OAuth endpoint to generate a fresh, valid accesstoken. User Creation: The new accesstoken and refreshtoken are saved back to the Data Table for future use. The workflow then generates a random password for the new user, merges this password with the initial user data, and finally sends a POST request to the Zoom API to create the new user. If the creation is successful, Zoom automatically sends an invitation email to the new user. --- Set up steps Prepare the Data Table: Create a new Data Table in your n8n project. Add two columns to it: accessToken and refreshToken. Configure Zoom OAuth App: Create a standard OAuth app in the Zoom Marketplace (not a Server-to-Server app). Note your Zoom account_id. Encode your Zoom app's clientid and clientsecret in Base64 format (as clientid:clientsecret). In both the "Get new token" and "Zoom First Access Token" nodes, replace the "XXX" in the Authorization header with this Base64-encoded string. Generate Initial Tokens (First Run Only): Manually execute the "Zoom First Access Token" node once. This node uses an authorization code to fetch the first-ever access and refresh tokens and saves them to your Data Table. The main workflow will use these stored tokens from this point forward. Configure User Data: In the "Data" node, set the default values for the new Zoom user by replacing the "XXX" placeholders for firstname, lastname, and email. After these setup steps, the main workflow (triggered via "When clicking 'Execute workflow'") can be run whenever you need to create a new Zoom user. It will automatically refresh the token and use the provided user data to create the account. --- Need help customizing? Contact me for consulting and support or add me on Linkedin.
Categorize support emails with AI and create tasks in Dart
Automatically turn incoming support emails into categorized, prioritized tasks in Dart—complete with AI-generated summaries, tags, and sender context. What It Does This workflow captures emails from Gmail, uses an AI model to classify them into one of seven categories (e.g., Bug Report, Billing, Feature Request), and creates a structured task in Dart. Each task includes: Title: The email subject Tag: Based on the detected category Priority: Set by the AI based on content analysis Description: Includes confidence level, rationale, summary, and the cleaned full email body Comment: Automatically adds the sender’s name and email for easy reference The workflow also parses and cleans the raw HTML email content, ensuring all data is readable and workflow-ready. Who's It For This template is built for support and operations teams using Dart who want to streamline how incoming emails are sorted and turned into actionable tasks. It’s ideal for organizations managing multiple types of requests and updates from clients, partners, or systems. How to Set Up Import the workflow into n8n Connect your Gmail and Dart accounts Replace the dummy Dartboard ID with your actual board ID Choose your preferred AI model (results may vary depending on model quality) If your target email address is in a google group, use the Filter: "Sender" in the Gmail trigger Requirements n8n account Connected Gmail and Dart account How to Customize the Workflow Modify the category list to match your team’s taxonomy Adjust the AI classification prompt to fine-tune tagging and prioritization Choose your preferred AI model
Auto-track Amazon prices with Google Gemini & send alerts to Telegram
AI-Powered Amazon Price Tracker to Telegram Overview Automate your deal hunting with this intelligent Amazon price tracker. This workflow uses the power of AI to monitor any Amazon product page at regular intervals. When the price drops to or below your desired target, it instantly sends a notification to your Telegram chat. Say goodbye to manual price checking and never miss a sale, a lightning deal, or a Black Friday bargain again. Unlike traditional scrapers that break when a website's layout changes, this workflow uses a Large Language Model (Google Gemini) to understand the page content, making it significantly more robust and reliable. 🚀 Key Features AI-Powered Data Extraction: Leverages Google Gemini to intelligently read the product page and extract the name and price, making it resilient to Amazon's frequent layout updates. Scheduled Automation: Set it up once with a schedule (e.g., every hour) and let it run in the background. Instant Telegram Alerts: Get real-time notifications directly in Telegram the moment a price drops to your target. Centralized & Easy Configuration: A single Set node holds all your settings—product URL, target price, and Telegram Chat ID—for quick and easy updates. Built-in Error Handling: Intelligently checks if data was extracted correctly and sends an error alert if it fails, so you're always in the loop. Cost-Efficient Processing: Includes a pre-processing step to clean and simplify the page's HTML, reducing the amount of data sent to the AI and lowering potential token costs. ⚙️ How It Works The workflow follows a clear, linear process from scheduling to notification. Initiation and Configuration A Schedule node triggers the workflow automatically at your chosen interval (e.g., every hour). A Set node named Config: Product & Alert acts as your control panel. Here, you define the Amazon product URL, your target price, and your Telegram Chat ID. Fetch and Clean Product Page An HTTP Request node fetches the full HTML content of the Amazon product URL. A Code node then cleans this raw HTML. It strips out unnecessary scripts, styles, and tags, leaving only the core text content. This crucial step makes the data easier for the AI to analyze accurately and efficiently. AI-Powered Data Extraction An AI Agent node sends the cleaned text to the Google Gemini model. It uses a specific prompt to ask the AI to identify and extract the product's name (productName) and its current price (priceValue) as a number. A Structured Output Parser ensures the AI returns the data in a clean, predictable JSON format (e.g., {"productName": "...", "priceValue": 49.99}), which is essential for the next steps. Validate and Compare Price An IF node first validates the AI's output, checking if a valid price was successfully extracted. If not, it routes the workflow to send an error message. If the data is valid, a second IF node compares the extracted priceValue with your priceTarget from the configuration node. Prepare and Send Telegram Notification If the current price is less than or equal to your target price, the workflow proceeds down the "true" path. A Set node constructs a formatted, user-friendly success message including the product name, the new low price, and a direct link to the product page. Finally, a Telegram node sends this prepared message to your specified Chat ID. If an error occurred at any stage, a corresponding error message is sent instead. 🛠️ Setup Steps & Credentials To get this workflow running, you'll need to configure the following: Google Gemini: You need a Google AI (Gemini) API Key. Create a Google Gemini(PaLM) Api credential in n8n. Assign this credential to the Google Gemini Chat Model node. Telegram: You need a Telegram Bot Token. Get one by talking to the @BotFather on Telegram. You also need your personal Chat ID. You can get this from a bot like @userinfobot. Create a Telegram credential in n8n with your Bot Token. Assign this credential to both the Send Success and Send Error Telegram nodes. Workflow Configuration: Open the Config: Product & Alert node. Replace the placeholder values: telegramChatId: Paste your Telegram Chat ID. amazonUrl: Paste the full URL of the Amazon product you want to track. priceTarget: Set your desired price (e.g., 49.99). Once configured, save the workflow and activate it using the toggle in the top-right corner. 💡 Customization & Learning This workflow is a powerful template that you can easily adapt and expand: Track Multiple Products: Modify the workflow to read product URLs and target prices from a Google Sheet or Airtable base to monitor many items at once. Add More Notification Channels: Duplicate the Telegram node and add Discord, Slack, or Email nodes to receive alerts on multiple platforms. Store Price History: Connect a Google Sheets, Airtable, or database node after the AI extraction step to log the price of the product over time, creating a historical price chart. Switch AI Models: Easily swap the Google Gemini node for an OpenAI or Anthropic model by adjusting the prompt and credentials.