Api schema extractor
This workflow automates the process of discovering and extracting APIs from various services, followed by generating custom schemas. It works in three distinct stages: research, extraction, and schema generation, with each stage tracking progress in a Google Sheet. š Jim Le deserves major kudos for helping to build this sophisticated three-stage workflow that cleverly automates API documentation processing using a smart combination of web scraping, vector search, and LLM technologies. How it works Stage 1 - Research: Fetches pending services from a Google Sheet Uses Google search to find API documentation Employs Apify for web scraping to filter relevant pages Stores webpage contents and metadata in Qdrant (vector database) Updates progress status in Google Sheet (pending, ok, or error) Stage 2 - Extraction: Processes services that completed research successfully Queries vector store to identify products and offerings Further queries for relevant API documentation Uses Gemini (LLM) to extract API operations Records extracted operations in Google Sheet Updates progress status (pending, ok, or error) Stage 3 - Generation: Takes services with successful extraction Retrieves all API operations from the database Combines and groups operations into a custom schema Uploads final schema to Google Drive Updates final status in sheet with file location Ideal for: Development teams needing to catalog multiple APIs API documentation initiatives Creating standardized API schema collections Automating API discovery and documentation Accounts required: Google account (for Sheets and Drive access) Apify account (for web scraping) Qdrant database Gemini API access Set up instructions: Prepare your Google Sheets document with the services information. Here's an example of a Google Sheet ā you can copy it and change or remove the values under the columns. Also, make sure to update Google Sheets nodes with the correct Google Sheet ID. Configure Google Sheets OAuth2 credentials, required third-party services (Apify, Qdrant) and Gemini. Ensure proper permissions for Google Drive access.
Website scam risk detector with GPT-4o and SerpAPI
What It Does This intelligent workflow simplifies the complex task of determining whether a website is legitimate or potentially a scam. By simply submitting a URL through a form, the system initiates a multi-agent evaluation process. Four dedicated AI agentsāeach powered by GPT-4o and connected to SerpAPIāanalyze different dimensions of the website: domain and technical details, search engine signals, product and pricing patterns, and on-site content analysis. Their findings are then passed to a fifth AI agent, the Analyzer, powered by GPT-4o mini, which consolidates the data, scores the site on a scale of 1ā10 for scam likelihood, and presents the findings in a clear, structured format for the user. Who It's For This workflow is ideal for anyone who needs to quickly and reliably assess the trustworthiness of a website. Whether you're a consumer double-checking a store before making a purchase, a small business owner validating supplier sites, a cybersecurity analyst conducting threat assessments, or a developer building fraud detection into your platform ā this tool offers fast, AI-powered insights without the need for manual research or technical expertise. It's designed for both individuals and teams who value accurate, scalable scam detection. How It Works The process begins with a simple form submission where the user enters the URL of the website they want to investigate. Once submitted, the workflow activates four specialized AI agentsāeach powered by GPT-4o and connected to SerpAPIāto independently analyze the site from different angles: Agent 1 examines domain age, SSL certificates, and TLD trustworthiness. Agent 2 reviews search engine results, forum mentions, and public scam reports. Agent 3 analyzes product pricing patterns and brand authenticity. Agent 4 assesses on-site content quality, grammar, legitimacy of claims, and presence of business info. Each agent returns its findings, which are then aggregated and passed to a fifth AI agentāthe Analyzer. This final agent, powered by GPT-4o mini, evaluates all the input, assigns a scam likelihood score from 1 to 10, and compiles a neatly formatted summary with organized insights and a disclaimer for context. Set UP You will need to obtain an Open AI API key from platform.openai.com/api-keys After you obtain this Open AI API key you will need to connect it to the Open AI Chat Model for all of the Tools agents (Analyzer, Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents). You will now need to fund your Open AI account. GPT 4o costs ~$0.01 to run the workflow. Next you will need to create a SerpAPI account at https://serpapi.com/users/sign_up After you create an account you will need to obtain a SerpAPI key. You will then need to use this key to connect to the SerpAPI tool for each of the tools agents (Domain & Technical Details, Search Engine Signals, Product & Pricing Patterns, and Content Analysis Tools Agents) Tip: SerpAPI will allow you to run 100 free searches each month. This workflow uses ~5-15 SerpAPI searches per run. If you would like to utilize the workflow more than that each month, create multiple SerpAPI accounts and have an API key for each account. When you utilize all 100 free searches for an account, switch to the API key for another account within the workflow. Disclaimer This tool is designed to assist in evaluating the potential risk of websites using AI-generated insights. The scam likelihood score and analysis provided are based on publicly available information and should not be considered a definitive or authoritative assessment. This tool does not guarantee the accuracy, safety, or legitimacy of any website. Users should perform their own due diligence and use independent judgment before engaging with any site. N8N, OpenAI, its affiliates, and the creators of this workflow are not responsible for any loss, damages, or consequences arising from the use of this tool or the actions taken based on its results.
Suspicious login detection
This n8n workflow is designed for security monitoring and incident response when suspicious login events are detected. It can be initiated either manually from within the n8n UI for testing or automatically triggered by a webhook when a new login event occurs. The workflow first extracts relevant data from the incoming webhook payload, including the IP address, user agent, timestamp, URL, and user ID. It then splits into three parallel processing paths. In the first path, it queries GreyNoise's Community API to retrieve information about the investigated IP address. Depending on the classification and trust level received from GreyNoise, the alert is given a High, Medium, or Low priority. This priority is assigned based on the best practices documentation from GreyNoise on how to apply their data to analysis. Once a priority is assigned, a message is sent to a Slack channel to notify users about the alert. The second path involves fetching geolocation data about the IP address using IP-API's Geolocation API and merging it with data from the UserParser node. This data is then combined with the data obtained from GreyNoise. In the third path, the UserParser node queries the Userparser IP address and user agent lookup API to obtain information about the user's IP and user agent. This data is merged with the IP-API data and GreyNoise data. The workflow then checks if the IP address is considered an unknown threat by examining both the noise and riot fields from GreyNoise. If it is considered an unknown threat, the workflow proceeds to retrieve the last 10 login records for the same user from a Postgres database. If there are any discrepancies in the login information, indicating a new location or device/browser, the user is informed via email. Potential issues when setting up this workflow include ensuring that credentials are correctly entered for GreyNoise and UserParser nodes, and addressing any discrepancies in the data sources that could lead to false positives or negatives in threat detection. Additionally, the usage of hardcoded API keys should be replaced with credentials for security and flexibility. Thorough testing and validation with sample data are crucial to ensure the workflow performs as expected and aligns with security incident response procedures.
Join data from Postgres and MySQL
query data from two different databases handle and unify in a single return
Voice-to-email response system with Telegram, OpenAI Whisper & Gmail
This workflow gives you the ability to reply to a long email with a voice note, rather than having to type everything out. ChatGPT will format your audio response and create an email draft for you. How it works When a new email arrives in your inbox, the workflow checks if it needs a response, and it it does, it sends a message to you on Telegram via a VoiceEmailer bot. When you reply to that message with an audio message, the second part of this workflow is triggered. It checks if the message is in the right format, transcribes the audio, and creates a draft response that shows up in the same email thread. Set up steps Add your credentials for Gmail and OpenAI Create an Telegram bot following the instructions here. Connect your telegram credentials so the workflow will use your bot. Turn on the workflow, and message the bot from your telegram. Find the Chat ID from the Executions tab of your workflow, and enter it in as a variable.
Clone viral TikTok & Instagram reels with Apify and Gemini 2.5 Pro
Reverse engineer short-form videos from Instagram and TikTok using Gemini AI Who's it for Content creators, AI video enthusiasts, and digital marketers who want to analyze successful short-form videos and understand their production techniques. Perfect for anyone looking to reverse-engineer viral content or create detailed prompts for AI video generation tools like Google Veo or Sora. How it works This automation takes any Instagram Reel or TikTok URL and performs a forensic analysis of the video content. The workflow downloads the video, converts it to base64, and uses Google's Gemini 2.5 Pro vision API to generate an extremely detailed "Generative Manifest" - a comprehensive prompt that could be used to recreate the video with AI tools. The analysis includes: Visual medium identification (film stock, camera sensor, lens characteristics) Color grading and lighting breakdown Shot-by-shot deconstruction with precise timing Camera movement and framing details Subject description and action choreography Environmental and atmospheric details How to set up Configure API credentials: Add your Apify API key for video scraping Set up Google Gemini API authentication Set up Slack integration (optional): Configure Slack OAuth for result sharing Update the channel ID where results should be posted Access the form: The workflow creates a web form where you can input video URLs Form accepts both Instagram Reel and TikTok URLs Requirements Apify account with API access for video scraping Google Cloud account with Gemini API enabled Slack workspace (optional, for sharing results) Videos must be publicly accessible (no private accounts) How to customize the workflow Modify the analysis prompt: Edit the "setbaseprompt" node to adjust the depth and focus of the video analysis Add different platforms: Extend the switch node to handle other video platforms Integrate with other tools: Replace Slack with email, Discord, or other notification systems
Verify emails & enrich new form leads and save them to HubSpot
Use case When collecting leads via a form you're typically facing a few problems: Often end up with a bunch of leads who don't have a valid email address You want to know as much about the new lead as possible but also want to keep the form short After forms are submitted you have to walk over the submissions and see which you want to add to your CRM This workflow helps you to fix all those problems. What this workflow does The workflow checks every new form submission and verifies the email using Hunter.io. If the email is valid, it then tries to enrich the person using Clearbit and saves the new lead into your Hubspot CRM. Setup Add you Hunter, Clearbit and Hubspot credentials Click the Test Workflow button, enter your email and check your Hubspot Activate the workflow and use the form trigger production URL to collect your leads in a smart way How to adjust it to your needs Change the form to the form you need in your use case (e.g. Typeform, Google Forms, SurveyMonkey etc.) Add criteria before an account is added to your CRM. This could for example be the size of company, industry etc. You can find some inspiration in our other template Reach out via Email to new form submissions that meet a certain criteria Add more data sources to save the new lead in
Monitor a file for changes and send an alert
This flow monitors a file for changes of its content. If changed, an alert is sent out and you receive it as push, SMS or voice call on SIGNL4. User cases: Log-file monitoring Monitoring of production data Integration with third-party systems via file interface Etc. Sample file "alert-data.json": { "Body": "Alert in building A2.", "Done": false, "eventId": "2518088743722201372_4ee5617b-2731-4d38-8e16-e4148b8fb8a0" } Body: The alert text to be sent. Done: If false this is a new alert. If true this indicated the alert has been closed. eventId: Last SIGNL4 event ID written by SIGNL4. This flow can be easily adapted for database monitoring as well.
When specific event created in Google Calendar, duplicate & rename Google file
Who is this template for? This template is for everyone who has to take notes during a call: Talent Acquisition Managers / Talent Acquisition Specialists / Recruiters HR professionals Sales teams, customer success teams Product teams / User Experience Designers / anyone conducting user research interviews Use case This workflow allows specific events created on Google Calendar (or any other meeting scheduling tool like Calendly) to trigger the duplication and renaming of a specific template document. Example: For each new screening call that is scheduled in your calendar, you want to create a draft of your screening interview template for the role, titled "{Name of the candidate} | {Date of the interview}", and located in your Google Drive in a specific folder This workflow could then be extended to copy the link to the file on a Notion database that is shared with the team (check "To go further" section). This workflow ensures that if you're jumping from calls to calls, you're already set up to take notes, and every document is tidied up and sorted in a structured way! How it works The workflow starts when a new event is created in Google Calendar The Filter node then selects a specific type of events, depending on a chosen pattern (title includes a specific term, organizer is X, attendees include Y, etc.) The workflow then searches for a specific folder in your Google Drive, where the file you want to duplicate is located The workflow then searches for the specific file you want to duplicate The last step allows to duplicate and rename the file with variables from your Google Calendar event Set up Set up credentials for Google Calendar, Google Drive, and Google File. You'll need a Google Workspace account. Set up the Filter node with the pattern you want to look for to retreive specific events in your calendar Set up the Google Drive you want to search in Set up the Google File you want to duplicate Set up variable at the last step to rename your duplicated file however you want it, or add a description To go further Here's a few idea to enhance this workflow depending on your specific needs: Instead of a filter, separate your flow depending on your use case (ex: you have want to fetch different templates depending on the type of call it'll be). Switch Google Calendar for another trigger (Calendly, Hubspot..) 10 minutes before the event, send the duplicate Google File to the meeting organizer through Slack The day after the event, if the event hasn't been cancelled, add the link to the Google File to your ATS, Hubspot, etc.
Automated local business lead generation with Google Maps & GPT-4
AI-Powered Local Lead Generation Workflow with n8n This workflow solves one of the biggest pain points for freelancers, agencies, and SaaS foundersāfinding accurate local business leads at scale without manual copy-pasting or unreliable scraping tools. Traditional lead generation is time-consuming and prone to errors. This template automates the entire process so you can focus on outreach, not data gathering. --- ā What the Workflow Does Accepts a business type (e.g., plumbers) and city (e.g., Los Angeles) as input Uses AI to generate hyperlocal search terms for full neighborhood coverage Scrapes Google Maps results to extract business details and websites Filters out junk, Google-owned links, and duplicates Scrapes homepage HTML for each business and extracts valid emails using Regex Outputs a clean, deduplicated lead list with business names, websites, and emails --- š Everything Runs Inside n8n With: OpenAI for AI-driven query generation Zyte API for reliable scraping JavaScript functions for email extraction Built-in filtering and batching for clean results --- š„ Who is This For? Marketing agencies doing local outreach Freelancers offering SEO, design, or lead gen services SaaS founders targeting SMBs Sales teams scaling outbound campaigns --- ā Requirements n8n account (Cloud or self-hosted) OpenAI API key (stored in n8n credentials) Zyte API key (stored securely) Basic familiarity with Google Sheets if you want to export results --- āļø How to Set Up Import the workflow JSON into n8n Go to Credentials in n8n and add OpenAI and Zyte API keys Replace placeholder credential references in the HTTP Request nodes Set your search parameters (business type and city) in the designated Set node Test the workflow with a single search to confirm scraping and email extraction steps Configure batching if you plan to scrape multiple neighborhoods Add an output step (Google Sheets, Airtable, or CRM) to store your leads --- š§ How to Customize Update the OpenAI prompt for different search formats (e.g., service + zip code) Adjust the Regex pattern in the JavaScript node for additional email validation rules Add extra filtering logic for niche-specific keywords Integrate with Instantly, HubSpot, or any email-sending tool for full automation
Daily calendar summary notifications via Telegram from Google Calendar
Context: This workflow automatically sends you a daily message on Telegram summarizing all your meetings and events for the day, straight from your Google Calendar. For who ? Perfect for anyone who: Uses Google Calendar to manage their schedule. Wants Telegram reminders for daily events. Loves automation and productivity tools. Requirements: Telegram. Google account. Google Calendar. Steps: šļø Use the sticky notes in the n8n canvas to: Add your Telegram and Google credentials. Execute and test the workflow. Check if you receive your daily summary on Telegram. You'll get this: Tutorial video: Watch the Youtube Tutorial video How does it work? ā° The trigger runs every day at 7AM. š Your Google Calendar is checked. š¢ If there are events or meetings, a number > 0 is returned. Otherwise, it's 0. š A text message is generated with a summary of all your events, including all relevant details. ā If no events are found, a "no event" message is sent. About me : Iām Yassin a Project & Product Manager Scaling tech products with data-driven project management. š¬ Feel free to connect with me on Linkedin
Evaluation metric example: Check if tool was called
AI evaluation in n8n This is a template for n8n's evaluation feature. Evaluation is a technique for getting confidence that your AI workflow performs reliably, by running a test dataset containing different inputs through the workflow. By calculating a metric (score) for each input, you can see where the workflow is performing well and where it isn't. How it works This template shows how to calculate a workflow evaluation metric: whether a specific tool was called by an agent. We use an evaluation trigger to read in our dataset It is wired up in parallel with the regular trigger so that the workflow can be started from either one. More info We make sure that the agent outputs the list of tools that it used We then check whether the expected tool (from the dataset) is in that list Finally we pass this information back to n8n as a metric