Verify emails & enrich new form leads and save them to HubSpot
Use case When collecting leads via a form you're typically facing a few problems: Often end up with a bunch of leads who don't have a valid email address You want to know as much about the new lead as possible but also want to keep the form short After forms are submitted you have to walk over the submissions and see which you want to add to your CRM This workflow helps you to fix all those problems. What this workflow does The workflow checks every new form submission and verifies the email using Hunter.io. If the email is valid, it then tries to enrich the person using Clearbit and saves the new lead into your Hubspot CRM. Setup Add you Hunter, Clearbit and Hubspot credentials Click the Test Workflow button, enter your email and check your Hubspot Activate the workflow and use the form trigger production URL to collect your leads in a smart way How to adjust it to your needs Change the form to the form you need in your use case (e.g. Typeform, Google Forms, SurveyMonkey etc.) Add criteria before an account is added to your CRM. This could for example be the size of company, industry etc. You can find some inspiration in our other template Reach out via Email to new form submissions that meet a certain criteria Add more data sources to save the new lead in
Monitor a file for changes and send an alert
This flow monitors a file for changes of its content. If changed, an alert is sent out and you receive it as push, SMS or voice call on SIGNL4. User cases: Log-file monitoring Monitoring of production data Integration with third-party systems via file interface Etc. Sample file "alert-data.json": { "Body": "Alert in building A2.", "Done": false, "eventId": "2518088743722201372_4ee5617b-2731-4d38-8e16-e4148b8fb8a0" } Body: The alert text to be sent. Done: If false this is a new alert. If true this indicated the alert has been closed. eventId: Last SIGNL4 event ID written by SIGNL4. This flow can be easily adapted for database monitoring as well.
Turn any PDF into a clean Google Doc with Mistral OCR
Upload a PDF and instantly get a neatly formatted Google Doc with all the readable text—no manual copy-paste, no messy line breaks. What this workflow does Accepts PDF uploads via a public form Sends the file to Mistral Cloud for high-accuracy OCR Detects and merges page images with their extracted text Cleans headers, footers, broken lines, and noise Creates a new Google Doc in your chosen Drive folder Writes the polished markdown text into the document What you need Mistral Cloud API key with OCR access Google Docs & Drive credentials connected in n8n Drive folder ID for new documents A PDF file to process (up to 100 MB) Setup Import the workflow into n8n and activate credentials. In Trigger • Form Submission, copy the webhook URL and share it or embed it. In Create • Google Doc, replace the default folder ID with yours. Fill out Mistral API key under Mistral Cloud API credentials. Save and activate the workflow. Visit the form, upload a PDF, name your future doc, and submit. Open Drive to view your newly generated, clean Google Doc. Example use cases Convert annual reports into editable text for analysis. Extract readable content from scan-only invoices for bookkeeping. Turn magazine PDFs into draft blog posts. Digitize lecture handouts for quick search and annotation. Convert image-heavy landing pages / advertorials into editable text for AI to analyze structure and content.
Seo competitor analysis & data logging with Semrush API and Google Sheets
Competitor Analysis & SEO Data Logging Workflow Using Competitor Analysis Semrush API Description This workflow automates SEO competitor analysis using the Competitor Analysis Semrush API and logs the data into Google Sheets for structured reporting. It captures domain overview, organic competitors, organic pages, and keyword-level insights from the Competitor Analysis Semrush API, then appends them to different sheets for easy tracking. --- Node-by-Node Explanation On form submission – Captures the website URL entered by the user. Competitor Analysis – Sends the website to the Competitor Analysis Semrush API via HTTP POST request. Re format output – Extracts and formats the domain overview data. Domain overview – Saves organic keywords and traffic into Google Sheets. Reformat – Extracts the organic competitors list. Organic Competitor – Logs competitor domains, relevance, and traffic into Google Sheets. Reformat 2 – Extracts organic pages data. Organic Pages – Stores page-level data such as traffic and keyword counts. Reformat2 – Extracts organic keywords details. organic keywords – Logs keyword data like CPC, volume, and difficulty into Google Sheets. --- Benefits ✅ Automated competitor tracking – No manual API calls, all logged in Google Sheets. ✅ Centralized SEO reporting – Data stored in structured sheets for quick access. ✅ Time-saving – Streamlines research by combining multiple reports in one workflow. ✅ Accurate insights – Direct data from the Competitor Analysis Semrush API ensures reliability. --- Use Cases 📊 SEO Research – Track domain performance and competitor strategies. 🔍 Competitor Monitoring – Identify competitor domains, keywords, and traffic. 📝 Content Strategy – Find top-performing organic pages and replicate content ideas. 💰 Keyword Planning – Use CPC and difficulty data to prioritize profitable keywords. 📈 Client Reporting – Generate ready-to-use SEO competitor analysis reports in Google Sheets.
Web crawler: Convert websites to AI-ready markdown in Google Sheets
Transform any website into a structured knowledge repository with this intelligent crawler that extracts hyperlinks from the homepage, intelligently filters images and content pages, and aggregates full Markdown-formatted content—perfect for fueling AI agents or building comprehensive company dossiers without manual effort. 📋 What This Template Does This advanced workflow acts as a lightweight web crawler: it scrapes the homepage to discover all internal links (mimicking a sitemap extraction), deduplicates and validates them, separates image assets from textual pages, then fetches and converts non-image page content to clean Markdown. Results are seamlessly appended to Google Sheets for easy analysis, export, or integration into vector databases. Automatically discovers and processes subpage links from the homepage Filters out duplicates and non-HTTP links for efficient crawling Converts scraped content to Markdown for AI-ready formatting Categorizes and stores images, links, and full content in a single sheet row per site 🔧 Prerequisites Google account with Sheets access for data storage n8n instance (cloud or self-hosted) Basic understanding of URLs and web links 🔑 Required Credentials Google Sheets OAuth2 API Setup Go to console.cloud.google.com → APIs & Services → Credentials Click "Create Credentials" → Select "OAuth client ID" → Choose "Web application" Add authorized redirect URIs: https://your-n8n-instance.com/rest/oauth2-credential/callback (replace with your n8n URL) Download the client ID and secret, then add to n8n as "Google Sheets OAuth2 API" credential type During setup, grant access to Google Sheets scopes (e.g., spreadsheets) and test the connection by listing a sheet ⚙️ Configuration Steps Import the workflow JSON into your n8n instance In the "Set Website" node, update the website_url value to your target site (e.g., https://example.com) Assign your Google Sheets credential to the three "Add ... to Sheet" nodes Update the documentId and sheetName in those nodes to your target spreadsheet ID and sheet name/ID Ensure your sheet has columns: "Website", "Links", "Scraped Content", "Images" Activate the workflow and trigger manually to test scraping 🎯 Use Cases Knowledge base creation: Crawl a company's site to aggregate all content into Sheets, then export to Notion or a vector DB for internal wikis AI agent training: Extract structured Markdown from industry sites to fine-tune LLMs on domain-specific data like legal docs or tech blogs Competitor intelligence: Build dossiers by crawling rival websites, separating assets and text for SEO audits or market analysis Content archiving: Preserve dynamic sites (e.g., news portals) as static knowledge dumps for compliance or historical research ⚠️ Troubleshooting No links extracted: Verify the homepage has <a> tags; test with a simple site like example.com and check HTTP response in executions Sheet update fails: Confirm column names match exactly (case-sensitive) and credential has edit permissions; try a new blank sheet Content truncated: Google Sheets limits cells to ~50k chars—adjust the .slice(0, 50000) in "Add Scraped Content to Sheet" or split into multiple rows Rate limiting errors: Add a "Wait" node after "Scrape Links" with 1-2s delay if the site blocks rapid requests
Track and analyze Forex news trading results with MyFxBook and Google Sheets
Forex News Trading Result Data This n8n template simulates the results if we take every trade based on the previous workflow: https://n8n.io/workflows/8340-automated-forex-news-alert-system-with-forex-factory-and-telegram/ Use Cases Analyze which news events have a higher probability of generating profit. Decide on Take Profit and Stop Loss levels for specific currency pairs or news events. Currency Pairs EURUSD, GBPUSD, AUDUSD, NZDUSD, USDJPY, USDCHF, USDCAD, XAUUSD Limitations We use High, Low, and Live Prices from MyFxBook. These values may differ depending on the broker. Spread widening and slippage will vary across brokers. Price gaps may occur over weekends. Profit/Loss also depends on when the trade is closed. How It Works Each day, the workflow checks if the Sheets have empty High and Low Price values. It queries MyFxBook for updated High and Low Prices for Date+1 (the next trading day after opening a buy/sell position) using the HTTP Request node. The response from the HTTP Request is parsed. If new High and Low data are available, the workflow retrieves the date. It scrapes the new High and Low Price data and calculates the multiplier for the respective currency pair. Example: USDJPY prices have 3 decimals, so the multiplier is 1000. Sheets are updated with High, Low, Points Up, and Points Down. The workflow checks if the buy/sell position results in profit or loss. Example: if we have a EURUSD Buy position and Points Up > 0, then there is a possibility of profit. Points Up can be negative if the new High Price remains below the Buy position entry price. If we have a Buy position and Points Up ≤ 0, then there is a possibility of a loss. How to Use Enter all required credentials. Create or download a Google Sheets file (example): https://docs.google.com/spreadsheets/d/1OhrbUQEclGegk5pRWWKz5nrnMbTZGT0lxK9aJqqId4/edit?usp=drivelink Run the workflow. Requirements Enable the Google Drive API in Google Cloud Console. Provide Google Sheets credentials. Need Help? Join the Discord or ask in the Forum! Thank you!
Automated competitor deal monitoring with AI segmentation & personalized email marketing
How it works Transform your business with intelligent deal monitoring and automated customer engagement! This AI-powered coupon aggregator continuously tracks competitor deals and creates personalized marketing campaigns that convert. Key Steps 24/7 Deal Monitoring - Automatically scans competitor websites daily for the best deals and offers Smart Customer Segmentation - Uses AI to intelligently categorize and target your customer base Personalized Offer Generation - Creates tailored coupon campaigns based on customer behavior and preferences Automated Email Marketing - Sends targeted email campaigns with personalized deals to the right customers Performance Analytics - Tracks campaign performance and provides detailed insights and reports Daily Management Reports - Delivers comprehensive analytics to management team every morning Set up steps Setup time: 10-15 minutes Configure competitor monitoring - Add target websites and deal sources you want to track Set up customer database - Connect your customer data source for intelligent segmentation Configure email integration - Connect your email service provider for automated campaigns Customize deal criteria - Define what types of deals and offers to prioritize Set up analytics tracking - Configure Google Sheets or database for performance monitoring Test automation flow - Run a test cycle to ensure all integrations work smoothly Never miss a profitable deal opportunity - let AI handle the monitoring and targeting while you focus on growth!
Star Wars language translation API for AI agents - 6 languages
Need help? Want access to this workflow + many more paid workflows + live Q&A sessions with a top verified n8n creator? Join the community Complete MCP server exposing 6 Starwars Translations API operations to AI agents. ⚡ Quick Setup Import this workflow into your n8n instance Credentials Add Starwars Translations API credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the Starwars Translations API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://api.funtranslations.com • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (6 total) 🔧 Translate (6 endpoints) • GET /translate/cheunh: Translate to Cheunh • GET /translate/gungan: Translate to Gungan • GET /translate/huttese: Translate to Huttese • GET /translate/mandalorian: Translate to Mandalorian • GET /translate/sith: Translate to Sith • GET /translate/yoda: Translate to Yoda 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native Starwars Translations API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.
Daily overdue tasks report from ClickUp to Gmail for team accountability
Description Stay on top of missed deadlines with this automated workflow that generates daily HTML reports of overdue tasks from ClickUp. This automation runs daily, fetching all tasks with past due dates, organizing them by assignee, and delivering professional reports to keep teams accountable. Perfect for sales and business development managers who need real-time visibility on overdue ClickUp tasks to drive accountability and timely completion. Features 🚨 Complete Overdue Task List - All tasks with past due dates, status, and sprint points 👤 Assignee Breakdown - Organized by team member for quick follow-ups 🏷 Priority Tagging - Visual urgency assessment with priority indicators 📊 Summary Statistics - Total overdue tasks per user and team-wide metrics 📅 Daily Automation - Ensures no overdue task slips through the cracks 📧 Professional Reports - Polished HTML emails delivered via Gmail Setup Instructions Prerequisites: Ensure you have n8n (v1.0+), ClickUp workspace access, and Gmail account with app-specific password ClickUp API: Generate API token in ClickUp Settings > Apps and add to n8n credentials Gmail Configuration: Set up Gmail credentials using app-specific password (enable 2FA first) Import Workflow: Load the JSON template and update ClickUp workspace/list IDs Configure Recipients: Update email recipient list in the Gmail node Test Run: Execute manually to verify data retrieval and email delivery Schedule: Set daily schedule (recommended: 8 AM on weekdays) The polished HTML report is sent automatically via Gmail, keeping managers, team leads, and stakeholders informed without manual task tracking. Perfect for sales and business development managers who need real-time visibility on overdue ClickUp tasks to drive accountability and timely completion. Keywords: n8n ClickUp automation, overdue task report, overdue assignments, ClickUp to Gmail workflow, missed deadline tracker, sales task monitoring, overdue sprint points report.
Automate property inspections and reporting with OpenAI, Google Sheets and Slack
Who’s it for Property management companies, building managers, and inspection teams who want to automate recurring property inspections, improve issue tracking, and streamline reporting. How it works / What it does This n8n workflow schedules periodic property inspections using a Cron trigger. AI generates customized inspection checklists for each property, which are sent to assigned inspectors. Inspectors submit photos and notes via a connected form or mobile app. AI analyzes these submissions to flag issues based on priority (high, medium, low). High-priority issues are routed to managers via Slack/email, while routine notes are logged for reporting. The workflow also generates weekly or monthly summary reports and can optionally notify tenants of resolved issues. How to set up Configure the Cron trigger with your desired inspection frequency. Connect Google Sheets or your CRM to fetch property and tenant data. Set up OpenAI node with your API key and checklist generation prompts. Configure email/SMS notifications for inspectors. Connect a form or mobile app via Webhook to collect inspection data. Set up Slack/email notifications for managers. Log all inspection results, photos, and flagged issues into Google Sheets. Configure summary report email recipients. Requirements n8n account with Google Sheets, Email, Slack, Webhook, and OpenAI nodes. Property and tenant data stored in Google Sheets or CRM. OpenAI API credentials for AI checklist generation and note analysis. How to customize the workflow Adjust Cron frequency to match inspection schedule. Customize AI prompts for property-specific checklist items. Add or remove branches for issue severity (high/medium/low). Include additional notification channels if needed (Teams, SMS, etc.). Workflow Use Case Automates property inspections for property management teams, ensuring no inspections are missed, AI-generated checklists standardize the process, and potential issues are flagged and routed efficiently. Saves time, improves compliance, and increases tenant satisfaction. Created by QuarterSmart | Hyrum Hurst
Track software vulnerability patents with ScrapeGraphAI, Matrix, and Intercom
Software Vulnerability Patent Tracker ⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template. This workflow automatically tracks newly-published patent filings that mention software-security vulnerabilities, buffer-overflow mitigation techniques, and related technology keywords. Every week it aggregates fresh patent data from USPTO and international patent databases, filters it by relevance, and delivers a concise JSON digest (and optional Intercom notification) to R&D teams and patent attorneys. Pre-conditions/Requirements Prerequisites n8n instance (self-hosted or n8n cloud, v1.7.0+) ScrapeGraphAI community node installed Basic understanding of patent search syntax (for customizing keyword sets) Optional: Intercom account for in-app alerts Required Credentials | Credential | Purpose | |------------|---------| | ScrapeGraphAI API Key | Enables ScrapeGraphAI nodes to fetch and parse patent-office webpages | | Intercom Access Token (optional) | Sends weekly digests directly to an Intercom workspace | Additional Setup Requirements | Setting | Recommended Value | Notes | |---------|-------------------|-------| | Cron schedule | 0 9 1 | Triggers every Monday at 09:00 server time | | Patent keyword matrix | See example CSV below | List of comma-separated keywords per tech focus | Example keyword matrix (upload as keywords.csv or paste into the “Matrix” node): topic,keywords Buffer Overflow,"buffer overflow, stack smashing, stack buffer" Memory Safety,"memory safety, safe memory allocation, pointer sanitization" Code Injection,"SQL injection, command injection, injection prevention" How it works This workflow automatically tracks newly-published patent filings that mention software-security vulnerabilities, buffer-overflow mitigation techniques, and related technology keywords. Every week it aggregates fresh patent data from USPTO and international patent databases, filters it by relevance, and delivers a concise JSON digest (and optional Intercom notification) to R&D teams and patent attorneys. Key Steps: Schedule Trigger: Fires weekly based on the configured cron expression. Matrix (Keyword Loader): Loads the CSV-based technology keyword matrix into memory. Code (Build Search Queries): Dynamically assembles patent-search URLs for each keyword group. ScrapeGraphAI (Fetch Results): Scrapes USPTO, EPO, and WIPO result pages and parses titles, abstracts, publication numbers, and dates. If (Relevance Filter): Removes patents older than 1 year or without vulnerability-related terms in the abstract. Set (Normalize JSON): Formats the remaining records into a uniform JSON schema. Intercom (Notify Team): Sends a summarized digest to your chosen Intercom workspace. (Skip or disable this node if you prefer to consume the raw JSON output instead.) Sticky Notes: Contain inline documentation and customization tips for future editors. Set up steps Setup Time: 10-15 minutes Install Community Node Navigate to “Settings → Community Nodes”, search for ScrapeGraphAI, and click “Install”. Create Credentials Go to “Credentials” → “New Credential” → select ScrapeGraphAI API → paste your API key. (Optional) Add an Intercom credential with a valid access token. Import the Workflow Click “Import” → “Workflow JSON” and paste the template JSON, or drag-and-drop the .json file. Configure Schedule Open the Schedule Trigger node and adjust the cron expression if a different frequency is required. Upload / Edit Keyword Matrix Open the Matrix node, paste your custom CSV, or modify existing topics & keywords. Review Search Logic In the Code (Build Search Queries) node, review the base URLs and adjust patent databases as needed. Define Notification Channel If using Intercom, select your Intercom credential in the Intercom node and choose the target channel. Execute & Activate Click “Execute Workflow” for a trial run. Verify the output. If satisfied, switch the workflow to “Active”. Node Descriptions Core Workflow Nodes: Schedule Trigger – Initiates the workflow on a weekly cron schedule. Matrix – Holds the CSV keyword table and makes each row available as an item. Code (Build Search Queries) – Generates search URLs and attaches meta-data for later nodes. ScrapeGraphAI – Scrapes patent listings and extracts structured fields (title, abstract, pub. date, link). If (Relevance Filter) – Applies date and keyword relevance filters. Set (Normalize JSON) – Maps scraped fields into a clean JSON schema for downstream use. Intercom – Sends formatted patent summaries to an Intercom inbox or channel. Sticky Notes – Provide inline documentation and edit history markers. Data Flow: Schedule Trigger → Matrix → Code → ScrapeGraphAI → If → Set → Intercom Customization Examples Change Data Source to Google Patents javascript // In the Code node const base = 'https://patents.google.com/?q='; items.forEach(item => { item.json.searchUrl = ${base}${encodeURIComponent(item.json.keywords)}&oq=${encodeURIComponent(item.json.keywords)}; }); return items; Send Digest via Slack Instead of Intercom javascript // Replace Intercom node with Slack node { "text": 🚀 New Vulnerability-related Patents (${items.length})\n + items.map(i => • <${i.json.link}|${i.json.title}>).join('\n') } Data Output Format The workflow outputs structured JSON data: json { "topic": "Memory Safety", "keywords": "memory safety, safe memory allocation, pointer sanitization", "title": "Memory protection for compiled binary code", "publicationNumber": "US20240123456A1", "publicationDate": "2024-03-21", "abstract": "Techniques for enforcing memory safety in compiled software...", "link": "https://patents.google.com/patent/US20240123456A1/en", "source": "USPTO" } Troubleshooting Common Issues Empty Result Set – Ensure that the keywords are specific but not overly narrow; test queries manually on USPTO. ScrapeGraphAI Timeouts – Increase the timeout parameter in the ScrapeGraphAI node or reduce concurrent requests. Performance Tips Limit the keyword matrix to <50 rows to keep weekly runs under 2 minutes. Schedule the workflow during off-peak hours to reduce load on patent-office servers. Pro Tips: Combine this workflow with a vector database (e.g., Pinecone) to create a semantic patent knowledge base. Add a “Merge” node to correlate new patents with existing vulnerability CVE entries. Use a second ScrapeGraphAI node to crawl citation trees and identify emerging technology clusters.