13 templates found
Category:
Author:
Sort:

Web site scraper for LLMs with Airtop

Recursive Web Scraping Use Case Automating web scraping with recursive depth is ideal for collecting content across multiple linked pages—perfect for content aggregation, lead generation, or research projects. What This Automation Does This automation reads a list of URLs from a Google Sheet, scrapes each page, stores the content in a document, and adds newly discovered links back to the sheet. It continues this process for a specified number of iterations based on the defined scraping depth. Input Parameters: Seed URL: The starting URL to begin the scraping process. Example: https://example.com/ Links must contain: Restricts the links to those that contain this specified string. Example: https://example.com/ Depth: The number of iterations (layers of links) to scrape beyond the initial set. Example: 3 How It Works Starts by reading the Seed URL from the Google Sheet. Scrapes each page and saves its content to the specified document. Extracts new links from each page that match the Links must contain string, appends them to the Google Sheet. Repeats steps 2–3 for the number of times specified by Depth - 1. Setup Requirements Airtop API Key — free to generate. Credentials set up for Google Docs (requires creating a project on Google Console). Read how to. Credentials set up for Google Spreadsheet. Next Steps Add Filtering Rules: Filter which links to follow based on domain, path, or content type. Combine with Scheduler: Run this automation on a schedule to continuously explore newly discovered pages. Export Structured Data: Extend the process to store extracted data in a CSV or database for analysis. Read more about website scraping for LLMS

AirtopBy Airtop
8832

Generate 360 social media reports with AI - Bright Data MCP

Social Media Intelligence Workflow with Bright Data and OpenAI Get a 360 Social media presence report for a person Who's it for Business development professionals, recruiters, sales teams, and market researchers who need comprehensive social media intelligence on individuals for lead qualification, due diligence, partnership evaluation, or candidate assessment. How it works Enter target person's details through the web form (name, company, location) AI Discovery Agent searches across selected platforms using name variations Profile validator verifies discovered profiles with confidence scoring Platform-specific agents analyze each profile using Bright Data MCP tools GPT-4 synthesizes all data into a comprehensive intelligence report Report automatically generated as formatted Google Doc with direct link Requirements Bright Data MCP account with PRO access (Get your Bright Data API key here) OpenAI API key (or alternative LLM provider) Google Drive OAuth connection for report delivery n8n self-hosted instance or cloud account How to set up Update Bright Data credentials: Find "Bright Data MCP" node (look for red warning note) Replace YOURBRIGHTDATATOKENHERE with your actual token Update UNLOCKERCODEHERE with your unlocker code Update Google Drive settings: Find "Create Empty Google Doc" node Select target folder there Configure your LLM credentials (OpenAI or alternative) Test with your own name using "Basic" search depth Watch Youtube Tutorial How to customize the workflow Add platforms: Extend the Switch node with new cases and create corresponding prompt builders Modify analysis depth: Edit the platform-specific prompt builders to focus on different metrics Change report format: Update the final LLM Chain prompt to adjust report structure Add notifications: Insert Slack or email nodes after report generation Adjust confidence thresholds: Modify validators to change profile verification requirements Alternative outputs: Replace Google Docs with PDF, Excel, or webhook to CRM

Romuald CzłonkowskiBy Romuald Członkowski
7040

AI call summary to HubSpot + follow-up task

This n8n template turns raw call transcripts into clean HubSpot call logs and a single, actionable follow-up task—automatically. Paste a transcript and the contact’s email; the workflow finds the contact, summarizes the conversation in 120–160 words, proposes the next best action, and (optionally) updates missing contact fields. Perfect for reps and founders who want accurate CRM hygiene without the manual busywork. --- How it works A form trigger collects two inputs: Contact email Plain-text call transcript The workflow looks up the HubSpot contact by email to pull known properties. An AI agent reads the transcript (plus known fields) to: Extract participants, role, problem/opportunity, requirements, blockers, timeline, and metrics. Write a 120–160 word recap a teammate can skim. Generate one concrete follow-up task (title + body). Suggest updates for missing contact properties (city, country, job title, job function). The recap is logged to HubSpot as a completed Call engagement. The follow-up is created in HubSpot as a Task with subject and body. (Optional) The contact record is updated using AI-suggested values if the transcript clearly mentions them. --- How to use Connect HubSpot (OAuth2) on all HubSpot nodes. Connect OpenAI on the AI nodes. Open Form: Capture Transcript, submit the email + transcript. (Optional) In AI: Summarize Call & Draft Task, tweak prompt rules (word count, date normalization). (Optional) In Update Contact from Transcript, review the mapped fields before enabling in production. Activate the workflow and paste transcripts after each call. --- Requirements HubSpot (OAuth2) for contact search, call logging, and tasks OpenAI for summarization and task drafting --- Notes & customization ideas Swap the form for a Google Drive or S3 watcher to ingest saved transcripts. Add a speech-to-text step if you store audio recordings. Extend Update Contact to include additional fields (timezone, department, seniority). Post the summary to Slack or email the AE for quick handoffs. Gate updates with a confidence check, or route low-confidence changes for manual approval.

MihaBy Miha
3669

Automate TikTok video transcription with RapidAPI and Google Sheets

TikTok Transcript Generator Overview This automated workflow extracts transcripts from TikTok videos by reading video URLs from a Google Sheet, calling the API via TikTok Transcript Generator, cleaning the subtitle data, and updating the sheet with transcripts. It efficiently handles batches, errors, and rate limits to provide a seamless transcription process. Key Features Batch processing: Reads and processes multiple TikTok video URLs from Google Sheets. Automatic transcript generation: Uses the TikTok Transcript Generator API on RapidAPI. Clean subtitle output: Removes timestamps and headers for clear transcripts. Error handling: Marks videos with no available transcript. Rate limiting: Implements wait times to avoid API throttling on RapidAPI. Seamless Google Sheets integration: Updates the same sheet with transcript results and statuses. API Used TikTok Transcript Generator API Google Sheet Columns | Column Name | Description | |----------------|-----------------------------------------| | Video Url | URL of the TikTok video to transcribe | | Transcript | Generated transcript text (updated by workflow) | | Generated Date | Date when the transcript was generated (YYYY-MM-DD) | Workflow Nodes Explanation | Node Name | Type | Purpose | |--------------------------|-----------------------|-------------------------------------------------------------------| | When clicking ‘Execute workflow’ | Manual Trigger | Manually starts the entire transcription workflow. | | Google Sheets2 | Google Sheets (Read) | Reads TikTok video URLs and transcript data from Google Sheets. | | Loop Over Items | Split In Batches | Processes rows in smaller batches to control execution speed. | | If | Conditional Check | Filters videos needing transcription (URL present, transcript empty). | | HTTP Request | HTTP Request | Calls the TikTok Transcript Generator API on RapidAPI to fetch transcripts. | | If1 | Conditional Check | Checks for valid API responses (handles 404 errors). | | Code | Code (JavaScript) | Cleans and formats raw subtitle text by removing timestamps. | | Google Sheets | Google Sheets (Update)| Updates the sheet with cleaned transcripts and generation dates. | | Google Sheets1 | Google Sheets (Update)| Updates sheet with “No transcription available” message on error.| | Wait | Wait | Adds delay between batches to avoid API rate limits on RapidAPI. | Challenges Resolved Manual Transcription Effort: Eliminates the need to manually transcribe TikTok videos, saving time and reducing errors. API Rate Limits: Introduces batching and wait periods to avoid exceeding API usage limits on RapidAPI, ensuring smooth execution. Incomplete or Missing Data: Filters out videos already transcribed and handles missing transcripts gracefully by logging appropriate messages. Data Formatting Issues: Cleans raw subtitle data to provide readable, timestamp-free transcripts. Data Synchronization: Updates transcripts back into the same Google Sheet row, maintaining data consistency and ease of access. Use Cases Content creators wanting to transcribe TikTok videos automatically. Social media analysts extracting text data for research. Automation enthusiasts integrating transcript generation into workflows. How to Use Prepare a Google Sheet with the columns: Video Url, Transcript, and Generated Date. Connect your Google Sheets account in the workflow. Enter your RapidAPI key for the TikTok Transcript Generator API. Execute the workflow to generate transcripts. View transcripts and generated dates directly in your Google Sheet. --- Try this workflow to automate your TikTok video transcriptions efficiently! Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!

Evoort SolutionsBy Evoort Solutions
1473

Amplify social media presence with O3 and GPT-4 multi-agent team

Amplify Social Media Presence with O3 and GPT-4 Multi-Agent Team 🌍 Overview This n8n workflow acts like a virtual social media department. A Social Media Director Agent coordinates multiple specialized AI agents (Instagram, Twitter/X, Facebook, TikTok, YouTube, and Analytics). Each agent creates or analyzes content for its platform, powered by OpenAI models. The result? A fully automated, cross-platform social media strategy—from content creation to performance tracking. --- 🟢 Section 1: Trigger & Director Setup 🔗 Nodes: When chat message received (Trigger) → Starts the workflow whenever you send a request (e.g., “Plan a TikTok campaign for my product launch”). Social Media Director Agent (connected to OpenAI O3 model) → Acts as the strategist. Think Tool → Helps the Director “reason” before delegating. 💡 Beginner takeaway: This section makes your workflow interactive. You send a request → the Director decides the best approach → then it assigns tasks. 📈 Advantage: Instead of manually planning content per platform, you only send one command, and the AI Director handles the strategy. --- 🟦 Section 2: Specialized Social Media Agents 🔗 Nodes (each paired with GPT-4.1-mini): 📸 Instagram Content Creator → Visual storytelling, Reels, Hashtags 🐦 Twitter/X Strategist → Viral tweets, trends, engagement 👥 Facebook Community Manager → Audience growth, ads, group engagement 🎵 TikTok Video Creator → Short-form video, viral trends 🎬 YouTube Content Planner → Long-form strategy, SEO, thumbnails 📊 Analytics Specialist → Performance insights, cross-platform reporting 💡 Beginner takeaway: Each platform has its own AI expert. They receive the Director’s strategy and produce tailored content for their platform. 📈 Advantage: Instead of one-size-fits-all posts, you get optimized content per platform—increasing reach and engagement. --- 🟣 Section 3: Models & Connections 🔗 Nodes: OpenAI Chat Models (O3 + multiple GPT-4.1-mini models) Each model is connected to its respective agent. 💡 Beginner takeaway: Think of these as the “brains” behind each specialist. The Director uses O3 for advanced reasoning, while the specialists use GPT-4.1-mini (cheaper, faster) for content generation. 📈 Advantage: This keeps costs low while maintaining quality output. --- 📊 Final Overview Table | Section | Nodes | Purpose | Benefit | | --------------------- | -------------------------------------------------------- | -------------------------------------- | ------------------------------ | | 🟢 Trigger & Director | Chat Trigger, Director, Think Tool | Capture requests & plan strategy | One command → full social plan | | 🟦 Specialists | Instagram, Twitter, Facebook, TikTok, YouTube, Analytics | Platform-specific content | Optimized posts per platform | | 🟣 Models | O3 + GPT-4.1-mini | Provide reasoning & content generation | High-quality, cost-efficient | --- 🚀 Why This Workflow is Powerful Multi-platform coverage: All major platforms handled in one flow Human-like strategy: Director agent makes real marketing decisions Scalable & fast: Generate a full campaign in minutes Cost-effective: O3 only for strategy, GPT-4.1-mini for bulk content Beginner-friendly: Just type your request → get full campaign output ---

Yaron BeenBy Yaron Been
1346

Save Qualys reports to TheHive

Automate Report Generation with n8n & Qualys Introducing the Save Qualys Reports to TheHive Workflow—a robust solution designed to automate the retrieval and storage of Qualys reports in TheHive. This workflow fetches reports from Qualys, filters out already processed reports, and creates cases in TheHive for the new reports. It runs every hour to ensure continuous monitoring and up-to-date vulnerability management, making it ideal for Security Operations Centers (SOCs). How It Works: Set Global Variables: Initializes necessary global variables like base_url and newtimestamp. This step ensures that the workflow operates with the correct configuration and up-to-date timestamps. Ensure to change the Global Variables to match your environment. Fetch Reports from Qualys: Sends a GET request to the Qualys API to retrieve finished reports. Automating this step ensures timely updates and consistent data retrieval. Convert XML to JSON: Converts the XML response to JSON format for easier data manipulation. This transformation simplifies further processing and integration into TheHive. Filter Reports: Checks if the reports have already been processed using their creation timestamps. This filtering ensures that only new reports are handled, avoiding duplicates. Process Each Report: Loops through the list of new reports, ensuring each is processed individually. This step-by-step handling prevents issues related to bulk processing and improves reliability. Create Case in TheHive: Generates a new case in TheHive for each report, serving as a container for the report data. Automating case creation improves efficiency and ensures that all relevant data is captured. Download and Attach Report: Downloads the report from Qualys and attaches it to the respective case in TheHive. This automation ensures that all data is properly archived and easily accessible for review. Get Started: Ensure your Qualys and TheHive integrations are properly set up. Customize the workflow to fit your specific vulnerability management needs. Need Help? Join the discussion on our Forum or check out resources on Discord! Deploy this workflow to streamline your vulnerability management process, improve response times, and enhance the efficiency of your security operations.

Angel MenendezBy Angel Menendez
1150

Automated local event monitor with Bright Data MCP and OpenAI analysis

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automatically monitors local event platforms (Eventbrite, Meetup, Facebook Events) and aggregates upcoming events that match your criteria. Never miss a networking or sponsorship opportunity again. Overview A scheduled trigger scrapes multiple event sites via Bright Data, filtering by location, date range, and keywords. OpenAI classifies each event (conference, meetup, workshop) and extracts key details such as venue, organizers, and ticket price. Updates are posted to Slack and archived in Airtable for quick lookup. Tools Used n8n – Core automation engine Bright Data – Reliable multi-site scraping OpenAI – NLP-based event categorization Slack – Delivers daily event digests Airtable – Stores enriched event records How to Install Import the Workflow: Add the .json file to n8n. Configure Bright Data: Provide your account credentials. Set Up OpenAI: Insert your API key. Connect Slack & Airtable: Authorize both services. Customize Filters: Edit the initial Set node to adjust city, radius, and keywords. Use Cases Community Managers: Curate a calendar of relevant events. Sales Teams: Identify trade shows and meetups for prospecting. Event Planners: Track competing events when choosing dates. Marketers: Spot speaking or sponsorship opportunities. Connect with Me Website: https://www.nofluff.online YouTube: https://www.youtube.com/@YaronBeen/videos LinkedIn: https://www.linkedin.com/in/yaronbeen/ Get Bright Data: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) n8n automation eventmonitoring brightdata openscraping openai slackalerts n8nworkflow nocode meetup eventbrite

Yaron BeenBy Yaron Been
467

Convert task ideas to implementation plans with GPT-4o, Slack & Google Sheets

🚀 Turn your random ideas into concrete automation specs This workflow acts as your interactive "n8n Consultant." Simply write down a rough automation idea in Google Tasks (e.g., "Send weather updates to Telegram"), and the AI will research, design, and send a detailed n8n implementation plan to your Slack. ✨ Why is this workflow special? Unlike simple notification workflows, this features a Human-in-the-Loop review process. You don't just get a message; you get control. Regenerate: Not satisfied with the AI's plan? Click a button in Slack to have the AI rewrite it instantly. Archive: Happy with the plan? Click "Approve" to automatically save the detailed specs to Google Sheets and mark the task as complete. How it works Fetch: The workflow periodically checks a specific Google Tasks list for new ideas. AI Design: The AI (OpenAI) analyzes your idea and generates a structured plan, including node configuration and potential pitfalls. Human Review: It sends the plan to Slack with interactive "Approve" and "Regenerate" buttons. The workflow waits for your input. If Regenerate: The AI re-analyzes the idea and creates a new variation. If Approve: The workflow proceeds to the next step. Archive: The approved plan (Title, Nodes, Challenges) is saved to a Google Sheet for future development. Close: The original Google Task is updated with a "Processed" flag. How to set up Google Tasks: Create a new list named "n8n Ideas". Google Sheets: Create a new sheet with the following headers in the first row (A to H): Date Added Idea Title Status Recommended Nodes Key Challenges Improvement Ideas Alternatives Source Task ID Credentials: Configure credentials for Google Tasks, Google Sheets, OpenAI, and Slack. Configure Nodes: [Step 1] Fetch New Ideas: Select your Task list. [Step 4] Slack — Review & Approve: Select your target channel. [Action] Archive to Sheets: Select your Spreadsheet and Sheet. [Close] Mark Task Done: Select your Task list again. Requirements Google Tasks account Google Sheets account OpenAI API Key Slack account

Shun NakayamaBy Shun Nakayama
297

Create an intelligent FAQ Telegram bot with Google Gemini and Supabase

Overview This template creates a smart FAQ bot on Telegram, powered by Google Gemini for intelligent answers and Supabase to store user data. The workflow can distinguish between new and existing users. How It Works Trigger: The workflow starts when a user sends any message to the Telegram Bot. Check User: It looks up the user's chatid in a Supabase telegramusers table. Route (New vs. Existing): New User (True Path): If the user is not found, the workflow saves their chat_id to Supabase and sends a one-time welcome message. Existing User (False Path): If the user exists, the workflow proceeds to the AI step. Generate Answer: It loads a predefined FAQ context and combines it with the user's question. This is sent to Google Gemini via the AI Agent node. Send Response: The AI-generated answer is sent back to the user on Telegram. Setup Instructions Telegram: Connect your Telegram credentials to the Telegram Trigger and both Send a text message nodes. Supabase: Connect your Supabase credentials to the Find user in DB and Create a row nodes. You MUST have a table named telegram_users. This table MUST have a column named chat_id (type: text or varchar). Google Gemini: Connect your Google Gemini (Palm API) credentials to the Google Gemini Chat Model node. (REQUIRED) Customization: Open the Set FAQ Context node and change its contents with Questions (Q) and Answers (A) that are appropriate for your bot. Change the text in the Send a text message (Welcome Message) node as you want.

Mohammad JibrilBy Mohammad Jibril
286

Qualify & route leads with GPT-4o, Clearbit, HubSpot CRM & Slack alerts

How it works • Webhook receives lead form submissions from your website • AI Agent (GPT-4o) analyzes lead quality using intelligent scoring framework • Clearbit enriches company data automatically (employee count, industry, revenue) • Qualification score (0-100) determines routing: high-quality leads → HubSpot CRM + Slack alert, low-quality leads → Airtable for manual review • Structured output parser ensures reliable JSON formatting every time Set up steps • Time to set up: 15-20 minutes • Import the Clearbit sub-workflow first (separate workflow file included) • Create 7 custom properties in HubSpot (qualificationscore, buyingintent, urgencylevel, budgetindicator, aisummary, painpoints, recommended_action) • Create Airtable base with 14 columns for low-quality lead tracking • Get Slack channel IDs for sales alerts and review requests • Add credentials: OpenAI (GPT-4o), Clearbit API, HubSpot OAuth2, Slack OAuth2, Airtable Token • Replace placeholder IDs in Slack and Airtable nodes • Configure the Clearbit Enrichment Tool node with your sub-workflow ID What you'll need • OpenAI API - OpenAI model access for AI qualification • Clearbit API - Free tier available for company enrichment • HubSpot - Free CRM account works • Slack - Standard workspace • Airtable - Free plan works • Website form - To send webhook data Who this is for Sales teams and agencies that want to automatically qualify inbound leads before they hit the CRM. Perfect for B2B companies with high lead volume that need intelligent routing.

GreypillarBy Greypillar
160

Epa clean water act data access & compliance monitoring API integration

⚠️ ADVANCED USE ONLY - U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services MCP Server (36 operations) 🚨 This workflow is for advanced users only! Need help? Want access to this workflow + many more paid workflows + live Q&A sessions with a top verified n8n creator? Join the community This MCP server contains 36 operations which is significantly more than the recommended maximum of tools for most AI clients. 🔍 Recommended Alternative for basic use cases Seek a simplified MCP server that utilizes the official n8n tool implementation for U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services if available, or an MCP server with only common operations as it will be more efficient and easier to manage. 🛠️ Advanced Usage Requirements BEFORE adding this MCP server to your client: Disable or delete unused nodes - Review sections and disable/delete those you don't need AFTER adding the MCP server to your client: 1.Selective tool enabling - Instead of enabling all tools (default), manually select only the specific tools you need for that Workflow's MCP client. Monitor performance - Too many tools can slow down AI responses 💡 Pro Tips Keep maximum 40 enabled tools - Most AI clients perform better with fewer tools Group related operations and only enable one group at a time Use the overview note to understand what each operation group does Ping me on discord if your business needs this implemented professionally ⚡ Quick Setup Import this workflow into your n8n instance Credentials Add U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works This workflow converts the U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services API into an MCP-compatible interface for AI agents. • MCP Trigger: Serves as your server endpoint for AI agent requests • HTTP Request Nodes: Handle API calls to https://echodata.epa.gov/echo • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Returns responses directly to the AI agent 📋 Available Operations (36 total) 🔧 CwaRestServices.Get_Download (2 endpoints) • GET /cwarestservices.get_download: Submit CWA Download Data • POST /cwarestservices.get_download: Clean Water Act (CWA) Download Data Service 🔧 CwaRestServices.Get_Facilities (2 endpoints) • GET /cwarestservices.get_facilities: Submit CWA Facility Search • POST /cwarestservices.get_facilities: Clean Water Act (CWA) Facility Search Service 🔧 CwaRestServices.GetFacilityInfo (2 endpoints) • GET /cwarestservices.getfacilityinfo: Submit CWA Facility Details • POST /cwarestservices.getfacilityinfo: Clean Water Act (CWA) Facility Enhanced Search Service 🔧 CwaRestServices.Get_Geojson (2 endpoints) • GET /cwarestservices.get_geojson: Submit CWA GeoJSON Data • POST /cwarestservices.get_geojson: Clean Water Act (CWA) GeoJSON Service 🔧 CwaRestServices.GetInfoClusters (2 endpoints) • GET /cwarestservices.getinfoclusters: Submit CWA Info Clusters • POST /cwarestservices.getinfoclusters: Clean Water Act (CWA) Info Clusters Service 🔧 CwaRestServices.Get_Map (2 endpoints) • GET /cwarestservices.get_map: Submit CWA Map Data • POST /cwarestservices.get_map: Clean Water Act (CWA) Map Service 🔧 CwaRestServices.Get_Qid (2 endpoints) • GET /cwarestservices.get_qid: Submit CWA Paginated Results • POST /cwarestservices.get_qid: Clean Water Act (CWA) Paginated Results Service 🔧 CwaRestServices.Metadata (2 endpoints) • GET /cwarestservices.metadata: Submit CWA Metadata • POST /cwarestservices.metadata: Clean Water Act (CWA) Metadata Service 🔧 RestLookups.BpTribes (2 endpoints) • GET /restlookups.bptribes: Submit BP Tribes Data • POST /restlookups.bptribes: ECHO BP Tribes Lookup Service 🔧 RestLookups.CwaParameters (2 endpoints) • GET /restlookups.cwaparameters: Submit CWA Parameters • POST /restlookups.cwaparameters: ECHO CWA Parameter Lookup Service 🔧 RestLookups.CwaPollutants (2 endpoints) • GET /restlookups.cwapollutants: Submit CWA Pollutants • POST /restlookups.cwapollutants: ECHO CWA Pollutants Lookup Service 🔧 RestLookups.FederalAgencies (2 endpoints) • GET /restlookups.federalagencies: Submit Federal Agencies • POST /restlookups.federalagencies: ECHO Federal Agency Lookup Service 🔧 RestLookups.IcisInspection_Types (2 endpoints) • GET /restlookups.icisinspection_types: Submit ICIS Inspection Types • POST /restlookups.icisinspection_types: ECHO ICIS NPDES Inspection Types Lookup Service 🔧 RestLookups.IcisLaw_Sections (2 endpoints) • GET /restlookups.icislaw_sections: Submit ICIS Law Sections • POST /restlookups.icislaw_sections: ECHO ICIS NPDES Law Sections Lookup Service 🔧 RestLookups.NaicsCodes (2 endpoints) • GET /restlookups.naicscodes: Submit NAICS Codes • POST /restlookups.naicscodes: ECHO NAICS Codes Lookup Service 🔧 RestLookups.NpdesParameters (2 endpoints) • GET /restlookups.npdesparameters: Submit NPDES Parameters • POST /restlookups.npdesparameters: ECHO NPDES Parameters Lookup Service 🔧 RestLookups.WbdCode_Lu (2 endpoints) • GET /restlookups.wbdcode_lu: Submit WBD Codes • POST /restlookups.wbdcode_lu: ECHO WBD Code Lookup Service 🔧 RestLookups.WbdName_Lu (2 endpoints) • GET /restlookups.wbdname_lu: Submit WBD Names • POST /restlookups.wbdname_lu: ECHO WBD Name Lookup Service 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Path parameters and identifiers • Query parameters and filters • Request body data • Headers and authentication Response Format: Native U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services API responses with full data structure Error Handling: Built-in n8n HTTP request error management 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Cursor: Add MCP server SSE URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n HTTP request handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.

David AshbyBy David Ashby
117

Automatically update Amazon transaction memos in YNAB with AI & Gmail

Who it's for ------------ This workflow is ideal for YNAB users who frequently shop on Amazon and want their transaction memos to automatically show itemized purchase details. It's especially helpful for people who import bank transactions into YNAB and want to keep purchase records tidy without manual entry. How it works ------------ The workflow triggers on a set schedule, via a webhook, or manually. It retrieves all unapproved transactions from your YNAB budget, filters for Amazon purchases with empty memo fields, and processes each transaction individually. Using Gmail, it searches for matching Amazon emails (within ±5 days of the transaction date) and sends the email data to an AI agent powered by OpenAI. The AI extracts product names and prices, generating a concise memo line (up to 499 characters). If no valid purchase info is found, a fallback message is added instead. A 5-second delay prevents API rate limiting. How to set up ------------- Connect your YNAB account with valid API credentials. Connect Gmail with OAuth2 authentication. Add your OpenAI (or other LLM) API credentials. Configure the schedule trigger or use manual/webhook start. Run the workflow and monitor execution logs in n8n. Requirements ------------ YNAB API credentials Gmail OAuth2 connection OpenAI API key (or another compatible AI model) How to customize ---------------- You can change the AI model (e.g., Gemini or Claude) or add HTML-to-Markdown conversion to lower token costs. Adjust the wait node delay to fit your API rate limits or modify the email date range for greater accuracy. Security note: Never store or share API keys or personal email data directly in the workflow. Use credential nodes to manage sensitive information securely.

Angel MenendezBy Angel Menendez
111