Create, add an attachment, and send a draft using Microsoft Outlook
This workflow allows you to create, add an attachment, and send a draft using the Microsoft Outlook node. Microsoft Outlook node: This node creates a draft message with HTML content. You can either set the content as Text or HTML. You can also add the recipients to the draft in this node. HTTP Request node: This node fetches the logo of n8n from a URL and returns the binary data. You might want to fetch files from your machine or another email or a database. You can replace this node with the relevant node. Microsoft Outlook1 node: This node adds the attachment that we receive from the previous node to the draft message that we created. Microsoft Outlook2 node: This node sends the draft message to a recipient. Since we didn't mention the recipient in the Microsoft Outlook node, we add the recipient in this node. You can also enter multiple recipients.
Job scraping using LinkedIn, Indeed, Bright Data, Google Sheets
LinkedIn & Indeed Job Scraper with Bright Data & Google Sheets Export Overview This n8n workflow automates the process of scraping job listings from both LinkedIn and Indeed platforms simultaneously, combining results, and exporting data to Google Sheets for comprehensive job market analysis. It integrates with Bright Data for professional web scraping, Google Sheets for data storage, and provides intelligent status monitoring with retry mechanisms. Workflow Components π Trigger Input Form Type: Form Trigger Purpose: Initiates the workflow with user-defined job search criteria Input Fields: City (required) Job Title (required) Country (required) Job Type (optional dropdown: Full-Time, Part-Time, Remote, WFH, Contract, Internship, Freelance) Function: Captures user requirements to start the dual-platform job scraping process π§ Format Input for APIs Type: Code Node (JavaScript) Purpose: Prepares and formats user input for both LinkedIn and Indeed APIs Processing: Standardizes location and job title formats Creates API-specific input structures Generates custom output field configurations Function: Ensures compatibility with both Bright Data datasets π Start Indeed Scraping Type: HTTP Request (POST) Purpose: Initiates Indeed job scraping via Bright Data Endpoint: https://api.brightdata.com/datasets/v3/trigger Parameters: Dataset ID: gd_lpfll7v5hcqtkxl6l Include errors: true Type: discover_new Discover by: keyword Limit per input: 2 Custom Output Fields: jobid, companyname, jobtitle, description_text location, salaryformatted, companyrating applylink, url, dateposted, benefits π Start LinkedIn Scraping Type: HTTP Request (POST) Purpose: Initiates LinkedIn job scraping via Bright Data (parallel execution) Endpoint: https://api.brightdata.com/datasets/v3/trigger Parameters: Dataset ID: gd_l4dx9j9sscpvs7no2 Include errors: true Type: discover_new Discover by: keyword Limit per input: 2 Custom Output Fields: jobpostingid, jobtitle, companyname, job_location jobsummary, jobemploymenttype, jobbasepayrange applylink, url, jobposteddate, companylogo π Check Indeed Status Type: HTTP Request (GET) Purpose: Monitors Indeed scraping job progress Endpoint: https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Function: Checks if Indeed dataset scraping is complete π Check LinkedIn Status Type: HTTP Request (GET) Purpose: Monitors LinkedIn scraping job progress Endpoint: https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Function: Checks if LinkedIn dataset scraping is complete β±οΈ Wait Nodes (60 seconds each) Type: Wait Node Purpose: Implements intelligent polling mechanism Duration: 1 minute Function: Pauses workflow before rechecking scraping status to prevent API overload β Verify Indeed Completion Type: IF Condition Purpose: Evaluates Indeed scraping completion status Condition: status === "ready" Logic: True: Proceeds to data validation False: Loops back to status check with wait β Verify LinkedIn Completion Type: IF Condition Purpose: Evaluates LinkedIn scraping completion status Condition: status === "ready" Logic: True: Proceeds to data validation False: Loops back to status check with wait π Validate Indeed Data Type: IF Condition Purpose: Ensures Indeed returned job records Condition: records !== 0 Logic: True: Proceeds to fetch Indeed data False: Skips Indeed data retrieval π Validate LinkedIn Data Type: IF Condition Purpose: Ensures LinkedIn returned job records Condition: records !== 0 Logic: True: Proceeds to fetch LinkedIn data False: Skips LinkedIn data retrieval π₯ Fetch Indeed Data Type: HTTP Request (GET) Purpose: Retrieves final Indeed job listings Endpoint: https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Format: JSON Function: Downloads completed Indeed job data π₯ Fetch LinkedIn Data Type: HTTP Request (GET) Purpose: Retrieves final LinkedIn job listings Endpoint: https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Format: JSON Function: Downloads completed LinkedIn job data π Merge Results Type: Merge Node Purpose: Combines Indeed and LinkedIn job results Mode: Merge all inputs Function: Creates unified dataset from both platforms π Save to Google Sheet Type: Google Sheets Node Purpose: Exports combined job data for analysis Operation: Append rows Target: "Compare" sheet in specified Google Sheet document Data Mapping: Job Title, Company Name, Location Job Detail (description), Apply Link Salary, Job Type, Discovery Input Workflow Flow Input Form β Format APIs β [Indeed Trigger] + [LinkedIn Trigger] β β Check Status Check Status β β Wait 60s Wait 60s β β Verify Ready Verify Ready β β Validate Data Validate Data β β Fetch Indeed Fetch LinkedIn β β ββββ Merge Results ββββ β Save to Google Sheet Configuration Requirements API Keys & Credentials Bright Data API Key: Required for both LinkedIn and Indeed scraping Google Sheets OAuth2: For data storage and export access n8n Form Webhook: For user input collection Setup Parameters Google Sheet ID: Target spreadsheet identifier Sheet Name: "Compare" tab for job data export Form Webhook ID: User input form identifier Dataset IDs: Indeed: gd_lpfll7v5hcqtkxl6l LinkedIn: gd_l4dx9j9sscpvs7no2 Key Features Dual Platform Scraping Simultaneous LinkedIn and Indeed job searches Parallel processing for faster results Comprehensive job market coverage Platform-specific field extraction Intelligent Status Monitoring Real-time scraping progress tracking Automatic retry mechanisms with 60-second intervals Data validation before processing Error handling and timeout management Smart Data Processing Unified data format from both platforms Intelligent field mapping and standardization Duplicate detection and removal Rich metadata extraction Google Sheets Integration Automatic data export and storage Organized comparison format Historical job search tracking Easy sharing and collaboration Form-Based Interface User-friendly job search form Flexible job type filtering Multi-country support Real-time workflow triggering Use Cases Personal Job Search Comprehensive multi-platform job hunting Automated daily job searches Organized opportunity comparison Application tracking and management Recruitment Services Client job search automation Market availability assessment Competitive salary analysis Bulk candidate sourcing Market Research Job market trend analysis Salary benchmarking studies Skills demand assessment Geographic opportunity mapping HR Analytics Competitor hiring intelligence Role requirement analysis Compensation benchmarking Talent market insights Technical Notes Polling Interval: 60-second status checks for both platforms Result Limiting: Maximum 2 jobs per input per platform Data Format: JSON with structured field mapping Error Handling: Comprehensive error tracking in all API requests Retry Logic: Automatic status rechecking until completion Country Support: Adaptable domain selection (indeed.com, fr.indeed.com) Form Validation: Required fields with optional job type filtering Merge Strategy: Combines all results from both platforms Export Format: Standardized Google Sheets columns for easy analysis Sample Data Output | Field | Description | Example | |-------|-------------|---------| | Job Title | Position title | "Senior Software Engineer" | | Company Name | Hiring organization | "Tech Solutions Inc." | | Location | Job location | "San Francisco, CA" | | Job Detail | Full description | "We are seeking a senior developer..." | | Apply Link | Direct application URL | "https://company.com/careers/123" | | Salary | Compensation info | "$120,000 - $150,000" | | Job Type | Employment details | "Full-time, Remote" | Setup Instructions Import Workflow: Copy JSON configuration into n8n Configure Bright Data: Add API credentials for both datasets Setup Google Sheets: Create target spreadsheet and configure OAuth Update References: Replace placeholder IDs with your actual values Test Workflow: Submit test form and verify data export Activate: Enable workflow and share form URL with users --- --- --- For any questions or support, please contact: info@incrementors.com or fill out this form: https://www.incrementors.com/contact-us/
Track expenses automatically with Telegram Bot using GPT-4o, OCR and voice recognition
Personal Expense Tracker Bot π° AI-powered Telegram bot for effortless expense tracking. Send receipts, voice messages, or text - the bot automatically extracts and categorizes your expenses. β¨ Key Features πΈ Receipt & Invoice OCR - Send photos of receipts or PDF invoices, AI extracts expense data automatically π€ Voice Messages - Speak your expenses naturally, audio is transcribed and processed π¬ Natural Language - Just type "spent 50 on groceries" or any text format π Multilingual - Processes documents in any language (EN, DE, PT, etc.) π Smart Statistics - Get monthly totals, category breakdowns, multi-month comparisons π Private & Secure - Single-user authorization, only you can access your data β‘ Zero Confirmation - Expenses are added instantly, no annoying "confirm?" prompts π― How It Works Send expense data via Telegram: Photo of receipt PDF invoice Voice message Text message AI processes automatically: Extracts amount, date, vendor Categorizes expense Stores in organized format Query your expenses: "Show my expenses for November" "How much did I spend on groceries?" "Compare last 3 months" π Expense Categories Groceries, Transportation, Housing, Utilities, Healthcare, Entertainment, Dining Out, Clothing, Education, Subscriptions, Personal Care, Gifts, Travel, Sports, Other π§ Setup Requirements Telegram Bot Create a Telegram bot via @BotFather and get your API token. Configure credentials for nodes: Input, WelcomeMessage, GetAudioFile, GetAttachedFile, GetAttachedPhoto ReplyText, NotAuthorizedMessage, DeleteProcessing OpenRouter API Get API key from OpenRouter for AI processing. Configure credentials for: Gpt4o (main processing) Sonnet45 (expense assistant) Ainoflow API Get API key from Ainoflow for storage and OCR. Configure Bearer credentials for: GetConfig, SaveConfig ExtractFileText, ExtractImageText TranscribeRecording JsonStorageMcp (MCP tool) ποΈ Workflow Architecture | Section | Description | |---------|-------------| | Message Trigger | Receives all Telegram messages | | Bot Privacy | Locks bot to first user, rejects unauthorized access | | Chat Message / Audio | Routes text and voice messages to AI | | Document / Photo | Extracts text from files via OCR and forwards to AI | | Root Agent | Routes messages to Expense Assistant, validates responses | | Expense Assistant | Core logic: stores expenses, calculates statistics | | Result / Reply | Sends formatted response back to Telegram | | Cleanup / Reset | Manual trigger to delete all data (β οΈ use with caution) | π¬ Usage Examples Adding Expenses πΈ [Send receipt photo] β Added: 45.50 EUR - Groceries (Lidl) π€ "Bought coffee for five euros" β Added: 5.00 EUR - Dining Out (coffee) π¬ "50 uber" β Added: 50.00 EUR - Transportation (uber) Querying Expenses "Show my expenses" β November 2025: 1,250.50 EUR (23 expenses) Top: Groceries 450β¬, Transportation 280β¬, Dining 220β¬ "How much on entertainment this month?" β Entertainment: 85.00 EUR (3 expenses) "Compare October and November" β Oct: 980β¬ | Nov: 1,250β¬ (+27%) π¦ Data Storage Expenses are stored in JSON format organized by month (YYYY-MM): json { "id": "uuid", "amount": 45.50, "currency": "EUR", "category": "Groceries", "description": "Store name", "date": "2025-11-10T14:30:00Z", "created_at": "2025-11-10T14:35:22Z" } β οΈ Important Notes First user locks the bot - Run /start to claim ownership Default currency is EUR - AI auto-detects other currencies Cleanup deletes ALL data - Use manual trigger with caution No confirmation for adding - Only delete operations ask for confirmation π οΈ Customization Change default currency in agent prompts Add/modify expense categories in ExpenseAssistant Extend Root Agent with additional assistants Adjust AI models (swap GPT-4o/Sonnet as needed) π Related Resources Create Telegram Bot OpenRouter Credentials Ainoflow Platform πΌ Need Customization? Want to adapt this template for your specific needs? Custom integrations, additional features, or enterprise deployment? Contact us at Ainova Systems - We build AI automation solutions for businesses. --- Tags: telegram, expense-tracker, ai-agent, ocr, voice-to-text, openrouter, mcp-tools, personal-finance
Create an intelligent FAQ Telegram bot with Google Gemini and Supabase
Overview This template creates a smart FAQ bot on Telegram, powered by Google Gemini for intelligent answers and Supabase to store user data. The workflow can distinguish between new and existing users. How It Works Trigger: The workflow starts when a user sends any message to the Telegram Bot. Check User: It looks up the user's chatid in a Supabase telegramusers table. Route (New vs. Existing): New User (True Path): If the user is not found, the workflow saves their chat_id to Supabase and sends a one-time welcome message. Existing User (False Path): If the user exists, the workflow proceeds to the AI step. Generate Answer: It loads a predefined FAQ context and combines it with the user's question. This is sent to Google Gemini via the AI Agent node. Send Response: The AI-generated answer is sent back to the user on Telegram. Setup Instructions Telegram: Connect your Telegram credentials to the Telegram Trigger and both Send a text message nodes. Supabase: Connect your Supabase credentials to the Find user in DB and Create a row nodes. You MUST have a table named telegram_users. This table MUST have a column named chat_id (type: text or varchar). Google Gemini: Connect your Google Gemini (Palm API) credentials to the Google Gemini Chat Model node. (REQUIRED) Customization: Open the Set FAQ Context node and change its contents with Questions (Q) and Answers (A) that are appropriate for your bot. Change the text in the Send a text message (Welcome Message) node as you want.
Automate faceless shorts with OpenAI, RunwayML & ElevenLabs: script to social media
βοΈ Faceless YouTube Generator - Automate Faceless YouTube Shorts with AI: Generate Scripts, Videos, Audio, and Captions in Minutes! Tired of grinding out YouTube content? This n8n workflow turns AI into your personal video factoryβcreating engaging, faceless shorts on autopilot. Perfect for creators, marketers, or side-hustlers looking to monetize without showing your face. From script writing to final upload, it's all handled: AI-generated scripts (optimized for 20-second hooks), stunning image-to-video animations via RunwayML, immersive audio with ElevenLabs, and pro captions. Scale your channel effortlessly and watch the views (and revenue) roll in! π§βπ» Author: LeeWei --- π How It Works β’ Trigger & Script Magic: Starts via Google Sheets webhookβAI crafts a punchy 213-223 character script (via Anthropic) for ~20-second shorts, ensuring viral potential. β’ Visuals on Autopilot: Splits scenes, generates high-quality images with OpenAI, then animates them into videos using RunwayML's Gen3a Turbo. β’ Audio Layering: Adds soothing background sounds and professional voice-overs (Adam voice) via ElevenLabs for that polished, immersive feel. β’ Final Polish & Upload: Merges everything in Creatomate, slaps on captions with Replicate, uploads to Google Drive/YouTube, and updates your sheetβready to post! Setup takes 15-20 minutes with API keys. Plug-and-play: Import the JSON into n8n, and you're setβedit only API tokens and credentials as noted. π Steps to Connect: Google Sheets & Drive Setup Use your existing Google OAuth2 credentials in the "Google Sheets" and "Google Drive" nodes. Clone the demo sheet (ID: 1Co9tTJelYyefLLxzkoOhUBU3XuQIspBpDudtZLynW2M) and update the ID in "Update Video Status" if needed. Tweak folder IDs in "Upload to Drive" nodes for custom storage. Anthropic API Key Head to Anthropic and grab your API key. Paste it into the "Anthropic Chat Model" node credentials. OpenAI API Key Visit OpenAI for your API key. Add it to the "OpenAI Chat Model" node credentials. RunwayML API Key Sign up at RunwayML and get your token. Replace "YOURAPITOKEN" in the "Authorization" headers of "Generate Videos" and "Get Generated Videos" HTTP nodes. ElevenLabs API Key Go to ElevenLabs and generate your key. Swap "YOURAPITOKEN" in the "xi-api-key" headers of "Generate Audio (Background sound)" and "Text to Speech (Adam Voice)" nodes. Creatomate API Key Register at Creatomate for your token. Update "YOURAPITOKEN" in the "Authorization" header of the "Render Video with Creatomate" HTTP node. Replicate API Key Get your token from Replicate. Insert "YOURAPITOKEN" in the "Authorization" header of the "Add Captions - replicate" HTTP node. (Optional) YouTube Upload For direct uploads, add your Google YouTube OAuth credentials to the "Upload Video" node.
AI-powered outreach & follow-up automation (GPT-4o + Gmail + Google Sheets)
Description Automate your AI-powered outreach and follow-up pipeline end-to-end with GPT-4o, Gmail, and Google Sheets. π€π¬ This workflow personalizes emails for each lead, manages follow-ups automatically, tracks client replies, and updates CRM records in real time β all from a single Google Sheet. Ideal for sales and growth teams looking to convert leads faster without manual effort. βοΈπ What This Template Does 1οΈβ£ Starts manually when you click βExecute workflow.β πΉοΈ 2οΈβ£ Fetches all leads from the Google Sheet (sampleleads50). π 3οΈβ£ Validates email format and filters only active (unbooked) leads. π 4οΈβ£ Uses Azure OpenAI GPT-4o to generate short, personalized outreach emails in HTML. βοΈ 5οΈβ£ Cleans and parses the AI output (subject + HTML body). π§ 6οΈβ£ Sends the first outreach email via Gmail and stores its thread ID. π€ 7οΈβ£ Waits 24 hours, then checks for a client reply in the Gmail thread. β±οΈ 8οΈβ£ If a positive reply is found β marks lead as BOOKED and updates in Sheets. β 9οΈβ£ If no reply β triggers a polite follow-up email, waits again 24 hours, and checks the thread a second time. π π If a second reply is found β marks BOOKED and logs the client message. 1οΈβ£1οΈβ£ If still no response β updates status to Declined in Google Sheets. β 1οΈβ£2οΈβ£ Logs invalid or incomplete leads to a separate sheet for data cleanup. π§Ύ Key Benefits β Eliminates manual outreach and follow-up effort. β Produces personalized, context-aware AI emails for every lead. β Auto-tracks replies and updates CRM status with zero input. β Prevents duplicate or repeated contact with booked clients. β Keeps lead database synchronized and audit-ready. Features Google Sheets integration for dynamic lead retrieval and updates. Regex-based email validation for clean data pipelines. Azure OpenAI GPT-4o for contextual email writing. Two-stage Gmail automation (initial + follow-up). JavaScript parsing for AI output and Gmail thread analysis. Automated 24-hour wait and recheck logic. Conditional branches for Booked / Declined / Invalid outcomes. End-to-end CRM synchronization without manual review. Requirements Google Sheets OAuth2 credentials with read/write access. Azure OpenAI API key for GPT-4o model access. Gmail OAuth2 credentials with send, read, modify permissions. Environment Variables GOOGLESHEETLEADS_ID GOOGLESHEETOUTREACHTABID AZUREOPENAIAPI_KEY GMAILOAUTHCLIENT_ID GMAILOAUTHSECRET Target Audience πΌ Sales and Business Development teams automating outreach. π Marketing and Growth teams running re-engagement campaigns. π€ Automation and RevOps teams integrating AI lead workflows. π¬ Freelancers and agencies managing large prospect lists. π Operations teams maintaining CRM cleanliness and tracking. Step-by-Step Setup Instructions 1οΈβ£ Connect your Google Sheets, Azure OpenAI, and Gmail credentials. 2οΈβ£ Set your Google Sheet ID and tab name (outreach automation). 3οΈβ£ Update the GPT-4o system prompt to match your tone and signature. 4οΈβ£ Verify column headers (Company Name, Email, Booking Status, etc.). 5οΈβ£ Test the email validation branch with sample data. 6οΈβ£ Run once manually to confirm Gmail thread creation and reply detection. 7οΈβ£ Confirm successful CRM updates in Google Sheets. 8οΈβ£ Activate for continuous lead outreach and follow-up automation. β
Epa clean water act data access & compliance monitoring API integration
β οΈ ADVANCED USE ONLY - U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services MCP Server (36 operations) π¨ This workflow is for advanced users only! Need help? Want access to this workflow + many more paid workflows + live Q&A sessions with a top verified n8n creator? Join the community This MCP server contains 36 operations which is significantly more than the recommended maximum of tools for most AI clients. π Recommended Alternative for basic use cases Seek a simplified MCP server that utilizes the official n8n tool implementation for U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services if available, or an MCP server with only common operations as it will be more efficient and easier to manage. π οΈ Advanced Usage Requirements BEFORE adding this MCP server to your client: Disable or delete unused nodes - Review sections and disable/delete those you don't need AFTER adding the MCP server to your client: 1.Selective tool enabling - Instead of enabling all tools (default), manually select only the specific tools you need for that Workflow's MCP client. Monitor performance - Too many tools can slow down AI responses π‘ Pro Tips Keep maximum 40 enabled tools - Most AI clients perform better with fewer tools Group related operations and only enable one group at a time Use the overview note to understand what each operation group does Ping me on discord if your business needs this implemented professionally β‘ Quick Setup Import this workflow into your n8n instance Credentials Add U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services credentials Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL π§ How it Works This workflow converts the U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services API into an MCP-compatible interface for AI agents. β’ MCP Trigger: Serves as your server endpoint for AI agent requests β’ HTTP Request Nodes: Handle API calls to https://echodata.epa.gov/echo β’ AI Expressions: Automatically populate parameters via $fromAI() placeholders β’ Native Integration: Returns responses directly to the AI agent π Available Operations (36 total) π§ CwaRestServices.Get_Download (2 endpoints) β’ GET /cwarestservices.get_download: Submit CWA Download Data β’ POST /cwarestservices.get_download: Clean Water Act (CWA) Download Data Service π§ CwaRestServices.Get_Facilities (2 endpoints) β’ GET /cwarestservices.get_facilities: Submit CWA Facility Search β’ POST /cwarestservices.get_facilities: Clean Water Act (CWA) Facility Search Service π§ CwaRestServices.GetFacilityInfo (2 endpoints) β’ GET /cwarestservices.getfacilityinfo: Submit CWA Facility Details β’ POST /cwarestservices.getfacilityinfo: Clean Water Act (CWA) Facility Enhanced Search Service π§ CwaRestServices.Get_Geojson (2 endpoints) β’ GET /cwarestservices.get_geojson: Submit CWA GeoJSON Data β’ POST /cwarestservices.get_geojson: Clean Water Act (CWA) GeoJSON Service π§ CwaRestServices.GetInfoClusters (2 endpoints) β’ GET /cwarestservices.getinfoclusters: Submit CWA Info Clusters β’ POST /cwarestservices.getinfoclusters: Clean Water Act (CWA) Info Clusters Service π§ CwaRestServices.Get_Map (2 endpoints) β’ GET /cwarestservices.get_map: Submit CWA Map Data β’ POST /cwarestservices.get_map: Clean Water Act (CWA) Map Service π§ CwaRestServices.Get_Qid (2 endpoints) β’ GET /cwarestservices.get_qid: Submit CWA Paginated Results β’ POST /cwarestservices.get_qid: Clean Water Act (CWA) Paginated Results Service π§ CwaRestServices.Metadata (2 endpoints) β’ GET /cwarestservices.metadata: Submit CWA Metadata β’ POST /cwarestservices.metadata: Clean Water Act (CWA) Metadata Service π§ RestLookups.BpTribes (2 endpoints) β’ GET /restlookups.bptribes: Submit BP Tribes Data β’ POST /restlookups.bptribes: ECHO BP Tribes Lookup Service π§ RestLookups.CwaParameters (2 endpoints) β’ GET /restlookups.cwaparameters: Submit CWA Parameters β’ POST /restlookups.cwaparameters: ECHO CWA Parameter Lookup Service π§ RestLookups.CwaPollutants (2 endpoints) β’ GET /restlookups.cwapollutants: Submit CWA Pollutants β’ POST /restlookups.cwapollutants: ECHO CWA Pollutants Lookup Service π§ RestLookups.FederalAgencies (2 endpoints) β’ GET /restlookups.federalagencies: Submit Federal Agencies β’ POST /restlookups.federalagencies: ECHO Federal Agency Lookup Service π§ RestLookups.IcisInspection_Types (2 endpoints) β’ GET /restlookups.icisinspection_types: Submit ICIS Inspection Types β’ POST /restlookups.icisinspection_types: ECHO ICIS NPDES Inspection Types Lookup Service π§ RestLookups.IcisLaw_Sections (2 endpoints) β’ GET /restlookups.icislaw_sections: Submit ICIS Law Sections β’ POST /restlookups.icislaw_sections: ECHO ICIS NPDES Law Sections Lookup Service π§ RestLookups.NaicsCodes (2 endpoints) β’ GET /restlookups.naicscodes: Submit NAICS Codes β’ POST /restlookups.naicscodes: ECHO NAICS Codes Lookup Service π§ RestLookups.NpdesParameters (2 endpoints) β’ GET /restlookups.npdesparameters: Submit NPDES Parameters β’ POST /restlookups.npdesparameters: ECHO NPDES Parameters Lookup Service π§ RestLookups.WbdCode_Lu (2 endpoints) β’ GET /restlookups.wbdcode_lu: Submit WBD Codes β’ POST /restlookups.wbdcode_lu: ECHO WBD Code Lookup Service π§ RestLookups.WbdName_Lu (2 endpoints) β’ GET /restlookups.wbdname_lu: Submit WBD Names β’ POST /restlookups.wbdname_lu: ECHO WBD Name Lookup Service π€ AI Integration Parameter Handling: AI agents automatically provide values for: β’ Path parameters and identifiers β’ Query parameters and filters β’ Request body data β’ Headers and authentication Response Format: Native U.S. EPA Enforcement and Compliance History Online (ECHO) - Clean Water Act (CWA) Rest Services API responses with full data structure Error Handling: Built-in n8n HTTP request error management π‘ Usage Examples Connect this MCP server to any AI agent or workflow: β’ Claude Desktop: Add MCP server URL to configuration β’ Cursor: Add MCP server SSE URL to configuration β’ Custom AI Apps: Use MCP URL as tool endpoint β’ API Integration: Direct HTTP calls to MCP endpoints β¨ Benefits β’ Zero Setup: No parameter mapping or configuration needed β’ AI-Ready: Built-in $fromAI() expressions for all parameters β’ Production Ready: Native n8n HTTP request handling and logging β’ Extensible: Easily modify or add custom logic > π Free for community use! Ready to deploy in under 2 minutes.