15 templates found
Category:
Author:
Sort:

RAG Starter Template using Simple Vector Stores, Form trigger and OpenAI

This template quickly shows how to use RAG in n8n. Who is this for? This template is for everyone who wants to start giving knowledge to their Agents through RAG. Requirements Have a PDF with custom knowledge that you want to provide to your agent. Setup No setup required. Just hit Execute Workflow, upload your knowledge document and then start chatting. How to customize this to your needs Add custom instructions to your Agent by changing the prompts in it. Add a different way to load in knowledge to your vector store, e.g. by looking at some Google Drive files or loading knowledge from a table. Exchange the Simple Vector Store nodes with your own vector store tools ready for production. Add a more sophisticated way to rank files found in the vector store. For more information read our docs on RAG in n8n.

n8n TeamBy n8n Team
83500

Pattern for multiple triggers combined to continue workflow

Overview This template describes a possible approach to handle a pseudo-callback/trigger from an independent, external process (initiated from a workflow) and combine the received input with the workflow execution that is already in progress. This requires the external system to pass through some context information (resumeUrl), but allows the "primary" workflow execution to continue with BOTH its own (previous-node) context, AND the input received in the "secondary" trigger/process. Primary Workflow Trigger/Execution The workflow path from the primary trigger initiates some external, independent process and provides "context" which includes the value of $execution.resumeUrl. This execution then reaches a Wait node configured with Resume - On Webhook Call and stops until a call to resumeUrl is received. External, Independent Process The external, independent process could be anything like a Telegram conversation, or a web-service as long as: it results in a single execution of the Secondary Workflow Trigger, and it can pass through the value of resumeUrl associated with the Primary Workflow Execution Secondary Workflow Trigger/Execution The secondary workflow execution can start with any kind of trigger as long as part of the input can include the resumeUrl. To combine / rejoin the primary workflow execution, this execution passes along whatever it receives from its trigger input to the resume-webhook endpoint on the Wait node. Notes IMPORTANT: The workflow ids in the Set nodes marked Update Me have embedded references to the workflow IDs in the original system. They will need to be CHANGED to make this demo work. Note: The Resume Other Workflow Execution node in the template uses the $env.WEBHOOK_URL configuration to convert to an internal "localhost" call in a Docker environment. This can be done differently. ALERT: This pattern is NOT suitable for a workflow that handles multiple items because the first workflow execution will only be waiting for one callback. The second workflow (not the second trigger in the first workflow) is just to demonstrate how the Independent, External Process needs to work.

HubschrauberBy Hubschrauber
1546

Get daily SMS updates about weather

Get daily SMS updates to tell you if you should wear a sweater

tanaypantBy tanaypant
1331

Namesilo bulk domain availability checker

Introduction The namesilo Bulk Domain Availability workflow is a powerful automation solution designed to check the registration status of multiple domains simultaneously using the Namesilo API. This workflow efficiently processes large lists of domains by splitting them into manageable batches, adhering to API rate limits, and compiling the results into a convenient Excel spreadsheet. It eliminates the tedious process of manually checking domains one by one, saving significant time for domain investors, web developers, and digital marketers. The workflow is particularly valuable during brainstorming sessions for new projects, when conducting domain portfolio audits, or when preparing domain acquisition strategies. By automating the domain availability check process, users can quickly identify available domains for registration without the hassle of navigating through multiple web interfaces. Who is this for? This workflow is ideal for: Domain investors and flippers who need to check multiple domains quickly Web developers and agencies evaluating domain options for client projects Digital marketers researching domain availability for campaigns Business owners exploring domain options for new ventures IT professionals managing domain portfolios Users should have basic familiarity with n8n workflow concepts and a Namesilo account to obtain an API key. No coding knowledge is required, though understanding of domain name systems would be beneficial. What problem is this workflow solving? Checking domain availability one-by-one is a time-consuming and tedious process, especially when dealing with dozens or hundreds of potential domains. This workflow solves several key challenges: Manual Inefficiency: Eliminates the need to individually search for each domain through registrar websites. Rate Limiting: Handles API rate limits automatically with built-in waiting periods. Data Organization: Compiles availability results into a structured Excel file rather than scattered notes or multiple browser tabs. Bulk Processing: Processes up to 200 domains per batch, with the ability to handle unlimited domains across multiple batches. Time Management: Frees up valuable time that would otherwise be spent on repetitive manual checks. What this workflow does Overview The workflow takes a list of domains, processes them in batches of up to 200 domains per request (to comply with API limitations), checks their availability using the Namesilo API, and compiles the results into an Excel spreadsheet showing which domains are available for registration and which are already taken. Process Input Setup: The workflow begins with a manual trigger and uses the "Set Data" node to collect the list of domains to check and your Namesilo API key. Domain Processing: The "Convert & Split Domains" node transforms the input list into batches of up to 200 domains to comply with API limitations. Batch Processing: The workflow loops through each batch of domains. API Integration: For each batch, the "Namesilo Requests" node sends a request to the Namesilo API to check domain availability. Data Parsing: The "Parse Data" node processes the API response, extracting information about which domains are available and which are taken. Rate Limit Management: A 5-minute wait period is enforced between batches to respect Namesilo's API rate limits. Data Compilation: The "Merge Results" node combines all the availability data. Output Generation: Finally, the "Convert to Excel" node creates an Excel file with two columns: Domain and Availability (showing "Available" or "Unavailable" for each domain). Setup Import the workflow: Download the workflow JSON file and import it into your n8n instance. Get Namesilo API key: Create a free account at Namesilo and obtain your API key from https://www.namesilo.com/account/api-manager Configure the workflow: Open the "Set Data" node Enter your Namesilo API key in the "Namesilo API Key" field Enter your list of domains (one per line) in the "Domains" field Save and activate: Save the workflow and run it using the manual trigger. How to customize this workflow to your needs Modify domain input format: You can adjust the code in the "Convert & Split Domains" node if your domain list comes in a different format. Change batch size: If needed, you can modify the batch size (currently set to 200) in the "Convert & Split Domains" node to accommodate different API limitations. Adjust wait time: If you have a premium API account with different rate limits, you can modify the wait time in the "Wait" node. Enhance output format: Customize the "Convert to Excel" node to add additional columns or formatting to the output file. Add domain filtering: You could add a node before the API request to filter domains based on specific criteria (length, keywords, TLDs). Integrate with other services: Connect this workflow to domain registrars to automatically register available domains that meet your criteria.

n8n custom workflowsBy n8n custom workflows
1203

Complete LAMP stack (Linux, Apache, MySQL, PHP) automated server setup

This automated n8n workflow enables the rapid setup of a complete LAMP (Linux, Apache, MySQL, PHP) stack on a Linux server, executing the entire process in approximately 10 seconds. It configures the server, installs necessary components, and sets up a development user for seamless operation. Fundamental Aspects Start - Initiates the workflow Set Parameters - Configures server parameters System Preparation - Prepares the system for LAMP installation Update System - Updates the system and installs essential packages Install Apache - Sets up the Apache web server Install MySQL - Installs MySQL and phpMyAdmin Install PHP & Extensions - Installs PHP with required extensions Install Development Tools - Adds development utilities Create Development User - Creates a dedicated user for development Final Setup & Configuration - Finalizes configurations Setup Completion - Provides a summary of the setup Setup Instructions Import the workflow into n8n Configure parameters in the Set Parameters node Run the workflow Verify the LAMP stack setup on the server Required Resources Linux server with SSH access Root-level administrative privileges Features Install Database Server - Deploys MySQL with phpMyAdmin Configure Web Server - Sets up Apache for web hosting Install PHP - Includes PHP with essential extensions Create Development User - Establishes a user for development tasks Parameters to Configure server_host: Your Linux server IP address server_user: SSH username (typically 'root') server_password: SSH password php_extensions: List of PHP extensions to install dev_tools: List of development tools to install username: Development username user_password: Password for the development user Workflow Actions Install: Deploys the LAMP stack, configures Apache, MySQL, and PHP Create User: Sets up a development user with appropriate permissions Configure: Finalizes server settings and tool installations The workflow automatically manages Ubuntu/Debian package installation Service startup and configuration Web server and database setup User and permission management Development tool integration Update the parameters in the "Set Parameters" node with your server specifics and run the workflow!

Oneclick AI SquadBy Oneclick AI Squad
549

Build a support ticket analytics dashboard with ScrapeGraphAI, Google Sheets & Slack alerts

Customer Support Analysis Dashboard with AI and Automated Insights 🎯 Target Audience Customer support managers and team leads Customer success teams monitoring satisfaction Product managers analyzing user feedback Business analysts measuring support metrics Operations managers optimizing support processes Quality assurance teams monitoring support quality Customer experience (CX) professionals πŸš€ Problem Statement Manual analysis of customer support tickets and feedback is time-consuming and often misses critical patterns or emerging issues. This template solves the challenge of automatically collecting, analyzing, and visualizing customer support data to identify trends, improve response times, and enhance overall customer satisfaction. πŸ”§ How it Works This workflow automatically monitors customer support channels using AI-powered analysis, processes tickets and feedback, and provides actionable insights for improving customer support operations. Key Components Scheduled Trigger - Runs the workflow at specified intervals to maintain real-time monitoring AI-Powered Ticket Analysis - Uses advanced NLP to categorize, prioritize, and analyze support tickets Multi-Channel Integration - Monitors email, chat, help desk systems, and social media Automated Insights - Generates reports on trends, response times, and satisfaction scores Dashboard Integration - Stores all data in Google Sheets for comprehensive analysis and reporting πŸ“Š Google Sheets Column Specifications The template creates the following columns in your Google Sheets: | Column | Data Type | Description | Example | |--------|-----------|-------------|---------| | timestamp | DateTime | When the ticket was processed | "2024-01-15T10:30:00Z" | | ticket_id | String | Unique ticket identifier | "SUP-2024-001234" | | customer_email | String | Customer contact information | "john@example.com" | | subject | String | Ticket subject line | "Login issues with new app" | | description | String | Full ticket description | "I can't log into the mobile app..." | | category | String | AI-categorized ticket type | "Technical Issue" | | priority | String | Calculated priority level | "High" | | sentiment_score | Number | Customer sentiment (-1 to 1) | -0.3 | | urgency_indicator | String | Urgency classification | "Immediate" | | response_time | Number | Time to first response (hours) | 2.5 | | resolution_time | Number | Time to resolution (hours) | 8.0 | | satisfaction_score | Number | Customer satisfaction rating | 4.2 | | agent_assigned | String | Support agent name | "Sarah Johnson" | | status | String | Current ticket status | "Resolved" | πŸ› οΈ Setup Instructions Estimated setup time: 20-25 minutes Prerequisites n8n instance with community nodes enabled ScrapeGraphAI API account and credentials Google Sheets account with API access Help desk system API access (Zendesk, Freshdesk, etc.) Email service integration (optional) Step-by-Step Configuration Install Community Nodes bash Install required community nodes npm install n8n-nodes-scrapegraphai npm install n8n-nodes-slack Configure ScrapeGraphAI Credentials Navigate to Credentials in your n8n instance Add new ScrapeGraphAI API credentials Enter your API key from ScrapeGraphAI dashboard Test the connection to ensure it's working Set up Google Sheets Connection Add Google Sheets OAuth2 credentials Grant necessary permissions for spreadsheet access Create a new spreadsheet for customer support analysis Configure the sheet name (default: "Support Analysis") Configure Support System Integration Update the websiteUrl parameters in ScrapeGraphAI nodes Add URLs for your help desk system or support portal Customize the user prompt to extract specific ticket data Set up categories and priority thresholds Set up Notification Channels Configure Slack webhook or API credentials for alerts Set up email service credentials for critical issues Define alert thresholds for different priority levels Test notification delivery Configure Schedule Trigger Set analysis frequency (hourly, daily, etc.) Choose appropriate time zones for your business hours Consider support system rate limits Test and Validate Run the workflow manually to verify all connections Check Google Sheets for proper data formatting Test ticket analysis with sample data πŸ”„ Workflow Customization Options Modify Analysis Targets Add or remove support channels (email, chat, social media) Change ticket categories and priority criteria Adjust analysis frequency based on ticket volume Extend Analysis Capabilities Add more sophisticated sentiment analysis Implement customer churn prediction models Include agent performance analytics Add automated response suggestions Customize Alert System Set different thresholds for different ticket types Create tiered alert systems (info, warning, critical) Add SLA breach notifications Include trend analysis alerts Output Customization Add data visualization and reporting features Implement support trend charts and graphs Create executive dashboards with key metrics Add customer satisfaction trend analysis πŸ“ˆ Use Cases Support Ticket Management: Automatically categorize and prioritize tickets Response Time Optimization: Identify bottlenecks in support processes Customer Satisfaction Monitoring: Track and improve satisfaction scores Agent Performance Analysis: Monitor and improve agent productivity Product Issue Detection: Identify recurring problems and feature requests SLA Compliance: Ensure support teams meet service level agreements 🚨 Important Notes Respect support system API rate limits and terms of service Implement appropriate delays between requests to avoid rate limiting Regularly review and update your analysis parameters Monitor API usage to manage costs effectively Keep your credentials secure and rotate them regularly Consider data privacy and GDPR compliance for customer data πŸ”§ Troubleshooting Common Issues: ScrapeGraphAI connection errors: Verify API key and account status Google Sheets permission errors: Check OAuth2 scope and permissions Ticket parsing errors: Review the Code node's JavaScript logic Rate limiting: Adjust analysis frequency and implement delays Alert delivery failures: Check notification service credentials Support Resources: ScrapeGraphAI documentation and API reference n8n community forums for workflow assistance Google Sheets API documentation for advanced configurations Help desk system API documentation Customer support analytics best practices

vinci-king-01By vinci-king-01
390

Extract & analyze Amazon reviews with Apify, Gemini AI & save to Google Sheets

Template Description πŸ“ Template Title Analyze Amazon product reviews with Gemini and save to Google Sheets πŸ“„ Description This workflow automates the process of analyzing customer feedback on Amazon products. Instead of manually reading through hundreds of reviews, this template scrapes reviews (specifically targeting negative feedback), uses Google Gemini (AI) to analyze the root causes of dissatisfaction, and generates specific improvement suggestions. The results are automatically logged into a Google Sheet for easy tracking, and a Slack notification is sent to keep your team updated. This tool is essential for understanding "Voice of Customer" data efficiently without manual data entry. 🧍 Who is this for Product Managers looking for product improvement ideas. E-commerce Sellers (Amazon FBA, D2C) monitoring brand reputation. Market Researchers analyzing competitor weaknesses. Customer Support Teams identifying recurring issues. βš™οΈ How it works Data Collection: The workflow triggers the Apify actor (junglee/amazon-reviews-scraper) to fetch reviews from a specified Amazon product URL. It is currently configured to filter for 1 and 2-star reviews to focus on complaints. AI Analysis: It loops through each review and sends the content to Google Gemini. The AI determines a sentiment score (1-5), categorizes the issue (Quality, Design, Shipping, etc.), summarizes the complaint, and proposes a concrete improvement plan. Formatting: A Code node parses the AI's response to ensure it is in a clean JSON format. Storage: The structured data is appended as a new row in a Google Sheet. Notification: A Slack message is sent to your specified channel to confirm the batch analysis is complete. πŸ› οΈ Requirements n8n (Self-hosted or Cloud) Apify Account: You need to rent the junglee/amazon-reviews-scraper actor. Google Cloud Account: For accessing the Gemini (PaLM) API and Google Sheets API. Slack Account: For receiving notifications. πŸš€ How to set up Apify Config: Enter your Apify API token in the credentials. In the "Run an Actor" node, update the startUrls to the Amazon product page you want to analyze. Google Sheets: Create a new Google Sheet with the following header columns: sentiment_score, category, summary, improvement. Copy the Spreadsheet ID into the Google Sheets node. AI Prompt: The "Message a model" node contains the prompt. It is currently set to output results in Japanese. If you need English output, simply translate the prompt text inside this node. Slack: Select the channel where you want to receive notifications in the Slack node.

ε°ζž—εΉΈδΈ€By ε°ζž—εΉΈδΈ€
356

Check email via AI agent with Mailcheck Tool MCP Server

Complete MCP server exposing all Mailcheck Tool operations to AI agents. Zero configuration needed - 1 operations pre-built. ⚑ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL πŸ”§ How it Works β€’ MCP Trigger: Serves as your server endpoint for AI agent requests β€’ Tool Nodes: Pre-configured for every Mailcheck Tool operation β€’ AI Expressions: Automatically populate parameters via $fromAI() placeholders β€’ Native Integration: Uses official n8n Mailcheck Tool tool with full error handling πŸ“‹ Available Operations (1 total) Every possible Mailcheck Tool operation is included: πŸ”§ Email (1 operations) β€’ Check an email πŸ€– AI Integration Parameter Handling: AI agents automatically provide values for: β€’ Resource IDs and identifiers β€’ Search queries and filters β€’ Content and data payloads β€’ Configuration options Response Format: Native Mailcheck Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic πŸ’‘ Usage Examples Connect this MCP server to any AI agent or workflow: β€’ Claude Desktop: Add MCP server URL to configuration β€’ Custom AI Apps: Use MCP URL as tool endpoint β€’ Other n8n Workflows: Call MCP tools from any workflow β€’ API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits β€’ Complete Coverage: Every Mailcheck Tool operation available β€’ Zero Setup: No parameter mapping or configuration needed β€’ AI-Ready: Built-in $fromAI() expressions for all parameters β€’ Production Ready: Native n8n error handling and logging β€’ Extensible: Easily modify or add custom logic > πŸ†“ Free for community use! Ready to deploy in under 2 minutes.

David AshbyBy David Ashby
327

Create, delete, and organize AWS S3 buckets & files directly from your email

This automated n8n workflow automates AWS S3 bucket and file operations (create, delete, upload, download, copy, list) by parsing simple email commands and sending back success or error confirmations. Good to Know The workflow processes email requests via a Start Workflow (GET Request) node. Data extraction from emails identifies S3 operation commands. Error handling is included for invalid or missing email data. Responses are sent via email for each action performed. How It Works Start Workflow (GET Request) - Captures incoming email requests. Extract Data from Email - Parses email content to extract S3 operation commands. Check Task Type - Validates the type of task (e.g., create bucket, delete file). Create a Bucket - Creates a new S3 bucket. Delete a Bucket - Deletes an existing S3 bucket. Copy a File - Copies a file within S3. Delete a File - Deletes a file from S3. Download a File - Downloads a file from S3. Upload a File - Uploads a file to S3. Get Many Files - Lists multiple files in a bucket. Check Success or Fail - Determines the outcome of the operation. Send Success Email - Sends a success confirmation email. Send Failed Email - Sends a failure notification email. How to Use Import the workflow into n8n. Configure the Start Workflow (GET Request) node to receive email commands. Test the workflow with sample email commands (e.g., "create bucket: my-bucket", "upload file: document.pdf"). Monitor email responses and adjust command parsing if needed. Example Email for Testing List files from the bucket json-test in Mumbai region. Requirements AWS S3 credentials configured in n8n. Email service integration (e.g., SMTP settings). n8n environment with workflow execution permissions. Customizing This Workflow Adjust the Extract Data from Email node to support additional command formats. Modify the Send Success Email or Send Failed Email nodes to customize messages. Update the S3 nodes to include additional bucket or file attributes.

Oneclick AI SquadBy Oneclick AI Squad
295

πŸ› οΈ NASA tool MCP server πŸ’ͺ all 15 operations

πŸ› οΈ NASA Tool MCP Server Complete MCP server exposing all NASA Tool operations to AI agents. Zero configuration needed - all 15 operations pre-built. ⚑ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL πŸ”§ How it Works β€’ MCP Trigger: Serves as your server endpoint for AI agent requests β€’ Tool Nodes: Pre-configured for every NASA Tool operation β€’ AI Expressions: Automatically populate parameters via $fromAI() placeholders β€’ Native Integration: Uses official n8n NASA Tool tool with full error handling πŸ“‹ Available Operations (15 total) Every possible NASA Tool operation is included: πŸ”§ Asteroidneobrowse (1 operations) β€’ Get many asteroid neos πŸ”§ Asteroidneofeed (1 operations) β€’ Get an asteroid neo feed πŸ”§ Asteroidneolookup (1 operations) β€’ Get an asteroid neo lookup πŸ”§ Astronomypictureoftheday (1 operations) β€’ Get the astronomy picture of the day πŸ”§ Donkicoronalmassejection (1 operations) β€’ Get a DONKI coronal mass ejection πŸ”§ Donkihighspeedstream (1 operations) β€’ Get a DONKI high speed stream πŸ”§ Donkiinterplanetaryshock (1 operations) β€’ Get a DONKI interplanetary shock πŸ”§ Donkimagnetopausecrossing (1 operations) β€’ Get a DONKI magnetopause crossing πŸ”§ Donkinotifications (1 operations) β€’ Get a DONKI notifications πŸ”§ Donkiradiationbeltenhancement (1 operations) β€’ Get a DONKI radiation belt enhancement πŸ”§ Donkisolarenergeticparticle (1 operations) β€’ Get a DONKI solar energetic particle πŸ”§ Donkisolarflare (1 operations) β€’ Get a DONKI solar flare πŸ”§ Donkiwsaenlilsimulation (1 operations) β€’ Get a DONKI wsa enlil simulation πŸ”§ Earthassets (1 operations) β€’ Get Earth assets πŸ”§ Earthimagery (1 operations) β€’ Get Earth imagery πŸ€– AI Integration Parameter Handling: AI agents automatically provide values for: β€’ Resource IDs and identifiers β€’ Search queries and filters β€’ Content and data payloads β€’ Configuration options Response Format: Native NASA Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic πŸ’‘ Usage Examples Connect this MCP server to any AI agent or workflow: β€’ Claude Desktop: Add MCP server URL to configuration β€’ Custom AI Apps: Use MCP URL as tool endpoint β€’ Other n8n Workflows: Call MCP tools from any workflow β€’ API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits β€’ Complete Coverage: Every NASA Tool operation available β€’ Zero Setup: No parameter mapping or configuration needed β€’ AI-Ready: Built-in $fromAI() expressions for all parameters β€’ Production Ready: Native n8n error handling and logging β€’ Extensible: Easily modify or add custom logic > πŸ†“ Free for community use! Ready to deploy in under 2 minutes.

David AshbyBy David Ashby
175

πŸ› οΈ Sentry.io tool MCP server πŸ’ͺ all 25 operations

Need help? Want access to this workflow + many more paid workflows + live Q&A sessions with a top verified n8n creator? Join the community Complete MCP server exposing all Sentry.io Tool operations to AI agents. Zero configuration needed - all 25 operations pre-built. ⚑ Quick Setup Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL πŸ”§ How it Works β€’ MCP Trigger: Serves as your server endpoint for AI agent requests β€’ Tool Nodes: Pre-configured for every Sentry.io Tool operation β€’ AI Expressions: Automatically populate parameters via $fromAI() placeholders β€’ Native Integration: Uses official n8n Sentry.io Tool tool with full error handling πŸ“‹ Available Operations (25 total) Every possible Sentry.io Tool operation is included: πŸ“… Event (2 operations) β€’ Get an event β€’ Get many events πŸ› Issue (4 operations) β€’ Delete an issue β€’ Get an issue β€’ Get many issues β€’ Update an issue 🏒 Organization (4 operations) β€’ Create an organization β€’ Get an organization β€’ Get many organizations β€’ Update an organization πŸ”§ Project (5 operations) β€’ Create a project β€’ Delete a project β€’ Get a project β€’ Get many projects β€’ Update a project πŸš€ Release (5 operations) β€’ Create a release β€’ Delete a release β€’ Get a release by version ID β€’ Get many releases β€’ Update a release πŸ”§ Team (5 operations) β€’ Create a team β€’ Delete a team β€’ Get a team β€’ Get many teams β€’ Update a team πŸ€– AI Integration Parameter Handling: AI agents automatically provide values for: β€’ Resource IDs and identifiers β€’ Search queries and filters β€’ Content and data payloads β€’ Configuration options Response Format: Native Sentry.io Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic πŸ’‘ Usage Examples Connect this MCP server to any AI agent or workflow: β€’ Claude Desktop: Add MCP server URL to configuration β€’ Custom AI Apps: Use MCP URL as tool endpoint β€’ Other n8n Workflows: Call MCP tools from any workflow β€’ API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits β€’ Complete Coverage: Every Sentry.io Tool operation available β€’ Zero Setup: No parameter mapping or configuration needed β€’ AI-Ready: Built-in $fromAI() expressions for all parameters β€’ Production Ready: Native n8n error handling and logging β€’ Extensible: Easily modify or add custom logic > πŸ†“ Free for community use! Ready to deploy in under 2 minutes.

David AshbyBy David Ashby
129

Automate school trip consent forms with email verification, PDF generation & Google Drive

Verified Parent Consent Form Automation for School Trips --- 🎯 Description This workflow automates the entire parent consent process for school field trips, replacing manual paper forms with a secure, verified, and legally compliant digital system. When a parent submits consent data via POST request (from Postman or any form), the workflow: Receives parent & trip details through a Webhook trigger. Verifies the parent’s email using the VerifiEmail API to prevent fake or disposable entries. Generates a unique Consent ID and timestamps for tracking and legal validation. Creates a professional HTML-based digital consent form, including child details, trip information, and a parent signature section. Converts the HTML document to a PDF using the HTMLCSSToPDF API (ready for printing or archiving). Uploads the PDF to Google Drive automatically, organizing it under a designated folder (e.g., /School_Consents/2025/November). Sends an automated Gmail notification to the respective class teacher, including all verified details and the Drive reference. Responds instantly to the original POST request with a success confirmation and all metadata (Consent ID, verification status, storage location, timestamp). Handles invalid emails gracefully, returning a 400 error response for unauthorized or unverified submissions. Provides complete traceability, digital audit, and tamper-proof documentation for school compliance. Use Case: Ideal for schools, institutions, or organizations that need paperless consent workflows with email verification, cloud storage, and automated staff alerts β€” ensuring authenticity, security, and compliance. --- 🧩 Features Email verification (VerifiEmail) Auto-generated unique Consent ID HTML-to-PDF conversion (HTMLCSSToPDF) Google Drive cloud storage integration Automated Gmail teacher notification API-friendly POST-based trigger Real-time error handling & response Legally formatted consent slip --- πŸ§ͺ Test Input Example (Postman) Use this JSON in Postman when testing the webhook: json { "parent_name": "Ritu Sharma", "parent_email": "ritu.sharma@gmail.com", "child_name": "Aarav Sharma", "child_class": "Grade 5-A", "trip_name": "Science Museum Visit", "trip_date": "2025-11-10", "teacher_email": "teacher@school.edu" } --- βœ… Expected Output Success Response (200): json { "status": "success", "message": "Parent consent form verified and stored successfully", "data": { "consent_id": "CONSENT-1699123456789", "child_name": "Aarav Sharma", "trip_name": "Science Museum Visit", "email_verified": true, "stored_at": "Google Drive", "teacher_notified": true } } Error Response (400): json { "status": "error", "message": "Invalid email address provided. Please use a valid email.", "reason": "Email verification failed" } --- 🌐 Workflow Tags for n8n Creators Education Automation EmailVerification PDFGeneration GoogleDrive Webhook SchoolAdmin PaperlessWorkflow Compliance ---

Jitesh DugarBy Jitesh Dugar
126