Back to Catalog

Learn secure webhook APIs with authentication and Supabase integration

Wayne SimpsonWayne Simpson
2380 views
2/3/2026
Official Page

This template is a practical introduction to n8n Webhooks with built-in examples for all major HTTP methods and authentication types. It is designed as a learning resource to help you understand how webhooks work in n8n, how to connect them to a data store, and how to secure them properly.

Whatโ€™s included:

  • Webhook nodes for GET, POST, PUT, PATCH, DELETE, and HEAD
  • Demonstrations of Basic Auth, Header Auth, and JWT Auth
  • Supabase integration for creating, retrieving, updating, and deleting rows
  • Example response handling with Respond to Webhook nodes
  • Sticky notes explaining each method, response type, and security option

Use this template to:

  • Learn how to configure and test webhooks in n8n
  • Explore different authentication strategies
  • Connect webhooks to a simple Supabase table
  • Understand best practices for securing webhook endpoints

This workflow is intended as an educational starting point. It shows you how to receive requests, map data, and return responses securely. For production use, adapt the structure, apply your own security policies, and extend the logic as needed.

Check out the YouTube video here: https://www.youtube.com/watch?v=o6F36xsiuBk

Secure Webhook API with Authentication and Supabase Integration

This n8n workflow demonstrates how to build a secure webhook API endpoint with authentication and integrate it with a Supabase database. It provides a foundational example for handling incoming webhook data, processing it, and storing it securely.

What it does

This workflow simplifies the process of receiving data via a webhook and persisting it in a Supabase table. Specifically, it:

  1. Listens for incoming webhook requests: It acts as an API endpoint, waiting for external systems to send data.
  2. Processes incoming data: The Edit Fields (Set) node is a placeholder for any data transformation or validation logic you might want to apply to the incoming webhook payload.
  3. Integrates with Supabase: It's set up to interact with a Supabase database, allowing you to insert, update, or query data based on the webhook payload.
  4. Responds to the webhook: After processing, it sends a response back to the system that initiated the webhook, confirming receipt or providing feedback.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running n8n instance (self-hosted or cloud).
  • Supabase Account: An active Supabase project with a database table configured to receive your webhook data.
  • Supabase Credentials: Your Supabase Project URL and Anon Public Key (or Service Role Key for more secure operations) to configure the Supabase node.

Setup/Usage

  1. Import the workflow: Download the provided JSON and import it into your n8n instance.
  2. Configure the Webhook Trigger:
    • Activate the Webhook node.
    • Copy the generated webhook URL. This is the endpoint you will send your data to.
    • (Optional but recommended for security): Configure webhook authentication within the Webhook node if your external system supports it (e.g., header-based authentication, query parameters).
  3. Configure Supabase Credentials:
    • Click on the Supabase node.
    • Click "Create New Credential" if you haven't already.
    • Enter your Supabase Project URL and API Key (Anon Public or Service Role).
    • Select the desired operation (e.g., Insert, Update) and specify the table name and data to be sent from the webhook payload.
  4. Configure Data Transformation (Optional):
    • Modify the Edit Fields (Set) node to transform or validate the incoming webhook data before it's sent to Supabase. You can add, remove, or modify fields here.
  5. Configure Webhook Response:
    • The Respond to Webhook node is configured to send a simple 200 OK response by default. You can customize the response body and status code based on your needs.
  6. Activate the workflow: Once configured, activate the workflow to make your webhook endpoint live.

Now, any data sent to your n8n webhook URL will trigger this workflow, process the data, and interact with your Supabase database.

Related Templates

Auto-create TikTok videos with VEED.io AI avatars, ElevenLabs & GPT-4

๐Ÿ’ฅ Viral TikTok Video Machine: Auto-Create Videos with Your AI Avatar --- ๐ŸŽฏ Who is this for? This workflow is for content creators, marketers, and agencies who want to use Veed.ioโ€™s AI avatar technology to produce short, engaging TikTok videos automatically. Itโ€™s ideal for creators who want to appear on camera without recording themselves, and for teams managing multiple brands who need to generate videos at scale. --- โš™๏ธ What problem this workflow solves Manually creating videos for TikTok can take hours โ€” finding trends, writing scripts, recording, and editing. By combining Veed.io, ElevenLabs, and GPT-4, this workflow transforms a simple Telegram input into a ready-to-post TikTok video featuring your AI avatar powered by Veed.io โ€” speaking naturally with your cloned voice. --- ๐Ÿš€ What this workflow does This automation links Veed.ioโ€™s video-generation API with multiple AI tools: Analyzes TikTok trends via Perplexity AI Writes a 10-second viral script using GPT-4 Generates your voiceover via ElevenLabs Uses Veed.io (Fabric 1.0 via FAL.ai) to animate your avatar and sync the lips to the voice Creates an engaging caption + hashtags for TikTok virality Publishes the video automatically via Blotato TikTok API Logs all results to Google Sheets for tracking --- ๐Ÿงฉ Setup Telegram Bot Create your bot via @BotFather Configure it as the trigger for sending your photo and theme Connect Veed.io Create an account on Veed.io Get your FAL.ai API key (Veed Fabric 1.0 model) Use HTTPS image/audio URLs compatible with Veed Fabric Other APIs Add Perplexity, ElevenLabs, and Blotato TikTok keys Connect your Google Sheet for logging results --- ๐Ÿ› ๏ธ How to customize this workflow Change your Avatar: Upload a new image through Telegram, and Veed.io will generate a new talking version automatically. Modify the Script Style: Adjust the GPT prompt for tone (educational, funny, storytelling). Adjust Voice Tone: Tweak ElevenLabs stability and similarity settings. Expand Platforms: Add Instagram, YouTube Shorts, or X (Twitter) posting nodes. Track Performance: Customize your Google Sheet to measure your most successful Veed.io-based videos. --- ๐Ÿง  Expected Outcome In just a few seconds after sending your photo and theme, this workflow โ€” powered by Veed.io โ€” creates a fully automated TikTok video featuring your AI avatar with natural lip-sync and voice. The result is a continuous stream of viral short videos, made without cameras, editing, or effort. --- โœ… Import the JSON file in n8n, add your API keys (including Veed.io via FAL.ai), and start generating viral TikTok videos starring your AI avatar today! ๐ŸŽฅ Watch This Tutorial --- ๐Ÿ“„ Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube

Dr. FirasBy Dr. Firas
39510

Two-way property repair management system with Google Sheets & Drive

This workflow automates the repair request process between tenants and building managers, keeping all updates organized in a single spreadsheet. It is composed of two coordinated workflows, as two separate triggers are required โ€” one for new repair submissions and another for repair updates. A Unique Unit ID that corresponds to individual units is attributed to each request, and timestamps are used to coordinate repair updates with specific requests. General use cases include: Property managers who manage multiple buildings or units. Building owners looking to centralize tenant repair communication. Automation builders who want to learn multi-trigger workflow design in n8n. --- โš™๏ธ How It Works Workflow 1 โ€“ New Repair Requests Behind the Scenes: A tenant fills out a Google Form (โ€œRepair Request Formโ€), which automatically adds a new row to a linked Google Sheet. Steps: Trigger: Google Sheets rowAdded โ€“ runs when a new form entry appears. Extract & Format: Collects all relevant form data (address, unit, urgency, contacts). Generate Unit ID: Creates a standardized identifier (e.g., BUILDING-UNIT) for tracking. Email Notification: Sends the building manager a formatted email summarizing the repair details and including a link to a Repair Update Form (which activates Workflow 2). --- Workflow 2 โ€“ Repair Updates Behind the Scenes:\ Triggered when the building manager submits a follow-up form (โ€œRepair Update Formโ€). Steps: Lookup by UUID: Uses the Unit ID from Workflow 1 to find the existing row in the Google Sheet. Conditional Logic: If photos are uploaded: Saves each image to a Google Drive folder, renames files consistently, and adds URLs to the sheet. If no photos: Skips the upload step and processes textual updates only. Merge & Update: Combines new data with existing repair info in the same spreadsheet row โ€” enabling a full repair history in one place. --- ๐Ÿงฉ Requirements Google Account (for Forms, Sheets, and Drive) Gmail/email node connected for sending notifications n8n credentials configured for Google API access --- โšก Setup Instructions (see more detail in workflow) Import both workflows into n8n, then copy one into a second workflow. Change manual trigger in workflow 2 to a n8n Form node. Connect Google credentials to all nodes. Update spreadsheet and folder IDs in the corresponding nodes. Customize email text, sender name, and form links for your organization. Test each workflow with a sample repair request and a repair update submission. --- ๐Ÿ› ๏ธ Customization Ideas Add Slack or Telegram notifications for urgent repairs. Auto-create folders per building or unit for photo uploads. Generate monthly repair summaries using Google Sheets triggers. Add an AI node to create summaries/extract relevant repair data from repair request that include long submissions.

Matt@VeraisonLabsBy Matt@VeraisonLabs
208

Automate Dutch Public Procurement Data Collection with TenderNed

TenderNed Public Procurement What This Workflow Does This workflow automates the collection of public procurement data from TenderNed (the official Dutch tender platform). It: Fetches the latest tender publications from the TenderNed API Retrieves detailed information in both XML and JSON formats for each tender Parses and extracts key information like organization names, titles, descriptions, and reference numbers Filters results based on your custom criteria Stores the data in a database for easy querying and analysis Setup Instructions This template comes with sticky notes providing step-by-step instructions in Dutch and various query options you can customize. Prerequisites TenderNed API Access - Register at TenderNed for API credentials Configuration Steps Set up TenderNed credentials: Add HTTP Basic Auth credentials with your TenderNed API username and password Apply these credentials to the three HTTP Request nodes: "Tenderned Publicaties" "Haal XML Details" "Haal JSON Details" Customize filters: Modify the "Filter op ..." node to match your specific requirements Examples: specific organizations, contract values, regions, etc. How It Works Step 1: Trigger The workflow can be triggered either manually for testing or automatically on a daily schedule. Step 2: Fetch Publications Makes an API call to TenderNed to retrieve a list of recent publications (up to 100 per request). Step 3: Process & Split Extracts the tender array from the response and splits it into individual items for processing. Step 4: Fetch Details For each tender, the workflow makes two parallel API calls: XML endpoint - Retrieves the complete tender documentation in XML format JSON endpoint - Fetches metadata including reference numbers and keywords Step 5: Parse & Merge Parses the XML data and merges it with the JSON metadata and batch information into a single data structure. Step 6: Extract Fields Maps the raw API data to clean, structured fields including: Publication ID and date Organization name Tender title and description Reference numbers (kenmerk, TED number) Step 7: Filter Applies your custom filter criteria to focus on relevant tenders only. Step 8: Store Inserts the processed data into your database for storage and future analysis. Customization Tips Modify API Parameters In the "Tenderned Publicaties" node, you can adjust: offset: Starting position for pagination size: Number of results per request (max 100) Add query parameters for date ranges, status filters, etc. Add More Fields Extend the "Splits Alle Velden" node to extract additional fields from the XML/JSON data, such as: Contract value estimates Deadline dates CPV codes (procurement classification) Contact information Integrate Notifications Add a Slack, Email, or Discord node after the filter to get notified about new matching tenders. Incremental Updates Modify the workflow to only fetch new tenders by: Storing the last execution timestamp Adding date filters to the API query Only processing publications newer than the last run Troubleshooting No data returned? Verify your TenderNed API credentials are correct Check that you have setup youre filter proper Need help setting this up or interested in a complete tender analysis solution? Get in touch ๐Ÿ”— LinkedIn โ€“ Wessel Bulte

Wessel BulteBy Wessel Bulte
247