Back to Catalog

Store data received from webhook in JSON

Harshil AgrawalHarshil Agrawal
4279 views
2/3/2026
Official Page

Store the data received from the CocktailDB API in JSON

Store Data Received from Webhook in JSON

This n8n workflow demonstrates a simple yet effective way to capture data from an incoming webhook, process it, and then store it as a JSON file. It's ideal for scenarios where you need to log or persist data received from external services for later analysis or use.

What it does

This workflow automates the following steps:

  1. Triggers Manually: The workflow is initiated manually (or can be adapted to listen for a webhook).
  2. Makes an HTTP Request: It performs an HTTP GET request to a specified URL (e.g., https://n8n.io). This node acts as a placeholder for receiving or generating data that you would typically get from a webhook.
  3. Converts to Binary Data: The response from the HTTP request (which is usually JSON or text) is converted into binary data. This is a crucial step for preparing the data to be written as a file.
  4. Writes Binary File: The binary data is then written to a file. By default, it will save the content as a .bin file, but you can configure it to save as .json or any other format by specifying the file name.

Prerequisites/Requirements

  • n8n Instance: An active n8n instance (cloud or self-hosted) to run this workflow.
  • HTTP Endpoint (Optional): If you modify the "HTTP Request" node to fetch data from a specific API or webhook, you'll need access to that endpoint.

Setup/Usage

  1. Import the Workflow:
    • Copy the JSON content provided.
    • In your n8n instance, click "New" in the workflows section.
    • Click the "Import from JSON" button and paste the copied JSON.
    • Click "Import".
  2. Configure Nodes:
    • Start Node: This node is currently set to "Manual" execution. If you intend to use this workflow with an actual webhook, you would replace this with a "Webhook" trigger node and configure its URL.
    • HTTP Request Node:
      • By default, it fetches https://n8n.io. You will likely want to change this.
      • If you're using this to store data from a previous webhook, you would remove this node and connect the "Webhook" trigger node directly to the "Convert to/from binary data" node. The webhook's incoming data would then be processed.
      • Ensure the "Response Format" is set appropriately (e.g., "JSON" if the API returns JSON).
    • Convert to/from binary data Node: No specific configuration is typically needed here unless you want to change how the data is handled before writing.
    • Write Binary File Node:
      • File Name: By default, it might generate a generic name. To save as a JSON file, you should set the "File Name" to something like {{ $json.fileName || 'webhook_data' }}.json or a static name like my_data.json.
      • File Path: Configure the directory where you want the file to be saved on the n8n host system.
      • File Content: Ensure this is set to use the binary data from the previous node.
  3. Activate the Workflow: Once configured, click the "Active" toggle in the top right corner to enable the workflow.
  4. Execute:
    • If using the "Start" node, click "Execute Workflow" to run it manually.
    • If you've replaced the "Start" node with a "Webhook" node, trigger the webhook from your external service.

This workflow provides a solid foundation for capturing and storing data. Remember to adjust the "HTTP Request" or "Start" node based on how your data originates.

Related Templates

AI-powered code review with linting, red-marked corrections in Google Sheets & Slack

Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOURSPREADSHEETID>, <YOURSHEETGIDORNAME>, <YOURSLACKCHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.

higashiyama By higashiyama
90

Generate Weather-Based Date Itineraries with Google Places, OpenRouter AI, and Slack

🧩 What this template does This workflow builds a 120-minute local date course around your starting point by querying Google Places for nearby spots, selecting the top candidates, fetching real-time weather data, letting an AI generate a matching emoji, and drafting a friendly itinerary summary with an LLM in both English and Japanese. It then posts the full bilingual plan with a walking route link and weather emoji to Slack. 👥 Who it’s for Makers and teams who want a plug-and-play bilingual local itinerary generator with weather awareness — no custom code required. ⚙️ How it works Trigger – Manual (or schedule/webhook). Discovery – Google Places nearby search within a configurable radius. Selection – Rank by rating and pick the top 3. Weather – Fetch current weather (via OpenWeatherMap). Emoji – Use an AI model to match the weather with an emoji 🌤️. Planning – An LLM writes the itinerary in Markdown (JP + EN). Route – Compose a Google Maps walking route URL. Share – Post the bilingual itinerary, route link, and weather emoji to Slack. 🧰 Requirements n8n (Cloud or self-hosted) Google Maps Platform (Places API) OpenWeatherMap API key Slack Bot (chat:write) LLM provider (e.g., OpenRouter or DeepL for translation) 🚀 Setup (quick) Open Set → Fields: Config and fill in coords/radius/time limit. Connect Credentials for Google, OpenWeatherMap, Slack, and your LLM. Test the workflow and confirm the bilingual plan + weather emoji appear in Slack. 🛠 Customize Adjust ranking filters (type, min rating). Modify translation settings (target language or tone). Change output layout (side-by-side vs separated). Tune emoji logic or travel mode. Add error handling, retries, or logging for production use.

nodaBy noda
52

AI-powered document search with Oracle and ONNX embeddings for recruiting

How it works Create a user for doing Hybrid Search. Clear Existing Data, if present. Add Documents into the table. Create a hybrid index. Run Semantic search on the Documents table for "prioritize teamwork and leadership experience". Run Hybrid search for the text input in the Chat interface on the Documents table. Setup Steps Download the ONNX model allMiniLML12v2augmented.zip Extract the ZIP file on the database server into a directory, for example /opt/oracle/onnx. After extraction, the folder contents should look like: bash bash-4.4$ pwd /opt/oracle/onnx bash-4.4$ ls allMiniLML12_v2.onnx Connect as SYSDBA and create the DBA user sql -- Create DBA user CREATE USER app_admin IDENTIFIED BY "StrongPassword123" DEFAULT TABLESPACE users TEMPORARY TABLESPACE temp QUOTA UNLIMITED ON users; -- Grant privileges GRANT DBA TO app_admin; GRANT CREATE TABLESPACE, ALTER TABLESPACE, DROP TABLESPACE TO app_admin; Create n8n Oracle DB credentials hybridsearchuser → for hybrid search operations dbadocuser → for DBA setup (user and tablespace creation) Run the workflow Click the manual Trigger It displays Pure semantic search results. Enter search text in Chat interface It displays results for vector and keyword search. Note The workflow currently creates the hybrid search user, docuser with the password visible in plain text inside the n8n Execute SQL node. For better security, consider performing the user creation manually outside n8n. Oracle 23ai or 26ai Database has to be used. Reference Hybrid Search End-End Example

sudarshanBy sudarshan
211