Clean and log IoT sensor data to InfluxDB (Webhook | Function | HTTP)
π‘ IoT Sensor Data Cleaner + InfluxDB Logger (n8n | Webhook | Function | InfluxDB)
This workflow accepts raw sensor data from IoT devices via webhook, applies basic cleaning and transformation logic, and writes the cleaned data to an InfluxDB instance for time-series tracking. Perfect for renewable energy sites, smart farms and environmental monitoring setups using dashboards like Grafana or Chronograf.
β‘ Quick Implementation Steps
- Import the workflow JSON into your n8n instance.
- Edit the Set Config node to include your InfluxDB credentials and measurement name.
- Use the webhook URL (
/webhook/sensor-data) in your IoT device or form to send sensor data. - Start monitoring your data directly in InfluxDB!
π― Whoβs It For
- IoT developers and integrators.
- Renewable energy and environmental monitoring teams.
- Data engineers working with time-series data.
- Smart agriculture and utility automation platforms.
π Requirements
| Tool | Purpose | |------|---------| | n8n Instance | For automation | | InfluxDB (v1 or v2) | To store time-series sensor data | | IoT Device or Platform | To POST sensor data | | Function Node | To filter and transform data |
π§ What It Does
- Accepts JSON-formatted sensor data via HTTP POST.
- Validates the data (removes invalid or noisy readings).
- Applies transformation (rounding, timestamp formatting).
- Pushes the cleaned data to InfluxDB for real-time visualization.
π§© Workflow Components
- Webhook Node: Exposes an HTTP endpoint to receive sensor data.
- Function Node: Filters out-of-range values, formats timestamp, rounds data.
- Set Node: Stores configurable values like InfluxDB host, user/pass, and measurement name.
- InfluxDB Node: Writes valid records into the specified database bucket.
π§ How To Set Up β Step-by-Step
-
Import Workflow:
- Upload the provided
.jsonfile into your n8n workspace.
- Upload the provided
-
Edit Configuration Node:
- Update InfluxDB connection info in the
Set Confignode:influxDbHost,influxDbDatabase,influxDbUsername,influxDbPasswordmeasurement: What you want to name the data set (e.g.,sensor_readings)
- Update InfluxDB connection info in the
-
Send Data to Webhook:
- Webhook URL:
https://your-n8n/webhook/sensor-data - Example payload:
{ "temperature": 78.3, "humidity": 44.2, "voltage": 395.7, "timestamp": "2024-06-01T12:00:00Z" }
- Webhook URL:
-
View in InfluxDB:
- Log in to your InfluxDB/Grafana dashboard and query the new measurement.
β¨ How To Customize
| Customization | Method | |---------------|--------| | Add more fields (e.g., wind_speed) | Update the Function & InfluxDB nodes | | Add field/unit conversion | Use math in the Function node | | Send email alerts on anomalies | Add IF β Email branch after Function node | | Store in parallel in Google Sheets | Add Google Sheets node for hybrid logging |
β Addβons (Advanced)
| Add-on | Description | |--------|-------------| | π Grafana Integration | Real-time charts using InfluxDB | | π§ Email on Faulty Data | Notify if voltage < 0 or temperature too high | | π§ AI Filtering | Add OpenAI or TensorFlow for anomaly detection | | π Dual Logging | Save data to both InfluxDB and BigQuery/Sheets |
π Use Case Examples
- Remote solar inverter sends temperature and voltage via webhook.
- Environmental sensor hub logs humidity and air quality data every minute.
- Smart greenhouse logs climate control sensor metrics.
- Edge IoT devices periodically report health and diagnostics remotely.
π§― Troubleshooting Guide
| Issue | Cause | Solution |
|-------|-------|----------|
| No data logged in InfluxDB | Invalid credentials or DB name | Recheck InfluxDB values in config |
| Webhook not triggered | Wrong method or endpoint | Confirm it is a POST to /webhook/sensor-data |
| Data gets filtered | Readings outside valid range | Check logic in Function node |
| Data not appearing in dashboard | Influx write format error | Inspect InfluxDB log and field names |
π Need Assistance?
Need help integrating this workflow into your energy monitoring system or need InfluxDB dashboards built for you?
π Contact WeblineIndia | Experts in workflow automation and time-series analytics.
Clean and Log IoT Sensor Data to InfluxDB
This n8n workflow provides a robust solution for receiving, cleaning, and logging IoT sensor data to an InfluxDB database. It's designed to be triggered by incoming webhook requests, making it highly flexible for various IoT device integrations.
What it does
This workflow automates the following steps:
- Receives Sensor Data: Listens for incoming HTTP POST requests at a defined webhook URL. This acts as the entry point for your IoT sensor data.
- Cleans and Transforms Data: Processes the incoming raw sensor data using a custom JavaScript function. This step is crucial for standardizing data formats, extracting relevant fields, and performing any necessary data type conversions or calculations.
- Prepares Data for InfluxDB: Formats the cleaned data into the specific line protocol required by InfluxDB. This includes defining measurements, tags, and fields.
- Logs Data to InfluxDB: Sends the prepared data as an HTTP POST request to your InfluxDB instance, ensuring your sensor readings are persistently stored.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running n8n instance to host and execute the workflow.
- InfluxDB Instance: An accessible InfluxDB instance (version 1.x or 2.x, configured for HTTP API access).
- InfluxDB Configuration:
- URL: The base URL of your InfluxDB instance (e.g.,
http://localhost:8086). - Database/Bucket: The name of the database (InfluxDB 1.x) or bucket (InfluxDB 2.x) where you want to store the data.
- Authentication: Depending on your InfluxDB setup, you might need an API token (InfluxDB 2.x) or username/password (InfluxDB 1.x). The current workflow uses basic authentication.
- URL: The base URL of your InfluxDB instance (e.g.,
- IoT Devices/Sensors: Devices capable of sending HTTP POST requests with sensor data to the n8n webhook URL.
Setup/Usage
-
Import the Workflow:
- Download the provided JSON file.
- In your n8n instance, click on "Workflows" in the left sidebar.
- Click "New" and then "Import from JSON".
- Paste the JSON content or upload the file.
-
Configure the Webhook Trigger:
- The "Webhook" node is already configured to listen for POST requests.
- After activating the workflow, n8n will provide a unique URL for this webhook. Copy this URL.
- Configure your IoT devices or data sources to send HTTP POST requests to this URL, with the sensor data in the request body (JSON format is recommended).
-
Customize the Function Node:
- The "Function" node (named "Function") contains JavaScript code for data cleaning and transformation.
- Crucially, you will need to modify this code to match the specific structure of the data sent by your IoT devices and to define how you want to transform it into the InfluxDB line protocol.
- The current JSON indicates a basic structure, but you'll need to adapt it to extract fields like
temperature,humidity,device_id, etc., from your incoming webhook data and format them into InfluxDB line protocol. - Example of InfluxDB line protocol:
measurement,tag_key=tag_value field_key=field_value timestamp
-
Configure the HTTP Request Node (InfluxDB):
- The "HTTP Request" node (named "HTTP Request") is responsible for sending data to InfluxDB.
- URL: Update the URL to your InfluxDB instance's write endpoint.
- InfluxDB 1.x:
http://<your-influxdb-host>:8086/write?db=<your-database-name> - InfluxDB 2.x:
http://<your-influxdb-host>:8086/api/v2/write?org=<your-org-id>&bucket=<your-bucket-name>&precision=ns(adjust precision as needed)
- InfluxDB 1.x:
- Authentication:
- If using InfluxDB 1.x with username/password, configure "Basic Auth" credentials.
- If using InfluxDB 2.x, set the "Header"
AuthorizationtoToken <your-influxdb-token>.
- Body: The body of the request will be automatically populated by the output of the "Function" node. Ensure the "Function" node outputs data in the InfluxDB line protocol format.
-
Activate the Workflow:
- Once all configurations are complete, activate the workflow by toggling the "Active" switch in the top right corner of the n8n editor.
Now, any data sent to the webhook URL will be processed and logged to your InfluxDB instance.
Related Templates
AI multi-agent executive team for entrepreneurs with Gemini, Perplexity and WhatsApp
This workflow is an AI-powered multi-agent system built for startup founders and small business owners who want to automate decision-making, accountability, research, and communication, all through WhatsApp. The βvirtual executive team,β is designed to help small teams to work smarter. This workflow sends you market analysis, market and sales tips, It can also monitor what your competitors are doing using perplexity (Research agent) and help you stay a head, or make better decisions. And when you feeling stuck with your start-up accountability director is creative enough to break the barrier π― Core Features π§βπΌ 1. President (Super Agent) Acts as the main controller that coordinates all sub-agents. Routes messages, assigns tasks, and ensures workflow synchronization between the AI Directors. π 2. Sales & Marketing Director Uses SerpAPI to search for market opportunities, leads, and trends. Suggests marketing campaigns, keywords, or outreach ideas. Can analyze current engagement metrics to adjust content strategy. π΅οΈββοΈ 3. Business Research Director Powered by Perplexity AI for competitive and market analysis. Monitors competitor moves, social media engagement, and product changes. Provides concise insights to help the founder adapt and stay ahead. β° 4. Accountability Director Keeps the founder and executive team on track. Sends motivational nudges, task reminders, and progress reports. Promotes consistency and discipline β key traits for early-stage success. ποΈ 5. Executive Secretary Handles scheduling, email drafting, and reminders. Connects with Google Calendar, Gmail, and Sheets through OAuth. Automates follow-ups, meeting summaries, and notifications directly via WhatsApp. π¬ WhatsApp as the Main Interface Interact naturally with your AI team through WhatsApp Business API. All responses, updates, and summaries are delivered to your chat. Ideal for founders who want to manage operations on the go. βοΈ How It Works Trigger: The workflow starts from a WhatsApp Trigger node (via Meta Developer Account). Routing: The President agent analyzes the incoming message and determines which Director should handle it. Processing: Marketing or sales queries go to the Sales & Marketing Director. Research questions are handled by the Business Research Director. Accountability tasks are assigned to the Accountability Director. Scheduling or communication requests are managed by the Secretary. Collaboration: Each sub-agent returns results to the President, who summarizes and sends the reply back via WhatsApp. Memory: Context is maintained between sessions, ensuring personalized and coherent communication. π§© Integrations Required Gemini API β for general intelligence and task reasoning Supabase- for RAG and postgres persistent memory Perplexity API β for business and competitor analysis SerpAPI β for market research and opportunity scouting Google OAuth β to connect Sheets, Calendar, and Gmail WhatsApp Business API β for message triggers and responses π Benefits Acts like a team of tireless employees available 24/7. Saves time by automating research, reminders, and communication. Enhances accountability and strategy consistency for founders. Keeps operations centralized in a simple WhatsApp interface. π§° Setup Steps Create API credentials for: WhatsApp (via Meta Developer Account) Gemini, Perplexity, and SerpAPI Google OAuth (Sheets, Calendar, Gmail) Create a supabase account at supabase Add the credentials in the corresponding n8n nodes. Customize the system prompts for each Director based on your startupβs needs. Activate and start interacting with your virtual executive team on WhatsApp. Use Case You are a small organisation or start-up that can not afford hiring; marketing department, research department and secretar office, then this workflow is for you π‘ Need Customization? Want to tailor it for your startup or integrate with CRM tools like Notion or HubSpot? You can easily extend the workflow or contact the creator for personalized support. Consider adjusting the system prompt to suite your business
π How to transform unstructured email data into structured format with AI agent
This workflow automates the process of extracting structured, usable information from unstructured email messages across multiple platforms. It connects directly to Gmail, Outlook, and IMAP accounts, retrieves incoming emails, and sends their content to an AI-powered parsing agent built on OpenAI GPT models. The AI agent analyzes each email, identifies relevant details, and returns a clean JSON structure containing key fields: From β senderβs email address To β recipientβs email address Subject β email subject line Summary β short AI-generated summary of the email body The extracted information is then automatically inserted into an n8n Data Table, creating a structured database of email metadata and summaries ready for indexing, reporting, or integration with other tools. --- Key Benefits β Full Automation: Eliminates manual reading and data entry from incoming emails. β Multi-Source Integration: Handles data from different email providers seamlessly. β AI-Driven Accuracy: Uses advanced language models to interpret complex or unformatted content. β Structured Storage: Creates a standardized, query-ready dataset from previously unstructured text. β Time Efficiency: Processes emails in real time, improving productivity and response speed. *β Scalability: Easily extendable to handle additional sources or extract more data fields. --- How it works This workflow automates the transformation of unstructured email data into a structured, queryable format. It operates through a series of connected steps: Email Triggering: The workflow is initiated by one of three different email triggers (Gmail, Microsoft Outlook, or a generic IMAP account), which constantly monitor for new incoming emails. AI-Powered Parsing & Structuring: When a new email is detected, its raw, unstructured content is passed to a central "Parsing Agent." This agent uses a specified OpenAI language model to intelligently analyze the email text. Data Extraction & Standardization: Following a predefined system prompt, the AI agent extracts key information from the email, such as the sender, recipient, subject, and a generated summary. It then forces the output into a strict JSON structure using a "Structured Output Parser" node, ensuring data consistency. Data Storage: Finally, the clean, structured data (the from, to, subject, and summarize fields) is inserted as a new row into a specified n8n Data Table, creating a searchable and reportable database of email information. --- Set up steps To implement this workflow, follow these configuration steps: Prepare the Data Table: Create a new Data Table within n8n. Define the columns with the following names and string type: From, To, Subject, and Summary. Configure Email Credentials: Set up the credential connections for the email services you wish to use (Gmail OAuth2, Microsoft Outlook OAuth2, and/or IMAP). Ensure the accounts have the necessary permissions to read emails. Configure AI Model Credentials: Set up the OpenAI API credential with a valid API key. The workflow is configured to use the model, but this can be changed in the respective nodes if needed. Connect the Nodes: The workflow canvas is already correctly wired. Visually confirm that the email triggers are connected to the "Parsing Agent," which is connected to the "Insert row" (Data Table) node. Also, ensure the "OpenAI Chat Model" and "Structured Output Parser" are connected to the "Parsing Agent" as its AI model and output parser, respectively. Activate the Workflow: Save the workflow and toggle the "Active" switch to ON. The triggers will begin polling for new emails according to their schedule (e.g., every minute), and the automation will start processing incoming messages. --- Need help customizing? Contact me for consulting and support or add me on Linkedin.
Send WooCommerce discount coupons to customers via WhatsApp using Rapiwa API
Who is this for? This workflow is ideal for WooCommerce store owners who want to automatically send promotional WhatsApp messages to their customers when new coupons are created. Itβs designed for marketers and eCommerce managers looking to boost engagement, streamline coupon sharing, and track campaign performance effortlessly through Google Sheets. Overview This workflow listens for WooCommerce coupon creation events (coupon.created) and uses customer billing data to send promotional WhatsApp messages via the Rapiwa API. The flow formats the coupon data, cleans phone numbers, verifies WhatsApp registration with Rapiwa, sends the promotional message when verified, and logs each attempt to Google Sheets (separate sheets for verified/sent and unverified/not sent). What this Workflow Does Listens for new coupon creation events in WooCommerce via the WooCommerce Trigger node Retrieves all customer data from the WooCommerce store Processes customers in batches to control throughput Cleans and formats customer phone numbers for WhatsApp Verifies if phone numbers are valid WhatsApp accounts using Rapiwa API Sends personalized WhatsApp messages with coupon details to verified numbers Logs all activities to Google Sheets for tracking and analysis Handles both verified and unverified numbers appropriately Key Features Automated coupon distribution: Triggers when new coupons are created in WooCommerce Customer data retrieval: Fetches all customer information from WooCommerce Phone number validation: Verifies WhatsApp numbers before sending messages Personalized messaging: Includes customer name and coupon details in messages Dual logging system: Tracks both successful and failed message attempts Rate limiting: Uses batching and wait nodes to prevent API overload Data formatting: Structures coupon information for consistent messaging Google Sheet Column Structure A Google Sheet formatted like this β€ sample The workflow uses a Google Sheet with the following columns to track coupon distribution: | name | number | email | address1 | couponCode | couponTitle | couponType | couponAmount | createDate | expireDate | validity | status | | ----------- | ------------- | --------------------------------------------------- | --------- | ---------- | -------------- | ---------- | ------------ | ------------------- | ------------------- | ---------- | -------- | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur-DOHS | 62dhryst | eid offer 2025 | percent | 20.00 | 2025-09-11 06:08:02 | 2025-09-15 00:00:00 | unverified | not sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur-DOHS | 62dhryst | eid offer 2025 | percent | 20.00 | 2025-09-11 06:08:02 | 2025-09-15 00:00:00 | verified | sent | Requirements n8n instance with the following nodes: WooCommerce Trigger, Code, SplitInBatches, HTTP Request, IF, Google Sheets, Wait WooCommerce store with API access Rapiwa account with API access for WhatsApp verification and messaging Google account with Sheets access Customer phone numbers in WooCommerce (stored in billing.phone field) Important Notes Phone Number Format: The workflow cleans phone numbers by removing all non-digit characters. Ensure your WooCommerce phone numbers are in a compatible format. API Rate Limits: Rapiwa and WooCommerce APIs have rate limits. Adjust batch sizes and wait times accordingly. Data Privacy: Ensure compliance with data protection regulations when sending marketing messages. Error Handling: The workflow logs unverified numbers but doesn't have extensive error handling. Consider adding error notifications for failed API calls. Message Content: The current message template references the first coupon only (coupons[0]). Adjust if you need to handle multiple coupons. Useful Links Dashboard: https://app.rapiwa.com Official Website: https://rapiwa.com Documentation: https://docs.rapiwa.com Support & Help WhatsApp: Chat on WhatsApp Discord: SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen