Send weather alerts to your mobile phone with OpenWeatherMap and SIGNL4
Get weather alerts on your mobile phone via push, SMS or voice call. This flow gets weather information every morning and sends out an alert to your SIGNL4 on-call team. For example you can send out weather alerts in case of freezing temperatures, snow, rain, hail storms, hot weather, etc. The flow also supports automatic alert resolution. So, for example if the temperature goes up again the alert is closed automatically in the app. User cases: Dispatch snow removal teams Inform car dealers to protect the cars outside in case of hail storms Set sails if there are high winds And much more ... Can be adapted easily to other weather warnings, like rain, hail storm, etc.
Write all Linear tickets to Google Sheets
Use Case Track all Linear tickets in Google sheets. Useful if you want to do some custom analysis but don't want to pay for Linear's Plus features (Linear Insights) or that it does not cover. Setup Add Linear API header key Add Google sheets creds Update which teams to get tickets from in Graphql Nodes Update which Google Sheets page to write all the tickets to You only need to add one column, id, in the sheet. Google Sheets node in automatic mapping mode will handle adding the rest of the columns. Set any custom data on each ticket Activate workflow 🚀 How to adjust this template Set any custom fields you want to get out of this, that you can quickly do in n8n.
Automated brand mentions tracker with GPT-4o, Google Sheets, and email
This workflow enables you to automate the daily monitoring of how an AI model (like ChatGPT) responds to specific queries relevant to your market. It identifies mentions of your brand and predefined competitors, logs detailed interactions in Google Sheets, and delivers a comprehensive email report. Main Use Cases Monitor how your brand is mentioned by AI in response to relevant user queries. Track mentions of key competitors to understand AI's comparative positioning. Gain insights into AI's current knowledge and portrayal of your brand and market landscape. Automate daily intelligence gathering on AI-driven brand perception. How it works The workflow operates as a scheduled process, organized into these stages: Configuration & Scheduling Triggers daily (or can be run manually). Key variables are defined within the workflow: your brand name (e.g., "YourBrandName"), a list of queries to ask the AI, and a list of competitor names to track in responses. AI Querying For each predefined query, the workflow sends a request to the OpenAI ChatGPT API (via an HTTP Request node). Response Analysis Each AI response is processed by a Code node to: Check if your brand name is mentioned (case-insensitive). Identify if any of the listed competitors are mentioned (case-insensitive). Extract the core AI response content (limited to 500 characters for brevity in logs/reports). Data Logging to Google Sheets Detailed results for each query—including timestamp, date, the query itself, query index, your brand name, the AI's response, whether your brand was mentioned, and any errors—are appended to a specified Google Sheet. Email Report Generation A comprehensive HTML email report is compiled. This report summarizes: Total queries processed, number of times your brand was mentioned, total competitor mentions, and any errors encountered. A summary of competitor mentions, listing each competitor and how many times they were mentioned. A detailed table listing each query, whether your brand was mentioned, and which competitors (if any) were mentioned in the AI's response. Automated Reporting The generated HTML email report is sent to specified recipients, providing a daily snapshot of AI interactions. Summary Flow: Schedule/Workflow Trigger → Initialize Brand, Queries, Competitors (in Code node) → For each Query: Query ChatGPT API → Process AI Response (Check for Brand & Competitor Mentions) → Log Results to Google Sheets → Generate Consolidated HTML Email Report → Send Email Notification Benefits: Fully automated daily monitoring of AI responses concerning your brand and competitors. Provides objective insights into how AI models are representing your brand in user interactions. Delivers actionable competitive intelligence by tracking competitor mentions. Centralized logging in Google Sheets for historical analysis and trend spotting. Easily customizable with your specific brand, queries, competitor list, and reporting recipients.
Extract Amazon product data to Sheets with Olostep API
Olostep Amazon Products Scraper This n8n template automates Amazon product scraping using the Olostep API. Simply enter a search query, and the workflow scrapes multiple Amazon search pages to extract product titles and URLs. Results are cleaned, normalized, and saved into a Google Sheet or Data Table. Who’s it for E-commerce analysts researching competitors and pricing Product sourcing teams Dropshippers and Amazon sellers Automation builders who want quick product lists without manual scraping Growth hackers collecting product data at scale How it works / What it does Form Trigger User enters a search query (e.g., “wireless bluetooth headphones”). The query is used to build the Amazon search URL. Pagination Setup A list of page numbers (1–10) is generated automatically. Each number loads the corresponding Amazon search results page. Scrape Amazon with Olostep For each page, Olostep scrapes Amazon search results. Olostep’s LLM extraction returns: title — product title url — product link Parse & Split Results The JSON output is decoded and turned into individual product items. URL Normalization If the product URL is relative, it is automatically converted into a full Amazon URL. Conditional Check (IF node) Ensures only valid product URLs are stored. Helps avoid scraping Amazon navigation links or invalid items. Insert into Sheet / Data Table Each valid product is saved in: title url Automatic Looping & Rate Management A wait step ensures API rate limits are respected while scraping multiple pages. This workflow gives you a complete, reliable Amazon scraper with no browser automation and no manual copy/paste — everything runs through the Olostep API and n8n. How to set up Import this template into your n8n account. Add your Olostep API key. Connect your Google Sheets or Data Table. Deploy the form and start scraping with any Amazon search phrase. Requirements Olostep API key Google Sheets or Data Table n8n cloud or self-hosted instance How to customize the workflow Add more product fields (price, rating, number of reviews, seller name, etc.). Extend pagination range (1–20 or more pages). Add filtering logic (e.g., ignore sponsored results). Send scraped results to Notion, Airtable, or a CRM. Trigger via Telegram bot instead of a form. --- 👉 This workflow is perfect for e-commerce research, competitive analysis, or building Amazon product datasets with minimal effort.
Download watermark-free TikTok videos to Google Drive with automated sheets logging
📥 TikTok to MP4 Converter with Google Drive & Sheets Convert TikTok videos to MP4 , MP3 (without watermark), upload to Google Drive, and log conversion attempts into Google Sheets automatically — powered by TikTok Download Audio Video API. --- 📝 Description This n8n automation accepts a TikTok video URL via a form, sends it to the TikTok Download Audio Video API, downloads the watermark-free MP4, uploads it to Google Drive, and logs the result (success/failure) into Google Sheets. --- 🧩 Node-by-Node Overview | | Node | Functionality | |---|-------------------------------|-------------------------------------------------------------------------------| | 1 | 🟢 Form Trigger | Displays a form for user input of TikTok video URL. | | 2 | 🌐 TikTok RapidAPI Request | Calls the TikTok Downloader API to get the MP4 link. | | 3 | 🔍 If Condition | Checks if the API response status is "success". | | 4 | ⬇️ MP4 Downloader | Downloads the video file using the returned "no watermark" MP4 URL. | | 5 | ☁️ Upload to Google Drive | Uploads the video file to Google Drive root folder. | | 6 | 🔑 Set Google Drive Permission | Makes the file publicly shareable via link. | | 7 | 📄 Google Sheets (Success) | Logs TikTok URL + public Drive link into a Google Sheet. | | 8 | ⏱️ Wait Node | Delays to prevent rapid write operations on error. | | 9 | 📑 Google Sheets (Failure) | Logs failed attempts with Drive_URL = N/A. | --- ✅ Use Cases 📲 Social media managers downloading user-generated content 🧠 Educators saving TikTok content for offline lessons 💼 Agencies automating short-form video curation 🤖 Workflow automation demonstrations with n8n --- 🎯 Key Benefits ✔️ MP4 without watermark via TikTok Download Audio Video API ✔️ Automated Google Drive upload & shareable links ✔️ Centralized logging in Google Sheets ✔️ Error handling and retry-safe structure ✔️ Fully customizable and extendable within n8n --- 💡 Ideal for anyone looking to automate TikTok video archiving with full control over file storage and access. 🔐 How to Get Your API Key for the TikTok Download Audio Video API Go to 👉 TikTok Download Audio Video API - RapidAPI Click "Subscribe to Test" (you may need to sign up or log in). Choose a pricing plan (there’s a free tier for testing). After subscribing, click on the "Endpoints" tab. Your API Key will be visible in the "x-rapidapi-key" header. 🔑 Copy and paste this key into the httpRequest node in your workflow. --- Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n
Clean and log IoT sensor data to InfluxDB (Webhook | Function | HTTP)
🌡 IoT Sensor Data Cleaner + InfluxDB Logger (n8n | Webhook | Function | InfluxDB) This workflow accepts raw sensor data from IoT devices via webhook, applies basic cleaning and transformation logic, and writes the cleaned data to an InfluxDB instance for time-series tracking. Perfect for renewable energy sites, smart farms and environmental monitoring setups using dashboards like Grafana or Chronograf. ⚡ Quick Implementation Steps Import the workflow JSON into your n8n instance. Edit the Set Config node to include your InfluxDB credentials and measurement name. Use the webhook URL (/webhook/sensor-data) in your IoT device or form to send sensor data. Start monitoring your data directly in InfluxDB! 🎯 Who’s It For IoT developers and integrators. Renewable energy and environmental monitoring teams. Data engineers working with time-series data. Smart agriculture and utility automation platforms. 🛠 Requirements | Tool | Purpose | |------|---------| | n8n Instance | For automation | | InfluxDB (v1 or v2) | To store time-series sensor data | | IoT Device or Platform | To POST sensor data | | Function Node | To filter and transform data | 🧠 What It Does Accepts JSON-formatted sensor data via HTTP POST. Validates the data (removes invalid or noisy readings). Applies transformation (rounding, timestamp formatting). Pushes the cleaned data to InfluxDB for real-time visualization. 🧩 Workflow Components Webhook Node: Exposes an HTTP endpoint to receive sensor data. Function Node: Filters out-of-range values, formats timestamp, rounds data. Set Node: Stores configurable values like InfluxDB host, user/pass, and measurement name. InfluxDB Node: Writes valid records into the specified database bucket. 🔧 How To Set Up – Step-by-Step Import Workflow: Upload the provided .json file into your n8n workspace. Edit Configuration Node: Update InfluxDB connection info in the Set Config node: influxDbHost, influxDbDatabase, influxDbUsername, influxDbPassword measurement: What you want to name the data set (e.g., sensor_readings) Send Data to Webhook: Webhook URL: https://your-n8n/webhook/sensor-data Example payload: json { "temperature": 78.3, "humidity": 44.2, "voltage": 395.7, "timestamp": "2024-06-01T12:00:00Z" } View in InfluxDB: Log in to your InfluxDB/Grafana dashboard and query the new measurement. ✨ How To Customize | Customization | Method | |---------------|--------| | Add more fields (e.g., wind_speed) | Update the Function & InfluxDB nodes | | Add field/unit conversion | Use math in the Function node | | Send email alerts on anomalies | Add IF → Email branch after Function node | | Store in parallel in Google Sheets | Add Google Sheets node for hybrid logging | ➕ Add‑ons (Advanced) | Add-on | Description | |--------|-------------| | 📊 Grafana Integration | Real-time charts using InfluxDB | | 📧 Email on Faulty Data | Notify if voltage < 0 or temperature too high | | 🧠 AI Filtering | Add OpenAI or TensorFlow for anomaly detection | | 🗃 Dual Logging | Save data to both InfluxDB and BigQuery/Sheets | 📈 Use Case Examples Remote solar inverter sends temperature and voltage via webhook. Environmental sensor hub logs humidity and air quality data every minute. Smart greenhouse logs climate control sensor metrics. Edge IoT devices periodically report health and diagnostics remotely. 🧯 Troubleshooting Guide | Issue | Cause | Solution | |-------|-------|----------| | No data logged in InfluxDB | Invalid credentials or DB name | Recheck InfluxDB values in config | | Webhook not triggered | Wrong method or endpoint | Confirm it is a POST to /webhook/sensor-data | | Data gets filtered | Readings outside valid range | Check logic in Function node | | Data not appearing in dashboard | Influx write format error | Inspect InfluxDB log and field names | 📞 Need Assistance? Need help integrating this workflow into your energy monitoring system or need InfluxDB dashboards built for you? 👉 Contact WeblineIndia | Experts in workflow automation and time-series analytics.
Schedule supplier follow-ups from Airtable POs to Google Calendar with AI, Slack & Gmail
📊 Description Ensure suppliers never miss a follow-up by automating overdue purchase order tracking and scheduling. 📦⏰ This workflow checks Airtable every weekday morning for open POs older than seven days without scheduled follow-ups, generates Google Calendar events, updates Airtable with the follow-up link, and sends notifications to your team via Slack and Gmail. It centralizes supplier management and eliminates manual reminders, helping operations teams stay on top of aging purchase orders and vendor commitments. 📅📣 🔁 What This Template Does 1️⃣ Runs on a weekday schedule (default: 10 AM) to scan Airtable for overdue open POs. 📆 2️⃣ Filters POs with missing follow-up links and older than 7 days. 🔍 3️⃣ Processes each overdue PO one by one. 🔄 4️⃣ Creates a Google Calendar event for each supplier follow-up. 📅 5️⃣ Saves the event link back into Airtable and updates follow-up status to “Pending.” 📝 6️⃣ Sends initial and final Slack notifications with PO details and scheduling links. 💬 7️⃣ Sends a Gmail confirmation email to the assigned supplier or internal team. ✉️ 8️⃣ Ensures all stakeholders have quick access to follow-up links and event details. 🔗 ⭐ Key Benefits ✅ Automates supplier follow-up scheduling with zero manual effort ✅ Prevents overdue purchase orders from being forgotten ✅ Consolidates PO aging logic, event creation, and notifications ✅ Keeps suppliers and internal teams aligned with one workflow ✅ Ensures follow-ups are consistently logged and traceable in Airtable ✅ Improves accountability for purchasing and operations teams 🧩 Features Weekday schedule trigger (cron-based) Airtable “Purchase Orders” table integration Google Calendar event creation with tracking links Airtable record update with follow-up status Slack notifications (initial + final) Gmail email confirmations Batch processing for multiple overdue POs Automated vendor accountability loop 🔐 Requirements Airtable Personal Access Token Google Calendar OAuth2 credentials Slack API token Gmail OAuth2 credentials Airtable table containing: PO ID, Supplier Info, Status, PO Date, Follow-up Link 🎯 Target Audience Procurement & purchasing teams Operations managers handling vendor communication Supply chain coordinators tracking overdue POs Teams using Airtable for purchase order management Businesses that need consistent supplier follow-ups
Automated economic calendar PDF reports to Telegram via RapidAPI
Stay ahead of the markets with this fully automated n8n workflow that delivers AI-generated economic calendar events PDF updates directly to your Telegram or Discord. Powered by the Economic Events Calendar API via RapidAPI, this workflow is perfect for traders, investors, and financial community managers who want timely notifications about high- and medium-impact global economic events—no coding required. Key Features Automated PDF Generation: Receive medium and high impact upcoming economic calendar events schedule (for up to 7 days in future) straight to Telegram/Discord. Flexible Scheduling: Default update interval is every 7 days, adjustable to any custom frequency. Customizable Date Range: Easily set the window for economic event coverage (e.g., today to 7 days ahead). No Coding Needed: Simple setup—just plug in your API keys and Telegram credentials. Who Is This For? Retail and professional traders Financial influencers and Telegram group admins Community managers seeking hands-off, real-time macroeconomic alerts Why Choose This Workflow? Save Time: No more manual economic calendar checks. Get upcoming full week high and medium impact news events straight to your telegram in neatly formatted PDF. Stay Informed: Instantly know about key market-moving events. Easy Integration: Works out-of-the-box with n8n and Telegram. Get started today and never miss a critical economic event again! PDF View:
Brand DNA generator using JotForm, Google Search, Gemini AI & Notion
Automated Brand DNA Generator Using JotForm, Google Search, AI Extraction & Notion The Brand DNA Generator workflow automatically scans and analyzes online content to build a company’s Brand DNA profile. It starts with input from a form, then crawls the company’s website and Google search results to gather relevant information. Using AI-powered extraction, the system identifies insights such as value propositions, ideal customer profiles (ICP), pain points, proof points, brand tone, and more. All results are neatly formatted and automatically saved to a Notion database as a structured Brand DNA report, eliminating the need for manual research. 🛠️ Key Features Automated data capture, collects company data directly from form submissions and Google search results. Uses AI-powered insight extraction with LLMs to extract and summarize brand-related information from website content. Fetches clean text from multiple web pages using HTTP requests and a content extractor. Merges extracted data from multiple sources into a single Brand DNA JSON structure. Automatically creates a new page in Notion with formatted sections (headings, paragraphs, and bullet points). Handles parsing failures and processes multiple pages efficiently in batches. 🔧 Requirements JotForm API Key, to capture company data from form submissions. SerpAPI Key, to perform automated Google searches. OpenRouter / LLM API, for AI-based language understanding and information extraction. Notion Integration Token & Database ID, to save the final Brand DNA report to Notion. 🧩 Setup Instructions Connect your JotForm account and select the form containing the fields Company Name and Company Website. Add your SerpAPI Key. Configure the AI model using OpenRouter or LLM. Enter your Notion credentials and specify the databaseId in the Create a Database Page node. Customize the prompt in the Information Extractor node to modify the tone or structure of AI analysis (Optional). Activate the workflow, then submit data through the JotForm to test automatic generation and Notion integration. 💡 Final Output A complete Brand DNA Report containing: Company Description Ideal Customer Profile Pain Points Value Proposition Proof Points Brand Tone Suggested Keywords All generated automatically from the company’s online presence and stored in Notion with no manual input required.