Back to Catalog

Automatic monitoring of multiple URLs with downtime alerts

Oriol SeguíOriol Seguí
14200 views
2/3/2026
Official Page

This n8n workflow allows you to automatically monitor the status of multiple URLs in a simple and efficient way. You just need to enter the URLs you want to scan and run the workflow (either manually or scheduled). For each URL, an availability check is performed. The results are logged in a Google Sheet, clearly distinguishing between successful checks and failures (downtime). If any URL fails, the system filters these errors and automatically sends an email alert notifying you of the detected outages.

The workflow includes help messages in both English and Spanish, integrates with Google Sheets and Gmail, and is suitable for both one-off tasks and scheduled monitoring.

For Who?

  • Webmasters
  • SEO & Marketing Teams
  • SysAdmins
  • Anyone needing automated website uptime monitoring

How it works?

Enter the URLs to scan in the “URLs” field. Trigger the workflow manually or schedule it to run automatically. For each URL, the workflow: Checks if the URL is online or down. Logs the status (success or error) in a Google Sheet. At the end, filters the failed URLs (crashes) and sends an email alert l

Automatic Monitoring of Multiple URLs with Downtime Alerts

This n8n workflow provides a robust solution for continuously monitoring the availability of multiple URLs and sending immediate email alerts when a website experiences downtime. It simplifies the process of tracking website health, ensuring you are promptly notified of any issues.

What it does

This workflow automates the following steps:

  1. Triggers on a Schedule: The workflow can be configured to run at regular intervals (e.g., every 5 minutes, hourly) to check the status of your monitored URLs.
  2. Reads URLs from Google Sheets: It fetches a list of URLs to monitor from a specified Google Sheet. This allows for easy management and scaling of the monitored websites without modifying the workflow itself.
  3. Iterates Through URLs: For each URL retrieved from the Google Sheet, the workflow performs a separate HTTP request.
  4. Performs HTTP Request: It sends an HTTP GET request to each URL to check its availability.
  5. Checks Response Status: The workflow evaluates the HTTP response status code. If the status code indicates an error (e.g., 4xx or 5xx), it flags the URL as down.
  6. Prepares Alert Data: For any URLs identified as down, it gathers relevant information such as the URL, the error status, and a timestamp.
  7. Consolidates Alerts: All downtime alerts from a single execution are grouped together.
  8. Sends Email Notification: If any URLs are found to be down, a consolidated email alert is sent via Gmail, detailing which URLs are affected and their respective error statuses.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running instance of n8n.
  • Google Sheets Account: To store the list of URLs you want to monitor.
    • Google Sheets Credential: An n8n credential configured for Google Sheets (OAuth2 recommended).
    • Google Sheet with URLs: A Google Sheet containing a column with the URLs to be monitored.
  • Gmail Account: To send email notifications.
    • Gmail Credential: An n8n credential configured for Gmail (OAuth2 recommended).

Setup/Usage

  1. Import the Workflow:
    • Download the provided JSON file for this workflow.
    • In your n8n instance, go to "Workflows" and click "New".
    • Click the "Import from JSON" button and paste the workflow JSON or upload the file.
  2. Configure Credentials:
    • Locate the "Google Sheets" node and configure your Google Sheets credential.
    • Locate the "Gmail" node and configure your Gmail credential.
  3. Prepare Google Sheet:
    • Create a Google Sheet with a column named URL (or adjust the Google Sheets node to match your column name).
    • Populate this column with the full URLs you wish to monitor (e.g., https://example.com, https://another-site.org).
  4. Configure Google Sheets Node:
    • In the "Google Sheets" node, specify the "Spreadsheet ID" and "Sheet Name" where your URLs are located.
  5. Configure Schedule Trigger:
    • Adjust the "Schedule Trigger" node to your desired monitoring frequency (e.g., every 5 minutes, every hour).
  6. Activate the Workflow:
    • Save the workflow.
    • Toggle the workflow to "Active" to start monitoring.

Once activated, the workflow will automatically check your specified URLs at the set intervals and send email alerts if any downtime is detected.

Related Templates

Track competitor SEO keywords with Decodo + GPT-4.1-mini + Google Sheets

This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primarykeywords, seostrengthscore, keyworddensity_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo for intelligent web scraping OpenAI GPT-4.1-mini for keyword and SEO analysis Google Sheets for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.

Ranjan DailataBy Ranjan Dailata
161

Automated YouTube video uploads with 12h interval scheduling in JST

This workflow automates a batch upload of multiple videos to YouTube, spacing each upload 12 hours apart in Japan Standard Time (UTC+9) and automatically adding them to a playlist. ⚙️ Workflow Logic Manual Trigger — Starts the workflow manually. List Video Files — Uses a shell command to find all .mp4 files under the specified directory (/opt/downloads/单词卡/A1-A2). Sort and Generate Items — Sorts videos by day number (dayXX) extracted from filenames and assigns a sequential order value. Calculate Publish Schedule (+12h Interval) — Computes the next rounded JST hour plus a configurable buffer (default 30 min). Staggers each video’s scheduled time by order × 12 hours. Converts JST back to UTC for YouTube’s publishAt field. Split in Batches (1 per video) — Iterates over each video item. Read Video File — Loads the corresponding video from disk. Upload to YouTube (Scheduled) — Uploads the video privately with the computed publishAtUtc. Add to Playlist — Adds the newly uploaded video to the target playlist. 🕒 Highlights Timezone-safe: Pure UTC ↔ JST conversion avoids double-offset errors. Sequential scheduling: Ensures each upload is 12 hours apart to prevent clustering. Customizable: Change SPANHOURS, BUFFERMIN, or directory paths easily. Retry-ready: Each upload and playlist step has retry logic to handle transient errors. 💡 Typical Use Cases Multi-part educational video series (e.g., A1–A2 English learning). Regular content release cadence without manual scheduling. Automated YouTube publishing pipelines for pre-produced content. --- Author: Zane Category: Automation / YouTube / Scheduler Timezone: JST (UTC+09:00)

ZaneBy Zane
226

Create personalized email outreach with AI, Telegram bot & website scraping

Demo Personalized Email This n8n workflow is built for AI and automation agencies to promote their workflows through an interactive demo that prospects can try themselves. The featured system is a deep personalized email demo. --- 🔄 How It Works Prospect Interaction A prospect starts the demo via Telegram. The Telegram bot (created with BotFather) connects directly to your n8n instance. Demo Guidance The RAG agent and instructor guide the user step-by-step through the demo. Instructions and responses are dynamically generated based on user input. Workflow Execution When the user triggers an action (e.g., testing the email demo), n8n runs the workflow. The workflow collects website data using Crawl4AI or standard HTTP requests. Email Demo The system personalizes and sends a demo email through SparkPost, showing the automation’s capability. Logging and Control Each user interaction is logged in your database using their name and id. The workflow checks limits to prevent misuse or spam. Error Handling If a low-CPU scraping method fails, the workflow automatically escalates to a higher-CPU method. ⚙️ Requirements Before setting up, make sure you have the following: n8n — Automation platform to run the workflow Docker — Required to run Crawl4AI Crawl4AI — For intelligent website crawling Telegram Account — To create your Telegram bot via BotFather SparkPost Account — To send personalized demo emails A database (e.g., PostgreSQL, MySQL, or SQLite) — To store log data such as user name and ID 🚀 Features Telegram interface using the BotFather API Instructor and RAG agent to guide prospects through the demo Flow generation limits per user ID to prevent abuse Low-cost yet powerful web scraping, escalating from low- to high-CPU flows if earlier ones fail --- 💡 Development Ideas Replace the RAG logic with your own query-answering and guidance method Remove the flow limit if you’re confident the demo can’t be misused Swap the personalized email demo with any other workflow you want to showcase --- 🧠 Technical Notes Telegram bot created with BotFather Website crawl process: Extract sub-links via /sitemap.xml, sitemap_index.xml, or standard HTTP requests Fall back to Crawl4AI if normal requests fail Fetch sub-link content via HTTPS or Crawl4AI as backup SparkPost used for sending demo emails --- ⚙️ Setup Instructions Create a Telegram Bot Use BotFather on Telegram to create your bot and get the API token. This token will be used to connect your n8n workflow to Telegram. Create a Log Data Table In your database, create a table to store user logs. The table must include at least the following columns: name — to store the user’s name or Telegram username. id — to store the user’s unique identifier. Install Crawl4AI with Docker Follow the installation guide from the official repository: 👉 https://github.com/unclecode/crawl4ai Crawl4AI will handle website crawling and content extraction in your workflow. --- 📦 Notes This setup is optimized for low cost, easy scalability, and real-time interaction with prospects. You can customize each component — Telegram bot behavior, RAG logic, scraping strategy, and email workflow — to fit your agency’s demo needs. 👉 You can try the live demo here: @emaildemobot ---

Michael A PutraBy Michael A Putra
474