Automated Multi-Platform Game Deals Tracker with Deku Deals & Gmail Alerts
How it works
This advanced workflow keeps you informed about the hottest game deals across multiple platforms (Nintendo Switch, PlayStation, Xbox, PC, etc.), aggregated by Deku Deals. No more manual checking for price drops – it automatically:
-
Scans Deku Deals daily for popular game deals.
-
Extracts key information like game title, platforms, current price, original price, discount, and direct links.
-
Intelligently tracks previously seen deals using a local database, guaranteeing you only get notified about genuinely new price drops or added games.
-
Delivers a clear and concise notification to your email (or preferred service) summarizing the best new deals.
Stay ahead of the sales, save money, and never miss a great game deal again!
Set up steps
Setting up this workflow involves web scraping, local database management, and understanding some web element selectors. It typically takes around 20-35 minutes. You'll need to:
-
Authenticate your preferred notification service (e.g., Gmail, Telegram).
-
Understand how to use your browser's developer tools to find CSS Selectors (detailed instructions provided within the workflow).
-
No external database setup required, as it uses n8n's built-in SQLite database.
All detailed setup instructions and specific configuration guidance are provided within the workflow itself using sticky notes.
Automated Multi-Platform Game Deals Tracker with Deku Deals & Gmail Alerts
This n8n workflow automates the process of finding new game deals from Deku Deals, filtering them based on specific criteria, and sending email notifications for relevant deals. It's designed to keep you informed about the best game prices across multiple platforms without manual searching.
What it does
This workflow simplifies tracking game deals by:
- Scheduling Regular Checks: It runs on a predefined schedule (e.g., daily, hourly) to fetch the latest game deals.
- Fetching Game Deals: It makes an HTTP request to a specified Deku Deals URL (likely a filtered search result page) to retrieve game deal data.
- Extracting Deal Information: It parses the HTML response from Deku Deals to extract key information about each game deal, such as game title, price, platform, and discount.
- Filtering Deals: It includes a "Function" node, which suggests custom logic to filter deals based on user-defined criteria (e.g., specific platforms, minimum discount, price range).
- Conditional Alerting: An "If" node checks if any deals meet the defined criteria after filtering.
- Sending Email Alerts: If relevant deals are found, it sends an email notification via Gmail with the details of the deals.
- Deduplicating/Aggregating (Potential): The "Item Lists" node, while not explicitly configured in the provided JSON, often implies operations like deduplication, sorting, or aggregating items. In this context, it could be used to ensure unique deals are processed or to format the deal list before emailing.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running instance of n8n.
- Deku Deals URL: A pre-filtered Deku Deals URL (e.g., for specific platforms, minimum discount, or game types) that the HTTP Request node will target.
- Gmail Account: A Gmail account configured as a credential in n8n to send email alerts.
Setup/Usage
- Import the Workflow: Import the provided JSON into your n8n instance.
- Configure Credentials:
- Gmail: Set up your Gmail credentials in n8n. This typically involves OAuth2 authentication with Google.
- Configure Nodes:
- Cron (Node 7): Adjust the schedule to your preference (e.g., daily at a specific time).
- HTTP Request (Node 19): Update the URL to your desired Deku Deals search page.
- HTML Extract (Node 114): This node will need to be configured with the correct CSS selectors to extract the game title, price, discount, and other relevant information from the Deku Deals HTML. You may need to inspect the Deku Deals website to find the appropriate selectors.
- Function (Node 14): Customize the JavaScript code within this node to define your specific filtering logic for game deals. For example, you might want to filter by platform, price, or discount percentage.
- If (Node 20): Configure the conditions in this node to check if the filtered deals meet your criteria for sending an alert. For instance, check if the output from the "Function" node contains any items.
- Gmail (Node 356): Configure the recipient email address, subject line, and the body of the email to include the extracted deal information.
- Item Lists (Node 516): If you intend to deduplicate or sort the deals, configure this node accordingly.
- Activate the Workflow: Once all configurations are complete, activate the workflow. It will start running according to your defined schedule.
Related Templates
Track competitor SEO keywords with Decodo + GPT-4.1-mini + Google Sheets
This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primarykeywords, seostrengthscore, keyworddensity_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo for intelligent web scraping OpenAI GPT-4.1-mini for keyword and SEO analysis Google Sheets for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.
Automated YouTube video uploads with 12h interval scheduling in JST
This workflow automates a batch upload of multiple videos to YouTube, spacing each upload 12 hours apart in Japan Standard Time (UTC+9) and automatically adding them to a playlist. ⚙️ Workflow Logic Manual Trigger — Starts the workflow manually. List Video Files — Uses a shell command to find all .mp4 files under the specified directory (/opt/downloads/单词卡/A1-A2). Sort and Generate Items — Sorts videos by day number (dayXX) extracted from filenames and assigns a sequential order value. Calculate Publish Schedule (+12h Interval) — Computes the next rounded JST hour plus a configurable buffer (default 30 min). Staggers each video’s scheduled time by order × 12 hours. Converts JST back to UTC for YouTube’s publishAt field. Split in Batches (1 per video) — Iterates over each video item. Read Video File — Loads the corresponding video from disk. Upload to YouTube (Scheduled) — Uploads the video privately with the computed publishAtUtc. Add to Playlist — Adds the newly uploaded video to the target playlist. 🕒 Highlights Timezone-safe: Pure UTC ↔ JST conversion avoids double-offset errors. Sequential scheduling: Ensures each upload is 12 hours apart to prevent clustering. Customizable: Change SPANHOURS, BUFFERMIN, or directory paths easily. Retry-ready: Each upload and playlist step has retry logic to handle transient errors. 💡 Typical Use Cases Multi-part educational video series (e.g., A1–A2 English learning). Regular content release cadence without manual scheduling. Automated YouTube publishing pipelines for pre-produced content. --- Author: Zane Category: Automation / YouTube / Scheduler Timezone: JST (UTC+09:00)
Create personalized email outreach with AI, Telegram bot & website scraping
Demo Personalized Email This n8n workflow is built for AI and automation agencies to promote their workflows through an interactive demo that prospects can try themselves. The featured system is a deep personalized email demo. --- 🔄 How It Works Prospect Interaction A prospect starts the demo via Telegram. The Telegram bot (created with BotFather) connects directly to your n8n instance. Demo Guidance The RAG agent and instructor guide the user step-by-step through the demo. Instructions and responses are dynamically generated based on user input. Workflow Execution When the user triggers an action (e.g., testing the email demo), n8n runs the workflow. The workflow collects website data using Crawl4AI or standard HTTP requests. Email Demo The system personalizes and sends a demo email through SparkPost, showing the automation’s capability. Logging and Control Each user interaction is logged in your database using their name and id. The workflow checks limits to prevent misuse or spam. Error Handling If a low-CPU scraping method fails, the workflow automatically escalates to a higher-CPU method. ⚙️ Requirements Before setting up, make sure you have the following: n8n — Automation platform to run the workflow Docker — Required to run Crawl4AI Crawl4AI — For intelligent website crawling Telegram Account — To create your Telegram bot via BotFather SparkPost Account — To send personalized demo emails A database (e.g., PostgreSQL, MySQL, or SQLite) — To store log data such as user name and ID 🚀 Features Telegram interface using the BotFather API Instructor and RAG agent to guide prospects through the demo Flow generation limits per user ID to prevent abuse Low-cost yet powerful web scraping, escalating from low- to high-CPU flows if earlier ones fail --- 💡 Development Ideas Replace the RAG logic with your own query-answering and guidance method Remove the flow limit if you’re confident the demo can’t be misused Swap the personalized email demo with any other workflow you want to showcase --- 🧠 Technical Notes Telegram bot created with BotFather Website crawl process: Extract sub-links via /sitemap.xml, sitemap_index.xml, or standard HTTP requests Fall back to Crawl4AI if normal requests fail Fetch sub-link content via HTTPS or Crawl4AI as backup SparkPost used for sending demo emails --- ⚙️ Setup Instructions Create a Telegram Bot Use BotFather on Telegram to create your bot and get the API token. This token will be used to connect your n8n workflow to Telegram. Create a Log Data Table In your database, create a table to store user logs. The table must include at least the following columns: name — to store the user’s name or Telegram username. id — to store the user’s unique identifier. Install Crawl4AI with Docker Follow the installation guide from the official repository: 👉 https://github.com/unclecode/crawl4ai Crawl4AI will handle website crawling and content extraction in your workflow. --- 📦 Notes This setup is optimized for low cost, easy scalability, and real-time interaction with prospects. You can customize each component — Telegram bot behavior, RAG logic, scraping strategy, and email workflow — to fit your agency’s demo needs. 👉 You can try the live demo here: @emaildemobot ---