Back to Catalog

Automated product price tracking with ScrapeGraphAI, Slack alerts and Jira tickets

vinci-king-01vinci-king-01
23 views
2/3/2026
Official Page

Product Price Monitor with Slack and Jira

Workflow Preview Image

⚠️ COMMUNITY TEMPLATE DISCLAIMER: This is a community-contributed template that uses ScrapeGraphAI (a community node). Please ensure you have the ScrapeGraphAI community node installed in your n8n instance before using this template.

This workflow automatically scrapes multiple e-commerce sites, analyses weekly seasonal price trends, and notifies your team in Slack while opening Jira tickets for items that require price adjustments. It helps retailers plan inventory and pricing by surfacing actionable insights every week.

Pre-conditions/Requirements

Prerequisites

  • n8n instance (self-hosted or n8n cloud)
  • ScrapeGraphAI community node installed
  • Slack workspace & channel for notifications
  • Jira Software project (cloud or server)
  • Basic JavaScript knowledge for optional custom code edits

Required Credentials

  • ScrapeGraphAI API Key – Enables web scraping
  • Slack OAuth Access Token – Required by the Slack node
  • Jira Credentials – Email & API token (cloud) or username & password (server)
  • (Optional) Proxy credentials – If target websites block direct scraping

Specific Setup Requirements

| Resource | Purpose | Example | |----------|---------|---------| | Product URL list | Seed URLs to monitor | https://example.com/products-winter-sale | | Slack Channel | Receives trend alerts | #pricing-alerts | | Jira Project Key | Tickets are created here | ECOM |

How it works

This workflow automatically scrapes multiple e-commerce sites, analyses weekly seasonal price trends, and notifies your team in Slack while opening Jira tickets for items that require price adjustments. It helps retailers plan inventory and pricing by surfacing actionable insights every week.

Key Steps:

  • Webhook Trigger: Kicks off the workflow via a weekly schedule or manual call.
  • Set Product URLs: Prepares the list of product pages to analyse.
  • SplitInBatches: Processes URLs in manageable batches to avoid rate limits.
  • ScrapeGraphAI: Extracts current prices, stock, and seasonality hints from each URL.
  • Code (Trend Logic): Compares scraped prices against historical averages.
  • If (Threshold Check): Determines if price deviations exceed ±10%.
  • Slack Node: Sends a formatted message to the pricing channel for each deviation.
  • Jira Node: Creates/updates a ticket linking to the product for further action.
  • Merge: Collects all batch results for summary reporting.

Set up steps

Setup Time: 15-20 minutes

  1. Install Community Nodes: In n8n, go to Settings → Community Nodes, search for “ScrapeGraphAI”, and install.
  2. Add Credentials:
    a. Slack → CredentialsNew, paste your Bot/User OAuth token.
    b. Jira → CredentialsNew, enter your domain, email/username, API token/password.
    c. ScrapeGraphAI → CredentialsNew, paste your API key.
  3. Import Workflow: Upload or paste the JSON template into n8n.
  4. Edit the “Set Product URLs” Node: Replace placeholder URLs with your real product pages.
  5. Configure Schedule: Replace the Webhook Trigger with a Cron node (e.g., every Monday 09:00) or keep as webhook for manual runs.
  6. Map Jira Fields: In the Jira node, ensure Project Key, Issue Type (e.g., Task), and Summary fields match your instance.
  7. Test Run: Execute the workflow. Confirm Slack message appears and a Jira issue is created.
  8. Activate: Toggle the workflow to Active so it runs automatically.

Node Descriptions

Core Workflow Nodes:

  • Webhook – Default trigger, can be swapped with Cron for weekly automation.
  • Set (Product URLs) – Stores an array of product links for scraping.
  • SplitInBatches – Limits each ScrapeGraphAI call to five URLs to reduce load.
  • ScrapeGraphAI – Crawls and parses HTML, returning JSON with title, price, availability.
  • Code (Trend Logic) – Calculates percentage change vs. historical data (stored externally or hard-coded for demo).
  • If (Threshold Check) – Routes items above/below the set variance.
  • Slack – Posts a rich-format message containing product title, old vs. new price, and link.
  • Jira – Creates or updates a ticket with priority set to Medium and assigns to the Pricing team lead.
  • Merge – Recombines batch streams for optional reporting or storage.

Data Flow:

  1. WebhookSet (Product URLs)SplitInBatchesScrapeGraphAICode (Trend Logic)IfSlack / JiraMerge

Customization Examples

Change Price Deviation Threshold

// Code (Trend Logic) node
const threshold = 0.05; // 5% instead of default 10%

Alter Slack Message Template

{
  "text": `*${item.name}* price changed from *$${item.old}* to *$${item.new}* (${item.diff}%).`,
  "attachments": [
    {
      "title": "Product Link",
      "title_link": item.url,
      "color": "#4E79A7"
    }
  ]
}

Data Output Format

The workflow outputs structured JSON data:

{
  "product": "Winter Jacket",
  "url": "https://example.com/winter-jacket",
  "oldPrice": 129.99,
  "newPrice": 99.99,
  "change": -23.06,
  "scrapedAt": "2023-11-04T09:00:00Z",
  "status": "Below Threshold",
  "slackMsgId": "A1B2C3",
  "jiraIssueKey": "ECOM-101"
}

Troubleshooting

Common Issues

  1. ScrapeGraphAI returns empty data – Verify selectors; many sites use dynamic rendering, require a headless browser flag.
  2. Slack message not delivered – Check that the OAuth token scopes include chat:write; also confirm channel ID.
  3. Jira ticket creation fails – Field mapping mismatch; ensure Issue Type is valid and required custom fields are supplied.

Performance Tips

  • Batch fewer URLs (e.g., 3 instead of 5) to reduce timeout risk.
  • Cache historical prices in an external DB (Postgres, Airtable) instead of reading large CSVs in the Code node.

Pro Tips:

  • Rotate proxies/IPs within ScrapeGraphAI to bypass aggressive e-commerce anti-bot measures.
  • Add a Notion or Sheets node after Merge for historical logging.
  • Use the Error Trigger workflow in n8n to alert when ScrapeGraphAI fails more than X times per run.

Automated Product Price Tracking with ScrapeGraphAI, Slack Alerts, and Jira Tickets

This n8n workflow automates the process of tracking product prices, identifying price drops, and notifying relevant teams through Slack and Jira. It leverages an external scraping tool (ScrapeGraphAI, implied by the directory name, though not explicitly defined in the provided JSON) to fetch product data, then processes this data to detect price changes, and finally dispatches alerts and creates tickets based on predefined conditions.

What it does

  1. Receives Product Data: The workflow is triggered by an external system (likely ScrapeGraphAI) via a Webhook, which sends product information, including prices.
  2. Processes Each Product: It iterates through each product received from the webhook.
  3. Applies Custom Logic: A "Code" node is used to apply custom JavaScript logic, likely to compare current prices with historical data (not explicitly shown in JSON but implied by the "price tracking" context) or to format the data for subsequent steps.
  4. Filters for Price Drops: An "If" node checks for specific conditions, presumably if a product's price has dropped below a certain threshold or changed significantly.
  5. Notifies on Slack (Price Drop): If a price drop is detected (TRUE branch of the "If" node), a message is posted to a designated Slack channel to alert the team.
  6. Creates Jira Ticket (Price Drop): Simultaneously, a Jira Software ticket is created for the product with the price drop, allowing for further investigation or action by the product or sales team.
  7. Merges Workflow Paths: The workflow paths for both price drop and no-price-drop scenarios are merged back together.
  8. Sets Final Output: A "Set" node is used to format or clean up the data before the workflow concludes, potentially for logging or further processing.

Prerequisites/Requirements

  • ScrapeGraphAI (or similar web scraping tool): An external tool configured to scrape product prices and send data to the n8n webhook.
  • n8n Instance: A running n8n instance to host this workflow.
  • Slack Account: A Slack workspace and a channel where price drop alerts will be posted. Requires an n8n Slack credential configured.
  • Jira Software Account: A Jira instance where new tickets will be created. Requires an n8n Jira Software credential configured.

Setup/Usage

  1. Import the Workflow:
    • Copy the provided JSON content.
    • In your n8n instance, go to "Workflows" and click "New".
    • Click the three dots in the top right corner and select "Import from JSON".
    • Paste the JSON content and click "Import".
  2. Configure Credentials:
    • Slack: Locate the "Slack" node. Click on the "Credential" field and either select an existing Slack API credential or create a new one, providing your Slack Bot Token.
    • Jira Software: Locate the "Jira Software" node. Click on the "Credential" field and either select an existing Jira API credential or create a new one, providing your Jira instance URL, username, and API token.
  3. Configure Webhook:
    • Locate the "Webhook" node.
    • Copy the "Webhook URL" that n8n generates. This URL will be used by your external scraping tool (e.g., ScrapeGraphAI) to send product data to this workflow.
  4. Customize Nodes:
    • Code Node: Review and customize the JavaScript code within the "Code" node (Node834) to define your specific price comparison logic and data manipulation needs.
    • If Node: Adjust the conditions in the "If" node (Node20) to precisely define what constitutes a "price drop" or significant price change for your products.
    • Slack Node: Customize the message content in the "Slack" node (Node40) to include relevant product details and a clear call to action. Specify the target Slack channel.
    • Jira Software Node: Configure the "Jira Software" node (Node77) to specify the project, issue type, summary, description, and any other fields for the tickets created.
  5. Activate the Workflow: Once configured, activate the workflow in n8n.
  6. Integrate with Scraper: Configure your ScrapeGraphAI (or other scraping tool) to send the scraped product data to the n8n Webhook URL. Ensure the data format sent by the scraper matches what the n8n workflow expects in its initial "Code" node.

Related Templates

Automated YouTube video uploads with 12h interval scheduling in JST

This workflow automates a batch upload of multiple videos to YouTube, spacing each upload 12 hours apart in Japan Standard Time (UTC+9) and automatically adding them to a playlist. ⚙️ Workflow Logic Manual Trigger — Starts the workflow manually. List Video Files — Uses a shell command to find all .mp4 files under the specified directory (/opt/downloads/单词卡/A1-A2). Sort and Generate Items — Sorts videos by day number (dayXX) extracted from filenames and assigns a sequential order value. Calculate Publish Schedule (+12h Interval) — Computes the next rounded JST hour plus a configurable buffer (default 30 min). Staggers each video’s scheduled time by order × 12 hours. Converts JST back to UTC for YouTube’s publishAt field. Split in Batches (1 per video) — Iterates over each video item. Read Video File — Loads the corresponding video from disk. Upload to YouTube (Scheduled) — Uploads the video privately with the computed publishAtUtc. Add to Playlist — Adds the newly uploaded video to the target playlist. 🕒 Highlights Timezone-safe: Pure UTC ↔ JST conversion avoids double-offset errors. Sequential scheduling: Ensures each upload is 12 hours apart to prevent clustering. Customizable: Change SPANHOURS, BUFFERMIN, or directory paths easily. Retry-ready: Each upload and playlist step has retry logic to handle transient errors. 💡 Typical Use Cases Multi-part educational video series (e.g., A1–A2 English learning). Regular content release cadence without manual scheduling. Automated YouTube publishing pipelines for pre-produced content. --- Author: Zane Category: Automation / YouTube / Scheduler Timezone: JST (UTC+09:00)

ZaneBy Zane
226

Detect holiday conflicts & suggest meeting reschedules with Google Calendar and Slack

Who’s it for Remote and distributed teams that schedule across time zones and want to avoid meetings landing on public holidays—PMs, CS/AM teams, and ops leads who own cross-regional calendars. What it does / How it works The workflow checks next week’s Google Calendar events, compares event dates against public holidays for selected country codes, and produces a single Slack digest with any conflicts plus suggested alternative dates. Core steps: Workflow Configuration (Set) → Fetch Public Holidays (via a public holiday API such as Calendarific/Nager.Date) → Get Next Week Calendar Events (Google Calendar) → Detect Holiday Conflicts (compare dates) → Generate Reschedule Suggestions (find nearest business day that isn’t a holiday/weekend) → Format Slack Digest → Post Slack Digest. How to set up Open Workflow Configuration (Set) and edit: countryCodes, calendarId, slackChannel, nextWeekStart, nextWeekEnd. Connect your own Google Calendar and Slack credentials in n8n (no hardcoded keys). (Optional) Adjust the Trigger to run daily or only on Mondays. Requirements n8n (Cloud or self-hosted) Google Calendar read access to the target calendar Slack app with permission to post to the chosen channel A public-holiday API (no secrets needed for Nager.Date; Calendarific requires an API key) How to customize the workflow Time window: Change nextWeekStart/End to scan a different period. Holiday sources: Add or swap APIs; merge multiple regions. Suggestion logic: Tweak the look-ahead window or rules (e.g., skip Fridays). Output: Post per-calendar messages, DM owners, or create tentative reschedule events automatically.

Takuya OjimaBy Takuya Ojima
70

Automate Reddit brand monitoring & responses with GPT-4o-mini, Sheets & Slack

How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. --- Who is this for? Brand managers and marketing teams needing automated social listening and engagement on Reddit Community managers responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers who want to scale Reddit marketing without hiring a team PR and reputation teams monitoring brand sentiment and responding to discussions in real-time Product marketers seeking organic engagement opportunities in product-related communities Any business that wants to build authentic Reddit presence while avoiding spammy marketing tactics --- Setup Steps Setup time: Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements: Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOURSHEETID with your actual Google Sheets document ID Send Slack Report: Replace YOURCHANNELID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours --- Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel

Daniel ShashkoBy Daniel Shashko
679