Back to Catalog

Monitor supplier financial health with ScrapeGraphAI & multi-channel risk alerts

vinci-king-01vinci-king-01
482 views
2/3/2026
Official Page

Smart Supplier Health Monitor with ScrapeGraphAI Risk Detection and Multi-Channel Alerts

🎯 Target Audience

  • Procurement managers and directors
  • Supply chain risk analysts
  • CFOs and financial controllers
  • Vendor management teams
  • Enterprise risk managers
  • Operations managers
  • Contract administrators
  • Business continuity planners

🚀 Problem Statement

Manual supplier monitoring is reactive and time-consuming, often missing early warning signs of financial distress that could disrupt your supply chain. This template solves the challenge of proactive supplier health surveillance by automatically monitoring financial indicators, news sentiment, and market conditions to predict supplier risks before they impact your business operations.

🔧 How it Works

This workflow automatically monitors your critical suppliers' financial health using AI-powered web scraping, analyzes multiple risk factors, identifies alternative suppliers when needed, and sends intelligent alerts through multiple channels to ensure your procurement team can act quickly on emerging risks.

Key Components

  1. Weekly Health Check Scheduler - Automated trigger based on supplier criticality levels
  2. Supplier Database Loader - Dynamic supplier portfolio management with risk-based monitoring frequency
  3. ScrapeGraphAI Website Analyzer - AI-powered extraction of financial health indicators from company websites
  4. Financial News Scraper - Intelligent monitoring of financial news and sentiment analysis
  5. Advanced Risk Scorer - Industry-adjusted risk calculation with failure probability modeling
  6. Alternative Supplier Finder - Automated identification and ranking of backup suppliers
  7. Multi-Channel Alert System - Email, Slack, and API notifications with escalation rules

📊 Risk Analysis Specifications

The template performs comprehensive financial health analysis with the following parameters:

| Risk Factor | Weight | Score Impact | Description | |-------------|--------|--------------|-------------| | Financial Issues | 40% | +0-24 points | Revenue decline, debt levels, cash flow problems | | Operational Risks | 30% | +0-18 points | Management changes, restructuring, capacity issues | | Market Risks | 20% | +0-12 points | Industry disruption, regulatory changes, competition | | Reputational Risks | 10% | +0-6 points | Negative news, legal issues, public sentiment |

Industry Risk Multipliers:

  • Technology: 1.1x (Higher volatility)
  • Manufacturing: 1.0x (Baseline)
  • Energy: 1.2x (Regulatory risks)
  • Financial: 1.3x (Market sensitivity)
  • Logistics: 0.9x (Generally stable)

Risk Levels & Actions:

  • Critical Risk: Score ≥ 75 (CEO/CFO escalation, immediate transition planning)
  • High Risk: Score ≥ 55 (Procurement director escalation, backup activation)
  • Medium Risk: Score ≥ 35 (Manager review, increased monitoring)
  • Low Risk: Score < 35 (Standard monitoring)

🏢 Supplier Management Features

| Feature | Critical Suppliers | High Priority | Medium Priority | |---------|-------------------|---------------|-----------------| | Monitoring Frequency | Weekly | Bi-weekly | Monthly | | Risk Threshold | 35+ points | 40+ points | 50+ points | | Alert Recipients | C-Level + Directors | Directors + Managers | Managers only | | Alternative Suppliers | 3+ pre-qualified | 2+ identified | 1+ researched | | Transition Timeline | 24-48 hours | 1-2 weeks | 1-3 months |

🛠️ Setup Instructions

Estimated setup time: 25-30 minutes

Prerequisites

  • n8n instance with community nodes enabled
  • ScrapeGraphAI API account and credentials
  • Gmail account for email alerts (or alternative email service)
  • Slack workspace with webhook or bot token
  • Supplier database or CRM system API access
  • Basic understanding of procurement processes

Step-by-Step Configuration

1. Configure ScrapeGraphAI Credentials

  • Sign up for ScrapeGraphAI API account
  • Navigate to Credentials in your n8n instance
  • Add new ScrapeGraphAI API credentials with your API key
  • Test the connection to ensure proper functionality

2. Set up Email Integration

  • Add Gmail OAuth2 credentials in n8n
  • Configure sender email and authentication
  • Test email delivery with sample message
  • Set up email templates for different risk levels

3. Configure Slack Integration

  • Create Slack webhook URL or bot token
  • Add Slack credentials to n8n
  • Configure target channels for different alert types
  • Customize Slack message formatting and buttons

4. Load Supplier Database

  • Update the "Supplier Database Loader" node with your supplier data
  • Configure supplier categories, contract values, and criticality levels
  • Set monitoring frequencies based on supplier importance
  • Add supplier website URLs and contact information

5. Customize Risk Parameters

  • Adjust industry risk multipliers for your business context
  • Modify risk scoring thresholds based on risk tolerance
  • Configure economic factor adjustments
  • Set failure probability calculation parameters

6. Configure Alternative Supplier Database

  • Populate the alternative supplier database in the "Alternative Supplier Finder" node
  • Add supplier ratings, capacities, and specialties
  • Configure geographic coverage and certification requirements
  • Set suitability scoring parameters

7. Set up Procurement System Integration

  • Configure the procurement system webhook endpoint
  • Add API authentication credentials
  • Test webhook payload delivery
  • Set up automated data synchronization

8. Test and Validate

  • Run test scenarios with sample supplier data
  • Verify ScrapeGraphAI extraction accuracy
  • Check risk scoring calculations and thresholds
  • Confirm all alert channels are working properly
  • Test alternative supplier recommendations

🔄 Workflow Customization Options

Modify Risk Analysis

  • Add custom risk indicators specific to your industry
  • Implement sector-specific economic adjustments
  • Configure contract-specific risk factors
  • Add ESG (Environmental, Social, Governance) scoring

Extend Data Sources

  • Integrate credit rating agency APIs (Dun & Bradstreet, Experian)
  • Add financial database connections (Bloomberg, Reuters)
  • Include social media sentiment analysis
  • Connect to government regulatory databases

Enhance Alternative Supplier Management

  • Add automated supplier qualification workflows
  • Implement dynamic pricing comparison
  • Create supplier performance scorecards
  • Add geographic risk assessment

Advanced Analytics

  • Implement predictive failure modeling
  • Add supplier portfolio optimization
  • Create supply chain risk heatmaps
  • Generate automated compliance reports

📈 Use Cases

  • Supply Chain Risk Management: Proactive monitoring of supplier financial stability
  • Procurement Optimization: Data-driven supplier selection and management
  • Business Continuity Planning: Automated backup supplier identification
  • Financial Risk Assessment: Early warning system for supplier defaults
  • Contract Management: Risk-based contract renewal and negotiation
  • Vendor Diversification: Strategic supplier portfolio management

🚨 Important Notes

  • Respect ScrapeGraphAI API rate limits and terms of service
  • Implement appropriate delays between supplier assessments
  • Keep all API credentials secure and rotate them regularly
  • Monitor API usage to manage costs effectively
  • Ensure compliance with data privacy regulations (GDPR, CCPA)
  • Regularly update supplier databases and contact information
  • Review and adjust risk parameters based on market conditions
  • Maintain confidentiality of supplier financial information

🔧 Troubleshooting

Common Issues:

  • ScrapeGraphAI extraction errors: Check API key validity and rate limits
  • Email delivery failures: Verify Gmail credentials and permissions
  • Slack notification failures: Check webhook URL and channel permissions
  • False positive alerts: Adjust risk scoring thresholds and industry multipliers
  • Missing supplier data: Verify website URLs and accessibility
  • Alternative supplier errors: Check supplier database completeness

Monitoring Best Practices:

  • Set up workflow execution monitoring and error alerts
  • Regularly review and update supplier information
  • Monitor API usage and costs across all integrations
  • Validate risk scoring accuracy with historical data
  • Test disaster recovery and backup procedures

Support Resources:

  • ScrapeGraphAI documentation and API reference
  • n8n community forums for workflow assistance
  • Procurement best practices and industry standards
  • Financial risk assessment methodologies
  • Supply chain management resources and tools

Monitor Supplier Financial Health with ScrapeGraphAI & Multi-Channel Risk Alerts

This n8n workflow automates the process of monitoring supplier financial health by leveraging an external AI web scraping tool (ScrapeGraphAI) and sending multi-channel risk alerts based on the scraped data. It simplifies the task of regularly checking supplier information and notifying relevant stakeholders when potential risks are identified.

What it does

This workflow performs the following key steps:

  1. Triggers on a Schedule: The workflow starts automatically at predefined intervals (e.g., daily, weekly).
  2. Initiates Web Scraping: It makes an HTTP request to an external API, likely a ScrapeGraphAI endpoint, to scrape financial health data for a specified supplier.
  3. Processes Scraped Data: A Code node is used to process and transform the raw data received from the web scraping API, extracting relevant financial health indicators.
  4. Evaluates Risk: An If node checks the processed data against predefined conditions to determine if any financial risk is detected for the supplier.
  5. Sends Multi-Channel Alerts (Conditional):
    • If a risk is detected, the workflow sends an email notification via Gmail.
    • (Implied, but not explicitly shown in the provided JSON for the "false" branch of the If node): If no risk is detected, the workflow might end or proceed with a different action (e.g., logging successful monitoring).
  6. Prepares Email Content: An "Edit Fields (Set)" node prepares the data for the Gmail node, likely formatting the alert message.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running n8n instance.
  • ScrapeGraphAI Endpoint: Access to a ScrapeGraphAI (or similar web scraping) API endpoint that can provide supplier financial health data. You will need the API URL and any necessary authentication.
  • Gmail Account: A configured Gmail credential in n8n to send email alerts.
  • Basic JavaScript Knowledge (Optional): For customizing the "Code" node logic.

Setup/Usage

  1. Import the Workflow: Import the provided JSON into your n8n instance.
  2. Configure Credentials:
    • Gmail: Set up your Gmail OAuth2 or API key credentials in n8n.
  3. Configure Nodes:
    • Schedule Trigger (Node 839): Adjust the schedule to your desired monitoring frequency (e.g., daily, weekly).
    • HTTP Request (Node 19):
      • Update the URL to your ScrapeGraphAI endpoint.
      • Configure any necessary Headers or Body parameters for your API request (e.g., supplier name, API key).
    • Code (Node 834): Review and modify the JavaScript code to correctly parse the response from your ScrapeGraphAI API and extract the financial health indicators you want to monitor.
    • If (Node 20): Define the conditions for detecting financial risk based on the data processed by the "Code" node.
    • Edit Fields (Set) (Node 38): Customize the fields to prepare the email content, including the subject and body of the alert.
    • Gmail (Node 356):
      • Specify the To email address(es) for the alerts.
      • Map the Subject and Body fields from the "Edit Fields (Set)" node.
  4. Activate the Workflow: Once configured, activate the workflow to start monitoring.

Related Templates

Automate RSS to social media pipeline with AI, Airtable & GetLate for multiple platforms

Overview Automates your complete social media content pipeline: sources articles from Wallabag RSS, generates platform-specific posts with AI, creates contextual images, and publishes via GetLate API. Built with 63 nodes across two workflows to handle LinkedIn, Instagram, and Bluesky—with easy expansion to more platforms. Ideal for: Content marketers, solo creators, agencies, and community managers maintaining a consistent multi-platform presence with minimal manual effort. How It Works Two-Workflow Architecture: Content Aggregation Workflow Monitors Wallabag RSS feeds for tagged articles (to-share-linkedin, to-share-instagram, etc.) Extracts and converts content from HTML to Markdown Stores structured data in Airtable with platform assignment AI Generation & Publishing Workflow Scheduled trigger queries Airtable for unpublished content Routes to platform-specific sub-workflows (LinkedIn, Instagram, Bluesky) LLM generates optimized post text and image prompts based on custom brand parameters Optionally generates AI images and hosts them on Imgbb CDN Publishes via GetLate API (immediate or draft mode) Updates Airtable with publication status and metadata Key Features: Tag-based content routing using Wallabag's native system Swappable AI providers (Groq, OpenAI, Anthropic) Platform-specific optimization (tone, length, hashtags, CTAs) Modular design—duplicate sub-workflows to add new platforms in \~30 minutes Centralized Airtable tracking with 17 data points per post Set Up Steps Setup time: \~45-60 minutes for initial configuration Create accounts and get API keys (\~15 min) Wallabag (with RSS feeds enabled) GetLate (social media publishing) Airtable (create base with provided schema—see sticky notes) LLM provider (Groq, OpenAI, or Anthropic) Image service (Hugging Face, Fal.ai, or Stability AI) Imgbb (image hosting) Configure n8n credentials (\~10 min) Add all API keys in n8n's credential manager Detailed credential setup instructions in workflow sticky notes Set up Airtable database (\~10 min) Create "RSS Feed - Content Store" base Add 19 required fields (schema provided in workflow sticky notes) Get Airtable base ID and API key Customize brand prompts (\~15 min) Edit "Set Custom SMCG Prompt" node for each platform Define brand voice, tone, goals, audience, and image preferences Platform-specific examples provided in sticky notes Configure platform settings (\~10 min) Set GetLate account IDs for each platform Enable/disable image generation per platform Choose immediate publish vs. draft mode Adjust schedule trigger frequency Test and deploy Tag test articles in Wallabag Monitor the first few executions in draft mode Activate workflows when satisfied with the output Important: This is a proof-of-concept template. Test thoroughly with draft mode before production use. Detailed setup instructions, troubleshooting tips, and customization guidance are in the workflow's sticky notes. Technical Details 63 nodes: 9 Airtable operations, 8 HTTP requests, 7 code nodes, 3 LangChain LLM chains, 3 RSS triggers, 3 GetLate publishers Supports: Multiple LLM providers, multiple image generation services, unlimited platforms via modular architecture Tracking: 17 metadata fields per post, including publish status, applied parameters, character counts, hashtags, image URLs Prerequisites n8n instance (self-hosted or cloud) Accounts: Wallabag, GetLate, Airtable, LLM provider, image generation service, Imgbb Basic understanding of n8n workflows and credential configuration Time to customize prompts for your brand voice Detailed documentation, Airtable schema, prompt examples, and troubleshooting guides are in the workflow's sticky notes. Category Tags social-media-automation, ai-content-generation, rss-to-social, multi-platform-posting, getlate-api, airtable-database, langchain, workflow-automation, content-marketing

Mikal Hayden-GatesBy Mikal Hayden-Gates
188

Document RAG & chat agent: Google Drive to Qdrant with Mistral OCR

Knowledge RAG & AI Chat Agent: Google Drive to Qdrant Description This workflow transforms a Google Drive folder into an intelligent, searchable knowledge base and provides a chat agent to query it. It’s composed of two distinct flows: An ingestion pipeline to process documents. A live chat agent that uses RAG (Retrieval-Augmented Generation) and optional web search to answer user questions. This system fully automates the creation of a “Chat with your docs” solution and enhances it with external web-searching capabilities. --- Quick Implementation Steps Import the workflow JSON into your n8n instance. Set up credentials for Google Drive, Mistral AI, OpenAI, and Qdrant. Open the Web Search node and add your Tavily AI API key to the Authorization header. In the Google Drive (List Files) node, set the Folder ID you want to ingest. Run the workflow manually once to populate your Qdrant database (Flow 1). Activate the workflow to enable the chat trigger (Flow 2). Copy the public webhook URL from the When chat message received node and open it in a new tab to start chatting. --- What It Does The workflow is divided into two primary functions: Knowledge Base Ingestion (Manual Trigger) This flow populates your vector database. Scans Google Drive: Lists all files from a specified folder. Processes Files Individually: Downloads each file. Extracts Text via OCR: Uses Mistral AI OCR API for text extraction from PDFs, images, etc. Generates Smart Metadata: A Mistral LLM assigns metadata like documenttype, project, and assignedto. Chunks & Embeds: Text is cleaned, chunked, and embedded via OpenAI’s text-embedding-3-small model. Stores in Qdrant: Text chunks, embeddings, and metadata are stored in a Qdrant collection (docaiauto). AI Chat Agent (Chat Trigger) This flow powers the conversational interface. Handles User Queries: Triggered when a user sends a chat message. Internal RAG Retrieval: Searches Qdrant Vector Store first for answers. Web Search Fallback: If unavailable internally, the agent offers to perform a Tavily AI web search. Contextual Responses: Combines internal and external info for comprehensive answers. --- Who's It For Ideal for: Teams building internal AI knowledge bases from Google Drive. Developers creating AI-powered support, research, or onboarding bots. Organizations implementing RAG pipelines. Anyone making unstructured Google Drive documents searchable via chat. --- Requirements n8n instance (self-hosted or cloud). Google Drive Credentials (to list and download files). Mistral AI API Key (for OCR & metadata extraction). OpenAI API Key (for embeddings and chat LLM). Qdrant instance (cloud or self-hosted). Tavily AI API Key (for web search). --- How It Works The workflow runs two independent flows in parallel: Flow 1: Ingestion Pipeline (Manual Trigger) List Files: Fetch files from Google Drive using the Folder ID. Loop & Download: Each file is processed one by one. OCR Processing: Upload file to Mistral Retrieve signed URL Extract text using Mistral DOC OCR Metadata Extraction: Analyze text using a Mistral LLM. Text Cleaning & Chunking: Split into 1000-character chunks. Embeddings Creation: Use OpenAI embeddings. Vector Insertion: Push chunks + metadata into Qdrant. Flow 2: AI Chat Agent (Chat Trigger) Chat Trigger: Starts when a chat message is received. AI Agent: Uses OpenAI + Simple Memory to process context. RAG Retrieval: Queries Qdrant for related data. Decision Logic: Found → Form answer. Not found → Ask if user wants web search. Web Search: Performs Tavily web lookup. Final Response: Synthesizes internal + external info. --- How To Set Up Import the Workflow Upload the provided JSON into your n8n instance. Configure Credentials Create and assign: Google Drive → Google Drive nodes Mistral AI → Upload, Signed URL, DOC OCR, Cloud Chat Model OpenAI → Embeddings + Chat Model nodes Qdrant → Vector Store nodes Add Tavily API Key Open Web Search node → Parameters → Headers Add your key under Authorization (e.g., tvly-xxxx). Node Configuration Google Drive (List Files): Set Folder ID. Qdrant Nodes: Ensure same collection name (docaiauto). Run Ingestion (Flow 1) Click Test workflow to populate Qdrant with your Drive documents. Activate Chat (Flow 2) Toggle the workflow ON to enable real-time chat. Test Open the webhook URL and start chatting! --- How To Customize Change LLMs: Swap models in OpenAI or Mistral nodes (e.g., GPT-4o, Claude 3). Modify Prompts: Edit the system message in ai chat agent to alter tone or logic. Chunking Strategy: Adjust chunkSize and chunkOverlap in the Code node. Different Sources: Replace Google Drive with AWS S3, Local Folder, etc. Automate Updates: Add a Cron node for scheduled ingestion. Validation: Add post-processing steps after metadata extraction. Expand Tools: Add more functional nodes like Google Calendar or Calculator. --- Use Case Examples Internal HR Bot: Answer HR-related queries from stored policy docs. Tech Support Assistant: Retrieve troubleshooting steps for products. Research Assistant: Summarize and compare market reports. Project Management Bot: Query document ownership or project status. --- Troubleshooting Guide | Issue | Possible Solution | |------------|------------------------| | Chat agent doesn’t respond | Check OpenAI API key and model availability (e.g., gpt-4.1-mini). | | Known documents not found | Ensure ingestion flow ran and both Qdrant nodes use same collection name. | | OCR node fails | Verify Mistral API key and input file integrity. | | Web search not triggered | Re-check Tavily API key in Web Search node headers. | | Incorrect metadata | Tune Information Extractor prompt or use a stronger Mistral model. | --- Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. We can help you set it up for free — from connecting credentials to deploying it live. Contact: shilpa.raju@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help. ---

DIGITAL BIZ TECHBy DIGITAL BIZ TECH
1409

Ai website scraper & company intelligence

AI Website Scraper & Company Intelligence Description This workflow automates the process of transforming any website URL into a structured, intelligent company profile. It's triggered by a form, allowing a user to submit a website and choose between a "basic" or "deep" scrape. The workflow extracts key information (mission, services, contacts, SEO keywords), stores it in a structured Supabase database, and archives a full JSON backup to Google Drive. It also features a secondary AI agent that automatically finds and saves competitors for each company, building a rich, interconnected database of company intelligence. --- Quick Implementation Steps Import the Workflow: Import the provided JSON file into your n8n instance. Install Custom Community Node: You must install the community node from: https://www.npmjs.com/package/n8n-nodes-crawl-and-scrape FIRECRAWL N8N Documentation https://docs.firecrawl.dev/developer-guides/workflow-automation/n8n Install Additional Nodes: n8n-nodes-crawl-and-scrape and n8n-nodes-mcp fire crawl mcp . Set up Credentials: Create credentials in n8n for FIRE CRAWL API,Supabase, Mistral AI, and Google Drive. Configure API Key (CRITICAL): Open the Web Search tool node. Go to Parameters → Headers and replace the hardcoded Tavily AI API key with your own. Configure Supabase Nodes: Assign your Supabase credential to all Supabase nodes. Ensure table names (e.g., companies, competitors) match your schema. Configure Google Drive Nodes: Assign your Google Drive credential to the Google Drive2 and save to Google Drive1 nodes. Select the correct Folder ID. Activate Workflow: Turn on the workflow and open the Webhook URL in the “On form submission” node to access the form. --- What It Does Form Trigger Captures user input: “Website URL” and “Scraping Type” (basic or deep). Scraping Router A Switch node routes the flow: Deep Scraping → AI-based MCP Firecrawler agent. Basic Scraping → Crawlee node. Deep Scraping (Firecrawl AI Agent) Uses Firecrawl and Tavily Web Search. Extracts a detailed JSON profile: mission, services, contacts, SEO keywords, etc. Basic Scraping (Crawlee) Uses Crawl and Scrape node to collect raw text. A Mistral-based AI extractor structures the data into JSON. Data Storage Stores structured data in Supabase tables (companies, company_basicprofiles). Archives a full JSON backup to Google Drive. Automated Competitor Analysis Runs after a deep scrape. Uses Tavily web search to find competitors (e.g., from Crunchbase). Saves competitor data to Supabase, linked by company_id. --- Who's It For Sales & Marketing Teams: Enrich leads with deep company info. Market Researchers: Build structured, searchable company databases. B2B Data Providers: Automate company intelligence collection. Developers: Use as a base for RAG or enrichment pipelines. --- Requirements n8n instance (self-hosted or cloud) Supabase Account: With tables like companies, competitors, social_links, etc. Mistral AI API Key Google Drive Credentials Tavily AI API Key (Optional) Custom Nodes: n8n-nodes-crawl-and-scrape --- How It Works Flow Summary Form Trigger: Captures “Website URL” and “Scraping Type”. Switch Node: deep → MCP Firecrawler (AI Agent). basic → Crawl and Scrape node. Scraping & Extraction: Deep path: Firecrawler → JSON structure. Basic path: Crawlee → Mistral extractor → JSON. Storage: Save JSON to Supabase. Archive in Google Drive. Competitor Analysis (Deep Only): Finds competitors via Tavily. Saves to Supabase competitors table. End: Finishes with a No Operation node. --- How To Set Up Import workflow JSON. Install community nodes (especially n8n-nodes-crawl-and-scrape from npm). Configure credentials (Supabase, Mistral AI, Google Drive). Add your Tavily API key. Connect Supabase and Drive nodes properly. Fix disconnected “basic” path if needed. Activate workflow. Test via the webhook form URL. --- How To Customize Change LLMs: Swap Mistral for OpenAI or Claude. Edit Scraper Prompts: Modify system prompts in AI agent nodes. Change Extraction Schema: Update JSON Schema in extractor nodes. Fix Relational Tables: Add Items node before Supabase inserts for arrays (social links, keywords). Enhance Automation: Add email/slack notifications, or replace form trigger with a Google Sheets trigger. --- Add-ons Automated Trigger: Run on new sheet rows. Notifications: Email or Slack alerts after completion. RAG Integration: Use the Supabase database as a chatbot knowledge source. --- Use Case Examples Sales Lead Enrichment: Instantly get company + competitor data from a URL. Market Research: Collect and compare companies in a niche. B2B Database Creation: Build a proprietary company dataset. --- WORKFLOW IMAGE --- Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|-----------| | Form Trigger 404 | Workflow not active | Activate the workflow | | Web Search Tool fails | Missing Tavily API key | Replace the placeholder key | | FIRECRAWLER / find competitor fails | Missing MCP node | Install n8n-nodes-mcp | | Basic scrape does nothing | Switch node path disconnected | Reconnect “basic” output | | Supabase node error | Wrong table/column names | Match schema exactly | --- Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. Contact: shilpa.raju@digitalbiz.tech For more such offerings, visit us: https://www.digitalbiz.tech ---

DIGITAL BIZ TECHBy DIGITAL BIZ TECH
923