Build a cost estimation chatbot with Mistral AI, OCR & Supabase
AI Cost Estimation Chatbot (Conversational Dual-Agent + OCR Workflow)
Overview
This workflow introduces a conversational AI Cost Estimation Chatbot with built-in OCR document analysis and interactive form guidance.
It helps users and teams handle pricing, measurement, and product configuration for multiple categories such as fabrics and tiles — whether data comes from an uploaded invoice, a stored RFQ, or live user input.
The system blends Mistral AI’s reasoning with n8n’s native tools — OCR Extract, Calculator, Supabase, and Gmail — to deliver clear, step-by-step cost calculations.
It automatically retrieves or parses OCR data, confirms details conversationally, performs unit conversions, and returns accurate estimates in real time.
Escalation and recordkeeping are handled via Gmail and Supabase.
Chatbot Flow
Trigger: Chat message (from n8n Chat UI) or Webhook (from a live site).
Model: Mistral Cloud Chat Model (mistral-medium-latest)
Memory: Simple Memory (Buffer Window, 15-message history)
Tools:
- OCR Extract: Reads and converts invoices, receipts, and RFQs into structured data.
- Supabase: Stores and retrieves OCR data for re-use in future calculations.
- Calculator: Performs all material, area, and cost computations.
- Gmail: Escalates customer queries or sends quote summaries.
- Agent:
ai agent cost estimate
Workflow Behavior:
- Retrieves or parses OCR data, confirms and completes missing details interactively.
- Guides users step-by-step through product setup (Fabric or Tile).
- Calculates costs transparently using MATERIAL_COSTS and PROCESSING_COSTS.
- Handles GSM ↔ sqm, area, and weight conversions automatically.
- Escalates support or order confirmations via Gmail when requested.
Integrations Used
| Service | Purpose | |----------|----------| | ** Chat** | User-facing chatbot interface | | OCR Extract | Processes uploaded documents or receipts | | Supabase | Stores and retrieves OCR / quote data | | Mistral AI | Chat model and reasoning engine | | Calculator | Handles all numeric and cost calculations | | Gmail | Sends escalations or quote summaries |
Agent System Prompt Summary
> “You are an AI cost estimation assistant for a brand.
> Retrieve or parse OCR data from Supabase, confirm details with the user, and calculate costs transparently.
> Use the Calculator for all numeric logic based on MATERIAL_COSTS and PROCESSING_COSTS.
> Handle GSM-to-sqm and other conversions automatically.
> If support or follow-up is needed, send a message through Gmail.
> Always guide the user conversationally, confirm assumptions, and explain every step clearly.”
Key Features
input: Chat Interface
Conversational guidance even when OCR data doesnt exist
OCR + Supabase integration for document reuse
Interactive cost estimator for fabrics and tiles
Transparent calculations and unit conversions
Gmail integration for escalation or order confirmation
Modular design for scaling to other product types
Summary
A powerful AI + OCR conversational cost estimation assistant that retrieves or parses order data, guides users through setup, and calculates costs transparently.
It combines intelligence (Mistral), precision (Calculator), and automation (OCR + Supabase + Gmail) to create a complete, human-like quotation system — perfect for brands, manufacturers, and B2B platforms.
We can help you set it up for free — from connecting credentials to deploying it live.
Contact: shilpa.raju@digitalbiz.tech
Website: https://www.digitalbiz.tech
LinkedIn: https://www.linkedin.com/company/digital-biz-tech/
You can also DM us on LinkedIn for any help.
n8n Cost Estimation Chatbot with Mistral AI, OCR, and Supabase
This n8n workflow simplifies and automates the process of building a cost estimation chatbot. It integrates a chat trigger, AI agent with memory and tools, and a Supabase database to provide dynamic and intelligent responses.
What it does
This workflow automates the following steps:
- Listens for Chat Messages: Initiates the workflow upon receiving a new chat message.
- Manages Chat Memory: Utilizes a simple memory buffer to maintain context throughout the conversation, allowing the AI to remember previous interactions.
- Processes with AI Agent: Routes the chat message to an AI Agent configured with a Mistral Cloud Chat Model.
- Provides Calculation Capabilities: Equips the AI Agent with a Calculator tool, enabling it to perform mathematical operations as needed for cost estimations.
- Interacts with Supabase: Connects to a Supabase database, likely for retrieving or storing cost-related data that the AI agent can use.
- Executes Custom Logic: Includes a Code node for executing custom JavaScript logic, potentially for data manipulation, formatting, or more complex conditional processing.
- Makes HTTP Requests: Allows the AI agent to make HTTP requests, which could be used for integrating with OCR services (as hinted by the directory name) or other external APIs to fetch data relevant to cost estimation.
- Conditional Routing: Employs an 'If' node to introduce conditional logic, directing the flow based on certain criteria or AI agent outputs.
- Merges Data: Uses a 'Merge' node to combine data streams from different branches of the workflow, ensuring all relevant information is consolidated.
- Splits Output: Includes a 'Split Out' node, likely for processing or distributing the AI agent's responses or other data in a structured manner.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running n8n instance.
- Mistral AI Account: An API key for the Mistral Cloud Chat Model.
- Supabase Account: Access to a Supabase database, including necessary API keys or connection details.
- (Optional) OCR Service: If the "OCR" hint from the directory name is implemented, an account and API key for an OCR service (e.g., Mindee, Google Cloud Vision, etc.) would be required for the HTTP Request node.
Setup/Usage
- Import the Workflow: Download the provided JSON and import it into your n8n instance.
- Configure Credentials:
- Set up your Mistral AI credentials with your API key in the 'Mistral Cloud Chat Model' node.
- Configure your Supabase credentials in the 'Supabase' node.
- If using an OCR service, configure the 'HTTP Request' node with the necessary API endpoint and authentication.
- Activate the Workflow: Ensure the workflow is active to start listening for chat messages.
- Interact with the Chatbot: Send messages to the configured chat trigger to interact with your cost estimation chatbot.
Related Templates
Two-way property repair management system with Google Sheets & Drive
This workflow automates the repair request process between tenants and building managers, keeping all updates organized in a single spreadsheet. It is composed of two coordinated workflows, as two separate triggers are required — one for new repair submissions and another for repair updates. A Unique Unit ID that corresponds to individual units is attributed to each request, and timestamps are used to coordinate repair updates with specific requests. General use cases include: Property managers who manage multiple buildings or units. Building owners looking to centralize tenant repair communication. Automation builders who want to learn multi-trigger workflow design in n8n. --- ⚙️ How It Works Workflow 1 – New Repair Requests Behind the Scenes: A tenant fills out a Google Form (“Repair Request Form”), which automatically adds a new row to a linked Google Sheet. Steps: Trigger: Google Sheets rowAdded – runs when a new form entry appears. Extract & Format: Collects all relevant form data (address, unit, urgency, contacts). Generate Unit ID: Creates a standardized identifier (e.g., BUILDING-UNIT) for tracking. Email Notification: Sends the building manager a formatted email summarizing the repair details and including a link to a Repair Update Form (which activates Workflow 2). --- Workflow 2 – Repair Updates Behind the Scenes:\ Triggered when the building manager submits a follow-up form (“Repair Update Form”). Steps: Lookup by UUID: Uses the Unit ID from Workflow 1 to find the existing row in the Google Sheet. Conditional Logic: If photos are uploaded: Saves each image to a Google Drive folder, renames files consistently, and adds URLs to the sheet. If no photos: Skips the upload step and processes textual updates only. Merge & Update: Combines new data with existing repair info in the same spreadsheet row — enabling a full repair history in one place. --- 🧩 Requirements Google Account (for Forms, Sheets, and Drive) Gmail/email node connected for sending notifications n8n credentials configured for Google API access --- ⚡ Setup Instructions (see more detail in workflow) Import both workflows into n8n, then copy one into a second workflow. Change manual trigger in workflow 2 to a n8n Form node. Connect Google credentials to all nodes. Update spreadsheet and folder IDs in the corresponding nodes. Customize email text, sender name, and form links for your organization. Test each workflow with a sample repair request and a repair update submission. --- 🛠️ Customization Ideas Add Slack or Telegram notifications for urgent repairs. Auto-create folders per building or unit for photo uploads. Generate monthly repair summaries using Google Sheets triggers. Add an AI node to create summaries/extract relevant repair data from repair request that include long submissions.
Automate invoice processing with OCR, GPT-4 & Salesforce opportunity creation
PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive ➜ Download PDF ➜ OCR text ➜ AI normalize to JSON ➜ Upsert Buyer (Account) ➜ Create Opportunity ➜ Map Products ➜ Create OLI via Composite API ➜ Archive to OneDrive. --- Node by node (what it does & key setup) 1) Google Drive Trigger Purpose: Fire when a new file appears in a specific Google Drive folder. Key settings: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output: Metadata { id, name, ... } for the new file. --- 2) Download File From Google Purpose: Get the file binary for processing and archiving. Key settings: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output: Binary (default key: data) and original metadata. --- 3) Extract from File Purpose: Extract text from PDF (OCR as needed) for AI parsing. Key settings: Operation: pdf OCR: enable for scanned PDFs (in options) Output: JSON with OCR text at {{ $json.text }}. --- 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into strict normalized JSON array (invoice schema). Key settings: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content → the parsed JSON (ensure it’s an array). --- 5) Create or update an account (Salesforce) Purpose: Upsert Buyer as Account using an external ID. Key settings: Resource: account Operation: upsert External Id Field: taxid_c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream Opportunity. --- 6) Create an opportunity (Salesforce) Purpose: Create Opportunity linked to the Buyer (Account). Key settings: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output: Opportunity Id for OLI creation. --- 7) Build SOQL (Code / JS) Purpose: Collect unique product codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output: { soql, codes } --- 8) Query PricebookEntries (Salesforce) Purpose: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output: Items with Id, Product2.ProductCode (used for mapping). --- 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id ➜ build OpportunityLineItem payloads. Inputs: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes: Converts discount_total ➜ per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. --- 10) Create Opportunity Line Items (HTTP Request) Purpose: Bulk create OLIs via Salesforce Composite API. Key settings: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output: Composite API results (per-record statuses). --- 11) Update File to One Drive Purpose: Archive the original PDF in OneDrive. Key settings: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output: Uploaded file metadata. --- Data flow (wiring) Google Drive Trigger → Download File From Google Download File From Google → Extract from File → Update File to One Drive Extract from File → Message a model Message a model → Create or update an account Create or update an account → Create an opportunity Create an opportunity → Build SOQL Build SOQL → Query PricebookEntries Query PricebookEntries → Code in JavaScript Code in JavaScript → Create Opportunity Line Items --- Quick setup checklist 🔐 Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. 📂 IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) 🧠 AI Prompt: Use the strict system prompt; jsonOutput = true. 🧾 Field mappings: Buyer tax id/name → Account upsert fields Invoice code/date/amount → Opportunity fields Product name must equal your Product2.ProductCode in SF. ✅ Test: Drop a sample PDF → verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive --- Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep “Return only a JSON array” as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields don’t support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your org’s domain or use the Salesforce node’s Bulk Upsert for simplicity.
AI-powered code review with linting, red-marked corrections in Google Sheets & Slack
Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOURSPREADSHEETID>, <YOURSHEETGIDORNAME>, <YOURSLACKCHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.