Parse & track orders from email with Gemini/GPT & Notion database sync
Automated Email Order Tracking System with AI Classification and Notion Sync
Overview
⚠️ Self-Hosted Solution Required
This workflow requires a self-hosted n8n instance with active integrations for Gmail, Google Gemini AI, OpenAI, and Notion. API credentials and database IDs must be configured before use.
Template Image
Description
This intelligent automation system monitors your Gmail inbox for order-related emails, extracts key order information using AI, and automatically syncs the data to a Notion database for centralized order tracking. Perfect for individuals managing multiple e-commerce accounts or small businesses tracking customer orders across various platforms (Amazon, Noon, Namshi, etc.).
What This Workflow Does
- Email Monitoring: Continuously monitors Gmail inbox for new incoming emails
- Smart Classification: Uses AI to identify order-related emails (confirmations, shipping notifications, delivery updates)
- Intelligent Extraction: Parses email content to extract order details (order number, items, prices, status, delivery info)
- Database Synchronization: Automatically creates or updates Notion database records with order information
- Status Tracking: Monitors order progression through stages (Ordered → Shipped → Out for Delivery → Delivered)
Key Features
- Multi-vendor support: Works with any e-commerce platform (Amazon, Noon, Carrefour, Namshi, etc.)
- Duplicate prevention: Searches existing records before creating new entries
- Smart updates: Only modifies records when order status actually changes
- Status validation: Detects backward status changes (potential returns/reshipments)
- Graceful error handling: Handles missing data and optional fields intelligently
- Timestamped history: Maintains audit trail of all status changes
Technologies Used
- Gmail Trigger: Email monitoring
- JavaScript Code: Email content classification with pattern matching
- Google Gemini AI / OpenAI: Natural language processing for order extraction
- Structured Output Parser: JSON formatting and validation
- Notion API: Database search, create, and update operations
Prerequisites
Before setting up this workflow, ensure you have:
- Self-hosted n8n instance (version 1.0.0 or higher)
- Gmail account with IMAP access enabled
- Google Gemini API key OR OpenAI API key
- Notion workspace with:
- Integration access configured
- Database created with the required schema (see below)
- Integration token/API key
Notion Database Schema
Create a Notion database with the following properties:
Required Properties
| Property Name | Type | Description |
|--------------|------|-------------|
| Name of the Item | Title | Product/item name |
| Order Number | Text | Unique order identifier |
| Quantity | Number | Number of items |
| Expected Date | Date or Text | Expected delivery date |
| Order Status | Select | Options: Ordered, Shipped, Out for Delivery, Delivered |
Optional Properties (Recommended)
| Property Name | Type | Description | |--------------|------|-------------| | Vendor | Select | E-commerce platform (Amazon, Noon, etc.) | | Customer Name | Rich Text | Order recipient name | | Price | Number or Rich Text | Item price | | Order Total | Number | Total order amount | | Currency | Select | Currency code (AED, USD, SAR, etc.) | | Delivery Location | Rich Text | Delivery city/address | | Notes | Rich Text | Status change history | | Created Date | Created Time | Auto-populated by Notion | | Last Updated | Last Edited Time | Auto-populated by Notion |
Setup Instructions
Step 1: Import the Workflow
- Copy the workflow JSON from this template
- In your n8n instance, go to Workflows → Add Workflow → Import from File/URL
- Paste the JSON and click Import
Step 2: Configure Gmail Trigger
- Click on the Gmail Trigger node
- Click Create New Credential
- Follow the OAuth authentication flow to connect your Gmail account
- Configure trigger settings:
- Trigger On:
Message Received - Filters: (Optional) Add label filters to monitor specific folders
- Trigger On:
Step 3: Configure AI Model (Choose One)
Option A: Google Gemini AI
- Click on the Google Gemini AI Model node
- Click Create New Credential
- Enter your Gemini API key (obtain from Google AI Studio)
- Select model:
gemini-1.5-proorgemini-1.5-flash
Option B: OpenAI
- Click on the OpenAI Chat Model node
- Click Create New Credential
- Enter your OpenAI API key (obtain from OpenAI Platform)
- Select model:
gpt-4oorgpt-4-turbo
Step 4: Update Email Classification Node
- Click on the Check Email Type node (JavaScript code)
- Review the classification patterns (pre-configured for common e-commerce emails)
- (Optional) Add custom keywords specific to your vendors
Step 5: Configure Notion Integration
5.1: Create Notion Integration
- Go to Notion Integrations
- Click New Integration
- Name it (e.g., "n8n Order Tracker")
- Select your workspace
- Copy the Internal Integration Token
5.2: Share Database with Integration
- Open your Notion order database
- Click Share → Invite
- Search for your integration name and select it
- Grant Edit permissions
5.3: Get Database ID
- Open your Notion database in browser
- Copy the database ID from the URL:
https://notion.so/workspace/DATABASE_ID?v=... ^^^^^^^^^^^^
5.4: Configure Notion Nodes
-
Click on Search a database in Notion node
-
Click Create New Credential
-
Paste your Integration Token
-
In the node parameters:
- Database ID: Paste your database ID
- Filter: Set to search by
Order Numberproperty
-
Repeat credential setup for Create a database page in Notion and Update a database page in Notion nodes
Step 6: Update Agent Prompts
-
Click on the Email Classification and Extraction Agent node
-
Review the system prompt (pre-configured for common order emails)
-
Update the
{{$now}}variable if using a different timezone -
(Optional) Customize extraction rules for specific vendors
-
Click on the Order Database Sync Agent node
-
Replace
{{notion_database_id}}with your actual database ID in the prompt -
Review status handling logic
Step 7: Test the Workflow
- Click Execute Workflow to activate it
- Send yourself a test order confirmation email
- Monitor the execution:
- Check if email was classified correctly
- Verify extraction output in the AI agent node
- Confirm Notion database was updated
- Review your Notion database for the new/updated record
Step 8: Activate for Production
- Click Active toggle in the top-right corner
- The workflow will now run automatically for new emails
- Monitor executions in the Executions tab
Workflow Node Descriptions
Email Trigger
Monitors Gmail inbox for new incoming emails and triggers the workflow when a message is received.
Check Email Type
JavaScript code node that analyzes email content using pattern matching to identify order-related emails based on keywords, order numbers, and shipping terminology.
Email Router (IF Node)
Routes emails based on classification results:
- TRUE branch: Order-related emails proceed to extraction
- FALSE branch: Non-order emails are filtered out (no action)
Email Classification and Extraction Agent
AI-powered parser using Google Gemini or OpenAI to extract structured order information:
- Order number, items, prices, quantities
- Order status (Ordered/Shipped/Out for Delivery/Delivered)
- Customer name, delivery location, expected dates
- Vendor identification
Structured Output Parser
Validates and formats AI extraction output into clean JSON for downstream processing.
Search a database in Notion
Queries the Notion database by order number to check if a record already exists, preventing duplicates.
Order Database Sync Agent
Intelligent database manager that decides whether to create new records or update existing ones based on search results and status comparison.
Create a database page in Notion
Adds new order records to Notion when no existing record is found.
Update a database page in Notion
Modifies existing records when order status changes, appending timestamped notes for audit history.
No Action Taken
Terminates workflow branch for non-order emails with no further processing.
Customization Options
Add More Vendors
Edit the Check Email Type node to add vendor-specific keywords:
const customVendors = [
'your-vendor-name',
'vendor-domain.com'
];
Modify Status Values
Update the Email Classification and Extraction Agent prompt to add custom status values or change status progression logic.
Add Email Notifications
Insert a Send Email node after database sync to receive notifications for status changes.
Filter by Labels
Configure Gmail Trigger to monitor only specific labels (e.g., "Orders", "Shopping").
Multi-Database Support
Duplicate the Notion sync section to route different vendors to separate databases.
Troubleshooting
Email not being classified as order
- Check the Check Email Type node output
- Add vendor-specific keywords to the classification patterns
- Review email content for order indicators
AI extraction returning empty data
- Verify AI model credentials are valid
- Check if email content is being passed correctly
- Review the extraction prompt for compatibility with email format
Notion database not updating
- Confirm integration has edit permissions on the database
- Verify database ID is correct in all Notion nodes
- Check that property names in the workflow match your Notion schema exactly
Duplicate records being created
- Ensure Search a database in Notion node is filtering by
Order Number - Verify the search results are being evaluated correctly in the sync agent
Status not updating
- Check if the Order Database Sync Agent is comparing current vs new status
- Review the status comparison logic in the agent prompt
Performance Considerations
- Email Volume: This workflow processes each email individually. For high-volume inboxes, consider adding filters or label-based routing.
- AI Costs: Each email classification uses AI tokens. Monitor your API usage and costs.
- Rate Limits: Notion API has rate limits (3 requests/second). The workflow handles this gracefully with built-in error handling.
Privacy & Security
- All email content is processed through AI APIs (Google/OpenAI) - review their privacy policies
- Notion data is stored in your workspace with your configured permissions
- No data is stored or logged outside your n8n instance, AI provider, and Notion workspace
- Consider using self-hosted AI models for sensitive order information
Support & Contributions
Found a bug or have a suggestion? Please open an issue or contribute improvements to this template!
License
This template is provided as-is under the MIT License. Feel free to modify and distribute as needed.
Credits
Created for the n8n community to streamline e-commerce order tracking across multiple platforms.
n8n Workflow: Email Order Parsing with AI and Notion Database Sync
This n8n workflow automates the process of extracting order details from incoming emails using advanced AI models (Gemini/GPT) and then conditionally processes these details. While the workflow's full functionality isn't present in the provided JSON (specifically, the Notion integration and AI prompt/schema definitions are missing), it lays the groundwork for a powerful email parsing and data routing system.
What it does
This workflow is designed to:
- Monitor Gmail: Act as a trigger, listening for new emails in a specified Gmail account.
- Route based on conditions: It includes an "If" node, indicating a conditional logic to process emails differently based on certain criteria (which are not defined in the provided JSON).
- Process emails (placeholder): The "No Operation, do nothing" node suggests that further processing would occur here if the condition is met, or if it's a fallback path.
- Initialize AI Agent: Sets up an AI agent (likely for natural language processing) using the Langchain integration.
- Configure AI Model: Specifies the use of either an OpenAI Chat Model (GPT) or a Google Gemini Chat Model for the AI agent, allowing flexibility in choosing the underlying large language model.
- Structured Output Parsing: Prepares to parse the AI's output into a structured format (e.g., JSON) using the Langchain Structured Output Parser.
- Custom Code Execution: Includes a "Code" node, which would typically contain custom JavaScript logic to further manipulate or transform data.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running n8n instance.
- Gmail Account: Configured as a credential in n8n for the Gmail Trigger.
- OpenAI API Key (Optional, if using OpenAI Chat Model): Configured as a credential in n8n.
- Google Gemini API Key (Optional, if using Google Gemini Chat Model): Configured as a credential in n8n.
- Langchain Integration: While built-in, ensure your n8n instance supports the Langchain nodes.
Setup/Usage
- Import the workflow: Download the provided JSON and import it into your n8n instance.
- Configure Credentials:
- Set up your Gmail OAuth2 credential for the "Gmail Trigger" node.
- If using OpenAI, configure your OpenAI API Key credential for the "OpenAI Chat Model" node.
- If using Google Gemini, configure your Google Gemini API Key credential for the "Google Gemini Chat Model" node.
- Define "If" Node Logic: The "If" node (ID: 20) currently has no conditions defined. You will need to configure its conditions to filter incoming emails based on your specific needs (e.g., sender, subject keywords, body content).
- Implement AI Agent Logic: The "AI Agent" node (ID: 1119) requires configuration for its prompt, tools, and the specific schema for extracting order details. This is where you would instruct the AI on what information to look for in the email body (e.g., order number, items, quantities, shipping address, total price).
- Refine Structured Output Parser: The "Structured Output Parser" node (ID: 1179) will need a schema definition that matches the desired output format from the AI agent.
- Add Post-Processing Logic: The "No Operation, do nothing" node (ID: 26) and the "Code" node (ID: 834) are placeholders. You would replace or extend these with nodes to:
- Sync with Notion: Create or update items in a Notion database with the extracted order details.
- Send Notifications: Notify relevant parties (e.g., Slack, email) about new orders.
- Error Handling: Implement logic for when order details cannot be extracted.
- Activate the workflow: Once configured, activate the workflow to start processing emails automatically.
Related Templates
Track competitor SEO keywords with Decodo + GPT-4.1-mini + Google Sheets
This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primarykeywords, seostrengthscore, keyworddensity_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo for intelligent web scraping OpenAI GPT-4.1-mini for keyword and SEO analysis Google Sheets for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.
Generate song lyrics and music from text prompts using OpenAI and Fal.ai Minimax
Spark your creativity instantly in any chat—turn a simple prompt like "heartbreak ballad" into original, full-length lyrics and a professional AI-generated music track, all without leaving your conversation. 📋 What This Template Does This chat-triggered workflow harnesses AI to generate detailed, genre-matched song lyrics (at least 600 characters) from user messages, then queues them for music synthesis via Fal.ai's minimax-music model. It polls asynchronously until the track is ready, delivering lyrics and audio URL back in chat. Crafts original, structured lyrics with verses, choruses, and bridges using OpenAI Submits to Fal.ai for melody, instrumentation, and vocals aligned to the style Handles long-running generations with smart looping and status checks Returns complete song package (lyrics + audio link) for seamless sharing 🔧 Prerequisites n8n account (self-hosted or cloud with chat integration enabled) OpenAI account with API access for GPT models Fal.ai account for AI music generation 🔑 Required Credentials OpenAI API Setup Go to platform.openai.com → API keys (sidebar) Click "Create new secret key" → Name it (e.g., "n8n Songwriter") Copy the key and add to n8n as "OpenAI API" credential type Test by sending a simple chat completion request Fal.ai HTTP Header Auth Setup Sign up at fal.ai → Dashboard → API Keys Generate a new API key → Copy it In n8n, create "HTTP Header Auth" credential: Name="Fal.ai", Header Name="Authorization", Header Value="Key [Your API Key]" Test with a simple GET to their queue endpoint (e.g., /status) ⚙️ Configuration Steps Import the workflow JSON into your n8n instance Assign OpenAI API credentials to the "OpenAI Chat Model" node Assign Fal.ai HTTP Header Auth to the "Generate Music Track", "Check Generation Status", and "Fetch Final Result" nodes Activate the workflow—chat trigger will appear in your n8n chat interface Test by messaging: "Create an upbeat pop song about road trips" 🎯 Use Cases Content Creators: YouTubers generating custom jingles for videos on the fly, streamlining production from idea to audio export Educators: Music teachers using chat prompts to create era-specific folk tunes for classroom discussions, fostering interactive learning Gift Personalization: Friends crafting anniversary R&B tracks from shared memories via quick chats, delivering emotional audio surprises Artist Brainstorming: Songwriters prototyping hip-hop beats in real-time during sessions, accelerating collaboration and iteration ⚠️ Troubleshooting Invalid JSON from AI Agent: Ensure the system prompt stresses valid JSON; test the agent standalone with a sample query Music Generation Fails (401/403): Verify Fal.ai API key has minimax-music access; check usage quotas in dashboard Status Polling Loops Indefinitely: Bump wait time to 45-60s for complex tracks; inspect fal.ai queue logs for bottlenecks Lyrics Under 600 Characters: Tweak agent prompt to enforce fuller structures like [V1][C][V2][B][C]; verify output length in executions
Automate invoice processing with OCR, GPT-4 & Salesforce opportunity creation
PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive ➜ Download PDF ➜ OCR text ➜ AI normalize to JSON ➜ Upsert Buyer (Account) ➜ Create Opportunity ➜ Map Products ➜ Create OLI via Composite API ➜ Archive to OneDrive. --- Node by node (what it does & key setup) 1) Google Drive Trigger Purpose: Fire when a new file appears in a specific Google Drive folder. Key settings: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output: Metadata { id, name, ... } for the new file. --- 2) Download File From Google Purpose: Get the file binary for processing and archiving. Key settings: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output: Binary (default key: data) and original metadata. --- 3) Extract from File Purpose: Extract text from PDF (OCR as needed) for AI parsing. Key settings: Operation: pdf OCR: enable for scanned PDFs (in options) Output: JSON with OCR text at {{ $json.text }}. --- 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into strict normalized JSON array (invoice schema). Key settings: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content → the parsed JSON (ensure it’s an array). --- 5) Create or update an account (Salesforce) Purpose: Upsert Buyer as Account using an external ID. Key settings: Resource: account Operation: upsert External Id Field: taxid_c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream Opportunity. --- 6) Create an opportunity (Salesforce) Purpose: Create Opportunity linked to the Buyer (Account). Key settings: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output: Opportunity Id for OLI creation. --- 7) Build SOQL (Code / JS) Purpose: Collect unique product codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output: { soql, codes } --- 8) Query PricebookEntries (Salesforce) Purpose: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output: Items with Id, Product2.ProductCode (used for mapping). --- 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id ➜ build OpportunityLineItem payloads. Inputs: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes: Converts discount_total ➜ per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. --- 10) Create Opportunity Line Items (HTTP Request) Purpose: Bulk create OLIs via Salesforce Composite API. Key settings: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output: Composite API results (per-record statuses). --- 11) Update File to One Drive Purpose: Archive the original PDF in OneDrive. Key settings: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output: Uploaded file metadata. --- Data flow (wiring) Google Drive Trigger → Download File From Google Download File From Google → Extract from File → Update File to One Drive Extract from File → Message a model Message a model → Create or update an account Create or update an account → Create an opportunity Create an opportunity → Build SOQL Build SOQL → Query PricebookEntries Query PricebookEntries → Code in JavaScript Code in JavaScript → Create Opportunity Line Items --- Quick setup checklist 🔐 Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. 📂 IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) 🧠 AI Prompt: Use the strict system prompt; jsonOutput = true. 🧾 Field mappings: Buyer tax id/name → Account upsert fields Invoice code/date/amount → Opportunity fields Product name must equal your Product2.ProductCode in SF. ✅ Test: Drop a sample PDF → verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive --- Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep “Return only a JSON array” as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields don’t support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your org’s domain or use the Salesforce node’s Bulk Upsert for simplicity.