Automated construction project alerts with email notifications and data APIs
This n8n workflow monitors and alerts you about new construction projects in specified areas, helping you track competing builders and identify business opportunities. The system automatically searches multiple data sources and sends detailed email reports with upcoming projects.
Good to know
- Email parsing accuracy depends on the consistency of request formats - use the provided template for best results.
- The workflow includes fallback mock data for demonstration when external APIs are unavailable.
- Government data sources may have rate limits - the workflow includes proper error handling.
- Results are filtered to show only upcoming/recent projects (within 3 months).
How it works
- Email Trigger - Detects new email requests with "Construction Alert Request" in the subject line
- Check Email Subject - Validates that the email contains the correct trigger phrase
- Extract Location Info - Parses the email body to extract area, city, state, and zip code information
- Search Government Data - Queries government databases for public construction projects and permits
- Search Construction Sites - Searches construction industry databases for private projects
- Process Construction Data - Combines and filters results from both sources, removing duplicates
- Wait For Data - Wait for Combines and filters results.
- Check If Projects Found - Determines whether to send a results report or no-results notification
- Generate Email Report - Creates a professional HTML email with project details and summaries
- Send Alert Email - Delivers the construction project report to the requester
- Send No Results Email - Notifies when no projects are found in the specified area
The workflow also includes a Schedule Trigger that can run automatically on weekdays at 9 AM for regular monitoring.
Email Format Examples
Input Email Format
To: alerts@yourcompany.com
Subject: Construction Alert Request
Area: Downtown Chicago
City: Chicago
State: IL
Zip: 60601
Additional notes: Looking for commercial projects over $1M
Alternative format:
To: alerts@yourcompany.com
Subject: Construction Alert Request
Please search for construction projects in Miami, FL 33101
Focus on residential and mixed-use developments.
Output Email Example
Subject: ποΈ Construction Alert: 8 Projects Found in Downtown Chicago
ποΈ Construction Project Alert Report
Search Area: Downtown Chicago
Report Generated: August 4, 2024, 2:30 PM
π Summary
Total Projects Found: 8
Search Query: Downtown Chicago IL construction permits
π Upcoming Construction Projects
1. New Commercial Complex - Downtown Chicago
π Location: Downtown Chicago | π
Start Date: March 2024 | π’ Type: Mixed Development
Description: Mixed-use commercial and residential development
Source: Local Planning Department
2. Office Building Construction - Chicago
π Location: Chicago, IL | π
Start Date: April 2024 | π’ Type: Commercial
Description: 5-story office building with retail space
Source: Building Permits
[Additional projects...]
π‘ Next Steps
β’ Review each project for potential competition
β’ Contact project owners for partnership opportunities
β’ Monitor progress and timeline changes
β’ Update your competitive analysis
How to use
Setup Instructions
- Import the workflow into your n8n instance
- Configure Email Credentials:
- Set up IMAP credentials for receiving emails
- Set up SMTP credentials for sending alerts
- Test the workflow with a sample email
- Set up scheduling (optional) for automated daily checks
Sending Alert Requests
- Send an email to your configured address
- Use "Construction Alert Request" in the subject line
- Include location details in the email body
- Receive detailed project reports within minutes
Requirements
- n8n instance (cloud or self-hosted)
- Email account with IMAP/SMTP access
- Internet connection for API calls to construction databases
- Valid email addresses for sending and receiving alerts
API Integration Code Examples
Government Data API Integration
// Example API call to USA.gov jobs API
const searchGovernmentProjects = async (location) => {
const response = await fetch('https://api.usa.gov/jobs/search.json', {
method: 'GET',
headers: {
'Content-Type': 'application/json',
},
params: {
keyword: 'construction permit',
location_name: location,
size: 20
}
});
return await response.json();
};
Construction Industry API Integration
// Example API call to construction databases
const searchConstructionProjects = async (area) => {
const response = await fetch('https://www.construction.com/api/search', {
method: 'GET',
headers: {
'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36',
'Accept': 'application/json'
},
params: {
q: `${area} construction projects`,
type: 'projects',
limit: 15
}
});
return await response.json();
};
Email Processing Function
// Extract location from email content
const extractLocationInfo = (emailBody) => {
const lines = emailBody.split('\n');
let area = '', city = '', state = '', zipcode = '';
for (const line of lines) {
if (line.toLowerCase().includes('area:')) {
area = line.split(':')[1]?.trim();
}
if (line.toLowerCase().includes('city:')) {
city = line.split(':')[1]?.trim();
}
if (line.toLowerCase().includes('state:')) {
state = line.split(':')[1]?.trim();
}
if (line.toLowerCase().includes('zip:')) {
zipcode = line.split(':')[1]?.trim();
}
}
return { area, city, state, zipcode };
};
Customizing this workflow
Adding New Data Sources
- Add HTTP Request nodes for additional APIs
- Update the Process Construction Data node to handle new data formats
- Modify the search parameters based on API requirements
Enhanced Email Parsing
// Custom email parsing for different formats
const parseEmailContent = (emailBody) => {
// Add regex patterns for different email formats
const patterns = {
address: /(\d+\s+[\w\s]+,\s*[\w\s]+,\s*[A-Z]{2}\s*\d{5})/,
coordinates: /(\d+\.\d+),\s*(-?\d+\.\d+)/,
zipcode: /\b\d{5}(-\d{4})?\b/
};
// Extract using multiple patterns
// Implementation details...
};
Custom Alert Conditions
- Modify the Check If Projects Found node to filter by:
- Project value/budget
- Project type (residential, commercial, etc.)
- Distance from your location
- Timeline criteria
Advanced Scheduling
// Set up multiple schedule triggers for different areas
const scheduleConfigs = [
{ area: "Downtown", cron: "0 9 * * 1-5" }, // Weekdays 9 AM
{ area: "Suburbs", cron: "0 14 * * 1,3,5" }, // Mon, Wed, Fri 2 PM
{ area: "Industrial", cron: "0 8 * * 1" } // Monday 8 AM
];
Integration with CRM Systems
Add HTTP Request nodes to automatically create leads in your CRM when high-value projects are found:
// Example CRM integration
const createCRMLead = async (project) => {
await fetch('https://your-crm.com/api/leads', {
method: 'POST',
headers: {
'Authorization': 'Bearer YOUR_TOKEN',
'Content-Type': 'application/json'
},
body: JSON.stringify({
name: project.title,
location: project.location,
value: project.estimatedValue,
source: 'Construction Alert System'
})
});
};
Troubleshooting
- No emails received: Check IMAP credentials and email filters
- Empty results: Verify API endpoints and add fallback data sources
- Failed email delivery: Confirm SMTP settings and recipient addresses
- API rate limits: Implement delays between requests and error handling
Automated Construction Project Alerts with Email Notifications and Data APIs
This n8n workflow automates the process of checking for new emails, potentially interacting with an external API, and sending email notifications based on certain conditions. It's designed to provide a flexible framework for triggering actions based on incoming email content and integrating with other services.
What it does
This workflow simplifies and automates the following steps:
- Monitors for New Emails: It continuously checks an IMAP email account for new incoming emails.
- Schedules Regular Checks: A separate branch of the workflow is set up to run on a schedule, which could be used for periodic API calls or other time-based tasks.
- Conditional Logic: It includes an "If" node, suggesting that it can evaluate conditions based on incoming data (e.g., email content or API responses) to determine the next steps.
- Makes HTTP Requests: The workflow can send HTTP requests to external APIs, allowing it to fetch or send data to other services.
- Sends Email Notifications: It can send out email notifications, likely triggered by specific conditions met in the "If" node or as a result of an API interaction.
- Introduces Delays: A "Wait" node is included, which can be used to pause the workflow for a specified duration, useful for rate limiting API calls or waiting for external processes.
- Custom Code Execution: A "Code" node is present, enabling custom JavaScript logic to be executed within the workflow for advanced data manipulation or decision-making.
Prerequisites/Requirements
To use this workflow, you will need:
- IMAP Email Account Credentials: For the "Email Trigger (IMAP)" node to monitor incoming emails.
- SMTP Email Account Credentials: For the "Send Email" node to send out notifications.
- API Endpoints and Credentials (Optional): If you intend to use the "HTTP Request" node to interact with external services, you'll need the relevant API URLs, keys, or authentication details.
- n8n Instance: A running instance of n8n to import and execute the workflow.
Setup/Usage
- Import the Workflow: Download the provided JSON and import it into your n8n instance.
- Configure Credentials:
- Email Trigger (IMAP): Set up an IMAP credential with the details of the email account you want to monitor.
- Send Email: Set up an SMTP credential for the email account you want to use to send notifications.
- Customize Nodes:
- Email Trigger (IMAP): Configure the IMAP node to filter for specific emails (e.g., by sender, subject, or keywords) if needed.
- If: Adjust the conditions in the "If" node based on the logic you want to implement (e.g., checking for specific keywords in the email body, or values from an API response).
- HTTP Request: If you are making API calls, configure the URL, method, headers, and body of the request according to the API documentation.
- Send Email: Customize the recipient, subject, and body of the email notifications. You can use expressions to include data from previous nodes.
- Schedule Trigger: Configure the schedule for how often you want the time-based branch of the workflow to run.
- Wait: Adjust the delay duration as needed.
- Code: Modify the JavaScript code within the "Code" node to perform any custom data processing or logic required for your specific use case.
- Activate the Workflow: Once configured, activate the workflow to start monitoring emails and executing the defined automation.
Related Templates
Auto-create TikTok videos with VEED.io AI avatars, ElevenLabs & GPT-4
π₯ Viral TikTok Video Machine: Auto-Create Videos with Your AI Avatar --- π― Who is this for? This workflow is for content creators, marketers, and agencies who want to use Veed.ioβs AI avatar technology to produce short, engaging TikTok videos automatically. Itβs ideal for creators who want to appear on camera without recording themselves, and for teams managing multiple brands who need to generate videos at scale. --- βοΈ What problem this workflow solves Manually creating videos for TikTok can take hours β finding trends, writing scripts, recording, and editing. By combining Veed.io, ElevenLabs, and GPT-4, this workflow transforms a simple Telegram input into a ready-to-post TikTok video featuring your AI avatar powered by Veed.io β speaking naturally with your cloned voice. --- π What this workflow does This automation links Veed.ioβs video-generation API with multiple AI tools: Analyzes TikTok trends via Perplexity AI Writes a 10-second viral script using GPT-4 Generates your voiceover via ElevenLabs Uses Veed.io (Fabric 1.0 via FAL.ai) to animate your avatar and sync the lips to the voice Creates an engaging caption + hashtags for TikTok virality Publishes the video automatically via Blotato TikTok API Logs all results to Google Sheets for tracking --- π§© Setup Telegram Bot Create your bot via @BotFather Configure it as the trigger for sending your photo and theme Connect Veed.io Create an account on Veed.io Get your FAL.ai API key (Veed Fabric 1.0 model) Use HTTPS image/audio URLs compatible with Veed Fabric Other APIs Add Perplexity, ElevenLabs, and Blotato TikTok keys Connect your Google Sheet for logging results --- π οΈ How to customize this workflow Change your Avatar: Upload a new image through Telegram, and Veed.io will generate a new talking version automatically. Modify the Script Style: Adjust the GPT prompt for tone (educational, funny, storytelling). Adjust Voice Tone: Tweak ElevenLabs stability and similarity settings. Expand Platforms: Add Instagram, YouTube Shorts, or X (Twitter) posting nodes. Track Performance: Customize your Google Sheet to measure your most successful Veed.io-based videos. --- π§ Expected Outcome In just a few seconds after sending your photo and theme, this workflow β powered by Veed.io β creates a fully automated TikTok video featuring your AI avatar with natural lip-sync and voice. The result is a continuous stream of viral short videos, made without cameras, editing, or effort. --- β Import the JSON file in n8n, add your API keys (including Veed.io via FAL.ai), and start generating viral TikTok videos starring your AI avatar today! π₯ Watch This Tutorial --- π Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube
Automate invoice processing with OCR, GPT-4 & Salesforce opportunity creation
PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive β Download PDF β OCR text β AI normalize to JSON β Upsert Buyer (Account) β Create Opportunity β Map Products β Create OLI via Composite API β Archive to OneDrive. --- Node by node (what it does & key setup) 1) Google Drive Trigger Purpose: Fire when a new file appears in a specific Google Drive folder. Key settings: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output: Metadata { id, name, ... } for the new file. --- 2) Download File From Google Purpose: Get the file binary for processing and archiving. Key settings: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output: Binary (default key: data) and original metadata. --- 3) Extract from File Purpose: Extract text from PDF (OCR as needed) for AI parsing. Key settings: Operation: pdf OCR: enable for scanned PDFs (in options) Output: JSON with OCR text at {{ $json.text }}. --- 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into strict normalized JSON array (invoice schema). Key settings: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content β the parsed JSON (ensure itβs an array). --- 5) Create or update an account (Salesforce) Purpose: Upsert Buyer as Account using an external ID. Key settings: Resource: account Operation: upsert External Id Field: taxid_c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream Opportunity. --- 6) Create an opportunity (Salesforce) Purpose: Create Opportunity linked to the Buyer (Account). Key settings: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output: Opportunity Id for OLI creation. --- 7) Build SOQL (Code / JS) Purpose: Collect unique product codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output: { soql, codes } --- 8) Query PricebookEntries (Salesforce) Purpose: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output: Items with Id, Product2.ProductCode (used for mapping). --- 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id β build OpportunityLineItem payloads. Inputs: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes: Converts discount_total β per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. --- 10) Create Opportunity Line Items (HTTP Request) Purpose: Bulk create OLIs via Salesforce Composite API. Key settings: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output: Composite API results (per-record statuses). --- 11) Update File to One Drive Purpose: Archive the original PDF in OneDrive. Key settings: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output: Uploaded file metadata. --- Data flow (wiring) Google Drive Trigger β Download File From Google Download File From Google β Extract from File β Update File to One Drive Extract from File β Message a model Message a model β Create or update an account Create or update an account β Create an opportunity Create an opportunity β Build SOQL Build SOQL β Query PricebookEntries Query PricebookEntries β Code in JavaScript Code in JavaScript β Create Opportunity Line Items --- Quick setup checklist π Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. π IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) π§ AI Prompt: Use the strict system prompt; jsonOutput = true. π§Ύ Field mappings: Buyer tax id/name β Account upsert fields Invoice code/date/amount β Opportunity fields Product name must equal your Product2.ProductCode in SF. β Test: Drop a sample PDF β verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive --- Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep βReturn only a JSON arrayβ as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields donβt support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your orgβs domain or use the Salesforce nodeβs Bulk Upsert for simplicity.
Document RAG & chat agent: Google Drive to Qdrant with Mistral OCR
Knowledge RAG & AI Chat Agent: Google Drive to Qdrant Description This workflow transforms a Google Drive folder into an intelligent, searchable knowledge base and provides a chat agent to query it. Itβs composed of two distinct flows: An ingestion pipeline to process documents. A live chat agent that uses RAG (Retrieval-Augmented Generation) and optional web search to answer user questions. This system fully automates the creation of a βChat with your docsβ solution and enhances it with external web-searching capabilities. --- Quick Implementation Steps Import the workflow JSON into your n8n instance. Set up credentials for Google Drive, Mistral AI, OpenAI, and Qdrant. Open the Web Search node and add your Tavily AI API key to the Authorization header. In the Google Drive (List Files) node, set the Folder ID you want to ingest. Run the workflow manually once to populate your Qdrant database (Flow 1). Activate the workflow to enable the chat trigger (Flow 2). Copy the public webhook URL from the When chat message received node and open it in a new tab to start chatting. --- What It Does The workflow is divided into two primary functions: Knowledge Base Ingestion (Manual Trigger) This flow populates your vector database. Scans Google Drive: Lists all files from a specified folder. Processes Files Individually: Downloads each file. Extracts Text via OCR: Uses Mistral AI OCR API for text extraction from PDFs, images, etc. Generates Smart Metadata: A Mistral LLM assigns metadata like documenttype, project, and assignedto. Chunks & Embeds: Text is cleaned, chunked, and embedded via OpenAIβs text-embedding-3-small model. Stores in Qdrant: Text chunks, embeddings, and metadata are stored in a Qdrant collection (docaiauto). AI Chat Agent (Chat Trigger) This flow powers the conversational interface. Handles User Queries: Triggered when a user sends a chat message. Internal RAG Retrieval: Searches Qdrant Vector Store first for answers. Web Search Fallback: If unavailable internally, the agent offers to perform a Tavily AI web search. Contextual Responses: Combines internal and external info for comprehensive answers. --- Who's It For Ideal for: Teams building internal AI knowledge bases from Google Drive. Developers creating AI-powered support, research, or onboarding bots. Organizations implementing RAG pipelines. Anyone making unstructured Google Drive documents searchable via chat. --- Requirements n8n instance (self-hosted or cloud). Google Drive Credentials (to list and download files). Mistral AI API Key (for OCR & metadata extraction). OpenAI API Key (for embeddings and chat LLM). Qdrant instance (cloud or self-hosted). Tavily AI API Key (for web search). --- How It Works The workflow runs two independent flows in parallel: Flow 1: Ingestion Pipeline (Manual Trigger) List Files: Fetch files from Google Drive using the Folder ID. Loop & Download: Each file is processed one by one. OCR Processing: Upload file to Mistral Retrieve signed URL Extract text using Mistral DOC OCR Metadata Extraction: Analyze text using a Mistral LLM. Text Cleaning & Chunking: Split into 1000-character chunks. Embeddings Creation: Use OpenAI embeddings. Vector Insertion: Push chunks + metadata into Qdrant. Flow 2: AI Chat Agent (Chat Trigger) Chat Trigger: Starts when a chat message is received. AI Agent: Uses OpenAI + Simple Memory to process context. RAG Retrieval: Queries Qdrant for related data. Decision Logic: Found β Form answer. Not found β Ask if user wants web search. Web Search: Performs Tavily web lookup. Final Response: Synthesizes internal + external info. --- How To Set Up Import the Workflow Upload the provided JSON into your n8n instance. Configure Credentials Create and assign: Google Drive β Google Drive nodes Mistral AI β Upload, Signed URL, DOC OCR, Cloud Chat Model OpenAI β Embeddings + Chat Model nodes Qdrant β Vector Store nodes Add Tavily API Key Open Web Search node β Parameters β Headers Add your key under Authorization (e.g., tvly-xxxx). Node Configuration Google Drive (List Files): Set Folder ID. Qdrant Nodes: Ensure same collection name (docaiauto). Run Ingestion (Flow 1) Click Test workflow to populate Qdrant with your Drive documents. Activate Chat (Flow 2) Toggle the workflow ON to enable real-time chat. Test Open the webhook URL and start chatting! --- How To Customize Change LLMs: Swap models in OpenAI or Mistral nodes (e.g., GPT-4o, Claude 3). Modify Prompts: Edit the system message in ai chat agent to alter tone or logic. Chunking Strategy: Adjust chunkSize and chunkOverlap in the Code node. Different Sources: Replace Google Drive with AWS S3, Local Folder, etc. Automate Updates: Add a Cron node for scheduled ingestion. Validation: Add post-processing steps after metadata extraction. Expand Tools: Add more functional nodes like Google Calendar or Calculator. --- Use Case Examples Internal HR Bot: Answer HR-related queries from stored policy docs. Tech Support Assistant: Retrieve troubleshooting steps for products. Research Assistant: Summarize and compare market reports. Project Management Bot: Query document ownership or project status. --- Troubleshooting Guide | Issue | Possible Solution | |------------|------------------------| | Chat agent doesnβt respond | Check OpenAI API key and model availability (e.g., gpt-4.1-mini). | | Known documents not found | Ensure ingestion flow ran and both Qdrant nodes use same collection name. | | OCR node fails | Verify Mistral API key and input file integrity. | | Web search not triggered | Re-check Tavily API key in Web Search node headers. | | Incorrect metadata | Tune Information Extractor prompt or use a stronger Mistral model. | --- Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. We can help you set it up for free β from connecting credentials to deploying it live. Contact: shilpa.raju@digitalbiz.tech Website: https://www.digitalbiz.tech LinkedIn: https://www.linkedin.com/company/digital-biz-tech/ You can also DM us on LinkedIn for any help. ---