DataMinex
Smart Connection Analysis from Open Data, Globally at Scale
Templates by DataMinex
Real estate property search with SQL database and email delivery
Transform property searches into personalized experiences! This powerful automation delivers dream home matches straight to clients' inboxes with professional CSV reports - all from a simple web form. π What this workflow does Create a complete real estate search experience that works 24/7: β¨ Smart Web Form - Beautiful property search form captures client preferences π§ Dynamic SQL Builder - Intelligently creates optimized queries from user input β‘ Lightning Database Search - Scans 1000+ properties in milliseconds π Professional CSV Export - Excel-ready reports with complete property details π§ Automated Email Delivery - Personalized emails with property previews and attachments π― Perfect for: Real Estate Agents - Generate leads and impress clients with instant service Property Managers - Automate tenant matching and recommendations Brokerages - Provide 24/7 self-service property discovery Developers - Showcase available properties with professional automation π‘ Why this workflow is a game-changer > "From property search to professional report delivery in under 30 seconds!" β‘ Instant Results: Zero wait time for property matches π¨ Professional Output: Beautiful emails that showcase your expertise π± Mobile Optimized: Works flawlessly on all devices π§ Smart Filtering: Only searches criteria clients actually specify π Infinitely Scalable: Handles unlimited searches simultaneously π Real Estate Data Source Built on authentic US market data from the Github: ποΈ 1000+ Real Properties across all US states π° Actual Market Prices from legitimate listings π Complete Property Details (bedrooms, bathrooms, square footage, lot size) π Verified Locations with accurate cities, states, and ZIP codes π’ Broker Information for authentic real estate context π οΈ Quick Setup Guide Prerequisites Checklist β [ ] SQL Server database (MySQL/PostgreSQL also supported) [ ] Gmail account for automated emails [ ] n8n instance (cloud or self-hosted) [ ] 20 minutes setup time Step 1: Import Real Estate Data π₯ π Download the data πΎ Download CSV file (1000+ properties included) ποΈ Create SQL Server table with this exact schema: sql CREATE TABLE [REALTOR].[dbo].[realtorusaprice] ( brokered_by BIGINT, status NVARCHAR(50), price DECIMAL(12,2), bed INT, bath DECIMAL(3,1), acre_lot DECIMAL(10,8), street BIGINT, city NVARCHAR(100), state NVARCHAR(50), zip_code INT, house_size INT, prevsolddate NVARCHAR(50) ); π Import your CSV data into this table Step 2: Configure Database Connection π π Set up Microsoft SQL Server credentials in n8n β Test connection to ensure everything works π― Workflow is pre-configured for the table structure above Step 3: Gmail Setup (The Magic Touch) π§ π Visit Google Cloud Console π Create new project (or use existing) π Enable Gmail API in API Library π Create OAuth2 credentials (Web Application) βοΈ Add your n8n callback URL to authorized redirects π Configure Gmail OAuth2 credentials in n8n β¨ Authorize your Google account Step 4: Launch Your Property Search Portal π π Import this workflow template (form is pre-configured) π Copy your webhook URL from the Property Search Form node π Test with a sample property search π¨ Check email delivery with CSV attachment π Go live and start impressing clients! π¨ Customization Playground π·οΈ Personalize Your Brand javascript // Customize email subjects in the Gmail node "π Exclusive Properties Curated Just for You - ${results.length} Perfect Matches!" "β¨ Your Dream Home Portfolio - Handpicked by Our Experts" "π― Hot Market Alert - ${results.length} Premium Properties Inside!" π§ Advanced Enhancements π¨ HTML Email Templates: Create stunning visual emails with property images π Analytics Dashboard: Track popular searches and user engagement π Smart Alerts: Set up automated price drop notifications π± Mobile Integration: Connect to React Native or Flutter apps π€ AI Descriptions: Add ChatGPT for compelling property descriptions π Multi-Database Flexibility javascript // Easy database switching // MySQL: Replace Microsoft SQL node β MySQL node // PostgreSQL: Swap for PostgreSQL node // MongoDB: Use MongoDB node with JSON queries // Even CSV files: Use CSV reading nodes for smaller datasets π Advanced Features & Extensions π₯ Pro Tips for Power Users π Bulk Processing: Handle multiple searches simultaneously πΎ Smart Caching: Store popular searches for lightning-fast results π Lead Scoring: Track which properties generate most interest π Follow-up Automation: Schedule nurturing email sequences π― Integration Possibilities π’ CRM Connection: Auto-add qualified leads to your CRM π Calendar Integration: Add property viewing scheduling π Price Monitoring: Track market trends and price changes π± Social Media: Auto-share featured properties to social platforms π¬ Chat Integration: Connect to WhatsApp or SMS for instant alerts π Expand Your Real Estate Automation π Related Workflow Ideas π€ AI Property Valuation - Add machine learning for price predictions π Market Analysis Reports - Generate comprehensive market insights π± SMS Property Alerts - Instant text notifications for hot properties π’ Commercial Property Search - Adapt for office and retail spaces πΉ Investment ROI Calculator - Add financial analysis for investors ποΈ Neighborhood Analytics - Include school ratings and demographics π οΈ Technical Extensions π· Image Processing: Auto-resize and optimize property photos πΊοΈ Map Integration: Add interactive property location maps π± Progressive Web App: Create mobile app experience π Push Notifications: Real-time alerts for saved searches π Get Started Now Import this workflow template Configure your database and Gmail Customize branding and messaging Launch your professional property search portal Watch client satisfaction soar!
Flight data visualization with Chart.js, QuickChart API & Telegram Bot
π Real-Time Flight Data Analytics Bot with Dynamic Chart Generation via Telegram π Template Overview This advanced n8n workflow creates an intelligent Telegram bot that transforms raw CSV flight data into stunning, interactive visualizations. Users can generate professional charts on-demand through a conversational interface, making data analytics accessible to anyone via messaging. Key Innovation: Combines real-time data processing, Chart.js visualization engine, and Telegram's messaging platform to deliver instant business intelligence insights. π― What This Template Does Transform your flight booking data into actionable insights with four powerful visualization types: π Bar Charts: Top 10 busiest airlines by flight volume π₯§ Pie Charts: Flight duration distribution (Short/Medium/Long-haul) π© Doughnut Charts: Price range segmentation with average pricing π Line Charts: Price trend analysis across flight durations Each chart includes auto-generated insights, percentages, and key business metrics delivered instantly to users' phones. ποΈ Technical Architecture Core Components Telegram Webhook Trigger: Captures user interactions and button clicks Smart Routing Engine: Conditional logic for command detection and chart selection CSV Data Pipeline: File reading β parsing β JSON transformation Chart Generation Engine: JavaScript-powered data processing with Chart.js Image Rendering Service: QuickChart API for high-quality PNG generation Response Delivery: Binary image transmission back to Telegram Data Flow Architecture User Input β Command Detection β CSV Processing β Data Aggregation β Chart Configuration β Image Generation β Telegram Delivery π οΈ Setup Requirements Prerequisites n8n instance (self-hosted or cloud) Telegram Bot Token from @BotFather CSV dataset with flight information Internet connectivity for QuickChart API Dataset Source This template uses the Airlines Flights Data dataset from GitHub: π Dataset: Airlines Flights Data by Rohit Grewal Required Data Schema Your CSV file should contain these columns: csv airline,flight,sourcecity,departuretime,arrivaltime,duration,price,class,destinationcity,stops File Structure /data/ βββ flights.csv (download from GitHub dataset above) βοΈ Configuration Steps Telegram Bot Setup Create a new bot via @BotFather on Telegram Copy your bot token Configure the Telegram Trigger node with your token Set webhook URL in your n8n instance Data Preparation Download the dataset from Airlines Flights Data Upload the CSV file to /data/flights.csv in your n8n instance Ensure UTF-8 encoding Verify column headers match the dataset schema Test file accessibility from n8n Workflow Activation Import the workflow JSON Configure all Telegram nodes with your bot token Test the /start command Activate the workflow π§ Technical Implementation Details Chart Generation Process Bar Chart Logic: javascript // Aggregate airline counts const airlineCounts = {}; flights.forEach(flight => { const airline = flight.airline || 'Unknown'; airlineCounts[airline] = (airlineCounts[airline] || 0) + 1; }); // Generate Chart.js configuration const chartConfig = { type: 'bar', data: { labels, datasets }, options: { responsive: true, plugins: {...} } }; Dynamic Color Schemes: Bar Charts: Professional blue gradient palette Pie Charts: Duration-based color coding (lightβdark blue) Doughnut Charts: Price-tier specific colors (greenβpurple) Line Charts: Trend-focused red gradient with smooth curves Performance Optimizations Efficient Data Processing: Single-pass aggregations with O(n) complexity Smart Caching: QuickChart handles image caching automatically Minimal Memory Usage: Stream processing for large datasets Error Handling: Graceful fallbacks for missing data fields Advanced Features Auto-Generated Insights: Statistical calculations (percentages, averages, totals) Trend analysis and pattern detection Business intelligence summaries Contextual recommendations User Experience Enhancements: Reply keyboards for easy navigation Visual progress indicators Error recovery mechanisms Mobile-optimized chart dimensions (800x600px) π Use Cases & Business Applications Airlines & Travel Companies Fleet Analysis: Monitor airline performance and market share Pricing Strategy: Analyze competitor pricing across routes Operational Insights: Track duration patterns and efficiency Data Analytics Teams Self-Service BI: Enable non-technical users to generate reports Mobile Dashboards: Access insights anywhere via Telegram Rapid Prototyping: Quick data exploration without complex tools Business Intelligence Executive Reporting: Instant charts for presentations Market Research: Compare industry trends and benchmarks Performance Monitoring: Track KPIs in real-time π¨ Customization Options Adding New Chart Types Create new Switch condition Add corresponding data processing node Configure Chart.js options Update user interface menu Data Source Extensions Replace CSV with database connections Add real-time API integrations Implement data refresh mechanisms Support multiple file formats Visual Customizations javascript // Custom color palette backgroundColor: ['your-colors'], // Advanced styling borderRadius: 8, borderSkipped: false, // Animation effects animation: { duration: 2000, easing: 'easeInOutQuart' } π Security & Best Practices Data Protection Validate CSV input format Sanitize user inputs Implement rate limiting Secure file access permissions Error Handling Graceful degradation for API failures User-friendly error messages Automatic retry mechanisms Comprehensive logging π Expected Outputs Sample Generated Insights "βοΈ Vistara leads with 350+ flights, capturing 23.4% market share" "π Long-haul flights dominate at 61.1% of total bookings" "π° Budget category (βΉ0-10K) represents 47.5% of all bookings" "π Average prices peak at βΉ14K for 6-8 hour duration flights" Performance Metrics Response Time: <3 seconds for chart generation Image Quality: 800x600px high-resolution PNG Data Capacity: Handles 10K+ records efficiently Concurrent Users: Scales with n8n instance capacity π Getting Started Download the workflow JSON Import into your n8n instance Configure Telegram bot credentials Upload your flight data CSV Test with /start command Deploy and share with your team π‘ Pro Tips Data Quality: Clean data produces better insights Mobile First: Charts are optimized for mobile viewing Batch Processing: Handles large datasets efficiently Extensible Design: Easy to add new visualization types --- Ready to transform your data into actionable insights? Import this template and start generating professional charts in minutes! π
Dynamic search interface with Elasticsearch and automated report generation
Dynamic Search Interface with Elasticsearch and Automated Report Generation π― What this workflow does This template creates a comprehensive data search and reporting system that allows users to query large datasets through an intuitive web form interface. The system performs real-time searches against Elasticsearch, processes results, and automatically generates structured reports in multiple formats for data analysis and business intelligence. Key Features: π Interactive web form for dynamic data querying β‘ Real-time Elasticsearch data retrieval with complex filtering π Auto-generated reports (Text & CSV formats) with custom formatting πΎ Automatic file storage system for data persistence π― Configurable search parameters (amounts, time ranges, entity filters) π§ Scalable architecture for handling large datasets π οΈ Setup requirements Prerequisites Elasticsearch cluster running on https://localhost:9220 Transaction dataset indexed in bank_transactions index Sample dataset: Download from Bank Transaction Dataset File system access to /tmp/ directory for report storage HTTP Basic Authentication credentials for Elasticsearch Required Elasticsearch Index Structure This template uses the Bank Transaction Dataset from GitHub: https://github.com/dataminexcode/n8n-workflow/blob/main/Dynamic%20Search%20Interface%20with%20Elasticsearch%20and%20Automated%20Report%20Generation/data You can use this python script for importing the csv file into elasticsearch: Python script for importing data Your bank_transactions index should contain documents with these fields: json { "transactionid": "TXN123456789", "customerid": "CUST000001", "amount": 5000, "merchantcategory": "grocerynet", "timestamp": "2025-08-10T15:30:00Z" } Dataset Info: This dataset contains realistic financial transaction data perfect for testing search algorithms and report generation, with over 1 million transaction records including various transaction patterns and data types. Credentials Setup Create HTTP Basic Auth credentials in n8n Configure with your Elasticsearch username/password Assign to the "Search Elasticsearch" node βοΈ Configuration Form Customization Webhook Path: Update the webhook ID if needed Form Fields: Modify amounts, time ranges, or add new filters Validation: Adjust required fields based on your needs Elasticsearch Configuration URL: Change localhost:9220 to your ES cluster endpoint Index Name: Update bank_transactions to your index name Query Logic: Modify search criteria in "Build Search Query" node Result Limit: Adjust the size: 100 parameter for more/fewer results File Storage Directory: Change /tmp/ to your preferred storage location Filename Pattern: Modify fraudreportYYYY-MM-DD.{ext} format Permissions: Ensure n8n has write access to the target directory Report Formatting CSV Headers: Customize column names in the Format Report node Text Layout: Modify the report template for your organization Data Fields: Add/remove transaction fields as needed π How to use For Administrators: Import this workflow template Configure Elasticsearch credentials Activate the workflow Share the webhook URL with data analysts For Data Analysts: Access the search interface via the webhook URL Set parameters: Minimum amount, time range, entity filter Choose format: Text report or CSV export Submit form to generate instant data report Review results in the generated file Sample Use Cases: Data analysis: Search for transactions > $10,000 in last 24 hours Entity investigation: Filter all activity for specific customer ID Pattern analysis: Quick analysis of transaction activity patterns Business reporting: Generate CSV exports for business intelligence Dataset testing: Perfect for testing with the transaction dataset π Sample Output Text Report Format: DATA ANALYSIS REPORT ==================== Search Criteria: Minimum Amount: $10000 Time Range: Last 24 Hours Customer: All Results: 3 transactions found TRANSACTIONS: ============= Transaction ID: TXN_123456789 Customer: CUST_000001 Amount: $15000 Merchant: grocery_net Time: 2025-08-10T15:30:00Z CSV Export Format: csv TransactionID,CustomerID,Amount,Merchant_Category,Timestamp "TXN123456789","CUST000001",15000,"grocery_net","2025-08-10T15:30:00Z" π§ Customization ideas Enhanced Analytics Features: Add data validation and quality checks Implement statistical analysis (averages, trends, patterns) Include data visualization charts and graphs Generate summary metrics and KPIs Advanced Search Capabilities: Multi-field search with complex boolean logic Fuzzy search and text matching algorithms Date range filtering with custom periods Aggregation queries for data grouping Integration Options: Email notifications: Alert teams of significant data findings Slack integration: Post analytics results to team channels Dashboard updates: Push metrics to business intelligence systems API endpoints: Expose search functionality as REST API Report Enhancements: PDF generation: Create formatted PDF analytics reports Data visualization: Add charts, graphs, and trending analysis Executive summaries: Include key metrics and business insights Export formats: Support for Excel, JSON, and other data formats π·οΈ Tags elasticsearch, data-search, reporting, analytics, automation, business-intelligence, data-processing, csv-export π Use cases Business Intelligence: Organizations analyzing transaction patterns and trends E-commerce Analytics: Detecting payment patterns and customer behavior analysis Data Science: Real-time data exploration and pattern recognition systems Operations Teams: Automated reporting and data monitoring workflows Research & Development: Testing search algorithms and data processing techniques Training & Education: Learning Elasticsearch integration with realistic datasets Financial Technology: Transaction data analysis and business reporting systems β οΈ Important notes Security Considerations: Never expose Elasticsearch credentials in logs or form data Implement proper access controls for the webhook URL Consider encryption for sensitive data processing Regular audit of generated reports and access logs Performance Tips: Index optimization improves search response times Consider pagination for large result sets Monitor Elasticsearch cluster performance under load Archive old reports to manage disk usage Data Management: Ensure data retention policies align with business requirements Implement audit trails for all search operations Consider data privacy requirements when processing datasets Document all configuration changes for maintenance --- This template provides a production-ready data search and reporting system that can be easily customized for various data analysis needs. The modular design allows for incremental enhancements while maintaining core search and reporting functionality.