Oneclick AI Squad
The AI Squad Initiative is a pioneering effort to build, automate and scale AI-powered workflows using n8n.io. Our mission is to help individuals and businesses integrate AI agents seamlessly into their daily operations from automating tasks and enhancing productivity to creating innovative, intelligent solutions. We design modular, reusable AI workflow templates that empower creators, developers and teams to supercharge their automation with minimal effort and maximum impact.
Categories
Templates by Oneclick AI Squad
Scrape LinkedIn profiles & save to Google Sheets with Apify
This n8n workflow automates the process of scraping LinkedIn profiles using the Apify platform and organizing the extracted data into Google Sheets for easy analysis and follow-up. Use Cases Lead Generation: Extract contact information and professional details from LinkedIn profiles Recruitment: Gather candidate information for talent acquisition Market Research: Analyze professional networks and industry connections Sales Prospecting: Build targeted prospect lists with detailed professional information How It Works Workflow Initialization & Input Webhook Start Scraper: Triggers the entire scraping workflow Read LinkedIn URLs: Retrieves LinkedIn profile URLs from Google Sheets Schedule Scraper Trigger: Sets up automated scheduling for regular scraping Data Processing & Extraction Data Formatting: Prepares and structures the LinkedIn URLs for processing Fetch Profile Data: Makes HTTP requests to Apify API with profile URLs Run Scraper Actor: Executes the Apify LinkedIn scraper actor Get Scraped Results: Retrieves the extracted profile data from Apify Data Storage & Completion Save to Google Sheets: Stores the scraped profile data in organized spreadsheet format Update Progress Tracker: Updates workflow status and progress tracking Process Complete Wait: Ensures all operations finish before final steps Send Success Notification: Alerts users when scraping is successfully completed Requirements Apify Account Active Apify account with sufficient credits API token for authentication Access to LinkedIn Profile Scraper actor Google Sheets Google account with Sheets access Properly formatted input sheet with LinkedIn URLs Credentials configured in n8n n8n Setup HTTP Request node credentials for Apify Google Sheets node credentials Webhook endpoint configured How to Use Step 1: Prepare Your Data Create a Google Sheet with LinkedIn profile URLs Ensure the sheet has a column named 'linkedin_url' Add any additional columns for metadata (name, company, etc.) Step 2: Configure Credentials Set up Apify API credentials in n8n Configure Google Sheets authentication Update webhook endpoint URL Step 3: Customize Settings Adjust scraping parameters in the Apify node Modify data fields to extract based on your needs Set up notification preferences Step 4: Execute Workflow Trigger via webhook or manual execution Monitor progress through the workflow Check Google Sheets for scraped data Review completion notifications Good to Know Rate Limits: LinkedIn scraping is subject to rate limits. The workflow includes delays to respect these limits. Data Quality: Results depend on profile visibility and LinkedIn's anti-scraping measures. Costs: Apify charges based on compute units used. Monitor your usage to control costs. Compliance: Ensure your scraping activities comply with LinkedIn's Terms of Service and applicable laws. Customizing This Workflow Enhanced Data Processing Add data enrichment steps to append additional information Implement duplicate detection and merge logic Create data validation rules for quality control Advanced Notifications Set up Slack or email alerts for different scenarios Create detailed reports with scraping statistics Implement error recovery mechanisms Integration Options Connect to CRM systems for automatic lead creation Integrate with marketing automation platforms Export data to analytics tools for further analysis Troubleshooting Common Issues Apify Actor Failures: Check API limits and actor status Google Sheets Errors: Verify permissions and sheet structure Rate Limiting: Implement longer delays between requests Data Quality Issues: Review scraping parameters and target profiles Best Practices Test with small batches before scaling up Monitor Apify credit usage regularly Keep backup copies of your data Regular validation of scraped information accuracy
Automate restaurant marketing & booking with Excel, VAPI voice agent & calendar
This n8n template demonstrates how to create a comprehensive marketing automation and booking system that combines Excel-based lead management with voice-powered customer interactions. The system utilizes VAPI for voice communication and Excel/Google Sheets for data management, making it ideal for restaurants seeking to automate marketing campaigns and streamline booking processes through intelligent voice AI technology. Good to know Voice processing requires active VAPI subscription with per-minute billing Excel operations are handled in real-time with immediate data synchronization The system can handle multiple simultaneous voice calls and lead processing All customer data is stored securely in Excel with proper formatting and validation Marketing campaigns can be scheduled and automated based on lead data How it works Lead Management & Marketing Automation Workflow New Lead Trigger: Excel triggers capture new leads when customers are added to the lead management spreadsheet Lead Preparation: The system processes and formats lead data, extracting relevant details (name, phone, preferences, booking history) Campaign Loop: Automated loop processes through multiple leads for batch marketing campaigns Voice Marketing Call: VAPI initiates personalized voice calls to leads with tailored restaurant offers and booking invitations Response Tracking: All call results and lead responses are logged back to Excel for campaign analysis Booking & Order Processing Workflow Voice Response Capture: VAPI webhook triggers when customers respond to marketing calls or make direct booking requests Response Storage: Customer responses and booking preferences are immediately saved to Excel sheets Information Extraction: System processes natural language responses to extract booking details (party size, preferred times, special requests) Calendar Integration: Booking information is automatically scheduled in restaurant management systems Confirmation Loop: Automated follow-up voice messages confirm bookings and provide additional restaurant information Excel Sheet Structure Lead Management Sheet | Column | Description | |--------|-------------| | lead_id | Unique identifier for each lead | | customer_name | Customer's full name | | phone_number | Primary contact number | | email | Customer email address | | lastvisitdate | Date of last restaurant visit | | preferred_cuisine | Customer's food preferences | | partysizetypical | Usual number of guests | | preferredtimeslot | Preferred dining times | | marketing_consent | Permission for marketing calls | | lead_source | How customer was acquired | | lead_status | Current status (new, contacted, converted, inactive) | | lastcontactdate | Date of last marketing contact | | notes | Additional customer information | | created_at | Lead creation timestamp | Booking Responses Sheet | Column | Description | |--------|-------------| | response_id | Unique response identifier | | customer_name | Customer's name from call | | phone_number | Contact number used for call | | booking_requested | Whether customer wants to book | | party_size | Number of guests requested | | preferred_date | Requested booking date | | preferred_time | Requested time slot | | special_requests | Dietary restrictions or special occasions | | call_duration | Length of VAPI call | | call_outcome | Result of marketing call | | followupneeded | Whether additional contact is required | | booking_confirmed | Final booking confirmation status | | created_at | Response timestamp | Campaign Tracking Sheet | Column | Description | |--------|-------------| | campaign_id | Unique campaign identifier | | campaign_name | Descriptive campaign title | | target_audience | Lead segments targeted | | total_leads | Number of leads contacted | | successful_calls | Calls that connected | | bookings_generated | Number of bookings from campaign | | conversion_rate | Percentage of leads converted | | campaign_cost | Total VAPI usage cost | | roi | Return on investment | | start_date | Campaign launch date | | end_date | Campaign completion date | | status | Campaign status (active, completed, paused) | How to use Setup: Import the workflow into your n8n instance and configure VAPI credentials Excel Configuration: Set up Excel/Google Sheets with the required sheet structure provided above Lead Import: Populate the Lead Management sheet with customer data from various sources Campaign Setup: Configure marketing message templates in VAPI nodes to match your restaurant's branding Testing: Test voice commands such as "I'd like to book a table for tonight" or "What are your specials?" Automation: Enable triggers to automatically process new leads and schedule marketing campaigns Monitoring: Track campaign performance through the Campaign Tracking sheet and adjust strategies accordingly The system can handle multiple concurrent voice calls and scales with your restaurant's marketing needs. Requirements VAPI account for voice processing and natural language understanding Excel/Google Sheets for storing lead, booking, and campaign data n8n instance with Excel/Sheets and VAPI integrations enabled Valid phone numbers for lead contact and compliance with local calling regulations Customising this workflow Multi-location Support: Adapt voice AI automation for restaurant chains with location-specific offers Seasonal Campaigns: Try popular use-cases such as holiday promotions, special event marketing, or loyalty program outreach Integration Options: The workflow can be extended to include CRM integration, SMS follow-ups, and social media campaign coordination Advanced Analytics: Add nodes for detailed campaign performance analysis and customer segmentation
Automated restaurant call handling & table booking system with VAPI and PostgreSQL
Acts as a virtual receptionist for the restaurant, handling incoming calls via VAPI without human intervention. It collects user details (name, booking time, number of people) for table bookings, checks availability in a PostgreSQL database using n8n, books the table if available, and sends confirmation. It also provides restaurant details to users, mimicking a human receptionist. Key Insights VAPI must be configured to accurately capture user input for bookings and inquiries. PostgreSQL database requires a table to manage restaurant bookings and availability. Workflow Process Initiate the workflow with a VAPI call to collect user details (name, time, number of people). Use n8n to query the PostgreSQL database for table availability. If a table is available, book it using n8n and update the PostgreSQL database. Send a booking confirmation and hotel service details back to VAPI via n8n. Store and update restaurant table data in the PostgreSQL database using n8n. Usage Guide Import the workflow into n8n and configure VAPI and PostgreSQL credentials. Test with a sample VAPI call to ensure proper data collection and booking confirmation. Prerequisites VAPI API credentials for call handling PostgreSQL database with booking and availability tables Customization Options Modify the VAPI input fields to capture additional user details or adjust the PostgreSQL query for specific availability criteria.
Auto-respond to Instagram, Facebook & WhatsApp with Llama 3.2
This automated n8n workflow enables AI-powered responses across multiple social media platforms, including Instagram DMs, Facebook messages, and WhatsApp chats using Meta's APIs. The system provides intelligent customer support, lead generation, and smart engagement at scale through AI-driven conversation management and automated response routing. Good to Know Supports multi-platform messaging across Instagram, Facebook, and WhatsApp Uses AI Travel Agent and Ollama Chat Model for intelligent response generation Includes platform memory for maintaining conversation context and history Automatic message processing and routing based on platform and content type Real-time webhook integration for instant message detection and response How It Works WhatsApp Trigger - Monitors incoming WhatsApp messages and initiates automated response workflow Instagram Webhook - Captures Instagram DM notifications and processes them for AI analysis Facebook Webhook - Detects Facebook Messenger interactions and routes them through the system Message Processor - Analyzes incoming messages from all platforms and prepares them for AI processing AI Travel Agent - Processes messages using intelligent AI model to generate contextually appropriate responses Ollama Chat Model - Provides advanced language processing for complex conversation scenarios Platform Memory - Maintains conversation history and context across multiple interactions for personalized responses Response Router - Determines optimal response strategy and routes messages to appropriate sending mechanisms Instagram Sender - Delivers AI-generated responses back to Instagram DM conversations Facebook Sender - Sends automated replies through Facebook Messenger API Send Message (WhatsApp) - Delivers personalized responses to WhatsApp chat conversations How to Use Import workflow into n8n Configure Meta's Instagram Graph API, Facebook Messenger API, and WhatsApp Business Cloud API Set up approved Meta Developer App with required permissions Configure webhook endpoints for real-time message detection Set up Ollama Chat Model for AI response generation Test with sample messages across all three platforms Monitor response accuracy and adjust AI parameters as needed Requirements Access to Meta's Instagram Graph API, Facebook Messenger API, and WhatsApp Business Cloud API Approved Meta Developer App Webhook setup and persistent token management for real-time messaging Ollama Chat Model integration AI Travel Agent configuration Customizing This Workflow Modify AI prompts for different business contexts (customer service, sales, support) Adjust response routing logic based on message content or user behavior Configure platform-specific message templates and formatting Set up custom memory storage for enhanced conversation tracking Integrate additional AI models for specialized response scenarios Add message filtering and content moderation capabilities
Real-time stock monitor with smart alerts for Indian & US markets
Monitor Indian (NSE/BSE) and US stock markets with intelligent price alerts, cooldown periods, and multi-channel notifications (Email + Telegram). Automatically tracks price movements and sends alerts when stocks cross predefined upper/lower limits. Perfect for day traders, investors, and portfolio managers who need instant notifications for price breakouts and breakdowns. How It Works Market Hours Trigger - Runs every 2 minutes during market hours Read Stock Watchlist - Fetches your stock list from Google Sheets Parse Watchlist Data - Processes stock symbols and alert parameters Fetch Live Stock Price - Gets real-time prices from Twelve Data API Smart Alert Logic - Intelligent price checking with cooldown periods Check Alert Conditions - Validates if alerts should be triggered Send Email Alert - Sends detailed email notifications Send Telegram Alert - Instant mobile notifications Update Alert History - Records alert timestamps in Google Sheets Alert Status Check - Monitors workflow success/failure Success/Error Notifications - Admin notifications for monitoring Key Features: Smart Cooldown: Prevents alert spam Multi-Market: Supports Indian & US stocks Dual Alerts: Email + Telegram notifications Auto-Update: Tracks last alert times Error Handling: Built-in failure notifications Setup Requirements: Google Sheets Setup: Create a Google Sheet with these columns (in exact order): A: symbol (e.g., TCS, AAPL, RELIANCE.BSE) B: upper_limit (e.g., 4000) C: lower_limit (e.g., 3600) D: direction (both/above/below) E: cooldown_minutes (e.g., 15) F: lastalertprice (auto-updated) G: lastalerttime (auto-updated) API Keys & IDs to Replace: YOURGOOGLESHEETIDHERE - Replace with your Google Sheet ID YOURTWELVEDATAAPIKEY - Get free API key from twelvedata.com YOURTELEGRAMCHAT_ID - Your Telegram chat ID (optional) your-email@gmail.com - Your sender email alert-recipient@gmail.com - Alert recipient email Stock Symbol Format: US Stocks: Use simple symbols like AAPL, TSLA, MSFT Indian Stocks: Use .BSE or .NSE suffix like TCS.NSE, RELIANCE.BSE Credentials Setup in n8n: Google Sheets: Service Account credentials Email: SMTP credentials Telegram: Bot token (optional) Example Google Sheet Data: symbol upperlimit lowerlimit direction cooldown_minutes TCS.NSE 4000 3600 both 15 AAPL 180 160 both 10 RELIANCE.BSE 2800 2600 above 20 Output Example: Alert: TCS crossed the upper limit. Current Price: ₹4100, Upper Limit: ₹4000.
Pharmacy inventory alerts for low stock & expiring medicine with Google Sheets
This n8n workflow monitors pharmacy inventory stored in a Google Sheet, checks daily for low stock or near-expiry medicines, and sends alerts to the pharmacist via email, ensuring timely restocking and waste prevention. Why Use It This workflow automates inventory management for pharmacies, reducing the risk of stockouts or expired medicines, saving time, minimizing losses, and ensuring compliance with safety standards by providing proactive alerts. How to Import It Download the Workflow JSON: Obtain the workflow file from the n8n template or create it based on this document. Import into n8n: In your n8n instance, go to "Workflows," click the three dots, select "Import from File," and upload the JSON. Configure Credentials: Set up Google Sheets, email (e.g., SMTP), and optional SMS (e.g., Twilio) credentials in n8n. Run the Workflow: Activate the scheduled trigger and test with a sample Google Sheet. System Architecture Daily Stock Check (9 AM): Automated trigger to monitor inventory levels Fetch Stock Data: Retrieves current medicine data from Google Sheets Wait For All Data: Ensures complete data retrieval before processing Check Expiry Date and Low Stock: Analyzes inventory for alerts Update Google Sheet: Records alert status and timestamps Send Email Alert: Notifies pharmacist of low stock and expiry issues Google Sheet File Structure Sheet Name: PharmacyInventory Range: A1:E20 (or adjust based on needs) | A | B | C | D | E | |------------|---------------|------------|---------------|---------------| | medicinename | stockquantity | expirydate | alertstatus | last_checked | | Paracetamol | 15 | 2025-09-15 | Notified | 2025-08-08 | | Aspirin | 5 | 2025-08-20 | Pending | 2025-08-07 | | Ibuprofen | 20 | 2026-01-10 | - | 2025-08-08 | Columns: medicine_name: Name of the medicine. stock_quantity: Current stock level (e.g., number of units). expiry_date: Expiry date of the medicine (e.g., YYYY-MM-DD). alert_status: Status of the alert (e.g., Pending, Notified, - for no alert). last_checked: Date of the last inventory check. Customization Ideas Adjust Thresholds: Change the low stock threshold (e.g., from 10 to 5) or expiry window (e.g., from 30 to 15 days). Add SMS Alerts: Integrate Twilio or another SMS service for additional notifications. Incorporate Barcode Scanning: Add a node to import inventory updates via barcode scanners. Dashboard Integration: Connect to a dashboard (e.g., Google Data Studio) for real-time inventory tracking. Automated Restock Orders: Add logic to generate purchase orders for low stock items. Requirements to Run This Workflow Google Sheets Account: For storing and managing inventory data. Email Service: Gmail, SMTP, or similar for email alerts. n8n Instance: With Google Sheets and email connectors configured. Cron Service: For scheduling the daily trigger. Internet Connection: To access Google Sheets and email APIs. Optional SMS Service: Twilio or similar for SMS alerts (requires additional credentials). Want a tailored workflow for your business? Our experts can craft it quickly Contact our team
Automated meeting recording & AI summaries with Google Calendar, Vexa & Llama 3.2
Transform your meetings into actionable insights automatically! This workflow captures meeting audio, transcribes conversations, generates AI summaries, and emails the results to participants—all without manual intervention. What's the Goal? Auto-record meetings when they start and stop when they end Transcribe audio to text using Vexa Bot integration Generate intelligent summaries with AI-powered analysis Email summaries to meeting participants automatically Eliminate manual note-taking and post-meeting admin work Never miss important discussions or action items again Why Does It Matter? Save 90% of Post-Meeting Time: No more manual transcription or summary writing Never Lose Key Information: Automatic capture ensures nothing falls through cracks Improve Team Productivity: Focus on discussions, not note-taking Perfect Meeting Records: Searchable transcripts and summaries for future reference Instant Distribution: Summaries reach all participants immediately after meetings How It Works Step 1: Meeting Detection & Recording Start Meeting Trigger: Detects when meeting begins via Google Meet webhook Launch Vexa Bot: Automatically joins meeting and starts recording End Meeting Trigger: Detects meeting end and stops recording Step 2: Audio Processing & Transcription Stop Vexa Bot: Ends recording and retrieves audio file Fetch Meeting Audio: Downloads recorded audio from Vexa Bot Transcribe Audio: Converts speech to text using AI transcription Step 3: AI Summary Generation Prepare Transcript: Formats transcribed text for AI processing Generate Summary: AI model creates concise meeting summary with: Key discussion points Decisions made Action items assigned Next steps identified Step 4: Distribution Send Email: Automatically emails summary to all meeting participants Setup Requirements Google Meet Integration: Configure Google Meet webhook and API credentials Set up meeting detection triggers Test with sample meeting Vexa Bot Configuration: Add Vexa Bot API credentials for recording Configure audio file retrieval settings Set recording quality and format preferences AI Model Setup: Configure AI transcription service (e.g., OpenAI Whisper, Google Speech-to-Text) Set up AI summary generation with custom prompts Define summary format and length preferences Email Configuration: Set up SMTP credentials for email distribution Create email templates for meeting summaries Configure participant list extraction from meeting metadata Import Instructions Get Workflow JSON: Copy the workflow JSON code Open n8n Editor: Navigate to your n8n dashboard Import Workflow: Click menu (⋯) → "Import from Clipboard" → Paste JSON → Import Configure Credentials: Add API keys for Google Meet, Vexa Bot, AI services, and SMTP Test Workflow: Run a test meeting to verify end-to-end functionality Your meetings will now automatically transform into actionable summaries delivered to your inbox!
AI-powered email to Jira ticket creation with Llama 3.2
This AI-powered workflow reads emails, understands the request using an LLM, and creates structured Jira issues. Key Insights Poll for new emails every 5 minutes; ensure Gmail/IMAP is properly configured. AI analysis requires a reliable LLM model (e.g., Chat Model or AI Tool). Workflow Process Trigger the workflow with the Check for New Emails Gmail Trigger node. Fetch full email content using the Fetch Full Email Content get message node. Analyze email content with the Analyze Email & Extract Tasks node using AI. Parse the AI-generated JSON output into tasks with the Parse JSON Output from AI node. Create the main Jira issue with the Jira - Create Main Issue create: issue node. Split subtasks from JSON and create them with the Split Subtasks JSON Items and Create Subtasks create: issue nodes. Usage Guide Import the workflow into n8n and configure Gmail and Jira credentials. Test with a sample email to ensure ticket creation and subtask assignment. Prerequisites Gmail/IMAP credentials for email polling Jira API credentials with issue creation permissions Customization Options Adjust the Analyze Email & Extract Tasks node to refine AI task extraction or modify the polling frequency in the trigger node.
Conversational travel booker: Automate flight & hotel reservations with GPT-3.5
This guide walks you through setting up an AI-driven workflow to automate flight and hotel reservation processes using a conversational travel booking system. The workflow accepts booking requests, processes them via APIs, and sends confirmations, enabling a seamless travel booking experience. What’s the Goal? Automatically accept and process booking requests for flights and hotels via HTTP POST. Use AI to understand natural language requests and route them to appropriate data processors. Search for flights and hotels using external APIs and process booking confirmations. Send confirmation emails and return structured booking data to users. Enable an automated system for efficient travel reservations. By the end, you’ll have a self-running system that handles travel bookings effortlessly. Why Does It Matter? Manual booking processes are time-consuming and prone to errors. This workflow offers: Zero Human Error: AI ensures accurate request parsing and booking processing. Time-Saving Automation: Automates the entire booking lifecycle, boosting efficiency. Seamless Confirmation: Sends automated emails and responses without manual intervention. Enhanced User Experience: Provides a conversational interface for bookings. Think of it as your reliable travel booking assistant that keeps the process smooth and efficient. How It Works Here’s the step-by-step flow of the automation: Step 1: Trigger the Workflow Webhook Trigger: Accepts incoming booking requests via HTTP POST, initiating the workflow. Step 2: Parse the Request AI Request Parser: Uses AI to understand natural language booking requests (e.g., flight or hotel) and extracts relevant details. Step 3: Route Booking Type Booking Type Router: Determines whether the request is for a flight or hotel and routes it to the respective data processor. Step 4: Process Flight Data Flight Data Processor: Handles flight-specific data and prepares it for the search API. Step 5: Search Flight API Flight Search API: Searches for available flights based on parameters (e.g., https://api.aviationstack.com) and returns results. Step 6: Process Hotel Data Hotel Data Processor: Handles hotel-specific data and prepares it for the search API. Step 7: Search Hotel API Hotel Search API: Searches for available hotels based on parameters (e.g., https://api.booking.com) and returns results. Step 8: Process Flight Booking Flight Booking Processor: Processes flight bookings and generates confirmation details. Step 9: Process Hotel Booking Hotel Booking Processor: Processes hotel bookings and generates confirmation details. Step 10: Generate Confirmation Message Confirmation Message Generator: Creates structured confirmation messages for the user. Step 11: Send Confirmation Email Send Confirmation Email: Sends booking confirmation via email to the user. Step 12: Send Response Send Response: Returns structured booking data to the user, completing the workflow. How to Use the Workflow? Importing the workflow in n8n is a straightforward process. Follow these steps to import the Conversational Travel Booker workflow: Download the Workflow: Obtain the workflow file (e.g., JSON export from n8n). Open n8n: Log in to your n8n instance. Import Workflow: Navigate to the workflows section, click "Import," and upload the workflow file. Configure Nodes: Adjust settings (e.g., API keys, webhook URLs) as needed. Execute Workflow: Test and activate the workflow to start processing bookings. Requirements n8n account and instance setup. Access to flight and hotel search APIs (e.g., Aviationstack, Booking.com). Email service integration for sending confirmations. Webhook URL for receiving booking requests. Customizing this Workflow Modify the AI Request Parser to handle additional languages or booking types. Update API endpoints in Flight Search API and Hotel Search API nodes to match your preferred providers. Adjust the Send Confirmation Email node to include custom email templates or additional recipients. Schedule the Webhook Trigger to align with your business hours or demand peaks.
AI-driven inventory management with OpenAI forecasting & ERP integration
This n8n workflow automates the monitoring of warehouse inventory and sales velocity to predict demand, generate purchase orders automatically, send them to suppliers, and record all transactions in ERP and database systems. It uses AI-driven forecasting to ensure timely restocking while maintaining operational efficiency and minimizing stockouts or overstocking. --- Key Features Automated Scheduling: Periodically checks inventory and sales data at defined intervals. Real-Time Data Fetching: Retrieves live warehouse stock levels and sales trends. AI Demand Forecasting: Uses OpenAI GPT to predict future demand based on sales velocity and stock trends. Auto-Purchase Orders: Automatically generates and sends purchase orders to suppliers. ERP Integration: Logs completed purchase orders into ERP systems like SAP, Oracle, or Netsuite. Database Logging: Saves purchase order details and forecast confidence data into SQL databases (PostgreSQL/MySQL). Email Notifications: Notifies relevant teams upon successful order creation and logging. Modular Configuration: Each node includes configuration notes and credentials setup instructions. --- Workflow Process Schedule Trigger Runs every 6 hours to monitor stock and sales data. Interval can be adjusted for higher or lower frequency checks. Fetch Current Inventory Data Retrieves live inventory levels from the warehouse API endpoint. Requires API credentials and optional GET/POST method setup. Fetch Sales Velocity Pulls recent sales data for forecasting analysis. Used later for AI-based trend prediction. Merge Inventory & Sales Data Combines inventory and sales datasets into a unified JSON structure. Prepares data for AI model input. AI Demand Forecasting Sends merged data to OpenAI GPT for demand prediction. Returns demand score, reorder need, and confidence levels. Parse AI Response Extracts and structures forecast results. Combines AI data with original inventory dataset. Filter: Reorder Needed Identifies items flagged for reorder based on AI output. Passes only reorder-required products to next steps. Create Purchase Order Automatically creates a PO document with item details, quantity, and supplier information. Calculates total cost and applies forecast-based reorder logic. Send PO to Supplier Sends the generated purchase order to supplier API endpoints. Includes response validation for order success/failure. Log to ERP System Records confirmed purchase orders into ERP platforms (SAP, Oracle, Netsuite). Includes timestamps and forecast metrics. Save to Database Stores all PO data, supplier responses, and AI forecast metrics into PostgreSQL/MySQL tables. Useful for long-term audit and analytics. Send Notification Email Sends summary emails upon PO creation and logging. Includes PO ID, supplier, cost, and demand reasoning. --- Setup Instructions Schedule Trigger: Adjust to your preferred interval (e.g., every 6 hours or once daily). API Configuration: Provide credentials in Inventory, Sales, and Supplier nodes. Use Authorization headers or API keys as per your system. AI Node (OpenAI): Add your OpenAI API key in the credentials section. Modify the prompt if you wish to include additional forecasting parameters. ERP Integration: Replace placeholder URLs with your ERP system endpoints. Match fields like purchase order number, date, and cost. Database Connection: Configure credentials for PostgreSQL/MySQL in the Save to Database node. Ensure tables (purchase_orders) are created as per schema provided in sticky notes. Email Notifications: Set up SMTP credentials (e.g., Gmail, Outlook, or custom mail server). Add recipients under workflow notification settings. --- Industries That Benefit This automation is highly beneficial for: Retail & E-commerce: Predicts product demand and auto-orders from suppliers. Manufacturing: Ensures raw materials are restocked based on production cycles. Pharmaceuticals: Maintains optimum inventory for high-demand medicines. FMCG & Supply Chain: Balances fast-moving goods availability with minimal overstocking. Automotive & Electronics: Prevents delays due to missing components. --- Prerequisites API access to inventory, sales, supplier, and ERP systems. Valid OpenAI API key for demand forecasting. SQL database (PostgreSQL/MySQL) for record storage. SMTP or mail server credentials for email notifications. n8n environment with required nodes installed (HTTP, AI, Filter, Email, Database). --- Modification Options Change forecast logic or thresholds for different industries. Integrate Slack/Teams for live notifications. Add approval workflow before sending POs. Extend AI prompt for seasonality or promotional trends. Add dashboard visualization using Grafana or Google Sheets. --- Explore More AI Workflows: Get in touch with us to build industry-grade n8n automations with predictive intelligence.
Automate restaurant sales & inventory forecasting with Gemini AI & Google Sheets
This automated n8n workflow performs weekly forecasting of restaurant sales and raw material requirements using historical data from Google Sheets and AI predictions powered by Google Gemini. The forecast is then emailed to stakeholders for efficient planning and waste reduction. What is Google Gemini AI? Google Gemini is an advanced AI model that analyzes historical sales data, seasonal patterns, and market trends to generate accurate forecasts for restaurant sales and inventory requirements, helping optimize purchasing decisions and reduce waste. Good to Know Google Gemini AI forecasting accuracy improves over time with more historical data Weekly forecasting provides better strategic planning compared to daily predictions Google Sheets access must be properly authorized to avoid data sync issues Email notifications ensure timely review of weekly forecasts by stakeholders The system analyzes trends and predicts upcoming needs for efficient planning and waste reduction How It Works Trigger Weekly Forecast - Automatically starts the workflow every week at a scheduled time Load Historical Sales Data - Pulls weekly sales and material usage data from Google Sheets Format Input for AI Agent - Transforms raw data into a structured format suitable for the AI Agent Generate Forecast with AI - Uses Gemini AI to analyze trends and predict upcoming needs Interpret AI Forecast Output - Parses the AI's response into readable, usable JSON format Log Forecast to Google Sheets - Stores the new forecast data back into a Google Sheet Email Forecast Summary - Sends a summary of the forecast via Gmail for stakeholder review Data Sources The workflow utilizes Google Sheets as the primary data source: Historical Sales Data Sheet - Contains weekly sales and inventory data with columns: Week/Date (date) Menu Item (text) Sales Quantity (number) Revenue (currency) Raw Material Used (number) Inventory Level (number) Category (text) Forecast Output Sheet - Contains AI-generated predictions with columns: Forecast Week (date) Menu Item (text) Predicted Sales (number) Recommended Inventory (number) Material Requirements (number) Confidence Level (percentage) Notes (text) How to Use Import the workflow into n8n Configure Google Sheets API access and authorize the application Set up Gmail credentials for forecast report delivery Create the required Google Sheets with the specified column structures Configure Google Gemini AI API credentials Test with sample historical sales data to verify predictions and email delivery Adjust forecasting parameters based on your restaurant's specific needs Monitor and refine the system based on actual vs. predicted results Requirements Google Sheets API access Gmail API credentials Google Gemini AI API credentials Historical sales and inventory data for initial training Customizing This Workflow Modify the Generate Forecast with AI node to focus on specific menu categories, seasonal adjustments, or local market conditions. Adjust the email summary format to match your restaurant's reporting preferences and add additional data sources like supplier information, weather data, or special events calendar for more accurate predictions.
Daily tech news digest from Google News summarized with Llama AI and email delivery
This n8n workflow automatically fetches top technology news from Google News, summarizes it using AI, and sends a daily email with key updates. Users get a concise overview of important tech developments every morning. Good to know: Focuses specifically on technology news. Summarizes multiple sources into one concise email. Ensures consistent and easy-to-read formatting. Handles updates from different websites reliably. How it works Trigger: Schedule Daily Tech News Runs automatically every morning at 8 AM. Fetch Google Tech News Retrieves the latest tech news from Google News. Extract Tech News Articles Parses the HTML to extract headlines, source, and timestamps. Format Tech News Data Prepares structured data ready for AI analysis. Check If News Found If no news is found, sends an error alert email. Otherwise, continues to AI summarization. AI Tech News Analyzer Uses an AI model to summarize and highlight key trends. Send Tech News Email Sends a formatted daily email with summarized tech news. Send Error Alert (optional) Sends an alert email if no news can be found. Email Examples Output Email Example: Subject: 🌐 Daily Tech News Summary - August 14, 2025 📌 Top Technology Headlines Today: AI-powered tools are revolutionizing cloud computing. (Source: TechCrunch) Startup funding in India sees record growth. (Source: Economic Times) New smartphone launches include innovative camera features. (Source: The Verge) Cybersecurity threats increase amid remote work trends. (Source: Wired) 🗓️ Summary Date: August 14, 2025 How to use Setup Instructions: Import workflow into your n8n instance. Configure SMTP credentials for sending emails. Set the schedule to run daily at your preferred time. Test the workflow to ensure news is fetched and the email is sent correctly. Requirements: n8n instance (cloud or self-hosted). Email account with SMTP access. Reliable internet connection. Access to Google News. Troubleshooting: No news found: Check internet connection and Google News accessibility. Email not sent: Verify SMTP credentials. AI summarization errors: Check model credentials and API usage.