12 templates found
Category:
Author:
Sort:

Personal shopper chatbot for WooCommerce with RAG using Google Drive and openAI

This workflow combines OpenAI, Retrieval-Augmented Generation (RAG), and WooCommerce to create an intelligent personal shopping assistant. It handles two scenarios: Product Search: Extracts user intent (keywords, price ranges, SKUs) and fetches matching products from WooCommerce. General Inquiries: Answers store-related questions (e.g., opening hours, policies) using RAG and documents stored in Google Drive. --- How It Works Chat Interaction & Intent Detection Chat Trigger: Starts when a user sends a message ("When chat message received"). Information Extractor: Uses OpenAI to analyze the message and determine if the user is searching for a product or asking a general question. Extracts: search (true/false). keyword, priceRange, SKU, category (if product-related). Example: json { "search": true, "keyword": "red handbags", "priceRange": { "min": 50, "max": 100 }, "SKU": "BAG123", "category": "women's accessories" } Product Search (WooCommerce Integration) AI Agent: If search: true, routes the request to the personal_shopper tool. WooCommerce Node: Queries the WooCommerce store using extracted parameters (keyword, priceRange, SKU). Filters products in stock (stockStatus: "instock"). Returns matching products (e.g., "red handbags under €100"). General Inquiries (RAG System) RAG Tool: If search: false, uses the Qdrant Vector Store to retrieve store information from documents. Google Drive Integration: Documents (e.g., store policies, FAQs) are stored in Google Drive. Downloaded, split into chunks, and embedded into Qdrant for semantic search. OpenAI Chat Model: Generates answers based on retrieved documents (e.g., "Our store opens at 9 AM"). Set Up Steps Configure the RAG System Google Drive Setup: Upload store documents . Update the Google Drive2 node with your folder ID. Qdrant Vector Database: Clean the collection (update Qdrant Vector Store node with your URL). Use Embeddings OpenAI to convert documents into vectors. Configure OpenAI & WooCommerce OpenAI Credentials: Add your API key to all OpenAI nodes (OpenAI Chat Model, Embeddings OpenAI, etc.). WooCommerce Integration: Connect your WooCommerce store (credentials in the personal_shopper node). Ensure product data is synced and accessible. Customize the AI Agent Intent Detection: Modify the Information Extractor’s system prompt to align with your store’s terminology. RAG Responses: Update the tool description to reflect your store’s documents. --- Notes This template is ideal for e-commerce businesses needing a hybrid assistant for product discovery and customer support. Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
13378

Flux dev image generation (Fal.ai) to Google Drive

This workflow automates AI-based image generation using the Fal.ai Flux API. Define custom prompts, image parameters, and effortlessly generate, monitor, and save the output directly to Google Drive. Streamline your creative automation with ease and precision. Who is this for? This template is for content creators, developers, automation experts, and creative professionals looking to integrate AI-based image generation into their workflows. It’s ideal for generating custom visuals with the Fal.ai Flux API and automating storage in Google Drive. What problem is this workflow solving? Manually generating AI-based images, checking their status, and saving results can be tedious. This workflow automates the entire process — from requesting image generation, monitoring its progress, downloading the result, and saving it directly to a Google Drive folder. What this workflow does Sets Custom Image Parameters: Allows you to define the prompt, resolution, guidance scale, and steps for AI image generation. Sends a Request to Fal.ai: Initiates the image generation process using the Fal.ai Flux API. Monitors Image Status: Checks for completion and waits if needed. Downloads the Generated Image: Fetches the completed image once ready. Saves to Google Drive: Automatically uploads the generated image to a specified Google Drive folder. Setup Prerequisites: • Fal.ai API Key: Obtain it from the Fal.ai platform and set it as the Authorization header in HTTP Header Auth credentials. • Google Drive OAuth Credentials: Connect your Google Drive account in n8n. Configuration: • Update the “Edit Fields” node with your desired image parameters: • Prompt: Describe the image (e.g., “Thai young woman net idol 25 yrs old, walking on the street”). • Width/Height: Define image resolution (default: 1024x768). • Steps: Number of inference steps (e.g., 30). • Guidance Scale: Controls image adherence to the prompt (e.g., 3.5). • Set your Google Drive folder ID in the “Google Drive” node to save the image. Run the Workflow: • Trigger the workflow manually to generate the image. • The workflow waits, checks status, and saves the final output seamlessly. Customization • Modify Image Parameters: Adjust the prompt, resolution, steps, and guidance scale in the “Edit Fields” node. • Change Storage Location: Update the Google Drive node with a different folder ID. • Add Notifications: Integrate an email or messaging node to alert you when the image is ready. • Additional Outputs: Expand the workflow to send the generated image to Slack, Dropbox, or other platforms. This workflow streamlines AI-based image generation and storage, offering flexibility and customization for creative automation.

Sira EkabutBy Sira Ekabut
9538

Forward filtered Gmail notifications to Telegram chat

This workflow automatically forwards incoming Gmail emails to a Telegram chat only if the email subject contains specific keywords (like "Urgent" or "Server Down"). The workflow extracts key details such as the sender, subject, and message body, and sends them as a formatted message to a specified Telegram chat. This is useful for real-time notifications, security alerts, or monitoring important emails directly from Telegram — filtering out unnecessary emails. Prerequisites: Before setting up the workflow, ensure the following: The Gmail API should be enabled. Create a bot using @BotFather and obtain the API key. Retrieve the telegram Chat ID (for personal messages or group messages). Set up OAuth2 for Gmail and use the Bot Token for Telegram. Customisation Options : Modify the subject keywords in the IF Node to change the filtering criteria. Customize how the email details appear in Telegram (bold subject, italic body, etc.). Extend the workflow to include email attachments in Telegram. Steps : Step 1: Gmail Trigger Node (On Message Received) Select "Gmail Trigger" and add it to the workflow. Authenticate with your Google Account. Set Trigger Event to "Message Received". (Optional) Add filters for specific senders, labels, or subjects. Click "Execute Node" to test the connection. Click "Save". Step 2: IF Node (Conditional Filtering) Add an "IF" Node after the Gmail Trigger. Configure the condition to check if the email subject contains specific keywords (e.g., "Urgent", "Server Down", "Alert"). If the condition is true, proceed to the next step. If false, you can stop or route it elsewhere (optional). Step 3: Telegram Node (Send Message Action) Click "Add Node" and search for Telegram. Select "Send Message" as the action. Authenticate using your Telegram Bot Token. Set the Chat ID (personal or group chat). Format the message using email details received from the email trigger node and set the message in text. Steps 4. Connect & Test the Workflow Link Gmail Trigger → if node → Telegram Send Message. Save and execute the workflow manually. Send a test email to your Gmail account. Verify if the email details appear in your Telegram chat. About the Creator, WeblineIndia This workflow is created by the Agentic business process automation developers at WeblineIndia. We build automation and AI-driven tools that make life easier for your team. If you’re looking to hire dedicated developers who can customize workflows around your business, we’re just a click away.

WeblineIndiaBy WeblineIndia
2821

CallForge - 03 - Gong transcript processor and Salesforce enricher

--- CallForge - AI Gong Transcript PreProcessor Transform your Gong.io call transcripts into structured, enriched, and AI-ready data for better sales insights and analytics. Who is This For? This workflow is designed for: ✅ Sales teams looking to automate call transcript formatting. ✅ Revenue operations (RevOps) professionals optimizing AI-driven insights. ✅ Businesses using Gong.io that need structured, enriched call transcripts for better decision-making. What Problem Does This Workflow Solve? Manually processing raw Gong call transcripts is inefficient and often lacks essential context for AI-driven insights. With CallForge, you can: ✔ Extract and format Gong call transcripts for structured AI processing. ✔ Enhance metadata using sales data from Salesforce. ✔ Classify speakers as internal (sales team) or external (customers). ✔ Identify external companies by filtering out free email domains (e.g., Gmail, Yahoo). ✔ Enrich customer profiles using PeopleDataLabs to identify company details and locations. ✔ Prepare transcripts for AI models by structuring conversations and removing unnecessary noise. What This Workflow Does Retrieves Gong Call Data Calls the Gong API to extract call metadata, speaker interactions, and collaboration details. Fetches call transcripts for AI processing. Processes and Cleans Transcripts Converts call transcripts into structured, speaker-based dialogues. Assigns each speaker as either Internal (Sales Team) or External (Customer). Extracts Company Information Retrieves Salesforce data to match customers with existing sales opportunities. Filters out free email domains to determine the customer’s actual company domain. Calls the PeopleDataLabs API to retrieve additional company data and location details. Merges and Enriches Data Combines Gong metadata, Salesforce customer details and insights. Ensures all necessary data is available for AI-driven sales insights. Final Formatting for AI Processing Merges all call transcript data into a single structured format for AI analysis. Extracts the final cleaned, enriched dataset for further AI-powered insights. How to Set Up This Workflow Connect Your APIs 🔹 Gong API Access – Set up your Gong API credentials in n8n. 🔹 Salesforce Setup – Ensure API access if you want customer enrichment. 🔹 PeopleDataLabs API – Required to retrieve company and location details based on email domains. 🔹 Webhook Integration – Modify the webhook call to push enriched call data to an internal system. CallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage CallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization CallForge - 03 - Gong Transcript Processor and Salesforce Enricher CallForge - 04 - AI Workflow for Gong.io Sales Calls CallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync CallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI CallForge - 07 - AI Marketing Data Processing with Gong & Notion CallForge - 08 - AI Product Insights from Sales Calls with Notion How to Customize This Workflow 💡 Modify Data Sources – Connect different CRMs (e.g., HubSpot, Zoho) instead of Salesforce. 💡 Expand AI Analysis – Add another AI model (e.g., OpenAI GPT, Claude) for advanced conversation insights. 💡 Change Speaker Classification Rules – Adjust internal vs. external speaker logic to match your team’s structure. 💡 Filter Specific Customers – Modify the free email filtering logic to better fit your company’s needs. Why Use CallForge? 🚀 Automate Gong call transcript processing to save time. 📊 Improve AI accuracy with enriched, structured data. 🛠 Enhance sales strategy by extracting actionable insights from calls. Start optimizing your Gong transcript analysis today!

Angel MenendezBy Angel Menendez
2029

Extract specific website data with form input, Gemini 2.5 flash and Gmail

What this workflow does This workflow creates an automated web scraper that accepts form submissions, extracts specific data from any website using AI, and emails the results back to you. Step by step: Web Scraper Form Submission provides a web form interface where users submit a URL and specify what data to extract Get HTML from Source URL fetches the complete HTML content from the provided website HTML Extractor processes the raw HTML and extracts the body content for analysis Data Extractor LLM Chain uses Google Gemini AI to intelligently analyze the content and extract only the specific data requested by the user Structured Output Parser formats the AI response into clean JSON structure with standardized format Gmail Send Result delivers the extraction results via email including the source URL, extraction request details, and clean extracted results How to set up Connect your Google Gemini API to the Google Gemini Chat Model node for AI-powered data extraction Connect your Gmail account to the Gmail node for sending result emails Update the recipient email in the Gmail node Customize the extraction prompt in the Data Extractor LLM Chain node based on your specific requirements How to customize this workflow to your needs Switch AI models: Replace Google Gemini with OpenAI, Claude, or other LLM providers in the Chat Model node based on your accuracy requirements and budget preferences Change result delivery: Replace Gmail with Google Sheets for data storage, Outlook for corporate email, Slack for team notifications, or webhook integrations for custom applications Customize extraction prompts: Modify the LLM prompt in the Data Extractor Chain to handle specific data types, extraction formats, or industry-specific terminology for your use case Need help customizing? Contact me for consulting and support: 📧 billychartanto@gmail.com

Billy ChristiBy Billy Christi
1880

Post unassigned Zendesk tickets to Slack

> This has been updated to support the Query feature added to the Zendesk node in 0.144.0 This workflow will post all New and Open tickets without an agent assigned to a Slack channel on a schedule. The function node is used in this example to merge multiple inputs into one output message which is then used as the Slack message. The output in Slack will be similar to the below message, The "TICKET_ID" will be a link to the ticket. > Unassigned Tickets TICKETID [STATUS] - TICKETSUBJECT Usage Update the Cron schedule, The default value is 16:30 daily. Update the Credentials in the Zendesk nodes Update the Credentials and Channel in the Slack Node Grab a coffee and enjoy! Zendesk Query In the Zendesk node we are using the query assignee:none status<pending this returns all New and Open tickets with no assignee allowing us to remove the extra nodes.

Jonathan BennettsBy Jonathan Bennetts
963

Generate educational social media carousels with GPT-4.1, Templated.io & Google Drive

🎯 Description Automatically generates, designs, stores, and logs complete Instagram carousel posts. It transforms a simple text prompt into a full post with copy, visuals, rendered images, Google Drive storage, and a record in Google Sheets. ⚙️ Use case / What it does This workflow enables creators, educators, or community managers to instantly produce polished, on-brand carousel assets for social media. It integrates OpenAI GPT-4.1, Pixabay, Templated.io, Google Drive, and Google Sheets into one continuous content-production chain. 💡 How it works 1️⃣ Form Trigger – Collects the user prompt via a simple web form. 2️⃣ OpenAI GPT-4.1 – Generates structured carousel JSON: titles, subtitles, topic, description, and visual keywords. 3️⃣ Code (Format content) – Parses the JSON output for downstream use. 4️⃣ Google Drive (Create Folder) – Creates a subfolder for the new carousel inside “RRSS”. 5️⃣ HTTP Request (Pixabay) – Searches for a relevant image using GPT’s visual suggestion. 6️⃣ Code (Get first result) – Extracts the top Pixabay result and image URL. 7️⃣ Templated.io – Fills the design template layers (titles/subtitles/topic/image). 8️⃣ HTTP Request (Download renders) – Downloads the rendered PNGs from Templated.io. 9️⃣ Google Drive (Upload) – Uploads the rendered images into the created folder. 10️⃣ Google Sheets (Save in DB) – Logs metadata (title, topic, folder link, description, timestamp, status). 🔗 Connectors used OpenAI GPT-4.1 (via n8n LangChain node) Templated.io API (design rendering) Pixabay API (stock image search) Google Drive (storage + folder management) Google Sheets (database / logging) Form Trigger (input collection) 🧱 Input / Output Input: User-submitted “Prompt” (text) via form Output: Generated carousel images stored in Google Drive Spreadsheet row in Google Sheets containing title, topic, description, Drive URL, status ⚠️ Requirements / Setup Valid credentials for: OpenAI API (GPT-4.1 access) Templated.io API key Pixabay API key Google Drive + Google Sheets OAuth connections Existing Google Drive folder ID for RRSS storage Spreadsheet with matching column headers (Created At, Title, Topic, Folder URL, Description, Status) Published form URL for user prompts 🌍 Example applications / extensions Educational themes (mental health, fitness, sustainability). Extend to auto-publish to Instagram Business via Meta API. Add Notion logging or automated email notifications. Integrate scheduling (Cron node) to batch-generate weekly carousels.

Bastian DiazBy Bastian Diaz
408

AI chatbot call center: Taxi service (Production-ready, part 3)

Workflow Name: 🛎️ Taxi Service Template was created in n8n v1.90.2 Skill Level: High Categories: n8n, Chatbot Stacks Execute Sub-workflow Trigger node Chat Trigger node Redis node Postgres node AI Agent node If node, Switch node, Code node, Edit Fields (Set) Prerequisite Execute Sub-workflow Trigger: Taxi Service Workflow (or your own node) Sub-workflow: Taxi Service Provider (or your own node) Sub-workflow: Demo Call Back (or your own node) Production Features Scaling Design for n8n Queue mode in production environment Service Data from external Database with Caching Mechanism Optional Long Terms Memory design Find Route Distance using Google Map API Optional Multi-Language Wait Output example Error Management What this workflow does? This is a n8n Taxi Service Workflow demo. It is the core node for Taxi Service. It will receive message from the Call Center Workflow, handling the QA from the caller, and pass to each of the Taxi Service Provider Workflow to process the estimation. How it works The Flow Trigger node will wait for the message from Call Center or other Sub-workflow. When message is received, it will first check for the matching Service from the PostgreSQL database. If no service or service is inactive, output Error. Next, always reset the Session Data in Cache, with channel_no set to taxi Next, delete the previous Route Data in Cache Trigger a AI Agent to process the fare estimation question to create the Route Data Use the Google Map Route API to calculate the distance. Repeat until created the route data, then pass to all the Taxi Service Provider for an estimation. Set up instructions Pull and Set up the required SQL from our Github repository. Create you Redis credentials, refer to n8n integration documentation for more information. Select your Credentials in Service Cache, Save Service Cache, Reset Session, Delete Route Data, Route Data, Update User Session and Create Route Data. Create you Postgres credentials, refer to n8n integration documentation for more information. Select your Credentials in Load Service Data, Postgres Chat Memory, Load User Memory and Save User Memory. Modify the AI Agent prompt to fit your need Set you Google Map API key in Find Route Distance How to adjust it to your needs By default, this template will use the sys_service table provider information, you could change it for your own design. You can use any AI Model for the AI Agent node Learn we use the prompt for the Load/Save User Memory on demand. Include is our prompt for the taxi service. It is a flexible design which use the data from the Service node to customize the prompt, so you could duplicate this workflow as another service. Create difference Taxi Providers to process the and feedback the estimate.

ChatPayLabsBy ChatPayLabs
398

Google Play review intelligence with Bright Data & Telegram alerts

Google Play Review Intelligence with Bright Data & Telegram Alerts Overview This n8n workflow automates the process of scraping Google Play Store reviews, analyzing app performance, and sending alerts for low-rated applications. It integrates with Bright Data for web scraping, Google Sheets for data storage, and Telegram for notifications. Workflow Components ✅ Trigger Input Form Type: Form Trigger Purpose: Initiates the workflow with user input Input Fields: URL (Google Play Store app URL) Number of reviews to fetch Function: Captures user requirements to start the scraping process 🚀 Start Scraping Request Type: HTTP Request (POST) Purpose: Sends scraping request to Bright Data API Endpoint: https://api.brightdata.com/datasets/v3/trigger Parameters: Dataset ID: gd_m6zagkt024uwvvwuyu Include errors: true Limit multiple results: 5 Custom Output Fields: url, reviewid, reviewername, review_date reviewrating, review, appurl, app_title appdeveloper, appimages, app_rating appnumberofreviews, appwhat_new appcontentrating, appcountry, numof_reviews 🔄 Check Scrape Status Type: HTTP Request (GET) Purpose: Monitors the progress of the scraping job Endpoint: https://api.brightdata.com/datasets/v3/progress/{snapshot_id} Function: Checks if the dataset scraping is complete ⏱️ Wait for Response 45 sec Type: Wait Node Purpose: Implements polling mechanism Duration: 45 seconds Function: Pauses workflow before checking status again 🧩 Verify Completion Type: IF Condition Purpose: Evaluates scraping completion status Condition: status === "ready" Logic: True: Proceeds to fetch data False: Loops back to status check 📥 Fetch Scraped Data Type: HTTP Request (GET) Purpose: Retrieves the final scraped data Endpoint: https://api.brightdata.com/datasets/v3/snapshot/{snapshot_id} Format: JSON Function: Downloads completed review and app data 📊 Save to Google Sheet Type: Google Sheets Node Purpose: Stores scraped data for analysis Operation: Append rows Target: Specified Google Sheet document Data Mapping: URL, Review ID, Reviewer Name, Review Date Review Rating, Review Text, App Rating App Number of Reviews, App What's New, App Country ⚠️ Check Low Ratings Type: IF Condition Purpose: Identifies poor-performing apps Condition: review_rating < 4 Logic: True: Triggers alert notification False: No action taken 📣 Send Alert to Telegram Type: Telegram Node Purpose: Sends performance alerts Message Format: ⚠️ Low App Performance Alert 📱 App: {app_title} 🧑‍💻 Developer: {app_developer} ⭐ Rating: {app_rating} 📝 Reviews: {appnumberof_reviews} 🔗 View on Play Store Workflow Flow Input Form → Start Scraping → Check Status → Wait 45s → Verify Completion ↑ ↓ └──── Loop ────┘ ↓ Fetch Data → Save to Sheet & Check Ratings ↓ Send Telegram Alert Configuration Requirements API Keys & Credentials Bright Data API Key: Required for web scraping Google Sheets OAuth2: For data storage access Telegram Bot Token: For alert notifications Setup Parameters Google Sheet ID: Target spreadsheet identifier Telegram Chat ID: Destination for alerts N8N Instance ID: Workflow instance identifier Key Features Data Collection Comprehensive app metadata extraction Review content and rating analysis Developer and country information App store performance metrics Quality Monitoring Automated low-rating detection Real-time performance alerts Continuous data archiving Integration Capabilities Bright Data web scraping service Google Sheets data persistence Telegram instant notifications Polling-based status monitoring Use Cases App Performance Monitoring Track rating trends over time Identify user sentiment patterns Monitor competitor performance Quality Assurance Early warning for rating drops Customer feedback analysis Market reputation management Business Intelligence Review sentiment analysis Performance benchmarking Strategic decision support Technical Notes Polling Interval: 45-second status checks Rating Threshold: Alerts triggered for ratings < 4 Data Format: JSON with structured field mapping Error Handling: Includes error tracking in dataset requests Result Limiting: Maximum 5 multiple results per request For any questions or support, please contact: info@incrementors.com or fill out this form https://www.incrementors.com/contact-us/

IncrementorsBy Incrementors
326

Monitor lead response time SLA breaches with Google Sheets & Telegram alerts

Description Never miss a lead again with this SLA Breach Alert automation powered by n8n! This workflow continuously monitors your Google Sheets for un-replied leads and automatically triggers instant Telegram alerts, ensuring your team takes immediate action. By running frequent SLA checks, enriching alerts with direct Google Sheet links, and sending real-time notifications, this automation helps prevent unattended leads, reduce response delays, and boost customer engagement. What This Template Does 📅 Runs every 5 minutes to monitor SLA breaches 📋 Fetches lead data (status, contact, timestamps) from Google Sheets 🕒 Identifies leads marked “Un-replied” beyond the 15-minute SLA 🔗 Enriches alerts with direct Google Sheet row links for quick action 📲 Sends Telegram alerts with lead details for immediate response Step-by-Step Setup Prepare Your Google Sheet Create a sheet with the following columns (minimum required): Lead Name Email Phone Status (values: Replied, Un-replied) Timestamp (time of last update/reply) Set Up Google Sheets in n8n Connect your Google account in n8n. Point the workflow to your sheet (remove any hardcoded document IDs before sharing). Configure SLA Check Use the IF node to filter leads where: Status = Un-replied Time since timestamp > 15 minutes Enrich Alerts with Links Add a Code node to generate direct row links to the sheet. Set Up Telegram Bot Create a Telegram bot via @BotFather. Add the bot to your team chat. Store the botToken securely (remove chatId before sharing templates). Send Alerts Configure the Telegram node in n8n to send lead details + direct Google Sheet link. Customization Guidance Adjust the SLA window (e.g., 30 minutes or 1 hour) by modifying the IF node condition. Add more fields from Google Sheets (e.g., Company, Owner) to enrich the alert. Replace Telegram with Slack or Email if your team prefers a different channel. Extend the workflow to auto-assign leads in your CRM once alerted. Perfect For Sales teams that need to respond to leads within strict SLAs Support teams ensuring no customer request is ignored Businesses aiming to keep lead response times sharp and consistent

Rahul JoshiBy Rahul Joshi
166

Track meal nutrition from meal photos with LINE, Google Gemini and Google Sheets

AI Meal Nutrition Tracker with LINE and Google Sheets Who's it for This workflow is designed for health-conscious individuals, fitness enthusiasts, and anyone who wants to track their daily food intake without manual calorie counting. It is best suited for users who want a simple, AI-powered meal logging system that analyzes food photos one at a time and provides instant nutritional feedback via LINE. What it does This workflow processes a single meal photo sent via LINE, analyzes it using Google Gemini AI to identify foods and estimate nutritional content, and stores the data in Google Sheets for tracking. The workflow focuses on simplicity and encouragement: it receives a meal image, performs AI-based food recognition, estimates calories and macronutrients, calculates a health score, provides personalized advice, and replies with a detailed nutritional breakdown on LINE. How it works A single meal photo is sent to the LINE bot. The workflow is triggered via a LINE webhook. The image file is downloaded and sent to Google Gemini AI for food analysis. The AI identifies foods and estimates nutritional values (calories, protein, carbs, fat, fiber). A health score (1-10) is calculated with personalized improvement tips. The data is appended to Google Sheets for meal history tracking. The image is uploaded to Google Drive for reference. A formatted nutritional report with advice is sent back as a LINE reply. This workflow is intentionally designed to handle one image per execution. Requirements To use this workflow, you will need: A LINE Messaging API account A Google Gemini API key A Google account with access to Google Sheets and Google Drive A Google Sheets document with the following column names: Date Time Meal Type Food Items Calories Protein (g) Carbs (g) Fat (g) Fiber (g) Health Score Advice Image URL Important limitations This workflow does not support multiple images sent in a single message. Sending images in quick succession may trigger multiple executions and lead to unexpected results. Only the first image in an event payload is processed. Nutritional values are AI estimates based on visual analysis and typical serving sizes. Accuracy depends on image quality, lighting, and food visibility. This tool should not replace professional dietary advice. These limitations are intentional to keep the workflow simple and easy to understand. How to set up Create a LINE Messaging API channel and obtain a Channel Access Token. Generate a Google Gemini API key. Update the Config node with your LINE token, Google Sheets ID, Google Drive folder ID, and daily calorie goal. Configure credentials for LINE, Google Gemini, Google Sheets, and Google Drive. Register the n8n webhook URL in your LINE channel settings. Activate the workflow in n8n and test it with a single meal photo. How to customize Modify the AI prompt in the "Analyze Meal with AI" node to support different languages or dietary frameworks (keto, vegan, etc.). Adjust the daily calorie goal in the Config node to match individual needs. Add additional nutritional fields such as sodium, sugar, or vitamins. Replace Google Sheets with a fitness app API or database. Integrate with other services to send daily/weekly nutrition summaries. --- Note: This workflow was tested using real meal photos sent individually via the LINE Messaging API. Nutritional estimates are approximations and may vary from actual values. For accurate dietary tracking, consult a registered dietitian.

Oka HironobuBy Oka Hironobu
89

Automate job matching with Gemini AI, Decodo scraping & resume analysis to Telegram

AI Job Matcher with Decodo, Gemini AI & Resume Analysis Sign up for Decodo — get better pricing here Who’s it for This workflow is built for job seekers, recruiters, founders, automation builders, and data engineers who want to automate job discovery and intelligently match job listings against resumes using AI. It’s ideal for anyone building job boards, candidate matching systems, hiring pipelines, or personal job alert automations using n8n. What this workflow does This workflow automatically scrapes job listings from SimplyHired using Decodo residential proxies, extracts structured job data with a Gemini AI agent, downloads resumes from Google Drive, extracts and summarizes resume content, and surfaces the most relevant job opportunities. The workflow stores structured results in a database and sends real-time notifications via Telegram, creating a scalable and low-maintenance AI-powered job matching pipeline. How it works A schedule trigger starts the workflow automatically Decodo fetches job search result pages from SimplyHired Job card HTML is extracted from the page A Gemini AI agent converts raw HTML into structured job data Resume PDFs are downloaded from Google Drive Resume text is extracted from PDF files A Gemini AI agent summarizes key resume highlights Job and resume data are stored in a database Matching job alerts are sent via Telegram How to set up Add your Decodo API credentials Add your Google Gemini API key Connect Google Drive for resume access Configure your Telegram bot Set up your database (Google Sheets by default) Update the job search URL with your keywords and location Requirements Self-hosted n8n instance Decodo account (community node) Google Gemini API access Google Drive access Telegram Bot token Google Sheets or another database > Note: This template uses a community node (Decodo) and is intended for self-hosted n8n only. How to customize the workflow Replace SimplyHired with another job board or aggregator Add job–resume matching or scoring logic Extend the resume summary with custom fields Swap Google Sheets for PostgreSQL, Supabase, or Airtable Route notifications to Slack, Email, or Webhooks Add pagination or multi-resume processing

Rully SaputraBy Rully Saputra
65
All templates loaded