16 templates found
Category:
Author:
Sort:

Breakdown documents into study notes using templating MistralAI and Qdrant

This n8n workflow takes in a document such as a research paper, marketing or sales deck or company filings, and breaks them down into 3 templates: study guide, briefing doc and timeline. These templates are designed to help a student, associate or clerk quickly summarise, learn and understand the contents to be more productive. Study guide - a short quiz of questions and answered generated by the AI Agent using the contents of the document. Briefing Doc - key information and insights are extracted by the AI into a digestable form. Timeline - key events, durations and people are identified and listed into a simple to understand timeline by the AI How it works A local file trigger watches a local network directory for new documents. New documents are imported into the workflow, its contents extracted and vectorised into a Qdrant vector store to build a mini-knowledgebase. The document then passes through a series of template generating prompts where the AI will perform "research" on the knowledgebase to generate the template contents. Generated study guide, briefing and timeline documents are exported to a designated folder for the user. Requirements Self-hosted version of n8n. Qdrant instance for knowledgebase. Mistral.ai account for embeddings and AI model. Customising your workflow Try adding your own templates or adjusting the existing templates to suit your unique use-case. Anything is quite possible and limited only by your imagination! Want to go fully local? A version of this workflow is available which uses Ollama instead. You can download this template here: https://drive.google.com/file/d/1VV5R2nW-IhVcFP_k8uEks4LsLRZrHSNG/view?usp=sharing

JimleukBy Jimleuk
28146

WhatsApp expense tracker with PostgreSQL database & AI-powered reports

Track Personal Finances with WhatsApp and AI Assistant Transform your WhatsApp into a powerful personal finance command center. This AI-powered workflow converts natural language messages into structured financial data, automates record-keeping, and delivers instant insights—all within your favorite messaging app. Who is this for? This template is perfect for: Personal finance enthusiasts who want effortless expense tracking Small business owners managing personal and business expenses Freelancers tracking income and expenses across projects Anyone who prefers messaging over complex finance apps Users seeking privacy with self-hosted financial data What problem is this workflow solving? Traditional expense tracking requires switching between apps, manual data entry, and complex spreadsheets. Most people abandon these systems within weeks. This workflow solves the friction by: Eliminating app-switching—everything happens in WhatsApp Converting natural language to structured data automatically Providing instant confirmations and reports Requiring zero learning curve or behavior change What this workflow does Smart Transaction Processing Send natural messages like Spent 300 on groceries at Walmart and the AI automatically extracts: Date: Today's date (or specified date) Category: Groceries Type: Expense/Income/Debt Amount: 300 Person/Company: Walmart Intelligent Message Classification The workflow automatically routes messages to three processing branches: Branch 1: Reports and analytics (show March expenses) Branch 2: Transaction logging (spent 50 on coffee) Branch 3: General financial chat (how can I save money?) Advanced Reporting Generate instant reports by messaging: today's report → Daily income/expense summary March vs April report → Monthly comparisons with percentages show groceries spending → Category-specific analysis Automatic daily summaries at your preferred time Database Integration All transactions are stored in PostgreSQL with proper schema: sql CREATE TABLE financial_transactions ( date DATE NOT NULL, category TEXT NOT NULL, type TEXT NOT NULL, amount NUMERIC(12,2) NOT NULL, person TEXT ); Setup Prerequisites n8n instance (self-hosted or n8n.cloud) WhatsApp Business Cloud API credentials PostgreSQL database (version 12+) OpenRouter API key for AI processing Quick Setup Steps Import the workflow template into your n8n instance Configure credentials: WhatsApp Business Cloud API (App Token + Phone Number ID) PostgreSQL connection details OpenRouter API key for AI processing Create database table using the provided SQL schema Test the connection by sending a sample message Customize the scheduled report timing (default: 8 AM daily) Verification Checklist [ ] WhatsApp webhook receives messages [ ] AI correctly parses transaction messages [ ] Database insertions work properly [ ] Confirmation messages are sent back [ ] Reports generate with accurate data How to customize this workflow to your needs AI Model Configuration Default: Uses OpenRouter with GPT-3.5-turbo for cost efficiency Upgrade: Switch to GPT-4 or Claude for better accuracy Local: Replace with self-hosted Ollama for complete privacy Database Options PostgreSQL: Recommended for production use Google Sheets: Alternative for simpler setups (nodes included) MySQL/SQLite: Easily adaptable with minor SQL modifications Message Classification Customize the classification system: 0: Reports (modify SQL queries for different analytics) 1: Transactions (adjust parsing rules for your language/currency) 2: Chat (customize AI responses for financial advice) Reporting Customization Scheduled reports: Change timing, format, and recipients Custom periods: Add quarterly, yearly, or custom date ranges Categories: Modify auto-categorization rules for your spending patterns Currency: Update formatting for your local currency Advanced Features Multi-user support: Add user identification for family/team use Receipt photos: Extend workflow to process image receipts via OCR Budgets: Add budget tracking and overspend alerts Integrations: Connect to banks via Plaid or other financial APIs Complete Package Included When you download this template, you get everything needed for immediate implementation: Ready-to-Use n8n Workflow Fully configured nodes with descriptive names explaining each step Color-coded sticky notes throughout the workflow explaining: What each branch does (Reports/Transactions/Chat) How the AI classification works Database connection requirements Error handling and troubleshooting tips Comprehensive Documentation Bundle Quick Start Guide: Get running in under 10 minutes Detailed Setup Guide: Complete configuration walkthrough with screenshots Branch Explanation Guide: Deep dive into each processing branch: Branch 0: Reports & Analytics - SQL queries and formatting Branch 1: Transaction Processing - AI parsing and database insertion Branch 2: Financial Chat - AI responses and conversation handling Built-in Workflow Documentation Sticky notes at every major step explaining the logic Node descriptions that clarify what each component does Visual flow indicators showing message routing paths Dependency callouts highlighting required credentials and connections Technical Implementation Details Database schema with complete SQL commands API configuration examples for all external services Troubleshooting checklist for common setup issues Performance optimization recommendations Bonus Resources Example message templates to test each workflow branch Sample data for testing reports and analytics Customization recipes for common modifications Integration patterns for extending functionality Example Usage Log Expenses: Spent 1200 on rent this month Paid 45 for gas at Shell Coffee 5.50 at Starbucks Log Income: Received 5000 salary from Company ABC Freelance payment 800 from Client XYZ Generate Reports: today's summary show this week's expenses compare March vs April spending how much on food this month? Expected Responses: ✅ Logged: expense | Rent | ₹1,200.00 | Landlord ✅ Logged: income | salary |₹12,000.00|company 📊 Today's Summary: Income: ₹0.00 Expenses: ₹1,245.50 Savings: -₹1,245.50 📈 March vs April: Expenses: ₹15,000 vs ₹12,500 (-16.7%) Top categories: Rent, Food, Transport

Roshan RamaniBy Roshan Ramani
2868

Classify event photos from attendees with Gemma AI, Google Drive & Sheets

There's a clear need for an easier way to manage attendee photos from live events, as current processes for collecting, sharing, and categorizing them are inefficient. n8n can indeed help to solve this challenge by providing the data input interface via its forms and orchestrate AI-powered classification of images using AI nodes. However, in some cases - say you run regular events or with high attendee counts - the volume of photos may result in unsustainably high inference fees (token usage based billing) which could make the project unviable. To work around this, Featherless.ai is an AI/LLM inference service which is subscription-based and provides unlimited tokens instead. This means costs are essentially capped for AI usage offering greater control and confidence on AI project budgets. Check out the final result here: https://docs.google.com/spreadsheets/d/1TpXQyhUq6tB8MLJ3maeWwswjut9wERZ8pSk_3kKhc58/edit?usp=sharing How it works A form trigger is used share a form interface to guests to upload their photos from their device. The photos are in one branch, are optimised in size before sending to a vision-capable LLM to classify and categorise against a set list of tags. The model inference service is provided by Featherless and takes advantage of their unlimited token usage subscription plan. The photos in another branch are copied into Google Drive for later reference. Once both branches are complete, the classification results and Google Drive link are appended to a Google Sheets table allowing for quick sorting and filtering of all photos. How to use Use this workflow to gain an incredible productivity boost for social media work. When all photos are organised and filter-ready, editors spend a fraction of the time to get community posts ready and delivered. Sharing the completed Google sheet with attendees helps them to better share memories within their own social circles. Requirements FeatherLess.ai) account for Open Source Multimodal LLMs and unlimited token usage. Google Drive for file storage Google Sheet for organising photos into categories Customising this workflow Feel free to refine the form with custom styles to match your branding. Swap out Google services with equivalents to match your own environment. eg. Sharepoint and Excel.

JimleukBy Jimleuk
1175

Organize Gmail attachments in Google Drive folders based on sender’s email

📩🤖 This workflow automatically processes emails received in Gmail, extracts their attachments, and organizes them into specific folders in Google Drive based on the sender's email address. Note: The workflow avoids duplicates by checking folder existence before creation. --- Benefits: ✅ Automated Organization: No need to manually sort or download email attachments. 📁 Sender-based Categorization: Files are stored in clearly labeled folders per sender, improving traceability and reducing clutter. ⏱ Time-saving: Reduces repetitive administrative tasks by automating the workflow end-to-end. 🔁 Modular and Scalable: Can be easily extended or reused with other services (e.g., Dropbox, S3) or integrated into larger document workflows. 🔐 Secure Cloud Storage: Attachments are safely backed up in Google Drive, minimizing the risk of data loss from email. --- How It Works Trigger: The workflow can be triggered manually ("When clicking ‘Execute workflow’) or automatically (via Gmail Trigger polling emails every minute). Email Processing: Fetches emails (with attachments) from Gmail within a date range (default: July 6–9, 2025). For each email, checks if it contains attachments (via IF node). Folder Management: Searches Google Drive for a folder named after the sender’s email address (under parent folder "Email Attachments"). Creates the folder if it doesn’t exist. Attachment Handling: Splits out binary attachments, extracts filenames, and uploads each file to the sender’s dedicated folder in Google Drive. Sub-Workflow Execution: Uses Execute Workflow to modularize the upload process (reusable for other workflows). --- Set Up Steps Google Services: Connect Gmail and Google Drive nodes to your accounts via OAuth2. Ensure the parent folder "Email Attachments" (ID: 1EitwWVd5rKZTlvOreB4R-6xxxxxx) exists in Google Drive. Adjust Date Range: Modify receivedAfter/receivedBefore in the Get emails node to target specific emails. Test: Run manually to verify folder creation and attachment uploads. Activate Automation: Enable the Gmail Trigger for real-time processing (currently active: false). ---- Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
1099

Automated WhatsApp group weekly team reports with Gemini AI summarization

This n8n template automatically summarizes your WhatsApp group activity from the past week and generates a team report. Why use this? Remote teams rely on chat for communication, but important discussions, decisions, and ideas get buried in message threads and forgotten by Monday. This workflow ensures nothing falls through the cracks. How it works Runs every Monday at 6am to collect the previous week's group messages Groups conversations by participant and analyzes message threads AI summarizes individual member activity into personal reports Combines all individual reports into one comprehensive team overview Posts the final report back to your WhatsApp group to kick off the new week Setup requirements WhatsApp (whapAround.pro) no need Meta API Gemini AI (or alternative LLM of choice) Best practices Use one workflow per WhatsApp group for focused results Filter for specific team members if needed Customize the report tone to match your team culture Adjust the schedule if weekly reports don't suit your team's pace Customization ideas Send reports via email instead of posting to busy groups Include project metrics alongside message summaries Connect to knowledge bases or ticket systems for additional context Perfect for project managers who want to keep distributed teams aligned and ensure important conversations don't get lost in the chat noise.

JamotBy Jamot
1007

Convert PDF articles to audio podcasts with Google TTS & Cloudflare R2

Convert PDF Articles to Podcast Workflow Name: Convert PDF Articles to Podcast Author: Devjothi Dutta Category: Productivity, Content Creation, Automation Complexity: Medium Setup Time: 45-60 minutes --- --- 📖 Description Transform any PDF article, research paper, or document into a high-quality audio podcast automatically. This workflow extracts text from PDFs, converts it to natural-sounding speech using Google Cloud Text-to-Speech, stores the audio files in cloud storage, and generates an RSS feed compatible with all major podcast apps (Apple Podcasts, Spotify, Pocket Casts, etc.). Perfect for consuming long-form content while commuting, exercising, or multitasking. Turn your reading list into a personal podcast feed. 👥 Who's it for For Professionals: Convert industry reports and whitepapers to audio Listen to research papers during commutes Stay updated with long-form articles hands-free For Students: Turn textbooks and study materials into audio Create audio versions of lecture notes Study while exercising or commuting For Content Creators: Repurpose written content into audio format Create podcast episodes from blog posts Reach audio-focused audiences For Busy Readers: Convert saved articles to a personal podcast Listen to newsletters and essays on the go Build a private audio library ✨ Key Features 📄 PDF Text Extraction - Automatically extracts text from any PDF file 🎙️ Natural Voice Synthesis - High-quality WaveNet voices from Google Cloud TTS ☁️ Cloud Storage - Files hosted on Cloudflare R2 (S3-compatible) with public URLs 📻 RSS Feed Generation - Full iTunes-compatible podcast feed with metadata 📧 Email Notifications - Instant alerts when new episodes are ready 🎨 Custom Branding - Configurable podcast name, artwork, and descriptions ⚙️ Modular Configuration - Easy-to-update centralized config node 🔄 Automated Workflow - Set it and forget it - fully automated pipeline 🛠️ Requirements Required Services: n8n (self-hosted or cloud) - Workflow automation platform Google Cloud Platform - Text-to-Speech API access Free tier: 1 million characters/month (WaveNet voices) Paid: $16 per 1 million characters Cloudflare R2 - Object storage for audio files and RSS feed Free tier: 10GB storage, unlimited egress Email Service - SMTP or email service for notifications Required Community Nodes: Cloudflare R2 Storage (n8n-nodes-cloudflare-r2-storage) Install via: Settings → Community Nodes → Install Search for: n8n-nodes-cloudflare-r2-storage Important: Install this BEFORE importing the workflow Optional: Custom domain for podcast feed URLs Podcast artwork (3000x3000px recommended) 📦 What's Included This workflow package includes: Complete n8n workflow JSON (ready to import) Comprehensive setup guide Architecture documentation Configuration templates Credentials setup instructions Testing and validation checklist RSS feed customization guide Troubleshooting documentation 🚀 Quick Start Install community node (required): Go to Settings → Community Nodes → Install Search for: n8n-nodes-cloudflare-r2-storage Click Install and wait for completion Import workflow into your n8n instance Configure credentials: Google Cloud TTS API key Cloudflare R2 credentials (Access Key ID + Secret) SMTP email credentials Update Workflow Config node with your settings: R2 bucket name and public URL Podcast name and description Artwork URL Email recipient Test with a sample PDF to verify setup Add RSS feed URL to your podcast app 📊 Workflow Stats Nodes: 10 Complexity: Medium Execution Time: ~2-5 minutes per PDF (depends on length) Monthly Cost: $0-20 (depending on usage and free tiers) Maintenance: Minimal (set and forget) 🎨 Customization Options Change TTS voice (20+ English voices available) Adjust speech speed and pitch Customize RSS feed metadata Add custom intro/outro audio Configure file retention policies Set up webhook triggers for remote submission 🔧 How it Works User uploads PDF to n8n Text is extracted from PDF Text is sent to Google TTS API Audio file (.mp3) is generated Files uploaded to R2 storage: Original PDF Generated MP3 audio RSS feed is generated/updated with: Episode title (from PDF filename) Audio URL Duration and file size Publication date Rich HTML description RSS feed uploaded to R2 Email notification sent with episode details 💡 Pro Tips Voice Selection: Test different WaveNet voices to find your preferred style Batch Processing: Process multiple PDFs by running workflow multiple times Quality vs Cost: WaveNet voices sound better but cost more than Standard voices Storage Management: Set up R2 lifecycle rules to auto-delete old episodes Custom Domains: Use Cloudflare for custom podcast feed URLs 🔗 Related Workflows PDF to Email Summary Document Translation to Audio Blog RSS to Podcast Multi-language Audio Generation 📧 Support & Feedback For questions, issues, or feature requests: GitHub: PDF-to-Podcast---N8N Repository n8n Community Forum: Tag @devdutta Email: devjothi@gmail.com 📄 License MIT License - Free to use, modify, and distribute --- ⭐ If you find this workflow useful, please share your feedback and star the workflow! ---

Dev DuttaBy Dev Dutta
1002

Monitor solar energy production & send alerts with Gmail, Google Sheets, and Slack

Solar Energy Production Monitoring Alert Workflow This workflow automatically monitors solar energy production every 2 hours by fetching data from the Energidataservice API. If the energy output falls below a predefined threshold, it instantly notifies users via email. Otherwise, it logs the data into a Google Sheet and posts a daily summary to Slack. Who’s It For Renewable energy teams monitoring solar output. Facility managers and power plant supervisors. ESG compliance officers tracking sustainability metrics. Developers or analysts automating solar energy reporting. How It Works Trigger: The workflow starts every 2 hours using a Schedule Trigger. Data Fetch: An HTTP Request node fetches solar energy production data from the Energidataservice API. Processing: A Code node filters out entries with production below the minimum threshold. Decision Making: An If node checks whether any low-production entries are present. Alerts: If low-production is detected, an email is sent via the Gmail node. Logging: If all entries are valid, they are logged into a Google Sheet. Slack Summary: A Slack node posts the summary sheet data for end-of-day visibility. How to Set Up Schedule Trigger: Configure to run every 2 hours. HTTP Request Node: Method: GET URL: https://api.energidataservice.dk/dataset/YourDatasetHere Add necessary headers and params as required by the API. Code Node: Define logic to filter entries where solarenergyproduction < required_threshold. If Node: Use items.length > 0 to check for low-production entries. Gmail Node: Auth with Gmail credentials. Customize recipient and message template. Google Sheets Node: Connect to a spreadsheet. Map appropriate columns. Slack Node: Use Slack OAuth2 credentials. Specify channel and message content. Requirements n8n Cloud or Self-hosted instance. Access to Energidataservice API. Gmail account (with n8n OAuth2 integration). Google Sheets account & sheet ID. Slack workspace and app with appropriate permissions. How to Customize Change Frequency: Adjust the Schedule Trigger interval (e.g., every hour or 4x per day). Threshold Tuning: Modify the value in the Code node to change the minimum acceptable solar production. Alert Routing: Update Gmail recipients or replace Gmail with Microsoft Outlook/SendGrid. Sheet Format: Add or remove columns in the Google Sheet based on extra metrics (e.g., wind or nuclear data). Slack Posting: Customize Slack messages using Markdown for improved readability. Add‑ons Telegram Node: Send alerts to a Telegram group instead of email. Discord Webhook: Push updates to a Discord channel. n8n Webhook Trigger: Extend it to receive external production update notifications. Integromat/Make or Zapier: For multi-platform integration with CRMs or ticketing tools. Use Case Examples Utility Companies: Automatically detect and act on solar underperformance to maintain grid stability. Solar Farm Operators: Log clean production data for auditing and compliance reports. Sustainability Teams: Track daily performance and anomalies without manual checks. Home Solar System Owners: Get notified if solar generation drops below expected. Common Troubleshooting | Issue | Possible Cause | Solution | | -------------------------------------- | -------------------------------------- | ------------------------------------------------------------------- | | HTTP Request fails | API key missing or URL is incorrect | Check API endpoint, parameters, and authentication headers | | Gmail not sending alerts | Missing or invalid Gmail credentials | Re-authenticate Gmail OAuth2 in n8n credentials | | No data getting logged in Google Sheet | Incorrect mapping or sheet permissions | Ensure the sheet exists, columns match, and credentials are correct | | Slack node fails | Invalid token or missing channel ID | Reconnect Slack credentials and check permissions | | Code node returns empty | Filter logic may be too strict | Validate data format and relax the threshold condition | Need Help? Need help setting this up or customizing it for your own solar or energy monitoring use case? ✅ Set it up on your n8n Cloud or self-hosted instance ✅ Customize it for your own API or data source ✅ Modify alerts to suit your internal tools (Teams, Discord, SMS, etc.) 👉 Just reach out to our n8n automation team at WeblineIndia, we'll be happy to help.

WeblineIndiaBy WeblineIndia
517

Automated abandoned cart recovery emails for Shopify stores

Overview This n8n template automatically sends personalized recovery emails to customers who abandon their shopping carts. Recover 15-25% of lost sales with intelligent, well-timed follow-up emails that include exact cart contents. Use cases: Online stores, e-commerce brands, subscription services, digital product sales, any business with cart abandonment. How it works Shopify webhook triggers when customer creates cart and begins checkout process. The workflow waits one hour then checks if cart was converted to completed order. If cart remains abandoned, it extracts customer email and cart details. A personalized email is generated with cart contents and direct checkout link. Customer receives gentle reminder and can complete purchase with one click. Set up instructions The cart creation webhook is used but you can adjust the wait time or add multiple follow-up sequences Configure your email service provider in the Send Email node Customize email template and messaging to match your brand tone Requirements Shopify store with cart abandonment tracking Email service provider configured in n8n Customer email collection enabled in checkout process Customising this workflow Create multi-email sequences with different timing (1hr, 24hr, 72hr) Add discount codes for additional purchase incentive Segment messaging by cart value or customer type

David OlusolaBy David Olusola
376

AI chatbot for Max Messenger with voice recognition (GigaChat +SaluteSpeech)

Name: AI Chatbot for Max Messenger with Voice Recognition (GigaChat + Sber) Description: How it works This workflow powers an intelligent, conversational AI bot for Max messenger that can understand and respond to both text and voice messages. The bot uses GigaChat AI with built-in memory, allowing it to remember the conversation history for each unique user and answer follow-up questions. Voice messages are transcribed using Sber SmartSpeech. It's a complete solution for creating an engaging, automated assistant within your Max bot, using Russian AI services. Step-by-step Max Trigger: The workflow starts when the Max Trigger node receives a new message sent to your Max bot. Access Control: The Check User node verifies the sender's user ID against an allowed list. This prevents unauthorized users from accessing your bot. Access Denied Response: If the user is not authorized, the Access Denied node sends a polite rejection message. Message Type Routing: The Text/Attachment (Switch) node checks if the message contains plain text or has attachments (voice, photo, file). Attachment Processing: If an attachment is detected, the Download Attachment (HTTP Request) node retrieves it, and the Attachment Router (Switch) node determines its type (voice, photo, or file). Voice Transcription: For voice messages, the workflow gets a Sber access token via Get Access Token (HTTP Request), merges it with the audio file, and sends it to Get Response (HTTP Request) which uses Sber SmartSpeech API to transcribe the audio to text. Input Unification: The Voice to Prompt node converts transcribed text into a prompt, while Text to Prompt does the same for plain text messages. Both paths merge at the Combine node. AI Agent Processing: The unified prompt is passed to the AI Agent, powered by GigaChat Model and using Simple Memory to retain the last 10 messages per user (using Max user_id as the session key). Response Delivery: The AI-generated response is sent back to the user via the Send Message node. Set up steps Estimated set up time: 15 minutes Get Max bot credentials: Visit https://business.max.ru/ to create a bot and obtain API credentials. Add these credentials to Max Trigger, Send Message, and Access Denied nodes. Add GigaChat credentials: Register for GigaChat API access and add your credentials to the GigaChat Model node. Add Sber credentials: Obtain Sber SmartSpeech API credentials and add them to Get Access Token and Get Response nodes (HTTP Header Auth). Configure access control: Open the Check User node and change the user_id value (currently 50488534) to your own Max user ID. This ensures only you can use the bot during testing. Customize bot personality: Open the AI Agent node and edit the system message to change the bot's name, behavior, and add your own contact information or links. Test the bot: Activate the workflow and send a text or voice message to your Max bot to verify it responds correctly. Notes This workflow is specifically designed for Russian-speaking users and uses Russian AI services (GigaChat and Sber SmartSpeech) as alternatives to OpenAI. Make sure you have valid API access to both services before setting up this workflow.

KonstantinBy Konstantin
326

Track brand reputation with Gemini AI & Google News sentiment analysis to email

Create a powerful brand/company monitoring system that fetches news headlines, performs AI-powered sentiment analysis, and delivers witty, easy-to-read reports via email. This workflow turns brand mentions into a lively “personality analysis” — making your reports not only insightful but also fun to read. Perfect for teams that want to stay informed and entertained. How it works ++Data Collection++: A Google Sheets table captures brand name and recipient email which triggers the workflow. ++News Aggregation++: The RSS Read node fetches recent news headlines from Google News based on the specified brand or company keyword. ++Content Processing++: News headlines are aggregated and formatted for AI analysis. ++AI Analysis++: Gemini 2.5 Flash model plays the role of a brand analyst, writing reports as if the brand were a character in a story. It highlights strengths, quirks, and challenges in a witty, narrative-driven style — while still providing sentiment scores and action points. ++Report Generation++: JavaScript code structures the AI response into well-formatted HTML paragraphs for a smooth email reading experience. ++Automated Delivery++: Gmail integration sends the analysis report directly to the specified email address. How to use First, create a google sheets document with sheet name="page1", A1 cell name="keyword" and B1 cell name="email". The system will read the keyword & email data when a new row data is entered. Paste the url of your google sheets document into the first trigger node. Select trigger on "row added" in the node. Enter your credentials to connect Gemini PaLM API account in the "message a model" node of Google. Enter your credentials to connect Gmail account in the "send a message" node. The workflow automatically runs when new row is detected. Recipients receive comprehensive sentiment analysis reports within minutes! Requirements -Google Sheets URL -Google Gemini API credentials for AI analysis -Gmail API credentials for email delivery

ermanatalayBy ermanatalay
319

Monitor security logs for failed login attempts with Slack alerts

How It Works: The 5-Node Anomaly Detection Flow This workflow efficiently processes logs to detect anomalies. Scheduled Check (Cron Node): This is the primary trigger. It schedules the workflow to run at a defined interval (e.g., every 15 minutes), ensuring logs are routinely scanned for suspicious activity. Fetch Logs (HTTP Request Node): This node is responsible for retrieving logs from an external source. It sends a request to your log API endpoint to get a batch of the most recent logs. Count Failed Logins (Code Node): This is the core of the detection logic. The JavaScript code filters the logs for a specific event ("login_failure"), counts the total, and identifies unique IPs involved. This information is then passed to the next node. Failed Logins > Threshold? (If Node): This node serves as the final filter. It checks if the number of failed logins exceeds a threshold you set (e.g., more than 5 attempts). If it does, the workflow is routed to the notification node; if not, the workflow ends safely. Send Anomaly Alert (Slack Node): This node sends an alert to your team if an anomaly is detected. The Slack message includes a summary of the anomaly, such as the number of failed attempts and the IPs involved, enabling a swift response. --- How to Set Up Implementing this essential log anomaly detector in your n8n instance is quick and straightforward. Prepare Your Credentials & API: Log API: Make sure you have an API endpoint or a way to get logs from your system (e.g., a server, CMS, or application). The logs should be in JSON format, and you'll need any necessary API keys or tokens. Slack Credential: Set up a Slack credential in n8n and get the Channel ID of your security alert channel (e.g., security-alerts). Import the Workflow JSON: Create a new workflow in n8n and choose "Import from JSON." Paste the JSON code (which was provided in a previous response). Configure the Nodes: Scheduled Check (Cron): Set the schedule according to your preference (e.g., every 15 minutes). Fetch Logs (HTTP Request): Update the URL and header/authentication to match your specific log API endpoint. Count Failed Logins (Code): Verify that the JavaScript code matches your log's JSON format. You may need to adjust log.event === 'login_failure' if your log events use a different name. Failed Logins > Threshold? (If): Adjust the threshold value (e.g., 5) based on your risk tolerance. Send Anomaly Alert (Slack): Select your Slack credential and enter the correct Channel ID. Test and Activate: Manual Test: Run the workflow manually to confirm it fetches logs and processes them correctly. You can temporarily lower the threshold to 0 to ensure the alert is triggered. Verify Output: Check your Slack channel to confirm that alerts are formatted and sent correctly. Activate: Once you're confident in its function, activate the workflow. n8n will now automatically monitor your logs on the schedule you set.

MarthBy Marth
289

Auto-extract & approve invoices with OpenAI, Jotform - fraud detection

Transform accounts payable from a manual bottleneck into an intelligent, automated system that reads invoices, detects fraud, and processes payments automatically—saving 20+ hours per week while preventing costly fraudulent payments. 🎯 What This Workflow Does Automates the complete invoice-to-payment cycle with advanced AI: 📧 Check Invoices from Jotform - Monitor Jotform for Invoice Submission 🤖 AI-Powered OCR - Extracts ALL data from PDFs and images (vendor, amounts, line items, dates, tax) 🚨 Fraud Detection Engine - Analyzes 15+ fraud patterns: duplicates, anomalies, suspicious vendors, document quality 🚦 Intelligent Routing - Auto-routes based on AI risk assessment: Critical Fraud (Risk 80-100): Block → Slack alert → CFO investigation Manager Review (>$5K or Medium Risk): Approval workflow with full analysis Auto-Approve (<$5K + Low Risk): Instant → QuickBooks → Vendor notification 📊 Complete Audit Trail - Every decision logged to Google Sheets with AI reasoning ✨ Key Features Advanced AI Capabilities Vision-Based OCR: Reads any invoice format—PDF, scanned images, smartphone photos 99% Extraction Accuracy: Vendor details, line items, amounts, dates, tax calculations, payment terms Multi-Dimensional Fraud Detection: Duplicate invoice identification (same number, similar amounts) Amount anomalies (round numbers, threshold gaming, unusually high) Vendor verification (new vendors, mismatched domains, missing tax IDs) Document quality scoring (OCR confidence, missing fields, calculation errors) Timing anomalies (future dates, expired invoices, weekend submissions) Pattern-based detection (frequent small amounts, vague descriptions, no PO references) Intelligent Processing Risk-Based Scoring: 0-100 risk score with detailed reasoning Vendor Trust Ratings: Build vendor reputation over time Category Classification: Auto-categorizes (software, consulting, office supplies, utilities, etc.) Amount Thresholds: Configurable auto-approve limits Human-in-the-Loop: Critical decisions escalated appropriately Fast-Track Low Risk: Process safe invoices in under 60 seconds Security & Compliance Fraud Prevention: Catch fraudulent invoices before payment Duplicate Detection: Prevent double payments automatically Complete Audit Trail: Every decision logged with timestamp and reasoning Role-Based Approvals: Route to correct approver based on amount and risk Document Verification: Quality checks on every invoice 💼 Perfect For Finance Teams: Processing 50-500 invoices per week CFOs: Need fraud prevention and spending visibility Controllers: Want automated AP with audit compliance Growing Companies: Scaling without adding AP headcount Multi-Location Businesses: Centralized invoice processing across offices Fraud-Conscious Organizations: Healthcare, legal, financial services, government contractors 💰 ROI & Business Impact Time Savings 90% reduction in manual data entry time 20-25 hours saved per week on invoice processing Same-day turnaround on all legitimate invoices Zero data entry errors with AI extraction No more lost invoices - complete tracking Fraud Prevention 100% duplicate detection before payment Catch suspicious patterns automatically Prevent invoice splitting (gaming approval thresholds) Identify fake vendors before payment Average savings: $50K-$200K annually in prevented fraud losses Process Improvements 24-hour vendor response times (vs 7-10 days manual) 95%+ payment accuracy with AI validation Better cash flow management via due date tracking Vendor satisfaction from transparent, fast processing Audit-ready with complete decision trail 🔧 Required Integrations Core Services Jotform - Invoice Submissions Create your form for free on Jotform using this link OpenAI API - GPT-4o-mini for OCR & fraud detection (~$0.03/invoice) Google Sheets - Invoice database and analytics (free) Accounting System - QuickBooks, Xero, NetSuite, or Sage (via API) Optional Add-Ons Slack - Real-time fraud alerts and approval requests Bill.com - Payment processing automation Linear/Asana - Task creation for manual reviews Expensify/Ramp - Expense management integration 🚀 Quick Setup Guide Step 1: Import Template Copy JSON from artifact In n8n: Workflows → Import from File → Paste JSON Template imports with all nodes and sticky notes Step 2: Configure Email Monitoring Connect Gmail or Outlook account Update filter: invoices@yourcompany.com (or your AP email) Test: Send yourself a sample invoice Step 3: Add OpenAI API Get API key: https://platform.openai.com/api-keys Add to both AI nodes (OCR + Fraud Detection) Cost: ~$0.03 per invoice processed Step 4: Connect Accounting System Get API credentials from QuickBooks/Xero/NetSuite Configure HTTP Request node with your endpoint Map invoice fields to your GL codes Step 5: Setup Approval Workflows Update email addresses (finance-manager@yourcompany.com) Configure Slack webhook (optional) Set approval thresholds ($5K default, customize as needed) Step 6: Create Google Sheet Database Create spreadsheet with columns:

Jitesh DugarBy Jitesh Dugar
180