AI research agents to automate PDF analysis with Mistral’s best-in-class OCR
Overview Mistral OCR is a cutting-edge document understanding API that improves how businesses extract and process information from complex documents. With top scores in benchmarks for accuracy and comprehension capabilities, Mistral OCR handles multi-column text, charts, diagrams, and multiple languages. This workflow uses Mistral's Document understanding OCR API to automatically turns dense PDFs (such as financial reports) into either deep research reports or concise newsletters [](https://youtu.be/Z9Lym6AZdVM) Key Features Superior Document Understanding: Processes complex documents with high-fidelity rendering Multi-Format Support: Handles PDFs containing text, images, charts, and diagrams Multilingual Capabilities: Accurately processes documents in various languages Seamless API Integration: Easy implementation through cloud-based API Customizable Research Depth: Generate comprehensive 8-page reports or concise 1,750-word newsletters How It Works Document Upload: Submit your PDF through an n8n form interface. Output Format Selection: Choose between comprehensive deep research (3,500 words) or Concise newsletter (1,750 words) Custom Instructions: Tailor the analysis by adding specific focus areas (e.g., quantitative data, growth catalysts). AI Processing: The document undergoes multi-stage AI analysis: OCR and text extraction using Mistral AI and Content structuring and summarization using GPT models Agents: Research Leader: Plans and conducts initial research, creating a table of contents. Project Planner: Breaks down the table of contents into manageable sections. Research Assistants: Multiple agents that conduct in-depth research on assigned sections. Editor: Compiles and refines the final article, ensuring coherence and proper citations. Setup API Key Acquisition: Obtain an API key from OpenRouter.ai Get an API key from Mistral.ai n8n Configuration: In your n8n instance, navigate to the credentials section. Create new credentials for OpenRouter and Mistral, inputting the respective API keys. Form Configuration: Customize the input form fields if needed (e.g., adding company-specific options). Output Customization: Adjust the word count parameters in the Project Planner node to change output length.
Prevent concurrent workflow runs using Redis
What does this template do? This workflow sets a small "lock" value in Redis so that only one copy of a long job can run at the same time. If another trigger fires while the job is still busy, the workflow sees the lock, stops early, and throws a clear error. This protects your data and keeps you from hitting rate limits. Because the workflow also stores simple progress flags ("working", "loading", "finishing"), you can poll the current status and show live progress for very long jobs. Use Case Great when the same workflow can be called many times in parallel (for example by webhooks, cron jobs, or nested Execute Workflow calls) and you need an "only run once at a time" guarantee without building a full queue system. What the Workflow Does ⚡ Starts through Execute Workflow Trigger called by another workflow 🔄 A Switch sends the run to Get, Set, or Unset actions 💾 Redis reads or writes a key named processstatus<key> with a time‑to‑live (default 600 s) 🚦 If nodes check the key and decide to continue or stop ⏱️ Wait nodes stand in for the slow part of your job (replace these with your real work) 📈 Updates the key with human‑readable progress values that another workflow can fetch with action = get 🏁 When done, the lock is removed so the next run can start Apps & Services Used Redis Core n8n nodes (Switch, If, Set, Wait, Stop and Error) Pre‑requisites A Redis server that n8n can reach Redis credentials stored in n8n A second workflow that calls this one and sends: action set to get, set, or unset key set to a unique name for the job Optional timeout in seconds Customization Tips Increase or decrease the TTL in the Set Timeout node to match how long your job usually runs Add or rename status values ("working", "loading", "finishing", and so on) to show finer progress Replace Stop and Error with a Slack or email alert, or even push the extra trigger into a queue if you prefer waiting instead of failing Use different Redis keys if you need separate locks for different tasks Build a small "status endpoint" workflow that calls this one with action = get to display real‑time progress to users Additional Use Cases 🛑 Telegram callback spam filter If a Telegram bot sends many identical callbacks in a burst, call this workflow first to place a lock. Only the first callback will proceed; the rest will exit cleanly until the lock clears. This keeps your bot from flooding downstream APIs. 🧩 External API rate‑limit protection Run heavy API syncs one after the other so parallel calls do not break vendor rate limits. 🔔 Maintenance window lock Block scheduled maintenance tasks from overlapping, making sure each window finishes before the next starts.
Create a WhatsApp chatbot with GPT-4o, Whisper transcription and Redis buffer
👥 Who's it for This workflow is perfect for businesses or individuals who want to automate WhatsApp conversations 💬 with an intelligent AI chatbot that can handle text, voice notes 🎵, and images 🖼️. No advanced coding required! 🤖 What it does It automatically receives WhatsApp messages through WasenderAPI, intelligently buffers consecutive messages to avoid fragmented responses, processes multimedia content (transcribing audio and analyzing images with AI), and responds naturally using GPT-4o mini with conversation memory. All while protecting your WhatsApp account from being banned. ⚙️ How it works 📱 Webhook Trigger – Receives new messages from WasenderAPI 🗃️ Redis Buffer System – Groups consecutive messages intelligently (7-second window) 🔀 Content Classifier – Routes messages by type (text, audio, or image) 🎵 Audio Processing – Decrypts and transcribes voice notes using OpenAI Whisper 🖼️ Image Analysis – Decrypts and analyzes images with GPT-4O Vision 🧠 AI Agent (GPT-4o mini) – Generates intelligent responses with 10-message memory ⏱️ Anti-Ban Wait – 6-second delay to simulate human typing 📤 Message Sender – Delivers response back to WhatsApp user 📋 Requirements WasenderAPI account with connected WhatsApp number : https://wasenderapi.com/ Redis database (free tier works fine) OpenAI API key with access to GPT-4o mini and Whisper n8n's AI Agent, LangChain, and Redis nodes 🛠️ How to set up Create your WasenderAPI account and connect a WhatsApp number Set up a free Redis database and get connection credentials Configure OpenAI API key in n8n credentials Replace the WasenderAPI Bearer token in "Get the audio", "Get the photo", and "Send Message to User" nodes Change the Manual Trigger to a Webhook and configure it in WasenderAPI Customize the AI Agent prompt to match your business needs Adjust wait times if needed (default: 6 seconds for responses, 7 seconds for buffer) Save and activate the workflow ✅ 🎨 How to customize Modify the AI Agent prompt to change bot personality and instructions Adjust buffer wait time (7 seconds) for faster/slower message grouping Change response delay (6 seconds) based on your use case , its recomendable 30 seconds. Add more content types (documents, videos) by extending the Switch Type node Configure conversation memory window (default: 10 messages)
Automate job posting creation with Forms, Dropbox, and Foxit PDF generation
This n8n template demonstrates how to add a tie form data to a new PDF. The idea is to automate the creation of a professional looking job posting. Use cases would be organizations who need to automate the creation of job postings. How it Works The trigger is a form that asks for job position, salary, office location, and responsiblities When the form is posted, it kicks off the workflow's next steps A Word document is downloaded from a Dropbox folder. This Word document is used as the template for the posting. The Word document is converted to base64. A call to Foxit's Document Generation endpoint includes the encoded Word document along with the form information. The resulting PDF is downloaded and converted from base64 into binary. At this point, the PDF is just there, but it could be emailed, sent to another workflow, etc. Requirements A Dropbox account. The workflow's first step points to a Word template. See our doc gen APIs for information on how to craft the Word doc, but the easiest way is to copy text like so: Job Position We are pleased to announce the opening of a new job, {{ jobPosition }}. This job pays ${{ salary }} per year and is in our {{ office }} location. The details of this job are: {{ responsibilities }} Foxit developer account (https://developer-api.foxit.com) Next Steps As mentioned above, you could do anything with the resulting PDF when done.
Transform RSS feeds into blog articles with GPT-4, human review & Google Docs
[Workflow Overview] ⚠️ Self-Hosted Only: This workflow uses the gotoHuman community node and requires a self-hosted n8n instance. Who's It For Content teams, bloggers, news websites, and marketing agencies who want to automate content creation from RSS feeds while maintaining editorial quality control. Perfect for anyone who needs to transform news articles into detailed blog posts at scale. What It Does This workflow automatically converts RSS feed articles into comprehensive, SEO-optimized blog posts using AI. It fetches articles from your RSS source, generates detailed content with GPT-4, sends drafts for human review via gotoHuman, and publishes approved articles to Google Docs with automatic Slack notifications to your team. How It Works Schedule Trigger runs every 6 hours to check for new RSS articles RSS Read node fetches the latest articles from your feed Format RSS Data extracts key information (title, keywords, description) Generate Article with AI creates a structured blog post using OpenAI GPT-4 Structure Article Data formats the content with metadata Request Human Review sends the article for approval via gotoHuman Check Approval Status routes the workflow based on review decision Create Google Doc and Add Article Content publish approved articles Send Slack Notification alerts your team with article details Requirements OpenAI API key with GPT-4 access Google account for Google Docs integration gotoHuman account for human-in-the-loop approval workflow Slack workspace for team notifications RSS feed URL from your preferred source How to Set Up Configure RSS Feed: In the "RSS Read" node, replace the example URL with your RSS feed source Connect OpenAI: Add your OpenAI API credentials to the "OpenAI Chat Model" node Set Up Google Docs: Connect your Google account and optionally specify a folder ID for organized storage Configure gotoHuman: Add your gotoHuman credentials and create a review template for article approval Connect Slack: Authenticate with Slack and select the channel for notifications Customize Content: Modify the AI prompt in "Generate Article with AI" to match your brand voice and article structure Adjust Schedule: Change the trigger frequency in "Schedule Trigger" based on your content needs How to Customize Article Style: Edit the AI prompt to change tone, length, or structure Keywords & SEO: Modify the "Format RSS Data" node to adjust keyword extraction logic Publishing Destination: Change from Google Docs to other platforms (WordPress, Notion, etc.) Approval Workflow: Customize the gotoHuman template to include specific review criteria Notification Format: Adjust the Slack message template to include additional metadata Processing Volume: Modify the Code node to process multiple RSS articles instead of just one
Customer support ticket system for SMEs with Google Sheets and auto-emails
How it Works This workflow automates customer support for SMEs in five simple steps: Capture requests via a Webhook connected to a contact form. Extract the message to make processing easier. Check categories (e.g., refund-related requests) using an IF node. Save all tickets to a Google Sheet for tracking. Send an acknowledgment email back to the customer automatically. This setup ensures all customer inquiries are logged, categorized, and acknowledged without manual effort. --- Setup Steps Webhook Add a Webhook node with the path customer-support. Configure your contact form or system to send name, email, and message to this webhook. Extract Message (Set Node) Add a Set node. Map the incoming message field to make it available for other nodes. Check Category (IF Node) Insert an IF node. Example: check if the message contains the word “refund”. This allows you to route refund-related requests differently if needed. Save Ticket (Google Sheets) Connect to Google Sheets with OAuth2 credentials. Operation: Append. Range: Tickets!A:C. Map the fields Name, Email, and Message. Send Acknowledgement (Email Send) Configure the Email Send node with your SMTP credentials. To: ={{$json.email}}. Subject: Support Ticket Received. Body: personalize with {{$json.name}} and include the {{$json.message}}. --- 👉 With this workflow, SMEs can handle incoming support tickets more efficiently, maintain a simple ticket log, and improve customer satisfaction through instant acknowledgment.
Send monthly Toggl time tracking summary via Resend email
Description This workflow fetches Toggl Track summary data for the previous month, aggregates hours per client and project, and emails a clean HTML report via Resend. How it works 1) Compute previous month period. 2) Fetch Toggl summary (grouping=clients, sub_grouping=projects). 3) Fetch clients and projects for proper names. 4) Merge and aggregate seconds to hours. 5) Generate HTML raport. 6) Send raport via Resend API. Requirements Toggl free account (Login, Pass, TOGGLWORKSPACEID). Resend.com free account (RESENDAPIKEY). Customization Change trigger time in the Schedule Trigger node. Modify period calculation for weekly/quarterly in Get Toggle Summary node. Add archived projects by querying with active=false&archived=true and merging. Documentation Toggle docs Resend.com docs Author Krystian Syryca - krsweb.pl
Sync QuickBooks chart of accounts to Google BigQuery
Sync QuickBooks Chart of Accounts to Google BigQuery Keep a historical, structured copy of your QuickBooks Chart of Accounts in BigQuery. This n8n workflow runs weekly, syncing new or updated accounts for better reporting and long-term tracking. Who Is This For? Data Analysts & BI Developers Build a robust financial model and analyze changes over time. Financial Analysts & Accountants Track structural changes in your Chart of Accounts historically. Business Owners Maintain a permanent archive of your financial structure for future reference. What the Workflow Does Extract Every Monday, fetch accounts created or updated in the past 7 days from QuickBooks. Transform Clean the API response, manage currencies, create stable IDs, and format the data. Format Convert cleaned data into an SQL insert-ready structure. Load Insert or update account records into BigQuery. Setup Steps Prepare BigQuery Create a table (e.g., quickbooks.accounts) with columns matching the final SQL insert step. Add Credentials Connect QuickBooks Online and BigQuery credentials in n8n. Configure the HTTP Node Open 1. Get Updated Accounts from QuickBooks. Replace the Company ID {COMPANY_ID} with your real Company ID. Press Ctrl + Alt + ? in QuickBooks to find it. Configure the BigQuery Node Open 4. Load Accounts to BigQuery. Select the correct project. Make sure your dataset and table name are correctly referenced in the SQL. Activate Save and activate the workflow. It will now run every week. Requirements QuickBooks Online account QuickBooks Company ID Google Cloud project with BigQuery and a matching table Customization Options Change Sync Frequency Adjust the schedule node to run daily, weekly, etc. Initial Backfill Temporarily update the API query to select * from Account for a full pull. Add Fields Modify 2. Structure Account Data to include or transform fields as needed.
Multi-platform revenue reconciliation across Stripe, PayPal & bank with tax archive
How It Works This workflow automates monthly revenue reconciliation across Stripe, PayPal, and bank statements by standardizing data formats, detecting discrepancies, and producing audit-ready reports. It concurrently retrieves revenue data from multiple sources, normalizes datasets into consistent structures, consolidates records, and reconciles transactions against bank statements with intelligent mismatch detection. The system aggregates monthly totals, generates detailed audit reports with clearly flagged discrepancies, archives finalized outputs to Google Drive, and notifies tax agents. Designed for accounting firms, finance teams, and businesses, it enables automated revenue verification, multi-channel reconciliation, discrepancy identification, and compliance audit documentation without manual record matching or error-prone spreadsheet workflows. Setup Steps Configure Stripe, PayPal. Set up normalization rules for date, currency, and transaction ID mappings. Connect Google Drive for report archiving and Gmail for agent notifications. Define mismatch thresholds and reconciliation tolerance parameters. Prerequisites Stripe, PayPal, and bank statement accounts Use Cases Accounting firms automating client revenue verification; multi-channel e-commerce businesses Customization Add additional payment sources (Square, Shopify), adjust normalization rules for regional formats Benefits Eliminates manual reconciliation, detects discrepancies automatically