Back to Catalog

Record transactions & generate budget reports with Gemini AI, Telegram & Firefly III

PTSPTS
160 views
2/3/2026
Official Page

Who this is for

Anybody using Firefly III, especially home/self-hosted users, who want to add some level of automation to their transaction tracking, either in addition to or because they can't or don't want to use the dataimporter

How it works - posting transactions

  1. User sends a transaction screenshot/image or statement to a Telegram bot
  2. Gemini analyzes it based on the user's requirements (asset account IDs & categories)
  3. The transaction information is parsed to create a suitable POST to a Firefly instance
  4. The transaction(s) are posted to Firefly via its API, using an OAuth2 credential

How it works - requesting budget reports

  1. User sends the word 'Report' via telegram
  2. A 'GET' API request is sent to Firefly for all budgets between the beginning of the month and the request date, including remaining amounts for each
  3. This is converted to a CSV file
  4. The CSV is sent to the user via Telegram

Prerequisites

  1. Telegram, and knowledge of how to set up a bot (search for BotFather in Telegram)
  2. An existing instance of Firefly III with admin access for creating OAuth2 credentials

How to set it up - Credentials

  1. Open Telegram, and search for BotFather
  2. Create a new bot by following the instructions
  3. Save the API key provided
  4. In n8n, create a new Telegram credential using the info for the new bot
  5. Create an OAuth client in Firefly, using the redirect URL found in n8n's OAuth2 API credential creator
  6. Fill the n8n OAuth2 API credential form as Authorization Code, filling in the remaining parameters from the info created in Firefly
  7. Create a Gemini credential following the instructions in n8n

How to set it up - the workflow

  1. Set the credential in each Telegram node
  2. Set the Firefly credential in each http node
  3. Set the correct base URL for the Firefly instance in each http node
  4. Set the desired Gemini credential and model in each AI node
  5. Set the correct Bank IDs (as per Firefly) and preferred categories in the AI node system message

Customization options

The user can specify all types of asset and expense accounts, as well as a specific list of categories and descriptions for Gemini to use. Gemini can also be swapped out for any other AI/LLM.

Additionally, anyone can build on this by reviewing the Firefly API documents to automate almost any other part of the Firefly software.

n8n Workflow: Telegram-driven Transaction Recording and Budget Reporting with Gemini AI & Firefly III

This n8n workflow empowers users to effortlessly record financial transactions via Telegram and generate budget reports using Google Gemini AI, integrating with Firefly III for financial management. It simplifies personal finance tracking by turning natural language messages into structured financial data and insightful reports.

What it does

  1. Listens for Telegram Messages: The workflow is triggered by incoming messages to a configured Telegram bot.
  2. Parses Telegram Command: It checks if the message starts with /record or /report.
  3. Records Transactions (via /record command):
    • Extracts transaction details (amount, description, category, date) from the Telegram message using Google Gemini AI.
    • Formats the extracted data.
    • Sends a request to Firefly III to create a new transaction.
    • Confirms the transaction recording back to the user on Telegram.
  4. Generates Budget Reports (via /report command):
    • Retrieves transaction data from Firefly III.
    • Summarizes the financial data and generates a budget report using Google Gemini AI.
    • Converts the report into a CSV file.
    • Sends the budget report as a CSV file back to the user on Telegram.

Prerequisites/Requirements

  • n8n Instance: A running n8n instance.
  • Telegram Bot: A Telegram bot token and chat ID.
  • Google Gemini API Key: For AI-powered transaction parsing and report generation.
  • Firefly III Instance: A self-hosted or cloud Firefly III instance with an API token.

Setup/Usage

  1. Import the workflow: Download the JSON provided and import it into your n8n instance.
  2. Configure Credentials:
    • Telegram Trigger & Node: Set up your Telegram Bot API credentials (Bot Token) and specify the Chat ID for communication.
    • Google Gemini Chat Model & Google Gemini Node: Configure your Google Gemini API key.
    • HTTP Request Nodes: Update the HTTP Request nodes (especially those interacting with Firefly III) with your Firefly III instance URL and API token.
  3. Activate the workflow: Once all credentials are set, activate the workflow.
  4. Interact via Telegram:
    • To record a transaction, send a message to your Telegram bot like: /record 50.25 for groceries at supermarket on 2023-10-26
    • To generate a budget report, send a message like: /report last month or /report for October 2023

Related Templates

Two-way property repair management system with Google Sheets & Drive

This workflow automates the repair request process between tenants and building managers, keeping all updates organized in a single spreadsheet. It is composed of two coordinated workflows, as two separate triggers are required — one for new repair submissions and another for repair updates. A Unique Unit ID that corresponds to individual units is attributed to each request, and timestamps are used to coordinate repair updates with specific requests. General use cases include: Property managers who manage multiple buildings or units. Building owners looking to centralize tenant repair communication. Automation builders who want to learn multi-trigger workflow design in n8n. --- ⚙️ How It Works Workflow 1 – New Repair Requests Behind the Scenes: A tenant fills out a Google Form (“Repair Request Form”), which automatically adds a new row to a linked Google Sheet. Steps: Trigger: Google Sheets rowAdded – runs when a new form entry appears. Extract & Format: Collects all relevant form data (address, unit, urgency, contacts). Generate Unit ID: Creates a standardized identifier (e.g., BUILDING-UNIT) for tracking. Email Notification: Sends the building manager a formatted email summarizing the repair details and including a link to a Repair Update Form (which activates Workflow 2). --- Workflow 2 – Repair Updates Behind the Scenes:\ Triggered when the building manager submits a follow-up form (“Repair Update Form”). Steps: Lookup by UUID: Uses the Unit ID from Workflow 1 to find the existing row in the Google Sheet. Conditional Logic: If photos are uploaded: Saves each image to a Google Drive folder, renames files consistently, and adds URLs to the sheet. If no photos: Skips the upload step and processes textual updates only. Merge & Update: Combines new data with existing repair info in the same spreadsheet row — enabling a full repair history in one place. --- 🧩 Requirements Google Account (for Forms, Sheets, and Drive) Gmail/email node connected for sending notifications n8n credentials configured for Google API access --- ⚡ Setup Instructions (see more detail in workflow) Import both workflows into n8n, then copy one into a second workflow. Change manual trigger in workflow 2 to a n8n Form node. Connect Google credentials to all nodes. Update spreadsheet and folder IDs in the corresponding nodes. Customize email text, sender name, and form links for your organization. Test each workflow with a sample repair request and a repair update submission. --- 🛠️ Customization Ideas Add Slack or Telegram notifications for urgent repairs. Auto-create folders per building or unit for photo uploads. Generate monthly repair summaries using Google Sheets triggers. Add an AI node to create summaries/extract relevant repair data from repair request that include long submissions.

Matt@VeraisonLabsBy Matt@VeraisonLabs
208

Automate invoice processing with OCR, GPT-4 & Salesforce opportunity creation

PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive ➜ Download PDF ➜ OCR text ➜ AI normalize to JSON ➜ Upsert Buyer (Account) ➜ Create Opportunity ➜ Map Products ➜ Create OLI via Composite API ➜ Archive to OneDrive. --- Node by node (what it does & key setup) 1) Google Drive Trigger Purpose: Fire when a new file appears in a specific Google Drive folder. Key settings: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output: Metadata { id, name, ... } for the new file. --- 2) Download File From Google Purpose: Get the file binary for processing and archiving. Key settings: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output: Binary (default key: data) and original metadata. --- 3) Extract from File Purpose: Extract text from PDF (OCR as needed) for AI parsing. Key settings: Operation: pdf OCR: enable for scanned PDFs (in options) Output: JSON with OCR text at {{ $json.text }}. --- 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into strict normalized JSON array (invoice schema). Key settings: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content → the parsed JSON (ensure it’s an array). --- 5) Create or update an account (Salesforce) Purpose: Upsert Buyer as Account using an external ID. Key settings: Resource: account Operation: upsert External Id Field: taxid_c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream Opportunity. --- 6) Create an opportunity (Salesforce) Purpose: Create Opportunity linked to the Buyer (Account). Key settings: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output: Opportunity Id for OLI creation. --- 7) Build SOQL (Code / JS) Purpose: Collect unique product codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output: { soql, codes } --- 8) Query PricebookEntries (Salesforce) Purpose: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output: Items with Id, Product2.ProductCode (used for mapping). --- 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id ➜ build OpportunityLineItem payloads. Inputs: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes: Converts discount_total ➜ per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. --- 10) Create Opportunity Line Items (HTTP Request) Purpose: Bulk create OLIs via Salesforce Composite API. Key settings: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output: Composite API results (per-record statuses). --- 11) Update File to One Drive Purpose: Archive the original PDF in OneDrive. Key settings: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output: Uploaded file metadata. --- Data flow (wiring) Google Drive Trigger → Download File From Google Download File From Google → Extract from File → Update File to One Drive Extract from File → Message a model Message a model → Create or update an account Create or update an account → Create an opportunity Create an opportunity → Build SOQL Build SOQL → Query PricebookEntries Query PricebookEntries → Code in JavaScript Code in JavaScript → Create Opportunity Line Items --- Quick setup checklist 🔐 Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. 📂 IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) 🧠 AI Prompt: Use the strict system prompt; jsonOutput = true. 🧾 Field mappings: Buyer tax id/name → Account upsert fields Invoice code/date/amount → Opportunity fields Product name must equal your Product2.ProductCode in SF. ✅ Test: Drop a sample PDF → verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive --- Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep “Return only a JSON array” as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields don’t support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your org’s domain or use the Salesforce node’s Bulk Upsert for simplicity.

Le NguyenBy Le Nguyen
942

AI-powered candidate nurturing with scheduled WhatsApp & Gmail follow-ups

What This Workflow Does This workflow automates the candidate nurturing process, solving the common problem of candidates losing interest or "ghosting" after an application. It keeps them engaged and informed by sending a personalized, multi-channel (WhatsApp & Gmail) sequence of follow-up messages over their first week. The automation triggers when a new candidate is added to your ATS (e.g., via a Recrutei webhook). It then uses AI to generate a custom 3-part message (for Day 1, Day 3, and Day 7) tailored to the candidate's age and the specific job they applied for, ensuring a professional and empathetic experience that strengthens your employer brand. How it Works Trigger: A Webhook node captures the new candidate data from your Applicant Tracking System (ATS) or form. Data Preparation: Two Code nodes clean the incoming data. The first (Separating information) extracts key fields and formats the phone number. The second (Extract age) calculates the candidate's age from their birthday to be used by the AI. AI Content Generation: The workflow sends the candidate's details (name, age, job title) to an AI model (AI Recruitment Assistant). The AI has a detailed system prompt to generate three distinct messages for Day 1 (Thank You), Day 3 (Friendly Reminder), and Day 7 (Final Reinforcement), adapting its tone based on the candidate's age. Split Messages: A Code node (Separating messages per days) receives the single text block from the AI and splits it into three separate variables (day1, day3, day7). Day 1 Send: The workflow immediately sends the day1 message via both Gmail and WhatsApp (configured for Evolution API). Day 3 Send: A "Wait" node pauses the workflow for 2 days, after which it sends the day3 message. Day 7 Send: Another "Wait" node pauses for 4 more days, then sends the final day7 message, completing the 7-day nurturing sequence. Setup Instructions This workflow is plug-and-play once you configure the following 5 steps: Webhook Node: Copy the Test URL from the Webhook node and configure it in your ATS (e.g., Recrutei) or form builder to trigger whenever a new candidate is added. Run one test submission to make the data structure visible to n8n. AI Credentials: In the AI Recruitment Assistant node, select or create your OpenAI API credential. MCP Credential (Optional): If you use a Recrutei MCP, paste your endpoint URL into the MCP Recrutei node. Gmail Credentials: In all three Message Gmail nodes (Day 1, 3, 7), select or create your Gmail (OAuth2) credential. Optional:* In the same nodes, go to Options and change the Sender Name from your_company to your actual company name. WhatsApp (Evolution API): This template is pre-configured for the Evolution API. In all three Message WhatsApp nodes (Day 1, 3, 7), you must: URL: Replace {server-url} and {instance} with your Evolution API details. Headers: In the "Header Parameters" section, replace yourapikey with your actual Evolution API key.

Recrutei  Automações By Recrutei Automações
54