Back to Catalog

Scrape Google Maps leads (email, phone, website) using Apify + GPT + Airtable

Baptiste FortBaptiste Fort
4634 views
2/3/2026
Official Page

Who is it for?

This workflow is for marketers, sales teams, and local businesses who want to quickly collect leads (business name, phone, website, and email) from Google Maps and store them in Airtable.

You can use it for real estate agents, restaurants, therapists, or any local niche.

If you need a professional automation agency to build advanced lead generation systems like this, check out Vision IA's n8n automation services.


How it works

  1. Scrape Google Maps with Apify Google Maps Extractor.
  2. Clean and structure the data (name, address, phone, website).
  3. Visit each website and retrieve the raw HTML.
  4. Use GPT to extract the most relevant email from the site content.
  5. Save everything to Airtable for easy filtering and future outreach.

It works for any location or keyword – just adapt the input in Apify.


Requirements

Before running this workflow, you’ll need:

  • Apify account (to use the Google Maps Extractor)
  • OpenAI API key (for GPT email extraction)
  • Airtable account & base with the following fields:
    • Business Name
    • Address
    • Website
    • Phone Number
    • Email
    • Google Maps URL

Airtable Structure

Your Airtable base should contain these columns:

Airtable Structure

| Title | Street | Website | Phone Number | Email | URL | |-------------------------|-------------------------|--------------------|-----------------|------------------------|----------------------| | Paris Real Estate Agency| 10 Rue de Rivoli, Paris | https://agency.fr | +33 1 23 45 67 | contact@agency.fr | maps.google.com/... | | Example Business 2 | 25 Avenue de l’Opéra | https://example.fr | +33 1 98 76 54 | info@example.fr | maps.google.com/... | | Example Business 3 | 8 Boulevard Haussmann | https://demo.fr | +33 1 11 22 33 | contact@demo.fr | maps.google.com/... |


Error Handling

  • Missing websites: If a business has no website, the flow skips the scraping step.
  • No email found: GPT returns Null if no email is detected.
  • API rate limits: Add a Wait node between requests to avoid Apify/OpenAI throttling.

baptistefortleadslocalgooglemapsn8n.png

Now let’s take a detailed look at how to set up this automation, using real estate agencies in Paris as an example.

Step 1 – Launch the Google Maps Scraper

Start with a When clicking Execute workflow trigger to launch the flow manually.

Then, add an HTTP Request node with the method set to POST.

👉 Head over to Apify: Google Maps Extractor

baptistefortrecupererleadsn8ngooglemaps.png

On the page: https://apify.com/compass/google-maps-extractor

Enter your business keyword (e.g., real estate agency, hairdresser, restaurant)

Set the location you want to target (e.g., Paris, France)

Choose how many results to fetch (e.g., 50)

Optionally, use filters (only places with a website, by category, etc.)

⚠️ No matter your industry, this works — just adapt the keyword and location.

Once everything is filled in:

Click Run to test.

Then, go to the top right → click on API.

Select the API endpoints tab.

Choose Run Actor synchronously and get dataset items.

baptistefortscrapergooglemapsn8nleads.png

Copy the URL and paste it into your HTTP Request (in the URL field).

Then enable:

✅ Body Content Type → JSON ✅ Specify Body Using JSON`

Go back to Apify, click on the JSON tab, copy the entire code, and paste it into the JSON body field of your HTTP Request.

baptistefortn8nemailtelephonegooglemaps.png

At this point, if you run your workflow, you should see a structured output similar to this:

title
subTitle
price
categoryName
address
neighborhood
street
city
postalCode ........

Step 2 – Clean and structure the data

baptistefortworkflowleadsn8n.png

Once the raw data is fetched from Apify, we clean it up using the Edit Fields node.

In this step, we manually select and rename the fields we want to keep:

Title → {{ $json.title }}

Address → {{ $json.address }}

Website → {{ $json.website }}

Phone → {{ $json.phone }}

URL → {{ $json.url }}*

This node lets us keep only the essentials in a clean format, ready for the next steps. On the right: a clear and usable table, easy to work with.

Step 3 – Loop Over Items

baptistefortleadsgooglemapsn8n.png

Now that our data is clean (see step 2), we’ll go through it item by item to handle each contact individually.

The Loop Over Items node does exactly that:

it takes each row from the table (each contact pulled from Apify) and runs the next steps on them, one by one.

👉 Just set a Batch Size of 20 (or more, depending on your needs).

Nothing tricky here, but this step is essential to keep the flow dynamic and scalable.

Step 4 – Edit Field (again)

workflowleadsgooglen8nbaptistefort.png

After looping through each contact one by one (thanks to Loop Over Items), we're refining the data a bit more.

This time, we only want to keep the website.

We use the Edit Fields node again, in Manual Mapping mode, with just:

Website → {{ $json.website }}

The result on the right? A clean list with only the URLs extracted from Google Maps.

🔧 This simple step helps isolate the websites so we can scrape them one by one in the next part of the flow.

Step 5 – Scrape Each Website with an HTTP Request

n8nautomationbaptistefortgooglemaps.png

Let’s continue the flow: in the previous step, we isolated the websites into a clean list. Now, we’re going to send a request to each URL to fetch the content of the site.

➡️ To do this, we add an HTTP Request node, using the GET method, and set the URL as:

{{ $json.website }} This value comes from the previous Edit Fields input

This node will simply “visit” each website automatically and return the raw HTML code (as shown on the right).

📄 That’s the material we’ll use in the next step to extract email addresses (and any other useful info).

We’re not reading this code manually — we’ll scan through it line by line to detect patterns that matter to us.

This is a technical but crucial step: it’s how we turn a URL into real, usable data.

Step 6 – Extract the Email with GPT

workflown8nleadssecteurbaptistefort.png

Now that we've retrieved all the raw HTML from the websites using the HTTP Request node, it's time to analyze it.

💡 Goal: detect the most relevant email address on each site (ideally the main contact or owner).

👉 To do that, we’ll use an OpenAI node (Message a Model). Here’s how to configure it:

⚙️ Key Parameters: Model: GPT-4-1-MINI (or any GPT-4+ model available)

Operation: Message a Model

Resource: Text

Simplify Output: ON

Prompt (message you provide):

Look at this website content and extract only the email I can contact this business. In your output, provide only the email and nothing else. Ideally, this email should be of the business owner, so if you have 2 or more options, try for most authoritative one. If you don't find any email, output 'Null'.

Exemplary output of yours:

name@examplewebsite.com

{{ $json.data }}

Step 7 – Save the Data in Airtable

leadslocauxn8ngooglebaptistefort.png

Once we’ve collected everything — the business name, address, phone number, website…

and most importantly the email extracted via ChatGPT — we need to store all of this somewhere clean and organized.

👉 The best place in this workflow is Airtable.

📦 Why Airtable? Because it allows you to:

Easily view and sort the leads you've scraped

Filter, tag, or enrich them later

And most importantly… reuse them in future automations

⚙️ What we're doing here We add an Airtable → Create Record node to insert each lead into our database.

Inside this node, we manually map each field with the data collected in the previous steps:

baptistefortn8nrecupererleadslocaux.png

| Airtable Field | Description | Value from n8n | | -------------- | ------------------------ | ------------------------------------------ | | Title | Business name | {{ $('Edit Fields').item.json.Title }} | | Street | Full address | {{ $('Edit Fields').item.json.Address }} | | Website | Website URL | {{ $('Edit Fields').item.json.Website }} | | Phone Number | Business phone number | {{ $('Edit Fields').item.json.Phone }} | | Email | Email found by ChatGPT | {{ $json.message.content }} | | URL | Google Maps listing link | {{ $('Edit Fields').item.json.URL }} |

🧠 Reminder: we’re keeping only clean, usable data — ready to be exported, analyzed, or used in cold outreach campaigns (email, CRM, enrichment, etc.).

➡️ And the best part? You can rerun this workflow automatically every week or month to keep collecting fresh leads 🔁.

Need Help Building an Automated Lead Generation System?

This workflow is a solid foundation for scraping Google Maps and extracting contact emails automatically. If you want to go further with AI-powered lead qualification, multi-channel outreach, and automatic follow-ups, our agency builds custom lead generation systems that run 24/7.

👉 Explore our lead generation automation services: Vision IA – Automated Lead Generation Agency

We help B2B companies and agencies scale their prospecting without hiring more people—everything from data collection to booking qualified meetings happens on autopilot.

Questions about this workflow or other automation solutions? Visit Vision IA or reach out for a free consultation.

n8n Workflow: Apify Google Maps Scraper with GPT and Airtable Integration

This n8n workflow automates the process of scraping business data from Google Maps using Apify, enriching it with OpenAI's GPT, and then storing the structured information in Airtable. It's designed to help you gather leads with comprehensive details like email, phone, and website directly from Google Maps searches.

What it does

This workflow streamlines the lead generation process by:

  1. Triggering Manually: Initiates the workflow upon a manual execution, allowing you to control when the scraping process begins.
  2. Fetching Data from Airtable: Retrieves search queries or other relevant configuration from a specified Airtable base. This allows for dynamic and configurable scraping tasks.
  3. Preparing Data for Apify: Transforms and sets the necessary fields (e.g., search queries) required by the Apify Google Maps Scraper.
  4. Looping Over Items: Processes each item (e.g., each search query) from Airtable individually to ensure all tasks are handled.
  5. Calling Apify Google Maps Scraper: Sends a request to the Apify Google Maps Scraper API to initiate the data extraction for the specified queries.
  6. Enriching Data with OpenAI (GPT): (Potentially, based on typical usage of GPT in such workflows, though not explicitly detailed in the provided JSON beyond the node's presence) Utilizes OpenAI to process the scraped data, possibly to extract specific information, categorize businesses, or generate summaries.
  7. Updating Airtable: Takes the enriched data and updates or creates new records in your Airtable base, populating fields like email, phone, and website.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running n8n instance to import and execute the workflow.
  • Airtable Account: An Airtable account with a base and table configured to store your Google Maps leads. You'll need an API key and the Base ID/Table Name.
  • Apify Account: An Apify account with access to the Google Maps Scraper actor. You'll need an Apify API token.
  • OpenAI Account: An OpenAI API key for the GPT model if you intend to use it for data enrichment.

Setup/Usage

  1. Import the Workflow:
    • Copy the provided JSON code.
    • In your n8n instance, go to "Workflows" and click "New".
    • Click the "Import from JSON" button and paste the JSON code.
  2. Configure Credentials:
    • Airtable Node:
      • Click on the "Airtable" node.
      • Add your Airtable API key as a credential.
      • Specify your Airtable Base ID and Table Name where the search queries are located and where the results will be stored.
    • HTTP Request Node (for Apify):
      • Click on the "HTTP Request" node.
      • Configure the URL and headers to interact with the Apify Google Maps Scraper API. This typically involves setting the Apify API endpoint and including your Apify API token in the headers.
    • OpenAI Node:
      • Click on the "OpenAI" node.
      • Add your OpenAI API key as a credential.
      • Configure the specific model and prompt you wish to use for data enrichment.
  3. Configure "Edit Fields (Set)" Node:
    • Adjust the "Edit Fields" node to map the data from Airtable to the expected input format for the Apify HTTP Request, and to map the scraped and enriched data back to Airtable.
  4. Configure "Loop Over Items (Split in Batches)" Node:
    • Ensure this node is correctly configured to process the list of items (e.g., search queries) coming from Airtable.
  5. Activate the Workflow:
    • Once all credentials and configurations are set, save the workflow.
    • Click the "Execute Workflow" button on the "Manual Trigger" node to run it manually, or set up a schedule if you prefer automated runs.

Related Templates

Automate interior design lead qualification with AI & human approval to Notion

Overview This automated workflow intelligently qualifies interior design leads, generates personalized client emails, and manages follow-up through a human-approval process. Built with n8n, Claude AI, Telegram approval, and Notion database integration. ⚠️ Hosting Options This template works with both n8n Cloud and self-hosted instances. Most nodes are native to n8n, making it cloud-compatible out of the box. What This Template Does Automated Lead Management Pipeline: Captures client intake form submissions from website or n8n forms AI-powered classification into HOT/WARM/COLD categories based on budget, project scope, and commitment indicators Generates personalized outreach emails tailored to each lead type Human approval workflow via Telegram for quality control Email revision capability for rejected drafts Automated client email delivery via Gmail Centralized lead tracking in Notion database Key Features ✅ Intelligent Lead Scoring: Analyzes 12+ data points including budget (AED), space count, project type, timeline, and style preferences ✅ Personalized Communication: AI-generated emails reference specific client details, demonstrating genuine understanding ✅ Quality Control: Human-in-the-loop approval via Telegram prevents errors before client contact ✅ Smart Routing: Different workflows for qualified leads (meeting invitations) vs. unqualified leads (respectful alternatives) ✅ Revision Loop: Rejected emails automatically route to revision agent for improvements ✅ Database Integration: All leads stored in Notion for pipeline tracking and analytics Use Cases Interior design firms managing high-volume lead intake Architecture practices with complex qualification criteria Home renovation companies prioritizing project value Any service business requiring budget-based lead scoring Sales teams needing approval workflows before client contact Prerequisites Required Accounts & API Keys: Anthropic Claude API - For AI classification and email generation Telegram Bot Token - For approval notifications Gmail Account - For sending client emails (or any SMTP provider) Notion Account - For lead database storage n8n Account - Cloud or self-hosted instance Technical Requirements: Basic understanding of n8n workflows Ability to create Telegram bots via BotFather Gmail app password or OAuth setup Notion database with appropriate properties Setup Instructions Step 1: Clone and Import Template Copy this template to your n8n instance (cloud or self-hosted) All nodes will appear as inactive - this is normal Step 2: Configure Form Trigger Open the Client Intake Form Trigger node Choose your trigger type: For n8n forms: Configure form fields matching the template structure For webhook: Copy webhook URL and integrate with your website form Required form fields: First Name, Second Name, Email, Contact Number Project Address, Project Type, Spaces Included Budget Range, Completion Date, Style Preferences Involvement Level, Previous Experience, Inspiration Links Step 3: Set Up Claude AI Credentials Obtain API key from https://console.anthropic.com In n8n: Create new credential → Anthropic → Paste API key Apply credential to these nodes: AI Lead Scoring Engine Personalized Client Outreach Email Generator Email Revision Agent Step 4: Configure Telegram Approval Bot Create bot via Telegram's @BotFather Copy bot token Get your Telegram Chat ID (use @userinfobot) In n8n: Create Telegram credential with bot token Configure Human-in-the-Loop Email Approval node: Add your Chat ID Customize approval message format if desired Step 5: Set Up Gmail Sending Enable 2-factor authentication on Gmail account Generate app password: Google Account → Security → App Passwords In n8n: Create Gmail credential using app password Configure Client Email Delivery node with sender details Step 6: Connect Notion Database Create Notion integration at https://www.notion.so/my-integrations Copy integration token Create database with these properties: Client Name (Title), Email (Email), Contact Number (Phone) Project Address (Text), Project Type (Multi-select) Spaces Included (Text), Budget (Select), Timeline (Date) Classification (Select: HOT/WARM/COLD), Confidence (Select) Estimated Value (Number), Status (Select) Share database with your integration In n8n: Add Notion credential → Paste token Configure Notion Lead Database Manager with database ID Step 7: Customize Classification Rules (Optional) Open AI Lead Scoring Engine node Review classification criteria in the prompt: HOT: 500k+ AED, full renovations, 2+ spaces WARM: 100k+ AED, 2+ spaces COLD: <100k AED OR single space Adjust thresholds to match your business requirements Modify currency if not using AED Step 8: Personalize Email Templates Open Personalized Client Outreach Email Generator node Customize: Company name and branding Signature placeholders ([Your Name], [Title], etc.) Tone and style preferences Alternative designer recommendations for COLD leads Step 9: Test the Workflow Activate the workflow Submit a test form with sample data Monitor each node execution in n8n Check Telegram for approval message Verify email delivery and Notion database entry Step 10: Set Up Error Handling (Recommended) Add error workflow trigger Configure notifications for failed executions Set up retry logic for API failures Workflow Node Breakdown Client Intake Form Trigger Captures lead data from website forms or n8n native forms with all project details. AI Lead Scoring Engine Analyzes intake data using structured logic: budget validation, space counting, and multi-factor evaluation. Returns HOT/WARM/COLD classification with confidence scores. Lead Classification Router Routes leads into three priority workflows based on AI classification, optimizing resource allocation. Sales Team Email Notifier Sends instant alerts to sales representatives with complete lead details and AI reasoning for internal tracking. Personalized Client Outreach Email Generator AI-powered composer creating tailored responses demonstrating genuine understanding of client vision, adapted by lead type. Latest Email Version Controller Captures most recent email output ensuring only final approved version proceeds to delivery. Human-in-the-Loop Email Approval Telegram-based review checkpoint sending generated emails to team member for quality control before client delivery. Approval Decision Router Evaluates reviewer's response, routing approved emails to client delivery or rejected emails to revision agent. Email Revision Agent AI-powered editor refining rejected emails based on feedback while maintaining personalization and brand voice. Client Email Delivery Sends final approved personalized emails demonstrating understanding of project vision with clear next steps. Notion Lead Database Manager Records all potential clients with complete intake data, classification results, and tracking information for pipeline management. Customization Tips Adjust Classification Thresholds: Modify budget ranges and space requirements in the AI Lead Scoring Engine prompt to match your market and service level. Multi-Language Support: Update all AI agent prompts with instructions for your target language. Claude supports 100+ languages. Additional Routing: Add branches for special cases like urgent projects, VIP clients, or specific geographic regions. CRM Integration: Replace Notion with HubSpot, Salesforce, or Airtable using respective n8n nodes. SMS Notifications: Add Twilio node for immediate HOT lead alerts to mobile devices. Troubleshooting Issue: Telegram approval not received Verify bot token is correct Confirm chat ID matches your Telegram account Check bot is not blocked Issue: Claude API errors Verify API key validity and credits Check prompt length isn't exceeding token limits Review rate limits on your Anthropic plan Issue: Gmail not sending Confirm app password (not regular password) is used Check "Less secure app access" if using older method Verify daily sending limits not exceeded Issue: Notion database not updating Confirm integration has access to database Verify property names match exactly (case-sensitive) Check property types align with data being sent Template Metrics Execution Time: ~30-45 seconds per lead (including AI processing) API Calls: 2-3 Claude requests per lead (classification + email generation, +1 if revision) Cost Estimate: ~$0.05-0.15 per lead processed (based on Claude API pricing) Support & Community n8n Community Forum: https://community.n8n.io Template Issues: Report bugs or suggest improvements via n8n template feedback Claude Documentation: https://docs.anthropic.com Notion API Docs: https://developers.notion.com License This template is provided as-is under MIT license. Modify and adapt freely for your business needs. --- Version: 1.0 Last Updated: October 2025 Compatibility: n8n v1.0+ (Cloud & Self-Hosted), Claude API v2024-10+

Jameson KanakulyaBy Jameson Kanakulya
201

Automated UGC video generator with Gemini images and SORA 2

This workflow automates the creation of user-generated-content-style product videos by combining Gemini's image generation with OpenAI's SORA 2 video generation. It accepts webhook requests with product descriptions, generates images and videos, stores them in Google Drive, and logs all outputs to Google Sheets for easy tracking. Main Use Cases Automate product video creation for e-commerce catalogs and social media. Generate UGC-style content at scale without manual design work. Create engaging video content from simple text prompts for marketing campaigns. Build a centralized library of product videos with automated tracking and storage. How it works The workflow operates as a webhook-triggered process, organized into these stages: Webhook Trigger & Input Accepts POST requests to the /create-ugc-video endpoint. Required payload includes: product prompt, video prompt, Gemini API key, and OpenAI API key. Image Generation (Gemini) Sends the product prompt to Google's Gemini 2.5 Flash Image model. Generates a product image based on the description provided. Data Extraction Code node extracts the base64 image data from Gemini's response. Preserves all prompts and API keys for subsequent steps. Video Generation (SORA 2) Sends the video prompt to OpenAI's SORA 2 API. Initiates video generation with specifications: 720x1280 resolution, 8 seconds duration. Returns a video generation job ID for polling. Video Status Polling Continuously checks video generation status via OpenAI API. If status is "completed": proceeds to download. If status is still processing: waits 1 minute and retries (polling loop). Video Download & Storage Downloads the completed video file from OpenAI. Uploads the MP4 file to Google Drive (root folder). Generates a shareable Google Drive link. Logging to Google Sheets Records all generation details in a tracking spreadsheet: Product description Video URL (Google Drive link) Generation status Timestamp Summary Flow: Webhook Request → Generate Product Image (Gemini) → Extract Image Data → Generate Video (SORA 2) → Poll Status → If Complete: Download Video → Upload to Google Drive → Log to Google Sheets → Return Response If Not Complete: Wait 1 Minute → Poll Status Again Benefits: Fully automated video creation pipeline from text to finished product. Scalable solution for generating multiple product videos on demand. Combines cutting-edge AI models (Gemini + SORA 2) for high-quality output. Centralized storage in Google Drive with automatic logging in Google Sheets. Flexible webhook interface allows integration with any application or service. Retry mechanism ensures videos are captured even with longer processing times. --- Created by Daniel Shashko

Daniel ShashkoBy Daniel Shashko
1166

Track personal finances in Google Sheets with AI agent via Slack

Who's it for This workflow is perfect for individuals who want to maintain detailed financial records without the overhead of complex budgeting apps. If you prefer natural language over data entry forms and want an AI assistant to handle the bookkeeping, this template is for you. It's especially useful for: People who want to track cash and online transactions separately Anyone who lends money to friends/family and needs debt tracking Users comfortable with Slack as their primary interface Those who prefer conversational interactions over manual spreadsheet updates What it does This AI-powered finance tracker transforms your Slack workspace into a personal finance command center. Simply mention your bot with transactions in plain English (e.g., "₹500 cash food, borrowed ₹1000 from John"), and the AI agent will: Parse transactions using natural language understanding via Google Gemini Calculate balance changes for cash and online accounts Show a preview of changes before saving anything Update Google Sheets only after you approve Track debts (who owes you, who you owe, repayments) Send daily reminders at 11 PM with current balances and active debts The workflow maintains conversational context using PostgreSQL memory, so you can say things like "yesterday's transactions" or "that payment to Sarah" and it understands the context. How it works Scheduled Daily Check-in (11 PM) Fetches current balances from Google Sheets Retrieves all active debts Formats and sends a Slack message with balance summary Prompts you to share the day's transactions AI Agent Transaction Processing When you mention the bot in Slack: Phase 1: Parse & Analyze Extracts amount, payment type (cash/online), category (food, travel, etc.) Identifies transaction type (expense, income, borrowed, lent, repaid) Stores conversation context in PostgreSQL memory Phase 2: Calculate & Preview Reads current balances from Google Sheets Calculates new balances based on transactions Shows formatted preview with projected changes Waits for your approval ("yes"/"no") Phase 3: Update Database (only after approval) Logs transactions with unique IDs and timestamps Updates debt records with person names and status Recalculates and stores new balances Handles debt lifecycle (Active → Settled) Phase 4: Confirmation Sends success message with updated balances Shows active debts summary Includes logging timestamp Requirements Essential Services: n8n instance (self-hosted or cloud) Slack workspace with admin access Google account Google Gemini API key PostgreSQL database Recommended: Claude AI model (mentioned in workflow notes as better alternative to Gemini) How to set up Google Sheets Setup Create a new Google Sheet with three tabs named exactly: Balances Tab: | Date | CashBalance | OnlineBalance | Total_Balance | |------|--------------|----------------|---------------| Transactions Tab: | TransactionID | Date | Time | Amount | PaymentType | Category | TransactionType | PersonName | Description | Added_At | |----------------|------|------|--------|--------------|----------|------------------|-------------|-------------|----------| Debts Tab: | PersonName | Amount | Type | Datecreated | Status | Notes | |-------------|--------|------|--------------|--------|-------| Add header rows and one initial balance row in the Balances tab with today's date and starting amounts. Slack App Setup Go to api.slack.com/apps and create a new app Under OAuth & Permissions, add these Bot Token Scopes: app_mentions:read chat:write channels:read Install the app to your workspace Copy the Bot User OAuth Token Create a dedicated channel (e.g., personal-finance-tracker) Invite your bot to the channel Google Gemini API Visit ai.google.dev Create an API key Save it for n8n credentials setup PostgreSQL Database Set up a PostgreSQL database (you can use Supabase free tier): Create a new project Note down connection details (host, port, database name, user, password) The workflow will auto-create the required table n8n Workflow Configuration Import the workflow and configure: A. Credentials Google Sheets OAuth2: Connect your Google account Slack API: Add your Bot User OAuth Token Google Gemini API: Add your API key PostgreSQL: Add database connection details B. Update Node Parameters All Google Sheets nodes: Select your finance spreadsheet Slack nodes: Select your finance channel Schedule Trigger: Adjust time if you prefer a different check-in hour (default: 11 PM) Postgres Chat Memory: Change sessionKey to something unique (e.g., financetrackeryour_name) Keep tableName as n8nchathistory_finance or rename consistently C. Slack Trigger Setup Activate the "Bot Mention trigger" node Copy the webhook URL from n8n In Slack App settings, go to Event Subscriptions Enable events and paste the webhook URL Subscribe to bot event: app_mention Save changes Test the Workflow Activate both workflow branches (scheduled and agent) In your Slack channel, mention the bot: @YourBot ₹100 cash snacks Bot should respond with a preview Reply "yes" to approve Verify Google Sheets are updated How to customize Change Transaction Categories Edit the AI Agent's system message to add/remove categories. Current categories: travel, food, entertainment, utilities, shopping, health, education, other Modify Daily Check-in Time Change the Schedule Trigger's triggerAtHour value (0-23 in 24-hour format). Add Currency Support Replace ₹ with your currency symbol in: Format Daily Message code node AI Agent system prompt examples Switch AI Models The workflow uses Google Gemini, but notes recommend Claude. To switch: Replace "Google Gemini Chat Model" node Add Claude credentials Connect to AI Agent node Customize Debt Types Modify AI Agent's system prompt to change debt handling logic: Currently: IOwe and TheyOwe_Me You can add more types or change naming Add More Payment Methods Current: cash, online To add more (e.g., credit card): Update AI Agent prompt Modify Balances sheet structure Update balance calculation logic Change Approval Keywords Edit AI Agent's Phase 2 approval logic to recognize different approval phrases. Add Spending Analytics Extend the daily check-in to calculate: Weekly/monthly spending summaries Category-wise breakdowns Use additional Code nodes to process transaction history Important Notes ⚠️ Never trigger with normal messages - Only use app mentions (@botname) to avoid infinite loops where the bot replies to its own messages. 💡 Context Awareness - The bot remembers conversation history, so you can reference "yesterday", "last week", or previous transactions naturally. 🔒 Data Privacy - All your financial data stays in your Google Sheets and PostgreSQL database. The AI only processes transaction text temporarily. 📊 Backup Regularly - Export your Google Sheets periodically as backup. --- Pro Tips: Start with small test transactions to ensure everything works Use consistent person names for debt tracking The bot understands various formats: "₹500 cash food" = "paid 500 rupees in cash for food" You can batch transactions in one message: "₹100 travel, ₹200 food, ₹50 snacks"

Habeeb MohammedBy Habeeb Mohammed
448