Back to Catalog

Templates by Madame AI Team | Kai

Automated B2B lead generation: Google Maps to Sheets with BrowserAct & Telegram

Automated B2B Lead Generation from Google Maps to Google Sheets using BrowserAct This n8n template automates local lead generation by scraping Google Maps for businesses, saving them to Google Sheets, and notifying you in real-time via Telegram. This workflow is perfect for sales teams, marketing agencies, and local B2B services looking to build targeted lead lists automatically. --- Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. --- How it works The workflow is triggered manually. You can set the Location, BussinesCategory, and number of leads (ExtractedData) in the first BrowserAct node. A BrowserAct node ("Run a workflow task") initiates the scraping job on Google Maps using your specified criteria. A second BrowserAct node ("Get details of a workflow task") pauses the workflow and waits for the scraping task to be 100% complete. A Code node takes the raw JSON string output from the scraper and correctly parses it, splitting the data into individual items (one for each business). A Google Sheets node appends or updates each lead into your spreadsheet, matching on the "Name" column to prevent duplicate entries. Finally, a Telegram node sends a message with the new lead's details to your specified chat, providing instant notification. --- Requirements BrowserAct API account for web scraping BrowserAct "Google Maps Local Lead Finder" Template BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets credentials for saving leads Telegram credentials for sending notifications Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node --- Workflow Guidance and Showcase AUTOMATE Local Lead Generation: Google Maps to Sheets & Telegram with n8n

Madame AI Team | KaiBy Madame AI Team | Kai
1169

Synchronizing WooCommerce inventory and creating products with Google Gemini AI and BrowserAct

Synchronize WooCommerce Inventory & Create Products with Gemini AI & BrowserAct This sophisticated n8n template automates WooCommerce inventory management by scraping supplier data, updating existing products, and intelligently creating new ones with AI-formatted descriptions. This workflow is essential for e-commerce operators, dropshippers, and inventory managers who need to ensure their product pricing and stock levels are synchronized with multiple third-party suppliers, minimizing overselling and maximizing profit. --- Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. --- How it works The workflow is typically run by a Schedule Trigger (though a Manual Trigger is also shown) to check stock automatically. It reads a list of suppliers and their inventory page URLs from a central Google Sheet. The workflow loops through each supplier: A BrowserAct node scrapes the current stock and price data from the supplier's inventory page. A Code node parses this bulk data into individual product items. It then loops through each individual product found. The workflow checks WooCommerce to see if the product already exists based on its name. If the product exists: It proceeds to update the existing product's price and stock quantity. If the product DOES NOT exist: An If node checks if the missing product's category matches a predefined type (optional filtering). If it passes the filter, a second BrowserAct workflow scrapes detailed product attributes from a dedicated product page (e.g., DigiKey). An AI Agent (Gemini) transforms these attributes into a specific, styled HTML table for the product description. Finally, the product is created in WooCommerce with all scraped details and the AI-generated description. Error Handling: Multiple Slack nodes are configured to alert your team immediately if any scraping task fails or if the product update/creation process encounters an issue. Note: This workflow does not support image uploads for new products. To enable this functionality, you must modify both the n8n and BrowserAct workflows. --- Requirements BrowserAct API account for web scraping BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) BrowserAct templates named “WooCommerce Inventory & Stock Synchronization” and “WooCommerce Product Data Reconciliation” Google Sheets credentials for the supplier list WooCommerce credentials for product management Google Gemini account for the AI Agent Slack credentials for error alerts --- Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node --- Workflow Guidance and Showcase STOP Overselling! Auto-Sync WooCommerce Inventory from ANY Supplier

Madame AI Team | KaiBy Madame AI Team | Kai
600

Scrape & import shoe products to Shopify with BrowserAct (with variants & images)

Scrape & Import Products to Shopify from Any Site (with Variants & Images)-(Optimized for shoes) This advanced n8n template automates e-commerce operations by scraping product data (including variants and images) from any URL and creating fully detailed products in your Shopify store. This workflow is essential for dropshippers, e-commerce store owners, and anyone looking to quickly import product catalogs from specific websites into their Shopify store. --- Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. --- How it works The workflow reads a list of product page URLs from a Google Sheet. Your sheet, with its columns for Product Name and Product Link, acts as a database for your workflow. The Loop Over Items node processes products one URL at a time. Two BrowserAct nodes run sequentially to scrape all product details, including the Name, price, description, sizes, and image links. A custom Code node transforms the raw scraped data (where fields like sizes might be a single string) into a structured JSON format with clean lists for sizes and images. The Shopify node creates the base product entry using the main details. The workflow then uses a series of nodes (Set Option and Add Option via HTTP Request) to dynamically add product options (e.g., "Shoe Size") to the new product. The workflow intelligently uses HTTP Request nodes to perform two crucial bulk tasks: Create a unique variant for each available size, including a custom SKU. Upload all associated product images from their external URLs to the product. A final Slack notification confirms the batch has been processed. --- Requirements BrowserAct API account for web scraping BrowserAct "Bulk Product Scraping From (URLs) and uploading to Shopify (Optimized for shoe - NIKE -> Shopify)" Template BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets credentials for the input list Shopify credentials (API Access Token) to create and update products, variants, and images Slack credentials (optional) for notifications --- Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node --- Workflow Guidance and Showcase Automate Shoe Scraping to Shopify Using n8n, BrowserAct & Google Sheets

Madame AI Team | KaiBy Madame AI Team | Kai
585

Filter real-time news with Gemini AI and BrowserAct for Telegram channels

Automate AI News Filtering with Keywords to Telegram This n8n template helps you stay up-to-date by automatically filtering news and sending relevant articles to Telegram using an AI Agent. This workflow is perfect for content marketers, journalists, or researchers who need to find specific articles without manually sifting through countless news feeds. --- Steps to Take Create BrowserAct Workflow: Set up the News Content Marketing Automation template in your BrowserAct account. Add BrowserAct Token: Connect your BrowserAct account credentials to the HTTP Request inside Run Node. Update Workflow ID: Change the workflow_id value in the HTTP Request inside Run Node to match the one from your BrowserAct workflow. Connect Gemini: Add your Google Gemini credentials to the AI Agent node. Configure Telegram: Connect your Telegram account and add your Channel ID to the Send a News Photo To Telegram node. --- How it works The workflow is triggered automatically on a schedule, but you can also manually run it. It uses an HTTP Request node to start a web scraping task via the BrowserAct API to collect the latest news. A series of If and Wait nodes monitor the scraping job until the full data is ready. An AI Agent node, powered by Google Gemini, processes the headlines and filters the news based on a list of keywords you define. A Code node then formats the AI's output into a clean, readable format. Finally, the filtered news articles are sent as rich media messages to Telegram, including the headline, a picture, and a link. --- Requirements BrowserAct API account BrowserAct “News Content Marketing Automation” Template Gemini account Telegram credentials --- Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates

Madame AI Team | KaiBy Madame AI Team | Kai
387

Scrape physician profiles from BrowserAct into Google Sheets and notify Slack

Scrape physician profiles from BrowserAct to Google Sheets This workflow automates the process of building a targeted database of healthcare providers by scraping physician details for a specific location and syncing them to your records. It leverages BrowserAct to extract data from healthcare directories and ensures your database stays clean by preventing duplicate entries. Target Audience Medical recruiters, pharmaceutical sales representatives, lead generation specialists, and healthcare data analysts. How it works Define Location: The workflow starts by setting the target Location and State in a Set node. Scrape Data: A BrowserAct node executes a task (using the "Physician Profile Enricher" template) to search a healthcare directory (e.g., Healow) for doctors matching the criteria. Parse JSON: A Code node takes the raw string output from the scraper and parses it into individual JSON objects. Update Database: The workflow uses a Google Sheets node to append new records or update existing ones based on the physician's name, preventing duplicates. Notify Team: A Slack node sends a message to a specific channel to confirm the batch job has finished successfully. How to set up Configure Credentials: Connect your BrowserAct, Google Sheets, and Slack accounts in n8n. Prepare BrowserAct: Ensure the Physician Profile Enricher template is saved in your BrowserAct account. Setup Google Sheet: Create a new Google Sheet with the required headers (listed below). Select Spreadsheet: Open the Google Sheets node and select your newly created file and sheet. Set Variables: Open the Define Location node and input your target Location (City) and State. Configure Notification: Open the Slack node and select the channel where you want to receive alerts. Google Sheet Headers To use this workflow, create a Google Sheet with the following headers: Name Specialty Address Requirements BrowserAct account with the Physician Profile Enricher template. Google Sheets account. Slack account. How to customize the workflow Change the Data Source: Modify the BrowserAct template to scrape a different directory (e.g., Zocdoc or WebMD) and update the Google Sheet columns accordingly. Switch Notifications: Replace the Slack node with a Microsoft Teams, Discord, or Email node to suit your team's communication preferences. Enrich Data: Add an AI Agent node after the Code node to format addresses or research the specific clinics listed. Need Help? How to Find Your BrowserAct API Key & Workflow ID How to Connect n8n to BrowserAct How to Use & Customize BrowserAct Templates --- Workflow Guidance and Showcase Video Automate Medical Lead Gen: Scrape Healow to Google Sheets & Slack

Madame AI Team | KaiBy Madame AI Team | Kai
380

Product review analysis with BrowserAct & Gemini-powered recommendations

Product Review Analysis with BrowserAct & Gemini-Powered Recommendations. This n8n template demonstrates how to perform product review sentiment analysis and generate improvement recommendations using an AI Agent. This workflow is perfect for e-commerce store owners, product managers, or marketing teams who want to automate the process of collecting feedback and turning it into actionable insights. --- How it works The workflow is triggered manually. An HTTP Request node initiates a web scraping task with the BrowserAct API to collect product reviews. A series of If and Wait nodes are used to check the status of the scraping task. If the task is not yet complete, the workflow pauses and retries until it receives the full dataset. An AI Agent node, powered by Google Gemini, then processes the scraped review summaries. It analyzes the sentiment of each review and generates actionable improvement recommendations. Finally, the workflow sends these detailed recommendations via a Telegram message and an Email to the relevant stakeholders. --- Requirements BrowserAct API account for web scraping BrowserAct "Product Review Sentiment Analysis" Template Gemini account for the AI Agent Telegram and SMTP credentials for sending messages --- Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates --- Workflow Guidance and Showcase How to INSTANTLY Get Product Improvement Ideas from Amazon Reviews | BrowserAct + n8n + Gemini

Madame AI Team | KaiBy Madame AI Team | Kai
293

Monitor Shopify stores for new products with BrowserAct and Slack alerts

Automated E-commerce Store Monitoring for New Products Using BrowserAct This n8n template is an advanced competitive intelligence tool that automatically monitors competitor E-commerce/Shopify stores and alerts you the moment they launch a new product. This workflow is essential for e-commerce store owners, product strategists, and marketing teams who need real-time insight into what their competitors are selling. --- Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. --- How it works The workflow runs on a Schedule Trigger to check for new products automatically (e.g., daily). A Google Sheets node fetches your master list of competitor store links from a central sheet. The workflow loops through each competitor one by one. For each competitor, a Google Sheets node first creates a dedicated tracking sheet (if one doesn't exist) to store their product list history. A BrowserAct node then scrapes the competitor's current product list from their live website. The scraped data is saved to the competitor's dedicated tracking sheet. The workflow then fetches the newly scraped list and the previously stored list* of products. A custom Code node (labeled "Compare Datas") performs a difference check to reliably detect if any new products have been added. If a new product is detected, an If node triggers an immediate Slack alert to your team, providing real-time competitive insight. --- Requirements BrowserAct API account for web scraping BrowserAct "Competitors Shopify Website New Product Monitor" Template BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets credentials for storing and managing data Slack credentials for sending alerts --- Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node --- Workflow Guidance and Showcase Automatically Track Competitor Products | n8n & Google Sheets Template

Madame AI Team | KaiBy Madame AI Team | Kai
223

Indeed job matching to Telegram with BrowserAct, Gemini & Telegram

Analyze job market data with AI to find matching jobs This n8n template helps you stay on top of the job market by matching scraped job offers with your resume using an AI Agent. This workflow is perfect for job seekers, recruiters, or market analysts who need to find specific job opportunities without manually sifting through countless listings. --- Steps to Take Create BrowserAct Workflow: Set up the Job Market Intelligence template in your BrowserAct account. Add BrowserAct Token: Connect your BrowserAct account credentials to the HTTP Request node. Update Workflow ID: Change the workflow_id value in the HTTP Request node to match the one from your BrowserAct workflow. Connect Gemini: Add your Google Gemini credentials and update your resume inside the prompt in the AI Agent node. Configure Telegram: Connect your Telegram account and add your Channel ID to the Send a text message node. --- How it works The workflow is triggered manually by clicking "Execute workflow," but you can easily set it to run on a schedule. It uses an HTTP Request node to start a web scraping task via the BrowserAct API to collect the latest job offers. A series of If and Wait nodes monitor the scraping job, ensuring the full data is ready before proceeding. An AI Agent node, powered by Google Gemini, processes the job listings and filters them to find the best matches for your resume. A Code node then transforms the AI's output into a clean, readable format. Finally, the filtered job offers are sent directly to you via Telegram. --- Requirements BrowserAct API account BrowserAct “Job Market Intelligence” Template Gemini account Telegram credentials --- Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates Workflow Guidance and Showcase Never Manually Search for a Job Again (AI Automation Tutorial)

Madame AI Team | KaiBy Madame AI Team | Kai
212

Aggregate Twitter/X content to Telegram channel with BrowserAct & Google Gemini

Automate social media content aggregation to a Telegram channel This n8n template automatically aggregates and analyzes key updates from your social media platforms Home Page, delivering them as curated posts to a Telegram channel. This workflow is perfect for digital marketers, brand managers, or data analysts and Busy people, seeking to monitor real-time trends and competitor activity without manual effort. --- How it works The workflow is triggered automatically on a schedule to aggregate the latest social media posts. A series of If and Wait nodes monitor the data processing job until the full data is ready. An AI Agent, powered by Google Gemini, refines the content by summarizing posts and removing duplicates. An If node checks for an image in the post to decide if a photo or a text message should be sent. Finally, the curated posts are sent to your Telegram channel as rich media messages. --- How to use Set up BrowserAct Template: In your BrowserAct account, set up “Twitter/X Content Aggregation” template. Set up Credentials: Add your credentials for BrowserAct In Run Node , Google Gemini in Agent Node, and Telegram in Send Node. Add Workflow ID: Change the workflow_id value inside the HTTP Request inside the Run Node, to match the one from your BrowserAct workflow. Activate Workflow: To enable the automated schedule, simply activate the workflow. --- Requirements BrowserAct API account BrowserAct “Twitter/X Content Aggregation” Template Gemini account Telegram credentials --- customizing this workflow This workflow provides a powerful foundation for social media monitoring. You could: Replace the Telegram node with an email or Slack node to send notifications to a different platform. Add more detailed prompts to the AI Agent for more specific analysis or summarization. customize BrowserAct Workflow to reach your desire. --- Need Help ? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates --- Workflow Guidance and Showcase Automate Your Social Media: Get All X/Twitter Updates Directly in Telegram!

Madame AI Team | KaiBy Madame AI Team | Kai
206

Find & qualify funded leads with BrowserAct & Gemini

Find & Qualify Funded Leads with BrowserAct & Gemini This n8n template helps you find new investment leads by automatically scraping articles for funding announcements and analyzing them with an AI Agent. This workflow is ideal for venture capitalists, sales teams, or market researchers who need to automatically track and compile lists of recently funded companies. --- Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. --- How it works The workflow is triggered manually but can be set to a Cron node to run on a schedule. A Google Sheet node loads a list of keywords (e.g., "Series A," "Series B") and geographic locations to search for. The workflow loops through each keyword, initiating BrowserAct web scraping tasks to collect relevant articles. A second set of BrowserAct nodes patiently monitors the scraping jobs, waiting for them to complete before proceeding. Once all articles are collected, they are merged and fed into an AI Agent node, powered by Google Gemini. The AI Agent processes the articles to identify companies that recently received funding, extracting the Company Name, the Field of Investment, and the source URL. A Code node transforms the AI's JSON output into a clean, itemized format. An If node filters out any entries where no company was found, ensuring data quality. The qualified leads are automatically added or updated in a Google Sheet, matching by "Company" to prevent duplicates. Finally, a Slack message is sent to a channel to notify your team that the lead list has been updated. --- Requirements BrowserAct API account for web scraping BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) BrowserAct "Funding Announcement to Lead List (TechCrunch)" Template (or a similar scraping workflow) Gemini account for the AI Agent Google Sheets credentials for input and output Slack credentials for sending notifications --- Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct n8n Community Node --- Workflow Guidance and Showcase How to Automatically Find Leads from Funding News (n8n Workflow Tutorial)

Madame AI Team | KaiBy Madame AI Team | Kai
163

Real-time MAP enforcement & price violation alerts using Google Sheets, BrowserAct & Slack

Real-Time MAP Enforcement & Price Violation Alerts using BrowserAct & slack This n8n template automates MAP (Minimum Advertised Price) enforcement by monitoring reseller websites and alerting you instantly to price violations and stock issues. This workflow is essential for brand owners, manufacturers, and compliance teams who need to proactively monitor their distribution channels and enforce pricing policies. --- How it works The workflow runs on a Schedule Trigger (e.g., hourly) to continuously monitor product prices. A Google Sheets node fetches your list of resellers, product URLs, and the official MAP price (AP_Price). The Loop Over Items node ensures that each reseller's product is checked individually. A pair of BrowserAct nodes navigate to the reseller's product page and reliably scrape the current live price. A series of If nodes check for violations: The first check (If1) looks for "NoData," signaling that the product is Out of Stock, and sends a specific Slack alert. The second check (If) compares the scraped price to your MAP price, triggering a detailed Slack alert if a MAP violation is found. The workflow loops back to check the next reseller on the list. --- Requirements BrowserAct API account for web scraping BrowserAct "MAP (Minimum Advertised Price) Violation Alerts" Template BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets credentials for your price list Slack credentials for sending alerts --- Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node --- Workflow Guidance and Showcase I Built a Bot to Catch MAP Violators (n8n + BrowserAct Workflow)

Madame AI Team | KaiBy Madame AI Team | Kai
130

Automate YP.com directory scraping to Google Sheets using BrowserAct

Automate Directory Scraping to Google Sheets using BrowserAct This n8n template helps you generate local business leads by automatically scraping online directories and saving the results directly to a spreadsheet. This workflow is perfect for sales teams, marketing agencies, or anyone looking to build a list of local business leads by scraping online directories like YP.com. --- Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. --- How it works The workflow is triggered manually. You can set the businesscategory and citylocation inputs in the "Run a workflow task" node. A BrowserAct node initiates the web scraping task on your BrowserAct account using the template specified. A second BrowserAct node ("Get details of a workflow task") patiently waits for the scraping job to finish before allowing the workflow to proceed. A Code node takes the raw output from the scraper (which is a single JSON string) and correctly parses it, splitting the data into individual items for each business. Finally, a Google Sheets node appends or updates each business as a new row in your spreadsheet, matching on "Company Name" to prevent duplicates. --- Requirements BrowserAct API account for web scraping BrowserAct "Online Directory Lead Scraper (YP.com)" Template BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) Google Sheets credentials for saving the leads --- Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node --- Workflow Guidance and Showcase STOP Manual Leads! Automate Lead Gen with BrowserAct & n8n

Madame AI Team | KaiBy Madame AI Team | Kai
32