Back to Catalog

Templates by Ian Kerins

Scrape & track dentist leads from Google Maps with ScrapeOps, Sheets & notifications

Overview This n8n template automates the generation of local business leads by scraping Google Maps. It goes beyond basic search results by visiting individual business pages to extract detailed contact information, reviews, and attributes (like LGBTQ+ friendly status). It includes built-in deduplication against a Google Sheet to ensure you only receive alerts for new leads. Who is this for? Marketing Agencies: Finding local clients for services. Sales Teams: Building lists of prospects in specific cities. Recruiters: Finding businesses in specific niches. Researchers: Gathering data on local business landscapes. What problems it solves Manual Data Entry: Eliminates the need to copy-paste business details from Maps. Duplicate Leads: Automatically checks against your database to prevent duplicate entries. Incomplete Data: Performs a "deep scrape" to get data often missing from list views (websites, full phone numbers, reviews). Delayed Action: Sends instant alerts via Gmail and Slack so you can act on new leads immediately. How it works Input: Takes a city name via a form (or can be scheduled). Search: Uses ScrapeOps to search Google Maps for your target keyword (e.g., "Dentist"). Deep Extraction: Visits each business profile to scrape phone numbers, websites, ratings, and reviews. Validation: Compares found businesses with your existing Google Sheet database. Action: Saves new leads to the sheet and notifies you via Gmail and Slack. Set up steps (~ 10-15 minutes) ScrapeOps Account: Register for a free API key at ScrapeOps. Google Sheet: Create a new Google Sheet. Add these headers: businessName, phone, website, rating, totalReviews, address, city, category, mapUrl, status, checkedAt, lgbtqFriendly, review1, review2, review3. Or duplicate this Template Sheet. Configure Nodes: Set Google Maps Configuration: Set your keyword (e.g., "dentist", "plumber"). Google Sheets Nodes: Connect your account and select the sheet you created. Gmail & Slack Nodes: Update with your email address and Slack channel. Pre-conditions An n8n instance (Cloud or Self-hosted). A ScrapeOps API Key (Free tier available). Google Cloud Console project with Gmail and Sheets APIs enabled (for credentials). Disclaimer This template uses ScrapeOps as a community node. You are responsible for complying with Google's Terms of Use, robots directives, and applicable laws in your jurisdiction. Scraping targets may change at any time; adjust render/scroll/wait settings and parsers as needed. Use responsibly for legitimate business purposes. Key Features The ScrapeOps n8n node provides access to three main APIs: Proxy API Access to ScrapeOps' proxy aggregator for reliable web scraping: Smart proxy rotation across multiple providers JavaScript rendering support Anti-bot bypass capabilities Geo-targeting options Mobile and residential proxy support Full Documentation: n8n Proxy API Aggregator Parser API Extract structured data from popular websites without maintaining your own parsers: Supported Sites: Amazon, eBay, Walmart, Indeed, Redfin Page Types: Product pages, search results, reviews, categories Returns clean, structured JSON data Full Documentation: n8n Parser API Data API Direct access to structured data endpoints: Amazon Product API: Get product details by ASIN or URL Amazon Search API: Search products and get structured results Full Documentation: n8n Data APIs

Ian KerinsBy Ian Kerins
177

Track GitHub trending repositories with ScrapeOps & Google Sheets

Overview This n8n template tracks GitHub Trending repositories (daily/weekly/monthly), parses the trending page into structured data (rank, repo name, stars, language, etc.), and stores results in Google Sheets with automatic deduping. It’s designed for teams who want a simple “trending feed” for engineering research, developer tooling discovery, and weekly reporting. Who is this for? Developers, PMs, DevRel, and tooling teams who want a lightweight trend radar Anyone building a curated list of fast-rising open source projects Teams who want Sheets-based tracking without manual copy/paste What problems it solves Automatically collects GitHub Trending data on a schedule Prevents duplicate rows using a stable dedupe_key Updates existing rows when values change (rank/stars/score) How it works A schedule triggers the workflow. Inputs define the trending window (daily, weekly, or monthly) and optional languages. ScrapeOps fetches the GitHub Trending HTML reliably. The workflow parses repositories and ranks from the HTML. Cleaned rows are written to Google Sheets using Append or Update Row matching on dedupe_key. Setup steps (~5–10 minutes) 1) ScrapeOps Register & get an API key: https://scrapeops.io/app/register/n8n Read the n8n overview: https://scrapeops.io/docs/n8n/overview/ (Optional) Learn ScrapeOps Proxy API features: https://scrapeops.io/docs/n8n/proxy-api/ 2) Google Sheets Duplicate this sheet/create a Sheet and add a trending_raw tab. Add columns used by the workflow (e.g. capturedat, since, sourceurl, rankonpage, fullname, repourl, starstotal, forkstotal, starsinperiod, score, dedupe_key). In the Google Sheets node, choose Append or Update Row and set Column to match on = dedupe_key. 3) Customize Change since to daily/weekly/monthly in the Inputs node. Add languages via languages_csv (example: any,python,go,rust). Adjust delay if needed. Pre-conditions ScrapeOps account + API key configured in n8n Google Sheets credentials connected in n8n A Sheet tab named trending_raw with matching columns Disclaimer This template uses ScrapeOps as a community node. You are responsible for complying with GitHub’s Terms of Service, robots directives, and applicable laws in your jurisdiction. Scraping targets can change at any time; you may need to update wait times and parsing logic accordingly. Use responsibly for legitimate business purposes.

Ian KerinsBy Ian Kerins
10
All templates loaded