Find LinkedIn professionals with Google Search and Airtable
Find LinkedIn Professionals with Google Search and Airtable
Who is this for?
This workflow is perfect for sales professionals, recruiters, business development teams, and marketers who need to build targeted prospect lists from LinkedIn. Whether you're looking for specific job titles, industry professionals, or experts in particular locations, this template automates the tedious process of manual LinkedIn searching.
What problem is this workflow solving?
Finding qualified prospects on LinkedIn manually is time-consuming and inefficient. Traditional methods involve:
- Manually searching LinkedIn with limited search capabilities
- Copy-pasting profile information one by one
- Struggling with LinkedIn's search limitations and restrictions
- Difficulty organizing and tracking prospect data
- No systematic way to avoid duplicate contacts
This workflow solves these challenges by leveraging Google's powerful search capabilities to find LinkedIn profiles at scale, automatically extracting key information, and organizing everything in a structured database.
What this workflow does
The workflow performs intelligent LinkedIn prospect discovery through these key steps:
- Keyword-Based Search: Uses Google Custom Search API to find LinkedIn profiles matching your specific criteria (job titles, industries, locations)
- Smart Data Extraction: Automatically parses profile titles, descriptions, URLs, and search snippets from Google results
- Structured Storage: Saves all prospect data to Airtable with proper field mapping and automatic deduplication
- Pagination Handling: Automatically processes multiple pages of search results to maximize prospect discovery
- Rate Limiting: Includes built-in delays to respect API limits and ensure reliable operation
Key features:
- Deduplication: Prevents storing duplicate LinkedIn profiles
- Batch Processing: Handles large prospect lists efficiently
- Customizable Search: Easily modify keywords to target different professional segments
- Clean Data Output: Structured data ready for outreach campaigns
Setup
Prerequisites
You'll need accounts with the following services:
- Google Cloud Console (for Custom Search API)
- Airtable (free tier works)
- n8n (cloud or self-hosted)
Step 1: Google Custom Search Setup
-
Go to Google Cloud Console
-
Create a new project or select existing one
-
Enable the Custom Search API
-
Create credentials (API Key)
-
Set up a Custom Search Engine at Google CSE
-
Configure it to search the entire web
-
Copy your Search Engine ID (cx parameter)
Bonus: Youtube Set-up Guide
Step 2: Airtable Base Setup
Create a new Airtable base with a table named "LinkedIn Prospects" containing these fields:
- Title (Single line text) - LinkedIn profile headline
- linkedin_url (URL) - Direct link to LinkedIn profile
- Search (Single line text) - Original search terms used
- Description (Long text) - Profile description/summary
- Snippet (Long text) - Google search result snippet
Step 3: n8n Credentials Configuration
Set up these credentials in n8n:
Google Custom Search API:
- Type: HTTP Query Auth
- Name:
Google Query Auth - Query Parameter Name:
key - Value: Your Google API key
Airtable:
- Type: Airtable Personal Access Token
- Token: Your Airtable personal access token
- Configure the base and table IDs in the Airtable node
Step 4: Workflow Configuration
- Import this workflow template
- Update the "⚙️ CUSTOMIZE YOUR SEARCH KEYWORDS HERE" node with your target keywords
- Configure the Airtable node with your base and table information
- Test the workflow with a small keyword set first
How to customize this workflow to your needs
Targeting Different Industries
Modify the search keywords in the yellow configuration node:
// For technology professionals
"Software Engineer React"
"Product Manager SaaS"
"Data Scientist Machine Learning"
// For sales professionals
"Account Executive Enterprise"
"Sales Director B2B"
"Business Development Manager"
// For marketing professionals
"Digital Marketing Manager"
"Content Marketing Specialist"
"Growth Marketing Lead"
Geographic Targeting
Add location keywords to narrow your search:
"Marketing Manager London"
"Sales Director New York"
"Software Engineer Berlin"
Company Size Targeting
Include company type indicators:
"CFO Startup"
"VP Engineering Fortune 500"
"Marketing Director SMB"
Adjusting Search Volume
Modify the Maxresults parameter in the "Configure Search Settings" node:
- Set to
10for quick tests - Set to
50-100for comprehensive searches - Maximum recommended:
100per search to respect API limits
Industry-Specific Customization
For Recruiters:
- Target specific job titles and seniority levels
- Add skills-based keywords ("Python Developer", "React Specialist")
- Include experience indicators ("Senior", "Lead", "Principal")
For Sales Teams:
- Focus on decision-maker titles ("Director", "VP", "C-Level")
- Target specific company sizes or industries
- Include location-based searches for territory management
For Marketers:
- Search for industry influencers and thought leaders
- Target specific professional communities
- Look for content creators and industry experts
Advanced Filtering
Add conditional logic after the search results to filter prospects based on:
- Profile description keywords
- Title patterns
- Company information (when available in snippets)
Integration Extensions
Connect additional tools to enhance your prospect research:
- Email finder tools (Hunter.io, Apollo) for contact discovery
- CRM integration (HubSpot, Salesforce) for automatic lead creation
- Enrichment services (Clearbit, ZoomInfo) for additional prospect data
- Slack/Teams notifications for real-time prospect alerts
Data Quality Improvements
Enhance the workflow with additional processing:
- Duplicate detection across multiple search terms
- Profile validation to ensure active LinkedIn profiles
- Keyword scoring to rank prospect relevance
- Export formatting for specific CRM requirements
This template provides a solid foundation that can be adapted for virtually any B2B prospect research need, making it an essential tool for modern sales and marketing teams.
Find LinkedIn Professionals with Google Search and Airtable
This n8n workflow automates the process of finding LinkedIn professionals using Google Search and storing the results in Airtable. It's designed to help with lead generation, recruitment, or market research by systematically searching for specific professional profiles and organizing the findings.
What it does
This workflow performs the following steps:
- Triggers Manually: The workflow is initiated manually, allowing you to control when the search process begins.
- Fetches Data from Airtable: It connects to an Airtable base to retrieve a list of search queries or criteria for finding professionals.
- Loops Through Search Queries: For each item fetched from Airtable, the workflow processes it individually.
- Constructs Google Search URL: It dynamically builds a Google search URL to specifically target LinkedIn profiles based on the criteria from Airtable.
- Executes Google Search: It makes an HTTP request to Google to perform the constructed search.
- Parses Search Results: It processes the raw HTML response from Google to extract relevant LinkedIn profile URLs and other information.
- Filters and Cleans Data: It refines the extracted data, potentially removing duplicates or irrelevant entries.
- Updates Airtable: For each found LinkedIn profile, it updates the corresponding record in Airtable with the search results, such as the profile URL, name, or other extracted details.
- Introduces Delays: It includes a
Waitnode to introduce a pause between search requests, helping to prevent rate limiting or being flagged by Google. - Handles Logic with Switch: A
Switchnode is present, suggesting conditional logic might be applied based on the search results or data processing steps, though the specific conditions are not detailed in the provided JSON.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Account: A running instance of n8n.
- Airtable Account: An Airtable base with a table configured to store your search queries and the results. You will need an API key and the Base ID/Table Name.
- Google Search Access: The workflow performs HTTP requests to Google. While no specific API key for Google Search is explicitly mentioned, be aware of Google's scraping policies and potential IP blocking if requests are too frequent or aggressive. The
Waitnode helps mitigate this.
Setup/Usage
- Import the workflow: Download the JSON provided and import it into your n8n instance.
- Configure Airtable Credentials:
- Locate the "Airtable" node.
- Add your Airtable API Key as a credential.
- Specify the Base ID and Table Name where your search queries are stored and where results will be written.
- Configure the "HTTP Request" node: Ensure it's correctly set up to make requests to Google. The current setup implies it's configured to search for LinkedIn profiles.
- Review "Code" and "Edit Fields" nodes: These nodes contain the logic for constructing search queries, parsing results, and formatting data. You might need to adjust them based on the exact structure of your Airtable data and the desired output.
- Activate the workflow: Once configured, activate the workflow.
- Execute Manually: Click "Execute Workflow" on the "Manual Trigger" node to start the process.
This workflow provides a robust foundation for automating LinkedIn professional searches and can be further customized to fit specific data extraction and storage needs.
Related Templates
Two-way property repair management system with Google Sheets & Drive
This workflow automates the repair request process between tenants and building managers, keeping all updates organized in a single spreadsheet. It is composed of two coordinated workflows, as two separate triggers are required — one for new repair submissions and another for repair updates. A Unique Unit ID that corresponds to individual units is attributed to each request, and timestamps are used to coordinate repair updates with specific requests. General use cases include: Property managers who manage multiple buildings or units. Building owners looking to centralize tenant repair communication. Automation builders who want to learn multi-trigger workflow design in n8n. --- ⚙️ How It Works Workflow 1 – New Repair Requests Behind the Scenes: A tenant fills out a Google Form (“Repair Request Form”), which automatically adds a new row to a linked Google Sheet. Steps: Trigger: Google Sheets rowAdded – runs when a new form entry appears. Extract & Format: Collects all relevant form data (address, unit, urgency, contacts). Generate Unit ID: Creates a standardized identifier (e.g., BUILDING-UNIT) for tracking. Email Notification: Sends the building manager a formatted email summarizing the repair details and including a link to a Repair Update Form (which activates Workflow 2). --- Workflow 2 – Repair Updates Behind the Scenes:\ Triggered when the building manager submits a follow-up form (“Repair Update Form”). Steps: Lookup by UUID: Uses the Unit ID from Workflow 1 to find the existing row in the Google Sheet. Conditional Logic: If photos are uploaded: Saves each image to a Google Drive folder, renames files consistently, and adds URLs to the sheet. If no photos: Skips the upload step and processes textual updates only. Merge & Update: Combines new data with existing repair info in the same spreadsheet row — enabling a full repair history in one place. --- 🧩 Requirements Google Account (for Forms, Sheets, and Drive) Gmail/email node connected for sending notifications n8n credentials configured for Google API access --- ⚡ Setup Instructions (see more detail in workflow) Import both workflows into n8n, then copy one into a second workflow. Change manual trigger in workflow 2 to a n8n Form node. Connect Google credentials to all nodes. Update spreadsheet and folder IDs in the corresponding nodes. Customize email text, sender name, and form links for your organization. Test each workflow with a sample repair request and a repair update submission. --- 🛠️ Customization Ideas Add Slack or Telegram notifications for urgent repairs. Auto-create folders per building or unit for photo uploads. Generate monthly repair summaries using Google Sheets triggers. Add an AI node to create summaries/extract relevant repair data from repair request that include long submissions.
Automate invoice processing with OCR, GPT-4 & Salesforce opportunity creation
PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive ➜ Download PDF ➜ OCR text ➜ AI normalize to JSON ➜ Upsert Buyer (Account) ➜ Create Opportunity ➜ Map Products ➜ Create OLI via Composite API ➜ Archive to OneDrive. --- Node by node (what it does & key setup) 1) Google Drive Trigger Purpose: Fire when a new file appears in a specific Google Drive folder. Key settings: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output: Metadata { id, name, ... } for the new file. --- 2) Download File From Google Purpose: Get the file binary for processing and archiving. Key settings: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output: Binary (default key: data) and original metadata. --- 3) Extract from File Purpose: Extract text from PDF (OCR as needed) for AI parsing. Key settings: Operation: pdf OCR: enable for scanned PDFs (in options) Output: JSON with OCR text at {{ $json.text }}. --- 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into strict normalized JSON array (invoice schema). Key settings: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content → the parsed JSON (ensure it’s an array). --- 5) Create or update an account (Salesforce) Purpose: Upsert Buyer as Account using an external ID. Key settings: Resource: account Operation: upsert External Id Field: taxid_c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream Opportunity. --- 6) Create an opportunity (Salesforce) Purpose: Create Opportunity linked to the Buyer (Account). Key settings: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output: Opportunity Id for OLI creation. --- 7) Build SOQL (Code / JS) Purpose: Collect unique product codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output: { soql, codes } --- 8) Query PricebookEntries (Salesforce) Purpose: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output: Items with Id, Product2.ProductCode (used for mapping). --- 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id ➜ build OpportunityLineItem payloads. Inputs: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes: Converts discount_total ➜ per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. --- 10) Create Opportunity Line Items (HTTP Request) Purpose: Bulk create OLIs via Salesforce Composite API. Key settings: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output: Composite API results (per-record statuses). --- 11) Update File to One Drive Purpose: Archive the original PDF in OneDrive. Key settings: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output: Uploaded file metadata. --- Data flow (wiring) Google Drive Trigger → Download File From Google Download File From Google → Extract from File → Update File to One Drive Extract from File → Message a model Message a model → Create or update an account Create or update an account → Create an opportunity Create an opportunity → Build SOQL Build SOQL → Query PricebookEntries Query PricebookEntries → Code in JavaScript Code in JavaScript → Create Opportunity Line Items --- Quick setup checklist 🔐 Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. 📂 IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) 🧠 AI Prompt: Use the strict system prompt; jsonOutput = true. 🧾 Field mappings: Buyer tax id/name → Account upsert fields Invoice code/date/amount → Opportunity fields Product name must equal your Product2.ProductCode in SF. ✅ Test: Drop a sample PDF → verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive --- Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep “Return only a JSON array” as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields don’t support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your org’s domain or use the Salesforce node’s Bulk Upsert for simplicity.
Tax deadline management & compliance alerts with GPT-4, Google Sheets & Slack
AI-Driven Tax Compliance & Deadline Management System Description Automate tax deadline monitoring with AI-powered insights. This workflow checks your tax calendar daily at 8 AM, uses GPT-4 to analyze upcoming deadlines across multiple jurisdictions, detects overdue and critical items, and sends intelligent alerts via email and Slack only when immediate action is required. Perfect for finance teams and accounting firms who need proactive compliance management without manual tracking. 🏛️🤖📊 Good to Know AI-Powered: GPT-4 provides risk assessment and strategic recommendations Multi-Jurisdiction: Handles Federal, State, and Local tax requirements automatically Smart Alerts: Only notifies executives when deadlines are overdue or critical (≤3 days) Priority Classification: Categorizes deadlines as Overdue, Critical, High, or Medium priority Dual Notifications: Critical alerts to leadership + daily summaries to team channel Complete Audit Trail: Logs all checks and deadlines to Google Sheets for compliance records How It Works Daily Trigger - Runs at 8:00 AM every morning Fetch Data - Pulls tax calendar and company configuration from Google Sheets Analyze Deadlines - Calculates days remaining, filters by jurisdiction/entity type, categorizes by priority AI Analysis - GPT-4 provides strategic insights and risk assessment on upcoming deadlines Smart Routing - Only sends alerts if overdue or critical deadlines exist Critical Alerts - HTML email to executives + Slack alert for urgent items Team Updates - Slack summary to finance channel with all upcoming deadlines Logging - Records compliance check results to Google Sheets for audit trail Requirements Google Sheets Structure Sheet 1: TaxCalendar DeadlineID | DeadlineName | DeadlineDate | Jurisdiction | Category | AssignedTo | IsActive FED-Q1 | Form 1120 Q1 | 2025-04-15 | Federal | Income | John Doe | TRUE Sheet 2: CompanyConfig (single row) Jurisdictions | EntityType | FiscalYearEnd Federal, California | Corporation | 12-31 Sheet 3: ComplianceLog (auto-populated) Date | AlertLevel | TotalUpcoming | CriticalCount | OverdueCount 2025-01-15 | HIGH | 12 | 3 | 1 Credentials Needed Google Sheets - Service Account OAuth2 OpenAI - API Key (GPT-4 access required) SMTP - Email account for sending alerts Slack - Bot Token with chat:write permission Setup Steps Import workflow JSON into n8n Add all 4 credentials Replace these placeholders: YOURTAXCALENDAR_ID - Tax calendar sheet ID YOURCONFIGID - Company config sheet ID YOURLOGID - Compliance log sheet ID C12345678 - Slack channel ID tax@company.com - Sender email cfo@company.com - Recipient email Share all sheets with Google service account email Invite Slack bot to channels Test workflow manually Activate the trigger Customizing This Workflow Change Alert Thresholds: Edit "Analyze Deadlines" node: Critical: Change <= 3 to <= 5 for 5-day warning High: Change <= 7 to <= 14 for 2-week notice Medium: Change <= 30 to <= 60 for 2-month lookout Adjust Schedule: Edit "Daily Tax Check" trigger: Change hour/minute for different run time Add multiple trigger times for tax season (8 AM, 2 PM, 6 PM) Add More Recipients: Edit "Send Email" node: To: cfo@company.com, director@company.com CC: accounting@company.com BCC: archive@company.com Customize Email Design: Edit "Format Email" node to change colors, add logo, or modify layout Add SMS Alerts: Insert Twilio node after "Is Critical" for emergency notifications Integrate Task Management: Add HTTP Request node to create tasks in Asana/Jira for critical deadlines Troubleshooting | Issue | Solution | |-------|----------| | No deadlines found | Check date format (YYYY-MM-DD) and IsActive = TRUE | | AI analysis failed | Verify OpenAI API key and account credits | | Email not sending | Test SMTP credentials and check if critical condition met | | Slack not posting | Invite bot to channel and verify channel ID format | | Permission denied | Share Google Sheets with service account email | 📞 Professional Services Need help with implementation or customization? Our team offers: 🎯 Custom workflow development 🏢 Enterprise deployment support 🎓 Team training sessions 🔧 Ongoing maintenance 📊 Custom reporting & dashboards 🔗 Additional API integrations Discover more workflows – Get in touch with us