Back to Catalog
explorium

explorium

Explorium empowers businesses to build high-performance GTM agents with specialized data infrastructure. Our seamless API integrations and high-quality data drive faster agent development and better results. With years of experience and robust data sets, we deliver context-aware solutions, helping AI agents achieve human-level support. Explorium is the essential data partner for teams building agent-driven technologies.

Total Views4,700
Templates7

Templates by explorium

Enrich company firmographic data in Google Sheets with Explorium MCP

Google Sheets Company Enrichment with Explorium MCP Template Download the following json file and import it to a new n8n workflow: google\sheets\enrichment.json Overview This n8n workflow template enables automatic enrichment of company information in your Google Sheets. When you add a new company or update existing company details (name or website), the workflow automatically fetches additional business intelligence data using Explorium MCP and updates your sheet with: Business ID NAICS industry code Number of employees (range) Annual revenue (range) Key Features Automatic Triggering: Monitors your Google Sheet for new rows or updates to company name/website fields Smart Processing: Only processes new or modified rows, not the entire sheet Data Validation: Ensures both company name and website are present before processing Error Handling: Processes each row individually to prevent one failure from affecting others Powered by AI: Uses Claude Sonnet 4 with Explorium MCP for intelligent data enrichment Prerequisites Before setting up this workflow, ensure you have: n8n instance (self-hosted or cloud) Google account with access to Google Sheets Anthropic API key for Claude Explorium MCP API key <br /> Installation & Setup Step 1: Import the Workflow Create a new workflow. Download the workflow JSON from above. In your n8n instance, go to Workflows → Add Workflow → Import from File Select the JSON file and click Import Step 2: Create Google Sheet Create a new google sheet (or make a copy of this template) Your Google Sheet must have the following columns (exact names): name - Company name website - Company website URL business_id - Will be populated by the workflow naics - Will be populated by the workflow numberofemployees_range - Will be populated by the workflow yearlyrevenuerange - Will be populated by the workflow <br /> Step 3: Configure Google Sheets Credentials You'll need to set up two Google credentials: Google Sheets Trigger Credentials: Click on the Google Sheets Trigger node Under Credentials, click Create New If working on n8n Cloud, Click the 'Sign in with Google' button Grant permissions to read and monitor your Google Sheets If working on n8n Instance, Follow the OAuth2 authentication process here Fill the Client ID and Client Secret fields Google Sheets Update Credentials: Click on the Update Company Row node Under Credentials, select the same credentials or create new ones (The same you did above) Ensure permissions include write access to your sheets Step 4: Configure Anthropic Credentials Click on the Anthropic Chat Model node Under Credentials, click Create New Enter your Anthropic API key Save the credentials Step 5: Configure Explorium MCP Credentials Click on the MCP Client node Under Credentials, click Create New (Header Auth) Fill the Name field with api_key Fill the Value field with your Explorium API Key Save the credentials Step 6: Link Your Google Sheet In the Google Sheets Trigger node: Select your Google Sheet from the dropdown Select the worksheet (usually "Sheet1") In the Update Company Row node: Select the same Google Sheet and worksheet Ensure the matching column is set to row_number Step 7: Activate the Workflow Click the Active toggle in the top right to activate the workflow The workflow will now monitor your sheet every minute for changes How It Works Workflow Process Flow Google Sheets Trigger: Polls your sheet every minute for new rows or changes to name/website fields Filter Valid Rows: Validates that both company name and website are present Loop Over Items: Processes each company individually AI Agent: Uses Explorium MCP to: Find the company's business ID Retrieve firmographic data (revenue, employees, NAICS code) Format Output: Structures the data for Google Sheets Update Company Row: Writes the enriched data back to the original row Trigger Behavior First Activation: May process all existing rows to establish a baseline Ongoing Operation: Only processes new rows or rows where name/website fields change Polling Frequency: Checks for changes every minute Usage Adding New Companies Add a new row to your Google Sheet Fill in the name and website columns Within 1 minute, the workflow will automatically: Detect the new row Enrich the company data Update the remaining columns Updating Existing Companies Modify the name or website field of an existing row The workflow will re-process that row with the updated information All enrichment data will be refreshed Monitoring Executions In n8n, go to Executions to see workflow runs Each execution shows: Which rows were processed Success/failure status Detailed logs for troubleshooting Troubleshooting Common Issues All rows are processed instead of just new/updated ones Ensure the workflow is activated, not just run manually Manual test runs will process all rows First activation may process all rows once No data is returned for a company Verify the company name and website are correct Check if the company exists in Explorium's database Some smaller or newer companies may not have data available Workflow isn't triggering Confirm the workflow is activated (Active toggle is ON) Check that changes are made to the name or website columns Verify Google Sheets credentials have proper permissions Authentication errors Re-authenticate Google Sheets credentials Verify Anthropic API key is valid and has credits Check Explorium Bearer token is correct and active Error Handling The workflow processes each row individually, so if one company fails to enrich: Other rows will still be processed The failed row will retain its original data Check the execution logs for specific error details Best Practices Data Quality: Ensure company names and websites are accurate for best results Website Format: Include full URLs (https://example.com) rather than just domain names Batch Processing: The workflow handles multiple updates efficiently, so you can add several companies at once Regular Monitoring: Periodically check execution logs to ensure smooth operation API Limits & Considerations Google Sheets API: Subject to Google's API quotas Anthropic API: Each enrichment uses Claude Sonnet 4 tokens Explorium MCP: Rate limits may apply based on your subscription Support For issues specific to: n8n platform: Consult n8n documentation or community Google Sheets integration: Check n8n's Google Sheets node documentation Explorium MCP: Contact Explorium support for API-related issues Anthropic/Claude: Refer to Anthropic's documentation for API issues Example Use Cases Sales Prospecting: Automatically enrich lead lists with company size and revenue data Market Research: Build comprehensive databases of companies in specific industries Competitive Analysis: Track and monitor competitor information Investment Research: Gather firmographic data for potential investment targets

exploriumBy explorium
1314

Search business prospects with natural language using Claude AI and Explorium MCP

Explorium Prospects Search Chatbot Template Download the following json file and import it to a new n8n workflow: mcp\to\prospects\to\csv.json <br /> Overview This n8n workflow creates a chatbot that understands natural language requests for finding business prospects and automatically: Interprets your query using AI (Claude Sonnet 3.7) Converts it to proper Explorium API filters Validates the API request structure Fetches prospect data from Explorium Exports results as a downloadable CSV file Perfect for sales teams, recruiters, and business development professionals who need to quickly find and export targeted prospect lists without learning complex API syntax. Key Features Natural Language Interface: Simply describe who you're looking for in plain English Smart Query Translation: AI converts your request to valid API parameters Built-in Validation: Ensures API calls meet Explorium's requirements Error Recovery: Automatically retries with corrections if validation fails Pagination Support: Handles large result sets automatically CSV Export: Clean, formatted output ready for CRM import Conversation Memory: Maintains context for follow-up queries Example Queries The chatbot understands queries like: "Find marketing directors at SaaS companies in New York with 50-200 employees" "Get me CTOs from fintech startups in California" "Show me sales managers at healthcare companies with revenue over $10M" "Find engineers at Microsoft with 3-5 years experience" "Get customer service leads from e-commerce companies in Europe" Prerequisites Before setting up this workflow, ensure you have: n8n instance with chat interface enabled Anthropic API key for Claude Explorium API credentials (Bearer token) - Get explorium api key Basic understanding of n8n chat workflows Supported Filters The chatbot can search using these criteria: Company Filters Size: 1-10, 11-50, 51-200, 201-500, 501-1000, 1001-5000, 5001-10000, 10001+ employees Revenue: Ranges from $0-500K up to $10T+ Age: 0-3, 3-6, 6-10, 10-20, 20+ years Location: Countries, regions, cities Industry: Google categories, NAICS codes, LinkedIn categories Name: Specific company names Prospect Filters Job Level: CXO, VP, Director, Manager, Senior, Entry, etc. Department: Sales, Marketing, Engineering, Finance, HR, etc. Experience: Total months and current role duration Location: Country and region codes Contact Info: Filter by email/phone availability Installation & Setup Step 1: Import the Workflow Copy the workflow JSON from the template In n8n: Workflows → Add Workflow → Import from File Paste the JSON and click Import Step 2: Configure Anthropic Credentials Click on the Anthropic Chat Model1 node Under Credentials, click Create New Add your Anthropic API key Name: "Anthropic API" Save credentials Step 3: Configure Explorium Credentials You'll need to set up Explorium credentials in two places: For MCP Client: Click on the MCP Client node Under Credentials, create new Header Auth Add your authentication header (usually Authorization: Bearer YOUR_TOKEN) Save credentials For API Calls: Click on the Prospects API Call node Use the same Header Auth credentials created above Verify the API endpoint is correct Step 4: Activate the Workflow Save the workflow Click the Active toggle to enable it The chat interface will now be available Step 5: Access the Chat Interface Click on the When chat message received node Copy the webhook URL Access this URL in your browser to start chatting How It Works Workflow Architecture Chat Trigger: Receives natural language queries from users Memory Buffer: Maintains conversation context AI Agent: Interprets queries and generates API parameters Validation: Checks API structure against Explorium requirements API Call: Fetches prospect data with pagination Data Processing: Formats results for CSV export File Conversion: Creates downloadable CSV file Processing Flow User Query → AI Interpretation → Validation → API Call → CSV Export ↑ ↓ └──── Error Correction Loop ←──────┘ Validation Rules The workflow validates: Filter keys are allowed by Explorium API Values match expected formats (e.g., valid country codes) Range filters have proper gte/lte values No duplicate values in arrays Required structure is maintained Usage Guide Basic Conversation Flow Start with your query: "Find me VPs of Sales at software companies in the US" Bot processes and responds: Generates API filters Validates the structure Fetches data Returns CSV download link Refine if needed: "Can you also include directors and filter for companies with 100+ employees?" Query Tips Be specific: Include job titles, departments, company details Use standard terms: "CTO" instead of "Chief Technology Officer" Specify locations: Use country names or standard codes Include size/revenue: Helps narrow results effectively Advanced Queries Combine multiple criteria: "Find engineering managers and senior engineers at B2B SaaS companies in New York and California with 50-500 employees and revenue over $5M who have been in their role for at least 1 year" Output Format The CSV file includes: Prospect ID Name (first, last, full) Location (country, region, city) LinkedIn profile Experience summary Skills and interests Company details Job information Business ID Troubleshooting Common Issues "Validation failed" errors Check that your query uses supported filter values Ensure location names are spelled correctly Verify company sizes/revenues match allowed ranges No results returned Broaden your search criteria Check if the company exists in Explorium's database Verify filter combinations aren't too restrictive Chat not responding Ensure workflow is activated Check all credentials are properly configured Verify webhook URL is accessible Large result sets timing out Try adding more specific filters Limit results by location or company size Use the size parameter (max 10,000) Error Messages The bot provides clear feedback: Invalid filters: Shows which filters aren't supported Value errors: Lists correct options for each field API failures: Explains connection or authentication issues Performance Optimization Best Practices Start broad, then narrow: Begin with basic criteria and add filters Use business IDs: When targeting specific companies Limit by contact info: Add has_email: true for actionable leads Batch by location: Process regions separately for large searches API Limits Maximum 10,000 results per search Pagination handles up to 100 records per page Rate limits apply based on your Explorium subscription Customization Options Modify AI Behavior Edit the AI Agent system message to: Change response format Add custom filters Adjust interpretation logic Include additional instructions Extend Functionality Add nodes to: Send results via email Import directly to CRM Schedule recurring searches Create custom reports Integration Ideas Connect to Slack for team queries Add to CRM workflows Create lead scoring systems Build automated outreach campaigns Security Considerations API credentials are stored securely in n8n Chat sessions are isolated No prospect data is stored permanently CSV files are generated on-demand Support Resources For issues with: n8n platform: Check n8n documentation Explorium API: Contact Explorium support Anthropic/Claude: Refer to Anthropic docs Workflow logic: Review node configurations

exploriumBy explorium
1178

Automate HubSpot to Salesforce lead creation with Explorium AI enrichment

Automatically enrich prospect data from HubSpot using Explorium and create leads in Salesforce This n8n workflow streamlines the process of enriching prospect information by automatically pulling data from HubSpot, processing it through Explorium's AI-powered tools, and creating new leads in Salesforce with enhanced prospect details. Credentials Required To use this workflow, set up the following credentials in your n8n environment: HubSpot Type: App Token (or OAuth2 for broader compatibility) Used for: triggering on new contacts, fetching contact data Explorium API Type: Generic Header Auth Header: Authorization Value: Bearer YOURAPIKEY Get explorium api key Salesforce Type: OAuth2 or Username/Password Used for: creating new lead records Go to Settings → Credentials, create these three credentials, and assign them in the respective nodes before running the workflow. Workflow Overview Node 1: HubSpot Trigger This node listens for real-time events from the connected HubSpot account. Once triggered, the node passes metadata about the event to the next step in the flow. Node 2: HubSpot This node fetches contact details from HubSpot after the trigger event. Credential: Connected using a HubSpot App Token Resource: Contact Operation: Get Contact Return All: Disabled This node retrieves the full contact details needed for further processing and enrichment. Node 3: Match prospect This node sends each contact's data to Explorium's AI-powered prospect matching API in real time. Method: POST Endpoint: https://api.explorium.ai/v1/prospects/match Authentication: Generic Header Auth (using a configured credential) Headers: Content-Type: application/json The request body is dynamically built from contact data, typically including: fullname, companyname, email, phone_number, linkedin. These fields are matched against Explorium's intelligence graph to return enriched or validated profiles. Response Output: totalmatches, matchedprospects, and a prospect_id. Each response is used downstream to enrich, validate, or create lead information. Node 4: Filter This node filters the output from the Match prospect step to ensure that only valid, matched results continue in the flow. Only records that contain at least one matched prospect with a non-null prospect_id are passed forward. Status: Currently deactivated (as shown by the "Deactivate" label) Node 5: Extract Prospect IDs from Matched Results This node extracts all valid prospectid values from previously matched prospects and compiles them into a flat array. It loops over all matched items, extracts each prospectid from the matchedprospects array and returns a single object with an array of all prospectids. Node 6: Explorium Enrich Contacts Information This node performs bulk enrichment of contacts by querying Explorium with a list of matched prospect_ids. Node Configuration: Method: POST Endpoint: https://api.explorium.ai/v1/prospects/contactsinformation/bulkenrich Authentication: Header Auth (using saved credentials) Headers: "Content-Type": "application/json", "Accept": "application/json" Returns enriched contact information, such as: emails: professional/personal email addresses phone_numbers: mobile and work numbers professionsemail, professionalemailstatus, mobilephone Node 7: Explorium Enrich Profiles This additional enrichment node provides supplementary contact data enhancement, running in parallel with the primary enrichment process. Node 8: Merge This node combines multiple data streams from the parallel enrichment processes into a single output, allowing you to consolidate data from different Explorium enrichment endpoints. The "combine" setting indicates it will merge the incoming data streams rather than overwriting them. Node 9: Code - flatten This custom code node processes and transforms the merged enrichment data before creating the Salesforce lead. It can be used to: Flatten nested data structures Format data according to Salesforce field requirements Apply business logic or data validation Map Explorium fields to Salesforce lead properties Handle data type conversions Node 10: Salesforce This final node creates new leads in Salesforce using the enriched data returned by Explorium. Credential: Salesforce OAuth2 or Username/Password Resource: Lead Operation: Create Lead The node creates new lead records with enriched information including contact details, company information, and professional data obtained through the Explorium enrichment process. Workflow Flow Summary Trigger: HubSpot webhook triggers on new/updated contacts Fetch: Retrieve contact details from HubSpot Match: Find prospect matches using Explorium Filter: Keep only successfully matched prospects (currently deactivated) Extract: Compile prospect IDs for bulk enrichment Enrich: Parallel enrichment of contact information through multiple Explorium endpoints Merge: Combine enrichment results Transform: Flatten and prepare data for Salesforce (Code node) Create: Create new lead records in Salesforce This workflow ensures comprehensive data enrichment while maintaining data quality and providing a seamless integration between HubSpot prospect data and Salesforce lead creation. The parallel enrichment structure maximizes data collection efficiency before creating high-quality leads in your CRM system.

exploriumBy explorium
741

Generate personalized sales leads with Claude AI & Explorium for Gmail outreach

Outbound Agent - AI-Powered Lead Generation with Natural Language Prospecting This n8n workflow transforms natural language queries into targeted B2B prospecting campaigns by combining Explorium's data intelligence with AI-powered research and personalized email generation. Simply describe your ideal customer profile in plain English, and the workflow automatically finds prospects, enriches their data, researches them, and creates personalized email drafts. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Anthropic API Type: API Key Used for: AI Agent query interpretation, email research, and email writing Get your API key at Anthropic Console Explorium API Type: Generic Header Auth Header: Authorization Value: Bearer YOURAPIKEY Used for: Prospect matching, contact enrichment, professional profiles, and MCP research Get your API key at Explorium Dashboard Explorium MCP Type: HTTP Header Auth Used for: Real-time company and prospect intelligence research Connect to: https://mcp.explorium.ai/mcp Gmail Type: OAuth2 Used for: Creating email drafts Alternative options: Outlook, Mailchimp, SendGrid, Lemlist Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. --- Workflow Overview Node 1: When chat message received This node creates an interactive chat interface where users can describe their prospecting criteria in natural language. Type: Chat Trigger Purpose: Accept natural language queries like "Get 5 marketing leaders at fintech startups who joined in the past year and have valid contact information" Example Prompts: "Find SaaS executives in New York with 50-200 employees" "Get marketing directors at healthcare companies" "Show me VPs at fintech startups with recent funding" Node 2: Chat or Refinement This code node manages the conversation flow, handling both initial user queries and validation error feedback. Function: Routes either the original chat input or validation error messages to the AI Agent Dynamic Input: Combines chatInput and errorInput fields Purpose: Creates a feedback loop for validation error correction Node 3: AI Agent The core intelligence node that interprets natural language and generates structured API calls. Functionality: Interprets user intent from natural language queries Maps concepts to Explorium API filters (job levels, departments, company size, revenue, location, etc.) Generates valid JSON requests with precise filter criteria Handles off-topic queries with helpful guidance Connected to MCP Client for real-time filter specifications AI Components: Anthropic Chat Model: Claude Sonnet 4 for query interpretation Simple Memory: Maintains conversation context (100 message window) Output Parser: Structured JSON output with schema validation MCP Client: Connected to https://mcp.explorium.ai/mcp for Explorium specifications System Instructions: Expert in converting natural language to Explorium API filters Can revise previous responses based on validation errors Strict adherence to allowed filter values and formats Default settings: mode: "full", size: 10000, pagesize: 100, hasemail: true Node 4: API Call Validation This code node validates the AI-generated API request against Explorium's filter specifications. Validation Checks: Filter key validity (only allowed filters from approved list) Value format correctness (enums, ranges, country codes) No duplicate values in arrays Proper range structure for experience fields (totalexperiencemonths, currentrolemonths) Required field presence Allowed Filters: countrycode, regioncountrycode, companycountrycode, companyregioncountrycode companysize, companyrevenue, companyage, numberof_locations googlecategory, naicscategory, linkedincategory, companyname cityregioncountry, website_keywords hasemail, hasphone_number joblevel, jobdepartment, job_title businessid, totalexperiencemonths, currentrole_months Output: isValid: Boolean validation status validationErrors: Array of specific error messages Node 5: Is API Call Valid? Conditional routing node that determines the next step based on validation results. If Valid: Proceed to Explorium API: Fetch Prospects If Invalid: Route to Validation Prompter for correction Node 6: Validation Prompter Generates detailed error feedback for the AI Agent when validation fails. This creates a self-correcting loop where the AI learns from validation errors and regenerates compliant requests by routing back to Node 2 (Chat or Refinement). Node 7: Explorium API: Fetch Prospects Makes the validated API call to Explorium's prospect database. Method: POST Endpoint: /v1/prospects/fetch Authentication: Header Auth (Bearer token) Input: JSON with filters, mode, size, page_size, page Returns: Array of matched prospects with prospect IDs based on filter criteria Node 8: Pull Prospect IDs Extracts prospect IDs from the fetch response for bulk enrichment. Input: Full fetch response with prospect data Output: Array of prospect_id values formatted for enrichment API Node 9: Explorium API: Contact Enrichment Single enrichment node that enhances prospect data with both contact and profile information. Method: POST Endpoint: /v1/prospects/enrich Enrichment Types: contacts, profiles Authentication: Header Auth (Bearer token) Input: Array of prospect IDs from Node 8 Returns: Contacts: Professional emails (current, verified), phone numbers (mobile, work), email validation status, all available email addresses Profiles: Full professional history, current role details, company information, skills and expertise, education background, experience timeline, job titles and seniority levels Node 10: Clean Output Data Transforms and structures the enriched data for downstream processing. Node 11: Loop Over Items Iterates through each prospect to generate individualized research and emails. Batch Size: 1 (processes prospects one at a time) Purpose: Enable personalized research and email generation for each prospect Loop Control: Processes until all prospects are complete Node 12: Research Email AI-powered research agent that investigates each prospect using Explorium MCP. Input Data: Prospect name, job title, company name, company website LinkedIn URL, job department, skills Research Focus: Company automation tool usage (n8n, Zapier, Make, HubSpot, Salesforce) Data enrichment practices Tech stack and infrastructure (Snowflake, Segment, etc.) Recent company activity and initiatives Pain points related to B2B data (outdated CRM data, manual enrichment, static workflows) Public content (speaking engagements, blog posts, thought leadership) AI Components: Anthropic Chat Model1: Claude Sonnet 4 for research Simple Memory1: Maintains research context Explorium MCP1: Connected to https://mcp.explorium.ai/mcp for real-time intelligence Output: Structured JSON with research findings including automation tools, pain points, personalization notes Node 13: Email Writer Generates personalized cold email drafts based on research findings. Input Data: Contact info from Loop Over Items Current experience and skills Research findings from Research Email agent Company data (name, website) AI Components: Anthropic Chat Model3: Claude Sonnet 4 for email writing Structured Output Parser: Enforces JSON schema with email, subject, message fields Output Schema: email: Selected prospect email address (professional preferred) subject: Compelling, personalized subject line message: HTML formatted email body Node 14: Create a draft (Gmail) Creates email drafts in Gmail for review before sending. Resource: Draft Subject: From Email Writer output Message: HTML formatted email body Send To: Selected prospect email address Authentication: Gmail OAuth2 After Creation: Loops back to Node 11 (Loop Over Items) to process next prospect Alternative Output Options: Outlook: Create drafts in Microsoft Outlook Mailchimp: Add to email campaign SendGrid: Queue for sending Lemlist: Add to cold email sequence --- Workflow Flow Summary Input: User describes target prospects in natural language via chat interface Interpret: AI Agent converts query to structured Explorium API filters using MCP Validate: API call validation ensures filter compliance Refine: If invalid, error feedback loop helps AI correct the request Fetch: Retrieve matching prospect IDs from Explorium database Enrich: Parallel bulk enrichment of contact details and professional profiles Clean: Transform and structure enriched data Loop: Process each prospect individually Research: AI agent uses Explorium MCP to gather company and prospect intelligence Write: Generate personalized email based on research Draft: Create reviewable email drafts in preferred platform This workflow eliminates manual prospecting work by combining natural language processing, intelligent data enrichment, automated research, and personalized email generation—taking you from "I need marketing leaders at fintech companies" to personalized, research-backed email drafts in minutes. --- Customization Options Flexible Triggers The chat interface can be replaced with: Scheduled runs for recurring prospecting Webhook triggers from CRM updates Manual execution for ad-hoc campaigns Scalable Enrichment Adjust enrichment depth by: Adding more Explorium API endpoints (technographics, funding, news) Configuring prospect batch sizes Customizing data cleaning logic Output Destinations Route emails to your preferred platform: Email Platforms: Gmail, Outlook, SendGrid, Mailchimp Sales Tools: Lemlist, Outreach, SalesLoft CRM Integration: Salesforce, HubSpot (create leads with research) Collaboration: Slack notifications, Google Docs reports AI Model Flexibility Swap AI providers based on your needs: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini --- Setup Notes Domain Filtering: The workflow prioritizes professional emails—customize email selection logic in the Clean Output Data node MCP Configuration: Explorium MCP requires Header Auth setup—ensure credentials are properly configured Rate Limits: Adjust Loop Over Items batch size if hitting API rate limits Memory Context: Simple Memory maintains conversation history—increase window length for longer sessions Validation: The AI self-corrects through validation loops—monitor early runs to ensure filter accuracy This workflow represents a complete AI-powered sales development representative (SDR) that handles prospecting, research, and personalized outreach with minimal human intervention.

exploriumBy explorium
185

Business intelligence assistant for Slack using Explorium MCP & Claude AI

Explorium Agent for Slack AI-powered Slack bot for business intelligence queries using Explorium API through MCP. Prerequisites Slack workspace with admin access Anthropic API key (You can replace with other LLM Chat) Explorium API Key Create Slack App Create App Go to api.slack.com/apps Click Create New App → From scratch Give it name (e.g., "Explorium Agent") and select workspace Bot Permissions (OAuth & Permissions) Add these Bot Token Scopes: app_mentions:read channels:history channels:read chat:write emoji:read groups:history groups:read im:history im:read mpim:history mpim:read reactions:read users:read Enable Events Event Subscriptions → Enable Add Request URL (from n8n Slack Trigger node) Subscribe to bot events: app_mention message.channels message.groups message.im message.mpim reaction_added Install App Install App → Install to Workspace Copy Bot User OAuth Token (xoxb-...) Configure n8n Import & Setup Import this JSON template Slack Trigger node: Add Slack credential with Bot Token Copy webhook URL Paste in Slack Event Subscriptions Request URL Anthropic Chat Model node: Add Anthropic API credential Model: claude-haiku-4-5-20251001 (You can replace it with other chat models) MCP Client node: Endpoint: https://mcp.explorium.ai/mcp Header Auth: Add Explorium API key Usage Examples @ExploriumAgent find tech companies in SF with 50-200 employees @ExploriumAgent show Microsoft's technology stack @ExploriumAgent get CMO contacts at healthcare companies

exploriumBy explorium
126

Qualify leads with Salesforce, Explorium data & Claude AI analysis of API usage

Inbound Agent - AI-Powered Lead Qualification with Product Usage Intelligence This n8n workflow automatically qualifies and scores inbound leads by combining their product usage patterns with deep company intelligence. The workflow pulls new leads from your CRM, analyzes which API endpoints they've been testing, enriches them with firmographic data, and generates comprehensive qualification reports with personalized talking points—giving your sales team everything they need to prioritize and convert high-quality leads. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Salesforce Type: OAuth2 or Username/Password Used for: Pulling lead reports and creating follow-up tasks Alternative CRM options: HubSpot, Zoho, Pipedrive Get credentials at Salesforce Setup Databricks (or Analytics Platform) Type: HTTP Request with Bearer Token Header: Authorization Value: Bearer YOURDATABRICKSTOKEN Used for: Querying product usage and API endpoint data Alternative options: Datadog, Mixpanel, Amplitude, custom data warehouse Explorium API Type: Generic Header Auth Header: Authorization Value: Bearer YOURAPIKEY Used for: Business matching and firmographic enrichment Get your API key at Explorium Dashboard Explorium MCP Type: HTTP Header Auth Used for: Real-time company intelligence and supplemental research Connect to: https://mcp.explorium.ai/mcp Anthropic API Type: API Key Used for: AI-powered lead qualification and analysis Get your API key at Anthropic Console Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. --- Workflow Overview Node 1: When clicking 'Execute workflow' Manual trigger that initiates the lead qualification process. Type: Manual Trigger Purpose: On-demand execution for testing or manual runs Alternative Trigger Options: Schedule Trigger: Run automatically (hourly, daily, weekly) Webhook: Trigger on CRM updates or new lead events CRM Trigger: Real-time activation when leads are created Node 2: GET SF Report Pulls lead data from a pre-configured Salesforce report. Method: GET Endpoint: Salesforce Analytics Reports API Authentication: Salesforce OAuth2 Returns: Raw Salesforce report data including: Lead contact information Company names Lead source and status Created dates Custom fields CRM Alternatives: This node can be replaced with HubSpot, Zoho, or any CRM's reporting API. Node 3: Extract Records Parses the Salesforce report structure and extracts individual lead records. Extraction Logic: Navigates report's factMap['T!T'].rows structure Maps data cells to named fields Node 4: Extract Tenant Names Prepares tenant identifiers for usage data queries. Purpose: Formats tenant names as SQL-compatible strings for the Databricks query Output: Comma-separated, quoted list: 'tenant1', 'tenant2', 'tenant3' Node 5: Query Databricks Queries your analytics platform to retrieve API usage data for each lead. Method: POST Endpoint: /api/2.0/sql/statements Authentication: Bearer token in headers Warehouse ID: Your Databricks cluster ID Platform Alternatives: Datadog: Query logs via Logs API Mixpanel: Event segmentation API Amplitude: Behavioral cohorts API Custom Warehouse: PostgreSQL, Snowflake, BigQuery queries Node 6: Split Out Splits the Databricks result array into individual items for processing. Field: result.data_array Purpose: Transform single response with multiple rows into separate items Node 7: Rename Keys Normalizes column names from database query to readable field names. Mapping: 0 → TenantNames 1 → endpoints 2 → endpointsNum Node 8: Extract Business Names Prepares company names for Explorium enrichment. Node 9: Loop Over Items Iterates through each company for individual enrichment. Node 10: Explorium API: Match Businesses Matches company names to Explorium's business entity database. Method: POST Endpoint: /v1/businesses/match Authentication: Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of potential matches Match confidence scores Node 11: Explorium API: Firmographics Enriches matched businesses with comprehensive company data. Method: POST Endpoint: /v1/businesses/firmographics/bulk_enrich Authentication: Header Auth (Bearer token) Returns: Company name, website, description Industry categories (NAICS, SIC, LinkedIn) Size: employee count range, revenue range Location: headquarters address, city, region, country Company age and founding information Social profiles: LinkedIn, Twitter Logo and branding assets Node 12: Merge Combines API usage data with firmographic enrichment data. Node 13: Organize Data as Items Structures merged data into clean, standardized lead objects. Data Organization: Maps API usage by tenant name Maps enrichment data by company name Combines with original lead information Creates complete lead profile for analysis Node 14: Loop Over Items1 Iterates through each qualified lead for AI analysis. Batch Size: 1 (analyzes leads individually) Purpose: Generate personalized qualification reports Node 15: Get many accounts1 Fetches the associated Salesforce account for context. Resource: Account Operation: Get All Filter: Match by company name Limit: 1 record Purpose: Link lead qualification back to Salesforce account for task creation Node 16: AI Agent Analyzes each lead to generate comprehensive qualification reports. Input Data: Lead contact information API usage patterns (which endpoints tested) Firmographic data (company profile) Lead source and status Analysis Process: Evaluates lead quality based on usage, company fit, and signals Identifies which Explorium APIs the lead explored Assesses company size, industry, and potential value Detects quality signals (legitimate company email, active usage) and red flags Determines optimal sales approach and timing Connected to Explorium MCP for supplemental company research if needed Output: Structured qualification report with: Lead Score: High Priority, Medium Priority, Low Priority, or Nurture Quick Summary: Executive overview of lead potential API Usage Analysis: Endpoints used, usage insights, potential use case Company Profile: Overview, fit assessment, potential value Quality Signals: Positive indicators and concerns Recommended Actions: Next steps, timing, and approach Talking Points: Personalized conversation starters based on actual API usage Node 18: Clean Outputs Formats the AI qualification report for Salesforce task creation. Node 19: Update Salesforce Records Creates follow-up tasks in Salesforce with qualification intelligence. Resource: Task Operation: Create Authentication: Salesforce OAuth2 Alternative Output Options: HubSpot: Create tasks or update deal stages Outreach/SalesLoft: Add to sequences with custom messaging Slack: Send qualification reports to sales channels Email: Send reports to account owners Google Sheets: Log qualified leads for tracking --- Workflow Flow Summary Trigger: Manual execution or scheduled run Pull Leads: Fetch new/updated leads from Salesforce report Extract: Parse lead records and tenant identifiers Query Usage: Retrieve API endpoint usage data from analytics platform Prepare: Format data for enrichment Match: Identify companies in Explorium database Enrich: Pull comprehensive firmographic data Merge: Combine usage patterns with company intelligence Organize: Structure complete lead profiles Analyze: AI evaluates each lead with quality scoring Format: Structure qualification reports for CRM Create Tasks: Automatically populate Salesforce with actionable intelligence This workflow eliminates manual lead research and qualification, automatically analyzing product engagement patterns alongside company fit to help sales teams prioritize and personalize their outreach to the highest-value inbound leads. --- Customization Options Flexible Triggers Replace the manual trigger with: Schedule: Run hourly/daily to continuously qualify new leads Webhook: Real-time qualification when leads are created CRM Trigger: Activate on specific lead status changes Analytics Platform Integration The Databricks query can be adapted for: Datadog: Query application logs and events Mixpanel: Analyze user behavior and feature adoption Amplitude: Track product engagement metrics Custom Databases: PostgreSQL, MySQL, Snowflake, BigQuery CRM Flexibility Works with multiple CRMs: Salesforce: Full integration (pull reports, create tasks) HubSpot: Contact properties and deal updates Zoho: Lead enrichment and task creation Pipedrive: Deal qualification and activity creation Enrichment Depth Add more Explorium endpoints: Technographics: Tech stack and product usage News & Events: Recent company announcements Funding Data: Investment rounds and financial events Hiring Signals: Job postings and growth indicators Output Destinations Route qualification reports to: CRM Updates: Salesforce, HubSpot (update lead scores/fields) Task Creation: Any CRM task/activity system Team Notifications: Slack, Microsoft Teams, Email Sales Tools: Outreach, SalesLoft, Salesloft sequences Reporting: Google Sheets, Data Studio dashboards AI Model Options Swap AI providers: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini --- Setup Notes Salesforce Report Configuration: Create a report with required fields (name, email, company, tenant ID) and use its API endpoint Tenant Identification: Ensure your product usage data includes identifiers that link to CRM leads Usage Data Query: Customize the SQL query to match your database schema and table structure MCP Configuration: Explorium MCP requires Header Auth—configure credentials properly Lead Scoring Logic: Adjust AI system prompts to match your ideal customer profile and qualification criteria Task Assignment: Configure Salesforce task assignment rules or add logic to route to specific sales reps This workflow acts as an intelligent lead qualification system that combines behavioral signals (what they're testing) with firmographic fit (who they are) to give sales teams actionable intelligence for every inbound lead.

exploriumBy explorium
109

Automate sales meeting prep with Claude AI & Explorium Intelligence

Research Agent - Automated Sales Meeting Intelligence This n8n workflow automatically prepares comprehensive sales research briefs every morning for your upcoming meetings by analyzing both the companies you're meeting with and the individual attendees. The workflow connects to your calendar, identifies external meetings, enriches companies and contacts with deep intelligence from Explorium, and delivers personalized research reports—giving your sales team everything they need for informed, confident conversations. DEMO Template Demo Credentials Required To use this workflow, set up the following credentials in your n8n environment: Google Calendar (or Outlook) Type: OAuth2 Used for: Reading daily meeting schedules and identifying external attendees Alternative: Microsoft Outlook Calendar Get credentials at Google Cloud Console Explorium API Type: Generic Header Auth Header: Authorization Value: Bearer YOURAPIKEY Used for: Business/prospect matching, firmographic enrichment, professional profiles, LinkedIn posts, website changes, competitive intelligence Get your API key at Explorium Dashboard Explorium MCP Type: HTTP Header Auth Used for: Real-time company intelligence and supplemental research for AI agents Connect to: https://mcp.explorium.ai/mcp Anthropic API Type: API Key Used for: AI-powered company and attendee research analysis Get your API key at Anthropic Console Slack (or preferred output) Type: OAuth2 Used for: Delivering research briefs Alternative options: Google Docs, Email, Microsoft Teams, CRM updates Go to Settings → Credentials, create these credentials, and assign them in the respective nodes before running the workflow. --- Workflow Overview Node 1: Schedule Trigger Automatically runs the workflow on a recurring schedule. Type: Schedule Trigger Default: Every morning before business hours Customizable: Set to any interval (hourly, daily, weekly) or specific times Alternative Trigger Options: Manual Trigger: On-demand execution Webhook: Triggered by calendar events or CRM updates Node 2: Get many events Retrieves meetings from your connected calendar. Calendar Source: Google Calendar (or Outlook) Authentication: OAuth2 Time Range: Current day + 18 hours (configurable via timeMax) Returns: All calendar events with attendee information, meeting titles, times, and descriptions Node 3: Filter for External Meetings Identifies meetings with external participants and filters out internal-only meetings. Filtering Logic: Extracts attendee email domains Excludes your company domain (e.g., 'explorium.ai') Excludes calendar system addresses (e.g., 'resource.calendar.google.com') Only passes events with at least one external attendee Important Setup Note: Replace 'explorium.ai' in the code node with your company domain to properly filter internal meetings. Output: Events with external participants only external_attendees: Array of external contact emails company_domains: Unique list of external company domains per meeting externalattendeecount: Number of external participants --- Company Research Pipeline Node 4: Loop Over Items Iterates through each meeting with external attendees for company research. Node 5: Extract External Company Domains Creates a deduplicated list of all external company domains from the current meeting. Node 6: Explorium API: Match Business Matches company domains to Explorium's business entity database. Method: POST Endpoint: /v1/businesses/match Authentication: Header Auth (Bearer token) Returns: business_id: Unique Explorium identifier matched_businesses: Array of matches with confidence scores Company name and basic info Node 7: If Validates that a business match was found before proceeding to enrichment. Condition: business_id is not empty If True: Proceed to parallel enrichment nodes If False: Skip to next company in loop Nodes 8-9: Parallel Company Enrichment Node 8: Explorium API: Business Enrich Endpoints: /v1/businesses/firmographics/enrich, /v1/businesses/technographics/enrich Enrichment Types: firmographics, technographics Returns: Company name, description, website, industry, employees, revenue, headquarters location, ticker symbol, LinkedIn profile, logo, full tech stack, nested tech stack by category, BI & analytics tools, sales tools, marketing tools Node 9: Explorium API: Fetch Business Events Endpoint: /v1/businesses/events/fetch Event Types: New funding rounds, new investments, mergers & acquisitions, new products, new partnerships Date Range: September 1, 2025 - November 4, 2025 Returns: Recent business milestones and financial events Node 10: Merge Combines enrichment responses and events data into a single data object. Node 11: Cleans Merge Data Output Transforms merged enrichment data into a structured format for AI analysis. Node 12: Company Research Agent AI agent (Claude Sonnet 4) that analyzes company data to generate actionable sales intelligence. Input: Structured company profile with all enrichment data Analysis Focus: Company overview and business context Recent website changes and strategic shifts Tech stack and product focus areas Potential pain points and challenges How Explorium's capabilities align with their needs Timely conversation starters based on recent activity Connected to Explorium MCP: Can pull additional real-time intelligence if needed to create more detailed analysis Node 13: Create Company Research Output Formats the AI analysis into a readable, shareable research brief. --- Attendee Research Pipeline Node 14: Create List of All External Attendees Compiles all unique external attendee emails across all meetings. Node 15: Loop Over Items2 Iterates through each external attendee for individual enrichment. Node 16: Extract External Company Domains1 Extracts the company domain from each attendee's email. Node 17: Explorium API: Match Business1 Matches the attendee's company domain to get business_id for prospect matching. Method: POST Endpoint: /v1/businesses/match Purpose: Link attendee to their company Node 18: Explorium API: Match Prospect Matches attendee email to Explorium's professional profile database. Method: POST Endpoint: /v1/prospects/match Authentication: Header Auth (Bearer token) Returns: prospect_id: Unique professional profile identifier Node 19: If1 Validates that a prospect match was found. Condition: prospect_id is not empty If True: Proceed to prospect enrichment If False: Skip to next attendee Node 20: Explorium API: Prospect Enrich Enriches matched prospect using multiple Explorium endpoints. Enrichment Types: contacts, profiles, linkedin_posts Endpoints: /v1/prospects/contacts/enrich, /v1/prospects/profiles/enrich, /v1/prospects/linkedin_posts/enrich Returns: Contacts: Professional email, email status, all emails, mobile phone, all phone numbers Profiles: Full professional history, current role, skills, education, company information, experience timeline, job titles and seniority LinkedIn Posts: Recent LinkedIn activity, post content, engagement metrics, professional interests and thought leadership Node 21: Cleans Enrichment Outputs Structures prospect data for AI analysis. Node 22: Attendee Research Agent AI agent (Claude Sonnet 4) that analyzes prospect data to generate personalized conversation intelligence. Input: Structured professional profile with activity data Analysis Focus: Career background and progression Current role and responsibilities Recent LinkedIn activity themes and interests Potential pain points in their role Relevant Explorium capabilities for their needs Personal connection points (education, interests, previous companies) Opening conversation starters Connected to Explorium MCP: Can gather additional company or market context if needed Node 23: Create Attendee Research Output Formats attendee analysis into a readable brief with clear sections. Node 24: Merge2 Combines company research output with attendee information for final assembly. Node 25: Loop Over Items1 Manages the final loop that combines company and attendee research for output. Node 26: Send a message (Slack) Delivers combined research briefs to specified Slack channel or user. Alternative Output Options: Google Docs: Create formatted document per meeting Email: Send to meeting organizer or sales rep Microsoft Teams: Post to channels or DMs CRM: Update opportunity/account records with research PDF: Generate downloadable research reports --- Workflow Flow Summary Schedule: Workflow runs automatically every morning Fetch Calendar: Pull today's meetings from Google Calendar/Outlook Filter: Identify meetings with external attendees only Extract Companies: Get unique company domains from external attendees Extract Attendees: Compile list of all external contacts Company Research Path: Match Companies: Identify businesses in Explorium database Enrich (Parallel): Pull firmographics, website changes, competitive landscape, events, and challenges Merge & Clean: Combine and structure company data AI Analysis: Generate company research brief with insights and talking points Format: Create readable company research output Attendee Research Path: Match Prospects: Link attendees to professional profiles Enrich (Parallel): Pull profiles, job changes, and LinkedIn activity Merge & Clean: Combine and structure prospect data AI Analysis: Generate attendee research with background and approach Format: Create readable attendee research output Delivery: Combine: Merge company and attendee research for each meeting Send: Deliver complete research briefs to Slack/preferred platform This workflow eliminates manual pre-meeting research by automatically preparing comprehensive intelligence on both companies and individuals—giving sales teams the context and confidence they need for every conversation. --- Customization Options Calendar Integration Works with multiple calendar platforms: Google Calendar: Full OAuth2 integration Microsoft Outlook: Calendar API support CalDAV: Generic calendar protocol support Trigger Flexibility Adjust when research runs: Morning Routine: Default daily at 7 AM On-Demand: Manual trigger for specific meetings Continuous: Hourly checks for new meetings Enrichment Depth Add or remove enrichment endpoints: Company: Technographics, funding history, news mentions, hiring signals Prospects: Contact information, social profiles, company changes Customizable: Select only needed data to optimize speed and costs Research Scope Configure what gets researched: All External Meetings: Default behavior Filtered by Keywords: Only meetings with specific titles By Attendee Count: Only meetings with X+ external attendees By Calendar: Specific calendars only Output Destinations Deliver research to your preferred platform: Messaging: Slack, Microsoft Teams, Discord Documents: Google Docs, Notion, Confluence Email: Gmail, Outlook, custom SMTP CRM: Salesforce, HubSpot (update account notes) Project Management: Asana, Monday.com, ClickUp AI Model Options Swap AI providers based on needs: Default: Anthropic Claude (Sonnet 4) Alternatives: OpenAI GPT-4, Google Gemini --- Setup Notes Domain Configuration: Replace 'explorium.ai' in the Filter for External Meetings code node with your company domain Calendar Connection: Ensure OAuth2 credentials have calendar read permissions Explorium Credentials: Both API key and MCP credentials must be configured Output Timing: Schedule trigger should run with enough lead time before first meetings Rate Limits: Adjust loop batch sizes if hitting API rate limits during enrichment Slack Configuration: Select destination channel or user for research delivery Data Privacy: Research is based on publicly available professional information and company data This workflow acts as your automated sales researcher, preparing detailed intelligence reports every morning so your team walks into every meeting informed, prepared, and ready to have meaningful conversations that drive business forward.

exploriumBy explorium
81
All templates loaded