Back to Catalog
Bhuvanesh R

Bhuvanesh R

I partner with businesses to design AI voice agents and automation systems. Let's connect on LinkedIn to discuss your AI integration and automation needs.

Total Views813
Templates2

Templates by Bhuvanesh R

Generate personalized cold emails with Anthropic, GPT-4 & Google Sheets

Your Cold Email is Now Researched. This pipeline finds specific bottlenecks on prospect websites and instantly crafts an irresistible pitch --- 🎯 Problem Statement Traditional high-volume cold email outreach is stuck on generic personalization (e.g., "Love your website!"). Sales teams, especially those selling high-value AI Receptionists, struggle to efficiently find the one Unique Operational Hook (like manual scheduling dependency or high call volume) needed to make the pitch relevant. This forces reliance on expensive, slow manual research, leading to low reply rates and inefficient spending on bulk outreach tools. --- ✨ Solution This workflow deploys a resilient Dual-AI Personalization Pipeline that runs on a batch basis. It uses the Filter (Qualified Leads) node as a cost-saving Quality Gate to prevent processing bad leads. It executes a Targeted Deep Dive on successful leads, using GPT-4 for analytical insight extraction and Claude Sonnet for coherent, human-like copy generation. The entire process outputs campaign-ready data directly to Google Sheets and sends a critical QA Draft via Gmail. --- ⚙️ How It Works (Multi-Step Execution) 1\. Ingestion and Cost Control (The Quality Gate) Trigger and Ingestion: The workflow starts via a Manual Trigger, pulling leads directly from Get All Leads (Google Sheets). Cost Filtering: The Filter (Qualified Leads) node removes leads that lack a working email or website URL. Execution Isolation: The Loop Over Leads node initiates individual processing. The Capture Lead Data (Set) node immediately captures and locks down the original lead context for stability throughout the loop. Hybrid Scraping: The Scrape Site (HTTP Request) and Extract Text & Links (HTML) nodes execute the Hybrid Scraping strategy, simultaneously capturing website text and external links. Data Shaping & Status: The Filter Social & Status (Code) node is the control center. It filters links, bundles the context, and critically, assigns a status of 'Success' or 'Scrape Fail'. Cost Control Branch: The If (IF node) checks this status. Items with 'Scrape Fail' bypass all AI steps (saving 100% of AI token costs) and jump directly to Log Final Result. Successful items proceed to the AI core. 2\. Dual-AI Coherence & Dispatch (The Executive Output) Analytical Synthesis: The Summarize Website (OpenAI) node uses GPT-4 to synthesize the full context and extract the Unique Operational Hook (e.g., manual booking overhead). Coherent Copy Generation: The Generate Subject & Body (Anthropic) node uses the Claude Sonnet model to generate the subject and the multi-line body, guaranteeing coherence by creating both simultaneously in a single JSON output. Final Parsing: The Parse AI Output (Code) node reliably strips markdown wrappers and extracts the clean subject and body strings. Final Delivery: The data is logged via Log Final Result (Google Sheets), and the completed email is sent to the user via Create a draft (Gmail) for final Quality Assurance before sending. --- 🛠️ Setup Steps Before running the workflow, ensure these credentials and data structures are correctly configured: Credentials Anthropic: Configure credentials for the Language Model (Claude Sonnet). OpenAI: Configure credentials for the Analytical Model (GPT-4/GPT-4o). Google Services: Set up OAuth2 credentials for Google Sheets (Input/Output) and Gmail (Draft QA and Completion Alert). Configuration Google Sheet Setup: Your input sheet must include the columns email, website\_url, and an empty Icebreaker column for initial filtering. HTTP URL: Verify that the Scrape Site node's URL parameter is set to pull the website URL from the stabilized data structure: ={{ $json.website\_url }}. AI Prompts: Ensure the Anthropic prompt contains your current Irresistible Sales Offer and the required nested JSON output structure. --- ✅ Benefits Coherence Guarantee: A single Anthropic node generates both the subject and body, guaranteeing the message is perfectly aligned and hits the same unique insight. Maximum Cost Control: The IF node prevents spending tokens on bad or broken websites, making the campaign highly budget-efficient. Deep Personalization: Combines website text and social media links, creating an icebreaker that implies thorough, manual research. High Reliability: Uses robust Code nodes for data structuring and parsing, ensuring the workflow runs consistently under real-world conditions without crashing. Zero-Risk QA: The final Gmail (Create a draft) step ensures human review of the generated copy before any cold emails are sent out.

Bhuvanesh RBy Bhuvanesh R
517

Customer pain analysis & AI briefing with Anthropic, Reddit, X, and SerpAPI

The competitive edge, delivered. This Customer Intelligence Engine simultaneously analyzes the web, Reddit, and X/Twitter to generate a professional, actionable executive briefing. --- 🎯 Problem Statement Traditional market research for Customer Intelligence (CI) is manual, slow, and often relies on surface-level social media scraping or expensive external reports. Service companies, like HVAC providers, struggle to efficiently synthesize vast volumes of online feedback (Reddit discussions, real-time tweets, web articles) to accurately diagnose systemic service gaps (e.g., scheduling friction, poor automated systems). This inefficiency leads to delayed strategic responses and missed opportunities to invest in high-impact solutions like AI voice agents. --- ✨ Solution This workflow deploys a sophisticated Multisource Intelligence Pipeline that runs on a scheduled or ad-hoc basis. It uses parallel processing to ingest data from three distinct source types (SERP API, Reddit, and X/Twitter), employs a zero-cost Hybrid Categorization method to semantically identify operational bottlenecks, and uses the Anthropic LLM to synthesize the findings into a clear, executive-ready strategic brief. The data is logged for historical analysis while the brief is dispatched for immediate action. --- ⚙️ How It Works (Multi-Step Execution) Ingestion and Parallel Processing (The Data Fabric) Trigger: The workflow is initiated either on an ad-hoc basis via an n8n Form Trigger or on a schedule (Time Trigger). Parallel Ingestion: The workflow immediately splits into three parallel branches to fetch data simultaneously: SERP API: Captures authoritative content and industry commentary (Strategic Context*). Reddit (Looping Structure): Fetches posts from multiple subreddits via an Aggregate Node workaround to get authentic user experiences (Qualitative Signal*). X/Twitter (HTTP Request): Bypasses standard rate limits to capture real-time social complaints (Sentiment Signal*). Analysis and Fusion (The Intelligence Layer) Cleanup and Labeling (Function Nodes): Each branch uses dedicated Function Nodes to filter noise (e.g., low-score posts) and normalize the data by adding a source tag (e.g., 'Reddit'). Merge: A Merge Node (Append Mode) fuses all three parallel streams into a single, unified dataset. Hybrid Categorization (Function Node): A single Function Node applies the Hybrid Categorization Logic. This cost-free step semantically assigns a painpoint category (e.g., 'Call Hold/Availability') and a sentimentscore to every item, transforming raw text into labeled metrics. Dispatch and Reporting (The Executive Output) Aggregation and Split (Function Node): The final Function Node calculates the total counts, deduplicates the final results, and generates the comprehensive summaryString. Data Logging: The aggregated counts and metrics are appended to Google Sheets for historical logging. LLM Input Retrieval (Function Node): A final Function Node retrieves the summary data using the $items() helper (the serial route workaround). AI Briefing: The Message a model (Anthropic)* Node receives the summaryString and uses a strict HTML System Prompt to synthesize the strategic brief, identifying the top pain points and suggesting AI features. Delivery: The Gmail Node sends the final, professional HTML brief to the executive team. --- 🛠️ Setup Steps Credentials Anthropic: Configure credentials for the Language Model (Claude) used in the Message a model node. SERP API, Reddit, and X/Twitter: Configure API keys/credentials for the data ingestion nodes. Google Services: Set up OAuth2 credentials for Google Sheets (for logging data) and Gmail (for email dispatch). Configuration Form Configuration: If using the Form Trigger, ensure the Target Keywords and Target Subreddits are mapped correctly to the ingestion nodes. Data Integrity: Due to the serial route, ensure the Function (Get LLM Summary) node is correctly retrieving the LLMSUMMARYHOLDER field from the preceding node's output memory. --- ✅ Benefits Proactive CI & Strategy: Shifts market research from manual, reactive browsing to proactive, scheduled data diagnostic. Cost Efficiency: Utilizes a zero-cost Hybrid Categorization method (Function Node) for intent analysis, avoiding expensive per-item LLM token costs. Actionable Output: Delivers a fully synthesized, HTML-formatted executive brief, ready for immediate presentation and strategic sales positioning. High Reliability: Employs parallel ingestion, API workarounds, and serial routing to ensure the complex workflow runs consistently and without failure.

Bhuvanesh RBy Bhuvanesh R
296
All templates loaded