Alexandra Spalato
I'm an AI automation consultant with over a decade of experience in web development. I help businesses streamline their marketing and operations by building custom n8n workflows that save time and boost results. Ready to see what automation can do for your business? Use my link to book your initial consultation.
Templates by Alexandra Spalato
LinkedIn lead generation: Auto DM system with comment triggers using Unipile & NocoDB
Short Description This LinkedIn automation workflow monitors post comments for specific trigger words and automatically sends direct messages with lead magnets to engaged users. The system checks connection status, handles non-connected users with connection requests, and prevents duplicate outreach by tracking all interactions in a database. Key Features Comment Monitoring: Scans LinkedIn post comments for customizable trigger words Connection Status Check: Determines if users are 1st-degree connections Automated DMs: Sends personalized messages with lead magnet links to connected users Connection Requests: Asks non-connected users to connect via comment replies Duplicate Prevention: Tracks interactions in NocoDB to avoid repeat messages Message Rotation: Uses different comment reply variations for authenticity Batch Processing: Handles multiple comments with built-in delays Who This Workflow Is For Content creators looking to convert post engagement into leads Coaches and consultants sharing valuable LinkedIn content Anyone wanting to automate lead capture from LinkedIn posts How It Works Setup: Configure post ID, trigger word, and lead magnet link via form Comment Extraction: Retrieves all comments from the specified post using Unipile Trigger Detection: Filters comments containing the specified trigger word Connection Check: Determines if commenters are 1st-degree connections Smart Routing: Connected users receive DMs, others get connection requests Database Logging: Records all interactions to prevent duplicates Setup Requirements Required Credentials Unipile API Key: For LinkedIn API access NocoDB API Token: For database tracking Database Structure Table: leads linkedin_id: LinkedIn user ID name: User's full name headline: LinkedIn headline url: Profile URL date: Interaction date posts_id: Post reference connection_status: Network distance dm_status: Interaction type (sent/connection request) Customization Options Message Templates: Modify DM and connection request messages Trigger Words: Change the words that activate the workflow Timing: Adjust delays between messages (8-12 seconds default) Reply Variations: Add more comment reply options for authenticity Installation Instructions Import the workflow into your n8n instance Set up NocoDB database with required table structure Configure Unipile and NocoDB credentials Set environment variables for Unipile root URL and LinkedIn account ID Test with a sample post before full use
Find B2B decision maker emails & build lead database with Serper.dev & AnyMailFinder
Companies Email Finder & Lead Generation Automation Short Description Automatically find company domains, extract decision maker emails (CEO, Sales, Marketing), validate email quality, and build a comprehensive prospect database using AI-powered search and professional email finding APIs. Detailed Description This comprehensive lead generation workflow transforms a simple list of company names into a complete prospect database with verified decision maker contacts. The system automatically discovers official company websites, finds key decision makers' email addresses, validates email quality, and organizes everything in a structured database for immediate outreach. Perfect for sales teams, marketing agencies, business developers, and anyone who needs to build high-quality prospect lists efficiently and cost-effectively. Key Features Intelligent Domain Discovery: Uses Serper.dev and AI to find official company websites from search results Multi-Role Email Finding: Automatically extracts emails for: CEOs and C-level executives Sales decision makers Marketing decision makers Email Quality Validation: Classifies emails as "valid" or "risky" for better deliverability Smart Fallback System: Searches for additional company emails when decision makers aren't found Duplicate Prevention: Removes duplicate contacts automatically Batch Processing: Handles large company lists efficiently with intelligent batching Database Integration: Stores all data in NocoDB with proper organization and status tracking Rate Limiting: Includes delays and error handling to respect API limits Who This Workflow Is For Sales Teams: Building targeted prospect lists for outbound campaigns Marketing Agencies: Creating lead databases for client campaigns Business Development: Finding decision makers for partnership opportunities Recruiters: Locating hiring managers and HR contacts Entrepreneurs: Building contact lists for product launches or fundraising Lead Generation Services: Automating prospect research for clients Problems This Workflow Solves Manual Research Time: Eliminates hours of manual company and contact research Incomplete Contact Data: Ensures you have decision makers, not just generic emails Email Deliverability Issues: Validates email quality before outreach campaigns Data Organization: Maintains clean, structured prospect databases Scaling Bottlenecks: Processes hundreds of companies automatically High Lead Generation Costs: Reduces dependency on expensive lead generation services Setup Requirements Required API Credentials Serper.dev API Key: For company domain search and discovery OpenAI API Key: For intelligent domain extraction from search results AnyMailFinder API Key: For decision maker email discovery and validation NocoDB API Token: For database storage and management Database Structure Companies Table: Id (Number): Unique company identifier company_name (Text): Company name to search location (Text): Company location for better search results url (URL): Discovered company website domain (Text): Extracted domain name status (Select): Processing status tracking emails (Text): All discovered company emails companyemailsstatus (Text): Email validation status Contacts Table: companies_id (Number): Link to parent company name (Text): Contact full name position (Text): Job title/role email (Email): Contact email address email_status (Text): Email validation status linkedin_url (URL): LinkedIn profile (when available) System Requirements Active n8n instance (self-hosted or cloud) NocoDB database instance Active API subscriptions for Serper.dev, OpenAI, and AnyMailFinder How It Works Phase 1: Domain Discovery Company Processing: Retrieves companies from database in batches Domain Search: Uses Serper.dev to search for official company websites AI Domain Extraction: OpenAI analyzes search results to identify official domains Database Updates: Stores discovered domains and URLs Phase 2: Decision Maker Discovery Multi-Role Search: Finds emails for CEO, Sales, and Marketing decision makers using AnyMailFinder Email Validation: Validates email deliverability and flags risky addresses Contact Creation: Stores validated contacts with full details Status Tracking: Updates company status based on email discovery success Phase 3: Company Email Backup Gap Analysis: Identifies companies with no valid decision maker emails Bulk Email Search: Finds up to 20 additional company emails using AnyMailFinder Final Updates: Stores all discovered emails for comprehensive coverage Customization Options Search Parameters Modify search queries for better domain discovery using Serper.dev Adjust location-based search parameters Customize AI prompts for domain extraction Decision Maker Roles Add new decision maker categories (HR, Finance, Operations, etc.) Modify existing role search parameters in AnyMailFinder Customize email validation criteria Data Processing Adjust batch sizes for different processing speeds Modify rate limiting delays Customize error handling and retry logic Database Schema Add custom fields for industry, company size, etc. Integrate with different database systems Customize data validation rules API Costs and Credits AnyMailFinder: 2 credits per valid email found, 1 credit per bulk company search Serper.dev: ~$1 per 1000 searches OpenAI: Minimal costs for domain extraction prompts Estimated Cost: about $0.03 per company processed (depending on email discovery success) Benefits Save 20+ Hours Weekly: Automate prospect research that takes hours manually Higher Quality Leads: Get decision makers, not generic contact@ emails Better Deliverability: Email validation reduces bounce rates Scalable Process: Handle thousands of companies automatically Cost Effective: Much cheaper than traditional lead generation services Complete Database: Build comprehensive prospect databases with all contact details Use Cases Outbound Sales Campaigns: Build targeted prospect lists for cold outreach Partnership Development: Find decision makers at potential partner companies Market Research: Understand company structures and contact hierarchies Recruitment: Locate hiring managers and HR contacts Investor Relations: Find contacts at potential investor companies Vendor Outreach: Identify procurement and operations contacts Installation Instructions Import the workflow JSON into your n8n instance Set up NocoDB database with required table structures Configure all API credentials in the credential manager (including Serper.dev and AnyMailFinder) Update NocoDB connection settings with your database details Test with a small batch of companies before full deployment Monitor API usage and adjust batch sizes as needed Best Practices Start with high-quality company names and locations Monitor AnyMailFinder credit usage to manage costs Use Serper.dev efficiently with targeted search queries Regularly clean and validate your prospect database Respect email deliverability best practices Follow GDPR and data privacy regulations Use rate limiting to avoid API restrictions Error Handling Built-in retry mechanisms for API failures Continues processing even if individual companies fail Create an Error Workflow
Repurpose YouTube videos to multiple content types with OpenRouter AI and Airtable
YouTube Content Repurposing Automation Who's it for This workflow is for content creators, marketers, agencies, coaches, and businesses who want to maximize their YouTube content ROI by automatically generating multiple content assets from single videos. It's especially useful for professionals who want to: Repurpose YouTube videos into blogs, social posts, newsletters, and tutorials without manual effort Scale their content production across multiple channels and platforms Create consistent, high-quality content derivatives while saving time and resources Build automated content systems that generate multiple revenue streams Maintain active presence across social media, email, and blog platforms simultaneously What problem is this workflow solving Content creators face significant challenges when trying to maximize their video content: Time-intensive manual repurposing: Converting one YouTube video into multiple content formats traditionally requires hours of manual writing, editing, and formatting across different platforms. Inconsistent content quality: Manual repurposing often leads to varying quality levels and missed opportunities to optimize content for specific platforms. High costs for content services: Hiring ghostwriters or content agencies to repurpose videos can cost thousands of dollars monthly. Scaling bottlenecks: Manual processes prevent creators from efficiently scaling their content across multiple channels and formats. This workflow solves these problems by automatically extracting YouTube video transcripts, using AI to generate multiple high-quality content formats (tutorials, blog posts, social media content, newsletters), and organizing everything in Airtable for easy management and distribution. How it works Automated Video Processing Starts with a manual trigger and retrieves YouTube URLs from your Airtable configuration, processing only videos marked as "selected" while filtering out those marked for deletion. Intelligent Transcript Extraction Uses Scrape Creator API to extract video transcripts, automatically cleaning and formatting the text for optimal AI processing and content generation. Multi-Format Content Generation Leverages OpenRouter models, o you can easily test different AI models and choose the one that delivers the best results for your needs: Step-by-step tutorials with code snippets and technical details YouTube scripts with hooks, titles, and conclusions Blog posts optimized for lead generation Structured summaries with key takeaways LinkedIn posts with engagement triggers Newsletter content for email marketing Twitter/X posts for social media Smart Content Filtering Processes only the content types you've selected in Airtable, ensuring efficient resource usage and faster execution times. Automated Content Organization Matches and combines all generated content pieces by URL, then updates your Airtable with complete, ready-to-use content assets organized by type and source video. How to set up Required credentials OpenRouter API key Airtable Personal Access Token Scrape Creators API Key - For YouTube transcript extraction and processing Airtable base setup Create an Airtable base with one main table: Videos Table: title (Single line text): Video title for reference url (URL): YouTube video URL to process Status (Single select): Options: "selected", "delete", "processed" output (Multiple select): Content types to generate summary tutorial blog-post linkedin newsletter tweeter youtube summary (Long text): Generated video summary tutorial (Long text): Generated step-by-step tutorial keytakeaways (Long text): Extracted key insights blog_post (Long text): Generated blog post content linkedin (Long text): LinkedIn post content newsletter (Long text): Email newsletter content tweeter (Long text): Twitter/X post content youtube_titles (Long text): YouTube video title suggestions youtube_hook (Long text): Video opening hooks youtube_steps (Long text): Video step breakdowns youtube_conclusion (Long text): Video ending/CTAs API Configuration Scrape Creator Setup: Sign up for Scrape Creator API Obtain your API key from the dashboard Configure the HTTP Request node with your credentials Set the endpoint to: https://api.scrapecreators.com/v1/youtube/video/transcript OpenAI Setup: Create an OpenRouter account and generate an API key Workflow Configuration Import the workflow JSON into your n8n instance Update all credential references with your API keys Configure the Airtable nodes with your base and table IDs Test the workflow with a single video URL first Requirements n8n instance (self-hosted or cloud) Active API subscriptions for OpenRouter (or the LLM or your choice), Airtable, and Scrape Creator YouTube video URLs - Must be publicly accessible videos with available transcripts Airtable account - Free tier sufficient for most use cases How to customize the workflow Modify content generation prompts Edit the LLM Chain nodes to customize content style and format: Tutorial node: Adjust technical depth and formatting preferences Blog post node: Modify tone, length, and CTA strategies LinkedIn node: Customize engagement hooks and professional tone Newsletter node: Tailor subject lines and email marketing approach Adjust AI model selection Update the OpenRouter Chat Model to use different models Add new content formats Create additional LLM Chain nodes for new content types: Instagram captions TikTok scripts Podcast descriptions Course outlines
Find engagement opportunities from Skool communities using Apify & GPT-4.1
Who's it for This workflow is for community builders, marketers, consultants, coaches, and thought leaders who want to grow their presence in Skool communities through strategic, value-driven engagement. It's especially useful for professionals who want to: Build authority in their niche by providing helpful insights Scale their community engagement without spending hours manually browsing posts Identify high-value conversation opportunities that align with their expertise Maintain authentic, helpful presence across multiple Skool communities What problem is this workflow solving Many professionals struggle to consistently engage meaningfully in online communities due to: Time constraints: Manually browsing multiple communities daily is time-consuming Missed opportunities: Important discussions happen when you're not online Inconsistent engagement: Sporadic participation reduces visibility and relationship building Generic responses: Quick replies often lack the depth needed to showcase expertise This workflow solves these problems by automatically monitoring your target Skool communities, using AI to identify posts where your expertise could add genuine value, generating thoughtful contextual comment suggestions, and organizing opportunities for efficient manual review and engagement. How it works Scheduled Community Monitoring Runs daily at 7 PM to scan your configured Skool communities for new posts and discussions from the last 24 hours. Intelligent Configuration Management Pulls settings from Airtable including target communities, your domain expertise, and preferred tools Possibility to add several configurations Filters for active configurations only Processes multiple community URLs efficiently Comprehensive Data Extraction Uses Apify Skool Scraper to collect: Post content and metadata Comments over 50 characters (quality filter) Direct links for easy access AI-Powered Opportunity Analysis Leverages OpenAI GPT-4.1 to: Analyze each post for engagement opportunities based on your expertise Identify specific trigger sentences that indicate a need you can address Generate contextual, helpful comment suggestions Maintain authentic tone without being promotional Smart Filtering and Organization Only surfaces genuine opportunities where you can add value Stores results in Airtable with detailed reasoning Provides suggested comments ready for review and posting Tracks engagement history to avoid duplicate responses Quality Control and Review All opportunities are saved to Airtable where you can: Review AI reasoning and suggested responses Edit comments before posting Track which opportunities you've acted on Monitor success patterns over time How to set up Required credentials OpenAI API key - For GPT-4.1 powered opportunity analysis Airtable Personal Access Token - For configuration and results storage Apify API token - For Skool community scraping Airtable base setup Create an Airtable base with two tables: Config Table (config): Name (Single line text): Your configuration name Skool URLs (Long text): Comma-separated list of Skool community URLs cookies (Long text): Your Skool session cookies for authenticated access Domain of Activity (Single line text): Your area of expertise (e.g., "AI automation", "Digital marketing") Tools Used (Single line text): Your preferred tools to recommend (e.g., "n8n", "Zapier") active (Checkbox): Whether this configuration is currently active Results Table (Table 1): title (Single line text): Post title/author url (URL): Direct link to the post reason (Long text): AI's reasoning for the opportunity trigger (Long text): Specific sentence that triggered the opportunity suggested answer (Long text): AI-generated comment suggestion config (Link to another record): Reference to the config used date (Date): When the opportunity was found Select (Single select): Status tracking (not commented/commented) Skool cookies setup To access private Skool communities, you'll need to: Install Cookie Editor: Go to Chrome Web Store and install the "Cookie Editor" extension Login to Skool: Navigate to any Skool community you want to monitor and log in Open Cookie Editor: Click the Cookie Editor extension icon in your browser toolbar Export cookies: Click "Export" button in the extension Copy the exported text Add to Airtable: Paste the cookie string into the cookies field in your Airtable config Trigger configuration Ensure the Schedule Trigger is set to your preferred monitoring time Default is 7 PM daily, but adjust based on your target communities' peak activity Requirements Self-hosted n8n or n8n Cloud account Active Skool community memberships - You must be a legitimate member of communities you want to monitor OpenAI API credits Apify subscription - For reliable Skool data scraping (free tier available) Airtable account - Free tier sufficient for most use cases How to customize the workflow Modify AI analysis criteria Edit the EvaluateOpportunities And Generate Comments node to: Adjust the opportunity detection sensitivity Modify the comment tone and style Add industry-specific keywords or phrases Change monitoring frequency Update the Schedule Trigger to: Multiple times per day for highly active communities Weekly for slower-moving professional groups Custom intervals based on community activity patterns Customize data collection Modify the Apify scraper settings to: Adjust the time window (currently 24 hours) Change comment length filters (currently >50 characters) Include/exclude media content Modify the number of comments per post Add additional filters Insert filter nodes to: Skip posts from specific users Focus on posts with minimum engagement levels Exclude certain post types or keywords Prioritize posts from influential community members Enhance output options Add nodes after Record Results to: Send Slack/Discord notifications for high-priority opportunities Create calendar events for engagement tasks Export daily summaries to Google Sheets Integrate with CRM systems for lead tracking Example outputs Opportunity analysis result json { "opportunity": true, "reason": "The user is struggling with manual social media management tasks that could be automated using n8n workflows.", "trigger_sentence": "I'm spending 3+ hours daily just scheduling posts and responding to comments across all my social accounts.", "suggested_comment": "That sounds exhausting! Have you considered setting up automation workflows? Tools like n8n can handle the scheduling and even help with response suggestions, potentially saving you 80% of that time. The initial setup takes a day but pays dividends long-term." } Airtable record example Title: "Sarah Johnson - Social Media Burnout" URL: https://www.skool.com/community/post/123456 Reason: "User expressing pain point with manual social media management - perfect fit for automation solutions" Trigger: "I'm spending 3+ hours daily just scheduling posts..." Suggested Answer: "That sounds exhausting! Have you considered setting up automation workflows?..." Config: [Your Config Name] Date: 2024-12-09 19:00:00 Status: "not commented" Best practices Authentic engagement Always review and personalize AI suggestions before posting Focus on being genuinely helpful rather than promotional Share experiences and ask follow-up questions Engage in subsequent conversation when people respond Community guidelines Respect each community's rules and culture Avoid over-promotion of your tools or services Build relationships before introducing solutions Contribute value consistently, not just when selling Optimization tips Monitor which types of opportunities convert best A/B test different comment styles and approaches Track engagement metrics on your actual comments Adjust AI prompts based on community feedback
Scrape Google Maps leads and find emails with Apify and Anymailfinder
Scrape Google Maps leads and find emails with Apify and Anymailfinder Short Description This workflow automates lead generation by scraping business data from Google Maps using Apify, enriching it with verified email addresses via Anymailfinder, and storing the results in a NocoDB database. It's designed to prevent duplicates by checking against existing records before saving new leads. Key Features Automated Scraping: Kicks off a Google Maps search based on your query, city, and country. Email Enrichment: For businesses with a website, it automatically finds professional email addresses. Data Cleaning: Cleans website URLs to extract the root domain, ignoring social media links. Duplicate Prevention: Checks against existing entries in NocoDB using the Google placeId to avoid adding the same lead twice. Structured Storage: Saves enriched lead data into a structured NocoDB database. Batch Processing: Efficiently handles and loops through all scraped results. Who This Workflow Is For Sales Teams looking for a source of local business leads. Marketing Agencies building outreach campaigns for local clients. Business Developers prospecting for new partnerships. Freelancers seeking clients in specific geographical areas. How It Works Trigger: The workflow starts when you submit the initial form with a business type (e.g., "plumber"), a city, a country code, and the number of results you want. Scrape Google Maps: It sends the query to Apify to scrape Google Maps for matching businesses. Process Leads: The workflow loops through each result one by one. Clean Data: It extracts the main website domain from the URL provided by Google Maps. Check for Duplicates: It queries your NocoDB database to see if the business (placeId) has already been saved. If so, it skips to the next lead. Find Emails: If a valid website domain exists, it uses Anymailfinder to find associated email addresses. Store Lead: The final data, including the business name, address, phone, website, and any found emails, is saved as a new row in your NocoDB table. Setup Requirements Required Credentials Apify API Key: To use the Google Maps scraping actor. Anymailfinder API Key: For email lookup. NocoDB API Token: To connect to your database for storing and checking leads. Database Structure You need to create a table in your NocoDB instance with the following columns. The names should match exactly. Table: leads (or your preferred name) title (SingleLineText) website (Url) phone (PhoneNumber) email (Email) email_validation (SingleLineText) address (LongText) neighborhood (SingleLineText) rating (Number) categories (LongText) city (SingleLineText) country (SingleLineText) postal code (SingleLineText) domain (Url) placeId (SingleLineText) - Important for duplicate checking date (Date) Customization Options Change Trigger: Replace the manual Form Trigger with a Schedule Trigger to run searches automatically or an HTTP Request node to start it from another application. Modify Scraper Parameters: In the "Scrape Google Maps" node, you can adjust the Apify actor's JSON input to change language, include reviews, or customize other advanced settings. Use a Different Database: Replace the NocoDB nodes with nodes for Google Sheets, Baserow, Airtable, or any SQL database to store your leads. Installation Instructions Import the workflow into your n8n instance. Create the required table structure in your NocoDB instance as detailed above. Configure the credentials for Apify, Anymailfinder, and NocoDB in the respective nodes. In the two NocoDB nodes ("Get all the recorded placeIds" and "Create a row"), select your project and table from the dropdown menus. Activate the workflow. You can now run it by filling out the form in the n8n UI.