AI Personal Assistant
Email Personal Assistant - Comprehensive Communication Manager This automation flow is designed to proactively monitor email, calendar, and Slack communications, analyze priorities across all channels, and generate a comprehensive daily briefing with actionable tasks for executive productivity management. โ๏ธ How It Works (Step-by-Step): โฐ Automated Daily Trigger Runs automatically on weekdays: Scheduled execution every weekday at 8:00 AM Manual trigger available for on-demand analysis Comprehensive daily communication audit ๐ง Email Assistant Agent Analyzes inbox priorities and context: Scans unread emails across "To Respond" and "FYI" labels Checks email history to determine relationship context Identifies *company-related opportunities and partnerships Categorizes emails by urgency (High, Medium, Low) Cross-references with sent emails for follow-up context ๐ Follow-Up Assistant Agent Monitors meeting follow-up requirements: Reviews last 3 days of calendar meetings Fetches Fireflies transcripts for recorded sessions Identifies meetings without post-meeting communication Flags meetings requiring action items or follow-ups Checks sent emails and Slack for completed follow-ups ๐ฌ Slack Assistant Agent Tracks Slack communication priorities: Monitors direct messages and @mentions Identifies unreplied Slack conversations Cross-references with email and calendar context Prioritizes responses based on sender importance Checks for threaded conversations requiring attention ๐ฏ Master Orchestrator Agent Synthesizes all communication data: Combines reports from all three assistant agents Cross-references with existing Google Sheets to-do list Prioritizes tasks by urgency and business impact Identifies correlations between different communication channels Creates comprehensive daily action plan ๐ Task Management Integration Automated tracking and delivery: Appends new tasks to Google Sheets to-do tracker Sends personalized daily briefing via Slack DM Maintains conversation memory for context continuity Tracks outstanding vs. completed items ๐ ๏ธ Tools Used: n8n: Workflow orchestration and scheduling Claude Sonnet 4 & Opus 4: Multi-agent AI analysis Gmail API: Email monitoring and history checking Google Calendar: Meeting tracking and scheduling Slack API: Message monitoring and user management Fireflies API: Meeting transcript analysis Google Sheets: Task tracking and persistence ๐ฆ Key Features: Multi-channel communication monitoring (Email, Calendar, Slack) AI-powered priority assessment and context analysis Cross-platform relationship tracking and history Automated daily briefing generation and delivery Persistent task tracking with Google Sheets integration Meeting follow-up verification and flagging Conversation memory for continuity across sessions ๐ Ideal Use Cases: C-level executives managing multiple communication channels Sales leaders tracking prospect interactions and follow-ups Business development professionals managing partnerships Busy professionals needing communication prioritization Teams requiring systematic follow-up management Anyone wanting automated daily productivity briefings
Simple expense tracker with n8n chat, AI agent and Google Sheets
Use Case It is very convenient to add expenses via simple chat message. This workflow attempts to do exactly this using AI-powered n8n magic! Send message to a chat, something like "car wash; 59.3 usd; 25 jan 2024" And get a response: Your expense saved, here is the output of save sub-workflow:{"cost":59.3,"descr":"car wash","date":"2024-01-25","msg":"car wash; 59.3 usd; 25 jan 2024"} LLM will smartly parse your message to structured JSON and save the expense as a new row into Google Sheet! Installation Set up Google Sheets: Clone this Sheet: https://docs.google.com/spreadsheets/d/1D0r3tun7LF7Ypb21CmbTKEtn76WE-kaHvBCM5NdgiPU/edit?gid=0gid=0 (File -> Make a copy) Choose this sheet into "Save expense into Google Sheets" node. Fix sub-workflow dropdown: open "Parse msg and save to Sheets" node (which is an n8n sub-workflow executor tool) and make sure the SAME workflow is chosen in the dropdown. it will allow n8n to locate and call "Workflow Input Trigger" properly when needed. Activate the workflow to make chat work properly. Sent message to chat, something like "car wash; 59.3 usd; 25 jan 2024" you should get a response: Your expense saved, here is the output of save sub-workflow:{"cost":59.3,"descr":"car wash","date":"2024-01-25","msg":"car wash; 59.3 usd; 25 jan 2024"} and new row in Google sheets should be inserted!
Export n8n Cloud execution data to CSV
Overview This template helps n8n cloud plan users execute all executions to a CSV for easy data analysis. Identify what workflows are generating the most executions or could be optimized. How this workflow works Click "Test Workflow" to manually execute the workflow Open the "Convert to CSV" node to access the binary data of the CSV file Download the CSV file Nodes included: n8n node Convert to File No Operation, do nothing - replace with another Set up steps Import the workflow to your workspace Add your n8n API credential Benefits of Exporting n8n Cloud Executions to CSV Exporting n8n Cloud executions to CSV offers significant advantages for enhancing workflow management and data analysis capabilities. Here are three key benefits: Enhanced Data Analysis: Comprehensive Insights: Exporting execution data allows for in-depth analysis of workflow performance, helping identify bottlenecks and optimize processes. Custom Reporting: CSV files can be easily imported into various data analysis tools (e.g., Excel, Google Sheets, or BI software) to create custom reports and visualizations tailored to specific business needs. Improved Workflow Monitoring: Historical Data Review: Accessing historical execution data enables users to track workflow changes and their impacts over time, facilitating better decision-making. Error Tracking and Debugging: By reviewing execution logs, users can quickly identify and address errors or failures, ensuring smoother and more reliable workflow operations. Regulatory Compliance and Auditing: Audit Trails: Keeping a record of all executions provides a clear audit trail, essential for regulatory compliance and internal audits. Data Retention: Exported data ensures that execution records are preserved according to organizational data retention policies, safeguarding against data loss. By leveraging the capabilities of CSV exports, users can gain valuable insights, streamline workflow management, and ensure robust data handling practices, ultimately driving better performance and efficiency in their n8n Cloud operations.
Stripe payment order sync โ auto retrieve customer & product purchased
Overview This automation template is designed to streamline your payment processing by automatically triggering upon a successful Stripe payment. The workflow retrieves the complete payment session and filters the information to display only the customer name, customer email, and the purchased product details. This template is perfect for quickly integrating Stripe transactions into your inventory management, CRM, or notification systems. Step-by-Step Setup Instructions Stripe Account Configuration: Ensure you have an active Stripe account. Connect your Stripe Credentials. Retrieve Product and Customer Data: Utilize Stripeโs API within the automation to fetch the purchased product details. Retrieve customer information such as: email and full name. Integration and Response: Map the retrieved data to your desired format. Trigger subsequent nodes or actions such as sending a confirmation email, updating a CRM system, or logging the transaction. Pre-Conditions and Requirements Stripe Account: A valid Stripe account with access to API keys and webhook configurations. API Keys: Ensure you have your Stripe secret and publishable keys ready. Customization Guidance Data Mapping: Customize the filtering node to match your specific data schema or to include additional data fields if needed. Additional Actions: Integrate further nodes to handle post-payment actions like sending SMS notifications, updating order statuses, or generating invoices. Enjoy seamless integration and enhanced order management with this automation template!
Automated resume job matching engine with Bright Data MCP & OpenAI 4o mini
Notice Community nodes can only be installed on self-hosted instances of n8n. Who this is for The Automated Resume Job Matching Engine is an intelligent workflow designed for career platforms, HR tech startups, recruiting firms, and AI developers who want to streamline job-resume matching using real-time data from LinkedIn and job boards. This workflow is tailored for: HR Tech Founders - Building next-gen recruiting products Recruiters & Talent Sourcers - Seeking automated candidate-job fit evaluation Job Boards & Portals - Enriching user experience with AI-driven job recommendations Career Coaches & Resume Writers - Offering personalized job fit analysis AI Developers - Automating large-scale matching tasks using LinkedIn and job data What problem is this workflow solving? Manually matching a resume to job description is time-consuming, biased, and inefficient. Additionally, accessing live job postings and candidate profiles requires overcoming web scraping limitations. This workflow solves: Automated LinkedIn profile and job post data extraction using Bright Data MCP infrastructure Semantic matching between job requirements and candidate resume using OpenAI 4o mini Pagination handling for high-volume job data End-to-end automation from scraping to delivery via webhook and persisting the job matched response to disk What this workflow does Bright Data MCP for Job Data Extraction Uses Bright Data MCP Clients to extract multiple job listings (supports pagination) Pulls job data from LinkedIn with the pre-defined filtering criteria's OpenAI 4o mini LLM Matching Engine Extracts paginated job data from the Bright Data MCP extracted info via the MCP scrapeashtml tool. Extracts textual job description information via the scraped job information by leveraging the Bright Data MCP scrapeashtml tool. AI Job Matching node handles the job description and the candidate resume compare to generate match scores with insights Data Delivery Sends final match report to a Webhook Notification endpoint Persistence of AI matched job response to disk Pre-conditions Knowledge of Model Context Protocol (MCP) is highly essential. Please read this blog post - model-context-protocol You need to have the Bright Data account and do the necessary setup as mentioned in the Setup section below. You need to have the Google Gemini API Key. Visit Google AI Studio You need to install the Bright Data MCP Server @brightdata/mcp You need to install the n8n-nodes-mcp Setup Please make sure to setup n8n locally with MCP Servers by navigating to n8n-nodes-mcp Please make sure to install the Bright Data MCP Server @brightdata/mcp on your local machine. Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. Create a Web Unlocker proxy zone called mcp_unlocker on Bright Data control panel. In n8n, configure the OpenAi account credentials. In n8n, configure the credentials to connect with MCP Client (STDIO) account with the Bright Data MCP Server as shown below. Make sure to copy the Bright Data APITOKEN within the Environments textbox above as APITOKEN=<your-token>. Update the Set input fields for candidate resume, keywords and other filtering criteria's. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. Update the file name and path to persist on disk. How to customize this workflow to your needs Target Different Job Boards Set input fields with the sites like Indeed, ZipRecruiter, or Monster Customize Matching Criteria Adjust the prompt inside the AI Job Match node Include scoring metrics like skills match %, experience relevance, or cultural fit Automate Scheduling Use a Cron Node to periodically check for new jobs matching a profile Set triggers based on webhook or input form submissions Output Customization Add Markdown/PDF formatting for report summaries Extend with Google Sheets export for internal analytics Enhance Data Security Mask personal info before sending to external endpoints
Send a file from S3 to AWS Textract
This workflow shows how to download an image file from S3 and pass it on to Textract for text extraction. The workflow uses two nodes: AWS S3: This node will download a receipt file from S3 AWS Textract: This node connects to Aamazon's Textract service to extract text from the receipt file
Find high-intent sales leads by scraping Glassdoor with Bright Data & GPT
๐ Scrape Glassdoor with Bright Data Designed for sales teams, recruiters, and marketers aiming to automate job discovery and prospecting. This workflow scrapes Glassdoor job listings using Bright Data and automatically generates targeted pitches using AI, streamlining lead identification and outreach. --- ๐งฉ How It Works This automation leverages n8n, Bright Data, Google Sheets, and OpenAI: Trigger Starts with a custom form input (Location, Keyword, Country). Bright Data Job Scrape Triggers a Bright Data dataset snapshot via HTTP Request. Polls snapshot progress using a Wait node, ensuring data readiness. Retrieves full job listings dataset once ready. Google Sheets Integration Writes detailed job data (company, role, location, overview, metrics) into a Google Sheet. Uses a pre-built template for organized data storage. Automated Pitch Generation (AI) Splits listings into actionable parts: company name, title, and description. Sends data to OpenAI (via LangChain) to generate relevant pitches or icebreakers. Saves generated content back into the same sheet for easy access. --- โ Requirements Ensure you have the following: Google Sheets Google account Template Sheet with columns for job details and AI-generated pitches Bright Data Active account with Dataset API access API key and dataset ID OpenAI Valid OpenAI API key for GPT models n8n Environment Nodes: HTTP Request, Wait, If, Google Sheets, Split Out, LangChain (OpenAI) Credentials: Google Sheets OAuth2 Bright Data API credentials OpenAI API key --- โ๏ธ Setup Instructions Step 1: Prepare Google Sheets Copy the provided Google Sheets template Do not change headers Step 2: Import & Configure Workflow in n8n Import the workflow JSON file Set Google Sheets node: Link to your copied sheet Confirm correct tab name Step 3: Configure Bright Data Replace <YOURBRIGHTDATAAPIKEY> with your real key Set your dataset ID in all HTTP Request nodes Step 4: Configure OpenAI (LangChain) Connect OpenAI API key to the LangChain node Customize prompt to match tone and outreach style Step 5: Testing & Scheduling Test via manual form trigger Schedule runs or leave form enabled for on-demand use --- ๐ง Tips & Best Practices Use specific keywords and locations for better results Adjust polling intervals based on dataset size Refine AI prompts regularly to improve pitch quality Clean unused columns from your sheet to boost performance --- ๐ฌ Support & Feedback For help or customization: ๐ง Email: Yaron@nofluff.online ๐บ YouTube: @YaronBeen ๐ LinkedIn: linkedin.com/in/yaronbeen ๐ Bright Data Docs: docs.brightdata.com/introduction
Archive empty pages in Notion database
This workflow will archive empty pages in your Notion databases, Add your n8n integration to the Notion databases that you want to process. To configure this workflow set the Notion credentials in the 4 Notion nodes and if needed change the time in the Cron node, The default is to run at 2am every day.
Send an email using AWS SES
Companion workflow for AWS SES node docs
Get the last five SpaceX launches from the spacex.land API using GraphQL
Companion workflow for GraphQL node docs
Automate outbound voice calls from Go High Level opportunities with Vapi
This workflow triggers when a new opportunity is created in Go High Level (GHL), fetches the associated contact details, and initiates an outbound call using Vapi. The call is made by a Vapi assistant configured with the appropriate credentials. --- ๐งพ Requirements Go High Level (GHL) A Go High Level account GHL developer private app and credentials enabled in n8n Webhook URL from n8n added to your GHL private app Vapi A Vapi account with credit A connected phone number to make calls An assistant created and ready to make calls Your Vapi API key ๐ Useful Links GHL Docs Vapi Docs n8n GHL Credentials Setup --- ๐ Workflow Breakdown Trigger: GHL Opportunity Created Triggered by a Webhook (POST) from Go High Level when a new opportunity is created. Webhook URL must be enabled in your GHL private app. Get a GHL Contact Retrieves contact details from GHL CRM using the contact ID from the opportunity. Includes information such as phone number, name, and custom fields. Wait 5 Minutes Introduces a short delay before making the call to avoid immediate outreach. Helps ensure data is synced and gives the system time for any follow-up automation. Set Vapi Fields (Manual Step) Set the required fields for the Vapi API call: vapiPhoneNumberId โ the number id making the call vapiAssistantId โ the assistant who will handle the call vapiApi โ your secure Vapi API key Start Outbound Vapi Call Sends a POST request to https://api.vapi.ai/call Payload includes: Contactโs phone number Selected Vapi assistant Vapi phone number id to start the call --- โ Summary This n8n automation connects your CRM (Go High Level) with voice automation (Vapi) to immediately respond to new opportunities. Once a lead is created, they will receive a personalized voice call from a Vapi AI assistant. --- ๐โโ๏ธ Need Help? Feel free to contact us at 1 Node Get instant access to a library of free resources we created.
Create authentic UGC video ads with GPT-4o, ElevenLabs & WaveSpeed lip-sync
This n8n template demonstrates how to create authentic-looking User Generated Content (UGC) advertisements using AI image generation, voice synthesis, and lip-sync technology. The workflow transforms product images into realistic customer testimonial videos that mimic genuine user reviews and social media content. Use cases are many: Generate authentic UGC-style ads for social media campaigns, create customer testimonial videos without hiring influencers, produce localized UGC content for different markets, automate TikTok/Instagram-style product reviews, or scale UGC ad production for e-commerce brands! Good to know The workflow creates UGC-style content that appears genuine and authentic Uses multiple AI services: OpenAI GPT-4o for analysis, ElevenLabs for voice synthesis, and WaveSpeed AI for image generation and lip-sync Voice synthesis costs vary by ElevenLabs plan (typically $0.18-$0.30 per 1K characters) WaveSpeed AI pricing: ~$0.039 per image generation, additional costs for lip-sync processing Processing time: ~3-5 minutes per complete UGC video Optimized for Malaysian-English content but easily adaptable for global markets How it works Product Input: The Telegram bot receives product images to create UGC ads for AI Analysis: ChatGPT-4o analyzes the product to understand brand, colors, and target demographics UGC Content Creation: AI generates authentic-sounding testimonial scripts and detailed prompts for realistic customer scenarios Character Generation: WaveSpeed AI creates believable customer avatars that look like real users reviewing products Voice Synthesis: ElevenLabs generates natural, conversational audio using gender-appropriate voice models UGC Video Production: WaveSpeed AI combines generated characters with audio to create TikTok/Instagram-style review videos Content Delivery: Final UGC videos are delivered via Telegram, ready for social media posting The workflow produces UGC-style content that maintains authenticity while showcasing products in realistic, relatable scenarios that resonate with target audiences. How to use Setup Credentials: Configure OpenAI API, ElevenLabs API, WaveSpeed AI API, Cloudinary, and Telegram Bot credentials Deploy Workflow: Import the template and activate the workflow Send Product Images: Use the Telegram bot to send product images you want to create UGC ads for Automatic UGC Generation: The workflow will automatically create authentic-looking customer testimonial videos Receive UGC Content: Get both testimonial images and final UGC videos ready for social media campaigns Pro tip: The workflow automatically detects product demographics and creates appropriate customer personas. For best UGC results, use clear product images that show the item in use. Requirements OpenAI API account for GPT-4o product analysis and UGC script generation ElevenLabs API account for authentic voice synthesis (requires voice cloning credits) WaveSpeed AI API account for realistic character generation and lip-sync processing Cloudinary account for UGC content storage and hosting Telegram Bot setup for content input and delivery n8n instance (cloud or self-hosted) Customizing this workflow Platform-Specific UGC: Modify prompts to create UGC content optimized for TikTok, Instagram Reels, YouTube Shorts, or Facebook Stories. Brand Voice: Adjust testimonial scripts and character personas to match your brand's target audience and tone. Regional Adaptation: Customize language, cultural references, and character demographics for different markets and demographics. UGC Style Variations: Create different UGC formats - unboxing videos, before/after comparisons, day-in-the-life content, or product demonstrations. Influencer Personas: Develop specific customer personas (age groups, lifestyles, interests) to create targeted UGC content for different audience segments. Content Scaling: Set up batch processing to generate multiple UGC variations for A/B testing different approaches and styles.