🤖 Automate CV screening with AI candidate analysis
How it works This workflow automates your initial hiring pipeline by creating an AI-powered CV scanner. It collects job applications through a web form, uses AI to analyze the candidate's CV against your job description, and neatly organizes the results in a Google Sheet. Here’s the step-by-step process: The Application Form: A Form Trigger provides a public web form for candidates to submit their name, email, and CV (as a PDF). Initial Logging: As soon as an application is submitted, the candidate's name and email are added to a Google Sheet. This ensures every applicant is logged, even if a later step fails. CV Text Extraction: The workflow uses Mistral's OCR model to accurately extract all the text from the uploaded CV PDF. AI Analysis: The extracted text is sent to Google Gemini. A detailed prompt instructs the AI to act as a hiring assistant, scoring the CV against the specific requirements of your job role and providing a detailed explanation for its score. Structured Output: A JSON Output Parser ensures the AI's analysis is returned in a clean, structured format, making the data reliable. Final Record: The AI-generated qualification score and explanation are added to the candidate's row in the Google Sheet, giving you a complete, analyzed list of applicants. Set up steps Setup time: ~15 minutes You'll need API keys for Mistral and Google AI, and to connect your Google account. Get Your Mistral API Key: Visit the Mistral Platform at console.mistral.ai/api-keys. Create and copy your API key. In the workflow, go to the Extract CV Text node, click the Credential dropdown, and select + Create New Credential. Paste your key into the API Key field and Save. Get Your Google AI API Key: Visit Google AI Studio at aistudio.google.com/app/apikey. Click "Create API key in new project" and copy the key. In the workflow, go to the Gemini 2.5 Flash Lite node, click the Credential dropdown, and select + Create New Credential. Paste your key into the API Key field and Save. Connect Your Google Account: Select the Create 'CVs' Spreadsheet node. Click the Credential dropdown and select + Create New Credential to connect your Google account. Repeat this for the Log Candidate Submission and Add CV Analysis nodes, selecting the credential you just created. Create Your Spreadsheet: Click the "play" icon on the Start Here node to run it. This will create a new Google Sheet in your Google Drive named "CVs" with the correct columns. Customize the Job Role: Go to the AI Qualification node. In the Text parameter, find the job_requirements section and replace the example job description with your own. Be as detailed as possible for the best results. Start Screening! Activate the workflow using the toggle at the top right. Go to the Application Form node and click the "Open Form URL" button. Fill out the form with a test application and upload a sample CV. Check your Google Sheet to see the AI's analysis appear within moments
Generate dynamic contents for EMAILS or HTML pages
==Disclaimer: This template contains a community node and therefore only works for n8n self-hosted users== This is Miquel from Aprende n8n and Automate with n8n. We have created a new community node Document Generator that generates dynamic content using templates. You can compose complex content with no SETs or FUNCTION ITEMs nodes using this node, like: Send one email with a list of items in the body (i.e., one email with the last entries of an RSS feed). Send one email per item (i.e., one invoice per email). Emails are just a sample. You can create complex dynamic content to: Send messages to Telegram/Slack. Create WordPress entries. Create HTML pages for your website. Create tickets. And more! The sky is your limit ;) If you want to use this workflow, install the community node n8n-nodes-document-generator from Settings > Community nodes. Type "n8n-nodes-document-generator", check "I understand the risks..." and click on "Install": Later, copy and paste this workflow into your n8n. You will get this workflow: This workflow uses the Customer Datastore node to generate sample input items. You can render one template with all items (enable "Render All Items with One Template"): or one template per input item: Visit the oficial NPM page to see more samples. Learning n8n by yourself is nice, but a bit tricky :) We offer n8n training video courses at Aprende n8n. If you need custom trainings, let us know. Additionally, you can contact us at Automate with n8n if you need the next services: Custom installations. Custom nodes. Monitor and alarms. Delegate 12/5 or 24/7 workflow issue resolutions. Automated backups of your workflows. HTTP integrations of non-supported APIs. Complex workflows. I hope you will enjoy this new node and this workflow. Automate your life! Automate it with n8n!
Build a self-hosted URL shortener with a dashboard
This workflow creates an automatic self-hosted URL shortener. It consists of three sub-workflows: Short URL creation for extracting the provided long URL, generating an ID, and saving the record in the database. It returns a short link as a result. Redirection for extracting the ID value, validating the existence of its correspondent record in the database, and returning a redirection page after updating the visits (click) count. Dashboard for calculating simple statistics about the saved record and displaying them on a dashboard. Read more about this use case and how to set up the workflow in the blog post How to build a low-code, self-hosted URL shortener in 3 steps. Prerequisites A local proxy set up that redirects the n8n.ly domain to your n8n instance An Airtable account and credentials Basic knowledge of JavaScript, HTML, and CSS Nodes Webhook nodes trigger the sub-workflows on calls to a specified link. IF nodes route the workflows based on specified query parameters. Set nodes set the required values returned by the previous nodes (id, longUrl, and shortUrl). Airtable nodes retrieve records (values) from or append records to the database. Function node calculates statistics on link clicks to be displayed on the dashboard, as well as its design. Crypto node generates a SHA256 hash.
Automated meeting recording & AI summaries with Google Calendar, Vexa & Llama 3.2
Transform your meetings into actionable insights automatically! This workflow captures meeting audio, transcribes conversations, generates AI summaries, and emails the results to participants—all without manual intervention. What's the Goal? Auto-record meetings when they start and stop when they end Transcribe audio to text using Vexa Bot integration Generate intelligent summaries with AI-powered analysis Email summaries to meeting participants automatically Eliminate manual note-taking and post-meeting admin work Never miss important discussions or action items again Why Does It Matter? Save 90% of Post-Meeting Time: No more manual transcription or summary writing Never Lose Key Information: Automatic capture ensures nothing falls through cracks Improve Team Productivity: Focus on discussions, not note-taking Perfect Meeting Records: Searchable transcripts and summaries for future reference Instant Distribution: Summaries reach all participants immediately after meetings How It Works Step 1: Meeting Detection & Recording Start Meeting Trigger: Detects when meeting begins via Google Meet webhook Launch Vexa Bot: Automatically joins meeting and starts recording End Meeting Trigger: Detects meeting end and stops recording Step 2: Audio Processing & Transcription Stop Vexa Bot: Ends recording and retrieves audio file Fetch Meeting Audio: Downloads recorded audio from Vexa Bot Transcribe Audio: Converts speech to text using AI transcription Step 3: AI Summary Generation Prepare Transcript: Formats transcribed text for AI processing Generate Summary: AI model creates concise meeting summary with: Key discussion points Decisions made Action items assigned Next steps identified Step 4: Distribution Send Email: Automatically emails summary to all meeting participants Setup Requirements Google Meet Integration: Configure Google Meet webhook and API credentials Set up meeting detection triggers Test with sample meeting Vexa Bot Configuration: Add Vexa Bot API credentials for recording Configure audio file retrieval settings Set recording quality and format preferences AI Model Setup: Configure AI transcription service (e.g., OpenAI Whisper, Google Speech-to-Text) Set up AI summary generation with custom prompts Define summary format and length preferences Email Configuration: Set up SMTP credentials for email distribution Create email templates for meeting summaries Configure participant list extraction from meeting metadata Import Instructions Get Workflow JSON: Copy the workflow JSON code Open n8n Editor: Navigate to your n8n dashboard Import Workflow: Click menu (⋯) → "Import from Clipboard" → Paste JSON → Import Configure Credentials: Add API keys for Google Meet, Vexa Bot, AI services, and SMTP Test Workflow: Run a test meeting to verify end-to-end functionality Your meetings will now automatically transform into actionable summaries delivered to your inbox!
Automate Glassdoor job search with Bright Data scraping & Google Sheets storage
🔍 Glassdoor Job Finder: Bright Data Scraping + Keyword-Based Automation A comprehensive n8n automation that scrapes Glassdoor job listings using Bright Data's web scraping service based on user-defined keywords, location, and country parameters, then automatically stores the results in Google Sheets. 📋 Overview This workflow provides an automated job search solution that extracts job listings from Glassdoor using form-based inputs and stores organized results in Google Sheets. Perfect for recruiters, job seekers, market research, and competitive analysis. Workflow Description: Automates Glassdoor job searches using Bright Data's web scraping capabilities. Users submit keywords, location, and country via form trigger. The workflow scrapes job listings, extracts company details, ratings, and locations, then automatically stores organized results in Google Sheets for easy analysis and tracking. ✨ Key Features 🎯 Form-Based Input: Simple web form for job type, location, and country 🔍 Glassdoor Integration: Uses Bright Data's Glassdoor dataset for accurate job data 📊 Smart Data Processing: Automatically extracts key job information 📈 Google Sheets Storage: Organized data storage with automatic updates 🔄 Status Monitoring: Built-in progress tracking and retry logic ⚡ Fast & Reliable: Professional scraping with error handling 🎯 Keyword Flexibility: Search any job type with location filters 📝 Structured Output: Clean, organized job listing data 🎯 What This Workflow Does Input Job Keywords: Job title or role (e.g., "Software Engineer", "Marketing Manager") Location: City or region for job search Country: Target country for job listings Processing Form Submission Data Scraping via Bright Data Status Monitoring Data Extraction Data Processing Sheet Update Output Data Points | Field | Description | Example | |-------|-------------|---------| | Job Title | Position title from listing | Senior Software Engineer | | Company Name | Employer name | Google Inc. | | Location | Job location | San Francisco, CA | | Rating | Company rating score | 4.5 | | Job Link | Direct URL to listing | https://glassdoor.com/job/... | 🚀 Setup Instructions Prerequisites n8n instance (self-hosted or cloud) Google account with Sheets access Bright Data account with Glassdoor scraping dataset access 5–10 minutes for setup Step 1: Import the Workflow Copy the JSON workflow code from the provided file In n8n: Workflows → + Add workflow → Import from JSON Paste JSON and click Import Step 2: Configure Bright Data Set up Bright Data credentials in n8n Ensure access to dataset: gd_lpfbbndm1xnopbrcr0 Update API tokens in: "Scrape Job Data" node "Check Delivery Status of Snap ID" node "Getting Job Lists" node Step 3: Configure Google Sheets Integration Create a new Google Sheet (e.g., "Glassdoor Job Tracker") Set up Google Sheets OAuth2 credentials in n8n Prepare columns: Column A: Job Title Column B: Company Name Column C: Location Column D: Rating Column E: Job Link Step 4: Update Workflow Settings Update "Update Job List" node with your Sheet ID and credentials Test the form trigger and webhook URL Step 5: Test & Activate Submit test data (e.g., "Software Engineer" in "New York") Activate the workflow Verify Google Sheet updates and field extraction 📖 Usage Guide Submitting Job Searches Navigate to your workflow's webhook URL Fill in: Search Job Type Location Country Submit the form Reading the Results Real-time job listing data Company ratings and reviews Direct job posting links Location-specific results Processing timestamps 🔧 Customization Options More Data Points: Add job descriptions, salary, company size, etc. Search Parameters: Add filters for salary, experience, remote work Data Processing: Add validation, deduplication, formatting 🚨 Troubleshooting Bright Data connection failed: Check API credentials and dataset access No job data extracted: Validate search terms and location format Google Sheets permission denied: Re-authenticate and check sharing Form submission failed: Check webhook URL and form config Workflow execution failed: Check logs, add retry logic Advanced Troubleshooting Check execution logs in n8n Test individual nodes Verify data formats Monitor rate limits Add error handling 📊 Use Cases & Examples Recruitment Pipeline: Track job postings, build talent database Market Research: Analyze job trends, hiring patterns Career Development: Monitor opportunities, salary trends Competitive Intelligence: Track competitor hiring activity ⚙️ Advanced Configuration Batch Processing: Accept multiple keywords, loop logic, delays Search History: Track trends, compare results over time External Tools: Integrate with CRM, Slack, databases, BI tools 📈 Performance & Limits Single search: 2–5 minutes Data accuracy: 95%+ Success rate: 90%+ Concurrent searches: 1–3 (depends on plan) Daily capacity: 50–200 searches Memory: ~50MB per execution API calls: 3 Bright Data + 1 Google Sheets per search 🤝 Support & Community n8n Community Forum: community.n8n.io Documentation: docs.n8n.io Bright Data Support: Via your dashboard GitHub Issues: Report bugs and features Contributing: Share improvements, report issues, create variations, document best practices. Need Help? Check the full documentation or visit the n8n Community for support and workflow examples.
Automatically archive Gmail emails from Inbox
Use Case Automatically archive emails in your Gmail inbox from the last day, unless they have been starred. Been using this with my personal and work emails to stick to an Inbox Zero strategy, without having to click or swipe a lot. Setup Add your Gmail creds How to adjust this template Set your own schedule for when to run this. Otherwise, should be good to go. 🤞🏽
Better Oauth2.0 workflow for Pipedrive CRM with Supabase
This workflow provides an OAuth 2.0 auth token refresh process for better control. Developers can utilize it as an alternative to n8n's built-in OAuth flow to achieve improved control and visibility. In this template, I've used Pipedrive API, but users can apply it with any app that requires the authorization_code for token access. This resolves the issue of manually refreshing the OAuth 2.0 token when it expires, or when n8n's native OAuth stops working. What you need to replicate this Your database with a pre-existing table for storing authentication tokens and associated information. I'm using Supabase in this example, but you can also employ a self-hosted MySQL. Here's a quick video on setting up the Supabase table. Create a client app for your chosen application that you want to access via the API. After duplicating the template: a. Add credentials to your database and connect the DB nodes in all 3 workflows. Enable/Publish the first workflow, "1. Generate and Save Pipedrive tokens to Database." Open your client app and follow the Pipedrive instructions to authenticate. Click on Install and test. This will save your initial refresh token and access token to the database. Please watch the YouTube video for a detailed demonstration of the workflow: How it operates Workflow 1. Create a workflow to capture the authorizationcode, generate the accesstoken, and refresh the token, and then save the token to the database. Workflow 2. Develop your primary workflow to fetch or post data to/from your application. Observe the logic to include an if condition when an error occurs with an invalid token. This triggers the third workflow to refresh the token. Workflow 3. This workflow will handle the token refresh. Remember to send the unique ID to the webhook to fetch the necessary tokens from your table. Detailed demonstration of the workflow: https://youtu.be/6nXi_yverss
Generate YouTube SEO content & thumbnail from video scripts with GPT-4o & Runway
Who’s it for This template is ideal for YouTube video creators who spend a lot of time manually generating SEO assets like descriptions, tags, titles, keywords, and thumbnails. If you're looking to automate your YouTube SEO workflow, this is the perfect solution for you. How it works / What it does Connect a Google Sheet to n8n and pull in the Hindi script (or any language). Use OpenAI to generate SEO content: Video description Tags Keywords Titles Thumbnail titles etc. Use the generated description as input to create a thumbnail image using an image generation API. Store all outputs in the same Google Sheet in separate columns. Optionally, use tools like VidIQ or TubeBuddy to test the SEO strength of generated titles, tags, and keywords. 💡 Note: This example uses Runway’s image generation API, but you can plug in any other image-generation service of your choice. Requirements A Google Sheet with clearly named columns Hindi, English, or other language scripts in the sheet OpenAI API key Runway API key (or any other image generation API) How to set up You can set up this workflow in 15 minutes by following the pre-defined steps. Replace the manual Google Sheet trigger with a scheduled trigger for daily or timed automation. You may also swap Google Sheets with any database or data source of your choice. No Google Sheets API required. Requires minimal JavaScript or Python knowledge for advanced customizations.
Summarize PDFs with Groq AI and convert to audio using Qwen TTS
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works User Uploads PDF : The workflow accepts a PDF via webhook. Extract Text : n8n extracts the text content from the PDF. Summarize with AI : The extracted text is passed to an AI model (Groq) with OpenAI model for summarization. Generate Audio : The summary text is sent to a TTS (Text-to-Speech) API (Qwen-TTS-Demo), you can use other free alternatives. Serve Result : The workflow outputs both Summary and Audio File URL (WAV link) which you can attached to your audioPlayer. This allows users to read or listen to the summary instantly. How to use / Requirements Import Workflow : Copy/paste the workflow JSON into your n8n instance. Set Up Input Trigger : If you want users to upload directly you can use webhook or any other trigger. Configure AI Node : Add your own API key for (Groq / Open AI). Configure TTS Node : Add credentials for your chosen TTS service. Run Workflow : Upload a PDF and get back the summary and audio file url. n8n-smart pdf summarizer & voice generator Please reach out to me at Laiba Zubair if you need further assistance with you n8n workflows and automations!
Receive updates when a form submission occurs in your Webflow website
No description available.
Discover & generate leads from social engagement using Trigify, Google Sheets, and Slack
An intelligent automation workflow that monitors thought leader activity via social listening, tracks high-value prospects who engage with industry content, and systematically builds a qualified lead database through social intelligence gathering. Overview This workflow transforms passive social listening into proactive lead generation by identifying prospects who demonstrate genuine interest in industry topics through their engagement with thought leader content. It creates a continuous pipeline of warm prospects with enriched data for personalized outreach. --- 🔄 Workflow Process Social Intelligence Webhook Real-time engagement monitoring Integrated with Trigify.io social listening platform Monitors thought leader posts and their engagers Captures detailed prospect and company enrichment data Processes LinkedIn engagement activities in real-time Includes enriched contact information (email, phone, LinkedIn URLs) Data Processing & Extraction Structured data organization Post Data Extraction: Isolates LinkedIn post URLs, content, and posting dates Prospect Data Extraction: Captures first/last names, job titles, LinkedIn profiles, and locations Company Data Extraction: Gathers company names, domains, sizes, industries, and LinkedIn pages Prepares data for duplicate detection and storage systems Duplicate Detection System Data quality maintenance Queries existing Google Sheets database by post URL Identifies previously tracked thought leader content Filters out duplicate posts to maintain data quality Only processes genuinely new thought leader activities Maintains clean, unique post tracking records New Content Validation Gate Quality control checkpoint Validates that post URLs are not empty (indicating new content) Prevents processing of duplicate or invalid data Ensures only fresh thought leader content triggers downstream actions Maintains database integrity and notification relevance Thought Leader Post Tracking Systematic content monitoring Appends new thought leader posts to "Social Warming" Google Sheets Records post URLs, content text, and publication dates Creates searchable database of industry thought leadership content Enables trend analysis and content performance tracking Real-Time Slack Notifications Immediate team alerts Sends formatted alerts to comment-strategy channel Includes post content, publication date, and direct links Provides action buttons (View Post, Engage Now, Save for Later) Enables rapid response to thought leader activity Facilitates team coordination on engagement opportunities ICP Qualification Filter Smart prospect identification Filters engagers by job title keywords (currently: "marketing") Customizable ICP criteria for targeted lead generation Focuses on high-value prospects matching ideal customer profiles Prevents database pollution with irrelevant contacts Qualified Lead Database Systematic prospect capture Appends qualified engagers to "Engagers" Google Sheets Records comprehensive prospect and company data Includes contact enrichment (emails, phone numbers) Creates actionable lead database for sales outreach Maintains detailed company intelligence for personalization --- 🛠️ Technology Stack n8n: Workflow orchestration and webhook management Trigify.io: Social listening and engagement monitoring platform Google Sheets: Lead database and content tracking system Slack API: Real-time team notifications and collaboration Data Enrichment: Automated contact and company information gathering --- ✨ Key Features Real-time thought leader content monitoring Automated prospect discovery through social engagement ICP-based lead qualification and filtering Duplicate content detection and prevention Comprehensive prospect and company data enrichment Integrated CRM-ready lead database creation Team collaboration through Slack notifications Customizable qualification criteria for targeted lead generation --- 🎯 Ideal Use Cases Perfect for sales and marketing teams seeking warm prospects: B2B Sales Teams seeking warm prospects through social engagement Marketing Professionals building targeted lead databases Business Development Teams identifying engaged prospects Account-Based Marketing Campaigns requiring social intelligence Sales Professionals needing conversation starters with warm leads Companies wanting to identify prospects already engaged with industry content Teams requiring systematic lead qualification through social activity Organizations seeking to leverage thought leadership for lead generation --- 📈 Business Impact Transform social listening into strategic lead generation: Warm Lead Generation: Identifies prospects already engaged with industry content Social Selling Intelligence: Provides conversation starters through engagement history ICP Qualification: Focuses efforts on prospects matching ideal customer profiles Relationship Building: Enables outreach based on genuine interest demonstration Market Intelligence: Tracks industry engagement patterns and trending content Sales Efficiency: Prioritizes prospects who show active industry engagement Personalization Data: Provides context for highly personalized outreach campaigns --- 💡 Strategic Advantage This workflow creates a fundamental shift from cold outreach to warm, contextual conversations. By identifying prospects who have already demonstrated interest in industry topics through their engagement behavior, sales teams can approach leads with genuine relevance and shared context. The system delivers: Continuous Pipeline: Automated flow of warm prospects showing industry engagement Social Context: Rich background data for meaningful, personalized conversations Quality Focus: ICP-filtered prospects matching ideal customer profiles Engagement History: Conversation starters based on actual prospect interests Competitive Advantage: Proactive lead identification before competitors Rather than interrupting prospects with cold messages, this workflow enables sales teams to join conversations prospects are already having, dramatically increasing response rates and relationship-building success.
Weekly Google Search Console SEO pulse: Organic & brand vs non-brand performance
This workflow generates a weekly performance summary from Google Search Console, focused on brand-level SEO metrics and week-over-week trends. It provides a structured view of how each brand segment is performing, with clean formatting for quick insights. Key Features Sends a weekly email with a table showing clicks, impressions, CTR, and position — along with % change vs. the previous week. Highlights both brand and non-brand clicks separately. Color-coded % changes make it easy to spot wins (green) and losses (red) at a glance. It’s designed to give SEO teams a consistent overview of performance by brand, helping to track directional shifts and support deeper analysis when needed. How it works Runs weekly (e.g. every Monday) to compare “Last Week” vs. “2 Weeks Ago” from GSC data. Includes both brand + non-brand click breakdown. Calculates raw values and week-over-week % change for clicks, impressions, CTR, and position. Outputs a clean, formatted table with labeled rows and color-coded changes. Sends the table as part of a scheduled email (can also be adapted for Slack or other channels). Setup steps Requires connected Google Search Console data (per brand segment). Email delivery is included by default (customizable to other platforms). Update brand segmentation logic to match your tracking needs (e.g. domain, label, or custom filters). Typical setup time: ~5-10 minutes with structured input data.