Adrian Bent
Hey I'm Adrian. I got into automation to help me better understand and build systems. If you have any questions about my n8n templates/builds, feel free to ask, and I'll be happy to respond via email at terrflix45@gmail.com. Thanks for the time!
Templates by Adrian Bent
Indeed job scraper with AI filtering & company research using Apify and Tavily
This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow scrapes job listings on indeed via Apify, automatically gets that dataset, extracts information about the listing filters jobs off relevance, finds a decision maker at the company and updates a database (google sheets) with that info for outreach. All you need to do is run Apify actor then the database will update with the processed data. Benefits: Complete Job search Automation - A webhook monitors the Apify actor which sends a integration and starts the process AI-Powered Filter - Uses ChatGPT to analyze content/context, identify company goals, and filters based on job description Smart Duplicate Prevention - Automatically tracks processed job listings in a database to avoid redundancy Multi-Platform Intelligence - Combines Indeed scraping, web research via Tavily, and enriches each listing Niche Focus - Process content from multiple niches 6 currently (hardcoded) but can be changed to fit other niches (just prompt the "job filter" node) How It Works: Indeed Job Discovery: Search and apply filter for relevant job listings, copy and use URL in Apify Uses Apify's Indeed job scraper to scrape job listings from the URL of interest Automatically scrapes the information, stores it in a dataset and initiates a integration Oncoming Data Processing: Loops over 500 items (can be changed) with a batch size of 55 items (can be changed) to avoid running into API timeouts. Multiple filters to ensure all fields are scrapped with our required metrics (website must exist and number of employees < 250) Duplicate job listings are removed from oncoming batch to be processed Job Analysis & Filter: An additional filter to remove any job listing from the oncoming batch if it already exists in the google sheets database Then all new job listings gets pasted to chatGPT which uses information about the job post/description to determine if it is relevant to us All relevant jobs get a new field "verdict" which is either true or false and we keep the ones where verdict is true Enrich & Update Database: Uses Tavily to search for a decision maker (doesn't always finds one) and populate a row in google sheet with information about the job listing, the company and a decision maker at that company. Waits for 1 minute and 30 seconds to avoid google sheets and chatGPT API timeouts then loops back to the next batch to start filtering again until all job listings are processed Required Google Sheets Database Setup: Before running this workflow, create a Google Sheets database with these exact column headers: Essential Columns: jobUrl - Unique identifier for job listings title - Position Title descriptionText - Description of job listing hiringDemand/isHighVolumeHiring - Are they hiring at high volume? hiringDemand/isUrgentHire - Are they hiring at high urgency? isRemote - Is this job remote? jobType/0 - Job type: In person, Remote, Part-time, etc. companyCeo/name - CEO name collected from Tavily's search icebreaker - Column for holding custom icebreakers for each job listing (Not completed in the workflow. I will upload another that does this called "Personalized IJSFE") scrapedCeo - CEO name collected from Apify Scraper email - Email listed on for job listing companyName - Name of company that posted the job companyDescription - Description of the company that posted the job companyLinks/corporateWebsite - Website of the company that posted the job companyNumEmployees - Number of employees the company listed that they have location/country - Location of where the job is to take place salary/salaryText - Salary on job listing Setup Instructions: Create a new Google Sheet with these column headers in the first row Name the sheet whatever you please Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The merge logic relies on the id column to prevent duplicate processing, so this structure is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my gmail: terflix45@gmail.com and I'll get back to you as soon as I can. Set Up Steps: Configure Apify Integration: Sign up for an Apify account and obtain API key Get indeed job scraper actor and use Apify's integration to send a HTTP request to your n8n webhook (if test URL doesn't work use production URL) Use Apify node with Resource: Dataset, Operation: Get items and use your Api key as your credentials Set Up AI Services: Add OpenAI API credentials for job filtering Add Tavily API credentials for company research Set up appropriate rate limiting for cost control Database Configuration: Create Google Sheets database with provided column structure Connect Google Sheets OAuth credentials Configure the merge logic for duplicate detection Content Filtering Setup: Customize the AI prompts for your specific niche, requirements or interest Adjust the filtering criteria to fit your needs
YouTube video content analyzer & summarizer with Gemini AI
This workflow takes two inputs, YouTube video URL (required) and a description of what information to extract from the video. If the description/"what you want" field is left empty, the default prompt will generate a detailed summary and description of the video's contents. However, you can ask for something more specific using this field/input. ++ Don't forget to make the workflow Active and use the production URL from the form node. Benefits Instant Summary Generation - Convert hours of watching YouTube videos to familiar, structured paragraphs and sentences in less than a minute Live Integration - Generate a summary or extract information on the contents of a YouTube video whenever, wherever Virtually Complete Automation - All that needs to be done is to add the video URL and describe what you want to know from the video Presentation - You can ask for a specific structure or tone to better help you understand or study the contents of the video How It Works Smart Form Interface: Simple N8N form captures video URL and description of what's to be extracted Designed for rapid and repeated completion anywhere and anytime Description Check: Uses JavaScript to determine if the description was filled in or left empty If the description field was left empty, the default prompt is, "Please be as descriptive as possible about the contents being spoken of in this video after giving a detailed summary." If the description field is filled, then the filled input will be used to describe what information to extract from the video HTTP Request: We're using Gemini API, specifically the video understanding endpoint We make a post HTTP request passing the video URL and the description of what information to extract Setup Instructions: HTTP Request Setup: Sign up for a Google Cloud account, join the Developer Program and get your Gemini API key Get curl for Gemini Video Understanding API The video understanding relies on the inputs from the form, code and HTTP request node, so correct mapping is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my Gmail: terflix45@gmail.com, and I'll get back to you as soon as I can. Setup Steps: Code Node Setup: The code node is used as a filter to ensure a description prompt is always passed on. Use the JavaScript code below for that effect: // Loop over input items and add a new field called 'myNewField' to the JSON of each one for (const item of $input.all()) { item.json.myNewField = 1; if ($input.first().json['What u want?'].trim() == "") { $input.first().json['What do you want?'] = "Please be as descriptive as possible about the contents being spoken of this video after giving a detailed summary"; } } return $input.all(); // End of Code HTTP Request: To use Gemini Video Understanding, you'll need your Gemini API key Go to https://ai.google.dev/gemini-api/docs/video-understandingyoutube. This link will take you directly to the snippet. Just select REST programming language, copy that curl command, then paste it into the HTTP Request node Replace "Please summarize the video in 3 sentences." with the code node's output, which should either be the default description or the one entered by the user (second output field variable) Replace "https://www.youtube.com/watch?v=9hE5-98ZeCg" with the n8n form node's first output field, which should be the YouTube video URL variable Replace $GEMINIAPIKEY with your API key Redirect: Use n8n form node, page type "Final Ending" to redirect user to the initial n8n form for another analysis or preferred destination
Generate job application icebreakers with GPT-4 and Indeed data from Google Sheets
Part two of the Indeed Job Scraper, Filter, and Enrichment workflow, this workflow takes information about the scraped and filtered job listings on Indeed via Apify, which is stored in Google Sheets to generate a customized, five-line email icebreaker that implies the rest of the icebreaker is personalized. Personalized IJSFE (Indeed Job Scraper For Enrichment). ++ I am an engineering student, so I love my acronyms. Benefits Instant Icebreaker Generation - Convert hours of research, copywriting and personalization into seconds automatically Live Integration - Generate and send personalized icebreakers whenever, wherever Virtually Complete Automation - From research into company and job description to personalized response with the AI solution, this workflow does it in a click. Professional Presentation - Because the chatbot has context about the company and the job listing, it generates an icebreaker that makes the reader think that there was some research done How It Works Google Sheets Search: Google sheets node fetches all rows where the icebreaker column is empty Each row is an item returned that contains information about the company and the job listing AI Personalization: Uses sophisticated GPT-4 prompting Converts a bunch of information about a job posting and company into a customized, five-line personalized email icebreaker Applies a consistent and casual tone automatically to seem more human-written Database Update: Updates all rows fetched in the search and updates only the icebreaker column with the new personalized icebreaker Each item is returned as a row that contains information about the company, the job listing and the icebreaker Required Template Setup Google Sheets Template: Create a Google Sheet with these variables: jobUrl - Unique identifier for job listings title - Position Title descriptionText - Description of job listing hiringDemand/isHighVolumeHiring - Are they hiring at high volume? hiringDemand/isUrgentHire - Are they hiring at high urgency? isRemote - Is this job remote? jobType/0 - Job type: In person, Remote, Part-time, etc. companyCeo/name - CEO name collected from Tavily’s search icebreaker - AI-generated icebreakers for each job listing scrapedCeo - CEO name collected from Apify Scraper email - Email listed on for job listing companyName - Name of the company that posted the job companyDescription - Description of the company that posted the job companyLinks/corporateWebsite - Website of the company that posted the job companyNumEmployees - Number of employees the company listed that they have location/country - Location of where job is to take place salary/salaryText - Salary on job listing Setup Instructions: Google Sheets Search & Update Setup: Create a new Google Sheet with these column headers in the first row Name the sheet whatever you please Connect your Google Sheets OAuth credentials in n8n Update the document ID in the workflow nodes The search logic in the first Google Sheets node relies on the ID column for icebreaker generation, so this structure is essential for the workflow to function correctly. Feel free to reach out for additional help or clarification at my Gmail: terflix45@gmail.com, and I'll get back to you as soon as I can. AI Icebreaker Generation Setup: Configure OpenAI API for sophisticated proposal writing Implement example-based training with input/output for more specific output Set up JSON formatting for structure (Personally, I think JSON is easier to handle as an output) Setup Steps: Search & Fetch Rows Setup Create a Google Sheets database with the provided column structure Connect Google Sheets OAuth credentials Configure the filer on the get rows node to only include empty icebreaker columns Set up AI Personalization Add OpenAI API credentials for personalized icebreaker generation Customize the AI prompts for your specific niche, requirements or interest Update Google Sheets Setup Remember to map all items to their respective columns based on the row number All fields in the update sheets node should have the same value as the first sheets node, and the icebreaker field is to take the ChatGPT output as its value