Gerald Denor
Gerald Akhidenor is an AI Automation Lead Creative with 5 years of experience in AI automation consultancy. He specializes in n8n and API integrations, driving innovative automation solutions. As founder of DominixAI, Gerald combines creative vision and technical expertise to streamline processes and boost efficiency through cutting-edge AI technologies. Website: https://dominixai.com JobMonkey: https://trafficabc.com/jobmonkey-application-generator/
Categories
Templates by Gerald Denor
AI-powered multi-social media post automation: Google Trends & Perplexity AI
Overview This comprehensive n8n workflow automatically transforms trending Google search queries into engaging LinkedIn posts using AI. The system runs autonomously, discovering viral topics, researching content, and publishing professionally formatted posts to grow your social media presence. Workflow Description Automate your entire social media content pipeline - from trend discovery to publication. This workflow monitors Google Trends, selects high-potential topics, creates human-like content using advanced AI, and publishes across multiple social platforms with built-in tracking. Key Features Automated Trend Discovery: Pulls trending topics from Google Trends API with customizable filters Intelligent Topic Selection: AI chooses the most relevant trending topic for your niche Multi-AI Content Generation: Combines Perplexity for research and OpenAI for content curation Human-Like Writing: Advanced prompts eliminate AI detection markers LinkedIn Optimization: Proper formatting with Unicode characters, emojis, and engagement hooks Multi-Platform Support: Ready for LinkedIn, Twitter/X, and Facebook posting Automated Scheduling: Configurable posting times (default: 6 AM & 6 PM daily) Performance Tracking: Automatic logging to Google Sheets with timestamps and metrics Error Handling: Built-in delays and retry mechanisms for API stability Technical Implementation Workflow Architecture Schedule Trigger: Automated execution at specified intervals Google Trends API: Fetches trending search queries with geographical filtering Data Processing: JavaScript code node filters high-volume keywords (30+ search volume) Topic Selection: OpenAI GPT-3.5 evaluates and selects optimal trending topic Content Research: Perplexity AI researches selected topic for current information Content Generation: Advanced prompt engineering creates LinkedIn-optimized posts Content Distribution: Multi-platform posting with platform-specific formatting Analytics Tracking: Google Sheets integration for performance monitoring Node Breakdown Schedule Trigger: Configurable timing for automated execution HTTP Request (Google Trends): SerpAPI integration for trend data Set Node: Structures trending data for processing Code Node: JavaScript filtering for high-volume keywords OpenAI Node: Intelligent topic selection based on relevance and trend strength HTTP Request (Perplexity): Advanced AI research with anti-detection prompts Wait Node: Rate limiting and API respect Split Out: Prepares content for multi-platform distribution LinkedIn Node: Authenticated posting with community management Google Sheets Node: Automated tracking and analytics Social Media Nodes: Twitter/X, LinkedIn and Facebook ready for activation Use Cases Content Creators: Maintain consistent posting schedules with trending content Marketing Agencies: Scale content creation across multiple client accounts Business Development: Build thought leadership with timely industry insights Personal Branding: Establish authority by commenting on trending topics SEO Professionals: Create content around high-search-volume keywords Configuration Requirements API Integrations SerpAPI: Google Trends data access Perplexity AI: Advanced content research capabilities OpenAI: Content curation and topic selection LinkedIn Community Management API: Professional posting access Google Sheets API: Analytics and tracking Authentication Setup LinkedIn OAuth2 community management credentials Google Sheets OAuth2 integration HTTP header authentication for AI services Customization Options Industry Targeting: Modify prompts for specific business verticals Posting Schedule: Adjust timing based on audience activity Content Tone: Customize voice and style through prompt engineering Platform Selection: Enable/disable specific social media channels Trend Filtering: Adjust search volume thresholds and geographic targeting Content Length: Modify character limits for different platforms Advanced Features Anti-AI Detection: Sophisticated prompts create human-like content Rate Limit Management: Built-in delays prevent API throttling Error Recovery: Robust error handling with retry mechanisms Content Deduplication: Prevents posting duplicate content Engagement Optimization: LinkedIn-specific formatting for maximum reach Performance Metrics Time Savings: Eliminates 10+ hours of weekly content creation Consistency: Maintains regular posting schedule without manual intervention Relevance: Content always based on current trending topics Engagement: Optimized formatting increases social media interaction Scalability: Single workflow manages multiple platform posting Installation Notes Import JSON workflow file into n8n instance Configure all required API credentials Set up Google Sheets tracking document Test workflow execution with manual trigger Enable schedule trigger for automated operation Best Practices Monitor API usage to stay within rate limits Regularly update prompts based on content performance Review and adjust trending topic filters for your niche Maintain backup of workflow configuration Test content output before enabling automation Support & Updates Comprehensive setup documentation included Configuration troubleshooting guide provided Regular workflow updates for API changes Community support through n8n forums Tags social-media content-automation linkedin ai-generation google-trends perplexity openai marketing trend-analysis content-creation Compatibility n8n Version: 1.0+ Node Requirements: Standard n8n installation External Dependencies: API access to listed services Hosting: Compatible with cloud and self-hosted n8n instances
Ai-powered automated job search & application
Unleash the power of AI to automate your job search, tailor your applications, and boost your chances of landing your dream job! This comprehensive workflow handles everything from finding relevant job postings to generating personalized resumes and cover letters. Use cases are many: Automate your entire job application process: Spend less time searching and more time preparing for interviews. Tailor your resume and cover letter for every application: Maximize your ATS compatibility and stand out to recruiters. Efficiently track your applications: Keep all your job search activities organized in one place. Discover new job opportunities: Leverage the Adzuna API to find relevant listings. --- Good to know: Free Adzuna API: This workflow utilizes the free* Adzuna API, making job search capabilities accessible without initial cost. OpenRouter Chat Model Costs: AI model usage (for resume rewriting and cover letter generation) will incur costs based on the OpenRouter pricing model. Please check OpenRouter's official website for updated pricing information. Model Availability: The AI models used may have geo-restrictions. If you encounter a "model not found" error, it might not be available in your country or region. --- How it works: Webhook Trigger: The workflow is initiated via a webhook, allowing you to trigger it manually or integrate it with other systems (e.g., a form submission with your desired job title and resume). Resume Extraction: Your uploaded resume (e.g., PDF) is automatically extracted into a readable text format. Job Search (Adzuna API): Using the provided job title, the workflow queries the Adzuna API to fetch relevant job postings. Job Filtering: Duplicate job listings are filtered out to ensure you receive unique opportunities. Job Info Extraction: Key details like job description, company name, and job URL are extracted from each posting. Skills Extraction (AI): An AI model (OpenRouter) analyzes the job description to identify the top skills and qualifications required. Resume Match Scoring (AI): Your resume is compared against the extracted job skills by an AI model, generating a compatibility score (1-5). Conditional Resume & Cover Letter Generation: If the resume match score is satisfactory (β₯ 3): Tailored Resume Generation (AI): An AI model rewrites your resume, specifically highlighting the skills and experience most relevant to the target job, in an ATS-friendly and human-readable JSON/HTML format. Personalized Cover Letter Generation (AI): A custom cover letter is drafted by AI, uniquely tailored to the job description and your newly optimized resume, generated as well-formatted HTML. Google Sheets Integration: The generated cover letter, tailored resume, job URL, and application status are automatically updated in your designated Google Sheet for easy tracking. Gmail Notification: A personalized email containing the generated cover letter, tailored resume, and a direct link to the job posting on Adzuna is sent to your specified email address. Webhook Response: A final text response is sent back via the webhook, summarizing the sent application materials. --- How to use: Manual Trigger: The workflow is set up with a manual trigger (Webhook) for initial testing and demonstration. You can easily replace this with an n8n form, a scheduled trigger, or integrate it into your existing tools. Input: Provide your desired job search keyword and your resume (e.g., as a PDF) to the webhook. Review & Apply: Review the AI-generated cover letter and tailored resume sent to your email, then proceed to apply for the job using the provided Adzuna link. --- Requirements: n8n Instance: A running n8n instance (self-hosted or cloud). Adzuna API Key: A free Adzuna API key (easily obtainable from their developer portal). OpenRouter Account: For AI model access (costs apply based on usage). Google Sheets Account: To store and track your job applications. Gmail Account: To send automated application emails. --- Customizing this workflow: This workflow is highly customizable. You can: Integrate with other job boards (e.g., LinkedIn, Indeed) using their APIs. Add more sophisticated AI models or custom prompts for even finer control over resume and cover letter generation. Connect to other services for CRM, calendar management, or applicant tracking. Implement different filtering criteria for job postings. Expand the data stored in your Google Sheet (e.g., interview dates, feedback). Start automating your job search today and streamline your path to career success!
Automate job search & applications with 5 job boards & AI resume generator
Automate Your Job Search: Find Job Listings on LinkedIn, Indeed, Glassdoor, Upwork & Adzuna! Stop wasting time manually searching for jobs! This powerful job search automation workflow for n8n is your secret weapon to finding and preparing for your next career move, effortlessly. Tired of visiting multiple job boards every day, running the same searches, and drowning in a sea of irrelevant listings? Our Job Search Automation workflow streamlines the entire process for you. It automatically scours top job boards, filters and deduplicates the results, and even helps you craft a tailored resume and cover letter. This workflow will automatically pull job listings from: LinkedIn: The world's largest professional network. Indeed: One of the most popular job search engines. Glassdoor: A leading platform for company reviews and job listings. Upwork: The go-to marketplace for freelance opportunities. Adzuna: A comprehensive job search engine with a massive database of listings. Key Features: Multi-Board Job Sourcing: Never miss an opportunity! This workflow simultaneously searches LinkedIn, Indeed, Glassdoor, Upwork, and Adzuna for your desired job title. Intelligent Filtering & Deduplication: Say goodbye to repetitive listings. The workflow cleans up the results, so you only see unique, relevant job postings. AI-Powered Resume & Cover Letter Generation: Once a promising job is found, the workflow uses AI to analyze the job description, score your resume's match, and then rewrites your resume and generates a tailored cover letter to highlight the right skills and experience. Automated Email Delivery: Receive the tailored resume and cover letter directly in your inbox, ready for you to review and send. Google Sheets Integration: Keep track of every job you're considering. The workflow automatically logs the job details, your tailored documents, and the application status in a Google Sheet. Fully Customizable: Easily adapt the workflow to your specific needs. Change the job titles, locations, and even add or remove job boards. Why You Need This Workflow: Save Dozens of Hours: Automate the most time-consuming parts of your job search. Discover More Opportunities: Cast a wider net by searching multiple platforms at once. Apply Faster: Get a head start with AI-generated, tailored application materials. Stay Organized: Effortlessly track your applications in a centralized location. Take control of your job search and land your dream job faster. Get the Job Search Automation workflow today! ============ How to configure your new n8n workflow. Prerequisites: An active n8n instance. Accounts for the services you want to use (Apify, Google, etc.). Configuration Steps: Apify Account & Scrapers: This workflow uses Apify to scrape job listings from Indeed, LinkedIn, Upwork, and Glassdoor. Create an Apify Account: If you don't have one, sign up at https://apify.com. A free plan is available. Get Your Apify API Key: In your Apify account, go to Settings > Integrations to find your API token. Add Apify Credentials to n8n: In your n8n workflow, navigate to the Apify: Run Indeed Scraper node. In the "Credentials" section, click to add new credentials. Give your credentials a name (e.g., "My Apify Account") and paste your API token. Apply Credentials to All Apify Nodes: You will need to select your newly created Apify credentials for the following nodes: Apify: Run Indeed Scraper Apify: Get Indeed Results Apify: Run LinkedIn Scraper Apify: Get LinkedIn Results Apify: Run Upwork Scraper Apify: Get Upwork Results Apify: Run Glassdoor Scraper Apify: Get Glassdoor Results Adzuna API (Optional but Recommended): The workflow uses Adzuna for an additional source of job listings. You will need to sign up for a free developer account to get an App ID and App Key. Sign Up: Go to https://developer.adzuna.com/ and register. Get Your Credentials: Once registered, you will find your appid and appkey in your dashboard. Update the "Get Jobs from Adzuna" Node: Click on this node to open its parameters. In the URL field, place appid and your appkey. OpenRouter for AI Models: This workflow uses OpenRouter to access various AI models for tasks like resume scoring and writing. Create an OpenRouter Account: Sign up at https://openrouter.ai/. Get Your API Key: Find your key in your account settings. Add OpenRouter Credentials to n8n: Go to the OpenRouter Chat Model node. Add your OpenRouter API key as new credentials. Google Sheets and Gmail Integration: Create a Google Sheet: Create a new Google Sheet to track your job applications. The workflow is pre-configured with the following columns, but you can customize them: jobtitle, jobdescription, joburl, company, email, status, tailoredresume, cover_letter. Add Google Credentials to n8n: You will need to authenticate your Google account in n8n to allow access to Sheets and Gmail. In the Upate sheets node, go to the "Credentials" section and follow the prompts to connect your Google account using OAuth2. Do the same for the π§Gmail node. It's recommended to use the same credentials for both. Configure the "Upate sheets" Node: Select your Google Sheets credentials. In the "Document ID" field, enter the ID of the Google Sheet you created. You can find this in the URL of your sheet (it's the long string of characters between /d/ and /edit). In the "Sheet Name" field, select the correct sheet from the dropdown list. Setting up the Initial Request (Webhook): This workflow is triggered by a webhook. This means you can start it by sending a POST request from another application or using a tool like Postman. Find the Webhook URL: In the Webhook node, you will see a "Test URL" and a "Production URL". Use the production URL for live use. Required Data: The webhook expects a JSON body with the following fields: jobSearchKeyword: The job title you want to search for (e.g., "Software Engineer"). email: Your email address where the results will be sent. You will also need to upload your resume as a file in the request. Running the Workflow: Upload Your Resume: The workflow is designed to be initiated with your resume. When you trigger the webhook, you need to include the resume file. Activate the Workflow: Once all credentials are in place, save and activate your workflow. Trigger the Workflow: Send a POST request to the production webhook URL with the required JSON data and your resume file. You can use Postman to do this or replace the webhook with a form. You are now all set! The workflow will begin searching for jobs and preparing your application materials. Need Help? Reach me: https://www.linkedin.com/in/gerald-akhidenor-1ab1a45/ Work with me: https://dominixai.com/ My website: https://jobmonkey.dev My email: denorgerald@gmail.com
Automate job discovery & AI proposals across Upwork, Freelancer, Guru & PPH with OpenRouter
Upwork/Freelancer/Guru/PPH Job Automation + AI Proposal Generator Overview This comprehensive n8n workflow automates freelance job discovery and application processes across four major platforms: Upwork.com, Freelancer.com, PeoplePerHour.com, and Guru.com. The system monitors RSS feeds, extracts job details, generates personalized AI proposals, and tracks everything in Google Sheets with email notifications. Supported Platforms Complete Market Coverage Upwork.com World's largest freelance marketplace High-value projects and enterprise clients Competitive bidding environment requiring fast responses Freelancer.com Global platform with diverse project types International client base across all industries Contest and fixed-price project opportunities PeoplePerHour.com UK-focused professional services platform Hourly and project-based work Business and creative services emphasis Guru.com Premium freelance marketplace for skilled professionals Work room collaboration features Recurring client relationship focus Use Cases Primary Use Cases Multi-Platform Lead Generation Monitor Upwork, Freelancer, PeoplePerHour, and Guru simultaneously Receive instant notifications for relevant opportunities across all platforms Never miss time-sensitive job postings from any major marketplace Cross-Platform Proposal Automation Generate platform-specific, personalized proposals using AI Adapt proposal style to each platform's requirements and culture Maintain consistent quality across all applications and platforms Comprehensive Opportunity Tracking Automatically log jobs from all four platforms in Google Sheets Track application status and platform performance Competitive Market Advantage Respond to jobs within minutes across multiple platforms Maintain 24/7 monitoring without manual intervention Access broader market opportunities while competitors focus on single platforms Industry Applications Digital Marketing Agencies Monitor automation projects across Upwork and Freelancer Track competitor applications on multiple platforms Access European clients through PeoplePerHour and international through Guru Software Development Teams Find API integration projects on Upwork and technical contests on Freelancer Monitor PeoplePerHour for UK-based development work Track long-term client relationships on Guru Virtual Assistant Services Discover automation opportunities across all four platforms Monitor recurring service requests and business support roles Scale operations by accessing different market segments Consulting Practices Track business automation requests on premium platforms like Guru Monitor transformation projects across international markets Access diverse client bases from startup (Freelancer) to enterprise (Upwork) Technical Features Technical Features Multi-Platform RSS Monitoring Configurable polling intervals (default: 5 minutes) Processes Vollna API feeds for Upwork, Freelancer, PeoplePerHour, and Guru Handles various platform-specific job feed formats automatically Intelligent Data Extraction Parses job titles to extract budget information across different platform formats Supports multiple pricing formats (fixed, hourly, ranges) from all platforms Cleans and structures job data with platform-specific considerations Advanced URL Processing Decodes nested URLs from feed redirects Automatically identifies source platforms (Upwork, Freelancer, PeoplePerHour, Guru) Validates and formats final job URLs for direct platform access AI Integration with Platform Awareness Uses OpenRouter for AI model access with platform-specific contexts Implements advanced prompt engineering for different platform cultures Generates contextual, personalized proposals adapted to each platform Multi-Platform Support Gmail integration for notifications Google Sheets for data storage HTML email formatting Automated database updates βοΈ Configuration Options Filtering System ASCII text filtering for English content Custom budget range filtering Platform-specific filtering rules Keyword-based job matching AI Customization Customizable proposal templates Variable experience levels Industry-specific positioning Personal branding integration Notification Preferences HTML-formatted email alerts Mobile-friendly email templates Batch processing options Error handling and retry logic Benefits π Operational Efficiency Time Savings Eliminates manual job checking (save 2-3 hours daily) Automates proposal writing process Reduces administrative overhead Response Speed 5-minute job discovery cycle Instant proposal generation Competitive response timing Quality Consistency Standardized proposal quality Professional email formatting Consistent brand presentation π Business Intelligence Market Analysis Budget trend tracking Platform performance comparison Job volume analytics Competitive landscape insights Performance Tracking Application success rates Response time analysis Platform-specific metrics ROI measurement capabilities π Scalability Features Multi-User Support Team-based implementations Role-specific customizations Shared resource management Integration Ready CRM system compatibility Project management tool integration Calendar synchronization options API extension capabilities Requirements π οΈ Technical Prerequisites n8n Setup n8n Cloud or self-hosted instance Webhook capabilities enabled Cron trigger support External Services Gmail account with API access Google Sheets API credentials OpenRouter API key Vollna account for RSS feeds Optional Enhancements Custom domain for webhooks SSL certificates for secure connections Database storage for advanced analytics Installation & Setup Need Help? Reach me: https://www.linkedin.com/in/gerald-akhidenor-1ab1a45/ Work with me: https://dominixai.com/ My website: https://jobmonkey.dev My email: denorgerald@gmail.com --- π Beginner Setup Guide Workflow: Upwork Job Application Automation with Vollna API This workflow automates the process of: Reading new jobs from a Vollna RSS feed, Extracting job title, budget, and source, Using AI to generate a tailored proposal, Sending the proposal to your email, and Saving job details into a Google Sheet. --- π₯ Import the Workflow Open your n8n dashboard. Click Import in the top-right. Upload the file: Upwork Job Application Automation with Vollna API Updated.json. --- π Set Up Required Credentials Before the workflow can run, connect your accounts. Gmail Credentials Go to Credentials in n8n. Create a new credential: Gmail OAuth2 API. Follow the login prompt and grant access. Save it, then select it in the Send a Message node. Google Sheets Credentials Create a credential for Google Sheets OAuth2 API. Connect it to your Google account. Save it, then select it in the Update Database node. OpenRouter / AI API Key Sign up at openrouter.ai. Copy your API key. In n8n, create a credential for OpenRouter API. Select it in the OpenRouter Chat Model node. --- βοΈ Configure the RSS Feed Open the RSS Feed Trigger node. Replace https://www.vollna.com/rss/insertyourlinkhere with your personal Vollna RSS feed URL. You can get this from your Vollna account (filtering for Upwork jobs). --- π Understand the Loop This workflow uses Loop Over Items to process multiple jobs one by one. Loop branch (Output 1) β Processes each job (Extract, Decode, AI Proposal, Save to DB). Done branch (Output 2) β Would normally run once all jobs are processed. (Here, itβs not used β instead, the loop cycles back until no jobs are left.) So the workflow ensures every job from RSS is handled. --- π§ Email Setup The Send a Message node will send proposals to your email. Default recipient: denorgerald@gmail.com. Change this in the Send a Message β Send To field to your own email. --- π Google Sheets Setup Get Your Google Sheets Copy The Update Database node writes job data to your Google Sheet. Itβs already linked to this sheet: Upwork Jobs Automation β upwork_jobs Replace with your own Google Sheet ID if you want. Copy your sheetβs URL, e.g.: https://docs.google.com/spreadsheets/d/<YOURSHEETID>/editgid=0 Paste <YOURSHEETID> into the Document ID. Update Sheet Name if needed (default is upwork_jobs). --- βΆοΈ Test the Workflow Click Execute Workflow. The RSS Trigger will fetch jobs. You should see each job processed step by step: Title & Budget extracted URL decoded Proposal generated by AI Email sent Row added to Google Sheet --- π Activate Automation When satisfied, toggle the workflow to Active. It will check the RSS feed every 5 minutes (you can adjust this in the RSS Trigger). Your Upwork Job Application Automation is now ready to help you discover and apply to more opportunities automatically. Remember to monitor the system regularly and adjust settings based on your results. Happy freelancing! ππ
AI-powered mock job interview with voice, assessment & Gmail reporting
AI Mock Interview System - Complete n8n Template Overview This n8n workflow template creates a comprehensive AI-powered mock interview system that conducts voice-based interviews, provides real-time transcription, and generates detailed performance assessments. The system uses OpenAI's GPT-4o Realtime API for natural conversations and automated scoring across five professional criteria. What This Template Does Core Functionality Voice-Enabled Interviews: Real-time AI conversations using OpenAI's Realtime API Resume-Aware Questioning: Tailored questions based on uploaded resume content Automatic Timing: 15-minute sessions with automatic conclusion Live Transcription: Real-time conversation display during interviews Comprehensive Assessment: 5-criteria scoring system with STAR method evaluation Automated Reporting: HTML report generation and email delivery Workflow Components Interview Setup Form: Collects job role, email, and resume information Interview Engine: Manages real-time AI conversation flow Assessment Generator: Analyzes performance and creates detailed reports Email Delivery: Sends professional assessment reports automatically Prerequisites Required Services and Accounts n8n Instance: Cloud or self-hosted version OpenAI API: Account with GPT-4o Realtime API access OpenRouter Account: For cost-effective assessment analysis (free tier available) Gmail Account: For automated email delivery Google Cloud Console: For Gmail API credentials Estimated API Costs OpenAI API: ~$0.15-0.75 per 15-minute interview OpenRouter: ~$0.001-0.01 per assessment report Total operational cost: Under $1 per session Step-by-Step Setup Instructions Import the Workflow Download the workflow JSON file In your n8n instance, click "Import from file" Select the downloaded file and import Verify all nodes are properly connected Configure OpenAI API Integration Get Your API Key Visit platform.openai.com/api-keys Create a new secret key named "Mock Interview System" Copy the key (format: sk-proj-...) Ensure billing is enabled on your OpenAI account Configure in Workflow Method 1: Direct Configuration Locate the HTTP Request node for OpenAI In Headers section, find Authorization parameter Replace placeholder with: Bearer YOURAPIKEY Method 2: Using n8n Credentials (Recommended) Go to Settings β Credentials in n8n Add new OpenAI credential Enter your API key and save as "OpenAI Mock Interview" Reference this credential in the HTTP Request node Set Up OpenRouter for Assessments Sign up at openrouter.ai Generate an API key from the dashboard In the workflow, find the OpenRouter Chat Model node Add your OpenRouter credentials Verify model is set to deepseek/deepseek-r1:free for cost efficiency Configure Gmail Integration Enable Gmail API Go to Google Cloud Console Create a new project or select existing Enable the Gmail API for your project Create OAuth 2.0 credentials Add authorized redirect URIs (n8n will provide these) Configure in n8n Navigate to Settings β Credentials Add new Gmail OAuth2 credential Enter Client ID and Client Secret from Google Cloud Complete OAuth authorization flow Test the connection Update Email Node Find the "Send interview assessment report" node Select your Gmail credentials Customize the email template as needed Test email delivery functionality Update Webhook URLs The template contains placeholder URLs that must be updated for your instance. Find Your n8n Base URL n8n Cloud: https://[your-subdomain].app.n8n.cloud Self-hosted: Your custom domain Update HTML Forms Interview Setup Form Node: Find: action="https://n8n.dominixai.com/webhook/start-interview" Replace with: action="https://YOURN8NURL/webhook/start-interview" Interview Interface Node: Find: https://n8n.dominixai.com/webhook/generate-report Replace with: https://YOURN8NURL/webhook/generate-report Get Webhook URLs Click each Webhook trigger node Copy the Production URL Use these URLs in your HTML form actions Testing the System Component Testing API Connection Test: Execute the OpenAI HTTP Request node to verify connectivity Email Test: Send a test assessment report to your email Assessment Generation: Test the OpenRouter node with sample transcript data Full System Test Activate the workflow Navigate to the interview setup webhook URL Fill the form with test data: Job Role: "Software Developer" Your email address Sample resume content Complete the interview process Verify assessment email delivery Customization Guide Interview Duration Modify the timer in the Interview Interface HTML: javascript const interviewDuration = 15; // Change to desired minutes Assessment Criteria Edit the prompt in the "Interview Assessor" node to: Modify scoring weights Add industry-specific criteria Customize feedback categories Question Customization Update the conversation prompt to: Add role-specific questions Include company culture queries Incorporate technical assessments Branding and Styling Update CSS styling in HTML nodes Add company logos and colors Customize email templates Modify form layouts and designs Advanced Customizations Add multiple interview rounds Implement difficulty progression Include video recording capabilities Add candidate scoring comparison Workflow Architecture Node Structure Webhook Triggers: Handle form submissions and interview completion HTTP Request Nodes: Interface with OpenAI Realtime API Code Nodes: Process resume data and generate interview questions HTML Nodes: Serve interview forms and interfaces OpenRouter Node: Generate performance assessments Gmail Node: Deliver assessment reports Data Flow User submits setup form β Resume processing Interview initialization β OpenAI session creation Real-time conversation β Transcript generation Interview completion β Assessment analysis Report generation β Email delivery Security and Privacy Data Handling No permanent storage of personal information Real-time processing with automatic cleanup GDPR-compliant data handling practices Secure API credential management Security Best Practices Use n8n credential system for API keys Enable HTTPS for all webhook endpoints Implement rate limiting on public endpoints Regular security updates and monitoring Troubleshooting Common Issues OpenAI API Errors Verify API key format and permissions Check billing status on OpenAI account Ensure Realtime API access is enabled Email Delivery Problems Confirm Gmail OAuth setup Check spam folders for test emails Verify Gmail API quotas and limits Webhook Connection Issues Ensure workflow is activated Verify URL formatting (no trailing slashes) Test webhook endpoints individually Interview Interface Problems Check browser microphone permissions Test on different browsers Verify JavaScript console for errors Debug Steps Enable workflow execution logging Test individual nodes in isolation Check API response status codes Verify credential configurations Monitor workflow execution logs Performance Optimization API Efficiency Implement request caching where appropriate Set up retry logic for failed API calls Monitor API usage and costs Configure timeout settings Scalability Considerations Set up load balancing for high traffic Implement queue management for concurrent interviews Monitor system resources and performance Plan for API rate limit management Use Cases and Applications Educational Institutions Student career preparation Mock interview practice sessions Interview skill development programs Career counseling support Corporate Training Employee interview training Hiring manager preparation Internal promotion assessments Skills evaluation programs Career Coaching Individual coaching sessions Group interview workshops Resume and interview alignment Confidence building exercises HR and Recruitment Candidate pre-screening Interview process standardization Hiring bias reduction Recruitment efficiency improvement Conclusion This AI Mock Interview System template provides a complete solution for automated interview practice and assessment. With proper setup and customization, it can serve various educational, corporate, and professional development needs while maintaining cost efficiency and user privacy. The modular design allows for extensive customization while the automated assessment system provides consistent, objective feedback to help candidates improve their interview performance. If you would rather avoid setup hassles, you can check HERE Turn Interview Anxiety Into Interview Success
Analyze Reddit content and comments for sentiment with Deepseek AI
Reddit Sentiment Analysis with AI-Powered Insights Automatically analyze Reddit posts and comments to extract sentiment, emotional tone, and actionable community insights using AI. This powerful n8n workflow combines Reddit's API with advanced AI sentiment analysis to help community managers, researchers, and businesses understand public opinion and engagement patterns on Reddit. Get structured insights including sentiment scores, toxicity levels, trending concerns, and moderation recommendations. Features Comprehensive Sentiment Analysis: Categorizes content as Positive, Negative, or Neutral with confidence scores Emotional Intelligence: Detects emotional tones like excitement, frustration, concern, or sarcasm Content Categorization: Identifies discussion types (questions, complaints, praise, debates) Toxicity Detection: Flags potentially harmful content with severity levels Community Insights: Analyzes engagement quality and trending concerns Actionable Intelligence: Provides moderation recommendations and response urgency levels Batch Processing: Efficiently processes multiple posts and their comments Structured JSON Output: Returns organized data ready for further analysis or integration How It Works The workflow follows a two-stage process: Data Collection: Fetches recent posts from specified subreddits along with their comments AI Analysis: Processes content through DeepSeek AI to generate comprehensive sentiment and contextual insights Use Cases Community Management: Monitor sentiment trends and identify posts requiring moderator attention Brand Monitoring: Track public opinion about your products or services on Reddit Market Research: Understand customer sentiment and concerns in relevant communities Content Strategy: Identify what type of content resonates positively with your audience Crisis Management: Quickly detect and respond to negative sentiment spikes Required Credentials Before setting up this workflow, you'll need to obtain the following credentials: Reddit OAuth2 API Go to Reddit App Preferences Click "Create App" or "Create Another App" Choose "web app" as the app type Fill in the required fields: Name: Your app name Description: Brief description of your app Redirect URI: http://localhost:8080/oauth/callback (or your n8n instance URL + /oauth/callback) Note down your Client ID and Client Secret OpenRouter API Visit OpenRouter Sign up for an account Navigate to your API Keys section Generate a new API key Copy the API key for use in n8n Step-by-Step Setup Instructions Step 1: Import the Workflow Copy the workflow JSON from this template In your n8n instance, click the "+" button to create a new workflow Select "Import from URL" or "Import from Clipboard" Paste the workflow JSON and click "Import" Step 2: Configure Reddit Credentials Click on any Reddit node (e.g., "Get many posts") In the credentials dropdown, click "Create New" Select "Reddit OAuth2 API" Enter your Reddit app credentials: Client ID: Your Reddit app client ID Client Secret: Your Reddit app client secret Auth URI: https://www.reddit.com/api/v1/authorize Access Token URI: https://www.reddit.com/api/v1/access_token Click "Connect my account" and authorize the app Save the credentials Step 3: Configure OpenRouter Credentials Click on the "OpenRouter Chat Model1" node In the credentials dropdown, click "Create New" Select "OpenRouter API" Enter your OpenRouter API key Save the credentials Step 4: Test the Webhook Click on the "Webhook" node Copy the webhook URL (it will look like: https://your-n8n-instance.com/webhook/reddit-sentiment) Test the webhook using a tool like Postman or curl with this sample payload: json { "subreddit": "technology", "query": "AI", "limit": 5 } Step 5: Customize the Analysis Modify the Structured Data Generator prompt: Edit the prompt in the "Structured Data Generator" node to adjust the analysis criteria or output format Change the AI model: In the "OpenRouter Chat Model1" node, you can switch to different models like anthropic/claude-3-haiku or openai/gpt-4 based on your preferences and budget Adjust post limits: Modify the limit parameter in the "Get many posts" and "Get many comments in a post" nodes to control how much data you process Usage Instructions Making API Calls Send a POST request to your webhook URL with the following parameters: Required Parameters: subreddit: The subreddit name (without r/) limit: Number of posts to analyze (recommended: 5-15) Optional Parameters: query: Search term to filter posts (optional) Example Request: bash curl -X POST https://your-n8n-instance.com/webhook/reddit-sentiment \ -H "Content-Type: application/json" \ -d '{ "subreddit": "CustomerService", "limit": 10 }' Understanding the Output The workflow returns a JSON array with detailed analysis for each post: json [ { "sentiment_analysis": { "overall_sentiment": { "category": "Negative", "confidence_score": 8 }, "emotional_tone": ["frustrated", "concerned"], "intensity_level": "High" }, "content_categorization": { "discussion_type": "Complaint", "key_themes": ["billing issues", "customer support"], "toxicity_level": { "level": "Low", "indicators": "No offensive language detected" } }, "contextual_insights": { "communityengagementquality": "Constructive", "potentialissuesflagged": ["service disruption"], "trending_concerns": ["response time", "resolution process"] }, "actionable_intelligence": { "moderatorattentionneeded": { "required": "Yes", "reason": "Customer complaint requiring company response" }, "response_urgency": "High", "suggestedfollowup_actions": [ "Escalate to customer service team", "Monitor for similar complaints" ] } } ] Workflow Nodes Explanation Data Collection Nodes Webhook: Receives API requests with subreddit and analysis parameters Get many posts: Fetches recent posts from the specified subreddit Split Out: Processes individual posts for analysis Get many comments in a post: Retrieves comments for each post Processing Nodes Loop Over Items: Manages batch processing of multiple posts Sentiment Analyzer: Primary AI analysis node that processes content Structured Data Generator: Formats AI output into structured JSON Code: Parses and cleans the AI response Respond to Webhook: Returns the final analysis results Customization Options Adjusting Analysis Depth Modify the limit parameters to analyze more or fewer posts/comments Update the AI prompts to focus on specific aspects (e.g., product mentions, competitor analysis) Adding Data Storage Connect database nodes to store analysis results for historical tracking Add email notifications for high-priority findings Integrating with Other Tools Connect to Slack/Discord for real-time alerts Link to Google Sheets for easy data visualization Integrate with CRM systems for customer feedback tracking Tips for Best Results Choose Relevant Subreddits: Focus on communities where your target audience is active Monitor Regularly: Set up scheduled executions to track sentiment trends over time Customize Prompts: Tailor the AI prompts to your specific industry or use case Respect Rate Limits: Reddit API has rate limits, so avoid excessive requests Review AI Output: Periodically check the AI analysis accuracy and adjust prompts as needed Troubleshooting Common Issues "Reddit API Authentication Failed" Verify your Reddit app credentials are correct Ensure your redirect URI matches your n8n instance Check that your Reddit app is set as "web app" type "OpenRouter API Error" Confirm your API key is valid and has sufficient credits Check that the selected model is available Verify your account has access to the chosen model "Webhook Not Responding" Ensure the workflow is activated Check that the webhook URL is correct Verify the request payload format matches the expected structure "AI Analysis Returns Errors" Review the prompt formatting in the Structured Data Generator Check if the selected AI model supports the required features Ensure the input data is not empty or malformed Performance Considerations Rate Limits: Reddit allows 60 requests per minute for OAuth applications AI Costs: Monitor your OpenRouter usage to manage costs Processing Time: Larger batches will take longer to process Memory Usage: Consider n8n instance resources when processing large datasets Contributing This workflow can be extended and improved. Consider adding: Support for multiple subreddits in a single request Historical sentiment tracking and trend analysis Integration with visualization tools Custom classification models for industry-specific analysis --- Ready to start analyzing Reddit sentiment? Import this workflow and start gaining valuable insights into online community discussions!
Transform YouTube videos into LinkedIn posts with SearchAPI & OpenAi
π₯ YouTube to LinkedIn Poster β n8n Workflow Template Turn any YouTube video into a high-performing LinkedIn post β automatically. This AI-powered n8n workflow takes a YouTube video ID, fetches the transcript using SearchAPI.io, and transforms it into a professional, engaging LinkedIn post using OpenAI (via OpenRouter). Customize the writing style, automate your content repurposing, and scale your thought leadership. --- β‘ What It Does π₯ Accepts a YouTube video ID + preferred writing profile π Retrieves transcript via SearchAPI.io π§ Uses LLM (OpenRouter / GPT-compatible) to generate a polished LinkedIn post βοΈ Supports writing style customization (e.g., educational, inspirational, storytelling) π Handles fallback if no transcript is found --- π¦ Whatβs Included β Webhook-based trigger (compatible with any frontend) β YouTube transcript fetcher using SearchAPI.io β Conditional logic to handle errors β OpenAI content generation node with injected personality prompt β Clean text response via webhook --- π Requirements π§ n8n (self-hosted or cloud) π API key for SearchAPI.io π§ OpenRouter API key (free or paid) π A frontend form (e.g. WordPress + fetch(), Fillout, Postman, etc.) --- π Installation Guide Import the Workflow Go to your n8n dashboard Click Import and upload the JSON file Configure SearchAPI Sign up at SearchAPI.io Add your API key inside the HTTP Request node labeled Get YouTube Transcript Set Up OpenRouter Go to Credentials β Add a new OpenRouter API credential Paste your API key from OpenRouter.ai Test with Postman or UI Send a POST to the webhook URL with JSON: json { "video_id": "T1nX2yDeSzM", "llm_profile": "educational tone" } --- π§© Customizing π¨ Change llm_profile to match different tones (e.g., "inspirational", "founder voice", "storyteller") π Integrate output directly into LinkedIn via a social media scheduler API βοΈ Edit the prompt in the OpenAI node for different content types (Twitter threads, blog intros) π¨ Add rate limiting or credit logic using WordPress + myCred or n8n queue control --- π‘ Use Cases Content repurposing agency automating short-form content from videos Personal brand managers scaling 1 β many posts from long-form video Micro-SaaS founders turning webinars, tutorials, and walkthroughs into professional posts YouTube creators expanding audience reach on LinkedIn --- π How I Used It in My MicroSaaS I used this exact workflow as the backend for a lead magnet SaaS tool that converts YouTube videos into LinkedIn posts. With a simple UI and webhook, users paste a video link, choose a tone, and instantly receive a high-quality post they can copy and share. It increased lead generation and engagement while costing nothing in backend dev. Check it out here: Youtube -> LinkedIn Post The best part? I only used: n8n + Webhook SearchAPI.io OpenRouter API A WordPress front-end with credit gating --- π© Questions? DM me on Twitter or reach out via email for setup help or white-label licensing: https://www.linkedin.com/in/gerald-akhidenor-1ab1a45/ or denorgerald@gmail.com
Cost-free email follow-up sequence with Google Sheets and Gmail
Simple But Powerful Email Automation for Everyone: No AI, No API Costs! This n8n workflow template, "Email Outreach Automation," is designed to help you set up an automated email outreach system using tools you might already be familiar with: Google Sheets and Google Docs. At its core, this workflow is a sequence of steps that n8n follows to send personalized emails to a list of contacts. The best part? It's built to be incredibly cost-effective, meaning you won't need to pay for any external API services or AI subscriptions to make it work. This makes it an ideal starting point for anyone new to automation or looking to manage their budget. How Does This Workflow Work? Think of this n8n workflow as a digital assistant that handles your email outreach from start to finish. Here's a simplified breakdown of what happens behind the scenes: Your Contact List (Google Sheet): The workflow starts by reading your contact information (like names, emails, and company details) from a Google Sheet. This is where you'll keep all the people you want to email. Email Content (Google Docs): Instead of writing each email individually, you'll create email templates in Google Docs. The workflow can then pull the content from these documents and personalize it for each recipient. This means you can use placeholders like {{firstname}} or {{company}} in your Google Doc, and the workflow will automatically replace them with the correct information for each person. Smart Email Sequencing: The workflow is designed to send a series of emails to each contact. It keeps track of which email was sent last to each person in your Google Sheet. This ensures that your contacts receive the emails in the correct order and don't get the same email twice. Sending Emails (Gmail or SMTP): Once the workflow has the contact's details and the personalized email content, it uses your Gmail account (or any other SMTP service you configure) to send the email. It even includes a built-in delay to avoid hitting email sending limits. Updating Your Records: After sending an email, the workflow updates your Google Sheet to record which email was sent and when. This helps you keep track of your outreach efforts and ensures that contacts who have completed the email sequence are not emailed again. Key Features for Beginners: No Coding Required: While n8n involves visual programming, you don't need to write complex code to use this workflow. It's pre-built and ready for you to configure with your Google Sheet and Google Doc IDs. Free to Use (No Hidden Costs): This is a major advantage! Unlike many other email automation tools that charge per email or require expensive API keys, this workflow leverages free services. You can run your campaigns without worrying about unexpected bills. Easy Configuration: The workflow includes a "Settings" node where you can easily input your Google Sheet and Google Doc IDs, as well as your email subjects. This centralizes all your important settings in one place. Built-in Safeguards: Features like the "Filter passes only unprocessed contacts" and "Has contact completed email sequence?" nodes ensure that you only email the right people at the right time, preventing accidental re-sends. Scalable: As you get more comfortable, you can easily expand this workflow to handle more contacts or more complex email sequences. It's a great foundation for learning more about automation. What You'll Need to Get Started: An n8n instance (you can run it locally, use a cloud provider, or n8n.cloud) A Google account (for Google Sheets and Google Docs) A Gmail account (or SMTP credentials for sending emails) This workflow is your entry point into efficient and affordable email automation. It's designed to be straightforward for beginners while offering powerful capabilities. Get ready to automate your outreach and see the results! Setup Guide: Sample Google Doc Sample Google Sheet Need Help? Reach me: https://www.linkedin.com/in/gerald-akhidenor-1ab1a45/ Work with me: https://dominixai.com/ My website: https://jobmonkey.dev My email: denorgerald@gmail.com This guide will walk you through the process of setting up and running the Email Outreach Automation. This powerful workflow allows you to automate personalized email campaigns using n8n, Google Sheets, and Google Docs, all without incurring any API or AI subscription costs. Follow these steps to get your automated outreach up and running. Prerequisites Before you begin, ensure you have the following: An n8n Instance: This can be a local installation, a self-hosted server, or an n8n.cloud account. If you don't have one, you can find installation instructions on the official n8n website [1]. A Google Account: You will need this to create and manage your Google Sheets (for contacts) and Google Docs (for email templates). A Gmail Account (or SMTP Credentials): This workflow uses Gmail by default for sending emails. If you prefer to use another email service, you will need its SMTP credentials. Step 1: Import the Workflow into n8n Download the Workflow: Ensure you have the EmailOutreachAutomation.json file downloaded to your computer. Open n8n: Log in to your n8n instance. Import: In the n8n dashboard, click on the "Workflows" tab in the left sidebar. Then, click the "New" button (or the + icon) and select "Import from JSON". Upload the File: A dialog box will appear. Click "Browse" or drag and drop the EmailOutreachAutomation.json file into the designated area. Click "Import". Save the Workflow: Once imported, the workflow will open in the editor. Click the "Save" button (usually located in the top right corner) to save your new workflow. Step 2: Prepare Your Google Sheet for Contacts This workflow uses a Google Sheet as your contact database. You need to create a new Google Sheet and populate it with your contact information. The workflow expects specific column headers to function correctly. Create a New Google Sheet: Go to Google Sheets (sheets.google.com) and create a new blank spreadsheet. Name Your Sheet: Give your spreadsheet a descriptive name (e.g., "JobMonkey Email List"). Add Required Columns: In the first row of your sheet, add the following column headers exactly as they appear below (case-sensitive): firstname lastname email company lastemailsent (This column will be updated by the workflow to track the last email sent to each contact. Leave it blank initially.) lastemaildate (This column will be updated by the workflow with the date the last email was sent. Leave it blank initially.) processed (This column will be updated by the workflow to mark contacts as processed. Leave it blank initially.) Populate with Data: Fill in your contact data under the respective columns. Ensure the email column contains valid email addresses. Get Your Google Sheet ID: The Google Sheet ID is part of the URL when you open your spreadsheet. It's the long string of characters between /d/ and /edit. Copy this ID; you will need it in Step 4. Step 3: Prepare Your Google Docs Email Templates This workflow sends a sequence of up to 9 emails. You will create a separate Google Doc for each email in your sequence. The workflow uses placeholders in these documents to personalize the emails. Create New Google Docs: Go to Google Docs (docs.google.com) and create a new blank document for each email you plan to send (e.g., "Email 1 Template", "Email 2 Template", etc.). The workflow is configured for up to 9 emails, but you can use fewer. Write Your Email Content: Write the content for each email in its respective Google Doc. You can use the following variables (placeholders) which the workflow will automatically replace with data from your Google Sheet: {{firstname}} {{lastname}} {{company}} {{email}} Get Your Google Doc IDs: For each Google Doc, copy its ID from the URL. It's the long string of characters between /d/ and /edit. You will need these IDs in Step 4. Step 4: Configure the 'Settings' Node in n8n This is the most crucial step for customizing the workflow. The 'Settings' node holds all the essential IDs and subjects for your campaign. Open the Workflow: In n8n, open the Email Outreach JobMonkey copy workflow you imported. Locate the 'Settings' Node: Find the node named "Settings" (it's usually near the top, connected to the 'Schedule Trigger'). Double-click it to open its configuration. Update 'googlesheetid': Find the assignment named googlesheetid. Replace the existing value with the Google Sheet ID you copied in Step 2. Update 'googledocidX' and 'emailsubjectX': For each email in your sequence (e.g., googledocid1, googledocid2, etc.), replace the value with the corresponding Google Doc ID you copied in Step 3. Similarly, for each emailsubject_X, update the value with the subject line you want for that specific email. Close the 'Settings' Node: Click "Done" or close the node configuration. Step 5: Configure Email Sending Credentials (Gmail or SMTP) The workflow uses a 'Send a message' node (Gmail node) to send emails. You need to provide your Gmail credentials or configure an SMTP service. Option A: Using Gmail (Recommended for Simplicity) Locate the 'Send a message' Node: Find the node named "Send a message" (it's a Gmail node) in the workflow. Double-click it. Add Gmail Account: Under the 'Credentials' section, click "Add new" next to gmailOAuth2. Connect to Google: A new window will open, prompting you to connect your Google account. Follow the on-screen instructions to grant n8n access to send emails on your behalf. Select the Gmail account you wish to use for sending. Save Credentials: Once connected, save the credentials. Close the Node: Click "Done" or close the node configuration. Option B: Using an SMTP Service (Advanced) If you prefer to use a different email service via SMTP: Replace the 'Send a message' Node: Delete the existing "Send a message" (Gmail) node. Add a New SMTP Node: Search for and add an "SMTP" node to your workflow. Connect it in place of the deleted Gmail node. Configure SMTP Credentials: Double-click the new SMTP node. You will need to provide your SMTP host, port, username, and password. Consult your email provider's documentation for these details. Map Fields: Ensure the To, Subject, and Message fields in the SMTP node are correctly mapped to the data coming from the previous nodes (e.g., ={{ $(\'Loop Over Items\').item.json.email }} for the recipient email, ={{ $(\'Settings\').item.json[\'emailsubject_\' + $(\'Determine Email Number\').item.json.emailNumber] }} for the subject, and ={{ $json.html }} for the message body). Close the Node: Click "Done" or close the node configuration. Step 6: Configure the Schedule Trigger (Optional but Recommended) The 'Schedule Trigger' node determines when your workflow will run automatically. By default, it's set to run at 6 AM, 12 PM, and 6 PM. Locate the 'Schedule Trigger' Node: Find the node named "Schedule Trigger" (it's the very first node in the workflow). Double-click it. Adjust Schedule: You can modify the schedule to fit your needs. For example, to run daily at 9 AM, you would set the hour to 9. Close the Node: Click "Done" or close the node configuration. Step 7: Activate and Test Your Workflow Once all configurations are complete, it's time to activate and test your workflow. Activate the Workflow: In the top right corner of the n8n editor, toggle the "Active" switch to ON. Test Manually (Optional but Recommended): To perform a test run without waiting for the schedule, click the "Execute Workflow" button (usually a play icon) in the top right corner. This will run the workflow once. Monitor Execution: After execution, check the 'Executions' tab in n8n to see the status of your workflow runs. You can also check your Google Sheet to see if the lastemailsent and processed columns have been updated, and check your email outbox to confirm emails were sent. Important Considerations Email Sending Limits: Be aware of the sending limits of your email provider (e.g., Gmail has a daily sending limit). The workflow includes a 'Wait' node to help space out emails, but adjust it (amount parameter in the 'Wait' node) if you encounter issues. Google API Quotas: While this workflow avoids paid API subscriptions, Google services do have free usage quotas. For very high volumes, you might eventually hit these, but for typical outreach, this should not be an issue. Error Handling: The workflow has basic error handling. If you encounter issues, check the 'Executions' tab in n8n for error messages, which can help you diagnose problems. Customization: Feel free to explore and customize other nodes in the workflow as you become more familiar with n8n. For example, you can adjust the 'Limits the number of emails per run' node to control how many emails are sent in each execution. By following these steps, you will have a fully functional, automated email outreach system that is both powerful and cost-effective. Happy automating!
Calory tracker & meal logger with Telegram, Gemini AI and data tables
This can be your Alternative to Cal AI App Overview: A comprehensive n8n workflow demonstrating advanced AI agent orchestration, stateful conversation management, and multi-modal input processing for nutrition tracking applications. Technical Architecture: This workflow showcases production-ready patterns for building complex conversational AI systems in n8n: Multi-Agent System Design Router Agent: Analyzes user intent and routes to specialized sub-agents Registration Agent: Handles new user onboarding with guided data collection Meal Logging Agent: Processes text, image, and voice inputs for nutrition analysis Update Agent: Manages multi-turn conversations for meal corrections Report Agent: Generates daily nutrition summaries Profile Agent: Handles user preference updates Stateful Conversation Management The workflow implements a state machine using n8n Data Tables: Tracks conversation context across messages Enables multi-step workflows (e.g., "get meal ID β ask for new description β update") Prevents context loss in long conversations Auto-resets state after task completion Multi-Modal Input Processing Text: Direct AI analysis using Google Gemini Images: Gemini Vision API with specialized nutrition analysis prompts Voice: Audio-to-text transcription followed by analysis Unified output format regardless of input type Advanced Prompt Engineering Specialized system prompts for each agent role Structured output formatting for reliable data extraction Regional cuisine knowledge (African/West African foods) Error handling and clarification requests built into prompts Data Persistence Layer Uses n8n Data Tables for: User profiles (calories, protein targets, preferences) Meal logs (descriptions, macros, timestamps) Conversation state (current step, context variables) Telegram Integration MarkdownV2 formatting with auto-chunking for long messages Handles all Telegram message types (text, photo, voice) "Typing" indicators for better UX Learning Outcomes: By studying this workflow, you'll understand: How to build agent hierarchies with specialized roles State management patterns for complex conversations Multi-modal AI input processing techniques Production data persistence strategies in n8n Scalable webhook-based bot architecture Technical Requirements: n8n version 1.0+ (uses Data Tables feature) Google Gemini API access Telegram Bot Token Basic understanding of: AI agents, webhook triggers, data tables Difficulty Level: Intermediate to Advanced Use Cases Beyond Nutrition: The patterns here apply to any domain requiring: Multi-step user onboarding Data collection through conversation Record updates via natural language Report generation Profile management Components: 50+ nodes demonstrating n8n AI features 6 specialized AI agents 3 data tables with structured schemas Custom JavaScript for MIME type handling and markdown formatting Integration with Google Gemini (text + vision + audio models) What Makes This Workflow Unique: Most n8n AI examples show simple Q&A bots. This demonstrates enterprise-grade conversation management with real data persistence, making it suitable for actual production deployment. SETUP GUIDE Need Help? Reach me: https://www.linkedin.com/in/gerald-akhidenor-1ab1a45/ Work with me: https://dominixai.com/ My website: https://jobmonkey.dev My email: denorgerald@gmail.com Cal AI Nutrition Bot - Complete Setup Guide Who This Guide Is For: This workflow is designed for beginner to intermediate no-code builders who want to launch a nutrition tracking bot. You should be comfortable: Creating accounts on platforms (n8n, Google Cloud, Telegram) Following step-by-step instructions Copy-pasting API keys Basic JSON importing No programming experience required. If you can use Zapier or Make.com, you can set this up. Prerequisites (30 minutes to gather) n8n Account Option A (Easiest): Sign up for n8n Cloud at n8n.io (14-day free trial, then $20/month) Option B (Free): Self-host n8n on DigitalOcean/Railway/your server Google Gemini API Key Go to ai.google.dev Click "Get API Key" Create new project (free tier: 15 requests/minute) Copy your API key Telegram Bot Open Telegram and search for @BotFather Send /newbot command Follow prompts to create your bot Copy the bot token (looks like: 1234567890:ABCdefGHIjklMNOpqrsTUVwxyz) Step 1: Import the Workflow (5 minutes) Log into your n8n instance Click "New Workflow" in top left Click the three-dot menu β Import from File Upload the My_workflow.json file you downloaded The workflow will appear with all nodes connected Step 2: Create Data Tables (10 minutes) The workflow uses three data tables to store information. Create them exactly as shown: Table 1: Cal AI Profiles Navigate to n8n Data Tables section Click Create Data Table Name: Cal AI Profiles Add columns: User_ID (Text) Name (Text) Email (Text) Calories_target (Number) Protein_target (Number) Country (Text) Table 2: Cal AI Meals Create new table: Cal AI Meals Add columns: Meal_ID (Text) User_ID (Text) Date (Text) Meal_description (Text) Calories (Number) Proteins (Number) Carbs (Text) Fats (Text) Table 3: Cal AI StateManagement Create new table: Cal AI StateManagement Add columns: User_ID (Text) State (Text) Last_Interaction (DateTime) Context_MealID (Text) Step 3: Configure Credentials (15 minutes) A. Add Google Gemini Credential In any Google Gemini node, click "Create New Credential" Paste your Gemini API key Click Save This credential will auto-apply to all Gemini nodes B. Add Telegram Credential Open any Telegram node Click "Create New Credential" Paste your bot token Click Save C. Link Data Tables Open the "Is User Registered?" node Click the Data Table dropdown Select "Cal AI Profiles" Repeat for all nodes with data table connections (n8n will show which ones need updating) Step 4: Activate Telegram Webhook (5 minutes) Open the "Telegram Trigger" node Click "Execute Node" to register the webhook You'll see "Webhook registered successfully" Click "Listen for Test Event" to verify Send a message to your bot on Telegram You should see the message appear in n8n Step 5: Test Each Function (20 minutes) Test 1: User Registration Send any message to your bot Bot should ask for your name Follow the prompts to complete registration Check the "Cal AI Profiles" data table to confirm your data was saved Test 2: Meal Logging (Text) Send: "I had 2 eggs and toast" Bot should respond with nutrition breakdown Test 3: Meal Logging (Image) Take a photo of food Send it to the bot Bot should analyze and log the meal Test 4: Meal Logging (Voice) Record a voice message: "I had chicken and rice" Send it to your bot Bot should transcribe and analyze Test 5: Update Meal Send: "update meal" Follow prompts to provide meal ID and new description Test 6: Get Report Send: "show my report" Bot should ask for a date and generate summary Step 6: Customize (Optional) Change Bot Personality Edit the systemMessage in each Agent node Adjust tone, add emojis, change coaching style Add Your Branding Update bot responses to mention your app name Customize welcome messages Modify Cuisine Focus Update the nutrition analysis prompts to focus on your target cuisine Current default: African/West African/Nigerian foods Adjust Nutrition Calculations Edit the Gemini Vision prompt in "Analyze image" node Modify portion size assumptions or macro ratios Troubleshooting Bot doesn't respond: Check webhook is activated in Telegram Trigger node Verify Telegram credential is correct Check execution logs for errors "Data Table not found" errors: Ensure all three tables are created with exact names Relink tables in each node Gemini API errors: Verify API key is active Check you haven't exceeded free tier limits (15/min) Try regenerating API key Meals not saving: Check "Append Meal Data" node is connected Verify User_ID matches between tables Going Live Once testing is complete: Click "Active" toggle in top right Share your bot link: t.me/YourBotUsername Monitor the Executions tab for any errors Target Users & Support This workflow is perfect for: Fitness coaches wanting client nutrition tracking without monthly SaaS fees Health tech entrepreneurs building MVP before hiring developers No-code builders learning advanced n8n AI patterns Bootcamp graduates needing portfolio projects with real-world complexity Small app studios prototyping before committing to custom development You'll succeed with this if you: Can follow detailed instructions Are comfortable with web-based tools Want to learn by doing (the workflow itself is educational) Need a production-ready solution quickly You might struggle if: You've never used n8n before (recommend taking their free course first) You're not comfortable with API concepts (keys, webhooks) You need 100% hand-holding (this assumes basic technical literacy) Support: Reach me: https://www.linkedin.com/in/gerald-akhidenor-1ab1a45/ Review Google Gemini documentation for AI customization Telegram Bot API docs for advanced features Next Steps After Setup Invite Beta Users - Start with friends/clients to gather feedback Monitor Usage - Watch which features get used most Customize Prompts - Refine AI responses based on real conversations Add Features - Use this as foundation for recipe suggestions, meal planning, etc. Scale Infrastructure - Move to dedicated hosting as user base grows Estimated total setup time: 1-2 hours for complete beginners, 30 minutes for experienced n8n users.
Generate & explain Excel/Google Sheets formulas with DeepSeek AI
Description This AI-powered n8n workflow helps users generate or explain complex Excel/Google Sheets/R and Math formulas with a simple text prompt. Whether you're a beginner trying to understand VLOOKUP, or an advanced user automating formula creation, this tool handles it with intelligent, clean, and easy-to-read responses. The workflow acts as your personal formula assistant, powered by AI through OpenRouter (LangChain integration), and responds instantly via webhook when a user asks a question. Use cases are many π§βπ Students & learners using spreadsheets for assignments. π§βπΌ Finance, sales, HR, and marketing pros who regularly create or audit complex formulas. πΌ Consultants & freelancers offering spreadsheet automation services. π οΈ No-code builders embedding this into their own AI assistants. π± Micro-SaaS founders building AI bots or chatbot features around Excel/Sheets/R. π§ͺ Educators wanting to build interactive formula explainers for courses. How I used it to build a microSaaS Check it out here: AI Formula Bot I connected this n8n workflow to a Fillout form on my micro-site and added Stripe + myCred credits on WordPress. Users submit formula-related queries, and the bot responds in seconds. Itβs monetized with credit-based limits, a free tier, and upsells for pro plans. This setup allowed me to launch a niche AI tool in days, without building a full backend. Itβs scalable, automatable, and integrates with email, WhatsApp, or even voice AI for escalations. This workflow was the core engine behind it all. Good to know You can deploy this as a microSaaS with a front-end UI (like Fillout, Tally, or custom form) connected to this webhook. Itβs model-agnostic: swap in your preferred OpenRouter-compatible model. Responses are formatted for readability β with clean breakdowns and examples. The system intelligently detects whether the user wants to generate or explain a formula using NLP logic inside the workflow. Designed with modular nodes for easy customization and expansion. How it works User input hits the Webhook via POST (e.g., from a form). A logic node classifies the intent: generate or explain. Depending on intent, it routes the query to one of two AI agent nodes with tailored system prompts. The AI returns a natural language explanation or a code block formula. A custom Code node sanitizes the output. The formatted result is sent back via webhook response. How to use Import the workflow JSON into n8n. Set up your OpenRouter credentials (create in Credentials tab). Deploy the webhook URL in a frontend form (e.g., Fillout, Webflow form, etc.). Prompt the bot by asking: βExplain how =IF(A1>100,"High","Low") works.β βWrite a formula to calculate compound interest.β Receive an AI response with either: β A generated formula β An explanation with bullet points and examples Requirements π§ n8n (Self-hosted or Cloud) π OpenRouter API key (Free or Paid Tier) π A front-end form tool (optional but recommended: Fillout, Tally, or any frontend that can POST) Customizing this workflow Swap the AI model to your preferred one on OpenRouter (e.g., Mixtral, Claude, GPT-3.5). Edit system prompts inside the agent nodes to control tone or response style. Extend logic detection to classify more intents like "optimize" or "debug" formulas. Chain outputs into Google Sheets for logging or CRM tracking. Add logging and lead capture for SaaS-style metrics. Use AI content moderation before responses if deploying for public use. Other use cases Build an AI spreadsheet tutor bot for your online course. Offer Formula-as-a-Service for agencies or clients. Integrate with Slack or WhatsApp via n8n for real-time formula help. Add to internal knowledge bots for teams in operations, finance, or HR.
Transform YouTube transcripts to content assets with AI and Google Sheets
Transform YouTube Transcripts to Content Assets with AI, Youtube-transcript.io and Google Sheets For content creators, marketers, and video strategists who want to transform YouTube competitors' video transcripts into production-ready content assets using AI-powered automation. Overview This template demonstrates two approaches to YouTube transcript extraction and AI content generation: automated Google Sheets monitoring for batch processing, and direct webhook API calls for on-demand extraction. The workflow includes intelligent parsing, metadata extraction, and AI-powered content transformation. Key Features Dual Workflow Architecture Google Sheets integration for automated monitoring and batch processing Webhook endpoint for real-time API access and instant responses Both paths share core extraction logic while serving different use cases Smart YouTube Parsing Extracts video IDs from multiple URL formats (youtube.com, youtu.be, embed) Regex-based parsing handles edge cases reliably Graceful error handling for invalid URLs Rich Metadata Extraction Video title, channel name, and publish date Duration and category information Transcript text with full timestamp data Fallback handling for videos without transcripts AI Content Generation Transforms raw transcripts into structured content assets Generates Veo3-optimized scripts (8-second scene format) Creates SEO-optimized titles, thumbnail concepts, tags, descriptions Includes virality optimization suggestions (hooks, CTAs) Uses OpenRouter for flexible LLM selection Structured Output Parsing Custom code nodes parse AI responses into consistent formats Organized by content type (Script, Titles, Thumbnails, Tags, etc.) Ready for direct insertion into Google Sheets or API responses Technical Implementation Nodes Used Google Sheets (Trigger & Operations) Webhook (Trigger & Response) HTTP Request (YouTube Transcript API) Code (JavaScript for parsing) LangChain LLM Chain (Content generation) OpenRouter Chat Model External Dependencies YouTube Transcript API (youtube-transcript.io or similar) Google Sheets API credentials OpenRouter API access Use Cases Content Repurposing: Transform successful videos into new content formats Competitive Analysis: Batch analyze competitor video strategies Research & Documentation: Build searchable transcript databases Accessibility: Generate captions and text alternatives SEO Analysis: Extract and analyze video content at scale Content Strategy: Develop data-driven video concepts Setup Requirements YouTube Transcript API credentials Google Sheets with specified column structure (for automated workflow) OpenRouter API key for AI content generation n8n instance (cloud or self-hosted) Customization Options The workflow is designed for easy extension: Swap AI providers by changing the LangChain model node Modify prompt engineering in the Chain LLM nodes Add additional parsing logic in Code nodes Connect to databases, CMS, or other storage solutions Integrate text analysis, summarization, or translation steps Add notification systems for workflow completion Need Help? LinkedIn: Gerald Akhidenor Work with me: dominixai.com My websites: jobmonkey.dev and mediacraftai.com Email: denorgerald@gmail.com
Generate & edit images with Gemini AI: storage & email delivery pipeline
AI Image Generator with Nano Banana: Automated Creation and Delivery For content creators, agencies, and SaaS developers who need automated AI image generation and editing with professional delivery workflows. Overview This template demonstrates intelligent workflow routing based on user input type, leveraging Google's Gemini 2.5 Flash image model for both generative and editing tasks. The workflow includes complete file handling, cloud storage integration, and professional email notification with shareable download links. Architecture & Key Features Intelligent Input Detection Conditional logic automatically detects presence of uploaded image files Routes requests to appropriate processing branch (text-only vs. image+text) Single webhook endpoint serves both use cases without manual intervention Dual Processing Modes Text-to-Image Generation (Prompt Only): User submits natural language description and the system generates new images from scratch using Gemini 2.5 Flash Image-to-Image Editing (Prompt + Image): User uploads reference image with text instructions for AI-powered editing while maintaining image context Complete File Management Pipeline Extracts and validates uploaded image data Converts AI-generated base64 responses to binary files Uploads to organized Google Drive folder structure Generates public sharing links with reader permissions Implements proper file naming conventions with timestamps Professional Email Delivery System Responsive HTML email template with gradient styling Dynamic content injection (user names, download links) 7-day expiration warnings for urgency Built-in social sharing CTAs (Twitter, LinkedIn, WhatsApp) Referral program messaging for viral growth Mobile-optimized design Production-Ready Features API key authentication on webhook endpoint Proper error handling throughout pipeline Structured response format for webhook consumers Success confirmations before file operations Technical Implementation Nodes Used Webhook (with file upload support) If (conditional routing) Extract from File (base64 conversion) Code (image formatting for AI) HTTP Request (OpenRouter API calls) Edit Fields (data transformation) Convert to File (binary operations) Google Drive (upload & share operations) Email Send (HTML delivery) Respond to Webhook External Dependencies OpenRouter API (Gemini 2.5 Flash image model) Google Drive API credentials SMTP server or email service Workflow Logic Flow Input Reception: Webhook receives form data with optional image file Route Detection: Conditional checks for presence of binary image data Processing Branch A (With Image): Extract base64 from uploaded file, format as data URL for AI model, send prompt + image to Gemini Processing Branch B (Text Only): Send text prompt directly to Gemini and generate image from description Universal Pipeline (Both Routes): Parse AI response, extract base64 image, convert to binary file format, upload to Google Drive with dynamic naming, generate shareable public link, send formatted email with download link, return success response to webhook caller Use Cases SaaS Image Tools: Backend API for image generation/editing services Content Automation: Automated image creation for marketing campaigns Client Delivery Systems: White-label image processing for agencies E-commerce: Product image variations and mockups Social Media Tools: Automated visual content generation Design Services: Rapid prototyping and concept visualization Customization Options AI Model Flexibility Swap Gemini for other vision-capable models Adjust model parameters for different output styles Implement multiple model options with user selection Storage Alternatives Replace Google Drive with S3, Dropbox, or other cloud storage Implement local storage for self-hosted deployments Add CDN integration for faster delivery Email Template Fully customizable HTML design Add brand colors and logos Modify social sharing options Customize referral program messaging Business Logic Add usage tracking and analytics Implement credit/subscription systems Add watermarking for free tier users Queue management for high-volume processing Security Considerations Webhook protected with API key authentication File type validation on uploads Proper permissions on Google Drive shares Email validation before sending Setup Requirements OpenRouter account with Gemini 2.5 Flash access Google Cloud project with Drive API enabled Email sending credentials (SMTP or service) n8n instance with webhook support Public domain for webhook endpoint (production use) Need Help? LinkedIn: Gerald Akhidenor Work with me: dominixai.com My websites: jobmonkey.dev and mediacraftai.com Email: denorgerald@gmail.com