Back to Catalog

Automatic jest test generation for GitHub PRs with dual AI review

VarritechVarritech
599 views
2/3/2026
Official Page

Workflow: Automatic Unit Test Creator from GitHub

πŸ—οΈ Architecture Overview

This workflow listens for GitHub pull-request events, analyzes changed React/TypeScript files, auto-generates Jest tests via AI, has them reviewed by a second AI pass, and posts suggestions back as PR comments:

  1. GitHub Webhook β†’ PR opened or updated
  2. Fetch & Diff β†’ Retrieve raw diff of changed files
  3. Filter & Split β†’ Isolate .tsx files & their diffs
  4. Fetch File Contents β†’ Provide full context for tests
  5. Test Maker Agent β†’ Generate Jest tests for diff hunks
  6. Code Reviewer Agent β†’ Refine tests for style & edge-cases
  7. Post PR Comment β†’ Sends suggested tests back to GitHub

πŸ“¦ Node-by-Node Breakdown

flowchart LR
  A[Webhook: /github/pr-events] --> B[GitHub: Get PR]
  B --> C[Function: Parse diff_url + owner/repo]
  C --> D[HTTP Request: GET diff_url]
  D --> E[Function: Split on "diff --git"]
  E --> F[Filter: /\.tsx$/]
  F --> G[GitHub: Get File Contents]
  G --> H[Test Maker Agent]
  H --> I[Code Reviewer Agent]
  I --> J[Function: Build Comment Payload]
  J --> K[HTTP Request: POST to PR Comments]

Webhook: GitHub PR Events

  • Type: HTTP Webhook (/webhook/github/pr-events)
  • Subscribed Events: pull_request.opened, pull_request.synchronize

GitHub: Get PR

  • Node: GitHub
  • Action: "Get Pull Request"
  • Inputs: owner, repo, pull_number

Function: Parse diff_url + owner/repo

  • Extracts:
    • diff_url (e.g. …/pulls/123.diff)
    • owner, repo, merge_commit_sha

HTTP Request: GET diff_url

  • Fetches unified-diff text for the PR.

Function: Split on "diff --git"

  • Splits the diff into file-specific segments.

Filter: /.tsx$/

  • Keeps only segments where the file path ends with .tsx.

GitHub: Get File Contents

  • For each .tsx file, fetches the latest blob via GitHub API.

Test Maker Agent

  • Prompt:
    • "Generate Jest unit tests covering only the behaviors changed in these diff hunks."
  • Output: Raw Jest test code.

Code Reviewer Agent

  • Reads file + generated tests
  • Prompt:
    • "Review and improve these tests for readability, edge-cases, and naming conventions."
  • Output: Polished test suite.

Function: Build Comment Payload

  • Wraps tests in TypeScript fences:
// generated tests…
  • Constructs JSON:
{ "body": "<tests>" }

HTTP Request: POST to PR Comments

  • URL: https://api.github.com/repos/{owner}/{repo}/issues/{pull_number}/comments
  • Body: Contains the suggested tests.

πŸ” Design Rationale & Best Practices

Focused Diff Analysis

  • Targets only .tsx files to cover UI logic.

Two-Stage AI

  • Separate "generate" + "review" steps mimic TDD + code review.

Stateless Functions

  • Pure JS for parsing & transformation, easy to test.

Non-Blocking PR Comments

  • Asynchronous suggestionsβ€”developers aren't blocked.

Scoped Permissions

  • GitHub token limited to reading PRs & posting comments.

Automatic Jest Test Generation for GitHub PRs with Dual AI Review

This n8n workflow automates the generation of Jest tests for GitHub Pull Requests (PRs) and incorporates a dual AI review process. It listens for new PRs, extracts relevant code, uses an AI agent to generate test cases, and then reviews these tests with a second AI model before potentially submitting them back to the PR.

What it does

  1. Triggers on GitHub Events: The workflow is activated by specific events on a configured GitHub repository.
  2. Retrieves PR Details: It fetches information about the triggered Pull Request, including its content and changes.
  3. Extracts Code for Testing: A custom Code node processes the PR content to isolate the code that needs Jest tests.
  4. Generates Jest Tests (AI Agent 1): An AI Agent, powered by an OpenAI Chat Model and a Structured Output Parser, takes the extracted code and generates a set of Jest test cases.
  5. Reviews Generated Tests (AI Agent 2): The generated tests are then passed to a second AI Agent (also using an OpenAI Chat Model and Structured Output Parser) for an independent review, ensuring quality and correctness.
  6. Merges AI Outputs: The outputs from both AI agents are merged for further processing.
  7. Performs HTTP Request (Placeholder): A placeholder HTTP Request node suggests a potential next step, such as posting the generated and reviewed tests back to the GitHub PR as a comment or a new file.

Prerequisites/Requirements

  • GitHub Account: Configured with an n8n GitHub credential to listen for PR events and interact with repositories.
  • OpenAI API Key: Required for the OpenAI Chat Models used by both AI Agents. This should be configured as an n8n credential.
  • n8n Instance: A running n8n instance to host and execute the workflow.

Setup/Usage

  1. Import the Workflow: Import the provided JSON into your n8n instance.
  2. Configure GitHub Trigger:
    • Select the "GitHub" node.
    • Choose your GitHub credential or create a new one.
    • Configure the repository and events you want to monitor (e.g., pull_request events like opened, synchronize).
  3. Configure OpenAI Credentials:
    • Select the "OpenAI Chat Model" nodes within both "AI Agent" nodes.
    • Choose your OpenAI API Key credential or create a new one.
  4. Review Code Nodes: The "Code" node contains custom JavaScript logic to extract code from the PR. Review and adjust this logic if your repository structure or PR content requires different parsing.
  5. Review AI Agent Prompts: The "AI Agent" nodes will have specific prompts for generating and reviewing tests. Customize these prompts to align with your desired test quality, style, and specific Jest requirements.
  6. Configure Final Action: The "HTTP Request" node is currently a placeholder. Configure it to perform your desired action, such as:
    • Posting a comment to the GitHub PR with the generated tests.
    • Creating a new file in the PR branch with the tests.
    • Sending notifications to a communication channel (e.g., Slack, Microsoft Teams).
  7. Activate the Workflow: Once configured, activate the workflow to start listening for GitHub PR events.

Related Templates

Multi-source tax & cash flow forecasting with GPT-4, Gmail and Google Sheets

How It Works Automates monthly revenue aggregation from multiple sources with intelligent tax forecasting using GPT-4 structured analysis. Fetches revenue data from up to three distinct sources, consolidates datasets into unified records, applies OpenAI GPT-4 model for predictive tax obligation forecasting with context awareness. System generates formatted reports with structured forecast outputs and automatically sends comprehensive tax projections to agents via Gmail, storing results in Google Sheets for audit trails. Designed for tax professionals, accounting firms, and finance teams requiring accurate predictive tax planning, cash flow forecasting, and proactive compliance strategy without manual calculations. Setup Steps Configure OpenAI API key for GPT-4 model access Connect three revenue data sources with appropriate credentials Map data aggregation logic for multi-source consolidation Define structured output schema for forecast results Set up Gmail for automated agent notification Configure Google Sheets destination Prerequisites OpenAI API key with GPT-4 access, Gmail account, Google Sheets, three revenue data source credentials Use Cases Monthly tax liability projections, quarterly estimated tax planning Customization Adjust forecast model parameters, add additional revenue sources, modify email templates Benefits Eliminates manual tax calculations, enables proactive tax planning, improves cash flow forecasting accuracy

Cheng Siong ChinBy Cheng Siong Chin
43

Recreate Instagram reels with Gemini 2.0 analysis & minimax video generation

This n8n template demonstrates how to automatically download an Instagram Reel, analyze its content using AI video understanding, and regenerate a similar video using AI video generation models. The workflow creates AI-powered variations of existing video content while maintaining visual consistency and style from the original source material.​​ Use cases are many: Content creators can produce multiple variations of trending reels, social media managers can recreate viral content with brand-specific modifications, or marketers can generate similar videos without copyright concerns while maintaining the original's visual appeal and storytelling structure.​ Good to know Video generation typically takes 2-5 minutes per video. The workflow includes automatic polling to check completion status.​ Minimax Video-01 generates videos up to 6 seconds long at 720p resolution and 25fps with cinematic camera movements.​ Gemini 2.0 Flash processes video at 1 frame per second (approximately 300 tokens per second of video), making it ideal for detailed video analysis.​ The workflow includes retry logic for failed API requests to ensure reliability.​ Ensure you have sufficient API credits in your Replicate account before running high-volume workflows.​ How it works The workflow begins when a user submits an Instagram Reel URL through the chat trigger interface.​ The RapidAPI Instagram Reels Downloader fetches the video metadata and download URL from Instagram.​ Downloaded video data is formatted and validated to ensure successful retrieval before proceeding.​ The video file is uploaded to Google's Gemini 2.0 Flash model via their File API for processing.​​ Gemini 2.0 Flash analyzes the video frame-by-frame, generating a comprehensive description covering visuals, audio, movements, lighting, camera angles, and transitions.​​ The AI-generated description is formatted as a continuous text prompt (lowercase, no punctuation) optimized for video generation models.​ This detailed prompt is sent to Replicate's Minimax Video-01 model to generate a new video based on the analysis.​ The workflow polls the Replicate API every 2 minutes to check generation status, automatically looping until completion.​ Once generation succeeds, the final AI-generated video is downloaded and ready for use or further processing.​ How to use The chat trigger node accepts Instagram Reel URLs as input, but you can replace this with webhooks, forms, or scheduled triggers depending on your automation needs.​ Replace all placeholder API keys in the workflow before activation (RapidAPI, Google AI Studio, and Replicate).​ Test with short Instagram Reels first (under 30 seconds) to optimize processing time and token usage.​ The workflow automatically handles null checks and retries if the initial download fails, ensuring robust execution.​ Requirements RapidAPI account for Instagram Reels Downloader API (handles video extraction from Instagram)​ Google AI Studio API key with Gemini 2.0 Flash access enabled (for advanced video analysis)​​ Replicate API account with sufficient credits for Minimax Video-01 model access (for AI video generation)​ Active n8n instance (self-hosted or cloud) to run the workflow automation​ Customising this workflow This AI video recreation workflow can be adapted for numerous creative and business applications. Try extending it to generate multiple style variations from one input video, add branding overlays or watermarks before final output, or integrate with cloud storage (Google Drive, Dropbox) for automated archiving. You can also chain this workflow with content scheduling tools to automatically post generated videos to social media platforms, or combine it with analytics nodes to track which AI-generated variations perform best with your audience.

Aditya MalurBy Aditya Malur
606

Scrape Upwork job listings & generate daily email reports with Apify & Google Sheets

This automated n8n workflow scrapes job listings from Upwork using Apify, processes and cleans the data, and generates daily email reports with job summaries. The system uses Google Sheets for data storage and keyword management, providing a comprehensive solution for tracking relevant job opportunities and market trends. What is Apify? Apify is a web scraping and automation platform that provides reliable APIs for extracting data from websites like Upwork. It handles the complexities of web scraping including rate limiting, proxy management, and data extraction while maintaining compliance with website terms of service. Good to Know Apify API calls may incur costs based on usage; check Apify pricing for details Google Sheets access must be properly authorized to avoid data sync issues The workflow includes data cleaning and deduplication to ensure high-quality results Email reports provide structured summaries for easy review and decision-making Keyword management through Google Sheets allows for flexible job targeting How It Works The workflow is organized into three main phases: Phase 1: Job Scraping & Initial Processing This phase handles the core data collection and initial storage: Trigger Manual Run - Manually starts the workflow for on-demand job scraping Fetch Keywords from Google Sheet - Reads the list of job-related keywords from the All Keywords sheet Loop Through Keywords - Iterates over each keyword to trigger Apify scraping Trigger Apify Scraper - Sends HTTP request to start Apify actor for job scraping Wait for Apify Completion - Waits for the Apify actor to finish execution Delay Before Dataset Read - Waits a few seconds to ensure dataset is ready for processing Fetch Scraped Job Dataset - Fetches the latest dataset from Apify Process Raw Job Data - Filters jobs posted in the last 24 hours and formats the data Save Jobs to Daily Sheet - Appends new job data to the daily Google Sheet Update Keyword Job Count - Updates job count in the All Keywords summary sheet Phase 2: Data Cleaning & Deduplication This phase ensures data quality and removes duplicates: Load Today's Daily Jobs - Loads all jobs added in today's sheet for processing Remove Duplicates by Title/Desc - Removes duplicates based on title and description matching Save Clean Job Data - Saves the cleaned, unique entries back to the sheet Clear Old Daily Sheet Data - Deletes old or duplicate entries from the sheet Reload Clean Job Data - Loads clean data again after deletion for final processing Phase 3: Daily Summary & Email Report This phase generates summaries and delivers the final report: Generate Keyword Summary Stats - Counts job totals per keyword for analysis Update Summary Sheet - Updates the summary sheet with keyword statistics Fetch Final Summary Data - Reads the summary sheet for reporting purposes Build Email Body - Formats email with statistics and sheet link Send Daily Report Email - Sends the structured daily summary email to recipients Data Sources The workflow utilizes Google Sheets for data management: AI Keywords Sheet - Contains keyword management data with columns: Keyword (text) - Job search terms Job Count (number) - Number of jobs found for each keyword Status (text) - Active/Inactive status Last Updated (timestamp) - When keyword was last processed Daily Jobs Sheet - Contains scraped job data with columns: Job Title (text) - Title of the job posting Description (text) - Job description content Budget (text) - Job budget or hourly rate Client Rating (number) - Client's rating on Upwork Posted Date (timestamp) - When job was posted Job URL (text) - Direct link to the job posting Keyword (text) - Which keyword found this job Scraped At (timestamp) - When data was collected Summary Sheet - Contains daily statistics with columns: Date (date) - Report date Total Jobs (number) - Total jobs found Keywords Processed (number) - Number of keywords searched Top Keyword (text) - Most productive keyword Average Budget (currency) - Average job budget Report Generated (timestamp) - When summary was created How to Use Import the workflow into n8n Configure Apify API credentials and Google Sheets API access Set up email credentials for daily report delivery Create three Google Sheets with the specified column structures Add relevant job keywords to the AI Keywords sheet Test with sample keywords and adjust as needed Requirements Apify API credentials and actor access Google Sheets API access Email service credentials (Gmail, SMTP, etc.) Upwork job search keywords for targeting Customizing This Workflow Modify the Process Raw Job Data node to filter jobs by additional criteria like budget range, client rating, or job type. Adjust the email report format to include more detailed statistics or add visual aids, such as charts. Customize the data cleaning logic to better handle duplicate detection based on your specific requirements, or add additional data sources beyond Upwork for comprehensive job market analysis.

Oneclick AI SquadBy Oneclick AI Squad
753