Deploy code to GitHub with natural language via Slack & Claude 3.5
Github Deployer Agent
Overview
The Github Deployer Agent is an intelligent automation tool that integrates with Slack to streamline code deployment workflows. Powered by Anthropic's Claude 3.5 and Tavily for web search, it enables seamless, context-aware file pushes to a GitHub repository with minimal user input.
Capabilities
- Accepts natural language via Slack
- Automatically pushes code to a default GitHub repository
- Uses Claude 3.5 for code generation and decision-making
- Leverages Tavily for real-time web search to enhance context
- Supports folder structure hints to ensure clean and organized repositories
Required Connections
To operate correctly, the following integrations must be in place:
- Slack API Token with permission to read messages and post responses
- GitHub Personal Access Token with repo write permissions
- Tavily API Key for external search functionality
- Claude 3.5 API Access via Anthropic
Detailed configuration instructions are provided in the workflow
Example Input
From Slack, you can send messages like:
"Generate a basic README.md for my Python project and store it in the root directory."
Customising This Workflow
You can tailor the workflow by:
- Modifying default folder paths or repository settings
- Integrate Jira node to use issue keys as default folder naming
- Add slack file upload option
Deploy Code to GitHub with Natural Language via Slack (Claude 3.5)
This n8n workflow empowers you to deploy code to GitHub using natural language commands sent through Slack. By leveraging the power of an AI Agent (specifically, an Anthropic Claude 3.5 model) and an HTTP Request tool, you can instruct the workflow to create or update files in a GitHub repository simply by typing your request in a Slack channel.
What it does
This workflow automates the process of interacting with GitHub based on Slack messages:
- Listens for Slack Messages: It triggers when a new message is posted in a configured Slack channel.
- Prepares Message Content: It extracts and cleans the relevant message text from the Slack event.
- Processes with AI Agent: It feeds the cleaned Slack message to an AI Agent (Anthropic Chat Model) configured with a GitHub HTTP Request tool. The AI Agent interprets the natural language request and determines the necessary GitHub API calls (e.g., to create or update a file).
- Executes GitHub Actions: The AI Agent, through the HTTP Request tool, makes the actual API calls to GitHub to perform the requested action (e.g., creating a new file with specified content in a given repository).
- Confirms in Slack: (Implicit, as there's no explicit Slack "reply" node, but the AI Agent's response would typically be returned to the user or logged).
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running n8n instance.
- Slack Account: A Slack workspace and a bot user configured to listen for messages in a specific channel.
- Slack Credentials: An n8n credential for your Slack bot (OAuth scope for
channels:read,chat:write,commands,im:read,groups:read,mpim:read). - Anthropic API Key: An API key for the Anthropic Claude 3.5 model.
- GitHub Account: A GitHub account with a repository where you want to deploy code.
- GitHub Personal Access Token: A GitHub Personal Access Token with appropriate permissions (e.g.,
reposcope for creating/updating files). - Langchain Integration: Ensure the
@n8n/n8n-nodes-langchainpackage is installed and enabled in your n8n instance.
Setup/Usage
- Import the Workflow: Import the provided JSON into your n8n instance.
- Configure Slack Trigger:
- Select your Slack credential.
- Choose the "Message" trigger event.
- Specify the Slack channel where you will send your natural language commands.
- Configure Anthropic Chat Model:
- Select your Anthropic API credential.
- Ensure the model is set to a Claude 3.5 model (e.g.,
claude-3-5-sonnet-20240620).
- Configure HTTP Request Tool:
- This node acts as a tool for the AI Agent. You will need to configure it to interact with the GitHub API.
- Set up a credential for GitHub (e.g., using a Personal Access Token).
- The AI Agent will dynamically generate the HTTP requests, but the base configuration (like authentication) should be set here.
- Crucially: The AI Agent needs to be instructed on how to use this tool. This is typically done via the "System Message" or "Instructions" in the AI Agent node, detailing the tool's capabilities (e.g., "You have a tool called 'HTTP Request Tool' that can make API calls. Use it to interact with GitHub to create or update files. The base URL for GitHub API is
https://api.github.com/repos/{owner}/{repo}/contents/{path}for file operations.").
- Configure AI Agent:
- Select the Anthropic Chat Model as the language model.
- Add the "HTTP Request Tool" to the list of available tools.
- Craft a detailed "System Message": This is critical. Explain to the AI Agent its role, the tools it has, and how to use them to interact with GitHub. For example:
- "You are a GitHub deployment assistant. Your task is to create or update files in a GitHub repository based on user requests."
- "You have access to an 'HTTP Request Tool' which can make API calls. Use it to interact with the GitHub API."
- "To create or update a file, you will need the repository owner, repository name, file path, file content (base64 encoded), and a commit message."
- "Example GitHub API endpoint for file content:
PUT /repos/{owner}/{repo}/contents/{path}." - "Ensure the content is base64 encoded before sending."
- Activate the Workflow: Once configured, activate the workflow.
Now, you can send messages to your configured Slack channel like:
- "Create a file named
hello.txtin mymy-reporepository under themainbranch with the content 'Hello, n8n!'" - "Update the
README.mdfile inmy-org/project-xwith a new line 'Deployed via n8n and Claude 3.5' and commit message 'Update README with deployment info'."
The AI Agent will interpret these requests, formulate the correct GitHub API call using the HTTP Request tool, and execute it.
Related Templates
Create verified user profiles with email validation, PDF generation & Gmail delivery
Verified User Profile Creation - Automated Email Validation & PDF Generation --- Overview This comprehensive automation workflow streamlines the user onboarding process by validating email addresses, generating professional profile PDFs, and delivering them seamlessly to verified users. 🎯 What This Workflow Does: Receives User Data - Webhook trigger accepts user signup information (name, email, city, profession, bio) Validates Email Addresses - Uses VerifiEmail API to ensure only legitimate email addresses proceed Conditional Branching - Smart logic splits workflow based on email verification results Generates HTML Profile - Creates beautifully styled HTML templates with user information Converts to PDF - Transforms HTML into professional, downloadable PDF documents Email Delivery - Sends personalized welcome emails with PDF attachments to verified users Data Logging - Records all verified users in Google Sheets for analytics and tracking Rejection Handling - Notifies users with invalid emails and provides guidance ✨ Key Features: ✅ Email Verification - Prevents fake registrations and maintains data quality 📄 Professional PDF Generation - Beautiful, branded profile documents 📧 Automated Email Delivery - Personalized welcome messages with attachments 📊 Google Sheets Logging - Complete audit trail of all verified users 🔀 Smart Branching - Separate paths for valid and invalid emails 🎨 Modern Design - Clean, responsive HTML/CSS templates 🔒 Secure Webhook - POST endpoint for seamless form integration 🎯 Perfect Use Cases: User registration systems Community membership verification Professional certification programs Event registration with verified attendees Customer onboarding processes Newsletter signup verification Educational platform enrollments Membership card generation 📦 What's Included: Complete workflow with 12 informative sticky notes Pre-configured webhook endpoint Email verification integration PDF generation setup Gmail sending configuration Google Sheets logging Error handling guidelines Rejection email template 🛠️ Required Integrations: VerifiEmail - For email validation (https://verifi.email) HTMLcsstoPDF - For PDF generation (https://htmlcsstopdf.com) Gmail OAuth2 - For email delivery Google Sheets OAuth2 - For data logging ⚡ Quick Setup Time: 15-20 minutes 🎓 Skill Level: Beginner to Intermediate --- Benefits: ✅ Reduces manual verification work by 100% ✅ Prevents spam and fake registrations ✅ Delivers professional branded documents automatically ✅ Maintains complete audit trail ✅ Scales effortlessly with user growth ✅ Provides excellent user experience ✅ Easy integration with any form or application --- Technical Details: Trigger Type: Webhook (POST) Total Nodes: 11 (including 12 documentation sticky notes) Execution Time: ~3-5 seconds per user API Calls: 3 external (VerifiEmail, HTMLcsstoPDF, Google Sheets) Email Format: HTML with binary PDF attachment Data Storage: Google Sheets (optional) --- License: MIT (Free to use and modify) --- 🎁 BONUS FEATURES: Comprehensive sticky notes explaining each step Beautiful, mobile-responsive email template Professional PDF styling with modern design Easily customizable for your branding Ready-to-use webhook endpoint Error handling guidelines included --- Perfect for: Developers, No-code enthusiasts, Business owners, SaaS platforms, Community managers, Event organizers Start automating your user verification process today! 🚀
Automate Reddit brand monitoring & responses with GPT-4o-mini, Sheets & Slack
How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. --- Who is this for? Brand managers and marketing teams needing automated social listening and engagement on Reddit Community managers responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers who want to scale Reddit marketing without hiring a team PR and reputation teams monitoring brand sentiment and responding to discussions in real-time Product marketers seeking organic engagement opportunities in product-related communities Any business that wants to build authentic Reddit presence while avoiding spammy marketing tactics --- Setup Steps Setup time: Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements: Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOURSHEETID with your actual Google Sheets document ID Send Slack Report: Replace YOURCHANNELID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours --- Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel
Competitor intelligence agent: SERP monitoring + summary with Thordata + OpenAI
Who this is for? This workflow is designed for: Marketing analysts, SEO specialists, and content strategists who want automated intelligence on their online competitors. Growth teams that need quick insights from SERP (Search Engine Results Pages) without manual data scraping. Agencies managing multiple clients’ SEO presence and tracking competitive positioning in real-time. What problem is this workflow solving? Manual competitor research is time-consuming, fragmented, and often lacks actionable insights. This workflow automates the entire process by: Fetching SERP results from multiple search engines (Google, Bing, Yandex, DuckDuckGo) using Thordata’s Scraper API. Using OpenAI GPT-4.1-mini to analyze, summarize, and extract keyword opportunities, topic clusters, and competitor weaknesses. Producing structured, JSON-based insights ready for dashboards or reports. Essentially, it transforms raw SERP data into strategic marketing intelligence — saving hours of research time. What this workflow does Here’s a step-by-step overview of how the workflow operates: Step 1: Manual Trigger Initiates the process on demand when you click “Execute Workflow.” Step 2: Set the Input Query The “Set Input Fields” node defines your search query, such as: > “Top SEO strategies for e-commerce in 2025” Step 3: Multi-Engine SERP Fetching Four HTTP request tools send the query to Thordata Scraper API to retrieve results from: Google Bing Yandex DuckDuckGo Each uses Bearer Authentication configured via “Thordata SERP Bearer Auth Account.” Step 4: AI Agent Processing The LangChain AI Agent orchestrates the data flow, combining inputs and preparing them for structured analysis. Step 5: SEO Analysis The SEO Analyst node (powered by GPT-4.1-mini) parses SERP results into a structured schema, extracting: Competitor domains Page titles & content types Ranking positions Keyword overlaps Traffic share estimations Strengths and weaknesses Step 6: Summarization The Summarize the content node distills complex data into a concise executive summary using GPT-4.1-mini. Step 7: Keyword & Topic Extraction The Keyword and Topic Analysis node extracts: Primary and secondary keywords Topic clusters and content gaps SEO strength scores Competitor insights Step 8: Output Formatting The Structured Output Parser ensures results are clean, validated JSON objects for further integration (e.g., Google Sheets, Notion, or dashboards). Setup Prerequisites n8n Cloud or Self-Hosted instance Thordata Scraper API Key (for SERP data retrieval) OpenAI API Key (for GPT-based reasoning) Setup Steps Add Credentials Go to Credentials → Add New → HTTP Bearer Auth* → Paste your Thordata API token. Add OpenAI API Credentials* for the GPT model. Import the Workflow Copy the provided JSON or upload it into your n8n instance. Set Input In the “Set the Input Fields” node, replace the example query with your desired topic, e.g.: “Google Search for Top SEO strategies for e-commerce in 2025” Execute Click “Execute Workflow” to run the analysis. How to customize this workflow to your needs Modify Search Query Change the search_query variable in the Set Node to any target keyword or topic. Change AI Model In the OpenAI Chat Model nodes, you can switch from gpt-4.1-mini to another model for better quality or lower cost. Extend Analysis Edit the JSON schema in the “Information Extractor” nodes to include: Sentiment analysis of top pages SERP volatility metrics Content freshness indicators Export Results Connect the output to: Google Sheets / Airtable for analytics Notion / Slack for team reporting Webhook / Database for automated storage Summary This workflow creates an AI-powered Competitor Intelligence System inside n8n by blending: Real-time SERP scraping (Thordata) Automated AI reasoning (OpenAI GPT-4.1-mini) Structured data extraction (LangChain Information Extractors)