Back to Catalog

Jira MCP server

TamerTamer
1281 views
2/3/2026
Official Page

What it does This n8n workflow creates a powerful AI-powered Jira management system that allows you to use Claude or other AI assistants to create, update, and manage Jira tickets through natural language requests. The workflow exposes key Jira functions as AI tools, enabling you to interact with your Jira instance through conversational commands. How it works

The workflow sets up an MCP (Model Control Protocol) server, allowing compatible AI assistants to use a suite of Jira management tools The AI assistant can perform various Jira operations including:

Creating new tickets with customized fields Adding comments to existing tickets Retrieving available status transitions for tickets Attaching files to tickets Changing ticket statuses Retrieving detailed information about tickets Getting available projects and issue types

When you make a request to your AI assistant, it determines which Jira operation to perform and executes it through this workflow

Setup Steps

Prerequisites:

Active Jira Cloud account with admin access n8n instance with the Langchain and MCP nodes installed Claude Desktop App or another compatible AI assistant

Jira Credentials Setup:

Configure your Jira Cloud API credentials in n8n Ensure your Jira account has sufficient permissions for all operations

Workflow Configuration:

Import this template into your n8n instance Set up the MCP Trigger node with your desired path (currently "test_mcp") Verify that all Jira tool nodes are correctly connected to the MCP Server node Activate the workflow

Using Claude as an MCP Client:

Open your Claude Desktop App Navigate to Settings > Developer Settings Enable "Connect to local MCP servers" Add a new connection with the URL path to your n8n MCP server (e.g., "http://localhost:5678/webhook/test_mcp") Start a new conversation and ask Claude to perform Jira tasks

Example Usage with Claude Once you've set up the connection between Claude and your MCP server, you can use natural language to manage your Jira tickets. Here are some examples:

Creating a ticket: "Claude, please create a new Jira ticket in the Web Development project with bug issue type. The summary should be 'Homepage loading slowly' and the description should be 'Users are experiencing delays of 5+ seconds when loading the homepage on mobile devices.'" Adding a comment: "Claude, please add a comment to Jira ticket WEB-123 saying 'This issue has been reproduced on multiple devices and browsers. Priority should be increased.'" Checking status: "Claude, can you get the details of ticket WEB-123 and tell me its current status?" Changing status: "Claude, please move ticket WEB-123 to 'In Progress' status."

This workflow creates a seamless bridge between your AI assistant and Jira, making project management more efficient through natural language interactions.

Jira MCP Server

This n8n workflow acts as a Model Context Protocol (MCP) server trigger, designed to initiate AI-driven processes. While the directory name suggests a connection to Jira, the current workflow JSON solely defines an MCP server trigger without any explicit Jira integration or subsequent actions.

What it does

  1. Listens for MCP Requests: The workflow starts by listening for incoming requests to an MCP server. This trigger is designed to receive context and initiate AI model interactions.

Prerequisites/Requirements

  • n8n Instance: An active n8n instance to host and run the workflow.
  • Model Context Protocol (MCP): Understanding and a system capable of sending requests to an MCP server.

Setup/Usage

  1. Import the Workflow: Import the provided JSON into your n8n instance.
  2. Activate the Workflow: Enable the workflow to start listening for incoming MCP requests.
  3. Integrate with MCP Client: Configure your MCP client or AI application to send requests to the webhook URL provided by this n8n workflow's MCP Server Trigger.

Note: As per the current JSON, this workflow only sets up the MCP server trigger. To perform any actions related to Jira or other services, you would need to extend this workflow by adding subsequent nodes after the "MCP Server Trigger" to process the incoming data and interact with other APIs.

Related Templates

AI-powered code review with linting, red-marked corrections in Google Sheets & Slack

Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOURSPREADSHEETID>, <YOURSHEETGIDORNAME>, <YOURSLACKCHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.

higashiyama By higashiyama
90

Automate Reddit brand monitoring & responses with GPT-4o-mini, Sheets & Slack

How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. --- Who is this for? Brand managers and marketing teams needing automated social listening and engagement on Reddit Community managers responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers who want to scale Reddit marketing without hiring a team PR and reputation teams monitoring brand sentiment and responding to discussions in real-time Product marketers seeking organic engagement opportunities in product-related communities Any business that wants to build authentic Reddit presence while avoiding spammy marketing tactics --- Setup Steps Setup time: Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements: Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOURSHEETID with your actual Google Sheets document ID Send Slack Report: Replace YOURCHANNELID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours --- Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel

Daniel ShashkoBy Daniel Shashko
679

Generate Weather-Based Date Itineraries with Google Places, OpenRouter AI, and Slack

🧩 What this template does This workflow builds a 120-minute local date course around your starting point by querying Google Places for nearby spots, selecting the top candidates, fetching real-time weather data, letting an AI generate a matching emoji, and drafting a friendly itinerary summary with an LLM in both English and Japanese. It then posts the full bilingual plan with a walking route link and weather emoji to Slack. 👥 Who it’s for Makers and teams who want a plug-and-play bilingual local itinerary generator with weather awareness — no custom code required. ⚙️ How it works Trigger – Manual (or schedule/webhook). Discovery – Google Places nearby search within a configurable radius. Selection – Rank by rating and pick the top 3. Weather – Fetch current weather (via OpenWeatherMap). Emoji – Use an AI model to match the weather with an emoji 🌤️. Planning – An LLM writes the itinerary in Markdown (JP + EN). Route – Compose a Google Maps walking route URL. Share – Post the bilingual itinerary, route link, and weather emoji to Slack. 🧰 Requirements n8n (Cloud or self-hosted) Google Maps Platform (Places API) OpenWeatherMap API key Slack Bot (chat:write) LLM provider (e.g., OpenRouter or DeepL for translation) 🚀 Setup (quick) Open Set → Fields: Config and fill in coords/radius/time limit. Connect Credentials for Google, OpenWeatherMap, Slack, and your LLM. Test the workflow and confirm the bilingual plan + weather emoji appear in Slack. 🛠 Customize Adjust ranking filters (type, min rating). Modify translation settings (target language or tone). Change output layout (side-by-side vs separated). Tune emoji logic or travel mode. Add error handling, retries, or logging for production use.

nodaBy noda
52