Back to Catalog

Automated GLPI ticket deadline alerts via Microsoft Teams

Luis HernandezLuis Hernandez
1251 views
2/3/2026
Official Page

Overview

This n8n workflow provides an automated notification system that monitors tickets in GLPI (Gestionnaire Libre de Parc Informatique) and sends proactive alerts through Microsoft Teams when tickets are approaching their expiration dates.

Key Features

๐Ÿ•˜ Automated Scheduling

  • Daily execution scheduled at 9:00 AM
  • Continuous monitoring without manual intervention
  • Customizable scheduling configuration

๐ŸŽฏ Intelligent Deadline Detection

  • Automatic identification of tickets expiring within the next 2 days
  • Configurable date-based filtering criteria
  • Efficient processing of multiple simultaneous tickets

๐Ÿ‘ฅ Targeted Notifications

  • Personalized alerts sent to specific technicians via Microsoft Teams
  • Automatic assignment based on ticket assignee
  • Structured messages with key ticket information

๐Ÿ”ง Complete GLPI Integration

  • Secure connection through GLPI REST API
  • Authentication with application tokens
  • Automatic session management (initiation and closure)

Technical Functionalities

Data Processing

  • Extraction: Automatic queries to GLPI database
  • Filtering: Ticket separation by assigned technician
  • Transformation: Data formatting for readable notifications

Conditional Flow

  • Automatic evaluation of responsible technician
  • Intelligent notification routing
  • Handling of cases without specific assignment

Session Management

  • Automatic session initiation with GLPI
  • Secure session token maintenance
  • Controlled session closure upon completion

Ticket Information Included Each alert contains:

Ticket Title: Clear problem description Ticket ID: Unique identifier for tracking Time Remaining: Days/hours until expiration

System Requirements

Infrastructure

  • GLPI server with REST API enabled
  • Running n8n instance
  • Microsoft Teams account with API permissions

Required Credentials

  • GLPI user with application administrator privileges
  • Valid GLPI application token
  • OAuth2 credentials for Microsoft Teams

GLPI User ID Identification

For complete workflow configuration, it's necessary to identify the correct IDs of technical users for proper notification assignment.

User IDs can be obtained by accessing user management in GLPI and observing the ID directly in the browser URL when selecting a specific user.

Path: Administration > Users > [Select User] When clicking on the desired user, you can see the user ID directly in the browser URL (e.g., id=7 for Support Technician 1, id=8 for Support Technician 2).

Configuration

  • Environment Variables

    json{ "glpi_url": "https://your-glpi-server.com", "app_token": "your-application-token-here" }

Available Customization

  • Alert Period: Modifiable from 2 days to any desired range
  • Execution Schedule: Configurable according to operational needs
  • Recipients: Adaptable to specific team structure

Operational Benefits

For Support Teams

  • Reduction of expired tickets
  • Improved response times
  • Proactive workload management

For Organizations

  • Higher SLA compliance
  • Increased customer satisfaction
  • Optimized technical support resources

Ideal Use Cases

  • IT Service Centers: Incident and request management
  • Technical Support Teams: Critical case tracking
  • Organizations with Strict SLAs: Service agreement compliance
  • IT Departments: Internal ticket monitoring

Scalability

This workflow is designed to:

  • Handle high ticket volumes
  • Adapt to teams of different sizes
  • Integrate with multiple communication channels
  • Expand with additional functionalities

Installation and Deployment

  • Import the JSON workflow into n8n
  • Configure GLPI and Microsoft Teams credentials
  • Update configuration variables
  • Activate the scheduled trigger
  • Perform functionality tests

This workflow represents a robust and scalable solution for proactive ticket management in enterprise environments, significantly improving operational efficiency and service commitment compliance.

GLPI Ticket Deadline Alerts via Microsoft Teams

This n8n workflow automates the process of fetching GLPI tickets, identifying those with upcoming deadlines, and sending alert messages to a designated Microsoft Teams channel. It helps ensure that critical GLPI tickets are not missed and are addressed in a timely manner.

What it does

  1. Schedules Execution: The workflow runs on a predefined schedule (e.g., daily, hourly) to check for new or updated tickets.
  2. Fetches GLPI Tickets: It makes an HTTP request to the GLPI API to retrieve a list of tickets.
  3. Filters for Relevant Tickets: It processes the fetched tickets and likely applies a filter to identify tickets that are nearing their deadline or require attention. (Specific filtering logic is not detailed in the provided JSON but is implied by the "If" node).
  4. Prepares Alert Messages: For each relevant ticket, it formats the necessary information into a structured message suitable for Microsoft Teams.
  5. Sends Microsoft Teams Alerts: It sends these formatted alert messages to a specified Microsoft Teams channel, notifying the relevant team or individuals about the impending deadlines.

Prerequisites/Requirements

  • n8n Instance: A running n8n instance.
  • GLPI Instance: Access to a GLPI instance with API enabled.
  • GLPI API Token/Credentials: Credentials (e.g., API token, username/password) to authenticate with the GLPI API.
  • Microsoft Teams Account: A Microsoft Teams account with a channel configured to receive messages.
  • Microsoft Teams Webhook/Credentials: A Microsoft Teams webhook URL or n8n Microsoft Teams credentials configured to post messages to the desired channel.

Setup/Usage

  1. Import the Workflow:
    • Copy the provided JSON workflow.
    • In your n8n instance, click "New" in the workflows section.
    • Click the three dots next to the workflow name and select "Import from JSON".
    • Paste the JSON and click "Import".
  2. Configure Credentials:
    • HTTP Request (GLPI): Configure the "HTTP Request" node (Node ID: 19) with your GLPI API endpoint and authentication details. This typically involves setting the URL, headers (e.g., App-Token, Session-Token, Authorization), and potentially query parameters for filtering tickets.
    • Microsoft Teams: Configure the "Microsoft Teams" node (Node ID: 368) with your Microsoft Teams credentials. This usually involves setting up an OAuth2 or Webhook credential that points to your desired Teams channel.
  3. Customize Logic (If Node):
    • Review the "If" node (Node ID: 20) to understand and adjust the conditions for filtering GLPI tickets. This is where you'll define what constitutes an "upcoming deadline" or a "relevant ticket" based on your GLPI data structure (e.g., due_date, status).
  4. Customize Message (Edit Fields Node):
    • Adjust the "Edit Fields" (Set) node (Node ID: 38) to format the data from GLPI tickets into the desired message structure for Microsoft Teams. You can include ticket ID, title, due date, assigned user, etc.
  5. Set Schedule:
    • Configure the "Schedule Trigger" node (Node ID: 839) to define how often the workflow should run (e.g., every day at 9 AM, every hour).
  6. Activate the Workflow:
    • Once configured, activate the workflow by toggling the "Active" switch in the top right corner of the n8n editor.

The workflow will now automatically run according to your schedule, fetch GLPI tickets, identify urgent ones, and send alerts to Microsoft Teams.

Related Templates

Auto-create TikTok videos with VEED.io AI avatars, ElevenLabs & GPT-4

๐Ÿ’ฅ Viral TikTok Video Machine: Auto-Create Videos with Your AI Avatar --- ๐ŸŽฏ Who is this for? This workflow is for content creators, marketers, and agencies who want to use Veed.ioโ€™s AI avatar technology to produce short, engaging TikTok videos automatically. Itโ€™s ideal for creators who want to appear on camera without recording themselves, and for teams managing multiple brands who need to generate videos at scale. --- โš™๏ธ What problem this workflow solves Manually creating videos for TikTok can take hours โ€” finding trends, writing scripts, recording, and editing. By combining Veed.io, ElevenLabs, and GPT-4, this workflow transforms a simple Telegram input into a ready-to-post TikTok video featuring your AI avatar powered by Veed.io โ€” speaking naturally with your cloned voice. --- ๐Ÿš€ What this workflow does This automation links Veed.ioโ€™s video-generation API with multiple AI tools: Analyzes TikTok trends via Perplexity AI Writes a 10-second viral script using GPT-4 Generates your voiceover via ElevenLabs Uses Veed.io (Fabric 1.0 via FAL.ai) to animate your avatar and sync the lips to the voice Creates an engaging caption + hashtags for TikTok virality Publishes the video automatically via Blotato TikTok API Logs all results to Google Sheets for tracking --- ๐Ÿงฉ Setup Telegram Bot Create your bot via @BotFather Configure it as the trigger for sending your photo and theme Connect Veed.io Create an account on Veed.io Get your FAL.ai API key (Veed Fabric 1.0 model) Use HTTPS image/audio URLs compatible with Veed Fabric Other APIs Add Perplexity, ElevenLabs, and Blotato TikTok keys Connect your Google Sheet for logging results --- ๐Ÿ› ๏ธ How to customize this workflow Change your Avatar: Upload a new image through Telegram, and Veed.io will generate a new talking version automatically. Modify the Script Style: Adjust the GPT prompt for tone (educational, funny, storytelling). Adjust Voice Tone: Tweak ElevenLabs stability and similarity settings. Expand Platforms: Add Instagram, YouTube Shorts, or X (Twitter) posting nodes. Track Performance: Customize your Google Sheet to measure your most successful Veed.io-based videos. --- ๐Ÿง  Expected Outcome In just a few seconds after sending your photo and theme, this workflow โ€” powered by Veed.io โ€” creates a fully automated TikTok video featuring your AI avatar with natural lip-sync and voice. The result is a continuous stream of viral short videos, made without cameras, editing, or effort. --- โœ… Import the JSON file in n8n, add your API keys (including Veed.io via FAL.ai), and start generating viral TikTok videos starring your AI avatar today! ๐ŸŽฅ Watch This Tutorial --- ๐Ÿ“„ Documentation: Notion Guide Need help customizing? Contact me for consulting and support : Linkedin / Youtube

Dr. FirasBy Dr. Firas
39510

Automate invoice processing with OCR, GPT-4 & Salesforce opportunity creation

PDF Invoice Extractor (AI) End-to-end pipeline: Watch Drive โžœ Download PDF โžœ OCR text โžœ AI normalize to JSON โžœ Upsert Buyer (Account) โžœ Create Opportunity โžœ Map Products โžœ Create OLI via Composite API โžœ Archive to OneDrive. --- Node by node (what it does & key setup) 1) Google Drive Trigger Purpose: Fire when a new file appears in a specific Google Drive folder. Key settings: Event: fileCreated Folder ID: google drive folder id Polling: everyMinute Creds: googleDriveOAuth2Api Output: Metadata { id, name, ... } for the new file. --- 2) Download File From Google Purpose: Get the file binary for processing and archiving. Key settings: Operation: download File ID: ={{ $json.id }} Creds: googleDriveOAuth2Api Output: Binary (default key: data) and original metadata. --- 3) Extract from File Purpose: Extract text from PDF (OCR as needed) for AI parsing. Key settings: Operation: pdf OCR: enable for scanned PDFs (in options) Output: JSON with OCR text at {{ $json.text }}. --- 4) Message a model (AI JSON Extractor) Purpose: Convert OCR text into strict normalized JSON array (invoice schema). Key settings: Node: @n8n/n8n-nodes-langchain.openAi Model: gpt-4.1 (or gpt-4.1-mini) Message role: system (the strict prompt; references {{ $json.text }}) jsonOutput: true Creds: openAiApi Output (per item): $.message.content โ†’ the parsed JSON (ensure itโ€™s an array). --- 5) Create or update an account (Salesforce) Purpose: Upsert Buyer as Account using an external ID. Key settings: Resource: account Operation: upsert External Id Field: taxid_c External Id Value: ={{ $json.message.content.buyer.tax_id }} Name: ={{ $json.message.content.buyer.name }} Creds: salesforceOAuth2Api Output: Account record (captures Id) for downstream Opportunity. --- 6) Create an opportunity (Salesforce) Purpose: Create Opportunity linked to the Buyer (Account). Key settings: Resource: opportunity Name: ={{ $('Message a model').item.json.message.content.invoice.code }} Close Date: ={{ $('Message a model').item.json.message.content.invoice.issue_date }} Stage: Closed Won Amount: ={{ $('Message a model').item.json.message.content.summary.grand_total }} AccountId: ={{ $json.id }} (from Upsert Account output) Creds: salesforceOAuth2Api Output: Opportunity Id for OLI creation. --- 7) Build SOQL (Code / JS) Purpose: Collect unique product codes from AI JSON and build a SOQL query for PricebookEntry by Pricebook2Id. Key settings: pricebook2Id (hardcoded in script): e.g., 01sxxxxxxxxxxxxxxx Source lines: $('Message a model').first().json.message.content.products Output: { soql, codes } --- 8) Query PricebookEntries (Salesforce) Purpose: Fetch PricebookEntry.Id for each Product2.ProductCode. Key settings: Resource: search Query: ={{ $json.soql }} Creds: salesforceOAuth2Api Output: Items with Id, Product2.ProductCode (used for mapping). --- 9) Code in JavaScript (Build OLI payloads) Purpose: Join lines with PBE results and Opportunity Id โžœ build OpportunityLineItem payloads. Inputs: OpportunityId: ={{ $('Create an opportunity').first().json.id }} Lines: ={{ $('Message a model').first().json.message.content.products }} PBE rows: from previous node items Output: { body: { allOrNone:false, records:[{ OpportunityLineItem... }] } } Notes: Converts discount_total โžœ per-unit if needed (currently commented for standard pricing). Throws on missing PBE mapping or empty lines. --- 10) Create Opportunity Line Items (HTTP Request) Purpose: Bulk create OLIs via Salesforce Composite API. Key settings: Method: POST URL: https://<your-instance>.my.salesforce.com/services/data/v65.0/composite/sobjects Auth: salesforceOAuth2Api (predefined credential) Body (JSON): ={{ $json.body }} Output: Composite API results (per-record statuses). --- 11) Update File to One Drive Purpose: Archive the original PDF in OneDrive. Key settings: Operation: upload File Name: ={{ $json.name }} Parent Folder ID: onedrive folder id Binary Data: true (from the Download node) Creds: microsoftOneDriveOAuth2Api Output: Uploaded file metadata. --- Data flow (wiring) Google Drive Trigger โ†’ Download File From Google Download File From Google โ†’ Extract from File โ†’ Update File to One Drive Extract from File โ†’ Message a model Message a model โ†’ Create or update an account Create or update an account โ†’ Create an opportunity Create an opportunity โ†’ Build SOQL Build SOQL โ†’ Query PricebookEntries Query PricebookEntries โ†’ Code in JavaScript Code in JavaScript โ†’ Create Opportunity Line Items --- Quick setup checklist ๐Ÿ” Credentials: Connect Google Drive, OneDrive, Salesforce, OpenAI. ๐Ÿ“‚ IDs: Drive Folder ID (watch) OneDrive Parent Folder ID (archive) Salesforce Pricebook2Id (in the JS SOQL builder) ๐Ÿง  AI Prompt: Use the strict system prompt; jsonOutput = true. ๐Ÿงพ Field mappings: Buyer tax id/name โ†’ Account upsert fields Invoice code/date/amount โ†’ Opportunity fields Product name must equal your Product2.ProductCode in SF. โœ… Test: Drop a sample PDF โ†’ verify: AI returns array JSON only Account/Opportunity created OLI records created PDF archived to OneDrive --- Notes & best practices If PDFs are scans, enable OCR in Extract from File. If AI returns non-JSON, keep โ€œReturn only a JSON arrayโ€ as the last line of the prompt and keep jsonOutput enabled. Consider adding validation on parsing.warnings to gate Salesforce writes. For discounts/taxes in OLI: Standard OLI fields donโ€™t support per-line discount amounts directly; model them in UnitPrice or custom fields. Replace the Composite API URL with your orgโ€™s domain or use the Salesforce nodeโ€™s Bulk Upsert for simplicity.

Le NguyenBy Le Nguyen
942

Automate RSS to social media pipeline with AI, Airtable & GetLate for multiple platforms

Overview Automates your complete social media content pipeline: sources articles from Wallabag RSS, generates platform-specific posts with AI, creates contextual images, and publishes via GetLate API. Built with 63 nodes across two workflows to handle LinkedIn, Instagram, and Blueskyโ€”with easy expansion to more platforms. Ideal for: Content marketers, solo creators, agencies, and community managers maintaining a consistent multi-platform presence with minimal manual effort. How It Works Two-Workflow Architecture: Content Aggregation Workflow Monitors Wallabag RSS feeds for tagged articles (to-share-linkedin, to-share-instagram, etc.) Extracts and converts content from HTML to Markdown Stores structured data in Airtable with platform assignment AI Generation & Publishing Workflow Scheduled trigger queries Airtable for unpublished content Routes to platform-specific sub-workflows (LinkedIn, Instagram, Bluesky) LLM generates optimized post text and image prompts based on custom brand parameters Optionally generates AI images and hosts them on Imgbb CDN Publishes via GetLate API (immediate or draft mode) Updates Airtable with publication status and metadata Key Features: Tag-based content routing using Wallabag's native system Swappable AI providers (Groq, OpenAI, Anthropic) Platform-specific optimization (tone, length, hashtags, CTAs) Modular designโ€”duplicate sub-workflows to add new platforms in \~30 minutes Centralized Airtable tracking with 17 data points per post Set Up Steps Setup time: \~45-60 minutes for initial configuration Create accounts and get API keys (\~15 min) Wallabag (with RSS feeds enabled) GetLate (social media publishing) Airtable (create base with provided schemaโ€”see sticky notes) LLM provider (Groq, OpenAI, or Anthropic) Image service (Hugging Face, Fal.ai, or Stability AI) Imgbb (image hosting) Configure n8n credentials (\~10 min) Add all API keys in n8n's credential manager Detailed credential setup instructions in workflow sticky notes Set up Airtable database (\~10 min) Create "RSS Feed - Content Store" base Add 19 required fields (schema provided in workflow sticky notes) Get Airtable base ID and API key Customize brand prompts (\~15 min) Edit "Set Custom SMCG Prompt" node for each platform Define brand voice, tone, goals, audience, and image preferences Platform-specific examples provided in sticky notes Configure platform settings (\~10 min) Set GetLate account IDs for each platform Enable/disable image generation per platform Choose immediate publish vs. draft mode Adjust schedule trigger frequency Test and deploy Tag test articles in Wallabag Monitor the first few executions in draft mode Activate workflows when satisfied with the output Important: This is a proof-of-concept template. Test thoroughly with draft mode before production use. Detailed setup instructions, troubleshooting tips, and customization guidance are in the workflow's sticky notes. Technical Details 63 nodes: 9 Airtable operations, 8 HTTP requests, 7 code nodes, 3 LangChain LLM chains, 3 RSS triggers, 3 GetLate publishers Supports: Multiple LLM providers, multiple image generation services, unlimited platforms via modular architecture Tracking: 17 metadata fields per post, including publish status, applied parameters, character counts, hashtags, image URLs Prerequisites n8n instance (self-hosted or cloud) Accounts: Wallabag, GetLate, Airtable, LLM provider, image generation service, Imgbb Basic understanding of n8n workflows and credential configuration Time to customize prompts for your brand voice Detailed documentation, Airtable schema, prompt examples, and troubleshooting guides are in the workflow's sticky notes. Category Tags social-media-automation, ai-content-generation, rss-to-social, multi-platform-posting, getlate-api, airtable-database, langchain, workflow-automation, content-marketing

Mikal Hayden-GatesBy Mikal Hayden-Gates
188