AI Linux system administrator for managing Linux VPS instance
This n8n workflow template is designed for developers, system administrators, and IT professionals who manage Linux VPS environments. It leverages an AI chatbot powered by the OpenAI model to interpret and execute SSH commands on a Linux VPS directly from chat messages. The workflow triggers when a specific chat message is received, which is then processed by the AI SysAdmin ReAct Agent to execute predefined SSH commands securely. How It Works Chat Trigger: The workflow starts when a chat message is received via a supported platform (like Slack, Telegram, etc.). AI Processing: The message is passed to the AI SysAdmin ReAct Agent, which uses an embedded OpenAI model to interpret the command and map it to a corresponding SSH action. Command Execution: The interpreted command is securely executed on the target Linux VPS using SSH, with login credentials managed through a secure method embedded within the workflow. Setup Instructions Import the Workflow: Download and import the workflow into your n8n instance. Configure Chat Integration: Set up the chat trigger node by connecting it to your preferred chat platform and configuring the trigger conditions. Set SSH Credentials: Securely input your SSH credentials in the designated SSH login credentials node. Deploy and Test: Deploy the workflow and perform tests to ensure that commands are executed correctly and securely on your VPS. Embrace the future of VPS management with our AI-driven SysAdmin for Linux VPS template. This innovative solution transforms how system administrators interact with and manage their servers, offering a streamlined, secure, and efficient method to handle routine tasks through simple chat commands. With the power of AI at your fingertips, enhance your operational efficiency, reduce response times, and manage your Linux environments more effectively. Get started today to experience a smarter way to manage your systems directly through your chat tool.
Realtime Notion Todoist 2-way sync with Redis
Purpose This solution enables you to manage all your Notion and Todoist tasks from different workspaces as well as your calendar events in a single place. All tasks can be managed in Todoist and additionally Fantastical can be used to manage scheduled tasks & events all together. Demo & Explanation [](https://youtu.be/k66j6ZspjCg) How it works The realtime sync consists of two workflows, both triggered by a registered webhook from either Notion or Todoist To avoid overwrites by lately arriving webhook calls, every time the current task is retrieved from both sides. Redis is used to prevent from endless loops, since an update in one system triggers another webhook call again. Using the ID of the task, the trigger is being locked down for 15 seconds. Depending on the detected changes, the other side is updated accordingly. Generally Notion is treaded as the main source. Using an "Obsolete" Status, it is guaranteed, that tasks never get deleted entirely by accident. The Todoist ID is stored in the Notion task, so they stay linked together An additional full sync workflow daily fixes inconsistencies, if any of them occurred, since webhooks cannot be trusted entirely. Since Todoist requires a more complex setup, a tiny workflow helps with activating the webhook. Another tiny workflow helps generating a global config, which is used by all workflows for mapping purposes. Mapping (Notion >> Todoist) Name: Task Name Priority: Priority (1: do first, 2: urgent, 3: important, 4: unset) Due: Date Status: Section (Done: completed, Obsolete: deleted) <page_link>: Description (read-only) Todoist ID: <task_id> Current limitations Changes on the same task cannot be made simultaneously in both systems within a 15-20 second time frame Subtasks are not linked automatically to their parent yet Recurring tasks are not supported yet Tasks names do not support URL’s yet Prerequisites Notion A database must already exist (get a basic template here) with the following properties (case matters!): Text: "Name" Status: "Status", containing at least the options "Backlog", "In progress", "Done", "Obsolete" Select: "Priority", containing the options "do first", "urgent", "important" Date: "Due" Checkbox: "Focus" Text: "Todoist ID" Todoist A project must already exist with the same sections like defined as Status in Notion (except Done and Obsolete) Redis Create a Free Redis Cloud instance or self-host Setup [](https://youtu.be/73jhyU0t4c4) The setup involves quite a lot of steps, yet many of them can be automated for business internal purposes. Just follow the video or do the following steps: Setup credentials for Notion (access token), Todoist (access token) and Redis - you can also create empty credentials and populate these later during further setup Clone this workflow by clicking the "Use workflow" button and then choosing your n8n instance - otherwise you need to map the credentials of many nodes. Follow the instructions described within the bundle of sticky notes on the top left of the workflow How to use You can apply changes (create, update, delete) to tasks both in Notion and Todoist which then get synced over within a couple of seconds (this is handled by the differential realtime sync) The daily running full sync, resolves possible discrepancies in Todoist and sends a summary via email, if anything needed to be updated. In case that contains an unintended change, you can jump to the Task from the email directly to fix it manually.
Loading JSON via FTP to Qdrant vector database embedding pipeline
🧠 This workflow is designed for one purpose only, to bulk-upload structured JSON articles from an FTP server into a Qdrant vector database for use in LLM-powered semantic search, RAG systems, or AI assistants. The JSON files are pre-cleaned and contain metadata and rich text chunks, ready for vectorization. This workflow handles Downloading from FTP Parsing & splitting Embedding with OpenAI-embedding Storing in Qdrant for future querying JSON structure format for blog articles json { "id": "article_001", "title": "reseguider", "language": "sv", "tags": ["london", "resa", "info"], "source": "alltomlondon.se", "url": "https://...", "embedded_at": "2025-04-08T15:27:00Z", "chunks": [ { "chunkid": "article001_01", "section_title": "Introduktion", "text": "Välkommen till London..." }, ... ] } 🧰 Benefits ✅ Automated Vector Loading Handles FTP → JSON → Qdrant in a hands-free pipeline. ✅ Clean Embedding Input Supports pre-validated chunks with metadata: titles, tags, language, and article ID. ✅ AI-Ready Format Perfect for Retrieval-Augmented Generation (RAG), semantic search, or assistant memory. ✅ Flexible Architecture Modular and swappable: FTP can be replaced with GDrive/Notion/S3, and embeddings can switch to local models like Ollama. ✅ Community Friendly This template helps others adopt best practices for vector DB feeding and LLM integration.
Send a Discord message when a certain Onfleet event happens
Summary Onfleet is a last-mile delivery software that provides end-to-end route planning, dispatch, communication, and analytics to handle the heavy lifting while you can focus on your customers. This workflow template listens to an Onfleet event and communicates via a Discord message. You can easily streamline this with your Discord servers and users. Configurations Update the Onfleet trigger node with your own Onfleet credentials, to register for an Onfleet API key, please visit https://onfleet.com/signup to get started You can easily change which Onfleet event to listen to. Learn more about Onfleet webhooks with Onfleet Support Update the Discord node with your Discord server webhook URL, add your own expressions to the Text field
Fetch scriptures dynamically from get Bible API
Overview The Get Bible Query Workflow is a modular and self-standing workflow designed to retrieve scriptures dynamically based on structured input. It serves as an intermediary layer that extracts references, queries the GetBible API, and returns scriptures in a standardized JSON format. This workflow is fully prepared for integration—simply call it from another workflow with the required JSON input, and it will return the requested scripture data. --- Who Is This For? This workflow is ideal for developers, Bible study apps, research tools, and dynamic scripture-based projects that need seamless access to scriptural content without direct API interaction. ✅ Use Cases: Bible Study Apps → Embed scripture retrieval functionality. Research & Theology Tools → Fetch structured verse data. Dynamic Content Generation → Integrate real-time scripture references. Sermon Preparation → Automate scripture lookups. --- How It Works Trigger Workflow → This workflow is designed to be called from another workflow with a structured JSON input. Receive Input → Accepts a JSON object containing references, translation, and API version. Extract References → Parses single verses, comma-separated lists, and ranged passages. Query API → Sends structured requests to the GetBible API. Format Response → Returns structured JSON output, maintaining API response consistency. --- JSON Input Structure References → Should include the book name, chapter, and verse(s). Multiple Verses → Separated by commas (e.g., John 3:16,18). Verse Ranges → Defined with a dash (e.g., John 3:16-18). Translation → Choose from the supported translations. API Version → Currently supports v2. Example JSON Input json { "references": [ "1 John 3:16", "Jn 3:16", "James 3:16", "Rom 3:16" ], "translation": "kjv", "version": "v2" } --- Example API Response json { "result": { "kjv623": { "translation": "King James Version", "abbreviation": "kjv", "book_name": "1 John", "chapter": 3, "ref": ["1 John 3:16"], "verses": [ { "chapter": 3, "verse": 16, "name": "1 John 3:16", "text": "Hereby perceive we the love of God, because he laid down his life for us: and we ought to lay down our lives for the brethren." } ] } } } 💡 Fully structured and formatted response – ready for seamless integration. --- Integration and Usage The GetBible Query Workflow is designed for immediate use. Simply call it from another workflow and pass the appropriate JSON object as input, and it will return the requested scripture passages. ✔️ No additional configuration is required. ✔️ Designed for fast, reliable, and structured scripture retrieval. ✔️ Fully compatible with GetBible API responses. --- Why Use This Workflow? ✔️ Fast & Reliable → Direct API integration for efficient queries. ✔️ Flexible Queries → Supports single, multi-verse, and ranged requests. ✔️ Agent-Compatible → Easily integrates into automated workflows. ✔️ No Code Needed → Just configure the JSON input and run the workflow. --- Next Steps 🔗 API Support 📖 API Documentation 💬 Need help? Join the community for support! 🚀
Discover trending topics with Bright Data MCP, GPT analysis & Trello integration
This workflow automatically identifies trending topics and hashtags across social media platforms to keep you informed of current trends and viral content. It saves you time by eliminating the need to manually research trending topics and provides data-driven insights for content strategy and social media planning. Overview This workflow automatically scrapes trending hashtag platforms and social media sites to extract currently trending topics, hashtags, and viral content themes. It uses Bright Data to access trend data sources without restrictions and AI to intelligently analyze trending content and provide actionable insights for content creators and marketers. Tools Used n8n: The automation platform that orchestrates the workflow Bright Data: For scraping trend platforms and social media without being blocked OpenAI: AI agent for intelligent trend analysis and content insights Google Sheets: For storing trending topics data and analysis results How to Install Import the Workflow: Download the .json file and import it into your n8n instance Configure Bright Data: Add your Bright Data credentials to the MCP Client node Set Up OpenAI: Configure your OpenAI API credentials Configure Google Sheets: Connect your Google Sheets account and set up your trending topics tracking spreadsheet Customize: Define target trend platforms and topics of interest Use Cases Content Marketing: Discover trending topics for timely and relevant content creation Social Media Strategy: Plan posts around viral hashtags and trending themes Brand Monitoring: Track if your brand or industry topics are trending Influencer Marketing: Identify trending content opportunities for collaborations Connect with Me Website: https://www.nofluff.online YouTube: https://www.youtube.com/@YaronBeen/videos LinkedIn: https://www.linkedin.com/in/yaronbeen/ Get Bright Data: https://get.brightdata.com/1tndi4600b25 (Using this link supports my free workflows with a small commission) n8n automation trendingtopics hashtags brightdata webscraping contentmarketing n8nworkflow workflow nocode socialmediatrends trendanalysis viralcontent contentresearch socialmediamonitoring trendtracking contentdiscovery hashtagresearch socialmediamarketing contentautomation trendmonitoring socialmediainsights contentplanning trendalerts viralmarketing socialtrends contentoptimization trendingcontent socialmediadata contentintelligence
Securely call Google Cloud Run APIs with service account auth (main-workflow)
Who it’s for? Anyone who wants a simple, secure way to call a Google Cloud Run endpoint from n8n—without exposing it publicly. People who want a cheap/free-tier way to run custom API logic without hosting n8n or spinning up servers. Example: you’ve got scraping code that needs specific system/python libs—build it into a Dockerfile on Cloud Run, then call it as a secure endpoint from n8n. How it works This is a conjunctive workflow: the main workflow calls Service Auth (sub-workflow) to get a Google ID token, merges that auth with your context, then calls your Cloud Run URL with Authorization: Bearer <id_token>. Works great for single calls or looping over items. How to set up General instructions below—see my detailed guide for more info: Build a Secure Google Cloud Run API, Then Call It from n8n (Free Tier) Setup: Create a Cloud Run service and enable Require authentication (Cloud IAM). Create a Google Service Account and grant Cloud Run Invoker on that service. In n8n, import the workflows and update the Vars node (serviceurl, optional servicepath). Create a JWT (PEM) credential from your service account key, then run. Make sure to read the sticky notes in the workflows—they contain helpful pointers and optional configurations. Requirements Cloud Run service URL (auth required) Google Service Account with Cloud Run Invoker Private key JSON fields downloaded from Service Account | needed to generate JWT credentials How to customize Change the HTTP method/path/body in Cloud Run Request, or drop the Service Auth (sub-workflow) into other workflows to reuse the same auth pattern. More details Full write-up (minimal + modular flows), screenshots, and more: Build a Secure Google Cloud Run API, Then Call It from n8n (Free Tier) — by Marco Cassar