12 templates found
Category:
Author:
Sort:

Proxmox AI agent with n8n and generative AI integration

Proxmox AI Agent with n8n and Generative AI Integration This template automates IT operations on a Proxmox Virtual Environment (VE) using an AI-powered conversational agent built with n8n. By integrating Proxmox APIs and generative AI models (e.g., Google Gemini), the workflow converts natural language commands into API calls, enabling seamless management of your Proxmox nodes, VMs, and clusters. Buy My Book: Mastering n8n on Amazon Full Courses & Tutorials: http://lms.syncbricks.com Watch Video on Youtube How It Works Trigger Mechanism The workflow can be triggered through multiple channels like chat (Telegram, email, or n8n's built-in chat). Interact with the AI agent conversationally. AI-Powered Parsing A connected AI model (Google Gemini or other compatible models like OpenAI or Claude) processes your natural language input to determine the required Proxmox API operation. API Call Generation The AI parses the input and generates structured JSON output, which includes: response_type: The HTTP method (GET, POST, PUT, DELETE). url: The Proxmox API endpoint to execute. details: Any required payload parameters for the API call. Proxmox API Execution The structured output is used to make HTTP requests to the Proxmox VE API. The workflow supports various operations, such as: Retrieving cluster or node information. Creating, deleting, starting, or stopping VMs. Migrating VMs between nodes. Updating or resizing VM configurations. Response Formatting The workflow formats API responses into a user-friendly summary. For example: Success messages for operations (e.g., "VM started successfully"). Error messages with missing parameter details. Extensibility You can enhance the workflow by connecting additional triggers, external services, or AI models. It supports: Telegram/Slack integration for real-time notifications. Backup and restore workflows. Cloud monitoring extensions. --- Key Features Multi-Channel Input: Use chat, email, or custom triggers to communicate with the AI agent. Low-Code Automation: Easily customize the workflow to suit your Proxmox environment. Generative AI Integration: Supports advanced AI models for precise command interpretation. Proxmox API Compatibility: Fully adheres to Proxmox API specifications for secure and reliable operations. Error Handling: Detects and informs you of missing or invalid parameters in your requests. --- Example Use Cases Create a Virtual Machine Input: "Create a VM with 4 cores, 8GB RAM, and 50GB disk on psb1." Action: Sends a POST request to Proxmox to create the VM with specified configurations. Start a VM Input: "Start VM 105 on node psb2." Action: Executes a POST request to start the specified VM. Retrieve Node Details Input: "Show the memory usage of psb3." Action: Sends a GET request and returns the node's resource utilization. Migrate a VM Input: "Migrate VM 202 from psb1 to psb3." Action: Executes a POST request to move the VM with optional online migration. --- Pre-Requisites Proxmox API Configuration Enable the Proxmox API and generate API keys in the Proxmox Data Center. Use the Authorization header with the format: PVEAPIToken=<user>@<realm>!<token-id>=<token-value> n8n Setup Add Proxmox API credentials in n8n using Header Auth. Connect a generative AI model (e.g., Google Gemini) via the relevant credential type. Access the Workflow Import this template into your n8n instance. Replace placeholder credentials with your Proxmox and AI service details. --- Additional Notes This template is designed for Proxmox 7.x and above. For advanced features like backup, VM snapshots, and detailed node monitoring, you can extend this workflow. Always test with a non-production Proxmox environment before deploying in live systems. Start with n8n Learn n8n with Amjid Get n8n Book What is Proxmox

Amjid AliBy Amjid Ali
72116

Retrieve a Monday.com row and all data in a single node

This workflow is a building block designed to be called from other workflows via an Execute workflow node. When called from another workflow, and given the JSON input of a "pulse" field with the ID to pull from monday, this workflow will return: The items name and ID All column data, indexable by the column name All column data, indexable by the column's ID string All board relation columns, with their data and column values All subitems, with their data and column values For example: ++Prerequisites++ A monday.com account and credential A workflow that needs to get detailed data from a monday.com row The pulse id of the monday.com row to retreive data from. ++Setup++ Import the workflow Configure all monday nodes with your credentials and save the workflow Copy the workflow ID from it's URL In a different workflow, add an Edit Fields node, to output the field "pulse", with the monday item you want to retrieve. Feed the Edit Fields node with your pulse into an Execute workflow node, and paste the workflow ID from above into it This "pulse" field will tell the workflow what pulse to retreive. This can be populated by an expression in your workflow There is an example of the Edit Fields and Execute Workflow nodes in the template

Joey D’AnnaBy Joey D’Anna
3962

Generate complete SEO strategy reports with SerpAPI data and GPT-4 Agent team

Overview This workflow deploys a fully autonomous "AI SEO Agency" inside your n8n instance. Unlike simple chatbots, this is a hierarchical agent swarm. A "Director" agent acts as the project manager, using live market data to orchestrate a team of 6 specialized AI workers. It takes a single user prompt and turns it into a comprehensive, professional-grade SEO strategy report. Key Features Hierarchical Architecture: A "Manager" agent delegates tasks to "Worker" agents. Live Market Research: Uses SerpApi to fetch real-time Google Search results and competitor data. Conversational Memory: Remembers business details across the chat session. 6 Expert Specialists: Includes dedicated agents for Keyword Research, Technical SEO, Link Building, Analytics, Local SEO, and Content Writing. How it works Analysis: The SEO Director Agent receives your request and consults its memory. Live Research: The Director uses the SerpApi tool to perform live Google searches on the target website and niche to gather context. Delegation: Based on the research, the Director dynamically assigns tasks to the relevant Specialist Agents (e.g., calling the "Technical Specialist" for site speed issues or the "Keyword Specialist" for content ideas). Synthesis: The Director compiles the outputs from all specialists into one cohesive, actionable strategy report. Set up steps Estimated time: 5 minutes OpenAI Keys: Add your OpenAI API Key to all 7 "OpenAI Chat Model" nodes (1 for the Director, 6 for the Specialists). SerpApi Key: Open the "SEO Director Agent" node. Under "Tools" > "SerpApi," add your SerpApi Key (free tier available). This enables the live Google Search capability. Run: Toggle the workflow to "Active" and start chatting! About the Creator Built by Pixril. We specialize in building advanced, production-ready AI agents for n8n. Find more professional workflows in our shop: https://pixril.etsy.com

PixrilBy Pixril
2037

Automatically collect & process Google News articles to Google Sheets

Overview This workflow automatically collects the latest articles from Google News RSS feeds, cleans and deduplicates them, and stores them neatly in a Google Sheet. It runs on a set schedule (every Monday at 09:00 by default) and helps you build a fresh pool of content ideas for newsletters, blogs, or social media. --- What you can do with it 🔎 Research faster – pull in fresh articles from multiple RSS sources without manual searching. 🧼 Clean & normalize – extract the real article URL (instead of Google redirects), keep only the title, summary, and date. 🗑 No duplicates – filter out empty or repeated entries before they ever reach your sheet. 📊 Central storage – append all new, unique links into a Google Sheet for review or further automation. --- How it works Trigger – Cron starts the flow every Monday at 09:00 (you can change the schedule). RSS Read – Fetches articles from multiple Google News queries (e.g., “AI”, “AI Automation”). Merge – Combines all feed results into one list. Set (Clean URL) – Extracts the real URL, title, summary, and publication date. Filter – Ensures only items with a valid title and URL continue. Unique by URL – Removes duplicate articles across feeds. Google Sheets Append – Saves new links into your chosen Sheet for review and later use. --- Setup Instructions Import workflow into your n8n instance. Update RSS feeds: Replace the example Google News RSS URLs (AI, AI Automation) with your own queries. Format: https://news.google.com/rss/search?q=YOUR_QUERY&hl=de&gl=DE&ceid=DE:de Connect Google Sheets: Add your Google Sheets credentials. Select the documentId (the spreadsheet) and sheetName (the tab) in the Append new Links node. Recommended columns: date, title, url, summary. Adjust schedule: In the Trigger: Montag 09:00 node, change the cron expression to daily or multiple times per day if you want. Run test: Execute once manually. Check your sheet for the first rows. --- Tips & Extensions ✅ Add more RSS Read nodes for additional sources (blogs, media outlets, niche topics). ✅ Chain this workflow with an AI node (OpenAI/GPT) to automatically generate post ideas from the collected articles. ✅ Notify yourself in Slack/Telegram when new articles are added. ✅ Use a status column (Draft, Approved, Posted) to manage a simple content pipeline directly from the sheet. --- 👉 With this template you’ll never run out of content ideas – everything flows into one place, ready to inspire your next posts, newsletters, or campaigns.

SuSBy SuS
1412

Send message on Mattermost when your n8n instance starts

This workflow allows you to receive a message on Mattermost when your n8n instance starts. n8n Trigger node: The n8n Trigger node will trigger the workflow whenever the instance starts. Mattermost node: This node will send a message on Mattermost, notifying you when n8n starts.

Harshil AgrawalBy Harshil Agrawal
1385

Scrape TikTok trends & generate AI videos with Apify, Fal AI & Google Suite

Automated TikTok Repurposing & Video Generation Workflow Who’s it for This workflow is designed for content creators, social media managers, and marketers—specifically those in the career, recruitment, or "job change" (転職/就職) niches. It is ideal for anyone looking to automate the process of finding trending short-form content concepts and converting them into fresh AI-generated videos. How it works / What it does This workflow automates the pipeline from content research to video creation: Scrape Data: It triggers an Apify actor (clockworks/tiktok-scraper) to search and scrape TikTok videos related to "Job Change" (転職) and "Employment" (就職). Store Raw Data: It saves the scraped TikTok metadata (text, stats, author info) into a Google Sheet. AI Analysis & Prompting: An AI Agent (via OpenRouter) analyzes the scraped video content and creates a detailed prompt for a new video (concept, visual cues, aspect ratio). Log Prompts: The generated prompt is saved to a separate tab in the Google Sheet. Video Generation: The prompt is sent to Fal AI (Veo3 model) to generate a new 8-second, vertical (9:16) video with audio. Wait & Retrieve: The workflow waits for the generation to complete, then retrieves the video file. Cloud Storage: Finally, it uploads the generated video file to a specific Google Drive folder. How to set up Credentials: Configure the following credentials in n8n: Apify API: (Currently passed via URL query params in the workflow, recommended to switch to Header Auth). Google Sheets OAuth2: Connect your Google account. OpenRouter API: For the AI Agent. Fal AI (Header Auth): For the video generation API. Google Drive OAuth2: For uploading the final video. Google Sheets: Create a spreadsheet. Note the documentId and update the Google Sheets nodes. Ensure you have the necessary Sheet names (e.g., "シート1" for raw data, "生成済み" for prompts) and columns mapped. Google Drive: Create a destination folder. Update the Upload file node with the correct folderId. Apify: Update the token in the HTTP Request and HTTP Request1 URLs with your own Apify API token. Requirements n8n Version: 1.x or higher (Workflow uses version 4.3 nodes). Apify Account: With access to clockworks/tiktok-scraper and sufficient credits. Fal.ai Account: With credits for the fal-ai/veo3 model. OpenRouter Account: With credits for the selected LLM. Google Workspace: Access to Drive and Sheets. How to customize the workflow Change the Niche: Update the searchQueries JSON body in the first HTTP Request node (e.g., change "転職" to "Cooking" or "Fitness"). Adjust AI Logic: Modify the AI Agent system prompt to change the style, tone, or structure of the video prompts it generates. Video Settings: In the Fal Submit node, adjust bodyParameters to change the duration (e.g., 5s), aspect ratio (e.g., 16:9), or disable audio. Scale: Increase the amount in the Limit node to process more than one video per execution.

furuidoreandoroBy furuidoreandoro
659

Automatic jest test generation for GitHub PRs with dual AI review

Workflow: Automatic Unit Test Creator from GitHub 🏗️ Architecture Overview This workflow listens for GitHub pull-request events, analyzes changed React/TypeScript files, auto-generates Jest tests via AI, has them reviewed by a second AI pass, and posts suggestions back as PR comments: GitHub Webhook → PR opened or updated Fetch & Diff → Retrieve raw diff of changed files Filter & Split → Isolate .tsx files & their diffs Fetch File Contents → Provide full context for tests Test Maker Agent → Generate Jest tests for diff hunks Code Reviewer Agent → Refine tests for style & edge-cases Post PR Comment → Sends suggested tests back to GitHub 📦 Node-by-Node Breakdown mermaid flowchart LR A[Webhook: /github/pr-events] --> B[GitHub: Get PR] B --> C[Function: Parse diff_url + owner/repo] C --> D[HTTP Request: GET diff_url] D --> E[Function: Split on "diff --git"] E --> F[Filter: /\.tsx$/] F --> G[GitHub: Get File Contents] G --> H[Test Maker Agent] H --> I[Code Reviewer Agent] I --> J[Function: Build Comment Payload] J --> K[HTTP Request: POST to PR Comments] Webhook: GitHub PR Events Type: HTTP Webhook (/webhook/github/pr-events) Subscribed Events: pullrequest.opened, pullrequest.synchronize GitHub: Get PR Node: GitHub Action: "Get Pull Request" Inputs: owner, repo, pull_number Function: Parse diff_url + owner/repo Extracts: diff_url (e.g. …/pulls/123.diff) owner, repo, mergecommitsha HTTP Request: GET diff_url Fetches unified-diff text for the PR. Function: Split on "diff --git" Splits the diff into file-specific segments. Filter: /.tsx$/ Keeps only segments where the file path ends with .tsx. GitHub: Get File Contents For each .tsx file, fetches the latest blob via GitHub API. Test Maker Agent Prompt: "Generate Jest unit tests covering only the behaviors changed in these diff hunks." Output: Raw Jest test code. Code Reviewer Agent Reads file + generated tests Prompt: "Review and improve these tests for readability, edge-cases, and naming conventions." Output: Polished test suite. Function: Build Comment Payload Wraps tests in TypeScript fences: ts // generated tests… Constructs JSON: json { "body": "<tests>" } HTTP Request: POST to PR Comments URL: https://api.github.com/repos/{owner}/{repo}/issues/{pull_number}/comments Body: Contains the suggested tests. 🔍 Design Rationale & Best Practices Focused Diff Analysis Targets only .tsx files to cover UI logic. Two-Stage AI Separate "generate" + "review" steps mimic TDD + code review. Stateless Functions Pure JS for parsing & transformation, easy to test. Non-Blocking PR Comments Asynchronous suggestions—developers aren't blocked. Scoped Permissions GitHub token limited to reading PRs & posting comments.

VarritechBy Varritech
599

Download Facebook videos to Google Drive with automated logging in sheets

🚀 Facebook to MP4 Video Downloader – Fully Customizable Automated Workflow Easily convert Facebook videos into downloadable MP4 files using Facebook Video Downloader API. This n8n workflow automates fetching videos, downloading them, uploading them to Google Drive, and logging results in Google Sheets. Users can modify and extend this flow according to their own needs (e.g., add email notifications, change storage location, or use another API). --- 📝 Node-by-Node Explanation On form submission → Triggers when a user submits a Facebook video URL via the form. (You can customize this form to include email or multiple URLs.) Facebook RapidAPI Request → Sends a POST request to Facebook Video Downloader API to fetch downloadable MP4 links. (Easily replace or update API parameters as needed.) If Node → Checks API response for errors before proceeding. (You can add more conditions to handle custom error scenarios.) MP4 Downloader → Downloads the Facebook video file from the received media URL. (You can change download settings, add quality filters, or store multiple resolutions.) Upload to Google Drive → Uploads the downloaded MP4 file to a Google Drive folder. (Easily switch to Dropbox, S3, or any other storage service.) Google Drive Set Permission → Sets the uploaded file to be publicly shareable. (You can make it private or share only with specific users.) Google Sheets → Logs successful conversions with the original URL and shareable MP4 link. (Customizable for additional fields like video title, size, or download time.) Wait Node → Delays before logging failed conversions to avoid rapid writes. (You can adjust the wait duration or add retry attempts.) Google Sheets Append Row → Records failed conversion attempts with N/A as the Drive URL. (You can add notification alerts for failed downloads.) --- ✅ Use Cases Automate Facebook video downloads for social media teams Instantly generate shareable MP4 links for clients or marketing campaigns Maintain a centralized log of downloaded videos for reporting Customizable flow for different video quality, formats, or storage needs 🚀 Benefits Fast and reliable Facebook video downloading with Facebook Video Downloader API Flexible and fully customizable – adapt nodes, storage, and notifications as required Automatic error handling and logging in Google Sheets Cloud-based storage with secure and shareable Google Drive links Seamless integration with n8n and Facebook Video Downloader API for scalable automation --- 🔑 Resolved: Manual Facebook video downloads are now fully automated, customizable, and scalable using Facebook Video Downloader API, Google Drive uploads, and detailed logging via Google Sheets.

Sk developer By Sk developer
427

Ingredient price trend analysis & buying recommendations with PostgreSQL, API & Slack

This automated n8n workflow monitors ingredient price changes from external APIs or manual sources, analyzes historical trends, and provides smart buying recommendations. The system tracks price fluctuations in a PostgreSQL database, generates actionable insights, and sends alerts via email and Slack to help restaurants optimize their purchasing decisions. What is Price Trend Analysis? Price trend analysis uses historical price data to identify patterns and predict optimal buying opportunities. The system analyzes price movements over time and generates recommendations on when to buy ingredients based on current trends and historical patterns. Good to Know Price data accuracy depends on the reliability of external API sources Historical data improves recommendation accuracy over time (recommended minimum 30 days) PostgreSQL database provides robust data storage and complex trend analysis capabilities Real-time alerts help capture optimal buying opportunities Dashboard provides visual insights into price trends and recommendations How It Works Daily Price Check - Triggers the workflow daily to monitor price changes Fetch API Prices - Retrieves the latest prices from an external ingredient pricing API Setup Database - Ensures database tables are ready before inserting new data Store Price Data - Saves current prices to the PostgreSQL database for tracking Calculate Trends - Analyzes historical prices to detect patterns and price movements Generate Recommendations - Suggests actions based on price trends (buy/wait/stock up) Store Recommendations - Saves recommendations for future reporting Get Dashboard Data - Gathers necessary data for dashboard generation Generate Dashboard HTML - Builds an HTML dashboard to visualize insights Send Email Report - Emails the dashboard report to stakeholders Send Slack Alert - Sends key alerts or recommendations to Slack channels Database Structure The workflow uses PostgreSQL with two main tables: price_history - Historical price tracking with columns: id (Primary Key) ingredient (VARCHAR 100) - Name of the ingredient price (DECIMAL 10,2) - Current price value unit (VARCHAR 50) - Unit of measurement (kg, lbs, etc.) supplier (VARCHAR 100) - Source supplier name timestamp (TIMESTAMP) - When the price was recorded created_at (TIMESTAMP) - Record creation time buying_recommendations - AI-generated buying suggestions with columns: id (Primary Key) ingredient (VARCHAR 100) - Ingredient name current_price (DECIMAL 10,2) - Latest price pricechangepercent (DECIMAL 5,2) - Percentage change from previous price trend (VARCHAR 20) - Price trend direction (INCREASING/DECREASING/STABLE) recommendation (VARCHAR 50) - Buying action (BUYNOW/WAIT/STOCKUP) urgency (VARCHAR 20) - Urgency level (HIGH/MEDIUM/LOW) reason (TEXT) - Explanation for the recommendation generated_at (TIMESTAMP) - When recommendation was created Price Trend Analysis The system analyzes historical price data over the last 30 days to calculate percentage changes, identify trends (INCREASING/DECREASING/STABLE), and generate actionable buying recommendations based on price patterns and movement history. How to Use Import the workflow into n8n Configure PostgreSQL database connection credentials Set up external ingredient pricing API access Configure email credentials for dashboard reports Set up Slack webhook or bot credentials for alerts Run the Setup Database node to create required tables and indexes Test with sample ingredient data to verify price tracking and recommendations Adjust trend analysis parameters based on your purchasing patterns Monitor recommendations and refine thresholds based on actual buying decisions Requirements PostgreSQL database access External ingredient pricing API credentials Email service credentials (Gmail, SMTP, etc.) Slack webhook URL or bot credentials Historical price data for initial trend analysis Customizing This Workflow Modify the Calculate Trends node to adjust the analysis period (currently 30 days) or add seasonal adjustments. Customize the recommendation logic to match your restaurant's buying patterns, budget constraints, or supplier agreements. Add additional data sources like weather forecasts or market reports for more sophisticated predictions.

Oneclick AI SquadBy Oneclick AI Squad
326

Scheduled daily affirmations via email and Telegram using Cron

Description Wake up gently. This elegant workflow runs every morning at 7 AM, picks one uplifting affirmation from a curated list, and delivers it to your inbox (with optional Telegram). Zero code, zero secrets—just drop in your SMTP and Telegram credentials, edit the affirmations, and activate. Perfect for creators, homemakers, and entrepreneurs who crave intention and beauty before the day begins. How it works (high-level steps) Cron wakes the flow daily at 7 AM. Set: Configuration stores your email, Telegram chat ID, and affirmations. Code node randomly selects one affirmation. Email node sends the message via SMTP. IF node decides whether to forward it to Telegram as well. Set-up time 2 – 3 minutes 30 s: add SMTP credential 30 s: add Telegram Bot credential (optional) 1 min: edit affirmations & email addresses 30 s: activate Detailed instructions All deep-dive steps live inside the yellow and white sticky notes on the canvas—no extra docs needed. Requirements SMTP account (SendGrid, Gmail, etc.) Telegram Bot account (optional) Customisation tips Change Cron time or frequency Swap affirmation list for quotes, verses, or mantras Add Notion logger branch for journaling

Shelly-Ann DavyBy Shelly-Ann Davy
128

Qualify and Enrich Leads with Octave's Contextual Insights and Slack Alerts

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Qualify and enrich inbound leads with contextual insights and Slack alerts Who is this for? Sales teams, account executives, and RevOps professionals who need more than just basic lead scoring. Built for teams that want deep contextual insights about qualified prospects to enable truly relevant conversations from the first touchpoint. What problem does this solve? Most qualification stops at "good fit" or "bad fit" - but that leaves sales teams flying blind when it comes to actually engaging the prospect. You know they're qualified, but what are their specific pain points? What value propositions resonate? Which reference customers should you mention? This workflow uses Octave's context engine to not only qualify leads but enrich them with actionable insights that turn cold outreach into warm, contextualized conversations. What this workflow does Inbound Lead Processing: Receives lead information via webhook (firstName, companyName, companyDomain, profileURL, jobTitle) Processes leads from website forms, demo requests, content downloads, or trial signups Validates and structures lead data for intelligent qualification and enrichment Contextualized Lead Qualification: Leverages Octave's context engine to score leads against your specific ICP Analyzes company fit, role relevance, and timing indicators Generates qualification scores (1-10) with detailed rationale Filters out low-scoring leads (configurable threshold - default >5) Deep Lead Enrichment: Uses Octave's enrichment engine to generate contextual insights about qualified leads Identifies primary responsibilities, pain points, and relevant value propositions Suggests appropriate reference customers and use cases to mention Provides sales teams with conversation starters grounded in your business context Enhanced Sales Alerts: Sends enriched Slack alerts with qualification score plus actionable insights Includes suggested talking points, pain points, and reference customers Enables sales teams to have contextualized conversations from first contact Setup Required Credentials: Octave API key and workspace access Slack OAuth credentials with channel access Access to your lead source system (website forms, CRM, etc.) Step-by-Step Configuration: Set up Octave Qualification Agent: Add your Octave API credentials in n8n Replace your-octave-qualification-agent-id with your actual qualification agent ID Configure your qualification agent with your ICP criteria and business context Set up Octave Enrichment Agent: Replace your-octave-enrichment-agent-id with your actual enrichment agent ID Configure enrichment outputs based on the insights most valuable to your sales process Test enrichment quality with sample leads from your target market Configure Slack Integration: Add your Slack OAuth credentials to n8n Replace your-slack-channel-id with the channel for enriched lead alerts Customize the Slack message template with the enrichment fields most useful for your sales team Set up Lead Source: Replace your-webhook-path-here with a unique, secure path Configure your website forms, CRM, or lead source to send data to the webhook Ensure consistent data formatting across lead sources Customize Qualification Filter: Adjust the Filter node threshold (default: score > 5) Modify based on your lead volume and qualification standards Test with sample leads to calibrate scoring Required Webhook Payload Format: json { "body": { "firstName": "Sarah", "lastName": "Johnson", "companyName": "ScaleUp Technologies", "companyDomain": "scaleuptech.com", "profileURL": "https://linkedin.com/in/sarahjohnson", "jobTitle": "VP of Engineering" } } How to customize Qualification Criteria: Customize scoring in your Octave qualification agent: Product Level: Define "good fit" and "bad fit" questions that determine if someone needs your core offering Persona Level: Set criteria for specific buyer personas and their unique qualification factors Segment Level: Configure qualification logic for different market segments or use cases Multi-Level Qualification: Qualify against Product + Persona, Product + Segment, or all three levels combined Enrichment Insights: Configure your Octave enrichment agent to surface the most valuable insights: Primary Responsibilities: What this person actually does day-to-day Pain Points: Specific challenges they face that your solution addresses Value Propositions: Which benefits resonate most with their role and situation Reference Customers: Similar companies/roles that have succeeded with your solution Conversation Starters: Contextual talking points for outreach Slack Alert Format: Customize the enrichment data included in alerts: Add or remove enrichment fields based on sales team preferences Modify message formatting for better readability Include additional webhook data if needed Scoring Threshold: Adjust the Filter node to match your qualification standards Integration Channels: Replace Slack with email, CRM updates, or other notification systems Use Cases High-value enterprise lead qualification and research automation Demo request enrichment for contextual sales conversations Event lead processing with immediate actionable insights Website visitor qualification and conversation preparation Trial signup enrichment for targeted sales outreach Content download lead scoring with context-aware follow-up preparation

NalinBy Nalin
68

Manage your Shopify store via AI assistant with OpenAI and MCP server

Who it's for This n8n workflow is designed for Shopify store owners and e-commerce managers who want to automate their store operations through an intelligent AI assistant. The workflow creates a conversational interface that can manage products, process orders, and provide store analytics through natural language commands. Features Intelligent AI Assistant: Powered by OpenAI with specialized system prompts for e-commerce operations Shopify Integration: Complete MCP (Model Context Protocol) server implementation for seamless Shopify operations Product Management: Create, update, search, retrieve, and delete products automatically Order Processing: Create, update, retrieve, and manage orders including fulfillment status Context-Aware Automation: Uses conversation history and Shopify data to minimize user input requirements Requirements Shopify Access Token: For accessing Shopify store data and operations OpenAI API Credentials: For powering the AI assistant Notification Service Credentials: Discord Bot API, Telegram Bot API, Rapiwa API (for WhatsApp), Gmail OAuth2 Configure Credentials Shopify Access Token: Configure with proper permissions for your store OpenAI API: Set up with appropriate model access (gpt-4.1-mini or similar) Notification Services: Configure each service with proper API keys and target IDs Important Notes MCP Server Setup: The workflow includes a Shopify MCP Server that must be properly configured for production use Tool Selection: The MCP Client includes specific Shopify tools that can be enabled/disabled based on requirements System Prompt: The AI assistant is configured with specialized e-commerce guidelines that can be customized Confirmation Requirements: Irreversible actions like deletions will require confirmation Rate Limiting: The workflow includes appropriate delays to prevent API rate limiting Notification Content: All notifications include a standard success message that can be customized Production Deployment for MCP Server To deploy this workflow in production Support & Help WhatsApp: Chat on WhatsApp Discord: SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen

SpaGreen CreativeBy SpaGreen Creative
33
All templates loaded