15 templates found
Category:
Author:
Sort:

Read a file from disk

Companion workflow for Read Binary File node docs

amudhanBy amudhan
9219

🤖 Instagram MCP AI agent – read, reply & manage comments with GPT-4o

🤖 Instagram AI Agent with MCP Server – Built for Smart Engagement and Automation Hi! I’m Amanda 🥰 I build intelligent automations with n8n and Make. This powerful workflow was designed to help teams automatically handle Instagram interactions with AI. Using Meta Graph API, LangChain, MCP Server, and GPT-4o, it allows your AI agent to search for posts, read captions, fetch comments, and even reply or message followers, all through structured tools. --- 🔧 What the workflow does Searches for recent media using Instagram ID and access token Reads and extracts captions or media URLs Fetches comments and specific replies from each post Replies to comments automatically with GPT-generated responses Sends direct messages to followers who commented Maps user input and session to keep memory context via LangChain Communicates via Server-Sent Events (SSE) using your MCP Server URL --- 🧰 Nodes & Tech Used LangChain Agent + Chat Model with GPT-4o Memory Buffer for session memory toolHttpRequest to search media, comments, and send replies MCP Trigger and MCP Tool (custom SSE connection) Set node for input and variable assignment Webhook and JSON for Instagram API structure --- ⚙️ Setup Instructions Create your Instagram App in Meta Developer Portal Add your Instagram ID and Access Token in the Set node Update the MCP Server Tool URL in the MCP Instagram node Use your n8n server URL (e.g. https://yourdomain.com/mcp/server/instagram/sse) Trigger the workflow using the included LangChain Chat Trigger Interact via text to ask the agent to: “Get latest posts” “Reply to comment X with this message” “Send DM to this user about...” --- 👥 Who this is for Social media teams managing multiple comments Brands automating engagement with followers Agencies creating smart, autonomous digital assistants Developers building conversational Instagram bots --- ✅ Requirements Meta Graph API access Instagram Business account n8n instance (Cloud or Self-hosted) MCP Server configured (SSE Endpoint enabled) OpenAI API Key (for GPT-4o + LangChain) --- 🌐 Want to use this workflow? ❤️ Buy workflows: https://iloveflows.com ☁️ Try n8n Cloud: https://n8n.partnerlinks.io/amanda

Amanda BenksBy Amanda Benks
5511

CallForge - 06 - Automate sales insights with Gong.io, Notion & AI

--- CallForge - AI-Powered Sales Call Data Processor Automate sales call analysis and store structured insights in Notion with AI-powered intelligence. Who is This For? This workflow is ideal for: ✅ Sales teams looking to automate call insight processing. ✅ Sales operations managers managing AI-driven call analysis. ✅ Revenue teams using Gong, Fireflies.ai, Otter.ai, or similar transcription tools. It streamlines sales call intelligence, ensuring that insights such as competitor mentions, objections, and customer pain points are efficiently categorized and stored in Notion for easy access. --- 🔍 What Problem Does This Workflow Solve? Manually reviewing and documenting sales call takeaways is time-consuming and error-prone. With CallForge, you can: ✔ Identify competitors mentioned in sales calls. ✔ Capture objections and customer pain points for follow-up. ✔ Track sales call outcomes and categorize insights automatically. ✔ Store structured sales intelligence in Notion for future reference. ✔ Improve sales strategy with AI-driven, automated call analysis. --- 📌 Key Features & Workflow Steps 🎙️ AI-Powered Call Data Processing This workflow processes AI-generated sales call insights and structures them in Notion databases: Triggers automatically when AI call analysis data is received. Extracts competitor mentions from the call transcript and logs them in Notion. Identifies and categorizes sales objections for better follow-ups. Processes integration mentions, capturing tools or platforms referenced in the call. Extracts customer use cases, categorizing pain points and feature requests. Aggregates all extracted insights and updates relevant Notion databases. 📊 Notion Database Integration Competitors → Logs mentioned competitors for sales intelligence. Objections → Tracks and categorizes common objections from prospects. Integrations → Captures third-party tools & platforms discussed in calls. Use Cases → Stores customer challenges & product feature requests. --- 🛠 How to Set Up This Workflow Prepare Your AI Call Analysis Data Ensure AI-generated sales call data is passed into the workflow. Compatible with Gong, Fireflies.ai, Otter.ai, and other AI transcription tools. Connect Your Notion Database Set up Notion databases for: 🔹 Competitors (tracks competing products) 🔹 Objections (logs customer objections & concerns) 🔹 Integrations (captures mentioned platforms & tools) 🔹 Use Cases (categorizes customer pain points & feature requests) Configure n8n API Integrations Connect your Notion API key in n8n under “Notion API Credentials.” Set up webhook triggers to receive data from your AI transcription tool. Test the workflow using a sample AI-generated call transcript. CallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage CallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization CallForge - 03 - Gong Transcript Processor and Salesforce Enricher CallForge - 04 - AI Workflow for Gong.io Sales Calls CallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync CallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI CallForge - 07 - AI Marketing Data Processing with Gong & Notion CallForge - 08 - AI Product Insights from Sales Calls with Notion --- 🔧 How to Customize This Workflow 💡 Modify Notion Data Structure – Adjust fields to match your company’s CRM setup. 💡 Enhance AI Data Processing – Align fields with different AI transcription providers. 💡 Expand with CRM Integration – Sync insights with HubSpot, Salesforce, or Pipedrive. 💡 Add Notifications – Send alerts via Slack, email, or webhook when key competitor mentions or objections are detected. --- ⚙️ Key Nodes Used in This Workflow 🔹 If Nodes – Checks if AI-generated data includes competitors, integrations, objections, or use cases. 🔹 Notion Nodes – Creates or updates entries in Notion databases. 🔹 Split Out & Aggregate Nodes – Processes multiple insights and consolidates AI outputs. 🔹 Wait Nodes – Ensures smooth sequencing of API calls and database updates. 🔹 HTTP Request Node – Sends AI-extracted insights to Notion for structured storage. --- 🚀 Why Use This Workflow? ✔ Eliminates manual data entry and speeds up sales intelligence processing. ✔ Ensures structured and categorized sales insights for decision-making. ✔ Improves team collaboration with AI-powered competitor tracking & objections logging. ✔ Seamlessly integrates with Notion to centralize and manage sales call insights. ✔ Scalable for teams using n8n Cloud or self-hosted deployments. This workflow empowers sales teams with automated AI insights, streamlining sales strategy and follow-ups with minimal effort. 🚀

Angel MenendezBy Angel Menendez
1920

Monitor Amazon product prices with Bright Data and Google Sheets

Amazon Price Monitoring Workflow This workflow enables you to monitor the prices of Amazon product listings directly from a Google Sheet, using data provided by Bright Data’s Amazon Scraper API. It automates the retrieval of price data for specified products and is ideal for market research, competitor analysis, or personal price tracking. ✅ Requirements Before using this template, ensure you have the following: A Bright Data account and access to the Amazon Scraper API. An active API key from Bright Data. A Google Sheet set up with the required columns. N8N account (self-host or cloud version) ⸻ ⚙️ Setup Create a Google Sheet with the following columns: Product URL ZIP Code (used for regional price variations) ASIN (Amazon Standard Identification Number) Extract ASIN Automatically using the following formula in the ASIN column: =REGEXEXTRACT(A2, "/(?:dp|gp/product|product)/([A-Z0-9]{10})") Replace A2 with the appropriate cell reference Obtain an API Key: Sign in to your Bright Data account. Go to the API section to generate an API key. Create a Bearer Authentication Credential using this key in your automation tool. Configure the Workflow: Use a node (e.g., “Google Sheets”) to read data from your sheet. Use an HTTP Request node to send a query to Bright Data’s Amazon API with the ASIN and ZIP code. Parse the returned JSON response to extract product price and other relevant data. Optionally write the output (e.g., current price, timestamp) back into the sheet or another data store. ⸻ Workflow Functionality The workflow is triggered periodically (or manually) and reads product details from your Google Sheet. For each row, it extracts the Product URL and ZIP code and sends a request to the Bright Data API. The API returns product price information, which is then logged or updated back into the sheet using ASIN. You can also map the product URL to the product URL, but ensure that the URL has no parameters. If the URL has appended parameters, refer to the input field from the Bright Data snapshot result. ⸻ 💡 Use Cases E-commerce sellers monitoring competitors’ prices. Consumers tracking price drops on wishlist items. Market researchers collecting pricing data across ZIP codes. Affiliate marketers ensuring accurate product pricing on their platforms. ⸻ 🛠️ Customization Add columns for additional product data such as rating, seller, or stock availability. Schedule the workflow to run hourly, daily, or weekly depending on your needs. Implement email or Slack alerts for significant price changes. Filter by product category or brand to narrow your tracking focus.

Cyril Nicko GasparBy Cyril Nicko Gaspar
1571

📍 Daily nearby garage sales alerts via Telegram

Get a personalized list of garage sales happening today, based on your current location, directly in Telegram each morning! This n8n workflow integrates Home Assistant and Brocabrac.frto: Automatically detect your location every day Scrape and parse garage sale listings from Brocabrac Filter for high-quality and nearby events Send a neatly formatted message to your Telegram account Perfect for treasure hunters and second-hand enthusiasts who want to stay in the loop with zero effort!

ThibaudBy Thibaud
1043

Create a new user in Intercom

No description available.

tanaypantBy tanaypant
807

Auto-ticket maker: Convert Slack conversations into structured project tickets

Workflow: Auto-Ticket Maker ⚡ About the Creators This workflow was created by Varritech Technologies, an innovative agency that leverages AI to engineer, design, and deliver software development projects 500% faster than traditional agencies. Based in New York City, we specialize in custom software development, web applications, and digital transformation solutions. If you need assistance implementing this workflow or have questions about content management solutions, please reach out to our team. 🏗️ Architecture Overview This workflow transforms your Slack conversations into complete project tickets, effectively replacing the need for a dedicated PM for task creation: Slack Webhook → Captures team conversation Code Transformation → Parses Slack message structure AI PM Agent → Analyzes requirements and creates complete tickets Memory Buffer → Maintains conversation context Slack Output → Returns formatted tickets to your channel Say goodbye to endless PM meetings just to create tickets! Simply describe what you need in Slack, and our AI PM handles the rest, breaking down complex projects into structured epics and tasks with all the necessary details. 📦 Node-by-Node Breakdown flowchart LR A[Webhook: Slack Trigger] --> B[Code: Parse Message] B --> C[AI PM Agent] C --> D[Slack: Post Tickets] E[Memory Buffer] --> C F[OpenAI Model] --> C Webhook: Slack Trigger Type: HTTP Webhook (POST /slack-ticket-maker) Purpose: Captures messages from your designated Slack channel. Code Transformation Function: Parses complex Slack payload structure Extracts: User ID, channel, message text, timestamp, thread information AI PM Agent Inputs: Parsed Slack message Process: Evaluates project complexity Requests project name if needed Asks clarifying questions (up to 2 rounds) Breaks down into epics and tasks Formats with comprehensive structure Ticket Structure: Title Description Objectives/Goals Definition of Done Requirements/Acceptance Criteria Implementation Details Risks & Challenges Testing & Validation Timeline & Milestones Related Notes & References Open Questions Memory Buffer Type: Window Buffer Memory Purpose: Maintains context across conversation Slack Output Posts fully-formatted tickets back to your channel Uses markdown for clean, structured presentation 🔍 Design Rationale & Best Practices Replace Your PM's Ticket Creation Time Let your PM focus on strategy while AI handles the documentation. Cut ticket creation time by 90%. Standardized Quality Every ticket follows best practices with consistent structure, detail level, and formatting. No Training Required Describe your needs conversationally - the AI adapts to your communication style. Seamless Integration Works within your existing Slack workflow - no new tools to learn.

VarritechBy Varritech
700

OAuth2 settings finder with OpenRouter chat model and Llama 3.3

Find OAuth URIs with AI Llama Overview: The AI agent identifies: Authorization URI Token URI Audience Methodology: Confidence scoring is utilized to assess the trustworthiness of extracted data: Score Range: 0 < x ≤ 1 Score Granularity: 0.01 increments Model Details: Leveraging the Wayfarer Large 70b Llama 3.3 model. How it works: This template is designed to assist users in obtaining OAuth2 settings using AI-powered insights. It is ideal for developers, IT professionals, or anyone working with APIs that require OAuth2 authentication. By leveraging the AI agent, users can simplify the process of extracting and validating key details such as the authorizationurl, tokenurl, and audience. Set up instructions: Configuration Nodes Structured Output Node: Parses the AI model's output using a predefined JSON schema. This ensures the data is structured for downstream processing. Code Node: If the AI model’s output does not match the required format, use the Code node to re-arrange and transform the data. Example code snippets are provided below for common scenarios. AI Model Prompt The prompt for the AI model includes: A detailed structure and objectives of the query. Flexibility for the model to improvise when accurate results cannot be determined. Confidence Scoring The AI model assigns a confidence score (0 < x ≤ 1) to indicate the reliability of the extracted data. Scores are provided in increments of 0.01 for granularity. Adaptability Customize this template: Update the AI model prompt with details specific to your API or OAuth2 setup. Adjust the JSON schema in the Structured Output node to match the data format. Modify the Code logic to suit the application's requirements.

HendriekusBy Hendriekus
523

Real-time OKX spot market data with GPT-4o & Telegram

Instantly access live OKX Spot Market data directly in Telegram! This workflow integrates the OKX REST v5 API with Telegram and optional GPT-4.1-mini formatting, delivering real-time insights such as latest prices, order book depth, candlesticks, trades, and mark prices — all in clean, structured reports. --- 🔎 How It Works A Telegram Trigger node listens for incoming user commands. The User Authentication node validates the Telegram ID to allow only authorized users. The workflow creates a Session ID from chat.id to manage session memory. The OKX AI Agent orchestrates data retrieval via HTTP requests to OKX endpoints: Latest Price (/api/v5/market/ticker?instId=BTC-USDT) 24h Stats (/api/v5/market/ticker?instId=BTC-USDT) Order Book Depth (/api/v5/market/books?instId=BTC-USDT&sz=50) Best Bid/Ask (book ticker snapshot) Candlesticks / Klines (/api/v5/market/candles?instId=BTC-USDT&bar=15m) Average / Mark Price (/api/v5/market/mark-price?instType=SPOT&instId=BTC-USDT) Recent Trades (/api/v5/market/trades?instId=BTC-USDT&limit=100) Utility tools refine the data: Calculator → spreads, % change, normalized volumes. Think → reshapes raw JSON into clean text. Simple Memory → stores sessionId, symbol, and state for multi-turn interactions. A message splitter ensures Telegram output stays under 4000 characters. Final results are sent to Telegram in structured, human-readable format. --- ✅ What You Can Do with This Agent Get latest price and 24h stats for any Spot instrument. Retrieve order book depth with configurable size (up to 400 levels). View best bid/ask snapshots instantly. Fetch candlestick OHLCV data across intervals (1m → 1M). Monitor recent trades (up to 100). Check the mark price as a fair average reference. Receive clean, Telegram-ready reports (auto-split if too long). --- 🛠️ Setup Steps Create a Telegram Bot Use @BotFather to generate a bot token. Configure in n8n Import OKX AI Agent v1.02.json. Replace the placeholder in User Authentication node with your Telegram ID. Add Telegram API credentials (bot token). Add your OpenAI API key for GPT-4.1-mini. Add your OKX API key optional. Deploy and Test Activate the workflow in n8n. Send a query like BTC-USDT to your bot. Instantly get structured OKX Spot data back in Telegram. --- 📺 Setup Video Tutorial Watch the full setup guide on YouTube: [](https://www.youtube.com/watch?v=TAA_BFuwml0) --- ⚡ Unlock real-time OKX Spot Market insights directly in Telegram — no private API keys required! --- 🧾 Licensing & Attribution © 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. 🔗 For support: Don Jayamaha – LinkedIn

Don Jayamaha JrBy Don Jayamaha Jr
424

Automate sales follow-ups with GPT-4o-mini, HubSpot, Slack, Teams & Telegram

How it works This workflow automatically generates personalized follow-up messages for leads or customers after key interactions (e.g., demos, sales calls). It enriches contact details from HubSpot (or optionally Monday.com), uses AI to draft a professional follow-up email, and distributes it across multiple communication channels (Slack, Telegram, Teams) as reminders for the sales team. Step-by-step Trigger & Input Schedule Trigger – Runs automatically at a defined interval (e.g., daily). Set Sample Data – Captures the contact’s name, email, and context from the last interaction (e.g., “had a product demo yesterday and showed strong interest”). Contact Enrichment HubSpot Contact Lookup – Searches HubSpot CRM by email to confirm or enrich contact details. Monday.com Contact Fetch (Optional) – Can pull additional CRM details if enabled. AI Message Generation AI Language Model (OpenAI) – Provides the underlying engine for message creation. Generate Follow-Up Message – Drafts a short, professional, and friendly follow-up email: References previous interaction context. Suggests clear next steps (call, resources, etc.). Ends with a standardized signature block for consistency. Multi-Channel Communication Slack Reminder – Posts the generated message as a reminder in the sales team’s Slack channel. Telegram Reminder – Sends the follow-up draft to a Telegram chat. Teams Reminder – Shares the same message in a Microsoft Teams channel. Benefits Personalized Outreach at Scale – AI ensures each follow-up feels tailored and professional. Context-Aware Messaging – Pulls in CRM details and past interactions for relevance. Cross-Platform Delivery – Distributes reminders via Slack, Teams, and Telegram so no follow-up is missed. Time-Saving for Sales Teams – Eliminates manual drafting of repetitive follow-up emails. Consistent Branding – Ensures every message includes a unified signature block.

Avkash KakdiyaBy Avkash Kakdiya
389

Auto-categorize blog posts with OpenAI GPT-4, GitHub, and Google Sheets for Astro/Next.js

Automatically Assign Categories and Tags to Blog Posts with AI This workflow streamlines your content organization process by automatically analyzing new blog posts in your GitHub repository and assigning appropriate categories and tags using OpenAI. It compares new posts against existing entries in a Google Sheet, updates the metadata for each new article, and records the suggested tags and categories for review — all in one automated pipeline. --- Who’s It For Content creators and editors managing a static website (e.g., Astro or Next.js) who want AI-driven tagging. SEO specialists seeking consistent metadata and topic organization. Developers or teams managing a Markdown-based blog stored in GitHub who want to speed up post curation. --- How It Works Form Trigger – Starts the process manually with a form that initiates article analysis. Get Data from Google Sheets – Retrieves existing post records to prevent duplicate analysis. Compare GitHub and Google Sheets – Lists all .md or .mdx blog posts from the GitHub repository (piotr-sikora.com/src/content/blog/pl/) and identifies new posts not yet analyzed. Check New Repo Files – Uses a code node to filter only unprocessed files for AI tagging. Switch Node – If there are no new posts, the workflow stops and shows a confirmation message. If new posts exist, it continues to the next step. Get Post Content from GitHub – Downloads the content of each new article. AI Agent (LangChain + OpenAI GPT-4.1-mini) – Reads each post’s frontmatter (--- section) and body. Suggests new categories and tags based on the article’s topic. Returns a JSON object with proposed updates (Structured Output Parser) Append to Google Sheets – Logs results, including: File name Existing tags and categories Proposed tags and categories (AI suggestions) Completion Message – Displays a success message confirming the categorization process has finished. --- Requirements GitHub account with repository access to your website content. Google Sheets connection for storing metadata suggestions. OpenAI account (credential stored in openAiApi). --- How to Set Up Connect your GitHub, Google Sheets, and OpenAI credentials in n8n. Update the GitHub repository path to match your project (e.g., src/content/blog/en/). In Google Sheets, create columns: FileName, Categories, Proposed Categories, Tags, Proposed Tags. Adjust the AI model or prompt text if you want different tagging behavior. Run the workflow manually using the Form Trigger node. --- How to Customize Swap OpenAI GPT-4.1-mini for another LLM (e.g., Claude or Gemini) via the LangChain node. Modify the prompt in the AI Agent to adapt categorization style or tone. Add a GitHub commit node if you want AI-updated metadata written back to files automatically. Use the Schedule Trigger node to automate this process daily. --- Important Notes All API keys and credentials are securely stored — no hardcoded keys. The workflow includes multiple sticky notes explaining: Repository setup File retrieval and AI tagging Google Sheet data structure It uses a LangChain memory buffer to improve contextual consistency during multiple analyses. --- Summary This workflow automates metadata management for blogs or documentation sites by combining GitHub content, AI categorization, and Google Sheets tracking. With it, you can easily maintain consistent tags and categories across dozens of articles — boosting SEO, readability, and editorial efficiency without manual tagging.

Piotr SikoraBy Piotr Sikora
208

Convert POML to AI-Ready Prompts & Chat Messages with Zero Dependencies

POML → Prompt/Messages (No-Deps) What this does Turns POML markup into either a single Markdown prompt or chat-style messages\[] — using a zero-dependency n8n Code node. It supports variable substitution (via context), basic components (headings, lists, code, images, tables, line breaks), and optional schema-driven validation using componentSpec + attributeSpec. Credits Created by Real Simple Solutions as an n8n template friendly POML compiler (no dependencies) for full POML feature parity. View more of our templates here Who’s it for Teams who author prompts in POML and want a template-safe way to turn them into either a single Markdown prompt or chat-style messages—without installing external modules. Works on n8n Cloud and self-hosted. What it does This workflow converts POML into: prompt (Markdown) for single-shot models, or messages[] (system|user|assistant) for chat APIs when speakerMode is true. It supports variable substitution via a context object ({{dot.path}}), lists, headings, code blocks, images (incl. base64 → data: URL), tables from JSON (records/columns), and basic message components. How it works Set (Specs & Context): Provide componentSpec (allowed attrs per tag), attributeSpec (typing/coercion), and optional context. Code (POML → Prompt/Messages): A zero-dependency compiler parses the POML and emits prompt or messages[]. > Add a yellow Sticky Note that includes this description and any setup links. Use additional neutral sticky notes to explain each step. How to set up Import the template. Open the first Set node and paste your componentSpec, attributeSpec, and context (examples included). In the Code node, choose: speakerMode: true to get messages[], or false for a single prompt. listStyle: dash | star | plus | decimal | latin. Run → inspect prompt/messages in the output. Requirements No credentials or community nodes. Works without external libraries (template-compliant). How to customize Add message tags (<system-msg>, <user-msg>, <ai-msg>) in your POML when using speakerMode: true. Extend componentSpec/attributeSpec to validate or coerce additional tags/attributes. Preformat arrays in context (e.g., bulleted, csv) for display, or add a small Set node to build them on the fly. Rename nodes and keep all user-editable fields grouped in the first Set node. Security & best practices Never hardcode API keys in nodes. Remove any personal IDs before publishing. Keep your Sticky Note(s) up to date and instructional.

RealSimple SolutionsBy RealSimple Solutions
149