Back to Catalog

Telegram bot with Supabase memory and OpenAI assistant integration

Mark ShcherbakovMark Shcherbakov
10201 views
2/3/2026
Official Page

Video Guide

I prepared a detailed guide that showed the whole process of building an AI bot, from the simplest version to the most complex in a template.

Youtube thumb 3.png

Who is this for?

This workflow is ideal for developers, chatbot enthusiasts, and businesses looking to build a dynamic Telegram bot with memory capabilities. The bot leverages OpenAI's assistant to interact with users and stores user data in Supabase for personalized conversations.

What problem does this workflow solve?

Many simple chatbots lack context awareness and user memory. This workflow solves that by integrating Supabase to keep track of user sessions (via telegram_id and openai_thread_id), allowing the bot to maintain continuity and context in conversations, leading to a more human-like and engaging experience.

What this workflow does

This Telegram bot template connects with OpenAI to answer user queries while storing and retrieving user information from a Supabase database. The memory component ensures that the bot can reference past interactions, making it suitable for use cases such as customer support, virtual assistants, or any application where context retention is crucial.

1.Receive New Message: The bot listens for incoming messages from users in Telegram. 2. Check User in Database: The workflow checks if the user is already in the Supabase database using the telegram_id. 3. Create New User (if necessary): If the user does not exist, a new record is created in Supabase with the telegram_id and a unique openai_thread_id. 4. Start or Continue Conversation with OpenAI: Based on the user’s context, the bot either creates a new thread or continues an existing one using the stored openai_thread_id. 5. Merge Data: User-specific data and conversation context are merged. 6. Send and Receive Messages: The message is sent to OpenAI, and the response is received and processed. 7. Reply to User: The bot sends OpenAI’s response back to the user in Telegram.

Setup

  1. Create a Telegram Bot using the Botfather and obtain the bot token.
  2. Set up Supabase:
    1. Create a new project and generate a SUPABASE_URL and SUPABASE_KEY.
    2. Create a new table named telegram_users with the following SQL query:
create table
  public.telegram_users (
    id uuid not null default gen_random_uuid (),
    date_created timestamp with time zone not null default (now() at time zone 'utc'::text),
    telegram_id bigint null,
    openai_thread_id text null,
    constraint telegram_users_pkey primary key (id)
  ) tablespace pg_default;
  1. OpenAI Setup:
    1. Create an OpenAI assistant and obtain the OPENAI_API_KEY.
    2. Customize your assistant’s personality or use cases according to your requirements.
  2. Environment Configuration in n8n:
    1. Configure the Telegram, Supabase, and OpenAI nodes with the appropriate credentials.
    2. Set up triggers for receiving messages and handling conversation logic.
    3. Set up OpenAI assistant ID in "++OPENAI - Run assistant++" node.

Telegram Bot with Supabase Memory and OpenAI Assistant Integration

This n8n workflow creates a sophisticated Telegram bot that leverages Supabase for memory management and integrates with OpenAI's Assistant API for intelligent conversational responses. It provides a robust framework for building interactive and context-aware Telegram bots.

What it does

This workflow automates the following steps:

  1. Listens for Telegram Messages: It acts as a webhook, waiting for incoming messages from a configured Telegram bot.
  2. Initial Message Handling: When a message is received, it processes the message content and user information.
  3. Conditional Logic (Placeholder): The If node is present, suggesting a point where conditional logic could be introduced (e.g., to differentiate between commands, specific keywords, or user states). Currently, it doesn't have any configured conditions or connections to other nodes. This indicates a potential area for expansion based on specific bot requirements.
  4. Data Merging (Placeholder): The Merge node is included but not connected, indicating a potential future need to combine data from different branches or sources before further processing.
  5. Supabase Interaction (Placeholder): A Supabase node is available, suggesting the intention to use Supabase for data storage, such as user session data, conversation history, or custom bot settings. However, it is not currently connected to any part of the active flow.
  6. HTTP Request (Placeholder): An HTTP Request node is included but not connected. This would typically be used to interact with external APIs, such as the OpenAI Assistant API, to send user messages and receive AI-generated responses.
  7. Responds via Telegram: Finally, it sends a message back to the user through the Telegram bot.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running instance of n8n.
  • Telegram Bot Token: A Telegram bot created via BotFather and its associated API token.
  • Supabase Account: An active Supabase project for database interactions (though not actively connected in the provided JSON, its presence suggests its intended use).
  • OpenAI API Key: An OpenAI API key to interact with the OpenAI Assistant API (implied by the directory name, but would be configured in the HTTP Request node).

Setup/Usage

  1. Import the workflow: Download the provided JSON and import it into your n8n instance.
  2. Configure Telegram Trigger:
    • Set up a Telegram credential with your bot's API token.
    • Activate the "Telegram Trigger" node. This will generate a webhook URL that you need to configure in your Telegram bot (via BotFather, use the /setwebhook command).
  3. Configure Telegram Node:
    • Set up a Telegram credential with your bot's API token.
    • Ensure the "Telegram" node is configured to send messages to the correct chat ID.
  4. Configure Supabase (Optional, for full functionality):
    • Set up a Supabase credential.
    • Connect the Supabase node into the workflow and configure it to store/retrieve data as needed (e.g., user messages, conversation history).
  5. Configure OpenAI Assistant (Optional, for full functionality):
    • Connect the HTTP Request node into the workflow.
    • Configure it to make requests to the OpenAI Assistant API, including your OpenAI API key and the necessary payload for sending user messages and retrieving assistant responses.
  6. Activate the workflow: Once all credentials and nodes are configured, activate the workflow to start your Telegram bot.

Note: The If, Merge, Supabase, and HTTP Request nodes are currently disconnected placeholders. To achieve the full functionality of a Telegram bot with Supabase memory and OpenAI Assistant integration, you will need to connect and configure these nodes according to your specific requirements. The If and Merge nodes offer flexibility for adding complex logic and data manipulation.

Related Templates

Auto-reply & create Linear tickets from Gmail with GPT-5, gotoHuman & human review

This workflow automatically classifies every new email from your linked mailbox, drafts a personalized reply, and creates Linear tickets for bugs or feature requests. It uses a human-in-the-loop with gotoHuman and continuously improves itself by learning from approved examples. How it works The workflow triggers on every new email from your linked mailbox. Self-learning Email Classifier: an AI model categorizes the email into defined categories (e.g., Bug Report, Feature Request, Sales Opportunity, etc.). It fetches previously approved classification examples from gotoHuman to refine decisions. Self-learning Email Writer: the AI drafts a reply to the email. It learns over time by using previously approved replies from gotoHuman, with per-classification context to tailor tone and style (e.g., different style for sales vs. bug reports). Human Review in gotoHuman: review the classification and the drafted reply. Drafts can be edited or retried. Approved values are used to train the self-learning agents. Send approved Reply: the approved response is sent as a reply to the email thread. Create ticket: if the classification is Bug or Feature Request, a ticket is created by another AI agent in Linear. Human Review in gotoHuman: How to set up Most importantly, install the gotoHuman node before importing this template! (Just add the node to a blank canvas before importing) Set up credentials for gotoHuman, OpenAI, your email provider (e.g. Gmail), and Linear. In gotoHuman, select and create the pre-built review template "Support email agent" or import the ID: 6fzuCJlFYJtlu9mGYcVT. Select this template in the gotoHuman node. In the "gotoHuman: Fetch approved examples" http nodes you need to add your formId. It is the ID of the review template that you just created/imported in gotoHuman. Requirements gotoHuman (human supervision, memory for self-learning) OpenAI (classification, drafting) Gmail or your preferred email provider (for email trigger+replies) Linear (ticketing) How to customize Expand or refine the categories used by the classifier. Update the prompt to reflect your own taxonomy. Filter fetched training data from gotoHuman by reviewer so the writer adapts to their personalized tone and preferences. Add more context to the AI email writer (calendar events, FAQs, product docs) to improve reply quality.

gotoHumanBy gotoHuman
353

Synchronizing WooCommerce inventory and creating products with Google Gemini AI and BrowserAct

Synchronize WooCommerce Inventory & Create Products with Gemini AI & BrowserAct This sophisticated n8n template automates WooCommerce inventory management by scraping supplier data, updating existing products, and intelligently creating new ones with AI-formatted descriptions. This workflow is essential for e-commerce operators, dropshippers, and inventory managers who need to ensure their product pricing and stock levels are synchronized with multiple third-party suppliers, minimizing overselling and maximizing profit. --- Self-Hosted Only This Workflow uses a community contribution and is designed and tested for self-hosted n8n instances only. --- How it works The workflow is typically run by a Schedule Trigger (though a Manual Trigger is also shown) to check stock automatically. It reads a list of suppliers and their inventory page URLs from a central Google Sheet. The workflow loops through each supplier: A BrowserAct node scrapes the current stock and price data from the supplier's inventory page. A Code node parses this bulk data into individual product items. It then loops through each individual product found. The workflow checks WooCommerce to see if the product already exists based on its name. If the product exists: It proceeds to update the existing product's price and stock quantity. If the product DOES NOT exist: An If node checks if the missing product's category matches a predefined type (optional filtering). If it passes the filter, a second BrowserAct workflow scrapes detailed product attributes from a dedicated product page (e.g., DigiKey). An AI Agent (Gemini) transforms these attributes into a specific, styled HTML table for the product description. Finally, the product is created in WooCommerce with all scraped details and the AI-generated description. Error Handling: Multiple Slack nodes are configured to alert your team immediately if any scraping task fails or if the product update/creation process encounters an issue. Note: This workflow does not support image uploads for new products. To enable this functionality, you must modify both the n8n and BrowserAct workflows. --- Requirements BrowserAct API account for web scraping BrowserAct n8n Community Node -> (n8n Nodes BrowserAct) BrowserAct templates named “WooCommerce Inventory & Stock Synchronization” and “WooCommerce Product Data Reconciliation” Google Sheets credentials for the supplier list WooCommerce credentials for product management Google Gemini account for the AI Agent Slack credentials for error alerts --- Need Help? How to Find Your BrowseAct API Key & Workflow ID How to Connect n8n to Browseract How to Use & Customize BrowserAct Templates How to Use the BrowserAct N8N Community Node --- Workflow Guidance and Showcase STOP Overselling! Auto-Sync WooCommerce Inventory from ANY Supplier

Madame AI Team | KaiBy Madame AI Team | Kai
600

Ai website scraper & company intelligence

AI Website Scraper & Company Intelligence Description This workflow automates the process of transforming any website URL into a structured, intelligent company profile. It's triggered by a form, allowing a user to submit a website and choose between a "basic" or "deep" scrape. The workflow extracts key information (mission, services, contacts, SEO keywords), stores it in a structured Supabase database, and archives a full JSON backup to Google Drive. It also features a secondary AI agent that automatically finds and saves competitors for each company, building a rich, interconnected database of company intelligence. --- Quick Implementation Steps Import the Workflow: Import the provided JSON file into your n8n instance. Install Custom Community Node: You must install the community node from: https://www.npmjs.com/package/n8n-nodes-crawl-and-scrape FIRECRAWL N8N Documentation https://docs.firecrawl.dev/developer-guides/workflow-automation/n8n Install Additional Nodes: n8n-nodes-crawl-and-scrape and n8n-nodes-mcp fire crawl mcp . Set up Credentials: Create credentials in n8n for FIRE CRAWL API,Supabase, Mistral AI, and Google Drive. Configure API Key (CRITICAL): Open the Web Search tool node. Go to Parameters → Headers and replace the hardcoded Tavily AI API key with your own. Configure Supabase Nodes: Assign your Supabase credential to all Supabase nodes. Ensure table names (e.g., companies, competitors) match your schema. Configure Google Drive Nodes: Assign your Google Drive credential to the Google Drive2 and save to Google Drive1 nodes. Select the correct Folder ID. Activate Workflow: Turn on the workflow and open the Webhook URL in the “On form submission” node to access the form. --- What It Does Form Trigger Captures user input: “Website URL” and “Scraping Type” (basic or deep). Scraping Router A Switch node routes the flow: Deep Scraping → AI-based MCP Firecrawler agent. Basic Scraping → Crawlee node. Deep Scraping (Firecrawl AI Agent) Uses Firecrawl and Tavily Web Search. Extracts a detailed JSON profile: mission, services, contacts, SEO keywords, etc. Basic Scraping (Crawlee) Uses Crawl and Scrape node to collect raw text. A Mistral-based AI extractor structures the data into JSON. Data Storage Stores structured data in Supabase tables (companies, company_basicprofiles). Archives a full JSON backup to Google Drive. Automated Competitor Analysis Runs after a deep scrape. Uses Tavily web search to find competitors (e.g., from Crunchbase). Saves competitor data to Supabase, linked by company_id. --- Who's It For Sales & Marketing Teams: Enrich leads with deep company info. Market Researchers: Build structured, searchable company databases. B2B Data Providers: Automate company intelligence collection. Developers: Use as a base for RAG or enrichment pipelines. --- Requirements n8n instance (self-hosted or cloud) Supabase Account: With tables like companies, competitors, social_links, etc. Mistral AI API Key Google Drive Credentials Tavily AI API Key (Optional) Custom Nodes: n8n-nodes-crawl-and-scrape --- How It Works Flow Summary Form Trigger: Captures “Website URL” and “Scraping Type”. Switch Node: deep → MCP Firecrawler (AI Agent). basic → Crawl and Scrape node. Scraping & Extraction: Deep path: Firecrawler → JSON structure. Basic path: Crawlee → Mistral extractor → JSON. Storage: Save JSON to Supabase. Archive in Google Drive. Competitor Analysis (Deep Only): Finds competitors via Tavily. Saves to Supabase competitors table. End: Finishes with a No Operation node. --- How To Set Up Import workflow JSON. Install community nodes (especially n8n-nodes-crawl-and-scrape from npm). Configure credentials (Supabase, Mistral AI, Google Drive). Add your Tavily API key. Connect Supabase and Drive nodes properly. Fix disconnected “basic” path if needed. Activate workflow. Test via the webhook form URL. --- How To Customize Change LLMs: Swap Mistral for OpenAI or Claude. Edit Scraper Prompts: Modify system prompts in AI agent nodes. Change Extraction Schema: Update JSON Schema in extractor nodes. Fix Relational Tables: Add Items node before Supabase inserts for arrays (social links, keywords). Enhance Automation: Add email/slack notifications, or replace form trigger with a Google Sheets trigger. --- Add-ons Automated Trigger: Run on new sheet rows. Notifications: Email or Slack alerts after completion. RAG Integration: Use the Supabase database as a chatbot knowledge source. --- Use Case Examples Sales Lead Enrichment: Instantly get company + competitor data from a URL. Market Research: Collect and compare companies in a niche. B2B Database Creation: Build a proprietary company dataset. --- WORKFLOW IMAGE --- Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|-----------| | Form Trigger 404 | Workflow not active | Activate the workflow | | Web Search Tool fails | Missing Tavily API key | Replace the placeholder key | | FIRECRAWLER / find competitor fails | Missing MCP node | Install n8n-nodes-mcp | | Basic scrape does nothing | Switch node path disconnected | Reconnect “basic” output | | Supabase node error | Wrong table/column names | Match schema exactly | --- Need Help or More Workflows? Want to customize this workflow for your business or integrate it with your existing tools? Our team at Digital Biz Tech can tailor it precisely to your use case from automation logic to AI-powered enhancements. Contact: shilpa.raju@digitalbiz.tech For more such offerings, visit us: https://www.digitalbiz.tech ---

DIGITAL BIZ TECHBy DIGITAL BIZ TECH
923