Back to Catalog

N8N automated Twitter reply bot workflow

MaxMax
1080 views
2/3/2026
Official Page

N8N Automated Twitter Reply Bot Workflow

For latest version, check: dziura.online/automation

Latest documentation can be find here

You must have Apify community node installed before pasting the JSON to your workflow. 

Overview

This n8n workflow creates an intelligent Twitter/X reply bot that automatically scrapes tweets based on keywords or communities, analyzes them using AI, generates contextually appropriate replies, and posts them while avoiding duplicates. The bot operates on a schedule with intelligent timing and retry mechanisms.

Key Features

  • Automated tweet scraping from Twitter/X using Apify actors

  • AI-powered reply generation using LLM (Large Language Model)

  • Duplicate prevention via MongoDB storage

  • Smart scheduling with timezone awareness and natural posting patterns

  • Retry mechanism with failure tracking

  • Telegram notifications for status updates

  • Manual trigger option via Telegram command

Required Credentials & Setup

1. Telegram Bot

  • Create a bot via @BotFather on Telegram

  • Get your Telegram chat ID to receive status messages

  • Credential needed: Telegram account (Bot token)

2. MongoDB Database

  • Set up a MongoDB database to store replied tweets and prevent duplicates

  • Create a collection (default name: collection_name)

  • Credential needed: MongoDB account (Connection string)

  • Tutorial: MongoDB Connection Guide

3. Apify Account

4. OpenRouter (LLM Provider)

  • Sign up at OpenRouter.ai

  • Used for AI-powered tweet analysis and reply generation

  • Model used: x-ai/grok-3 (configurable)

  • Credential needed: OpenRouter account (API key)

5. Twitter/X API

  • Set up developer account at developer.x.com

  • Note: Free tier limited to ~17 posts per day

  • Credential needed: X account (OAuth2 credentials)

Workflow Components

Trigger Nodes

1. Schedule Trigger

  • Purpose: Runs automatically every 20 minutes

  • Smart timing: Only active between 7 AM - 11:59 PM (configurable timezone)

  • Randomization: Built-in probability control (~28% execution chance) to mimic natural posting patterns

2. Manual Trigger

  • Purpose: Manual execution for testing

3. Telegram Trigger

  • Purpose: Manual execution via /reply command in Telegram

  • Usage: Send /reply to your bot to trigger the workflow manually

Data Processing Flow

1. MongoDB Query (Find documents)

  • Purpose: Retrieves previously replied tweet IDs to avoid duplicates

  • Collection: collection_name (configure to match your setup)

  • Projection: Only fetches tweet_id field for efficiency

2. Data Aggregation (Aggregate1)

  • Purpose: Consolidates tweet IDs into a single array for filtering

3. Keyword/Community Selection (Keyword/Community List)

  • Purpose: Defines search terms and communities

  • Configuration: Edit the JSON to include your keywords and Twitter community IDs

Format:{

  "keyword_community_list": [

    "SaaS",

    "Entrepreneur", 

    "1488663855127535616"  // Community ID (19-digit number)

  ],

  "failure": 0

}

4. Random Selection (Randomized community, keyword)

  • Purpose: Randomly selects one item from the list to ensure variety

5. Routing Logic (If4)

  • Purpose: Determines whether to use Community search or Keyword search

  • Logic: Uses regex to detect 19-digit community IDs vs keywords

Tweet Scraping (Apify Actors)

Community Search Actor

  • Actor: api-ninja/x-twitter-community-search-post-scraper

  • Purpose: Scrapes tweets from specific Twitter communities

Configuration:{

  "communityIds": ["COMMUNITY_ID"],

  "numberOfTweets": 40

}

Search Actor

  • Actor: api-ninja/x-twitter-advanced-search

  • Purpose: Scrapes tweets based on keywords

Configuration:{

  "contentLanguage": "en",

  "engagementMinLikes": 10,

  "engagementMinReplies": 5,

  "numberOfTweets": 20,

  "query": "KEYWORD",

  "timeWithinTime": "2d",

  "tweetTypes": ["original"],

  "usersBlueVerifiedOnly": true

}

Filtering System (Community filter)

The workflow applies multiple filters to ensure high-quality replies:

  • Text length: >60 characters (substantial content)

  • Follower count: >100 followers (audience reach)

  • Engagement: >10 likes, >3 replies (proven engagement)

  • Language: English only

  • Views: >100 views (visibility)

  • Duplicate check: Not previously replied to

  • Recency: Within 2 days (configurable in actor settings)

AI-Powered Reply Generation

LLM Chain (Basic LLM Chain)

  • Purpose: Analyzes filtered tweets and generates contextually appropriate replies

  • Model: Grok-3 via OpenRouter (configurable)

  • Features:

    • Engagement potential scoring

    • User authority analysis

    • Timing optimization

    • Multiple reply styles (witty, informative, supportive, etc.)

    • <100 character limit for optimal engagement

Output Parser (Structured Output Parser)

  • Purpose: Ensures consistent JSON output format

Schema:{

  "selected_tweet_id": "tweet_id_here",

  "screen_name": "author_screen_name", 

  "reply": "generated_reply_here"

}

Posting & Notification System

Twitter Posting (Create Tweet)

  • Purpose: Posts the generated reply as a Twitter response

  • Error handling: Catches API limitations and rate limits

Status Notifications

  • Success: Notifies via Telegram with tweet link and reply text

  • Failure: Notifies about API limitations or errors

  • Format: HTML-formatted messages with clickable links

Database Storage (Insert documents)

  • Purpose: Saves successful replies to prevent future duplicates

  • Fields stored: tweet_id, screen_name, reply, tweet_url, timestamp

Retry Mechanism

The workflow includes intelligent retry logic:

Failure Counter (If5, Increment Failure Counter1)

  • Logic: If no suitable tweets found, increment failure counter

  • Retry limit: Maximum 3 retries with different random keywords

  • Wait time: 3-second delay between retries

Final Failure Notification

  • Trigger: After 4 failed attempts

  • Action: Sends Telegram notification about unsuccessful search

  • Recovery: Manual retry available via /reply command

Configuration Guide

Essential Settings to Modify

  1. MongoDB Collection Name: Update collection_name in MongoDB nodes

  2. Telegram Chat ID: Replace 11111111111 with your actual chat ID

  3. Keywords/Communities: Edit the list in Keyword/Community List node

  4. Timezone: Update timezone in Code node (currently set to Europe/Kyiv)

  5. Actor Selection: Enable only one actor (Community OR Search) based on your needs

Filter Customization

Adjust filters in Community filter node based on your requirements:

  • Minimum engagement thresholds

  • Text length requirements

  • Time windows

  • Language preferences

LLM Customization

Modify the AI prompt in Basic LLM Chain to:

  • Change reply style and tone

  • Adjust engagement criteria

  • Modify scoring algorithms

  • Set different character limits

Usage Tips

  1. Start small: Begin with a few high-quality keywords/communities

  2. Monitor performance: Use Telegram notifications to track success rates

  3. Adjust filters: Fine-tune based on the quality of generated replies

  4. Respect limits: Twitter's free tier allows ~17 posts/day

  5. Test manually: Use /reply command for testing before scheduling

Troubleshooting

Common Issues

  1. No tweets found: Adjust filter criteria or check keywords

  2. API rate limits: Reduce posting frequency or upgrade Twitter API plan

  3. MongoDB connection: Verify connection string and collection name

  4. Apify quota: Monitor Apify usage limits

  5. LLM failures: Check OpenRouter credits and model availability

Best Practices

  • Monitor your bot's replies for quality and appropriateness

  • Regularly update keywords to stay relevant

  • Keep an eye on engagement metrics

  • Adjust timing based on your audience's activity patterns

  • Maintain a balanced posting frequency to avoid appearing spammy

Documentation Links

This workflow provides a comprehensive solution for automated, intelligent Twitter engagement while maintaining quality and avoiding spam-like behavior.

n8n Automated Telegram Reply Bot Workflow

This n8n workflow automates responses to incoming Telegram messages. It acts as a bot that can receive messages, process them using an AI language model, and then send a structured reply back to the user. It also includes logic for storing conversation history in MongoDB and handling potential errors.

What it does

This workflow performs the following key steps:

  1. Listens for Telegram Messages: It is triggered whenever a new message is received by a configured Telegram bot.
  2. Stores Conversation History: The incoming message is immediately logged into a MongoDB database to maintain a record of interactions.
  3. Processes with AI: The message is then passed to a "Basic LLM Chain" (Language Model Chain) which uses an "OpenRouter Chat Model" to generate a response.
  4. Parses AI Output: A "Structured Output Parser" processes the AI's response, likely extracting specific data or formatting it for the reply.
  5. Filters for Valid Responses: An "If" node checks if the AI-generated response meets certain criteria.
  6. Prepares and Sends Telegram Reply:
    • If the response is valid, an "Edit Fields (Set)" node prepares the message content.
    • The workflow then sends the AI-generated reply back to the user via Telegram.
    • The sent message is also logged in MongoDB.
  7. Handles Invalid Responses: If the AI response is not valid, the workflow performs a "No Operation" (does nothing) for that specific branch.
  8. Manual Trigger for Testing: Includes a manual trigger for easy testing and debugging of the workflow.
  9. Scheduled Trigger (Placeholder): Contains a scheduled trigger, which is currently not connected but could be used for periodic tasks or maintenance.
  10. Sub-workflow Execution (Placeholder): Includes nodes for executing and triggering sub-workflows, suggesting modularity, though not actively used in the main flow.
  11. Code Node (Placeholder): A "Code" node is present, allowing for custom JavaScript logic if needed.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running instance of n8n.
  • Telegram Bot: A Telegram bot token and the bot configured to receive messages.
  • MongoDB Account/Instance: Access to a MongoDB database for storing conversation history.
  • OpenRouter Account: An OpenRouter API key for the AI chat model.
  • LangChain Nodes: Ensure the @n8n/n8n-nodes-langchain package is installed in your n8n instance.

Setup/Usage

  1. Import the Workflow: Import the provided JSON into your n8n instance.
  2. Configure Credentials:
    • Set up your Telegram Bot credential with your bot token.
    • Configure your MongoDB credential with your database connection details.
    • Set up your OpenRouter Chat Model credential with your OpenRouter API key.
  3. Activate the Workflow: Enable the workflow to start listening for incoming Telegram messages.
  4. Test: Send a message to your Telegram bot to test the automated reply functionality. You can also use the "Manual Trigger" node to simulate an incoming message for testing purposes.
  5. Review MongoDB: Check your MongoDB database to ensure conversation history is being logged correctly.

Related Templates

Two-way property repair management system with Google Sheets & Drive

This workflow automates the repair request process between tenants and building managers, keeping all updates organized in a single spreadsheet. It is composed of two coordinated workflows, as two separate triggers are required — one for new repair submissions and another for repair updates. A Unique Unit ID that corresponds to individual units is attributed to each request, and timestamps are used to coordinate repair updates with specific requests. General use cases include: Property managers who manage multiple buildings or units. Building owners looking to centralize tenant repair communication. Automation builders who want to learn multi-trigger workflow design in n8n. --- ⚙️ How It Works Workflow 1 – New Repair Requests Behind the Scenes: A tenant fills out a Google Form (“Repair Request Form”), which automatically adds a new row to a linked Google Sheet. Steps: Trigger: Google Sheets rowAdded – runs when a new form entry appears. Extract & Format: Collects all relevant form data (address, unit, urgency, contacts). Generate Unit ID: Creates a standardized identifier (e.g., BUILDING-UNIT) for tracking. Email Notification: Sends the building manager a formatted email summarizing the repair details and including a link to a Repair Update Form (which activates Workflow 2). --- Workflow 2 – Repair Updates Behind the Scenes:\ Triggered when the building manager submits a follow-up form (“Repair Update Form”). Steps: Lookup by UUID: Uses the Unit ID from Workflow 1 to find the existing row in the Google Sheet. Conditional Logic: If photos are uploaded: Saves each image to a Google Drive folder, renames files consistently, and adds URLs to the sheet. If no photos: Skips the upload step and processes textual updates only. Merge & Update: Combines new data with existing repair info in the same spreadsheet row — enabling a full repair history in one place. --- 🧩 Requirements Google Account (for Forms, Sheets, and Drive) Gmail/email node connected for sending notifications n8n credentials configured for Google API access --- ⚡ Setup Instructions (see more detail in workflow) Import both workflows into n8n, then copy one into a second workflow. Change manual trigger in workflow 2 to a n8n Form node. Connect Google credentials to all nodes. Update spreadsheet and folder IDs in the corresponding nodes. Customize email text, sender name, and form links for your organization. Test each workflow with a sample repair request and a repair update submission. --- 🛠️ Customization Ideas Add Slack or Telegram notifications for urgent repairs. Auto-create folders per building or unit for photo uploads. Generate monthly repair summaries using Google Sheets triggers. Add an AI node to create summaries/extract relevant repair data from repair request that include long submissions.

Matt@VeraisonLabsBy Matt@VeraisonLabs
208

Send WooCommerce cross-sell offers to customers via WhatsApp using Rapiwa API

Who Is This For? This n8n workflow enables automated cross-selling by identifying each WooCommerce customer's most frequently purchased product, finding a related product to recommend, and sending a personalized WhatsApp message using the Rapiwa API. It also verifies whether the user's number is WhatsApp-enabled before sending, and logs both successful and unsuccessful attempts to Google Sheets for tracking. What This Workflow Does Retrieves all paying customers from your WooCommerce store Identifies each customer's most purchased product Finds the latest product in the same category as their most purchased item Cleans and verifies customer phone numbers for WhatsApp compatibility Sends personalized WhatsApp messages with product recommendations Logs all activities to Google Sheets for tracking and analysis Handles both verified and unverified numbers appropriately Key Features Customer Segmentation: Automatically identifies paying customers from your WooCommerce store Product Analysis: Determines each customer's most purchased product Smart Recommendations: Finds the latest products in the same category as customer favorites WhatsApp Integration: Uses Rapiwa API for message delivery Phone Number Validation: Verifies WhatsApp numbers before sending messages Dual Logging System: Tracks both successful and failed message attempts in Google Sheets Rate Limiting: Uses batching and wait nodes to prevent API overload Personalized Messaging: Includes customer name and product details in messages Requirements WooCommerce store with API access Rapiwa account with API access for WhatsApp verification and messaging Google account with Sheets access Customer phone numbers in WooCommerce (stored in billing.phone field) How to Use — Step-by-Step Setup Credentials Setup WooCommerce API: Configure WooCommerce API credentials in n8n (e.g., "WooCommerce (get customer)" and "WooCommerce (get customer data)") Rapiwa Bearer Auth: Create an HTTP Bearer credential with your Rapiwa API token Google Sheets OAuth2: Set up OAuth2 credentials for Google Sheets access Configure Google Sheets Ensure your sheet has the required columns as specified in the Google Sheet Column Structure section Verify Code Nodes Code (get paying_customer): Filters customers to include only those who have made purchases Get most buy product id & Clear Number: Identifies the most purchased product and cleans phone numbers Configure HTTP Request Nodes Get customer data: Verify the WooCommerce API endpoint for retrieving customer orders Get specific product data: Verify the WooCommerce API endpoint for product details Get specific product recommend latest product: Verify the WooCommerce API endpoint for finding latest products by category Check valid WhatsApp number Using Rapiwa: Verify the Rapiwa endpoint for WhatsApp number validation Rapiwa Sender: Verify the Rapiwa endpoint for sending messages Google Sheet Required Columns You’ll need two Google Sheets (or two tabs in one spreadsheet): A Google Sheet formatted like this ➤ sample The workflow uses a Google Sheet with the following columns to track coupon distribution: Both must have the following headers (match exactly): | name | number | email | address1 | price | suk | title | product link | validity | staus | | ---------- | ------------- | ----------------------------------------------- | ----------- | ----- | --- | ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | ---------- | -------- | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | verified | sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | unverified | not sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | verified | sent | Important Notes Phone Number Format: The workflow cleans phone numbers by removing all non-digit characters. Ensure your WooCommerce phone numbers are in a compatible format. API Rate Limits: Rapiwa and WooCommerce APIs have rate limits. Adjust batch sizes and wait times accordingly. Data Privacy: Ensure compliance with data protection regulations when sending marketing messages. Error Handling: The workflow logs unverified numbers but doesn't have extensive error handling. Consider adding error notifications for failed API calls. Product Availability: The workflow recommends the latest product in a category, but doesn't check if it's in stock. Consider adding stock status verification. Testing: Always test with a small batch before running the workflow on your entire customer list. Useful Links Dashboard: https://app.rapiwa.com Official Website: https://rapiwa.com Documentation: https://docs.rapiwa.com Support & Help WhatsApp: Chat on WhatsApp Discord: SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen

RapiwaBy Rapiwa
183

Track SDK documentation drift with GitHub, Notion, Google Sheets, and Slack

📊 Description Automatically track SDK releases from GitHub, compare documentation freshness in Notion, and send Slack alerts when docs lag behind. This workflow ensures documentation stays in sync with releases, improves visibility, and reduces version drift across teams. 🚀📚💬 What This Template Does Step 1: Listens to GitHub repository events to detect new SDK releases. 🧩 Step 2: Fetches release metadata including version, tag, and publish date. 📦 Step 3: Logs release data into Google Sheets for record-keeping and analysis. 📊 Step 4: Retrieves FAQ or documentation data from Notion. 📚 Step 5: Merges GitHub and Notion data to calculate documentation drift. 🔍 Step 6: Flags SDKs whose documentation is over 30 days out of date. ⚠️ Step 7: Sends detailed Slack alerts to notify responsible teams. 🔔 Key Benefits ✅ Keeps SDK documentation aligned with product releases ✅ Prevents outdated information from reaching users ✅ Provides centralized release tracking in Google Sheets ✅ Sends real-time Slack alerts for overdue updates ✅ Strengthens DevRel and developer experience operations Features GitHub release trigger for real-time monitoring Google Sheets logging for tracking and auditing Notion database integration for documentation comparison Automated drift calculation (days since last update) Slack notifications for overdue documentation Requirements GitHub OAuth2 credentials Notion API credentials Google Sheets OAuth2 credentials Slack Bot token with chat:write permissions Target Audience Developer Relations (DevRel) and SDK engineering teams Product documentation and technical writing teams Project managers tracking SDK and doc release parity Step-by-Step Setup Instructions Connect your GitHub account and select your SDK repository. Replace YOURGOOGLESHEETID and YOURSHEET_GID with your tracking spreadsheet. Add your Notion FAQ database ID. Configure your Slack channel ID for alerts. Run once manually to validate setup, then enable automation.

Rahul JoshiBy Rahul Joshi
31