Back to Catalog

X (Twitter) brand sentiment analysis with Gemini AI & Slack alerts

Saeculum SolutionsSaeculum Solutions
78 views
2/3/2026
Official Page

This workflow is the AI analysis and alerting engine for a complete social media monitoring system. It's designed to work with data scraped from X (formerly Twitter) using a tool like the Apify Tweet Scraper, which logs the data into a Google Sheet. The workflow then automatically analyzes new tweets with Google Gemini and sends tailored alerts to Slack.

How it works

This workflow automates the analysis and reporting part of your social media monitoring:

  • tweet Hunting: It finds tweets for the query entered in the set node and passes the data to the google sheets
  • Fetches New Tweets: It gets all new rows from your Google Sheet that haven't been processed yet (it looks for "Notmarked" in the 'action taken' column).
  • Prepares for AI: It combines the data from all new tweets into a single, clean prompt for the AI to analyze.
  • AI Analysis with Gemini: It sends the compiled data to Google Gemini, asking for a full summary report and a separate, machine-readable JSON list of any urgent items.
  • Splits the Response: The workflow intelligently separates the AI's text summary from the JSON data for urgent alerts.
  • Sends Notifications:
    • The high-level summary is sent to a general Slack channel (e.g., #brand-alerts).
    • Each urgent item is sent as a separate, detailed alert to a high-priority Slack channel (e.g., #urgent).

Set up steps

It should take about 5-10 minutes to get this workflow running.

  1. Prerequisite - Data Source: Ensure you have a Google Sheet being populated with tweet data. For a complete automation, you can set up a new google sheet with the same structure for saving the tweets data and run the Tweet Scraper on a schedule.
  2. Configure Credentials: Make sure you have credentials set up in your n8n instance for Google Sheets, Google Gemini (PaLM) API, and Slack.
  3. Google Sheets Node ("Get row(s) in sheet"):
    • Select your Google Sheet containing the tweet data.
    • Choose the specific sheet name from the dropdown.
    • Ensure your sheet has a column named action taken so the filter works correctly.
  4. Google Gemini Chat Model Node: Select your Google Gemini credential from the dropdown.
  5. Slack Nodes ("Send a message" & "Send a message1"):
    • In the first Slack node, choose the channel for the summary report.
    • In the second Slack node, choose the channel for urgent alerts.
  6. Save and Activate: Once configured, save your workflow and turn it on!

n8n Workflow: X (Twitter) Brand Sentiment Analysis with Gemini AI and Slack Alerts

This n8n workflow automates the process of analyzing sentiment for tweets mentioning a specific brand, using Google Gemini AI, and then sending alerts to Slack for tweets with negative sentiment. It's designed to help you quickly identify and respond to critical feedback or mentions about your brand on X (Twitter).

What it does

This workflow simplifies brand monitoring and sentiment analysis by:

  1. Fetching Brand Mentions: Periodically retrieves recent tweets mentioning a specified brand from a Google Sheet.
  2. Processing Tweets in Batches: Processes the fetched tweets in manageable batches to optimize API calls.
  3. Analyzing Sentiment with Google Gemini: Uses the Google Gemini AI model to determine the sentiment (positive, negative, or neutral) of each tweet.
  4. Filtering Negative Sentiment: Identifies tweets that have been classified with a negative sentiment.
  5. Preparing Slack Alerts: Formats the negative sentiment tweets into a clear and actionable message for Slack.
  6. Sending Slack Alerts: Posts the formatted alerts to a designated Slack channel, ensuring your team is immediately aware of critical brand mentions.
  7. Rate Limiting: Includes a wait step to manage API call rates, preventing rate limit issues with external services.

Prerequisites/Requirements

To use this workflow, you will need:

  • n8n Instance: A running n8n instance.
  • Google Sheets Credential: An n8n credential configured for Google Sheets to read tweet data.
  • Google Gemini AI Credential: An n8n credential for Google Gemini (or a compatible LLM) to perform sentiment analysis.
  • Slack Credential: An n8n credential configured for Slack to send alerts.
  • Google Sheet: A Google Sheet containing the tweets you wish to analyze. The workflow expects a column named Tweet Text containing the tweet content.
  • X (Twitter) Data: This workflow assumes you have a mechanism (not included in this JSON) to populate the Google Sheet with X (Twitter) brand mentions.

Setup/Usage

  1. Import the Workflow:

    • Download the provided JSON file.
    • In your n8n instance, go to "Workflows" and click "New".
    • Click the three dots menu (...) and select "Import from JSON".
    • Paste the JSON content or upload the file.
  2. Configure Credentials:

    • Google Sheets: Update the "Google Sheets" node with your Google Sheets credential. Ensure the spreadsheet ID and sheet name are correctly configured to point to your tweet data.
    • Google Gemini Chat Model: Update the "Google Gemini Chat Model" node with your Google Gemini credential.
    • Slack: Update the "Slack" node with your Slack credential and specify the target channel for alerts.
  3. Customize AI Prompt (Optional):

    • The "Basic LLM Chain" node contains the prompt for the Google Gemini AI. You might want to adjust this prompt to fine-tune the sentiment analysis for your specific brand or industry.
  4. Activate the Workflow:

    • Once all credentials and configurations are set, activate the workflow.
    • The "Schedule Trigger" node is configured to run every hour, but you can adjust this frequency as needed.
  5. Monitor Slack:

    • Negative sentiment tweets will now be automatically posted to your configured Slack channel.

Related Templates

Track competitor SEO keywords with Decodo + GPT-4.1-mini + Google Sheets

This workflow automates competitor keyword research using OpenAI LLM and Decodo for intelligent web scraping. Who this is for SEO specialists, content strategists, and growth marketers who want to automate keyword research and competitive intelligence. Marketing analysts managing multiple clients or websites who need consistent SEO tracking without manual data pulls. Agencies or automation engineers using Google Sheets as an SEO data dashboard for keyword monitoring and reporting. What problem this workflow solves Tracking competitor keywords manually is slow and inconsistent. Most SEO tools provide limited API access or lack contextual keyword analysis. This workflow solves that by: Automatically scraping any competitor’s webpage with Decodo. Using OpenAI GPT-4.1-mini to interpret keyword intent, density, and semantic focus. Storing structured keyword insights directly in Google Sheets for ongoing tracking and trend analysis. What this workflow does Trigger — Manually start the workflow or schedule it to run periodically. Input Setup — Define the website URL and target country (e.g., https://dev.to, france). Data Scraping (Decodo) — Fetch competitor web content and metadata. Keyword Analysis (OpenAI GPT-4.1-mini) Extract primary and secondary keywords. Identify focus topics and semantic entities. Generate a keyword density summary and SEO strength score. Recommend optimization and internal linking opportunities. Data Structuring — Clean and convert GPT output into JSON format. Data Storage (Google Sheets) — Append structured keyword data to a Google Sheet for long-term tracking. Setup Prerequisites If you are new to Decode, please signup on this link visit.decodo.com n8n account with workflow editor access Decodo API credentials OpenAI API key Google Sheets account connected via OAuth2 Make sure to install the Decodo Community node. Create a Google Sheet Add columns for: primarykeywords, seostrengthscore, keyworddensity_summary, etc. Share with your n8n Google account. Connect Credentials Add credentials for: Decodo API credentials - You need to register, login and obtain the Basic Authentication Token via Decodo Dashboard OpenAI API (for GPT-4o-mini) Google Sheets OAuth2 Configure Input Fields Edit the “Set Input Fields” node to set your target site and region. Run the Workflow Click Execute Workflow in n8n. View structured results in your connected Google Sheet. How to customize this workflow Track Multiple Competitors → Use a Google Sheet or CSV list of URLs; loop through them using the Split In Batches node. Add Language Detection → Add a Gemini or GPT node before keyword analysis to detect content language and adjust prompts. Enhance the SEO Report → Expand the GPT prompt to include backlink insights, metadata optimization, or readability checks. Integrate Visualization → Connect your Google Sheet to Looker Studio for SEO performance dashboards. Schedule Auto-Runs → Use the Cron Node to run weekly or monthly for competitor keyword refreshes. Summary This workflow automates competitor keyword research using: Decodo for intelligent web scraping OpenAI GPT-4.1-mini for keyword and SEO analysis Google Sheets for live tracking and reporting It’s a complete AI-powered SEO intelligence pipeline ideal for teams that want actionable insights on keyword gaps, optimization opportunities, and content focus trends, without relying on expensive SEO SaaS tools.

Ranjan DailataBy Ranjan Dailata
161

Daily cash flow reports with Google Sheets, Slack & Email for finance teams

Simplify financial oversight with this automated n8n workflow. Triggered daily, it fetches cash flow and expense data from a Google Sheet, analyzes inflows and outflows, validates records, and generates a comprehensive daily report. The workflow sends multi-channel notifications via email and Slack, ensuring finance professionals stay updated with real-time financial insights. 💸📧 Key Features Daily automation keeps cash flow tracking current. Analyzes inflows and outflows for actionable insights. Multi-channel alerts enhance team visibility. Logs maintain a detailed record in Google Sheets. Workflow Process The Every Day node triggers a daily check at a set time. Get Cash Flow Data retrieves financial data from a Google Sheet. Analyze Inflows & Outflows processes the data to identify trends and totals. Validate Records ensures all entries are complete and accurate. If records are valid, it branches to: Sends Email Daily Report to finance team members. Send Slack Alert to notify the team instantly. Logs to Sheet appends the summary data to a Google Sheet for tracking. Setup Instructions Import the workflow into n8n and configure Google Sheets OAuth2 for data access. Set the daily trigger time (e.g., 9:00 AM IST) in the "Every Day" node. Test the workflow by adding sample cash flow data and verifying reports. Adjust analysis parameters as needed for specific financial metrics. Prerequisites Google Sheets OAuth2 credentials Gmail API Key for email reports Slack Bot Token (with chat:write permissions) Structured financial data in a Google Sheet Google Sheet Structure: Create a sheet with columns: Date Cash Inflow Cash Outflow Category Notes Updated At Modification Options Customize the "Analyze Inflows & Outflows" node to include custom financial ratios. Adjust the "Validate Records" filter to flag anomalies or missing data. Modify email and Slack templates with branded formatting. Integrate with accounting tools (e.g., Xero) for live data feeds. Set different trigger times to align with your financial review schedule. Discover more workflows – Get in touch with us

Oneclick AI SquadBy Oneclick AI Squad
619

Create personalized email outreach with AI, Telegram bot & website scraping

Demo Personalized Email This n8n workflow is built for AI and automation agencies to promote their workflows through an interactive demo that prospects can try themselves. The featured system is a deep personalized email demo. --- 🔄 How It Works Prospect Interaction A prospect starts the demo via Telegram. The Telegram bot (created with BotFather) connects directly to your n8n instance. Demo Guidance The RAG agent and instructor guide the user step-by-step through the demo. Instructions and responses are dynamically generated based on user input. Workflow Execution When the user triggers an action (e.g., testing the email demo), n8n runs the workflow. The workflow collects website data using Crawl4AI or standard HTTP requests. Email Demo The system personalizes and sends a demo email through SparkPost, showing the automation’s capability. Logging and Control Each user interaction is logged in your database using their name and id. The workflow checks limits to prevent misuse or spam. Error Handling If a low-CPU scraping method fails, the workflow automatically escalates to a higher-CPU method. ⚙️ Requirements Before setting up, make sure you have the following: n8n — Automation platform to run the workflow Docker — Required to run Crawl4AI Crawl4AI — For intelligent website crawling Telegram Account — To create your Telegram bot via BotFather SparkPost Account — To send personalized demo emails A database (e.g., PostgreSQL, MySQL, or SQLite) — To store log data such as user name and ID 🚀 Features Telegram interface using the BotFather API Instructor and RAG agent to guide prospects through the demo Flow generation limits per user ID to prevent abuse Low-cost yet powerful web scraping, escalating from low- to high-CPU flows if earlier ones fail --- 💡 Development Ideas Replace the RAG logic with your own query-answering and guidance method Remove the flow limit if you’re confident the demo can’t be misused Swap the personalized email demo with any other workflow you want to showcase --- 🧠 Technical Notes Telegram bot created with BotFather Website crawl process: Extract sub-links via /sitemap.xml, sitemap_index.xml, or standard HTTP requests Fall back to Crawl4AI if normal requests fail Fetch sub-link content via HTTPS or Crawl4AI as backup SparkPost used for sending demo emails --- ⚙️ Setup Instructions Create a Telegram Bot Use BotFather on Telegram to create your bot and get the API token. This token will be used to connect your n8n workflow to Telegram. Create a Log Data Table In your database, create a table to store user logs. The table must include at least the following columns: name — to store the user’s name or Telegram username. id — to store the user’s unique identifier. Install Crawl4AI with Docker Follow the installation guide from the official repository: 👉 https://github.com/unclecode/crawl4ai Crawl4AI will handle website crawling and content extraction in your workflow. --- 📦 Notes This setup is optimized for low cost, easy scalability, and real-time interaction with prospects. You can customize each component — Telegram bot behavior, RAG logic, scraping strategy, and email workflow — to fit your agency’s demo needs. 👉 You can try the live demo here: @emaildemobot ---

Michael A PutraBy Michael A Putra
474