Retweet cleanup with scheduling for X/Twitter
Who’s it for
Social media managers, creators, and brand accounts that rely on retweets for reach but want an automated, hands-off cleanup after campaigns to keep profiles tidy and on-brand.
What it does / How it works
On a schedule, the workflow resolves your handle to a user ID, fetches recent tweets, filters retweets only, and safely unretweets them using batching and delays to respect rate limits. A dedicated CONFIG (Set Fields) node centralizes variables (e.g., target_username, max_results, batch_delay_minutes) so you can adjust behavior without touching logic.
API endpoints used
- GET
/2/users/by/username/{username}– resolve handle → user ID - GET
/2/users/{id}/tweets?tweet.fields=created_at,referenced_tweets– fetch recent tweets (identify retweets viareferenced_tweets.type === "retweeted") - DELETE
/2/users/{id}/retweets/{tweet_id}– unretweet
Example response payloads
GET /2/users/by/username/{username}
{
"data": { "id": "2244994945", "name": "Twitter Dev", "username": "TwitterDev" }
}
GET /2/users/{id}/tweets (truncated)
{
"data": [
{
"id": "1760000000000000000",
"text": "RT @someone: …",
"referenced_tweets": [{ "type": "retweeted", "id": "1759999999999999999" }],
"created_at": "2025-01-15T09:10:11.000Z"
}
],
"meta": { "result_count": 20 }
}
DELETE /2/users/{id}/retweets/{tweet_id}
{ "data": { "retweeted": false } }
Use cases
- Brand hygiene: Auto-unretweet promos after 48–72h.
- Campaign cadence: Remove event retweets once the event ends.
- Feed freshness: Clear low-priority retweets on a rolling basis.
How to set up
-
Open CONFIG (Set Fields) and replace placeholders:
target_username = "your_handle"max_results = 100(per fetch)batch_delay_minutes = 2(throttle between batches)
-
Connect X/Twitter credentials in n8n (no keys hard-coded in HTTP nodes).
-
Run once with small values, verify logs, then enable the schedule.
> Optional enhancements: add a dead-letter path (Error Trigger → Set → Sheets/Email/Slack) and a notification node (e.g., Slack) for execution feedback.
n8n Retweet Cleanup and Scheduling Workflow
This n8n workflow provides a robust solution for managing retweets on X (formerly Twitter). It's designed to automatically identify and delete specific retweets based on predefined criteria, with built-in error handling and notifications.
What it does
This workflow automates the following steps:
- Schedules Execution: The workflow is triggered on a recurring schedule (e.g., daily, hourly) to initiate the retweet cleanup process.
- Fetches Retweets: It makes an HTTP request to an external API endpoint to retrieve a list of retweets that need to be processed.
- Filters Retweets: It evaluates each fetched retweet against a condition. In this specific workflow, it checks if the
retweet_idis not null. - Prepares Data for Deletion: For retweets that meet the filtering criteria, it transforms the data to isolate the
retweet_idfor the deletion process. - Processes Retweets in Batches: It iterates through the filtered retweets, processing them in batches to manage API rate limits or system load.
- Deletes Retweets: For each valid retweet ID, it uses the X (formerly Twitter) node to delete the retweet.
- Introduces Delay: A short delay is introduced between retweet deletions to prevent hitting API rate limits.
- Handles Errors: If any part of the main workflow execution fails, an "Error Trigger" node is activated.
- Notifies on Error: Upon an error, it sends a notification to a specified Slack channel, providing details about the workflow execution failure.
Prerequisites/Requirements
- n8n Instance: A running n8n instance to import and execute the workflow.
- X (formerly Twitter) Credentials: An X (formerly Twitter) API key and secret configured as an n8n credential to allow the workflow to interact with your X account.
- Slack Credentials: A Slack API token configured as an n8n credential to enable sending error notifications.
- External API Endpoint: An external API endpoint that provides a list of retweets to be processed. The workflow expects this API to return data containing a
retweet_idfield.
Setup/Usage
- Import the Workflow:
- Download the workflow JSON.
- In your n8n instance, click "New" in the workflows list.
- Click the "Import from JSON" button and paste the workflow JSON.
- Configure Credentials:
- Locate the "X" node and configure your X (formerly Twitter) API credentials.
- Locate the "Slack" node and configure your Slack API credentials.
- Configure HTTP Request:
- Edit the "HTTP Request" node to point to your specific external API endpoint that returns retweet data.
- Configure Schedule:
- Adjust the "Schedule Trigger" node to your desired execution frequency (e.g., every day at a specific time, every hour).
- Activate the Workflow:
- Save and activate the workflow.
The workflow will now automatically run according to your schedule, fetch retweets, filter them, and delete those that match your criteria, notifying you on Slack if any issues arise.
Related Templates
AI-powered code review with linting, red-marked corrections in Google Sheets & Slack
Advanced Code Review Automation (AI + Lint + Slack) Who’s it for For software engineers, QA teams, and tech leads who want to automate intelligent code reviews with both AI-driven suggestions and rule-based linting — all managed in Google Sheets with instant Slack summaries. How it works This workflow performs a two-layer review system: Lint Check: Runs a lightweight static analysis to find common issues (e.g., use of var, console.log, unbalanced braces). AI Review: Sends valid code to Gemini AI, which provides human-like review feedback with severity classification (Critical, Major, Minor) and visual highlights (red/orange tags). Formatter: Combines lint and AI results, calculating an overall score (0–10). Aggregator: Summarizes results for quick comparison. Google Sheets Writer: Appends results to your review log. Slack Notification: Posts a concise summary (e.g., number of issues and average score) to your team’s channel. How to set up Connect Google Sheets and Slack credentials in n8n. Replace placeholders (<YOURSPREADSHEETID>, <YOURSHEETGIDORNAME>, <YOURSLACKCHANNEL_ID>). Adjust the AI review prompt or lint rules as needed. Activate the workflow — reviews will start automatically whenever new code is added to the sheet. Requirements Google Sheets and Slack integrations enabled A configured AI node (Gemini, OpenAI, or compatible) Proper permissions to write to your target Google Sheet How to customize Add more linting rules (naming conventions, spacing, forbidden APIs) Extend the AI prompt for project-specific guidelines Customize the Slack message formatting Export analytics to a dashboard (e.g., Notion or Data Studio) Why it’s valuable This workflow brings realistic, team-oriented AI-assisted code review to n8n — combining the speed of automated linting with the nuance of human-style feedback. It saves time, improves code quality, and keeps your team’s review history transparent and centralized.
Generate Weather-Based Date Itineraries with Google Places, OpenRouter AI, and Slack
🧩 What this template does This workflow builds a 120-minute local date course around your starting point by querying Google Places for nearby spots, selecting the top candidates, fetching real-time weather data, letting an AI generate a matching emoji, and drafting a friendly itinerary summary with an LLM in both English and Japanese. It then posts the full bilingual plan with a walking route link and weather emoji to Slack. 👥 Who it’s for Makers and teams who want a plug-and-play bilingual local itinerary generator with weather awareness — no custom code required. ⚙️ How it works Trigger – Manual (or schedule/webhook). Discovery – Google Places nearby search within a configurable radius. Selection – Rank by rating and pick the top 3. Weather – Fetch current weather (via OpenWeatherMap). Emoji – Use an AI model to match the weather with an emoji 🌤️. Planning – An LLM writes the itinerary in Markdown (JP + EN). Route – Compose a Google Maps walking route URL. Share – Post the bilingual itinerary, route link, and weather emoji to Slack. 🧰 Requirements n8n (Cloud or self-hosted) Google Maps Platform (Places API) OpenWeatherMap API key Slack Bot (chat:write) LLM provider (e.g., OpenRouter or DeepL for translation) 🚀 Setup (quick) Open Set → Fields: Config and fill in coords/radius/time limit. Connect Credentials for Google, OpenWeatherMap, Slack, and your LLM. Test the workflow and confirm the bilingual plan + weather emoji appear in Slack. 🛠 Customize Adjust ranking filters (type, min rating). Modify translation settings (target language or tone). Change output layout (side-by-side vs separated). Tune emoji logic or travel mode. Add error handling, retries, or logging for production use.
AI-powered document search with Oracle and ONNX embeddings for recruiting
How it works Create a user for doing Hybrid Search. Clear Existing Data, if present. Add Documents into the table. Create a hybrid index. Run Semantic search on the Documents table for "prioritize teamwork and leadership experience". Run Hybrid search for the text input in the Chat interface on the Documents table. Setup Steps Download the ONNX model allMiniLML12v2augmented.zip Extract the ZIP file on the database server into a directory, for example /opt/oracle/onnx. After extraction, the folder contents should look like: bash bash-4.4$ pwd /opt/oracle/onnx bash-4.4$ ls allMiniLML12_v2.onnx Connect as SYSDBA and create the DBA user sql -- Create DBA user CREATE USER app_admin IDENTIFIED BY "StrongPassword123" DEFAULT TABLESPACE users TEMPORARY TABLESPACE temp QUOTA UNLIMITED ON users; -- Grant privileges GRANT DBA TO app_admin; GRANT CREATE TABLESPACE, ALTER TABLESPACE, DROP TABLESPACE TO app_admin; Create n8n Oracle DB credentials hybridsearchuser → for hybrid search operations dbadocuser → for DBA setup (user and tablespace creation) Run the workflow Click the manual Trigger It displays Pure semantic search results. Enter search text in Chat interface It displays results for vector and keyword search. Note The workflow currently creates the hybrid search user, docuser with the password visible in plain text inside the n8n Execute SQL node. For better security, consider performing the user creation manually outside n8n. Oracle 23ai or 26ai Database has to be used. Reference Hybrid Search End-End Example