4 templates found
Category:
Author:
Sort:

Create & track LinkedIn posts with Google Sheets, GPT-5.1, Unsplash, and Sona

Automate LinkedIn content creation by managing ideas in Google Sheets, generating professional AI-written posts, intelligently selecting relevant Unsplash images, sending drafts for email approval, and publishing directly to LinkedIn. How it works Step 1: Scheduled Sheet Check Workflow runs daily at midnight (customizable to hourly/weekly) Fetches posts from Google Sheet marked with Status = "Ready" Processes one post per run, updates status to "In Progress" Step 2: AI Content Generation GPT-5.1 creates engaging LinkedIn post based on your inputs Generates content with proper hashtags, formatting, and tone Follows your specified content type (tip, story, announcement, etc.) Step 3: Quality Validation Automatically checks character limits (3000 max) Validates minimum hashtag requirements (3+) Loops back to regenerate if quality checks fail Step 4: Email Approval Workflow Formats post as HTML email with professional styling Sends preview to your Gmail for review Waits for your approval response before proceeding Nothing posts without explicit confirmation Step 5: Smart Image Handling If you provided image URL: Downloads from Google Drive, Dropbox, or direct links If no URL is provided: Fetch 10 images from Unsplash and use AI to select the best one. If "Include Image?" is "No": Posts text-only content Automatically converts share links to downloadable formats Step 6: LinkedIn Publishing & Tracking Posts approved content directly to your LinkedIn profile Uses appropriate API endpoint based on whether image is included Updates Google Sheet status to "Posted" for successful posts Marks "Rejected" posts in sheet for review and editing What you'll get Batch content planning: Queue multiple posts in advance via Google Sheets Consistent posting schedule: Automated daily publishing without manual work Professional AI content: GPT-5.1 generates engaging, platform-optimized posts Full approval control: Review every post before it goes live Flexible image options: Your images, AI-generated, or text-only Quality assurance: Built-in checks prevent poorly formatted posts Status tracking: Monitor what's ready, in progress, rejected, or posted Smart link conversion: Automatically handles Google Drive and Dropbox share links Requirements Accounts & credentials: OpenAI API key (requires paid plan for GPT-5.1) Gmail account (for approval workflow) Google account (for Sheets integration) LinkedIn account (for publishing) Unsplash API key (for getting images) Google Sheet setup: Create a sheet with these columns: Topic/Subject (required) - Main idea for the post Content Type (required) - e.g., "Tip", "Story", "Announcement" Tone (required) - e.g., "Professional", "Casual", "Inspirational" Target Audience (optional) - Who you're writing for Additional Notes (optional) - Specific points to include Image link for your post (optional) - URL to your image Include Image? (required) - "Yes" or "No" Status (required) - "Ready" to trigger workflow Setup steps Import workflow - Click "Use workflow" to add to your n8n instance Connect credentials: Google Sheets: Authenticate and select your sheet from dropdown OpenAI: Add your API key in both AI nodes Gmail: Authenticate and update recipient email in approval node LinkedIn: Authenticate (your profile auto-populates) Create your content sheet - Add the required columns and fill with post ideas Test the workflow: Add one test row with Status = "Ready" Run workflow manually Check email for approval Verify post appears on LinkedIn Configure schedule - Default is daily at midnight; adjust Schedule Trigger node for different frequency Start batching - Add multiple ideas to your sheet and let automation handle the rest Tips for best results Be specific in Topic/Subject: "5 ways to improve team productivity" beats "productivity tips" Mix content types and tones to keep your feed engaging Use Additional Notes for data points, statistics, or specific examples. You can also include links that the AI can use for the posts. Start with text-only posts to validate content quality before adding images Review rejected posts carefully and refine your inputs Batch 10-20 ideas at once for weeks of automated content

Sona LabsBy Sona Labs
1334

N8N automated Twitter reply bot workflow

N8N Automated Twitter Reply Bot Workflow ============================================ For latest version, check: dziura.online/automation Latest documentation can be find here You must have Apify community node installed before pasting the JSON to your workflow.  Overview ------------ This n8n workflow creates an intelligent Twitter/X reply bot that automatically scrapes tweets based on keywords or communities, analyzes them using AI, generates contextually appropriate replies, and posts them while avoiding duplicates. The bot operates on a schedule with intelligent timing and retry mechanisms. Key Features ---------------- Automated tweet scraping from Twitter/X using Apify actors AI-powered reply generation using LLM (Large Language Model) Duplicate prevention via MongoDB storage Smart scheduling with timezone awareness and natural posting patterns Retry mechanism with failure tracking Telegram notifications for status updates Manual trigger option via Telegram command Required Credentials & Setup -------------------------------- 1\. Telegram Bot Create a bot via @BotFather on Telegram Get your Telegram chat ID to receive status messages Credential needed: Telegram account (Bot token) 2\. MongoDB Database Set up a MongoDB database to store replied tweets and prevent duplicates Create a collection (default name: collection\_name) Credential needed: MongoDB account (Connection string) Tutorial: MongoDB Connection Guide 3\. Apify Account Sign up at Apify.com Primary actors used: Search Actor: api-ninja/x-twitter-advanced-search - For keyword-based tweet scraping (ID: 0oVSlMlAX47R2EyoP) Community Actor: api-ninja/x-twitter-community-search-post-scraper - For community-based tweet scraping (ID: upbwCMnBATzmzcaNu) Credential needed: Apify account (API token) 4\. OpenRouter (LLM Provider) Sign up at OpenRouter.ai Used for AI-powered tweet analysis and reply generation Model used: x-ai/grok-3 (configurable) Credential needed: OpenRouter account (API key) 5\. Twitter/X API Set up developer account at developer.x.com Note: Free tier limited to ~17 posts per day Credential needed: X account (OAuth2 credentials) Workflow Components ----------------------- Trigger Nodes 1\. Schedule Trigger Purpose: Runs automatically every 20 minutes Smart timing: Only active between 7 AM - 11:59 PM (configurable timezone) Randomization: Built-in probability control (~28% execution chance) to mimic natural posting patterns 2\. Manual Trigger Purpose: Manual execution for testing 3\. Telegram Trigger Purpose: Manual execution via /reply command in Telegram Usage: Send /reply to your bot to trigger the workflow manually Data Processing Flow 1\. MongoDB Query (Find documents) Purpose: Retrieves previously replied tweet IDs to avoid duplicates Collection: collection\_name (configure to match your setup) Projection: Only fetches tweet\_id field for efficiency 2\. Data Aggregation (Aggregate1) Purpose: Consolidates tweet IDs into a single array for filtering 3\. Keyword/Community Selection (Keyword/Community List) Purpose: Defines search terms and communities Configuration: Edit the JSON to include your keywords and Twitter community IDs Format:{   "keyword\community\list": \[     "SaaS",     "Entrepreneur",      "1488663855127535616"  // Community ID (19-digit number)   \],   "failure": 0 } 4\. Random Selection (Randomized community, keyword) Purpose: Randomly selects one item from the list to ensure variety 5\. Routing Logic (If4) Purpose: Determines whether to use Community search or Keyword search Logic: Uses regex to detect 19-digit community IDs vs keywords Tweet Scraping (Apify Actors) Community Search Actor Actor: api-ninja/x-twitter-community-search-post-scraper Purpose: Scrapes tweets from specific Twitter communities Configuration:{   "communityIds": \["COMMUNITY\_ID"\],   "numberOfTweets": 40 } Search Actor Actor: api-ninja/x-twitter-advanced-search Purpose: Scrapes tweets based on keywords Configuration:{   "contentLanguage": "en",   "engagementMinLikes": 10,   "engagementMinReplies": 5,   "numberOfTweets": 20,   "query": "KEYWORD",   "timeWithinTime": "2d",   "tweetTypes": \["original"\],   "usersBlueVerifiedOnly": true } Filtering System (Community filter) The workflow applies multiple filters to ensure high-quality replies: Text length: >60 characters (substantial content) Follower count: >100 followers (audience reach) Engagement: >10 likes, >3 replies (proven engagement) Language: English only Views: >100 views (visibility) Duplicate check: Not previously replied to Recency: Within 2 days (configurable in actor settings) AI-Powered Reply Generation LLM Chain (Basic LLM Chain) Purpose: Analyzes filtered tweets and generates contextually appropriate replies Model: Grok-3 via OpenRouter (configurable) Features: Engagement potential scoring User authority analysis Timing optimization Multiple reply styles (witty, informative, supportive, etc.) <100 character limit for optimal engagement Output Parser (Structured Output Parser) Purpose: Ensures consistent JSON output format Schema:{   "selected\tweet\id": "tweet\id\here",   "screen\name": "author\screen\_name",    "reply": "generated\reply\here" } Posting & Notification System Twitter Posting (Create Tweet) Purpose: Posts the generated reply as a Twitter response Error handling: Catches API limitations and rate limits Status Notifications Success: Notifies via Telegram with tweet link and reply text Failure: Notifies about API limitations or errors Format: HTML-formatted messages with clickable links Database Storage (Insert documents) Purpose: Saves successful replies to prevent future duplicates Fields stored: tweet\id, screen\name, reply, tweet\_url, timestamp Retry Mechanism The workflow includes intelligent retry logic: Failure Counter (If5, Increment Failure Counter1) Logic: If no suitable tweets found, increment failure counter Retry limit: Maximum 3 retries with different random keywords Wait time: 3-second delay between retries Final Failure Notification Trigger: After 4 failed attempts Action: Sends Telegram notification about unsuccessful search Recovery: Manual retry available via /reply command Configuration Guide ----------------------- Essential Settings to Modify MongoDB Collection Name: Update collection\_name in MongoDB nodes Telegram Chat ID: Replace 11111111111 with your actual chat ID Keywords/Communities: Edit the list in Keyword/Community List node Timezone: Update timezone in Code node (currently set to Europe/Kyiv) Actor Selection: Enable only one actor (Community OR Search) based on your needs Filter Customization Adjust filters in Community filter node based on your requirements: Minimum engagement thresholds Text length requirements Time windows Language preferences LLM Customization Modify the AI prompt in Basic LLM Chain to: Change reply style and tone Adjust engagement criteria Modify scoring algorithms Set different character limits Usage Tips -------------- Start small: Begin with a few high-quality keywords/communities Monitor performance: Use Telegram notifications to track success rates Adjust filters: Fine-tune based on the quality of generated replies Respect limits: Twitter's free tier allows ~17 posts/day Test manually: Use /reply command for testing before scheduling Troubleshooting ------------------- Common Issues No tweets found: Adjust filter criteria or check keywords API rate limits: Reduce posting frequency or upgrade Twitter API plan MongoDB connection: Verify connection string and collection name Apify quota: Monitor Apify usage limits LLM failures: Check OpenRouter credits and model availability Best Practices Monitor your bot's replies for quality and appropriateness Regularly update keywords to stay relevant Keep an eye on engagement metrics Adjust timing based on your audience's activity patterns Maintain a balanced posting frequency to avoid appearing spammy Documentation Links ----------------------- Full Documentation: Google Doc Guide Latest Version: dziura.online/automation MongoDB Setup Tutorial: YouTube Guide This workflow provides a comprehensive solution for automated, intelligent Twitter engagement while maintaining quality and avoiding spam-like behavior.

MaxBy Max
1080

Automate lead generation with Apollo, AI parsing, and timed email follow-ups

Good to know: The workflow runs every hour with a randomized delay of 5–20 minutes to help distribute load. It records the exact date and time a lead is emailed so you can track outreach. Follow-ups are automatically scheduled two days after the initial email. How it works: After apify completes, the JSON data is retrieved and inserted into the proper JSON node (only the JSON is removed — nothing else). The agent then runs on its own, parsing the data and pushing it to Google Sheets. When a lead is emailed, the system tags it with the date and time for tracking. Two days later the workflow automatically triggers a follow-up, again on an hourly schedule with the same time delay. How to use: Start by connecting your apify account to retrieve data. Place the returned JSON into the designated JSON node. Configure your Google Sheet where the data will be stored. Adjust the time delay window or follow-up period if needed. Insert your email credentials and the message. Requirements: Apify account with active leads/data. Google Sheet for storing and managing parsed lead information. n8n credentials configured for your accounts. email credentials Customising this workflow: You can easily extend this template to include other CRMs, different time delays, or additional notification steps. For example, push new leads to Slack, send SMS notifications, or trigger downstream analytics dashboards automatically.

DenizBy Deniz
563

Personalize client meeting prep with GPT-4, Google Calendar, Notion & Places API to Slack

Who is this for This template is perfect for sales professionals, account managers, and business development teams who want to make memorable impressions on their clients. It automates the tedious task of researching gift shops and preparation spots before important meetings. What it does This workflow automatically prepares personalized recommendations for client visits by monitoring your Google Calendar, enriching data from Notion, and using AI to select the perfect options. How it works Trigger: Activates when a calendar event containing keywords like "visit," "meeting," "client," or "dinner" is created or updated Extract: Parses company name from the event title Enrich: Fetches customer preferences from your Notion database Search: Google Places API finds nearby gift shops and quiet cafes Analyze: GPT-4 recommends the best options based on customer preferences Notify: Sends a personalized message to Slack with recommendations Example Slack Output Here's what the final notification looks like: 🎁 Recommended Gift Shop Patisserie Sadaharu AOKI (★4.6) 3-5-2 Marunouchi, Chiyoda-ku 💡 Reason: The customer loves French desserts, so this patisserie's macarons would be perfect! ☕ Pre-Meeting Cafe Starbucks Reserve Roastery (★4.5) 5 min walk from meeting location Set up steps Setup time: approximately 15 minutes Google Calendar: Connect your Google Calendar account and select your calendar Notion Database: Create a customer database with "Company Name" (title) and "Preferences" (text) fields Google Places API: Get an API key from Google Cloud Console and add it to the Configuration node OpenAI: Connect your OpenAI account for AI-powered recommendations Slack: Connect your Slack workspace and update the channel ID in the final node Requirements Google Calendar account Notion account with a customer database Google Places API key (requires Google Cloud account) OpenAI API key Slack workspace with bot permissions How to customize Search radius: Adjust the searchRadius parameter in the Configuration node (default: 1000 meters) Event keywords: Modify the Filter node conditions to match your calendar naming conventions Notification channel: Change the Slack channel ID to your preferred channel

長谷 真宏By 長谷 真宏
27
All templates loaded