Generate and schedule AI discussion posts for Reddit with GPT-4 and Airtable
How the sequence works:
- A "Schedule Trigger" node activates the automation at a defined schedule.
- An "Airtable" node will search for previously posted questions in your question database.
- Airtable Base Template: here
- An "Aggregate" node will take all the questions from Airtable and compress them to a single output.
- ChatGPT, or a model of your choice, will generate a discussion question based on the options in the system prompt.
- The discussion question will be posted to the subreddit of your choice by the "Reddit" node.
- You can choose between a text, image, or link post!
- The recently-posted discussion question will then be uploaded to your Airtable base using the "Airtable" node. This will be used to prevent ChatGPT from creating duplicate questions.
Setup Steps
The setup process will take about 5 minutes. An Airtable base template is also pre-made for you here: https://airtable.com/app6wzQqegKIJOiOg/shrzy7L9yv8BFRQdY
- Set the recurrence in the "Schedule" node
- Create an Airtable account, you can use the link here.
- Get an Airtable personal access token here.
- Configure the "Get Previous Discussions" Airtable node.
- Configure the options in brackets in the "Generate New Discussion" node.
- Set the desired subreddit to post to and the post type(text, image, or link) in the "Post Discussion" node.
- Configure the "Create Archived Discussion" node to map to the Airtable base(required) and specific subreddit(optional).
Generate and Schedule AI Discussion Posts for Reddit with GPT-4 and Airtable
This n8n workflow automates the creation and scheduling of discussion posts for Reddit. It leverages the power of AI (GPT-4 via OpenAI) to generate engaging content based on topics stored in Airtable, and then aggregates these posts for potential scheduling or immediate publication to Reddit.
What it does
This workflow streamlines the process of generating and publishing Reddit content by:
- Triggering on a Schedule: The workflow starts at predefined intervals, allowing for regular content generation.
- Fetching Topics from Airtable: It connects to an Airtable base to retrieve discussion topics or prompts.
- Generating AI Content with OpenAI: For each topic, it uses the OpenAI Chat Model (GPT-4) to generate a discussion post.
- Aggregating Generated Posts: All generated posts are collected into a single output, making them ready for further processing (e.g., review, scheduling, or direct posting).
- Preparing for Reddit (Placeholder): The Reddit node is included as a placeholder, indicating the final intended action of posting the generated content to Reddit. It's currently disconnected, suggesting a manual review step or a future expansion to connect it.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running n8n instance.
- Airtable Account: An Airtable account with a base containing the discussion topics. You'll need to configure the Airtable node with your API Key and Base ID.
- OpenAI Account: An OpenAI account with API access to GPT-4. You'll need an OpenAI API Key for the "OpenAI Chat Model" node.
- Reddit Account (Optional for full automation): If you plan to fully automate posting, you'll need a Reddit account and credentials configured in n8n.
Setup/Usage
- Import the Workflow: Download the provided JSON and import it into your n8n instance.
- Configure Credentials:
- Airtable: Set up your Airtable credential with your API Key. In the "Airtable" node, select your credential and specify the "Base ID" and "Table Name" where your discussion topics are stored.
- OpenAI: Set up your OpenAI credential with your API Key. The "OpenAI Chat Model" node will use this credential.
- Reddit (Optional): If you intend to connect and use the Reddit node, set up your Reddit OAuth2 credential.
- Customize the Schedule: In the "Schedule Trigger" node, adjust the cron expression or interval to your desired frequency for generating posts.
- Customize the AI Prompt: In the "Basic LLM Chain" and "OpenAI Chat Model" nodes, review and modify the prompt to guide the AI in generating the desired type of discussion posts.
- Review and Connect Reddit Node:
- The "Reddit" node is currently disconnected. After generating posts, you might want to manually review them.
- To enable automatic posting, connect the "Aggregate" node to the "Reddit" node.
- Configure the "Reddit" node with the desired subreddit, title, and content fields using expressions from the previous nodes (e.g.,
{{ $json.post_title }}and{{ $json.post_content }}).
- Activate the Workflow: Once configured, activate the workflow to start generating and potentially scheduling Reddit posts automatically.
Related Templates
Automated YouTube video uploads with 12h interval scheduling in JST
This workflow automates a batch upload of multiple videos to YouTube, spacing each upload 12 hours apart in Japan Standard Time (UTC+9) and automatically adding them to a playlist. ⚙️ Workflow Logic Manual Trigger — Starts the workflow manually. List Video Files — Uses a shell command to find all .mp4 files under the specified directory (/opt/downloads/单词卡/A1-A2). Sort and Generate Items — Sorts videos by day number (dayXX) extracted from filenames and assigns a sequential order value. Calculate Publish Schedule (+12h Interval) — Computes the next rounded JST hour plus a configurable buffer (default 30 min). Staggers each video’s scheduled time by order × 12 hours. Converts JST back to UTC for YouTube’s publishAt field. Split in Batches (1 per video) — Iterates over each video item. Read Video File — Loads the corresponding video from disk. Upload to YouTube (Scheduled) — Uploads the video privately with the computed publishAtUtc. Add to Playlist — Adds the newly uploaded video to the target playlist. 🕒 Highlights Timezone-safe: Pure UTC ↔ JST conversion avoids double-offset errors. Sequential scheduling: Ensures each upload is 12 hours apart to prevent clustering. Customizable: Change SPANHOURS, BUFFERMIN, or directory paths easily. Retry-ready: Each upload and playlist step has retry logic to handle transient errors. 💡 Typical Use Cases Multi-part educational video series (e.g., A1–A2 English learning). Regular content release cadence without manual scheduling. Automated YouTube publishing pipelines for pre-produced content. --- Author: Zane Category: Automation / YouTube / Scheduler Timezone: JST (UTC+09:00)
Monitor bank transactions with multi-channel alerts for accounting teams
Enhance financial oversight with this automated n8n workflow. Triggered every 5 minutes, it fetches real-time bank transactions via an API, enriches and transforms the data, and applies smart logic to detect critical, high, and medium priority alerts based on error conditions, amounts, or risk scores. It sends multi-channel notifications via email and Slack, logs all data to Google Sheets, and generates summary statistics for comprehensive tracking. 💰🚨 Key Features Real-time monitoring every 5 minutes for instant alerts. Smart prioritization (Critical, High, Medium) based on risk and errors. Multi-channel notifications via email and Slack. Detailed logging and summary reports in Google Sheets. How It Works Schedule Trigger: Runs every 5 minutes. Fetch Transactions: HTTP request retrieves real-time transaction data. API Error?: If condition for error logic is met, sends error alert. Enrich & Transform Data: Advanced risk calculation enhances data. Critical Alert?: If condition (50% or risk > 8) is met, raises alert. High Priority?: If condition (5% or risk > 7) is met, raises alert. Medium Priority?: If condition is met, raises alert. Log Priority to Sheet: Google Sheets appends critical, high, or medium priority data. Send Critical Email: HTML email to execute sheets append. Send High Priority Email: Email to finance team. Send High Priority Slack: Slack notification to finance team. Send Medium Priority Email: Email to finance team. Merge All Alerts: Combines all alerts for comprehensive tracking. Generate Summary Stats: Code block for analytics. Log Summary to Sheet: Summary statistics storage. Setup Instructions Import the workflow into n8n and configure the bank API credentials in "Fetch Transactions." Set up Google Sheets OAuth2 and replace the sheet ID for logging nodes. Configure Gmail API Key and Slack Bot Token for alerts. Test the workflow with sample transaction data exceeding risk or amount thresholds. Adjust priority conditions (e.g., 50%, 5%, risk > 8) based on your risk policy. Prerequisites Bank API access with real-time transaction data (e.g., https://api.bank.com) Google Sheets OAuth2 credentials Gmail API Key for email alerts Slack Bot Token (with chat:write permissions) Structured transaction data format Google Sheet Structure: Create a sheet with columns: Transaction ID Amount Date Risk Score Priority (Critical/High/Medium) Alert Sent Summary Stats Updated At Modification Options Adjust the "Schedule Trigger" interval (e.g., every 10 minutes). Modify "Critical Alert?" and "High Priority?" conditions for custom thresholds. Customize email and Slack templates with branded messaging. Integrate with fraud detection tools for enhanced risk analysis. Enhance "Generate Summary Stats" with additional metrics (e.g., average risk). Discover more workflows – Get in touch with us
Detect holiday conflicts & suggest meeting reschedules with Google Calendar and Slack
Who’s it for Remote and distributed teams that schedule across time zones and want to avoid meetings landing on public holidays—PMs, CS/AM teams, and ops leads who own cross-regional calendars. What it does / How it works The workflow checks next week’s Google Calendar events, compares event dates against public holidays for selected country codes, and produces a single Slack digest with any conflicts plus suggested alternative dates. Core steps: Workflow Configuration (Set) → Fetch Public Holidays (via a public holiday API such as Calendarific/Nager.Date) → Get Next Week Calendar Events (Google Calendar) → Detect Holiday Conflicts (compare dates) → Generate Reschedule Suggestions (find nearest business day that isn’t a holiday/weekend) → Format Slack Digest → Post Slack Digest. How to set up Open Workflow Configuration (Set) and edit: countryCodes, calendarId, slackChannel, nextWeekStart, nextWeekEnd. Connect your own Google Calendar and Slack credentials in n8n (no hardcoded keys). (Optional) Adjust the Trigger to run daily or only on Mondays. Requirements n8n (Cloud or self-hosted) Google Calendar read access to the target calendar Slack app with permission to post to the chosen channel A public-holiday API (no secrets needed for Nager.Date; Calendarific requires an API key) How to customize the workflow Time window: Change nextWeekStart/End to scan a different period. Holiday sources: Add or swap APIs; merge multiple regions. Suggestion logic: Tweak the look-ahead window or rules (e.g., skip Fridays). Output: Post per-calendar messages, DM owners, or create tentative reschedule events automatically.