Creating a AI Slack bot with Google Gemini
This is an example of how we can build a slack bot in a few easy steps Before you can start, you need to o a few things Create a copy of this workflow Create a slack bot Create a slash command on slack and paste the webhook url to the slack command Note Make sure to configure this webhook using a https:// wrapper and don't use the default http://localhost:5678 as that will not be recognized by your slack webhook. Once the data has been sent to your webhook, the next step will be passing it via an AI Agent to process data based on the queries we pass to our agent. To have some sort of a memory, be sure to set the slack token to the memory node. This way you can refer to other chats from the history. The final message is relayed back to slack as a new message. Since we can not wait longer than 3000 ms for slack response, we will create a new message with reference to the input we passed. We can advance this using the tools or data sources for it to be more custom tailored for your company. Usage To use the slackbot, go to slack and click on your set slash command eg /Bob and send your desired message. This will send the message to your endpoint and get return the processed results as the message. If you would like help setting this up, feel free to reach out to zacharia@effibotics.com
Save email attachments to Nextcloud
This workflow will take all emails you put into a certain folder, upload any attachements to Nextcloud, and mark the emails as read (configurable). Attachements will be saved with automatically generated filenames: 2021-01-01From-Sender-NameFilename-of-attachement.pdf Instructions: Allow lodash to be used in n8n (or rewrite the code...) NODEFUNCTIONALLOW_EXTERNAL=lodash (environment variable) Import workflow Set credentials for Email & Nextcloud nodes Configure to use correct folder / custom filters Activate Custom filter examples: Only unread emails: Custom Email Config = ["UNSEEN"] Filter emails by 'to' address: Custom Email Config = [["TO", "example+invoices@posteo.de"]]
Convert an XML file to JSON via webhook call
Who this template is for This template is for everyone who needs to work with XML data a lot and wants to convert it to JSON instead. Use case Many products still work with XML files as their main language. Unfortunately, not every software still supports XML, as many switched to more modern storing languages such as JSON. This workflow is designed to handle the conversion of XML data to JSON format via a webhook call, with error handling and Slack notifications integrated into the process. How this workflow works Triggering the workflow: This workflow initiates upon receiving an HTTP POST request at the webhook endpoint specified in the "POST" node. The endpoint, designated as <WEBHOOK_URL>, can be accessed externally by sending a POST request to that URL. Data routing and processing: Upon receiving the POST request, the Switch node routes the workflow's path based on conditions determined by the content type of the incoming data or any encountered errors. The Extract From File and Edit Fields (Set) nodes manage XML input processing, adapting their actions according to the data's content type. XML to JSON conversion: The XML data extracted from the input is passed through the "XML" node, which performs the conversion process, transforming it into JSON format. Response handling: If the XML-to-JSON conversion is successful, a success response is sent back with a status of "OK" and the converted JSON data. If there are any errors during the XML-to-JSON conversion process, an error response is sent back with a status of "error" and an error message. Error handling: in case of an error during processing, the workflow sends a notification to a Slack channel designated for error reporting. Set up steps Set up your own <WEBHOOK_URL> in the Webhook node. While building or testing a workflow, use a test webhook URL. When your workflow is ready, switch to using the production webhook URL. Set credentials for Slack.
Facebook Messenger Bot with GPT-4 for Text, Image & Voice Processing
How it Works This workflow lets you build a Messenger AI Agent capable of understanding text, images, and voice notes, and replying intelligently in real time. It starts by receiving messages from a Facebook Page via a Webhook, detects the message type (text, image, or audio), and routes it through the right branch. Each input is then prepared as a prompt and sent to an AI Agent that can respond using text generation, perform quick calculations, or fetch information from Wikipedia. Finally, the answer is formatted and sent back to Messenger via the Graph API, creating a smooth, fully automated chat experience. Set Up Steps Connect credentials Add your OpenAI API key and Facebook Page Access Token in n8n credentials. Plug the webhook Copy the Messenger webhook URL from your workflow and paste it into your Facebook Page Developer settings (Webhook → Messages → Subscribe). Customize the agent Edit the System Message of the AI Agent to define tone, temperature, and purpose (e.g. “customer support”, “math assistant”). Enable memory & tools Turn on Simple Memory to keep conversation context and activate tools like Calculator or Wikipedia. Test & deploy Switch to production mode, test text, image, and voice messages directly from Messenger. Benefits 💬 Multi-modal Understanding — Handles text, images, and audio messages seamlessly. ⚙️ Full Automation — End-to-end workflow from Messenger to AI and back. 🧠 Smart Replies — Uses OpenAI + Wikipedia + Calculator for context-aware answers. 🚀 No-Code Setup — Build your first Messenger AI in less than 30 minutes. 🔗 Extensible — Easily connect more tools or APIs like Airtable, Google Sheets, or Notion.
Daily AI news digest from Hacker News with GPT-5 summaries and email delivery
Daily AI News Digest from Hacker News with GPT Summaries and Email Delivery Automate your daily AI news briefing: fetch AI-tagged stories from Hacker News, filter for the last 24 hours, scrape and summarize with GPT, then deliver a clean HTML email digest—no manual curation needed. What It Does Runs on schedule to fetch up to 1000 Hacker News stories tagged "AI", filters for today's posts, loops through each to scrape content from source URLs, generates concise AI summaries via OpenAI GPT, aggregates into a styled HTML newsletter, and sends via email. Setup Requirements Credentials Needed: OpenAI API Key: Get from platform.openai.com/api-keys, add as "OpenAI" credential in n8n SMTP Server: Configure email credentials (Gmail, Zoho, etc.) in n8n's SMTP settings Configuration Steps: Import workflow JSON into n8n Add OpenAI credential to "GPT 5 pro" node Add SMTP credential to "Send email" node Update fromEmail and toEmail fields in "Send email" node Set schedule in "start" trigger node (default: daily) Activate workflow Key Features Smart Filtering: Fetches 1000 stories, filters last 24 hours using date expressions AI Summarization: GPT generates heading + 2-sentence summaries with links Reliable Scraping: HTTP requests with markdown conversion for clean LLM input Batch Processing: Loops through items, processes sequentially Responsive Design: Mobile-friendly HTML email template with inline CSS Aggregation: Combines all summaries into single digest Customization Options Change Keywords: Modify "AI" filter in "Get many items" node Adjust Timeframe: Edit date filter in "Today" node Tweak Summaries: Customize GPT prompt in "News Summary Agent" node Email Styling: Update HTML/CSS in "Send email" node Schedule: Change frequency in "start" trigger Use Cases Personal daily AI briefings for researchers/developers Team knowledge sharing via automated newsletters Content curation for blogs or social media Trend monitoring for marketers Troubleshooting No stories returned: Check HN API limits, verify keyword filter Scraping failures: Some sites block bots—proxy noted in workflow but may need updates Email not sending: Verify SMTP credentials and test connection Poor summaries: Adjust GPT prompt or switch model Execution Time: 2-10 minutes depending on story count
Archive Spotify's discover weekly playlist
This workflow will archive your Spotify Discover Weekly playlist to an archive playlist named "Discover Weekly Archive" which you must create yourself. If you want to change the name of the archive playlist, you can edit value2 in the "Find Archive Playlist" node. It is configured to run at 8am on Mondays, a conservative value in case you forgot to set your GENERIC_TIMEZONE environment variable (see the docs here). Special thanks to erin2722 for creating the Spotify node and harshil1712 for help with the workflow logic. To use this workflow, you'll need to: Create then select your credentials in each Spotify node Create the archive playlist yourself Optionally, you may choose to: Edit the archive playlist name in the "Find Archive Playlist" node Adjust the Cron node with an earlier time if you know GENERIC_TIMEZONE is set Setup an error workflow like this one to be notified if anything goes wrong
Create a PayPal batch payout
No description available.
Generate short-form clips from YouTube videos with GPT-4o, Grok & Airtable
This n8n template demonstrates how to automate YouTube content repurposing using AI. Upload a video to Google Drive and automatically generate transcriptions, A/B testable titles, AI thumbnails, short-form clips with captions, and YouTube descriptions with chapter timestamps. Use cases include: Content creators who publish 1-2 long-form videos per week and need to extract 5-10 short-form clips, YouTube agencies managing multiple channels, or automation consultants building content systems for clients. Good to know Processing time is approximately 10-15 minutes per video depending on length Cost per video is roughly $1.00 (transcription $0.65, AI generation $0.35) YouTube captions take 10-60 minutes to generate after upload - the workflow includes automatic polling to check when captions are ready Manual steps still required: video clipping (using provided timestamps), social media posting, and YouTube A/B test setup How it works When a video is uploaded to Google Drive, the workflow automatically triggers and creates an Airtable record The video URL is sent to AssemblyAI (via Apify) for transcription with H:MM:SS.mmm timestamps GPT-4o-mini analyzes the transcript and generates 3 title variations optimized for A/B testing When you click "Generate thumbnail" in Airtable, your prompt is optimized and sent to Kie.ai's Nano Banana Pro model with 2 reference images for consistent branding After uploading to YouTube, the workflow polls YouTube's API every 5 minutes to check if auto-generated captions are ready Once captions are available, click "Generate clips" and Grok 4.1 Fast analyzes the transcript to identify 3-8 elite clips (45+ seconds each) with proper start/end boundaries and action-oriented captions GPT-4o-mini generates a YouTube description with chapter timestamps based on the transcript All outputs are saved to Airtable: titles, thumbnail, clip timestamps with captions, and description How to use Duplicate the provided Airtable base template and connect it to your n8n instance Create a Google Drive folder for uploading edited videos After activating the workflow, copy webhook URLs and paste them into Airtable button formulas and automations Upload your edited video to the designated Google Drive folder to trigger the system The workflow automatically generates titles and begins transcription Add your thumbnail prompt and 2 reference images to Airtable, then click "Generate thumbnail" Upload the video to YouTube as unlisted, paste the video ID into Airtable, and check the box to trigger clip generation Use the provided timestamps to manually clip videos in your editor Copy titles, thumbnail, clips, and description from Airtable to publish across platforms Requirements Airtable account (Pro plan recommended for automations) Google Drive for video upload monitoring Apify account for video transcription via AssemblyAI actor OpenAI API key for title and description generation (GPT-4o-mini) OpenRouter API key for clip identification (Grok 4.1 Fast) Kie.ai account for AI thumbnail generation (Nano Banana Pro model) YouTube Data API credentials for caption polling Customising this workflow Tailor the system prompts to your content niche by asking Claude to adjust them without changing the core structure Modify the clip identification criteria (length, caption style, number of clips) in the Grok prompt Adjust thumbnail generation style by updating the image prompt optimizer Add custom fields to Airtable for tracking performance metrics or additional metadata Integrate with additional platforms like TikTok or Instagram APIs for automated posting
Create an Onfleet task when a file in Google Drive is updated
Summary Onfleet is a last-mile delivery software that provides end-to-end route planning, dispatch, communication, and analytics to handle the heavy lifting while you can focus on your customers. This workflow template listens to a Google Drive update event and creates an Onfleet delivery task. You can easily change which Onfleet entity to interact with. Configurations Connect to Google Drive with your own Google credentials Specify the Poll Times and File URL or ID to your own preference, the poll time determines the frequency of this check while the file URL/ID specifies which file to monitor Update the Onfleet node with your own Onfleet credentials, to register for an Onfleet API key, please visit https://onfleet.com/signup to get started
Get list of completed sale orders with Unleashed Software
No description available.
Create, update and get records in Quick Base
No description available.
Auto-send PDF invoices with Stripe payment triggers and Gmail
💰 Auto-Send PDF Invoice When Stripe Payment is Received This workflow automatically generates a PDF invoice every time a successful payment is received in Stripe, then emails the invoice to the customer via Gmail. Perfect for freelancers, SaaS businesses, and service providers who want to automate billing without manual effort. --- ⚙️ How It Works Stripe Payment Webhook Listens for successful payment events (payment_intent.succeeded). Triggers the workflow whenever a new payment is made. Normalize Payment Data A Code node extracts and formats details like: Payment ID Amount & currency Customer name & email Payment date Description Generates a unique invoice number. Generate Invoice HTML A Code node builds a professional invoice template in HTML. Data is dynamically inserted (amount, customer info, invoice number). Output prepared for PDF generation. Send Invoice Email The Gmail node sends an email to the customer. Invoice is attached as a PDF file. Includes a confirmation message with payment details. --- 🛠️ Setup Steps Stripe Webhook In your Stripe Dashboard: Navigate to Developers → Webhooks Add a new endpoint with your Webhook URL from the n8n Webhook node. Select event: payment_intent.succeeded Gmail Setup In n8n, connect your Gmail OAuth2 credentials. Emails will be sent directly from your Gmail account. Customize Invoice Open the Generate Invoice HTML node. Replace "Your Company Name" with your actual business name. Adjust invoice branding, colors, and layout as needed. --- 📧 Example Email Sent Subject: Invoice INV-123456789 - Payment Confirmation Body: Dear John Doe, Thank you for your payment! Please find your invoice attached. Payment Details: Amount: USD 99.00 Payment ID: pi_3JXXXXXXXX Date: 2025-08-29 Best regards, Your Company Name (Attached: invoice_INV-123456789.pdf) --- ⚡ With this workflow, every Stripe payment automatically creates and delivers a polished PDF invoice — no manual work required.