20 templates found
Category:
Author:
Sort:

Send a ChatGPT email reply and save responses to Google Sheets

This workflow sends a OpenAI GPT reply when an email is received from specific email recipients. It then saves the initial email and the GPT response to an automatically generated Google spreadsheet. Subsequent GPT responses will be added to the same spreadsheet. Additionally, when feedback is given for any of the GPT responses, it will be recorded to the spreasheet, which can then be used later to fine-tune the GPT model. Prerequisites OpenAI credentials Google credentials How it works This workflow is essentially a two-in-one workflow. It triggers off from two different nodes and have very different functionality from each trigger. The flow triggered from On email received node is as follows: Triggers off on the On email received node. Extract the email body from the email. Generate a response from the email body using the OpenAI node. Reply to the email sender using the Send reply to recipient node. A feedback link is also included in the email body which will trigger the On feedback given node. This is used to fine-tune the GPT model. Save the email body and OpenAI response to a Google Sheet. If a sheet does not exist, it will be created. The flow triggered from On feedback given node is as follows: Triggers off when a feedback link is clicked in the emailed GPT response. The feedback, either positive or negative, for that specific GPT response is then recorded to the Google Sheet.

n8n TeamBy n8n Team
18784

Upsert huge documents in a vector store with Supabase and Notion

Purpose This workflow adds the capability to build a RAG on living data. In this case Notion is used as a Knowledge Base. Whenever a page is updated, the embeddings get upserted in a Supabase Vector Store. It can also be fairly easily adapted to PGVector, Pinecone, or Qdrant by using a custom HTTP request for the latter two. Demo [](https://youtu.be/ELAxebGmspY) How it works A trigger checks every minute for changes in the Notion Database. The manual polling approach improves accuracy and prevents changes from being lost between cached polling intervals. Afterwards every updated page is processed sequentially The Vector Database is searched using the Notion Page ID stored in the metadata of each embedding. If old entries exist, they are deleted. All blocks of the Notion Database Page are retrieved and combined into a single string The content is embedded and split into chunks if necessary. Metadata, including the Notion Page ID, is added during storage for future reference. A simple Question and Answer Chain enables users to ask questions about the embedded content through the integrated chat function Prerequisites To setup a new Vector Store in Supabase, follow this guide Prepare a simple Database in Notion with each Database Page containing at least a title and some content in the blocks section. You can of course also connect this to an existing Database of your choice. Setup Select your credentials in the nodes which require those If you are on an n8n cloud plan, switch to the native Notion Trigger by activating it and deactivating the Schedule Trigger along with its subsequent Notion Node Choose your Notion Database in the first Node related to Notion Adjust the chunk size and overlap in the Token Splitter to your preference Activate the workflow How to use Populate your Notion Database with useful information and use the chat mode of this workflow to ask questions about it. Updates to a Notion Page should quickly reflect in future conversations.

MarioBy Mario
13963

Create a pizza ordering chatbot with GPT-3.5 - Menu, orders & status tracking

Pizza Ordering Chatbot with OpenAI - Menu, Orders & Status Tracking Introduction This workflow template is designed to automate order processing for a pizza store using OpenAI and n8n. The chatbot acts as a virtual assistant to handle customer inquiries related to menu details, order placement, and order status tracking. Features The chatbot provides an interactive experience for customers by performing the following functions: Menu Inquiry: When a customer asks about the menu, the chatbot responds with a list of available pizzas, prices, and additional options. Order Placement: If a customer places an order, the chatbot confirms order details, provides a summary, informs the customer that the order is being processed, and expresses gratitude. Order Status Tracking: If a customer asks about their order status, the chatbot retrieves details such as order date, pizza type, and quantity, providing real-time updates. Prerequisites Before setting up the workflow, ensure you have the following: OpenAI account (Sign up here) OpenAI API key to interact with GPT-3.5 n8n instance running locally or on a server (Installation Guide) Configuration Steps Step 1: Set Up OpenAI API Credentials Log in to OpenAI's website. Navigate to API Keys under your account settings. Click Create API Key and copy the key for later use. Step 2: Configure OpenAI Node in n8n Open n8n and create a new workflow. Click Add Node and search for OpenAI. Select OpenAI from the list. In the OpenAI node settings, click "Create New" under the Credentials section. Enter a name for the credentials (e.g., "PizzaBot OpenAI Key"). Paste your API Key into the field. Click Save. Step 3: Set Up the Chatbot Logic Connect the AI Agent Builder Node to the OpenAI Node and HTTP Request Node. Configure the OpenAI Node with the following settings: Model: gpt-3.5-turbo Prompt: Provide dynamic text based on customer inquiries (e.g., "List available pizzas," "Place an order for Margherita pizza," "Check my order status"). Temperature: Adjust based on desired creativity (recommended: 0.7). Max Tokens: Limit response length (recommended: 150). Add multiple HTTP Request Node: For Get Products: Fetch stored menu data and return details. For Order Product: Capture order details, generate an order ID, and confirm with the customer. For Get Order: Retrieve order details based on the order ID and display progress. Step 4: Testing and Deployment Click Execute Workflow to test the chatbot. Open the Chat Message node, then copy the chat URL to access the chatbot in your browser. Interact with the chatbot by asking different queries (e.g., "What pizzas do you have?" or "I want to order a Pepperoni pizza"). Verify responses and adjust prompts or configurations as needed. Deploy the workflow and integrate it with a messaging platform (e.g., Telegram, WhatsApp, or a website chatbot). Conclusion This n8n workflow enables a fully functional pizza ordering chatbot using OpenAI's GPT-3.5. Customers can view menus, place orders, and track their order status efficiently. You can further customize the chatbot by refining prompts, adding new features, or integrating with external databases for order management. πŸš€ Happy automating!

Irfan HandokoBy Irfan Handoko
7544

Create voice assistant interface with OpenAI GPT-4o-mini and text-to-speech

Voice Assistant Interface with n8n and OpenAI This workflow creates a voice-activated AI assistant interface that runs directly in your browser. Users can click on a glowing orb to speak with the AI, which responds with voice using OpenAI's text-to-speech capabilities. Who is it for? This template is perfect for: Developers looking to add voice interfaces to their applications Customer service teams wanting to create voice-enabled support systems Content creators building interactive voice experiences Anyone interested in creating their own "Alexa-like" assistant How it works The workflow consists of two main parts: Frontend Interface: A beautiful animated orb that users click to activate voice recording Backend Processing: Receives the audio transcription, processes it through an AI agent with memory, and returns voice responses The system uses: Web Speech API for voice recognition (browser-based) OpenAI GPT-4o-mini for intelligent responses OpenAI Text-to-Speech for voice synthesis Session memory to maintain conversation context Setup requirements n8n instance (self-hosted or cloud) OpenAI API key with access to: GPT-4o-mini model Text-to-Speech API Modern web browser with Web Speech API support (Chrome, Edge, Safari) How to set up Import the workflow into your n8n instance Add your OpenAI credentials to both OpenAI nodes Copy the webhook URL from the "Audio Processing Endpoint" node Edit the "Voice Assistant UI" node and replace YOURWEBHOOKURL_HERE with your webhook URL Access the "Voice Interface Endpoint" webhook URL in your browser Click the orb and start talking! How to customize the workflow Change the AI personality: Edit the system message in the "Process User Query" node Modify the visual style: Customize the CSS in the "Voice Assistant UI" node Add more capabilities: Connect additional tools to the AI Agent Change the voice: Select a different voice in the "Generate Voice Response" node Adjust memory: Modify the context window length in the "Conversation Memory" node Demo Watch the template in action: https://youtu.be/0bMdJcRMnZY

Anderson AdelinoBy Anderson Adelino
4123

Find top keywords for Youtube and Google and store them in NocoDB

Template Description WDF Top Keywords: This workflow is designed to streamline keyword research by automating the process of generating, filtering, and analyzing Google and YouTube keyword data. Ensure compliance with local regulations and API terms of service when using this workflow. --- πŸ“Œ Purpose The WDF Top Keywords workflow automates collecting, processing, and managing keyword data for both Google and YouTube platforms. Leveraging multiple data sources and APIs ensures an efficient and scalable approach to identifying high-impact keywords for SEO, content creation, and marketing campaigns. Key Features Automates the generation of keyword suggestions using autocomplete APIs. Integrates with NocoDB to store and manage keyword data. Filters keywords based on monthly search volume and cost-per-click (CPC). Supports bulk import of keyword data into structured databases. Outputs both Google and YouTube keyword insights, enabling informed decision-making. --- 🎯 Target Audience This workflow is ideal for: Digital marketers aiming to optimize ad campaigns with data-driven insights. SEO specialists looking to identify high-potential keywords efficiently. Content creators seeking trending and relevant topics for their platforms. Agencies managing keyword research for multiple clients. --- βš™οΈ How It Works Trigger: The workflow runs on-demand or at scheduled intervals. Keyword Generation: Retrieves base keywords from NocoDB. Generates autocomplete suggestions for Google and YouTube. Data Processing: Filters and formats keyword data based on specific criteria (e.g., search volume, CPC). Consolidates results for efficient storage and analysis. Storage and Output: Saves data into structured NocoDB tables for tracking and reuse. Bulk imports monthly search volume statistics for detailed analysis. --- πŸ› οΈ Key APIs and Tools Used NocoDB: Stores and organizes base and processed keyword data. DataForSEO API: Provides search volume and keyword performance metrics. Google Autocomplete API: Suggests relevant Google search terms. YouTube Autocomplete API: Suggests trending YouTube keywords. Social Flood Docker Instance: Serves as the local integration hub. --- Setup Instructions Required Tools: NocoDB n8n DataForSEO Account Social Flood Docker Instance Create the following NocoDB tables: Base Keyword Search Second Order Google Keywords Second Order YouTube Keywords Search Volume This template empowers users to handle complex keyword research tasks effortlessly, saving time and providing actionable insights. Share this template to enhance your workflow efficiency!

Shannon AtkinsonBy Shannon Atkinson
3434

Categorize and label existing Gmail emails automatically with GPT-4o mini

πŸ“¨ Categorize and Label Existing Gmail Emails Automatically with GPT-4o mini πŸ‘₯ Who's it for This workflow is perfect for individuals or teams who want to sort and label existing emails in their Gmail inbox πŸ—ƒοΈ using AI. Ideal for cleaning up unlabeled emails in bulk β€” no coding required! For sorting incoming emails messages in your gmail inbox, please use this free workflow: Categorize and Label Incoming Gmail Emails Automatically with GPT-4o mini πŸ€– What it does It manually processes a selected number of existing Gmail emails, skips those that already have labels, sends the content to an AI Agent powered by GPT-4o mini 🧠, and applies a relevant Gmail label based on the email content. All labels must already exist in Gmail. βš™οΈ How it works ▢️ Manual Trigger – The workflow starts manually when you click "Execute Workflow". πŸ“₯ Gmail Get Many Messages – Pulls a batch of existing inbox emails (default: 50). 🚫 Filter – Skips emails that already have one or more labels. 🧠 AI Agent (GPT-4o mini) – Analyzes the content and assigns a category. 🧾 Structured Output Parser – Converts the AI output into structured JSON. πŸ”€ Switch Node – Routes each email to the right label based on the AI result. 🏷️ Gmail Nodes – Apply the correct Gmail label to the email. πŸ“‹ Requirements Gmail account connected to n8n Gmail labels must be manually created in your inbox beforehand Labels must exactly match the category names defined in the AI prompt OpenAI credentials with GPT-4o mini access n8n's AI Agent & Structured Output Parser nodes πŸ› οΈ How to set up In your Gmail account, create all the labels you want to use for categorizing emails Open the workflow and adjust the email fetch limit in the Gmail node (e.g., 50, 100) Confirm that the Filter skips emails that already have labels Define your categories in the AI Agent prompt β€” these must match the Gmail labels exactly In the Switch Node, create a condition for each label/category Ensure each Gmail Label Node applies the correct existing label Save the workflow and run it manually whenever you want to organize your inbox βœ… 🎨 How to customize the workflow Add or remove categories in the AI prompt & Switch Node Adjust the batch size of emails to process more or fewer per run Fine-tune the AI prompt to suit your inbox type (e.g., work, personal, client support)

Arlin PerezBy Arlin Perez
2599

Send daily recipe emails automatically

Not sure what to eat tonight? Have recipes emailed to you daily based on your criterial. To run this workflow, you will need to have: A Recipe Search API key from Edamam An active email account with configured credentials To set up your credentials: Set your Edamam AppID and AppKey in the Search Criteria node Select (or create) your email credentials in the Send Recipes node (and set up the to: and from: email addresses while you are at it) To customize the recipes that you receive, open up the Search Criteria node and modify one or more of the following: RecipeCount - the numner of recipes you would like to receive IngredientCount - the maximum number of ingredients you would like each recipe to have CaloriesMin - the minimum number of calories the recipe will have CaloriesMax - the maximum number of calories the recipe will have TimeMin - the minimum amount of time (in minutes) the recipe will take to prepare TimeMax - the maximum amount of time (in minutes) the recipe will take to prepare Diet - Select one of the following options: balanced - Protein/Fat/Carb values in 15/35/50 ratio high-fiber - More than 5g fiber per serving high-protein - More than 50% of total calories from proteins low-carb - Less than 20% of total calories from carbs low-fat - Less than 15% of total calories from fat low-sodium - Less than 140mg Na per serving random - selects a different random diet each day Health - Select one of the following options: alcohol-free - No alcohol used or contained immuno-supportive - Recipes which fit a science-based approach to eating to strengthen the immune system celery-free - does not contain celery or derivatives crustacean-free - does not contain crustaceans (shrimp, lobster etc.) or derivatives dairy-free - No dairy; no lactose egg-free - No eggs or products containing eggs fish-free - No fish or fish derivatives fodmap-free - Does not contain FODMAP foods gluten-free - No ingredients containing gluten keto-friendly - Maximum 7 grams of net carbs per serving kidney-friendly - per serving – phosphorus less than 250 mg AND potassium less than 500 mg AND sodium: less than 500 mg kosher - contains only ingredients allowed by the kosher diet. However it does not guarantee kosher preparation of the ingredients themselves low-potassium - Less than 150mg per serving lupine-free - does not contain lupine or derivatives mustard-free - does not contain mustard or derivatives low-fat-abs - Less than 3g of fat per serving no-oil-added - No oil added except to what is contained in the basic ingredients low-sugar - No simple sugars – glucose, dextrose, galactose, fructose, sucrose, lactose, maltose paleo - Excludes what are perceived to be agricultural products; grains, legumes, dairy products, potatoes, refined salt, refined sugar, and processed oils peanut-free - No peanuts or products containing peanuts pecatarian - Does not contain meat or meat based products, can contain dairy and fish pork-free - does not contain pork or derivatives red-meat-free - does not contain beef, lamb, pork, duck, goose, game, horse, and other types of red meat or products containing red meat. sesame-free - does not contain sesame seed or derivatives shellfish-free - No shellfish or shellfish derivatives soy-free - No soy or products containing soy sugar-conscious - Less than 4g of sugar per serving tree-nut-free - No tree nuts or products containing tree nuts vegan - No meat, poultry, fish, dairy, eggs or honey vegetarian - No meat, poultry, or fish wheat-free - No wheat, can have gluten though random - selects a different random health option each day SearchItem - the general term that you are looking for e.g. chicken

jasonBy jason
2156

Automated daily briefing with Todoist, Google Calendar & GPT-4o via Gmail

Put your productivity on autopilot with this workflow. How it works This workflow generates a beautifully formatted daily briefing email every morning at 6:00 AM by combining your Todoist tasks and Google Calendar events, summarizing them using GPT-4o, and sending them as a clean HTML email. It includes: Auto-fetching today's tasks and events Formatting them for context Generating a motivational summary with GPT-4o Converting the output into styled HTML Emailing it to you daily --- Set up steps Connect your Google Calendar and Todoist accounts Set your project ID in the Todoist node Customize the OpenAI prompt or email template if needed Enable the Schedule Trigger to automate daily runs All configuration logic and summaries are explained in sticky notes inside the workflow. No external tools required. Just plug, personalize, and automate your day!

AK PasnoorBy AK Pasnoor
1892

Generate social media content from video transcripts with Gemini AI & Airtable

🎬 Social Media Content Generator Workflow Overview Automated social media content creation from video transcripts 🎯 Trigger: Airtable Webhook Action: Receives webhook from Airtable automation Data: RecordId and action type (e.g., "post-ig") Purpose: Starts the content generation pipeline πŸ“Š Step 1: Fetch Record Node: Airtable (Get Record) Action: Retrieves full record data using RecordId Data: Name, transcript, and other fields πŸ“ Step 2: Create Google Drive Folder Node: Google Drive (Create Folder) Action: Creates blue folder in /tutorials directory Name: Uses record Name field Updates: Stores folder ID back to Airtable πŸ€– Step 3: AI Content Analysis Node: AI Agent with Google Gemini 2.5 Flash Input: Video transcript from Airtable Structured Output: JSON with all social formats: YouTube title & description YouTube thumbnail text Twitter thread (array) LinkedIn post Instagram caption TikTok caption YouTube Shorts caption Relevant tags πŸ’Ύ Step 4: Save Transcript File Node: Google Drive (Create from Text) Action: Saves transcript as text file Location: Inside the created folder Name: Uses record Name field πŸ“‹ Step 5: Update Airtable Results Node: Airtable (Update Record) Data: All AI-generated social media content Special: Twitter thread array joined with newlines --- 🎯 Result: Complete social media content suite ready for multi-platform publishing, organized in Google Drive with all data stored in Airtable.

Kurt BijlBy Kurt Bijl
1825

Create a WhatsApp customer support bot with OpenAI, calendar & email integration

Who’s it for This template is for businesses, customer support teams, and professionals who want to deliver AI-powered WhatsApp assistance. It helps automate conversations, schedule meetings, answer FAQs, and send follow-up emails β€” all from WhatsApp. How it works A customer sends a WhatsApp message, which is captured by the Twilio Trigger. The incoming text is formatted and passed to the AI Support Agent. Based on the request, the agent can: Manage Google Calendar events (create, list, delete). Answer questions from your knowledge base (Supabase + embeddings). Draft and send emails via Gmail. Reply directly on WhatsApp with the appropriate response. How to set up Connect your Twilio account with WhatsApp enabled. Add your OpenAI API key. Connect Google Calendar. Connect Gmail. Configure Supabase for knowledge base storage. Requirements Twilio account (with WhatsApp number) OpenAI API key Google Calendar Gmail account Supabase project How to customize Update the Set Fields node with your Twilio number, API keys, and Gmail details. Add custom documents to Supabase for domain-specific FAQs. Adjust AI prompts for different roles (e.g., booking bot, HR assistant, customer support). Extend by adding more tools (CRM, Slack, Notion, etc.) as needed.

Nabin BhandariBy Nabin Bhandari
1630

Automated replies to X threads with Airtop browser automation

Use Case Automatically responding to X (formerly Twitter) posts can help you engage with potential customers at scale, saving time while maintaining a personal touch. What This Automation Does This automation replies to specified X posts using the following input parameters: airtop_profile: The name of your Airtop Profile connected to X. thread_url: The URL of the X post to reply to. Example reply_text: The message you want to post as a reply. How It Works Creates a browser session using Airtop. Navigates to the specified X post. Types and submits the reply text. Setup Requirements Airtop API Key β€” free to generate. An Airtop Profile connected to X (requires one-time login). Next Steps Combine with X Monitoring: Use this with the X monitoring automation to create a fully automated engagement pipeline. Extend to Other Platforms: Adapt the automation for use on LinkedIn, Reddit, or any web community. Read more about this Airtop Automation.

AirtopBy Airtop
1146

Create a deal in Pipedrive

No description available.

tanaypantBy tanaypant
981