🎬 YouTube shorts automation tool 🚀
🎬 YouTube Shorts Automation Tool 🚀 Automate the creation of high-performing YouTube Shorts in minutes! Who is this for? 🎯 Content Creators: Generate engaging short videos effortlessly. Marketing Agencies: Produce client-ready content quickly. Business Owners: Promote products/services through viral short-form content. What problem does this solve? 🛠️ Creating short-form video content is time-consuming, complex, and skill-intensive. This workflow automates video creation, eliminating the need for video editing expertise while ensuring SEO optimization, high-quality visuals, and professional voiceovers. How it works 🌟 Enter your video idea into the chat interface. AI generates a script optimized for engagement and SEO. Voiceover is created with realistic AI narration. Relevant visuals are selected to match the script. The video is assembled and delivered via a shareable link. Setup ⚙️ (5-10 min) Connect required APIs (most have free tiers). Follow the guided setup (video tutorial included). Start generating professional YouTube Shorts instantly! Required APIs 🔗 Content Generation: OpenAI, ElevenLabs (script & voiceover) Media Processing: Cloudinary, Replicate (images & storage) Integration Tools: 0codekit, Creatomate (video assembly) Customization 🎨 Adjust script styles & voiceover preferences. Modify visuals to match your brand. Optimize video length and format. 🚀 Start automating your YouTube Shorts today and grow your audience effortlessly!
3D Product Video Generator from 2D Image for E-Commerce Stores
✅ What problem does this workflow solve? Shopify and E-Commerce store owners often struggle to create engaging 3D videos from static product images. This workflow automates that entire process—from image upload to video delivery—so store owners can get professional-looking 3D videos without any manual editing or follow-up. --- ⚙️ What does this workflow do? Accepts a 2D product image and name via a public n8n form. Generates a unique slug and folder in Google Drive for the product. Uploads the original image to Google Drive and logs data in a spreadsheet. Removes the background from the image using remove.bg API. Uploads the cleaned image to Google Drive and updates the spreadsheet. Creates a 3D product video using the cleaned image via the Fal.ai API. Periodically checks the video creation status. Once completed, download the video, upload it to Google Drive, and log the link. Notifies the store owner via email with the video download link. --- 🔧 Setup 🟢 Google Services Google Drive: Create and connect a folder where all product assets will be stored. Google Spreadsheet: A spreadsheet to log the product name, original image link, cleaned image link, and final video URL. Gmail: Connect Gmail to send the final notification email to the store owner. 🔑 API Keys Required Remove.bg: Get an API key from remove.bg. Fal.ai: Sign up at fal.ai and obtain your API key to use the image-to-video generation service. --- 🧠 How it Works 📝 1. Product Form Submission A store owner submits the product name and 2D image via a public n8n form. 🗂 2. Organize in Google Drive A unique slug is generated for the product. A new folder is created inside Google Drive using that slug. The original image is uploaded into the folder. 📊 3. Record in a Spreadsheet The product name and original image URL are stored in a Google Sheet. 🧹 4. Background Removal The uploaded image is processed through remove.bg API to eliminate noisy or cluttered backgrounds. The cleaned image is uploaded back into the product’s Drive folder. The cleaned image link is updated in the spreadsheet. 🎥 5. Create 3D Video (via Fal.ai) The cleaned image is passed to the Fal.ai video generation API. The workflow periodically checks the status until the video is ready. ☁️ 6. Store Final Video Once the video is ready, the file is downloaded. The final video is uploaded into the same Google Drive folder. Its link is saved in the spreadsheet next to the respective product entry. 📧 7. Notify the Store Owner An automated email is sent to the store owner with the video link, letting them know it's ready for use—no waiting, no manual follow-up needed. --- 👤 Who can use it? This workflow is ideal for: 🛍 Shopify Sellers 🧺 E-commerce Store Owners 📸 Product Photographers 🎬 Marketing Teams 🤖 Automation Enthusiasts If you want to automate 3D product video creation using AI—this is the no-code workflow you’ve been waiting for!
Build your own PostgreSQL MCP server
This n8n demonstrates how to build a simple PostgreSQL MCP server to manage your PostgreSQL database such as HR, Payroll, Sale, Inventory and More! This MCP example is based off an official MCP reference implementation which can be found here -https://github.com/modelcontextprotocol/servers/tree/main/src/postgres How it works A MCP server trigger is used and connected to 5 tools: 2 postgreSQL and 3 custom workflow. The 2 postgreSQL tools are simple read-only queries and as such, the postgreSQL tool can be simply used. The 3 custom workflow tools are used for select, insert and update queries as these are operations which require a bit more discretion. Whilst it may be easier to allow the agent to use raw SQL queries, we may find it a little safer to just allow for the parameters instead. The custom workflow tool allows us to define this restricted schema for tool input which we'll use to construct the SQL statement ourselves. All 3 custom workflow tools trigger the same "Execute workflow" trigger in this very template which has a switch to route the operation to the correct handler. Finally, we use our standard PostgreSQL node to handle select, insert and update operations. The responses are then sent back to the the MCP client. How to use This PostgreSQL MCP server allows any compatible MCP client to manage a PostgreSQL database by supporting select, create and update operations. You will need to have a database available before you can use this server. Connect your MCP client by following the n8n guidelines here - https://docs.n8n.io/integrations/builtin/core-nodes/n8n-nodes-langchain.mcptrigger/integrating-with-claude-desktop Try the following queries in your MCP client: "Please help me check if Alex has an entry in the users table. If not, please help me create a record for her." "What was the top selling product in the last week?" "How many high priority support tickets are still open this morning?" Requirements PostgreSQL for database. This can be an external database such as Supabase or one you can host internally. MCP Client or Agent for usage such as Claude Desktop - https://claude.ai/download Customising this workflow If the scope of schemas or tables is too open, try restrict it so the MCP serves a specific purpose for business operations. eg. Confine the querying and editing to HR only tables before providing access to people in that department. Remember to set the MCP server to require credentials before going to production and sharing this MCP server with others!
🤖 AI restaurant assistant for WhatsApp, Instagram & Messenger
Hi, I’m Amanda! 💌 This workflow was created with so much love, care, and attention… especially for you, who runs a restaurant, a cozy little burger place, or a delivery business full of heart. 🥰 I know how busy your days can be, so I made this sweet AI assistant to help you take care of your customers on WhatsApp, Instagram, Messenger (or Evolution API). It sends your beautiful menu, checks ZIP codes, creates payment links, and even notifies the kitchen when the order is ready. All gentle, all automatic, all with love. 💛 --- 💡 What this workflow does Replies to customers via WhatsApp API, Instagram Direct, Messenger, and Evolution API Checks ZIP codes to see if delivery is available using Google Maps Sends your menu as images, because food should look as good as it tastes 🍕 Collects item selections and offers lovely upsells like drinks or extras Creates payment links with the Asaas API Confirms when the payment is complete and sends the order to the kitchen Stores all messages and session data safely in Supabase Uses OpenAI GPT-4o to talk naturally and kindly with your customers --- ⚙️ How to set it up (I’ll guide you with care 🧸) Connect your webhook from WhatsApp, Instagram, Messenger, or Evolution API Create a Supabase table called n8nworkflowfollowup You can use this ready-made template here: 👉 Supabase Sheet Template Add your API keys (OpenAI, Supabase, Google Maps, and Asaas) securely in n8n Customize the AI prompt with your brand’s voice and sweet style 💫 Set your delivery radius (default is 10km, but you can change it!) Upload your menu images (from Google Drive, your website, or any link) That’s it! Your assistant is now ready to serve with kindness and automation 💕 --- 🍯 Works with: ✅ n8n Cloud and Self-Hosted n8n 🔐 All API credentials are safely stored using n8n’s secure credential manager --- Want something customized just for you? Chat with me, I’d love to help 💻💛 Chat via WhatsApp (+55 17 99155-7874) . . . Tradução em Português: Oi, eu sou a Amanda! 💌 Esse workflow foi feito com muito carinho, dedicação e cuidado... pensando especialmente em você, que tem um restaurante, lanchonete ou delivery cheio de amor pelo que faz. 🥰 Eu sei como o dia a dia pode ser corrido, e foi por isso que eu criei esse atendente com IA: pra te ajudar a responder clientes no WhatsApp, Instagram, Messenger (ou Evolution API), enviar cardápio com imagens lindas, calcular entregas, gerar links de pagamento e até avisar a cozinha. Tudo com jeitinho, sem complicação, e com muito coração. 💛 --- 💡 O que esse fluxo faz Atende clientes pelo WhatsApp API, Instagram Direct, Messenger e Evolution API Valida CEP e calcula se o cliente está dentro da área de entrega (usando Google Maps) Envia cardápio com imagens, porque comer começa pelos olhos 🍕 Coleta os pedidos e também oferece bebidas e adicionais Gera link de pagamento automaticamente com a API do Asaas Confirma o pagamento e avisa a cozinha quando estiver tudo certo Armazena mensagens, horários e histórico no Supabase Usa o GPT-4o da OpenAI pra conversar de forma educada e natural com seus clientes --- ⚙️ Como configurar (com meu passo a passo cheio de cuidado 🧸) Conecte seu webhook do WhatsApp, Instagram, Messenger ou Evolution API Crie uma tabela no Supabase chamada n8nworkflowfollowup Você pode usar esse modelo aqui: 👉 Planilha modelo Supabase Adicione suas chaves de API do OpenAI, Google Maps, Supabase e Asaas no gerenciador do n8n Personalize o prompt da IA com o nome do seu restaurante, estilo de fala e sua magia 💫 Defina a distância máxima de entrega (padrão: 10km) Coloque seus próprios links de imagens do cardápio (pode ser do Drive, site ou CDN) Prontinho! Agora o seu restaurante tem um atendente inteligente, gentil e muito eficiente 💕 --- 🍯 Funciona com: ✅ n8n Cloud e n8n auto-hospedado 🔐 E suas credenciais ficam guardadinhas com segurança no próprio n8n, tá bom? --- Quer algo feito especialmente pra você? Fala comigo com todo carinho 💻💛 Falar no WhatsApp (+55 17 99155-7874)
Store Notion's Pages as Vector Documents into Supabase with OpenAI
*Workflow updated on 17/06/2024: Added 'Summarize' node to avoid creating a row for each Notion content block in the Supabase table.* Store Notion's Pages as Vector Documents into Supabase This workflow assumes you have a Supabase project with a table that has a vector column. If you don't have it, follow the instructions here: Supabase Langchain Guide Workflow Description This workflow automates the process of storing Notion pages as vector documents in a Supabase database with a vector column. The steps are as follows: Notion Page Added Trigger: Monitors a specified Notion database for newly added pages. You can create a specific Notion database where you copy the pages you want to store in Supabase. Node: Page Added in Notion Database Retrieve Page Content: Fetches all block content from the newly added Notion page. Node: Get Blocks Content Filter Non-Text Content: Excludes blocks of type "image" and "video" to focus on textual content. Node: Filter - Exclude Media Content Summarize Content: Concatenates the Notion blocks content to create a single text for embedding. Node: Summarize - Concatenate Notion's blocks content Store in Supabase: Stores the processed documents and their embeddings into a Supabase table with a vector column. Node: Store Documents in Supabase Generate Embeddings: Utilizes OpenAI's API to generate embeddings for the textual content. Node: Generate Text Embeddings Create Metadata and Load Content: Loads the block content and creates associated metadata, such as page ID and block ID. Node: Load Block Content & Create Metadata Split Content into Chunks: Divides the text into smaller chunks for easier processing and embedding generation. Node: Token Splitter
Avoid rate limiting by batching HTTP requests
This workflow demonstrates the use of the Split In Batches node and the Wait node to avoid API rate limits. Customer Datastore node: The workflow fetches data from the Customer Datastore node. Based on your use case, replace it with a relevant node. Split In Batches node: This node splits the items into a single item. Based on the API limit, you can configure the Batch Size. HTTP Request node: This node makes API calls to a placeholder URL. If the Split In Batches node returns 5 items, the HTTP Request node will make 5 different API calls. Wait node: This node will pause the workflow for the time you specify. On resume, the Split In Batches node gets executed node, and the next batch is processed. Replace Me (NoOp node): This node is optional. If you want to continue your workflow and process the items, replace this node with the corresponding node(s).
Csv to JSON converter with error handling and Slack notifications
Who this template is for This template is for developers or teams who need to convert CSV data into JSON format through an API endpoint, with support for both file uploads and raw CSV text input. Use case Converting CSV files or raw CSV text data into JSON format via a webhook endpoint, with error handling and notifications. This is particularly useful when you need to transform CSV data into JSON as part of a larger automation or integration process. How this workflow works Receives POST requests through a webhook endpoint at /tool/csv-to-json Uses a Switch node to handle different input types: File uploads (binary data) Plain text CSV data JSON format data Processes the CSV data: For files: Uses the Extract From File node For raw text: Converts the text to CSV using a custom Code node that handles both comma and semicolon delimiters Aggregates the processed data and returns: Success response (200): Converted JSON data Error response (500): Error message with details In case of errors, sends notifications to a Slack error channel with execution details and a link to debug Set up steps Configure the webhook endpoint by deploying the workflow Set up Slack integration for error notifications: Update the Slack channel ID (currently set to "C0832GBAEN4") Configure OAuth2 authentication for Slack Test the endpoint using either: CURL for file uploads: bash bash Copy curl -X POST "https://yoururl.com/webhook-test/tool/csv-to-json" \ -H "Content-Type: text/csv" \ --data-binary @path/to/your/file.csv Or send raw CSV data as text/plain content type
2-way sync Notion and Google Calendar
This workflow syncs multiple Notion databases to your Google Calendar. And it works both ways. What events are supported? Everything except recurring events. All day events, multiple day events, start and end date… these are all supported. You set them in Notion and they stay in sync with Google. And vice versa. Why doesn’t it support recurring events? Notion doesn’t support recurring events yet. So when you create a recurring event in Google, it will only consider the first date, ignoring future occurrences of the event. Can I connect more than one Notion database? Yes. You can have many Notion databases synced to one Google Calendar account. You can see how to do it in the workflow instructions. It is recommended that you create more calendars in your account, so that you can link each calendar to a different database in Notion. But that’s a choice. What happens if I delete an event or page? Notion page deleted → Deletes event in Google Notion date property cleared → Deletes event in Google Google event deleted → Clears the date property in Notion, but keeps the page, so you don’t lose your work. Does it update the events? Yes. When you update the event in Google or in Notion it syncs both ways. How can I know what Notion item was linked to an event? Either by the name or by clicking the hyperlink in the event description that says: 👉 View in Notion. When I create a new event in Google, does it add an item to Notion? Yes. When you create an event inside one of your calendars, the item is synced to the corresponding Notion database. Does it sync event descriptions? No. The event description will always be “View in Notion”. Even if you change it in Google Calendar it will be overwritten when you make a change to the Notion page. 🎉 When you buy this template you receive step-by-step instructions on how to set it up. Check out my other templates 👉 https://n8n.io/creators/solomon/
Collect LinkedIn profiles with AI processing using SerpAPI, OpenAI, and NocoDB
What problem does this solve? It fetches LinkedIn profiles for a multitude of purposes based on a keyword and location via Google search and stores them in an Excel file for download and in a NocoDB database. It tries to avoid using costly services and should be n8n beginner friendly. It uses the serpapi.com to avoid being blocked by Google Search and to process the data in an easier way. What does it do? Based on criteria input, it searches LinkedIn profiles It discards unnecessary data and turns the follower count into a real number The output is provided as an Excel table for download and in a NocoDB database How does it do it? Based on criteria input, it uses serpAPI.com to conduct Google search of the respective LinkedI profiles With OpenAI.com the name of the respective company is being added With OpenAI.com the follower number e.g., 300+ is turned into a real number: 300 All unnecessary metadata is being discarded As an output an Excel file is being created The output is stored in a nocodb.com table Step-by-step instruction Import the Workflow: Copy the workflow JSON from the "Template Code" section below. Import it into n8n via "Import from File" or "Import from URL". Set up a free account at serpapi.com and get API credentials to enable good Google search results Set up an API account at openai.com and get API key Set up a nocodb.com account (or self-host) and get the API credentials Create the credentials for serpapi.com, opemnai.com and nocodb.com in n8n. Set up a table in NocoDB with the fields indicated in the note above the NocoDB node Follow the instructions as detailed in the notes above individual nodes When the workflow is finished, open the Excel node and click download if you need the Excel file
Elastic alert notification via Microsoft Graph API
Who is this template for? This template is for teams and administrators who use n8n to monitor Elastic alerts and want to receive automated email notifications when an alert is triggered. It leverages Microsoft Graph API to send emails and provides an efficient way to notify users about alerts directly in their inbox. How it works? The template connects to the Elastic API to retrieve alert data. When a new alert is detected, the workflow processes the alert content and sends an email notification via Microsoft Graph API. The email includes alert details such as the alert name, timestamp, severity, and a summary of the message, allowing for quick action or review. Setup steps Step 1: Set up OAuth2 Credentials in n8n for Microsoft Graph API with Mail.Send permission. Step 2: Configure your Elastic API endpoint in the HTTP Request node to retrieve alerts. Step 3: Modify the email recipients in the template to specify who will receive the alert notifications. Step 4: Customize the email format, if necessary, to include additional alert details or adjust the message.
Summarize news articles & auto-post to social media with GPT-4 and GetLate
Different Articles Summarizer & Social Media Auto-Poster This n8n template demonstrates how to extract full-text articles from different news websites, summarize them with AI, and automatically generate content for social networks (Twitter, Instagram, Threads, LinkedIn, YouTube). You can use it for any news topic. Example: posting summaries of breaking news articles. Possible use cases : Automate press article summarization with GPT. Create social media posts optimized for young audiences. Publish content simultaneously across multiple platforms with Late API. How it works The workflow starts manually or with a trigger. URLs of news articles are defined in the Edit Fields node. Each URL is processed separately via Split Out. HTTP Request fetches the article HTML. Custom Code node extracts clean text (title, content, main image). OpenAI summarizes each article factually. Aggregate combines results. Another OpenAI node (Message a model) creates structured JSON summaries for young readers. A final OpenAI node (Message a model1) generates short social media posts (hook, summary, CTA, hashtags). Images are extracted via HTML1 and uploaded to Google Drive. Posts (text + image) are sent to Late API for multi-platform scheduling (Twitter, Instagram, Threads, LinkedIn, YouTube). Requirements OpenAI API key connected to n8n. Google Drive account (for storing article images). Late API credentials with platform account IDs. Valid list of article URLs.
Auto file organizer for Google Drive: sort PDFs, images & documents by type
Description: This ready-to-deploy n8n automation template smartly detects and classifies files uploaded to a specified Google Drive folder based on MIME type. It automatically moves each file into its correct destination folder: Documents, PDFs, or Images — ensuring a clean and organized Drive, effortlessly. Perfect for remote teams, admins, educators, legal pros, and automation-focused operations, this workflow eliminates manual sorting and saves hours of repetitive work. What This Template Does (Step-by-Step) ⚙️Manual Trigger: Launch the workflow on demand using the "Execute Workflow" trigger. 📁 Search Files in Source Folder (Google Drive): Lists all files inside your chosen folder (e.g., "Uploads"). 🔁 Loop Over Files (SplitInBatches): Iterates through each file one-by-one to ensure reliability. 📥 Download File (Google Drive): Retrieves file metadata and MIME type required for filtering. 🧠 Smart File Type Detection via If Nodes application/json → Move to Documents folder application/pdf → Move to PDFs folder image/jpeg → Move to Images folder (Easily customizable to support additional types like PNG, DOCX, etc.) 📂 Move Files to Designated Folders: Uses Google Drive API to relocate each file to its proper location. 🔁 Loop Returns for Next File After each move, the loop picks the next file in queue. Key Features ⚙️ Google Drive API v3 Integration 🔐 OAuth2 for secure access 📄 MIME-type–based routing logic 🔁 Batch-safe with looping logic ✅ File properties are preserved 🔄 Auto-removal from source after sorting Required Integration Google Drive (OAuth2) Use Cases Auto-organize client uploads Separate scanned PDFs, images, or forms Route invoices, receipts, or contracts into folders Automatically sort uploaded assignments or resources Maintain structured cloud storage without manual intervention Why Use This Template? ✅ No-code deployment ✅ Saves hours of manual work ✅ Works across teams, departments, or shared Drives ✅ Easy to expand with more file types or routing rules ✅ Keeps your Drive clean, fast, and organized