Back to Catalog

Templates by Immanuel

Extract trends, auto-generate social content with AI, Reddit, Google & post

Extract Trends and Auto-Generate Social Media Content with OpenAI, Reddit, and Google Trends: Approve and Post to Instagram, TikTok, and More --- Description What Problem Does This Solve? 🛠️ This workflow automates trend extraction and social media content creation for businesses and marketers. It eliminates manual trend research and content generation by fetching trends, scoring them with AI, and posting tailored content to multiple platforms. Target audience: Social media managers, digital marketers, and businesses aiming to streamline content strategies. What Does It Do? 🌟 Fetches trending topics from Reddit, X and Google Trends Scores trends for relevance using OpenAI. Generates content for Twitter/X, LinkedIn, Instagram and Facebook Posts to supported platforms Stores results in Google Sheets for tracking Key Features 📋 Real-time trend fetching from Reddit and Google Trends. AI-driven trend scoring and content generation (OpenAI). Automated posting to Twitter/X, LinkedIn, Instagram, and Facebook. Persistent storage in Google Sheets. --- Setup Instructions Prerequisites ⚙️ n8n Instance: Self-hosted or cloud n8n instance. API Credentials: Reddit API: Client ID and secret from Reddit. SerpApi (Google Trends): API key from SerpApi, stored in n8n credentials OpenAI API: API key with GPT model access. Twitter/X API: OAuth 1.0a credentials with write permissions. LinkedIn API: OAuth 2.0 credentials with worganizationsocial scope. Instagram/Facebook API: Meta Developer app with posting permissions. Google Sheets API: Credentials from Google Cloud Console. Installation Steps 📦 Import the Workflow: Copy the workflow JSON from the "Template Code" section below. Import it into n8n via "Import from File" or "Import from URL". Configure Credentials: Add API credentials in n8n’s Credentials section for Reddit, SerpApi, OpenAI, Twitter/X, LinkedIn, Instagram/Facebook, and Google Sheets. Assign credentials to respective nodes. For example: In the Fetch Google Trends node (HTTP Request), use n8n credentials for SerpApi instead of hardcoding the API key. Example: Set the API key in n8n credentials as SerpApiKey and reference it in the node’s query parameter: api_key={{ $credentials.SerpApiKey }}. Set Up Google Sheets with the following columns (exact column names are case-sensitive) -Timestamp | Trend | Score | BrandVoice | AudienceMood | Customize Nodes: OpenAI Nodes (Trend Relevance Scoring, Generate Social Media Content): Update the model (e.g., gpt-4o) and prompt as needed. HTTP Request Nodes (Post to Twitter/X, Post to LinkedIn, etc.): Verify URLs, authentication, and payloads. Brand Voice/Audience Mood: Adjust Prepare Trend Scoring Input for your desired brandvoice (e.g., "casual") and audiencemood (e.g., "curious"). Test the Workflow: Fetch Reddit Trends to Store Selected Trends- to score and store trends. Retrieve Latest Trends to end) to generate and post content Check Google Sheets for posting statuses --- How It Works High-Level Steps 🔍 Fetch Trends: Pulls trends from Reddit,X and Google Trends. Score Trends: Uses OpenAI to score trends for relevance. Generate Content: Creates platform-specific social media content. Post Content: Posts to LinkedIn, Facebook or X Detailed descriptions are available in the sticky notes within the workflow screenshot above. --- Node Names and Actions Trend Extraction and Scoring Daily Trigger Idea: Triggers the workflow daily. Set Default Inputs: Sets default brand_voice and inputs. Fetch Reddit Trends: Fetches Reddit posts. Extract Reddit Trends: Extracts trends from Reddit. Fetch Google Trends: Fetches Google Trends via SerpApi. Extract Google Trends2: Processes Google Trends data. Fetch Twitter Mentions: Fetches Twitter mentions. Translate Tweets to English: Translates tweets. Fix Tweet Translation Output: Fixes translation format. Detect Audience Mood: Detects audience mood. Fix Audience Mood Output: Fixes mood output format. Analyze News Sentiment: Analyzes news sentiment. Combine Data (Merge): Merges all data sources. Merge Items into Single Item: Combines data into one item. Combine Trends and UGC: Combines trends with UGC. Prepare Trend Scoring Input: Prepares data for scoring. Trend Relevance Scoring: Scores trends with OpenAI. Parse Trend Scores: Parses scoring output. Store Selected Trends: Stores trends in Google Sheets. Content Generation and Posting Retrieve Latest Trends: Retrieves trends from Google Sheets. Parse Retrieved Trends: Parses retrieved trends. Select Top Trends: Selects the top trend. Generate Social Media Content: Generates platform-specific content. Parse Social Media Content: Parses generated content. Generate Images: Generates images for posts (if applicable). -Handle Approvals/Rejection before Posting Post to Instagram: Posts to Instagram. Post to Facebook: Posts to Facebook. Post to LinkedIn: Posts to LinkedIn. --- Customization Tips Add Trend Sources 📡: Include more sources (e.g., Instagram trends) by adding nodes to Combine Data (Merge). Change Content Tone ✍️: Update the Generate Social Media Content prompt for a different tone (e.g., "humorous"). Adjust Schedule ⏰: Modify Daily Trigger Idea to run hourly or weekly. Automate TikTok/YouTube 🎥: Add video generation (e.g., FFMPEG) to post TikTok and YouTube Shorts ---

ImmanuelBy Immanuel
20230

Interactive knowledge base chat with Supabase RAG using AI 📚💬

Google Drive File Ingestion to Supabase for Knowledge Base 📂💾 Overview 🌟 This n8n workflow automates the process of ingesting files from Google Drive into a Supabase database, preparing them for a knowledge base system. It supports text-based files (PDF, DOCX, TXT, etc.) and tabular data (XLSX, CSV, Google Sheets), extracting content, generating embeddings, and storing data in structured tables. This is a foundational workflow for building a company knowledge base that can be queried via a chat interface (e.g., using a RAG workflow). 🚀 Problem Solved 🎯 Manually managing a knowledge base with files from Google Drive is time-consuming and error-prone. This workflow solves that by: Automatically ingesting files from Google Drive as they are created or updated. Extracting content from various file types (text and tabular). Generating embeddings for text-based files to enable vector search. Storing data in Supabase for efficient retrieval. Handling duplicates and errors to ensure data consistency. Target Audience: Knowledge Managers: Build a centralized knowledge base from company files. Data Teams: Automate the ingestion of spreadsheets and documents. Developers: Integrate with other workflows (e.g., RAG for querying the knowledge base). Workflow Description 🔍 This workflow listens for new or updated files in Google Drive, processes them based on their type, and stores the extracted data in Supabase tables for later retrieval. Here’s how it works: File Detection: Triggers when a file is created or updated in Google Drive. File Processing: Loops through each file, extracts metadata, and validates the file type. Duplicate Check: Ensures the file hasn’t been processed before. Content Extraction: Text-based Files: Downloads the file, extracts text, splits it into chunks, generates embeddings, and stores the chunks in Supabase. Tabular Files: Extracts data from spreadsheets and stores it as rows in Supabase. Metadata Storage: Stores file metadata and basic info in Supabase tables. Error Handling: Logs errors for unsupported formats or duplicates. Nodes Breakdown 🛠️ Detect New File 🔔 Type: Google Drive Trigger Purpose: Triggers the workflow when a new file is created in Google Drive. Configuration: Credential: Google Drive OAuth2 Event: File Created Customization: Specify a folder to monitor specific directories. Detect Updated File 🔔 Type: Google Drive Trigger Purpose: Triggers the workflow when a file is updated in Google Drive. Configuration: Credential: Google Drive OAuth2 Event: File Updated Customization: Currently disconnected; reconnect if updates need to be processed. Process Each File 🔄 Type: Loop Over Items Purpose: Processes each file individually from the Google Drive trigger. Configuration: Input: {{ $json.files }} Customization: Adjust the batch size if processing multiple files at once. Extract File Metadata 🆔 Type: Set Purpose: Extracts metadata like fileid, filename, mimetype, and webview_link. Configuration: Fields: file_id: {{ $json.id }} file_name: {{ $json.name }} mime_type: {{ $json.mimeType }} webviewlink: {{ $json.webViewLink }} Customization: Add more metadata fields if needed (e.g., size, createdTime). Check File Type ✅ Type: IF Purpose: Validates the file type by checking the MIME type. Configuration: Condition: mime_type contains supported types (e.g., application/pdf, application/vnd.openxmlformats-officedocument.spreadsheetml.sheet). Customization: Add more supported MIME types as needed. Find Duplicates 🔍 Type: Supabase Purpose: Checks if the file has already been processed by querying knowledge_base. Configuration: Operation: Select Table: knowledge_base Filter: fileid = {{ $node['Extract File Metadata'].json.fileid }} Customization: Add additional duplicate checks (e.g., by file name). Handle Duplicates 🔄 Type: IF Purpose: Routes the workflow based on whether a duplicate is found. Configuration: Condition: {{ $node['Find Duplicates'].json.length > 0 }} Customization: Add notifications for duplicates if desired. Remove Old Text Data 🗑️ Type: Supabase Purpose: Deletes old text data from documents if the file is a duplicate. Configuration: Operation: Delete Table: documents Filter: metadata->>'fileid' = {{ $node['Extract File Metadata'].json.fileid }} Customization: Add logging before deletion. Remove Old Data 🗑️ Type: Supabase Purpose: Deletes old tabular data from document_rows if the file is a duplicate. Configuration: Operation: Delete Table: document_rows Filter: datasetid = {{ $node['Extract File Metadata'].json.fileid }} Customization: Add logging before deletion. Route by File Type 🔀 Type: Switch Purpose: Routes the workflow based on the file’s MIME type (text-based or tabular). Configuration: Rules: Based on mime_type (e.g., application/pdf for text, application/vnd.openxmlformats-officedocument.spreadsheetml.sheet for tabular). Customization: Add more routes for additional file types. Download File Content 📥 Type: Google Drive Purpose: Downloads the file content for text-based files. Configuration: Credential: Google Drive OAuth2 File ID: {{ $node['Extract File Metadata'].json.file_id }} Customization: Add error handling for download failures. Extract PDF Text 📜 Type: Extract from File (PDF) Purpose: Extracts text from PDF files. Configuration: File Content: {{ $node['Download File Content'].binary.data }} Customization: Adjust extraction settings for better accuracy. Extract DOCX Text 📜 Type: Extract from File (DOCX) Purpose: Extracts text from DOCX files. Configuration: File Content: {{ $node['Download File Content'].binary.data }} Customization: Add support for other text formats (e.g., TXT, RTF). Extract XLSX Data 📊 Type: Extract from File (XLSX) Purpose: Extracts tabular data from XLSX files. Configuration: File ID: {{ $node['Extract File Metadata'].json.file_id }} Customization: Add support for CSV or Google Sheets. Split Text into Chunks ✂️ Type: Text Splitter Purpose: Splits extracted text into manageable chunks for embedding. Configuration: Chunk Size: 1000 Chunk Overlap: 200 Customization: Adjust chunk size and overlap based on document length. Generate Text Embeddings 🌐 Type: OpenAI Purpose: Generates embeddings for text chunks using OpenAI. Configuration: Credential: OpenAI API key Operation: Embedding Model: text-embedding-ada-002 Customization: Switch to a different embedding model if needed. Store Text in Supabase 💾 Type: Supabase Vector Store Purpose: Stores text chunks and embeddings in the documents table. Configuration: Credential: Supabase credentials Operation: Insert Documents Table Name: documents Customization: Add metadata fields to store additional context. Store Tabular Data 💾 Type: Supabase Purpose: Stores tabular data in the document_rows table. Configuration: Operation: Insert Table: document_rows Columns: datasetid, rowdata Customization: Add validation for tabular data structure. Store File Metadata 📋 Type: Supabase Purpose: Stores file metadata in the document_metadata table. Configuration: Operation: Insert Table: document_metadata Columns: fileid, filename, filetype, fileurl Customization: Add more metadata fields as needed. Record in Knowledge Base 📚 Type: Supabase Purpose: Stores basic file info in the knowledge_base table. Configuration: Operation: Insert Table: knowledge_base Columns: fileid, filename, filetype, fileurl, upload_date Customization: Add indexes for faster lookups. Log File Errors ⚠️ Type: Supabase Purpose: Logs errors for unsupported file types. Configuration: Operation: Insert Table: error_log Columns: errortype, errormessage Customization: Add notifications for errors. Log Duplicate Errors ⚠️ Type: Supabase Purpose: Logs errors for duplicate files. Configuration: Operation: Insert Table: error_log Columns: errortype, errormessage Customization: Add notifications for duplicates. Interactive Knowledge Base Chat with Supabase RAG using GPT-4o-mini 📚💬 Introduction 🌟 This n8n workflow creates an interactive chat interface that allows users to query a company knowledge base using Retrieval-Augmented Generation (RAG). It retrieves relevant information from text documents and tabular data stored in Supabase, then generates natural language responses using OpenAI’s GPT-4o-mini model. Designed for teams managing internal knowledge, this workflow enables users to ask questions like “What’s the remote work policy?” or “Show me the latest budget data” and receive accurate, context-aware responses in a conversational format. 🚀 Problem Statement 🎯 Managing a company knowledge base can be a daunting task—employees often struggle to find specific information buried in documents or spreadsheets, leading to wasted time and inefficiencies. Traditional search methods may not understand natural language queries or provide contextually relevant results. This workflow solves these issues by: Offering a chat-based interface for natural language queries, making it easy for users to ask questions in their own words. Leveraging RAG to retrieve relevant text and tabular data from Supabase, ensuring responses are accurate and context-aware. Supporting diverse file types, including text-based files (e.g., PDFs, DOCX) and tabular data (e.g., XLSX, CSV), for comprehensive knowledge access. Maintaining conversation history to provide context during interactions, improving the user experience. Target Audience 👥 This workflow is ideal for: HR Teams: Quickly access company policies, employee handbooks, or benefits documents. Finance Teams: Retrieve budget data, expense reports, or financial summaries from spreadsheets. Knowledge Managers: Build a centralized assistant for internal documentation, streamlining information access. Developers: Extend the workflow with additional tools or integrations for custom use cases. Workflow Description 🔍 This workflow consists of a chat interface powered by n8n’s Chat Trigger node, an AI Agent node for RAG, and several tools to retrieve data from Supabase. Here’s how it works step-by-step: User Initiates a Chat: The user interacts with a chat interface, sending queries like “Summarize our remote work policy” or “Show budget data for Q1 2025.” Query Processing with RAG: The AI Agent processes the query using RAG, retrieving relevant data from Supabase tables and generating a response with OpenAI’s GPT-4o-mini model. Data Retrieval and Response Generation: The workflow uses multiple tools to fetch data: Retrieves text chunks from the documents table using vector search. Fetches tabular data from the document_rows table based on file IDs. Extracts full document text or lists available files as needed. Generates a natural language response combining the retrieved data. Conversation History Management: Stores the conversation history in Supabase to maintain context for follow-up questions. Response Delivery: Formats and sends the response back to the chat interface for the user to view. Nodes Breakdown 🛠️ Start Chat Interface 💬 Type: Chat Trigger Purpose: Provides the interactive chat interface for users to input queries and receive responses. Configuration: Chat Title: Company Knowledge Base Assistant Chat Subtitle: Ask me anything about company documents! Welcome Message: Hello! I’m your Company Knowledge Base Assistant. How can I help you today? Suggestions: What is the company policy on remote work?, Show me the latest budget data., List all policy documents. Output Chat Session ID: true Output User Message: true Customization: Update the title and welcome message to align with your company branding (e.g., HR Knowledge Assistant). Add more suggestions relevant to your use case (e.g., What are the company benefits?). Process Query with RAG 🧠 Type: AI Agent Purpose: Orchestrates the RAG process by retrieving relevant data using tools and generating responses with OpenAI’s GPT-4o-mini. Configuration: Credential: OpenAI API key Model: gpt-4o-mini System Prompt: You are a helpful assistant for a company knowledge base. Use the provided tools to retrieve relevant information from documents and tabular data. If the query involves tabular data, format it clearly in your response. If no relevant data is found, respond with "I couldn’t find any relevant information. Can you provide more details?" Input Field: {{ $node['Start Chat Interface'].json.message }} Customization: Switch to a different model (e.g., gpt-3.5-turbo) to adjust cost or performance. Modify the system prompt to change the tone (e.g., more formal for HR use cases). Retrieve Text Chunks 📄 Type: Supabase Vector Store (Tool) Purpose: Retrieves relevant text chunks from the documents table using vector search. Configuration: Credential: Supabase credentials Operation Mode: Retrieve Documents (As Tool for AI Agent) Table Name: documents Embedding Field: embedding Content Field: content_text Metadata Field: metadata Embedding Model: OpenAI text-embedding-ada-002 Top K: 10 Customization: Adjust Top K to retrieve more or fewer results (e.g., 5 for faster responses). Ensure the match_documents function (see prerequisites) is defined in Supabase. Fetch Tabular Data 📊 Type: Supabase (Tool, Execute Query) Purpose: Retrieves tabular data from the document_rows table based on a file ID. Configuration: Credential: Supabase credentials Operation: Execute Query Query: SELECT rowdata FROM documentrows WHERE dataset_id = $1 LIMIT 10 Tool Description: Run a SQL query - use this to query from the documentrows table once you know the file ID you are querying. datasetid is the fileid and you are always using the rowdata for filtering, which is a jsonb field that has all the keys from the file schema given in the document_metadata table. Customization: Modify the query to filter specific columns or add conditions (e.g., WHERE datasetid = $1 AND rowdata->>'year' = '2025'). Increase the LIMIT for larger datasets. Extract Full Document Text 📜 Type: Supabase (Tool, Execute Query) Purpose: Fetches the full text of a document by concatenating all text chunks for a given file_id. Configuration: Credential: Supabase credentials Operation: Execute Query Query: SELECT stringagg(contenttext, ' ') as documenttext FROM documents WHERE metadata->>'fileid' = $1 GROUP BY metadata->>'file_id' Tool Description: Given file id fetch the text from the documents Customization: Add filters to the query if needed (e.g., limit to specific metadata fields). List Available Files 📋 Type: Supabase (Tool, Select) Purpose: Lists all files in the knowledge base from the document_metadata table. Configuration: Credential: Supabase credentials Operation: Select Schema: public Table: document_metadata Tool Description: Use this tool to fetch all documents including the table schema if the file is csv, excel or xlsx Customization: Add filters to list specific file types (e.g., WHERE file_type = 'application/pdf'). Modify the columns selected to include additional metadata (e.g., file_size). Manage Chat History 💾 Type: Postgres Chat Memory (Tool) Purpose: Stores and retrieves conversation history to maintain context. Configuration: Credential: Supabase credentials (Postgres-compatible) Table Name: n8nchathistory Session ID Field: session_id Session ID Value: {{ $node['Start Chat Interface'].json.sessionId }} Message Field: message Sender Field: sender Timestamp Field: timestamp Context Window Length: 5 Customization: Increase the context window length for longer conversations (e.g., 10 messages). Add indexes on session_id and timestamp in Supabase for better performance. Format and Send Response 📤 Type: Set Purpose: Formats the AI Agent’s response and sends it back to the chat interface. Configuration: Fields: response: {{ $node['Process Query with RAG'].json.output }} Customization: Add additional formatting to the response if needed (e.g., prepend with a timestamp or apply markdown formatting). Setup Instructions 🛠️ Prerequisites 📋 n8n Setup: Ensure you’re using n8n version 1.0 or higher. Enable the AI features in n8n settings. Supabase: Create a Supabase project and set up the following tables: documents: id (uuid), content_text (text), embedding (vector(1536)), metadata (jsonb) documentrows: id (uuid), datasetid (varchar), row_data (jsonb) documentmetadata: fileid (varchar), filename (varchar), filetype (varchar), file_url (text) knowledgebase: id (serial), fileid (varchar), filename (varchar), filetype (varchar), fileurl (text), uploaddate (timestamp) n8nchathistory: id (serial), session_id (varchar), message (text), sender (varchar), timestamp (timestamp) Add the match_documents function to Supabase to enable vector search: sql CREATE OR REPLACE FUNCTION match_documents ( query_embedding vector(1536), match_count int DEFAULT 5, filter jsonb DEFAULT '{}' ) RETURNS TABLE ( id uuid, content_text text, metadata jsonb, similarity float ) LANGUAGE plpgsql AS $$ BEGIN RETURN QUERY SELECT documents.id, documents.content_text, documents.metadata, 1 - (documents.embedding <=> query_embedding) as similarity FROM documents WHERE documents.metadata @> filter ORDER BY similarity DESC LIMIT match_count; END; $$;

ImmanuelBy Immanuel
5290

Raw materials inventory management with Google Sheets, Supabase and approvals

Automated Raw Materials Inventory Management with Google Sheets, Supabase, and Gmail using n8n Webhooks Description What Problem Does This Solve? 🛠️ This workflow automates raw materials inventory management for businesses, eliminating manual stock updates, delayed material issue approvals, and missed low stock alerts. It ensures real-time stock tracking, streamlined approvals, and timely notifications. Target audience: Small to medium-sized businesses, inventory managers, and n8n users familiar with Google Sheets, Supabase, and Gmail integrations. What Does It Do? 🌟 Receives raw material data and issue requests via form submissions. Updates stock levels in Google Sheets and Supabase. Manages approvals for material issue requests with email notifications. Detects low stock levels and sends alerts via Gmail. Maintains data consistency across Google Sheets and Supabase. Key Features Real-time stock updates from form submissions. Automated approval process for material issuance. Low stock detection with Gmail notifications. Dual storage in Google Sheets and Supabase for redundancy. Error handling for robust data validation. Setup Instructions Prerequisites n8n Instance: Self-hosted or cloud n8n instance. API Credentials: Google Sheets API: Credentials from Google Cloud Console with Sheets scope, stored in n8n credentials. Supabase API: API key and URL from Supabase project, stored in n8n credentials (do not hardcode in nodes). Gmail API: Credentials from Google Cloud Console with Gmail scope. Forms: A form (e.g., Google Form) to submit raw material receipts and issue requests, configured to send data to n8n webhooks. Installation Steps Import the Workflow: Copy the workflow JSON from the “Template Code” section (to be provided). Import it into n8n via “Import from File” or “Import from URL”. Configure Credentials: Add API credentials in n8n’s Credentials section for Google Sheets, Supabase, and Gmail. Assign credentials to respective nodes. For example: In the Append Raw Materials node, use Google Sheets credentials: {{ $credentials.GoogleSheets }}. In the Current Stock Update node, use Supabase credentials: {{ $credentials.Supabase }}. In the Send Low Stock Email Alert node, use Gmail credentials. Set Up Nodes: Webhook Nodes (Receive Raw Materials Webhook, Receive Material Issue Webhook): Configure webhook URLs and link them to your form submissions. Approval Email (Send Approval Request): Customize the HTML email template if needed. Low Stock Alerts (Send Low Stock Email Alert, Send Low Stock Email After Issue): Configure recipient email addresses. Test the Workflow: Submit a test form for raw material receipt and verify stock updates in Google Sheets/Supabase. Submit a material issue request, approve/reject it, and confirm stock updates and notifications. How It Works High-Level Steps Receive Raw Materials: Processes form submissions for raw material receipts. Update Stock: Updates stock levels in Google Sheets and Supabase. Handle Issue Requests: Processes material issue requests via forms. Manage Approvals: Sends approval requests and processes decisions. Monitor Stock Levels: Detects low stock and sends Gmail alerts. Detailed Descriptions Detailed node descriptions are available in the sticky notes within the workflow screenshot (to be provided). Below is a summary of key actions. Node Names and Actions Raw Materials Receiving and Stock Update Receive Raw Materials Webhook: Receives raw material data from a form submission. Standardize Raw Material Data: Maps form data into a consistent format. Calculate Total Price: Computes Total Price (Quantity Received * Unit Price). Append Raw Materials: Records receipt in Google Sheets. Check Quantity Received Validity: Ensures Quantity Received is valid. Lookup Existing Stock: Retrieves current stock for the Product ID. Check If Product Exists: Branches based on Product ID existence. Calculate Updated Current Stock: Adds Quantity Received to stock (True branch). Update Current Stock: Updates stock in Google Sheets (True branch). Retrieve Updated Stock for Check: Retrieves updated stock for low stock check. Detect Low Stock Level: Flags if stock is below minimum. Trigger Low Stock Alert: Triggers email if stock is low. Send Low Stock Email Alert: Sends low stock alert via Gmail. Add New Product to Stock: Adds new product to stock (False branch). Current Stock Update: Updates Supabase Current Stock table. New Row Current Stock: Inserts new product into Supabase. Search Current Stock: Retrieves Supabase stock records. New Record Raw: Inserts raw material record into Supabase. Format Response: Removes duplicates from Supabase response. Combine Stock Update Branches: Merges branches for existing/new products. Material Issue Request and Approval Receive Material Issue Webhook: Receives issue request from a form submission. Standardize Data: Normalizes request data and adds Approval Link. Validate Issue Request Data: Ensures Quantity Requested is valid. Verify Requested Quantity: Validates Product ID and Submission ID. Append Material Request: Records request in Google Sheets. Check Available Stock for Issue: Retrieves current stock for the request. Prepare Approval: Checks stock sufficiency for the request. Send Approval Request: Emails approver with Approve/Reject options. Receive Approval Response: Captures approver’s decision via webhook. Format Approval Response: Processes approval data with Approval Date. Verify Approval Data: Validates the approval response. Retrieve Issue Request Details: Retrieves original request from Google Sheets. Process Approval Decision: Branches based on approval action. Get Stock for Issue Update: Retrieves stock before update (Approved). Deduct Issued Stock: Reduces stock by Approved Quantity (Approved). Update Stock After Issue: Updates stock in Google Sheets (Approved). Retrieve Stock After Issue: Retrieves updated stock for low stock check. Detect Low Stock After Issue: Flags low stock after issuance. Trigger Low Stock Alert After Issue: Triggers email if stock is low. Send Low Stock Email After Issue: Sends low stock alert via Gmail. Update Issue Request Status: Updates request status (Approved/Rejected). Combine Stock Lookup Results: Merges stock lookup branches. Create Record Issue: Inserts issue request into Supabase. Search Stock by Product ID: Retrieves Supabase stock records. Issues Table Update: Updates Supabase Materials Issued table. Update Current Stock: Updates Supabase stock after issuance. Combine Issue Lookup Branches: Merges issue lookup branches. Search Issue by Submission ID: Retrieves Supabase issue records. Customization Tips Expand Storage Options : Add nodes to store data in other databases (e.g., Airtable) alongside Google Sheets and Supabase. Modify Approval Email : Update the Send Approval Request node to customize the HTML email template (e.g., adjust styling or add branding). Alternative Notifications : Add nodes to send low stock alerts via other platforms (e.g., Slack or Telegram). Adjust Low Stock Threshold : Modify the Detect Low Stock Level node to change the Minimum Stock Level (default: 50).!

ImmanuelBy Immanuel
4958

Create & approve POV videos with AI, ElevenLabs & multi-posting (TikTok/IG/YT)

POV Video Creator: Automating TikTok-Style Instagram Video Automation, Approval, and Multi-Platform Posting Using AI, ElevenLabs, Google Sheets, and Social Media APIs Description What Problem Does This Solve? 🎥 This workflow automates the creation, rendering, approval, and posting of TikTok-style POV (Point of View) videos to Instagram, with cross-posting to Facebook and YouTube. It eliminates manual video production, approval delays, and inconsistent posting schedules. It ensures high-quality content creation and distribution for social media managers and content creators Target audience: Social media managers, content creators, small to medium-sized businesses, and n8n users familiar with AI tools, Google Sheets, and social media APIs What Does It Do? 🌟 Generates daily POV video ideas using OpenAI Creates images, videos, and audio with PIAPI.ai and ElevenLabs Renders final videos with Creatomate Manages approvals via email and Google Sheets Posts approved videos to Instagram, Facebook, and YouTube Tracks progress in a Google Sheet for transparency Key Features AI-driven idea generation and script creation Automated media production with image, video, and audio synthesis Email-based approval system for quality control Cross-platform posting to Instagram, Facebook, and YouTube Real-time tracking in Google Sheets and Google Drive Error handling for rendering and posting failures Setup Instructions Prerequisites n8n Instance: Self-hosted or cloud n8n instance API Credentials: OpenAI API: API key for idea generation, stored in n8n credentials PIAPI.ai API: API key for image and video generation, stored in n8n credentials ElevenLabs API: API key for audio generation, stored in n8n credentials Creatomate API: API key for video rendering, stored in n8n credentials Google Sheets/Drive API: OAuth2 credentials from Google Cloud Console with Sheets and Drive scopes Gmail API: OAuth2 credentials from Google Cloud Console with Gmail scope Instagram Graph API: User Access Token with instagramcontentpublish permission from a Facebook App Facebook Graph API: Access Token from the same Facebook App YouTube API: OAuth2 credentials for YouTube uploads Google Sheet: A sheet named "POV Videos" with a tab "Instagram" and columns: Timestamp, ID, Subject, Topic, Caption, POVStatus, Prompt, PublishStatus, Link, Final Video, Approval, row_number Creatomate Template: A pre-configured template with video, audio, and text elements Installation Steps Import the Workflow: Copy the workflow JSON from the “Template Code” section (to be provided) Import it into n8n via “Import from File” or “Import from URL” Configure Credentials: Add API credentials in n8n’s Credentials section for OpenAI, PIAPI.ai, ElevenLabs, Creatomate, Google Sheets/Drive, Gmail, Instagram Graph, Facebook Graph, and YouTube Assign credentials to respective nodes. For example: In "Text-to-Image", use PIAPI.ai credentials: {{ $credentials.PIAPI }} In "Render with Creatomate", use Creatomate credentials: {{ $credentials.Creatomate }} In "Send Approval Request", use Gmail credentials Set Up Nodes: Schedule Trigger: Configure to run daily Approval Email (Send Approval Request): Customize the HTML email template with approval/rejection links Post to Social Media Nodes (Instagram Container, Facebook Posts, Post YouTube): Configure with your Instagram Business Account ID, Facebook Page ID, and YouTube channel details Configure Google Sheet and Drive: Create "POV Videos" Google Sheet with "Instagram" tab and specified columns Share the sheet with your Google Sheets credential email Create "Audio" and "Video" folders in Google Drive, noting their IDs Test the Workflow: Run manually to verify idea generation, media creation, and posting Check email notifications, Google Sheet updates, and social media posts Schedule the Workflow: Enable "Schedule Trigger" and "Schedule Trigger1" for daily runs Enable "Get Latest Approved Video" to poll at 7 PM daily How It Works High-Level Steps Generate Video Ideas: Creates daily POV video concepts with OpenAI Create Media: Produces images, videos, and audio using AI tools Render Video: Combines media into a final video with Creatomate Manage Approvals: Sends approval emails and processes decisions Post to Platforms: Publishes approved videos to Instagram, Facebook, and YouTube Detailed Descriptions Detailed node descriptions are available in the sticky notes within the workflow (to be provided). Below is a summary of key actions Node Names and Actions Video Idea Generation and Script Creation Schedule Trigger: Initiates daily workflow Get Title: Fetches pending video ideas from Google Sheet Generate Topics: Uses OpenAI to create a new video idea Format Row: Structures the idea into a Google Sheet row Insert new Prompt, Caption and Title/Topic: Adds the idea to Google Sheet Generate Ideas: Produces 3 POV sequences Generate Script: Expands a sequence into a detailed script Set Topics: Stores the script for media creation Media Creation Text-to-Image: Generates an image with PIAPI.ai Get Image: Retrieves the generated image Generate Video Prompt: Creates a video prompt from the image Generate Video: Produces a 5-second video with PIAPI.ai Access Videos: Retrieves the video URL Store Video: Updates Google Sheet with video URL Generate Sound Prompt: Creates an audio prompt Text-to-Sound: Generates a 20-second audio clip with ElevenLabs Store Sound: Uploads audio to Google Drive Allow Access: Sets audio file permissions Video Rendering Merge: Combines script, video, and audio data List Elements: Formats data for Creatomate Render with Creatomate: Renders the final video Check Video Status: Routes based on render success/failure Storage and Notification Google Drive: Uploads the rendered video New Render Video Alert: Sends success email Failed Render: Sends failure email Render Video Link: Updates Google Sheet with final video URL Approval Process Approval Email: Sends approval request email Handle Approval/Rejection1: Processes approval/rejection via webhook Video Update1: Updates Google Sheet with approval status Social Media Posting Get Latest Approved Video: Polls for approved videos Check Approval: Routes based on approval status Instagram Container: Creates Instagram media container Post to Instagram: Publishes to Instagram Facebook Posts: Posts to Facebook Download Video: Downloads video for YouTube Post YouTube: Uploads to YouTube Mark Rejected: Updates status for rejected videos Update Google Sheet: Updates publish status Customization Tips Expand Platforms: Add nodes to post to other platforms Modify Approval Email: Update the Send Approval Request node to customize the HTML template Alternative Notifications: Add nodes for Slack or Telegram alerts Adjust Video Duration: Modify Generate Video node to change duration (default: 5 seconds)

ImmanuelBy Immanuel
2799

Analyze Telegram messages with OpenAI and send notifications via Gmail & Telegram

AI-powered Telegram message analysis with multi-tool notifications (Gmail, Telegram) This workflow triggers on Telegram updates, analyzes messages with an AI Agent using MCP tools, and sends notifications via Gmail and Telegram. Detailed Description Who is this for? This template is for teams, businesses, or individuals using Telegram for communication who need automated, AI-driven insights and notifications. It’s ideal for customer support teams, project managers, or tech enthusiasts wanting to process Telegram messages intelligently and receive alerts via Gmail and Telegram. What problem is this workflow solving? Use case This workflow solves the challenge of manually monitoring Telegram messages by automating message analysis and notifications. For example, a support team can use it to analyze customer queries on Telegram with AI tools (OpenAI, Airbnb, Brave, FireCrawl) and get notified via Gmail and Telegram for quick responses. What this workflow does The workflow: Triggers on a Telegram update (e.g., a new message) using the Listen for Telegram Updates node. Processes the message with the Analyze Message with AI node, an AI Agent using MCP tools like OpenAI Chat, Airbnb search, Brave search, and FireCrawl. Sends notifications via the Send Gmail Notification and Send Telegram Alert nodes, including AI-generated insights. Setup Prerequisites: Telegram bot token for the trigger and notification nodes. Gmail API credentials for sending emails. API keys for OpenAI, Airbnb, Brave, and FireCrawl (used in the AI Agent). Steps: Configure the Listen for Telegram Updates node with your Telegram bot token. Set up the Analyze Message with AI node with your OpenAI API key and other tool credentials. Configure the Send Gmail Notification node with your Gmail credentials. Set up the Send Telegram Alert node with your Telegram bot token. Test by sending a Telegram message to trigger the workflow. Setup takes ~15-30 minutes. Detailed instructions are in sticky notes within the workflow. How to customize this workflow to your needs Add more AI tools (e.g., sentiment analysis) in the Analyze Message with AI node. Modify the notification message in the Send Gmail Notification and Send Telegram Alert nodes to include specific AI outputs. Add nodes for other channels like Slack or SMS after the AI Agent. Disclaimer This workflow uses Community nodes (e.g., Airbnb, Brave, FireCrawl), which are available only in self-hosted n8n instances. Ensure your n8n setup supports Community nodes before using this template.

ImmanuelBy Immanuel
870
All templates loaded