Back to Catalog

Templates by WeblineIndia

Summarize the new documents from Google Drive and save summary in Google Sheet

This workflow is created by AI developers at WeblineIndia. It streamlines the process of managing content by automatically identifying and fetching the most recently added Google Doc file from your Google Drive. It extracts the content of the document for processing and leverages an AI model to generate a concise and meaningful summary of the extracted text. The summarized content is then stored in a designated Google Sheet, alongside relevant details like the document name and the date it was added, providing an organized and easily accessible reference for future use. This automation simplifies document handling, enhances productivity, and ensures seamless data management. Steps : Fetch the Most Recent Document from Google Drive Action: Use the Google Drive Node. Details: List files, filter by date to fetch the most recently added .doc file, and retrieve its file ID and metadata. Extract Content from the Document Action: Use the Google Docs Node. Details: Set the operation to "Get Content," pass the file ID, and extract the document's text content. Summarize the Document Using an AI Model Action: Use an AI Model Node (e.g., OpenAI, ChatGPT). Details: Provide the extracted text to the AI model, use a prompt to generate a summary, and capture the result. Store the Summarized Content in Google Sheets Action: Use the Google Sheets Node. Details: Append a new row to the target sheet with details such as the original document name, summary, and date added. --- About WeblineIndia WeblineIndia specializes in delivering innovative and custom AI solutions to simplify and automate business processes. If you need any help, please reach out to us.

WeblineIndiaBy WeblineIndia
7257

Automatic event creation in Google Calendar from Google Sheets data

This workflow streamlines the process of creating events in Google Calendar using event data stored in a Google Sheet. The process begins by retrieving the latest event entry from Google Sheets, ensuring only the most recent event details are processed. Once fetched, a Function node formats the event date to align with Google Calendar's required format—ensuring consistency and preventing date-related errors. After formatting, the structured event details are sent to Google Calendar, where an event is created with essential information such as the event title (summary), description, date, and location. Additionally, the workflow allows customization by setting the event's status as either "Busy" or "Available," helping attendees manage their schedules. A background color can also be assigned for better visibility and categorization. By automating this process, you eliminate the need for manual event creation, ensuring seamless synchronization between Google Sheets and Google Calendar. This improves efficiency, accuracy, and productivity, making event management effortless. Prerequisites : Before setting up this workflow, ensure the following: You have an active Google account connected to Google Sheets and Google Calendar. The Google Sheets API and Google Calendar API are enabled in the Google Cloud Console. n8n has the necessary OAuth2 authentication configured for both Google Sheets and Google Calendar. Your Google Sheet has columns for event details (event name, description, location, date, etc.). |Event Name|Event Description|Event Start Date|Location| |-|-|-|-| |Birthday|Celebration|27-Mar-1989|City| |Anniversary|Celebration|10-Jun-2015|City| Customization Options : Modify the Google Sheets trigger to track updates in specific columns. Adjust the data formatting function to support: Different date/time formats Time zone settings Custom event colors Attendee invitations Steps : Step 1: Add the Google Sheets Trigger Node Click "Add Node" and search for Google Sheets. Select "Google Sheets Trigger" and add it to the workflow. Authenticate using your Google account (select an existing account if already authenticated). Select the Spreadsheet and Sheet Name to monitor. Set the Trigger Event to "Row Added". Click "Execute Node" to test the connection. Click "Save". Step 2: Process Data with Function Node Click "Add Node" and search for Function. Add the Function Node and connect it to the Google Sheets Trigger Node. In the function editor, write a script to extract and format data. Ensure the required fields (title, location, date) are properly structured. Click "Execute Node" to verify the formatted output. Click "Save". Step 3: Add the Google Calendar Node Click "Add Node" and search for Google Calendar. Select "Create Event" operation. Authenticate with Google Calendar. Map the required fields Title Description Location Start time Optional: Set Event Status and Event Colors. Click "Execute Node" to test event creation. Click "Save". Step 4: Final Steps Connect all nodes in sequence (Google Sheets Trigger → Function Node → Google Calendar Node). Test the workflow by adding a sample row in Google Sheets. Verify that the event is created in Google Calendar with the correct title, description, date, and location. About WeblineIndia This workflow was built by the AI development team at WeblineIndia. We help businesses automate processes, reduce repetitive work, and scale faster. Need something custom? You can hire AI developers to build workflows tailored to your needs.

WeblineIndiaBy WeblineIndia
5145

Add new incoming emails to a Google Sheets spreadsheet as a new row

This n8n workflow automates the process of capturing and storing incoming email details in a structured spreadsheet format, such as Google Sheets or Excel. Whenever a new email is received, the workflow extracts key details—including the sender’s email, subject, email body, and optional attachments—and logs them as a new row in the spreadsheet. You can customise this workflow to extract additional details, filter emails based on specific criteria, or send notifications when new entries are added. Pre-conditions & Requirements Before setting up this workflow, ensure that: You have access to the email provider (e.g., Gmail, Outlook, or IMAP-supported email services). The Gmail Node must be enabled in n8n. You must authenticate n8n with Google OAuth2 to access your inbox. Ensure that the Gmail API is enabled in the Google Cloud Console. You have an existing Google Sheet where data will be stored. The Google Sheets API is enabled. You authenticate n8n with your Google account. Steps Step 1: Add the Gmail Trigger Node Click on "Add Node" and search for "Gmail". Select "Gmail Trigger" and click to add it. Under Authentication, click "Create New" and authenticate with your Google account. (If you have already connected your Google account, simply select it.) In the Trigger Event field, select "Message Received". Under Filters, you can specify: Label/Mailbox: If you want to listen to emails from a specific folder (optional). From Address: If you only want to receive emails from specific senders (optional). Click "Execute Node" to test the connection. Click "Save". What This Does: This node listens for new incoming emails in your Gmail inbox. Step 2: Store Email Data in Google Sheets Click on "Add Node" and search for "Google Sheets" (or Microsoft Excel, if applicable) Under Authentication, connect your Google account Select the target Spreadsheet and Sheet Name where the data will be stored Set the Operation to "Append Row" Map the extracted email data to the correct columns. Click "Execute Node" to test and verify data storage Click "Save" What This Does: This node automatically adds a new row for each incoming email, ensuring a structured and searchable email log. Final Step Attach both node and execute the workflow. Who’s behind this? WeblineIndia’s AI development team. We've delivered 3500+ software projects across 25+ countries since 1999. From no-code automations to complex AI systems — our AI team builds tools that drive results. Looking to hire AI developers? Start with us.

WeblineIndiaBy WeblineIndia
4701

Post new Google Calendar events to Telegram

This n8n workflow automatically sends a Telegram message whenever a new event is added to Google Calendar. It extracts key event details such as event name, description, event creator, start date, end date, and location and forwards them to a specified Telegram chat. This ensures you stay updated on all newly scheduled events directly from Telegram. Prerequisites Before setting up the workflow, ensure the following: Google Account with Google Calendar Access: The Google Calendar API must be enabled. Telegram Bot: Create a bot using BotFather on Telegram. Telegram Chat ID: Retrieve the Chat ID (personal chat or group). Use OAuth2 for Google Calendar and a Bot Token for Telegram. Steps Step 1: Google Calendar Trigger Node (Event Created Event) Click "Add Node" and search for Google Calendar. Select "Google Calendar Trigger" and add it to the workflow. Authenticate with your Google Account. Select "Event Created" as the trigger type. Choose the specific calendar to monitor. Click "Execute Node" to test the connection. Click "Save". Step 2: Telegram Node (Send Message Action) Click "Add Node" and search for Telegram. Select "Send Message" as the action. Authenticate using your Telegram Bot Token. Set the Chat ID (personal or group chat). Format the message using details from Google Calendar Trigger and set the message in text. Click "Execute Node" to test. Click "Save". Step 3: Connect & Test the Workflow Link Google Calendar Trigger → Telegram Send Message. Execute the workflow manually. Create a test event in Google Calendar. Check Telegram to see if the event details appear. n8n Workflow Created by WeblineIndia This workflow is built by the AI development team at WeblineIndia. We help businesses automate processes, reduce repetitive work, and scale faster. Need something custom? You can hire AI developers to build workflows tailored to your needs.

WeblineIndiaBy WeblineIndia
3583

Daily birthday reminders from Google Contacts to Slack

Ensure you never miss a birthday with this automated workflow designed by WeblineIndia. It retrieves your Google Contacts, identifies birthdays happening today, and sends personalized reminders directly to a designated Slack channel. This daily automation keeps your team informed and makes birthday celebrations effortless. Steps Set Daily Schedule (Cron Node) Configure a Cron node to trigger the workflow daily at a specific time (e.g., 8 AM). This ensures the workflow runs consistently every day to check for birthdays. Retrieve Contacts (Google Contacts - Get Contact Node) Use the Google Contacts (Get Contact) node to fetch your contact list. Ensure your contacts have birthday details stored for accurate filtering. Filter Birthdays (IF Node) Add an IF Node to compare the current date with each contact’s birthday. Only contacts whose birthdays match today’s date will move to the next step. Send Birthday Notifications to Slack (Slack - Send Message Node) Use the Slack node to send a personalized birthday message to your chosen Slack channel (e.g., general or birthdays). Customize the message to include the contact’s name, e.g., "🎉 Today is John Doe's birthday! Let’s celebrate!" Configure the node to target a specific Slack channel for seamless notifications. Activate Workflow Save and activate the workflow. From now on, the workflow will automatically check for birthdays daily and send timely reminders to your Slack team. Outcome This hassle-free automation keeps your team engaged and ensures no birthday goes unnoticed. Celebrate special days of your contacts effortlessly and maintain meaningful connections. About WeblineIndia This workflow showcases our commitment to delivering innovative automation solutions that enhance productivity and foster better relationships. Let us help you build the AI automation tools that make a difference.

WeblineIndiaBy WeblineIndia
3097

Forward filtered Gmail notifications to Telegram chat

This workflow automatically forwards incoming Gmail emails to a Telegram chat only if the email subject contains specific keywords (like "Urgent" or "Server Down"). The workflow extracts key details such as the sender, subject, and message body, and sends them as a formatted message to a specified Telegram chat. This is useful for real-time notifications, security alerts, or monitoring important emails directly from Telegram — filtering out unnecessary emails. Prerequisites: Before setting up the workflow, ensure the following: The Gmail API should be enabled. Create a bot using @BotFather and obtain the API key. Retrieve the telegram Chat ID (for personal messages or group messages). Set up OAuth2 for Gmail and use the Bot Token for Telegram. Customisation Options : Modify the subject keywords in the IF Node to change the filtering criteria. Customize how the email details appear in Telegram (bold subject, italic body, etc.). Extend the workflow to include email attachments in Telegram. Steps : Step 1: Gmail Trigger Node (On Message Received) Select "Gmail Trigger" and add it to the workflow. Authenticate with your Google Account. Set Trigger Event to "Message Received". (Optional) Add filters for specific senders, labels, or subjects. Click "Execute Node" to test the connection. Click "Save". Step 2: IF Node (Conditional Filtering) Add an "IF" Node after the Gmail Trigger. Configure the condition to check if the email subject contains specific keywords (e.g., "Urgent", "Server Down", "Alert"). If the condition is true, proceed to the next step. If false, you can stop or route it elsewhere (optional). Step 3: Telegram Node (Send Message Action) Click "Add Node" and search for Telegram. Select "Send Message" as the action. Authenticate using your Telegram Bot Token. Set the Chat ID (personal or group chat). Format the message using email details received from the email trigger node and set the message in text. Steps 4. Connect & Test the Workflow Link Gmail Trigger → if node → Telegram Send Message. Save and execute the workflow manually. Send a test email to your Gmail account. Verify if the email details appear in your Telegram chat. About the Creator, WeblineIndia This workflow is created by the Agentic business process automation developers at WeblineIndia. We build automation and AI-driven tools that make life easier for your team. If you’re looking to hire dedicated developers who can customize workflows around your business, we’re just a click away.

WeblineIndiaBy WeblineIndia
2821

Store form submission in Airtable

This workflow, developed by our AI developers at WeblineIndia, is designed to automate the process of capturing form submissions and storing them in Airtable. By leveraging automation, it eliminates manual data entry, ensuring a smooth and efficient way to handle form data. The purpose of creating this workflow is to streamline data management, helping businesses save time, reduce errors, and maintain an organized, structured database for easy access and future use. Steps: Trigger on Form Submission (Form Node) What It Does: Activates the workflow whenever a form is submitted. How to Set It Up: Use the Form Submission Trigger node to detect new form submissions. This ensures the workflow starts automatically when a user fills out the form. Store Data in Airtable (Airtable Node) What It Does: Transfers the form data into an Airtable base. How to Set It Up: Use the Airtable Node to map form fields to corresponding columns in your Airtable table, storing the data accurately. Finalize and Activate What It Does: Completes the setup to automate data storage upon form submission. How to Set It Up: Save and activate the workflow. Once active, it will automatically record all new form submissions in Airtable.

WeblineIndiaBy WeblineIndia
1984

YouTube transcription & translation to Google Docs with Gemini AI

YouTube Transcription, Summarization & Translation to Google Docs This workflow automates the end-to-end process of converting YouTube videos into structured, multilingual written content. It transcribes the video's speech, optionally summarizes it, translates it into the chosen language and stores the result in a well-formatted Google Doc—ready for review, sharing or publication. Who’s It For Content creators and bloggers repurposing video content. Educators and researchers converting lectures into readable notes. Marketing teams localizing video material for international audiences. Students summarizing and translating study material. YouTube viewers who want written notes or blog-ready formats. How It Works A Webhook triggers the flow with inputs: youtubeurl, language and enablesummary. A Code node formats these inputs into videoId, originalUrl, language and enable_summary. An HTTP Request node sends the video to Supadata API for full transcription. Another Code node combines all transcript segments into one body of text. The Basic LLM Chain node uses the Google Gemini Chat Model to summarize and translate the transcript if requested. A Google Docs node creates a new document with a title based on videoId and language. A final Google Docs node appends the processed summary and translation into the created document. How to Set Up Webhook Input: Send a POST request with three fields: youtubeurl, language, enablesummary. Configure Supadata API: Add the HTTP URL and Authorization Header for transcription. Set up Gemini Chat Model: Use Google Vertex AI/Gemini integration in the Basic LLM Chain node. Create Google Docs Credentials: Connect your Google account using OAuth2. Document Naming Logic: You may adjust document titles using expressions (e.g., {{ videoId }} - {{ language }}). Requirements Supadata API key (or any video-to-text API). Google account with Google Docs access. Google Gemini access via n8n’s LLM integration. n8n Cloud or self-hosted instance. Basic understanding of webhook setup (or a form frontend). How to Customize Change LLM model: Swap Gemini with GPT-4 or Claude in the LLM Chain node. Summarization toggle: Use the enable_summary flag to control verbosity. Document layout: Customize headings, font styles and content sections in Google Docs. Multiple languages: Extend the workflow to translate into multiple languages and generate one document per language. Sharing options: Add Gmail or Slack nodes to notify users once the document is generated. Add‑ons Notion Export: Send the document summary directly into Notion using the Notion node. Slack Notification: Notify your team with a link to the Google Doc using the Slack node. Google Sheets Logging: Log video URLs, timestamps, and language used for auditing. n8n Forms Integration: Allow users to submit video URLs and language via a hosted n8n form. Use Case Examples Repurposing Videos into Blogs: Automatically convert YouTube podcasts into multilingual blog posts. Educational Notes: Extract and translate lecture content into shareable study documents. International Marketing Teams: Summarize and localize product explainer videos for different countries. Transcription Library: Create a searchable database of translated transcripts from niche educational YouTube channels. Common Troubleshooting | Issue | Possible Cause | Solution | | ------------------------------- | ------------------------------------------ | ---------------------------------------------------------- | | Webhook not triggering | Incorrect webhook URL or POST format | Double-check payload and content-type (application/json) | | Transcription API fails | Invalid video ID or API key | Validate YouTube URL and Supadata API access | | Empty translation/summarization | Transcript was empty or prompt was weak | Ensure the video contains spoken content and refine prompt | | Google Doc not created | OAuth2 credentials not authorized properly | Reconnect Google Docs credentials in n8n | | Gemini LLM Chain fails | Model misconfigured or request malformed | Verify your model selection and payload structure | Need Help? Need help getting this set up or customizing it for your workflow? ✅ We can help you: Set up transcription and translation APIs Modify the summarization prompt Customize document layouts or automate sharing 👉 Contact WeblineIndia's automation experts !

WeblineIndiaBy WeblineIndia
1658

Automate video uploads to thumbnails with FFmpeg and Google Drive

Automate Video Upload → Auto-Thumbnail → Google Drive This workflow accepts a video via HTTP upload, verifies it’s a valid video, extracts a thumbnail frame at the 5-second mark using FFmpeg (auto-installs a static build if missing), uploads the image to a specified Google Drive folder and returns a structured JSON response containing the new file’s details. Who’s it for Marketing / Social teams who need ready-to-publish thumbnails from raw uploads. Developers who want an API-first thumbnail microservice without standing up extra infrastructure. Agencies / Creators standardizing assets in a shared Drive. How it works Accept Video Upload (Webhook) Receives multipart/form-data with file in field media at /mediaUpload. Response is deferred until the final node. Validate Upload is Video (IF) Checks {{$binary.media.mimeType}} contains video/. Non-video payloads can be rejected with HTTP 400. Persist Upload to /tmp (Write Binary File) Writes the uploaded file to /tmp/&lt;originalFilename or input.mp4&gt; for stable processing. Extract Thumbnail with FFmpeg (Execute Command) Uses system ffmpeg if available; otherwise downloads a static binary to /tmp/ffmpeg. Runs: ffmpeg -y -ss 5 -i <video> -frames:v 1 -q:v 2 /tmp/thumbnail.jpg Load Thumbnail from Disk (Read Binary File) Reads /tmp/thumbnail.jpg into the item’s binary as thumbnail. Upload Thumbnail to Drive (Google Drive) Uploads to your target folder. File name is &lt;original&gt;-thumb.jpg. Return API Response (Respond to Webhook) Sends JSON to the client including Drive file id, name, links, size, and checksums (if available). How to set up Import the workflow JSON into n8n. Google Drive Create (or choose) a destination folder; copy its Folder ID. Add Google Drive OAuth2 credentials in n8n and select them in the Drive node. Set the folder in the Drive “Upload” node. Webhook The endpoint is POST /webhook/mediaUpload. Test: bash curl -X POST https://YOUR-N8N-URL/webhook/mediaUpload \ -F "media=@/path/to/video.mp4" FFmpeg Nothing to install manually: the Execute Command node auto-installs a static ffmpeg if it’s not present. (Optional) If running n8n in Docker and you want permanence, use an image that includes ffmpeg. Response body The Respond node returns JSON with file metadata. You can customize the fields as needed. (Optional) Non-video branch On the IF node’s false output, add a Respond node with HTTP 400 and a helpful message. Requirements n8n instance with Execute Command node enabled (self-hosted/container/VM). Outbound network access (to fetch static FFmpeg if not installed). Google Drive OAuth2 credential with permission to the destination folder. Adequate temp space in /tmp for the uploaded video and generated thumbnail. How to customize Timestamp: change -ss 5 to another second, or parameterize it via query/body (e.g., timestamp=15). Multiple thumbnails: duplicate the FFmpeg + Read steps with -ss 5, -ss 15, -ss 30, suffix names -thumb-5.jpg, etc. File naming: include the upload time or Drive file ID: {{ base + '-' + $now + '-thumb.jpg' }}. Public sharing: add a Drive → Permission: Create node (Role: reader, Type: anyone) and return webViewLink. Output target: replace the Drive node with S3 Upload or Zoho WorkDrive (HTTP Request) if needed. Validation: enforce max file size/MIME whitelist in a small Function node before writing to disk. Logging: append a row to Google Sheets/Notion with sourceFile, thumbId, size, duration, status. Add-ons Slack / Teams notification with the uploaded thumbnail link. Image optimization (e.g., convert to WebP or resize variants). Retry & alerts via error trigger workflow. Audit log to a database (e.g., Postgres) for observability. Use Case Examples CMS ingestion: Editors upload videos; workflow returns a thumbnail URL to store alongside the article. Social scheduling: Upload longform to generate a quick hero image for a post. Client portals: Clients drop raw footage; you keep thumbnails uniform in one Drive folder. Common troubleshooting | Issue | Possible Cause | Solution | | ----------------------------------------------------- | ------------------------------------------------------ | ---------------------------------------------------------------------------------------------------------------- | | ffmpeg: not found | System lacks ffmpeg and static build couldn’t download | Ensure outbound HTTPS allowed; keep the auto-installer lines intact; or use a Docker image that includes ffmpeg. | | Webhook returns 400 “not a video” | Wrong field name or non-video MIME | Send file in media field; ensure it’s video/*. | | Drive upload fails (403 / insufficient permissions) | OAuth scope or account lacks folder access | Reconnect Drive credential; verify the destination Folder ID and sharing/ownership. | | Response missing webViewLink / webContentLink | Drive node not returning link fields | Enable link fields in the Drive node or build URLs using the returned id. | | 413 Payload Too Large at reverse proxy | Proxy limits on upload size | Increase body size limits in your proxy (e.g., Nginx clientmaxbody_size). | | Disk full / ENOSPC | Large uploads filling /tmp | Increase temp storage; keep Cleanup step; consider size caps and early rejection. | | Corrupt thumbnail or black frame | Timestamp lands on a black frame | Change -ss or use -ss before -i vs. after; try different seconds (e.g., 1–3s). | | Slow extraction | Large or remote files; cold FFmpeg download | Warm the container; host near upload source; keep static ffmpeg cached in image. | | Duplicate outputs | Repeat requests with same video/name | Add a de-dup check (query Drive for existing &lt;base&gt;-thumb.jpg before upload). | Need Help? Want this wired to S3 or Zoho WorkDrive or to generate multiple timestamps and public links out of the box? We're happy to help.

WeblineIndiaBy WeblineIndia
1268

Get daily weather and save it in Airtable

This smart automation workflow created by the AI development team at WeblineIndia, helps with the daily collection and storage of weather data. Using the OpenWeatherMap API and Airtable, this solution gathers vital weather details such as temperature, humidity, and wind speed. The automation ensures daily updates, creating a dependable historical record of weather patterns for future reference and analysis. Steps: Set Schedule Trigger Configure a Cron node to trigger the workflow daily, for example, at 7 AM. Fetch Weather Data (HTTP Request) Use the HTTP Request node to retrieve weather data from the OpenWeatherMap API. Include your API key and query parameters (e.g., q=London, unit=metric) to specify the city and desired units. Parse Weather Data Utilize a JSON Parse node to extract key weather details, such as temperature, humidity, and wind speed, from the API response. Store Data in Airtable Use the Airtable node to insert the parsed data into the designated Airtable table. Ensure proper mapping of fields like temperature, humidity, and wind speed. Save and Execute Save the workflow and activate it to ensure weather data is fetched and stored automatically every day. Outcome This robust solution, developed by WeblineIndia, reliably collects and archives daily weather data, providing businesses and individuals with an accessible record of weather trends for analysis and decision-making. About WeblineIndia We specialize in creating custom automation solutions and innovative software workflows to help businesses streamline operations and achieve efficiency. This weather data fetcher is just one example of our expertise in delivering value through technology.

WeblineIndiaBy WeblineIndia
1244

Sync new files from Google Drive with Airtable

This workflow automatically fetches newly uploaded files from a specific folder in Google Drive, shares them via email with specified recipients, and logs the file details (name, ID, created time, modified time) into Airtable for easy tracking. It streamlines the process of file sharing and management while keeping track of important metadata in a central place. Step-by-Step Instructions Google Drive Node (Fetch New File) Action: This node fetches newly uploaded files from the specific folder you’ve mentioned in your Google Drive. Configuration: Set the folder ID in the Google Drive node where the files are uploaded. Use the “New File in Folder” trigger to automatically detect new files added to the folder. Send Email Node (Share File via Email) Action: After detecting the new file, this node shares the file via email with the recipient you specify. Configuration: Set the recipient's email address. Include the file URL from the Google Drive node in the email body, allowing easy access to the file. Add the file name as part of the email subject or body to notify the recipient about the new file. Airtable Node (Store File Metadata) Action: This node stores the file’s metadata, such as name, ID, creation time, modification time, and the email address to which it was sent, in your Airtable database. Configuration: Set up Airtable with a table. Map the output from the Google Drive node to store the file metadata, and use the email address from the email node for tracking. About WeblineIndia WeblineIndia specializes in delivering innovative and custom AI solutions to simplify and automate business processes. If you need any help, please reach out to us.

WeblineIndiaBy WeblineIndia
1242

Summarize YouTube videos with GPT-4o-mini and Apify transcripts

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. This workflow automates summarizing YouTube videos by accepting a YouTube URL via a form, fetching the video transcript using Apify, and then generating a concise summary with OpenAI GPT. Setup Instructions Prerequisites: Apify account with access to the YouTube Transcript actor. OpenAI API key (for GPT-4o-mini model). n8n instance with the Apify and OpenAI credentials configured. Configuration Steps Apify Setup: Configure Apify API credentials in the Apify node. Ensure the YouTube Transcript actor ID (1s7eXiaukVuOr4Ueg) is correct. OpenAI Setup: Add your OpenAI API key in the OpenAI Chat Model node. Confirm model selection is set to gpt-4o-mini. Customization Modify form field to accept additional inputs if needed. Adjust Apify actor input JSON in the Payload node for extra metadata extraction. Customize the summarization options to tweak summary length or style. Change OpenAI prompt or model parameters in the OpenAI Chat Model node for different output quality or tone. Steps On Form Submission Node: Form Trigger Purpose: Collect the YouTube video URL from the user via a web form. Prepare Payload Node: Set Purpose: Format the YouTube URL and options into the JSON payload for Apify input. Fetch Transcript Node: Apify Purpose: Run the YouTube Transcript actor to retrieve video captions and metadata. Extract Captions Purpose: Isolate the captions field from the Apify response for processing. Summarize Transcript Purpose: Generate a concise summary of the video captions.

WeblineIndiaBy WeblineIndia
1102