Generate meeting minutes from videos with Whisper, Ollama LLM and Notion
Automated Meeting Minutes from Video Recordings
This workflow automatically transforms video recordings of meetings into structured, professional meeting minutes in Notion. It uses local AI models (Whisper for transcription and Ollama for summarization) to ensure privacy and cost efficiency, while uploading the original video to Google Drive for safekeeping. Ideal for creative teams, production reviews, or any scenario where visual context is as important as the spoken word.
π How It Works
- Wait & Detect: The workflow monitors a local folder. When a new
.mkvvideo file is added, it waits until the file has finished copying. - Prepare Audio: The video is converted into a
.wavaudio file optimized for transcription (under 25 MB with high clarity). - Transcribe Locally: The local Whisper model generates a timestamped text transcript.
- Generate Smart Minutes: The transcript is sent to a local Ollama LLM, which produces structured, summarized meeting notes.
- Store & Share: The original video is uploaded to Google Drive, a new page is created in Notion with the notes and a link to the video, and a completion notification is sent via Discord.
β±οΈ Setup Steps
- Estimated Time: 10β15 minutes (for technically experienced users).
- Prerequisites:
- Install Python, FFmpeg, and required packages (
openai-whisper,ffmpeg-python). - Run Ollama locally with a compatible model (e.g.,
gpt-oss:20b,llama3,mistral). - Configure n8n credentials for Google Drive, Notion, and Discord.
- Install Python, FFmpeg, and required packages (
- Workflow Configuration:
- Update the file paths for the helper scripts (
wait-for-file.ps1,create_wav.py,transcribe_return.py) in the respective "Execute Command" nodes. - Change the input folder path (
G:\OBS\videos) in the "File" node to your own recording directory. - Replace the Google Drive folder ID and Notion database/page ID in their respective nodes.
- Update the file paths for the helper scripts (
> π‘ Note: Detailed instructions for each step, including error handling and variable setup, are documented in the Sticky Notes within the workflow itself.
π Helper Scripts Documentation
wait-for-file.ps1
A PowerShell script that checks if a file is still being written to (i.e., locked by another process). It returns 0 if the file is free and 1 if it is still locked.
Usage:
.\wait-for-file.ps1 -FilePath "C:\path\to\your\file.mkv"
create_wav.py
A Python script that converts a video file into a .wav audio file. It automatically calculates the necessary audio bitrate to keep the output file under 25 MBβa common requirement for many transcription services.
Usage:
python create_wav.py "C:\path\to\your\file.mkv"
transcribe_return.py
A Python script that uses a local Whisper model to transcribe an audio file. It can auto-detect the language or use a language code specified in the filename (e.g., meeting.en.mkv for English, meeting.es.mkv for Spanish). The transcript is printed directly to stdout with timestamps, which is then captured by the n8n workflow.
Usage:
# Auto-detect language
python transcribe_return.py "C:\path\to\your\file.mkv"
# Force language via filename
python transcribe_return.py "C:\path\to\your\file.es.mkv"
Generate Meeting Minutes from Videos with Whisper, Ollama LLM, and Notion
This n8n workflow automates the process of generating meeting minutes from video files, transcribing them using the Whisper ASR model via Ollama, summarizing the transcription with an Ollama-powered LLM, and finally storing the structured minutes in Notion.
It simplifies the task of documenting video meetings, making it easy to extract key information and store it in a searchable format.
What it does
- Monitors for New Video Files: Listens for new video files (e.g.,
.mp4,.mov,.webm) in a specified local directory. - Uploads to Google Drive: If a new video file is detected, it uploads the file to a designated folder in Google Drive.
- Transcribes Video with Whisper (via Ollama):
- Executes a shell command to run the Whisper ASR model (hosted via Ollama) on the video file to generate a transcription.
- The transcription is saved to a local
.txtfile.
- Summarizes Transcription with Ollama LLM:
- Reads the generated transcription from the local file.
- Uses an Ollama-powered Large Language Model (LLM) to summarize the meeting transcription.
- The summary is then processed to extract key information like "Title" and "Summary".
- Stores Minutes in Notion: Creates a new page in a specified Notion database, populating it with the generated title and summary.
- Deletes Local Files (Optional): If the Notion entry is successfully created, it deletes the local transcription file to clean up the workspace.
- Notifies on Discord (Optional): Sends a success notification to a Discord channel, including a link to the Notion page.
Prerequisites/Requirements
- n8n Instance: A running n8n instance with access to the local file system.
- Ollama: Ollama installed and running on the same machine as n8n, with the Whisper ASR model and a suitable LLM (e.g.,
llama2,mistral) pulled and available.ollama pull whisperollama pull llama2(or your preferred LLM)
- Google Drive Account: A Google Drive account and an n8n Google Drive credential configured.
- Notion Account: A Notion account with a pre-configured database for meeting minutes and an n8n Notion credential configured.
- Discord Account (Optional): A Discord server and an n8n Discord credential configured for notifications.
- Local Directory: A local directory on the n8n host where video files will be dropped and transcriptions will be temporarily stored.
Setup/Usage
- Import the Workflow: Import the provided JSON into your n8n instance.
- Configure Credentials:
- Google Drive: Set up your Google Drive OAuth2 credential.
- Notion: Set up your Notion API credential. Ensure the Notion integration has access to your meeting minutes database.
- Discord (Optional): Set up your Discord Webhook or Bot credential.
- Configure Nodes:
- Local File Trigger (Node 533):
- Specify the
Directoryto monitor for new video files. - Set
File Patternto match your video file types (e.g.,*.mp4, *.mov, *.webm).
- Specify the
- Google Drive (Node 58):
- Select your Google Drive credential.
- Specify the
Folder IDwhere video files should be uploaded.
- Execute Command (Node 13 - Transcribe with Whisper):
- Update the
Commandto correctly call Ollama's Whisper model with the path to the input video file and the desired output transcription file path.- Example command:
ollama run whisper "{{ $json.path }}" > "/path/to/transcriptions/{{ $json.fileNameWithoutExtension }}.txt" - Important: Replace
/path/to/transcriptions/with an actual directory on your n8n host.
- Example command:
- Update the
- Read/Write Files from Disk (Node 1233 - Read Transcription):
- Ensure the
File Pathpoints to the transcription file generated in the previous step (e.g.,/path/to/transcriptions/{{ $json.fileNameWithoutExtension }}.txt).
- Ensure the
- Ollama Model (Node 1159):
- Select the
Modelyou want to use for summarization (e.g.,llama2,mistral).
- Select the
- Basic LLM Chain (Node 1123):
- Review and adjust the
Promptto guide the LLM in summarizing and extracting the "Title" and "Summary" from the transcription.
- Review and adjust the
- Notion (Node 487):
- Select your Notion credential.
- Set the
Database IDfor your meeting minutes database. - Map the
TitleandSummaryproperties to the output of the LLM chain.
- Execute Command (Node 13 - Delete Local File):
- Ensure the
Commandcorrectly deletes the local transcription file (e.g.,rm "/path/to/transcriptions/{{ $json.fileNameWithoutExtension }}.txt").
- Ensure the
- Discord (Node 60 - Send Discord Message):
- Select your Discord credential.
- Customize the
Messagecontent for the success notification.
- Local File Trigger (Node 533):
- Activate the Workflow: Enable the workflow to start monitoring for new video files.
- Test: Drop a video file into the configured local directory to test the end-to-end process.
Related Templates
Send WooCommerce cross-sell offers to customers via WhatsApp using Rapiwa API
Who Is This For? This n8n workflow enables automated cross-selling by identifying each WooCommerce customer's most frequently purchased product, finding a related product to recommend, and sending a personalized WhatsApp message using the Rapiwa API. It also verifies whether the user's number is WhatsApp-enabled before sending, and logs both successful and unsuccessful attempts to Google Sheets for tracking. What This Workflow Does Retrieves all paying customers from your WooCommerce store Identifies each customer's most purchased product Finds the latest product in the same category as their most purchased item Cleans and verifies customer phone numbers for WhatsApp compatibility Sends personalized WhatsApp messages with product recommendations Logs all activities to Google Sheets for tracking and analysis Handles both verified and unverified numbers appropriately Key Features Customer Segmentation: Automatically identifies paying customers from your WooCommerce store Product Analysis: Determines each customer's most purchased product Smart Recommendations: Finds the latest products in the same category as customer favorites WhatsApp Integration: Uses Rapiwa API for message delivery Phone Number Validation: Verifies WhatsApp numbers before sending messages Dual Logging System: Tracks both successful and failed message attempts in Google Sheets Rate Limiting: Uses batching and wait nodes to prevent API overload Personalized Messaging: Includes customer name and product details in messages Requirements WooCommerce store with API access Rapiwa account with API access for WhatsApp verification and messaging Google account with Sheets access Customer phone numbers in WooCommerce (stored in billing.phone field) How to Use β Step-by-Step Setup Credentials Setup WooCommerce API: Configure WooCommerce API credentials in n8n (e.g., "WooCommerce (get customer)" and "WooCommerce (get customer data)") Rapiwa Bearer Auth: Create an HTTP Bearer credential with your Rapiwa API token Google Sheets OAuth2: Set up OAuth2 credentials for Google Sheets access Configure Google Sheets Ensure your sheet has the required columns as specified in the Google Sheet Column Structure section Verify Code Nodes Code (get paying_customer): Filters customers to include only those who have made purchases Get most buy product id & Clear Number: Identifies the most purchased product and cleans phone numbers Configure HTTP Request Nodes Get customer data: Verify the WooCommerce API endpoint for retrieving customer orders Get specific product data: Verify the WooCommerce API endpoint for product details Get specific product recommend latest product: Verify the WooCommerce API endpoint for finding latest products by category Check valid WhatsApp number Using Rapiwa: Verify the Rapiwa endpoint for WhatsApp number validation Rapiwa Sender: Verify the Rapiwa endpoint for sending messages Google Sheet Required Columns Youβll need two Google Sheets (or two tabs in one spreadsheet): A Google Sheet formatted like this β€ sample The workflow uses a Google Sheet with the following columns to track coupon distribution: Both must have the following headers (match exactly): | name | number | email | address1 | price | suk | title | product link | validity | staus | | ---------- | ------------- | ----------------------------------------------- | ----------- | ----- | --- | ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | ---------- | -------- | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | verified | sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | unverified | not sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | verified | sent | Important Notes Phone Number Format: The workflow cleans phone numbers by removing all non-digit characters. Ensure your WooCommerce phone numbers are in a compatible format. API Rate Limits: Rapiwa and WooCommerce APIs have rate limits. Adjust batch sizes and wait times accordingly. Data Privacy: Ensure compliance with data protection regulations when sending marketing messages. Error Handling: The workflow logs unverified numbers but doesn't have extensive error handling. Consider adding error notifications for failed API calls. Product Availability: The workflow recommends the latest product in a category, but doesn't check if it's in stock. Consider adding stock status verification. Testing: Always test with a small batch before running the workflow on your entire customer list. Useful Links Dashboard: https://app.rapiwa.com Official Website: https://rapiwa.com Documentation: https://docs.rapiwa.com Support & Help WhatsApp: Chat on WhatsApp Discord: SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen
Track SDK documentation drift with GitHub, Notion, Google Sheets, and Slack
π Description Automatically track SDK releases from GitHub, compare documentation freshness in Notion, and send Slack alerts when docs lag behind. This workflow ensures documentation stays in sync with releases, improves visibility, and reduces version drift across teams. πππ¬ What This Template Does Step 1: Listens to GitHub repository events to detect new SDK releases. π§© Step 2: Fetches release metadata including version, tag, and publish date. π¦ Step 3: Logs release data into Google Sheets for record-keeping and analysis. π Step 4: Retrieves FAQ or documentation data from Notion. π Step 5: Merges GitHub and Notion data to calculate documentation drift. π Step 6: Flags SDKs whose documentation is over 30 days out of date. β οΈ Step 7: Sends detailed Slack alerts to notify responsible teams. π Key Benefits β Keeps SDK documentation aligned with product releases β Prevents outdated information from reaching users β Provides centralized release tracking in Google Sheets β Sends real-time Slack alerts for overdue updates β Strengthens DevRel and developer experience operations Features GitHub release trigger for real-time monitoring Google Sheets logging for tracking and auditing Notion database integration for documentation comparison Automated drift calculation (days since last update) Slack notifications for overdue documentation Requirements GitHub OAuth2 credentials Notion API credentials Google Sheets OAuth2 credentials Slack Bot token with chat:write permissions Target Audience Developer Relations (DevRel) and SDK engineering teams Product documentation and technical writing teams Project managers tracking SDK and doc release parity Step-by-Step Setup Instructions Connect your GitHub account and select your SDK repository. Replace YOURGOOGLESHEETID and YOURSHEET_GID with your tracking spreadsheet. Add your Notion FAQ database ID. Configure your Slack channel ID for alerts. Run once manually to validate setup, then enable automation.
Automate Gmail responses with GPT and human-in-the-loop verification
Try It Out! This n8n template uses AI to automatically respond to your Gmail inbox by drafting response for your approval via email. How it works Gmail Trigger monitors your inbox for new emails AI Analysis determines if a response is needed based on your criteria Draft Generation creates contextually appropriate replies using your business information Human Approval sends you the draft for review before sending Auto-Send replies automatically once approved Setup Connect your Gmail account to the Gmail Trigger node Update the "Your Information" node with: Entity name and description Approval email address Resource guide (FAQs, policies, key info) Response guidelines (tone, style, formatting preferences) Configure your LLM provider (OpenAI, Claude, Gemini, etc.) with API credentials Test with a sample email Requirements n8n instance (self-hosted or cloud) Gmail account with API access LLM provider API key Need Help? Email Nick @ nick@tropicflare.com