8 templates found
Category:
Author:
Sort:

Download Instagram & Facebook Videos/Reels with Telegram Bot

Instagram & Facebook Video/Reels Downloader Bot (Telegram bot) Once set up, simply send any Instagram Reel or Facebook video link to your Telegram bot, and it will automatically: Check if the shared link is valid. Detect whether it’s an Instagram or Facebook link. Fetch the video using API and scraping methods. Download the video directly from the source. Send the downloaded video (or a message if it’s invalid) right back to your Telegram chat — instantly! --- How It Works (Node Flow) Telegram Trigger: Listens for new messages (video/reel links) from users. Regex Node: Extracts and validates the Instagram/Facebook link format. Conditional Node (If): Determines whether the link is for Facebook or Instagram. Link Validation Node: Ensures the provided link is valid and reachable. Instagram Node: Fetches video metadata via API. Decodes and downloads the Reel. Sends the downloaded video and confirmation message via Telegram. Facebook Node: Uses scraping/API to get the video source. Generates the downloadable link. Downloads and sends the Facebook video back to Telegram. Error Handling Node: Sends a custom error message if the link is invalid. --- Features ✅ Works with both Instagram and Facebook links ✅ Automatically detects the platform and processes accordingly ✅ Delivers the downloaded video directly to your Telegram chat ✅ Handles invalid or broken links gracefully ✅ Clean and modular structure — easy to extend or customize --- Use Case Perfect for social media managers, content creators, and automation enthusiasts who want a simple Telegram bot to fetch and download Reels or videos without using third-party apps or websites.

Joy SutradharBy Joy Sutradhar
3351

AI-powered reservation reminder calls for restaurant with Twilio & Grok-4

🤖📞 This workflow automates the process of calling customers to remind them of their booking reservations using AI-generated messages and a Twilio phone number. It can easily be adapted for other venues. --- Key Benefits Time-Saving Automation: Eliminates the need for manual calls by staff, saving hours per week. Human-like AI Messages: Uses a custom language model to generate polite, natural phone messages tailored to each customer. Multi-Channel Integration: Google Sheets for reservation tracking. Twilio for automated calling. OpenRouter (or other LLMs) for generating speech content. Error Reduction: Ensures all customers receive reminders exactly on the reservation day, minimizing no-shows. Scalable: Easily adapts to growing reservation lists and more complex message logic. Suitable for restaurants, hairdressers, offices and any other business --- How It Works Trigger: The workflow can be triggered manually (via "When clicking ‘Execute workflow’) or automatically at 11 AM daily (via Schedule Trigger). Data Fetch: Retrieves today’s reservations from a Google Sheet, filtering rows where DATE = today and CALLED is empty. AI-Generated Call Script: For each reservation, the Secretary Agent (powered by OpenRouter’s Grok-4) generates a phone script using the guest’s name, time, and party size. Twilio Call: The script is sent to Twilio, which calls the guest’s phone number (from the sheet) and reads the message aloud using text-to-speech. Update & Loop: Marks the reservation as called (CALLED = "x") in the sheet and waits 2 minutes between calls to avoid rate limits. --- Set Up Steps Twilio Configuration: Sign up for Twilio, buy a phone number, and: Enable text-to-speech (set language to Italian). Configure geo permissions for the target country. Add credentials to the Twilio node (sender number in From field). Google Sheets Setup: Clone the Google Sheet template and ensure: Phone numbers include the international prefix (without "+"). Columns: DATE, TIME, NAME, N. PEOPLE, PHONE, CALLED. OpenRouter API: Connect the OpenRouter Chat Model node to your account (using Grok-4 or another model). Deploy: Activate the workflow and test with manual execution. Note: The workflow is currently inactive (active: false). Enable it after setup. ---- Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
1770

Automate your UTM campaign tracking: Shopify, n8n to Baserow

Campaign tracking is pivotal; it enables marketers to evaluate the efficacy of various strategies and channels. UTM parameters are particularly essential as they provide granular details about the source, medium, and campaign effectiveness. However, when this data is not automatically integrated into a centralized system, it can become a tedious and error-prone process to manually collate and analyze it. Retrieving UTM data from Shopify and storing it in Baserow enables oy to do more with this data. For example you could build a campaign database in Baserow and automatically add campaign revenue to it using this workflow template. This template will help you: Automatically retrieve UTM parameters from Shopify orders using the Shopify Admin API Process marketing data through n8n Store this data into Baserow, providing you with a dynamic, responsive base for campaign tracking and decision-making This template will demonstrate the follwing concepts in n8n: use the Schedule trigger node use the GraphQL node to call the Shopify Admin API split larger incoming datasets into n8n items with the Split node transform the data structure with the Set node control flow with the If node store data in Baserow with the Baserow node How to get started? Create a custom app in Shopify get the credentials needed to connect n8n to Shopify This is needed for the Shopify Trigger Create Shopify Acces Token API credentials n n8n for the Shopify trigger node Create Header Auth credentials: Use X-Shopify-Access-Token as the name and the Acces-Token from the Shopify App you created as the value. The Header Auth is neccessary for the GraphQL nodes. You will need a running Baserow instance for this. You can also sign up for a free account at https://baserow.io/ Please make sure to read the notes in the template. For a detailed explanation please check the corresponding video: https://youtu.be/VBeN-3129RM

SaschaBy Sascha
1449

Add a note to Pipedrive's contact once PR is added on GitHub

This workflow automatically adds a note of the PR from GitHub to the Pipedrive contact if their GitHub email matches a Person in Pipedrive. Prerequisites Pipedrive account and Pipedrive credentials GitHub account and GitHub credentials How it works GitHub Trigger node activates the workflow when a GitHub user adds a PR. HTTP Request node gets the user's data and sends it further. Pipedrive node searches the same email that GitHub user has in Pipedrive. IF node checks whether a person with the same email exists in Pipedrive. In case there's such a person in Pipedrive, the Pipedrive node creates a note within the person's profile.

n8n TeamBy n8n Team
1051

Transcribe & summarize Telegram voice notes with OpenAI and DeepSeek Chat to Google Docs

This workflow automates the process of transcribing voice notes from Telegram and then summarizing them, finally saving both the transcript and the summary to Google Drive. Here's a step-by-step breakdown of what the workflow does: Triggered by Telegram Message: The workflow starts when a new message is received on Telegram. Get Audio File from Telegram: It then retrieves the audio file from the Telegram message. Transcribe Audio with OpenAI: The audio file is sent to OpenAI for transcription. Save Transcript to Google Drive: The generated transcript is saved as a new Google Doc in a designated "N8N Transcribes" folder within Google Drive. The document is named after the original audio file. Summarize Transcript with AI: The transcript is then sent to an AI agent (which uses DeepSeek Chat Model 1 as its language model) to generate a summary description. The prompt instructs the AI to create a plain text summary suitable for a report to a supervisor, expanding on the title and avoiding rich text formatting. Save Summary to Google Drive: Finally, the generated summary is saved as a new Google Doc in a "N8N Summaries" folder in Google Drive, also named after the original audio file.

Wild NomadBy Wild Nomad
361

Record payout from Stripe in Wave Accounting

This workflow helps small business owners using Wave Apps to easily access the Wave Accounting API using n8n In this example, the workflow is triggered from a new payout from Stripe. It then logs the transaction as a journal entry in Wave Accounting, helping you automate your accounting without needing to pay for expensive subscriptions! What this workflow template does This workflow triggers when a payout is paid in Stripe and sends a GraphQL request to the Wave Accounting API recording the payout as a journal entry automatically. The benefits of this worklow are to instantaneously keep your books up to date and to ensure accuracy. How to setup Setup your Stripe credential in n8n using the native Stripe nodes. Follow this guide to get your Wave Apps authorization token. Setup the node with the correct header auth -> {"Authorization": "Bearer TOKEN-HERE"} Get your account IDs from Wave The payload uses GraphQL so ensure to update it accordingly with what you are trying to achieve, the current example creates a journal entry. Tips Getting Wave account and IDs It is easiest to use network logs to obtain the right account IDs from Wave, especially if you have created custom accounts (default Wave account IDs can be obtained via making that API call). Go to Wave and make a new dummy transaction using the accounts you want to use in n8n. Have the network logs open while you do this. Search network logs for the name of the account you are trying to use. You should see account IDs in the log data. Sales tax This example uses sales tax baked into the total Stripe payout amount (5%) You can find the sales tax account IDs the same way as you found the Wave account IDs using network logs Use AI to help you with your API call Ask Claude or ChatGPT to draft you your ideal payload

Jacob @ vwork DigitalBy Jacob @ vwork Digital
339

Save Mastodon bookmarks to Raindrop automatically

> 🛠️ Note: This workflow uses a custom Mastodon API request. Ensure your server supports bookmark access, and that your access token has the right permissions. OAuth or token-based credentials must be configured. 🧑‍💼 Who is this for? This workflow is ideal for digital researchers, social media users, and knowledge workers who want to automatically archive Mastodon bookmarks into their Raindrop.io collection for future reference and tagging. 🔧 What problem is this solving? Mastodon users often bookmark posts they want to read or save for later, but there's no native integration to archive them outside the app. This workflow solves that by syncing bookmarked posts from Mastodon to Raindrop, making them more accessible, organized, and searchable long-term. ⚙️ What this workflow does Triggers on schedule (or manually). Tracks the latest fetched min_id using workflow static data to avoid duplicates. Sends an HTTP GET request to the Mastodon bookmarks API, using bearer token authentication. Validates and processes the bookmarks if new entries exist. Parses pagination metadata (e.g. min_id) from response headers. Splits response array to handle individual bookmarks. Filters out entries with missing data. Saves each post to Raindrop.io, using its title and URL. Use the card URL if exist. Updates the min_id to remember where it left off. 🚀 Setup Create a Mastodon access token with access to bookmarks. Add a credential in n8n of type HTTP Bearer Auth with your token. Create and connect a Raindrop OAuth2 credential. Replace {VOTRE SERVEUR MASTODON} with your Mastodon server's base URL. (Optional) Adjust the scheduling interval under the "Schedule Trigger" node. Make sure the Raindrop collection ID is correct or leave it as default (-1) as this is the index for the Unsorted collection. 🧪 How to customize this workflow To save to a specific Raindrop collection, change the collectionId in both Raindrop nodes. You can extend the Code node to pull additional metadata like author, hashtags, or content excerpts. Add an Email or Slack node after Raindrop to notify you of saved bookmarks.

Aymeric BessetBy Aymeric Besset
195

Automated NASA patent lead generation & scoring with OpenAI, Google, and Notion

Who is this for This workflow is designed for Innovation Managers, Tech Transfer Offices, and Business Development Representatives looking to find commercial partners for new technologies. What it does This template automates the process of scouting startups that might be a good fit for NASA patents. Search: It fetches patents from the NASA Tech Transfer API based on a keyword you define. Find: It searches Google to identify startups operating in related fields. Enrich: It crawls the identified startup's website to extract context about their business. Analyze: Using OpenAI, it scores the "fit" between the patent and the startup and drafts a personalized outreach email. Save: High-scoring leads are enriched with LinkedIn company pages and saved directly to a Notion database. How to set up Configuration: In the Configuration node, set the keyword variable to the technology topic you want to search for (e.g., "robotics"). NASA API: Get a free API key from api.nasa.gov and enter it in the NASA Patents API node parameters. Apify: Connect your Apify account credential. You will need credits to run the google-search-scraper and website-content-crawler actors. OpenAI: Connect your OpenAI credential. Notion: Create a database with the following properties and connect it in the Create Notion Lead node: Company (Text) Website (URL) LinkedIn (URL) Email (Email) Score (Number) Draft Email (Text) NASA Tech (Text) Requirements NASA API Key: Free to obtain. Apify Account: Requires google-search-scraper and website-content-crawler actors. OpenAI API Key: For analysis and text generation. Notion Account: To store the leads.

Takumi OkuBy Takumi Oku
65
All templates loaded