18 templates found
Category:
Author:
Sort:

Copy viral reels with Gemini AI

Video Guide I prepared a detailed guide that shows the whole process of building an AI tool to analyze Instagram Reels using n8n. [](https://youtu.be/SQPPM0KLsrM) Youtube Link Who is this for? This workflow is ideal for social media analysts, digital marketers, and content creators who want to leverage data-driven insights from their Instagram Reels. It's particularly useful for those looking to automate the analysis of video performance to inform strategy and content creation. What problem does this workflow solve? Analyzing video performance on Instagram can be tedious and time-consuming, requiring multiple steps and data extraction. This workflow automates the process of fetching, analyzing, and recording insights from Instagram Reels, making it simpler for users to track engagement metrics without manual intervention. What this workflow does This workflow integrates several services to analyze Instagram Reels, allowing users to: Automatically fetch recent Reels from specified creators. Analyze the most-watched videos for insights. Store and manage data in Airtable for easy access and reporting. Initial Trigger: The process begins with a manual trigger that can later be modified for scheduled automation. Data Retrieval: It connects to Airtable to fetch a list of creators and their respective Instagram Reels. Video Analysis: It handles the fetching, downloading, and uploading of videos for analysis using an external service, simplifying performance tracking through a structured query process. Record Management: It saves relevant metrics and insights into Airtable, ensuring that users can access and organize their video analytics effectively. Setup Create accounts: Set up Airtable, Edify, n8n, and Gemini accounts. Prepare triggers and modules: Replace credentials in each node accordingly. Configure data flow: Ensure modules are set to fetch and analyze the correct data fields as outlined in the guide. Test the workflow: Run the scenario manually to confirm that data is fetched and analyzed correctly.

Mark ShcherbakovBy Mark Shcherbakov
13065

Automated video creation using Google Veo3 and n8n workflow

Who is this for? Content creators, social media managers, digital marketers, and businesses looking to automate video production without expensive equipment or technical expertise. What problem is this workflow solving? Traditional video creation requires cameras, editing software, voice recording equipment, and hours of post-production work. This workflow eliminates all these barriers by automatically generating professional videos with audio using just text prompts. What this workflow does This automated workflow takes video ideas from Google Sheets, generates optimized prompts using AI, creates videos through Google's V3 model via Fal AI, monitors the generation progress, and saves the final video URLs back to your spreadsheet for easy access and management. Setup Sign up for Fal AI account and obtain API key Create Google Sheet with video ideas and status columns Configure n8n with required credentials (Google Sheets, Fal AI API) Import the workflow template Set up authentication for all connected services Test with sample video idea How to customize this workflow to your needs Modify the AI prompts to match your brand voice, adjust video styles and camera movements, change polling intervals for video generation status, customize Google Sheet column mappings, and add additional processing steps like thumbnail generation or social media posting.

Lakshit UkaniBy Lakshit Ukani
11315

Create Teams notifications for new tickets in ConnectWise with Redis

This Workflow does a HTTPs request to ConnectWise Manage through their REST API. It will pull all tickets in the "New" status or whichever status you like, and notify your dispatch team/personnel whenever a new ticket comes in using Microsoft Teams. Video Explanation https://youtu.be/yaSVCybSWbM

GavinBy Gavin
4794

Xml to SQL database import

This is an example workflow that imports an XML file into an SQL database. The ReadBinaryFiles node loads the XML file from the server. Then the Code node extracts the file content from the binary buffer. Afterwards, an XML node converts the XML string into a JSON structure. Finally, in the MySQL node inserts the data records into the SQL table. In the upper part of the workflow there is another MySQL node that is disabled. This node creates a new table with all the required variables based on the sample SQL database: https://www.mysqltutorial.org/mysql-sample-database.aspx

n8n TeamBy n8n Team
2800

Query MySQL database with natural language using GPT AI

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. How it works Using chat node, ask a question pertaining to information stored in your MySQL database AI Agent converts your question to a SQL query AI Agent executes the SQL query and returns a result AI Agent can remember the previous 5 questions How to set up: Add your OpenAI API Key in "OpenAI Chat Model" node Add your MySQL credentials in the "SQL DB - List Tables and Schema" and "Execute a SQL Query in MySQL nodes" Update the database name in "SQL DB - List Tables and Schema" node. Replace "yourqueryname" under the Query field with your actual database name After the above steps are completed, use the "When chat message received" node to ask a question about your data using plain English

Moe AhadBy Moe Ahad
2639

Create new Clickup tasks from Slack commands

Create new Clickup Tasks from Slack commands This workflow aims to make it easy to create new tasks on Clickup from normal Slack messages using simple slack command. For example We can have a slack command as /newTask Set task to update new contacts on CRM and assign them to the sales team This will have an new task on Clickup with the same title and description on Clickup For most teams, getting tasks from Slack to Clickup involves manually entering the new tasks into Clickup. What if we could do this with a simple slash command? Step 1 The first step is to Create an endpoint URL for your slack command by creating an events API from the link [below] https://api.slack.com/apps/) STEP 2 Next step is defining the endpoint for your URL Create a new webhook endpoint from your n8n with a POST and paste the endpoint URL to your event API. This will send all slash commands associated with the Slash to the desired endpoint Step 3 Log on to slack API (https://api.slack.com/) and create an application. This is the one we use to run all automation and commands from Slack. Once your app is ready, navigate to the Slash Commands and create a new command This will include the command, the webhook URL and a description of what the slash command is all about Now that this is saved you can do a test by sending a demo task to your endpoint Once you have tested the webhook slash command is working with the webhook, create a new Clickup API that can be used to create new tasks in ClickUp This workflow creates a new task with the start dates on Clikup that can be assigned to the respective team members More details about the document setup can be found on this document below Happy Productivity

Zacharia KimothoBy Zacharia Kimotho
2411

Automated YouTube subscription notifications with RSS and email

Who is this template for? You are in the bad habit of always checking your feed to see if there are new videos? This workflow will help you get rid of this habit by delivering an email notification for each new video posted from the channels you are subscribed to. No need to check your feed again: no email = no new video. Example email How it works Every hour (by default), we: Fetch all your YouTube subscriptions from the YouTube Data v3 API. Get a list of the latest videos of each channel through RSS (we don't use YouTube's API for this step as it would put us over the daily quota). Send you a simple yet beautiful email for each new video that was published since the last run of the workflow. To go to the video, simply click on the thumbnail. Caveats Because of the way this workflow is implemented, if your n8n instance stops, you will not get emails for the videos you missed when your instance is back online. The situation could be improved if n8n gave us an easy way to access the last successful execution's timestamp. Set up instructions Complete the Set up credentials step when you first open the workflow. You'll need YouTube OAuth2 API and SMTP credentials. In the Send an email for each new video step, set the email address from which the email will be sent (an email address that your SMTP credentials allow sending from) and the email address to which you are going to send the email to (can be the same). Optional steps From the Schedule Trigger step, you can change the check frequency (default: every hour). If there are channels that you do not want notifications from, you can add their channel ID to the list in the Filter out channels step. To get a channel's ID, go to its main page, click on the description, then "Share channel" and finally "Copy channel ID". By default, shorts are excluded. But if you want them, simply remove the Filter out shorts step from the workflow. Template was created in n8n v1.84.0

SweenuBy Sweenu
1965

🎓 How to transform unstructured email data into structured format with AI agent

This workflow automates the process of extracting structured, usable information from unstructured email messages across multiple platforms. It connects directly to Gmail, Outlook, and IMAP accounts, retrieves incoming emails, and sends their content to an AI-powered parsing agent built on OpenAI GPT models. The AI agent analyzes each email, identifies relevant details, and returns a clean JSON structure containing key fields: From – sender’s email address To – recipient’s email address Subject – email subject line Summary – short AI-generated summary of the email body The extracted information is then automatically inserted into an n8n Data Table, creating a structured database of email metadata and summaries ready for indexing, reporting, or integration with other tools. --- Key Benefits ✅ Full Automation: Eliminates manual reading and data entry from incoming emails. ✅ Multi-Source Integration: Handles data from different email providers seamlessly. ✅ AI-Driven Accuracy: Uses advanced language models to interpret complex or unformatted content. ✅ Structured Storage: Creates a standardized, query-ready dataset from previously unstructured text. ✅ Time Efficiency: Processes emails in real time, improving productivity and response speed. *✅ Scalability: Easily extendable to handle additional sources or extract more data fields. --- How it works This workflow automates the transformation of unstructured email data into a structured, queryable format. It operates through a series of connected steps: Email Triggering: The workflow is initiated by one of three different email triggers (Gmail, Microsoft Outlook, or a generic IMAP account), which constantly monitor for new incoming emails. AI-Powered Parsing & Structuring: When a new email is detected, its raw, unstructured content is passed to a central "Parsing Agent." This agent uses a specified OpenAI language model to intelligently analyze the email text. Data Extraction & Standardization: Following a predefined system prompt, the AI agent extracts key information from the email, such as the sender, recipient, subject, and a generated summary. It then forces the output into a strict JSON structure using a "Structured Output Parser" node, ensuring data consistency. Data Storage: Finally, the clean, structured data (the from, to, subject, and summarize fields) is inserted as a new row into a specified n8n Data Table, creating a searchable and reportable database of email information. --- Set up steps To implement this workflow, follow these configuration steps: Prepare the Data Table: Create a new Data Table within n8n. Define the columns with the following names and string type: From, To, Subject, and Summary. Configure Email Credentials: Set up the credential connections for the email services you wish to use (Gmail OAuth2, Microsoft Outlook OAuth2, and/or IMAP). Ensure the accounts have the necessary permissions to read emails. Configure AI Model Credentials: Set up the OpenAI API credential with a valid API key. The workflow is configured to use the model, but this can be changed in the respective nodes if needed. Connect the Nodes: The workflow canvas is already correctly wired. Visually confirm that the email triggers are connected to the "Parsing Agent," which is connected to the "Insert row" (Data Table) node. Also, ensure the "OpenAI Chat Model" and "Structured Output Parser" are connected to the "Parsing Agent" as its AI model and output parser, respectively. Activate the Workflow: Save the workflow and toggle the "Active" switch to ON. The triggers will begin polling for new emails according to their schedule (e.g., every minute), and the automation will start processing incoming messages. --- Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
1616

Generate personalized weather reports with OpenWeatherMap, Python and GPT-4.1-mini

How it works User enters name of a city for which most current weather information will be gathered Custom Python code processes the weather data and generates a custom email about the weather AI agent further customizes the email and add a related joke about the weather Recipient gets the custom email for the city Set up instructions Enter city to get the weather data Add OpenWeather API and replace <yourAPIkey> with your actual API key Add your OpenAI API in OpenAI Chat Model Node Add your Gmail credentials and specify a recipient for the custom email

Moe AhadBy Moe Ahad
1491

Automate email discovery for companies with Anymail Finder, Google Sheets & Telegram alerts

This automation retrieves company information from a Google Sheet, uses the Anymail Finder API to discover email addresses associated with each company, and then writes the results (including the email status) back into the same Google Sheet and send alert on Telegram. --- Key Advantages ✅ Automated Email Discovery: No need for manual lookups—emails are found via the Anymail Finder API in bulk. 🔁 Seamless Google Sheets Integration: Works directly with Google Sheets for input and output, allowing easy data management. 🧠 Smart Filtering: Automatically classifies emails as valid, risky, or not found for quality control. ⚙️ Reusable & Scalable: Can be run anytime with a manual trigger or expanded to handle thousands of records with minimal setup. 📊 Real-Time Updates: Results are immediately reflected in your spreadsheet, streamlining lead generation and outreach workflows. 💸 Cost-Efficient: Uses a free Anymail Finder trial or API key for testing and validation before scaling up. --- How it Works This automated workflow finds email addresses for a list of companies using the Anymail Finder API and updates a Google Sheets document with the results. Trigger & Data Retrieval: The workflow starts manually. It first connects to a specified Google Sheet and retrieves a list of company leads that are marked for processing (where the "PROCESSING" column is empty). Batch Processing & API Call: The list of leads is then split into batches (typically one item at a time) to be processed individually. For each company, the workflow sends the "Company Name" and "Website" to the Anymail Finder API to search for a relevant email address. Result Classification: The API's response, which includes the found email and its status (e.g., valid, risky), is passed to a Switch node. This node routes the data down different paths based on the email status. Sheet Update: Depending on the status: Valid/Risky Email: The workflow updates the original Google Sheet row. It marks the "PROCESSING" column with an "x" and writes the found email address into the "EMAIL" column. No Email Found: The workflow also updates the sheet, marking "PROCESSING" with an "x" and leaving the "EMAIL" column empty to indicate no email was found. Loop Completion: After processing each item, the workflow loops back to process the next lead in the batch until all companies have been handled. --- Set up Steps To use this workflow, you need to complete the following configuration steps: Duplicate the Template Sheet: Clone the provided Google Sheets template to your own Google Drive. This sheet contains the necessary columns ("COMPANY NAME", "WEBSITE", "EMAIL", "PROCESSING") for the workflow to function. Get an API Key: Sign up for a free trial at Anymail Finder to obtain your personal API key. Configure Credentials in n8n: Google Sheets: In both the "Get Leads" and update nodes, set up the Google Sheets OAuth2 credential to grant n8n access to your copied spreadsheet. Anymail Finder: In the "Email finder" HTTP Request node, create a new credential of type "HTTP Header Auth". Name it "Anymail Finder". In the "Name" field, enter Authorization. In the "Value" field, paste your Anymail Finder API key. Update Sheet ID in Nodes: In the n8n workflow, update all Google Sheets nodes ("Get Leads", "Email found", "Email not found") with the Document ID of your cloned Google Sheet. The Sheet ID can be found in your sheet's URL: https://docs.google.com/spreadsheets/d/[YOURSHEETID_HERE]/edit.... Execute: Once configured, add your list of companies and their websites to the sheet and run the workflow using the "Manual Trigger" node. --- Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
1446

Get real-time NFT Marketplace insights with OpenSea Marketplace Agent Tool

Track NFT listings, offers, orders, and trait-based pricing in real time! This workflow integrates OpenSea API, AI-powered analytics (GPT-4o-mini), and n8n automation to provide instant insights into NFT trading activity. Ideal for NFT traders, collectors, and investors looking to monitor the market and identify profitable opportunities. How It Works A user submits a query about NFT listings, offers, or order history. The OpenSea Marketplace Agent determines the correct API tool: Retrieve active NFT listings for a collection. Fetch valid offers for individual NFTs or entire collections. Identify the cheapest NFT listings by collection or token ID. Track the highest offer made for a single NFT. Access detailed order history for a transaction. The OpenSea API (requires API key) is queried to fetch real-time data. The AI engine processes and structures the response, making it easy to interpret. The NFT marketplace insights are delivered via Telegram, Slack, or stored in a database. What You Can Do with This Agent 🔹 Find the Best NFT Listings → Retrieve the cheapest available listings in any collection. 🔹 Track Offers on NFTs → See all active offers, including highest bids. 🔹 Analyze Collection-Wide Market Data → Compare listings, offers, and sales activity. 🔹 Retrieve Order Details → Search by order hash to check buyer, seller, and transaction status. 🔹 Fetch NFT Trait-Based Offers → Identify rare traits that receive premium bids. 🔹 Monitor Multi-Chain Listings → Works across Ethereum, Polygon (Matic), Arbitrum, Optimism, and more. Example Queries You Can Use ✅ "Show me the 10 cheapest listings for Bored Ape Yacht Club." ✅ "Find the highest bid for CryptoPunk 1234." ✅ "Track all open offers for Azuki NFTs." ✅ "Retrieve details for this OpenSea order: 0x123abc... on Ethereum." ✅ "List all NFTs for sale in the 'CloneX' collection." Available API Tools & Endpoints 1️⃣ Get All Listings by Collection → /api/v2/listings/collection/{collectionslug}/all (Fetches active listings for a collection)_ 2️⃣ Get All Offers by Collection → /api/v2/offers/collection/{collectionslug}/all (Retrieves all offers for a collection)_ 3️⃣ Get Best Listing by NFT → /api/v2/listings/collection/{collectionslug}/nfts/{identifier}/best (Finds the lowest-priced NFT listing)_ 4️⃣ Get Best Listings by Collection → /api/v2/listings/collection/{collectionslug}/best (Fetches the cheapest listings per collection)_ 5️⃣ Get Best Offer by NFT → /api/v2/offers/collection/{collectionslug}/nfts/{identifier}/best (Retrieves the highest offer for an NFT)_ 6️⃣ Get Collection Offers → /api/v2/offers/collection/{collectionslug} (Shows collection-wide offers)_ 7️⃣ Get Item Offers → /api/v2/orders/{chain}/{protocol}/offers (Fetches active item-specific offers) 8️⃣ Get Listings by Chain & Protocol → /api/v2/orders/{chain}/{protocol}/listings (Retrieves active listings across blockchains) 9️⃣ Get Order Details by Hash → /api/v2/orders/chain/{chain}/protocol/{protocoladdress}/{orderhash} (Checks order status using an order hash) 🔟 Get Trait-Based Offers → /api/v2/offers/collection/{collectionslug}/traits (Fetches offers for specific NFT traits)_ Set Up Steps Get an OpenSea API Key Sign up at OpenSea API and request an API key. Configure API Credentials in n8n Add your OpenSea API key under HTTP Header Authentication. Connect the Workflow to Telegram, Slack, or Database (Optional) Use n8n integrations to send alerts to Telegram, Slack, or save results to Google Sheets, Notion, etc. Deploy and Test Send a query (e.g., "Get the best listing for BAYC 5678") and receive instant insights! Stay ahead of the NFT market—gain powerful insights with AI-powered OpenSea analytics!

Don Jayamaha JrBy Don Jayamaha Jr
1184

Sitemap page extractor: Discover, clean, and save website URLs to Google Sheets

Description: Automatically extracts all page URLs from website sitemaps, filters out unwanted sitemap links, and saves clean URLs to Google Sheets for SEO analysis and reporting. How It Works: This workflow automates the process of discovering and extracting all page URLs from a website's sitemap structure. Here's how it works step-by-step: Step 1: URL Input The workflow starts when you submit a website URL through a simple form interface. Step 2: Sitemap Discovery The system automatically generates and tests multiple possible sitemap URLs including /sitemap.xml, /sitemap_index.xml, /robots.txt, and other common variations. Step 3: Valid Sitemap Identification It sends HTTP requests to each potential sitemap URL and filters out empty or invalid responses, keeping only accessible sitemaps. Step 4: Nested Sitemap Processing For sitemap index files, the workflow extracts all nested sitemap URLs and processes each one individually to ensure complete coverage. Step 5: Page URL Extraction From each valid sitemap, it parses the XML content and extracts all individual page URLs using both XML <loc> tags and HTML links. Step 6: URL Filtering The system removes any URLs containing "sitemap" to ensure only actual content pages (like product, service, or blog pages) are retained. Step 7: Google Sheets Integration Finally, all clean page URLs are automatically saved to a Google Sheets document with duplicate prevention for easy analysis and reporting. Setup Steps: Estimated Setup Time: 10-15 minutes Import the Workflow: Import the provided JSON file into your n8n instance. Configure Google Sheets Integration: Set up Google Sheets OAuth2 credentials in n8n Create a new Google Sheet or use an existing one Update the "Save Page URLs to Sheet" node with your Google Sheet URL Ensure your sheet has a tab named "Your sheet tab name" with a column header "Column name" Test the Workflow: Activate the workflow in n8n Use the form trigger URL to submit a test website URL Verify that URLs are being extracted and saved to your Google Sheet Customize (Optional): Modify the sitemap URL patterns in the "Build sitemap URLs" node if needed Adjust the filtering criteria in the "Exclude the Sitemap URLs" node Update the Google Sheets column mapping as required Important Notes: Ensure your Google Sheets credentials have proper read/write permissions The workflow handles both XML sitemaps and robots.txt sitemap references Duplicate URLs are automatically prevented when saving to Google Sheets The workflow continues processing even if some sitemap URLs are inaccessible Need Help? For technical support or questions about this workflow: ✉️ info@incrementors.com or fill out this form: Contact Us

IncrementorsBy Incrementors
974