12 templates found
Category:
Author:
Sort:

Automate AI news videos to social media with GPT-4o & HeyGen and Postiz

🤖 Automated AI News Video Creation and Social Media Publishing Workflow ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 🎯 Overview: This workflow fully automates the creation and social media distribution of AI-generated news videos. It fetches news, crafts captions, generates avatar videos via HeyGen, stores them, and publishes them across Instagram, Facebook, and YouTube via Postiz. --- 🔄 WORKFLOW PROCESS: News Fetching: Reads the latest news from an RSS feed. AI Captioning: Generates concise, engaging captions using an AI agent (GPT-4o-mini). Video Generation: Creates an AI avatar video using HeyGen with the generated caption. Video Storage: Downloads the video and uploads it to Google Drive for archival. Data Logging: Records all news and video metadata into Google Sheets. Postiz Upload: Uploads the video to Postiz's internal storage for publishing. Social Publishing: Fetches Postiz integrations and routes the video to Instagram, Facebook, and YouTube after platform-specific content cleaning. --- ⚙️ KEY TECHNOLOGIES: RSS Feeds: News source. LangChain (n8n nodes): AI Agent and Chat OpenAI for caption generation. HeyGen API: AI avatar video creation. Google Drive: Video file storage. Google Sheets: Data logging and tracking. Postiz API: Unified social media publishing platform. --- ⚠️CRITICAL CONFIGURATIONS: API Keys: Ensure HeyGen and Postiz API keys are correctly set in credentials and the 'Setup Heygen Parameters' node. HeyGen IDs: Verify avatarid and voiceid in 'Setup Heygen Parameters'. Postiz URL: Confirm https://postiz.yourdomain.com is your correct Postiz instance URL across all HTTP Request nodes. Credentials: All Google, OpenAI, and Postiz credentials must be properly linked. --- 📈BENEFITS: Automated content creation and distribution, saving significant time. Consistent branding and messaging across multiple platforms. Centralized logging for tracking and performance analysis. Scalable solution for high-volume content demands. ---

David OlusolaBy David Olusola
9845

Get live crypto market data with AI-powered CoinMarketCap agent

Access real-time cryptocurrency prices, market rankings, metadata, and global stats—powered by GPT-4o and CoinMarketCap! This modular AI-powered agent is part of a broader CoinMarketCap multi-agent system designed for crypto analysts, traders, and developers. It uses the CoinMarketCap API and intelligently routes queries to the correct tool using AI. This agent can be used standalone or triggered by a supervisor AI agent for multi-agent orchestration. --- Supported API Tools (6 Total) This agent intelligently selects from the following tools to answer your crypto-related questions: 🔍 Tool Summary Crypto Map – Lookup CoinMarketCap IDs and active coins Crypto Info – Get metadata, whitepapers, and social links Crypto Listings – Ranked coins by market cap CoinMarketCap Price – Live prices, volume, and supply Global Metrics – Total market cap, BTC dominance Price Conversion – Convert between crypto and fiat --- What You Can Do with This Agent 🔹 Get live prices and volume for tokens (e.g., BTC, ETH, SOL) 🔹 Convert crypto → fiat or fiat → crypto instantly 🔹 Retrieve whitepapers, logos, and website links for any token 🔹 Analyze total market cap, BTC dominance, and circulating supply 🔹 Discover new tokens and track their CoinMarketCap IDs 🔹 View the top 100 coins ranked by market cap or volume --- Example Queries ✅ "What is the CoinMarketCap ID for PEPE?" ✅ "Show me the top 10 cryptocurrencies by market cap." ✅ "Convert 5 ETH to USD." ✅ "What’s the 24h volume for ADA?" ✅ "Get the global market cap and BTC dominance." --- AI Architecture AI Brain: GPT-4o-mini Memory: Session buffer with sessionId Agent Type: Subworkflow AI tool Connected APIs: 6 CoinMarketCap endpoints Trigger Mode: Executes when called by a supervisor (via message and sessionId inputs) --- Setup Instructions Get a CoinMarketCap API Key Register here: https://coinmarketcap.com/api/ Configure Credentials in n8n Use HTTP Header Auth with your API key for each connected endpoint Connect This Agent to a Supervisor Workflow (Optional) Trigger this agent using Execute Workflow with inputs message and sessionId Test Prompts Try asking: “Convert 1000 DOGE to BTC” or “Top 5 coins in EUR” --- Included Sticky Notes Crypto Agent Guide – Agent overview, node map, and endpoint details Usage Instructions – Step-by-step usage and sample prompts Error Handling & Licensing – Troubleshooting and IP rights --- ✅ Final Notes This agent is part of the CoinMarketCap AI Analyst System, which includes multiple specialized agents for cryptocurrencies, exchanges, community data, and DEX insights. Visit my Creator profile to find the full suite of tools. --- Get smarter about crypto—analyze the market in real time with AI and CoinMarketCap.

Don Jayamaha JrBy Don Jayamaha Jr
8169

Extract Facebook group posts with Airtop

Extract Facebook Group Posts with Airtop Use Case Extracting content from Facebook Groups allows community managers, marketers, and researchers to gather insights, monitor discussions, and collect engagement metrics efficiently. This automation streamlines the process of retrieving non-sponsored post data from group feeds. What This Automation Does This automation extracts key post details from a Facebook Group feed using the following input parameters: Facebook Group URL: The URL of the Facebook Group feed you want to scrape. Airtop Profile: The name of your Airtop Profile authenticated to Facebook. It returns up to 5 non-sponsored posts with the following attributes for each: Post text Post URL Page/profile URL Timestamp Number of likes Number of shares Number of comments Page or profile details Post thumbnail How It Works Form Trigger: Collects the Facebook Group URL and Airtop Profile via a form. Browser Automation: Initiates a new browser session using Airtop. Navigates to the provided Facebook Group feed. Uses an AI prompt to extract post data, including interaction metrics and profile information. Structured Output: The results are returned in a defined JSON schema, ready for downstream use. Setup Requirements Airtop API Key — Free to generate. An Airtop Profile logged into Facebook. Next Steps Integrate With Analytics Tools: Feed the output into dashboards or analytics platforms to monitor community engagement. Automate Alerts: Trigger notifications for posts matching certain criteria (e.g., high engagement, keywords). Combine With Comment Automation: Extend this to reply to posts or engage with users using other Airtop automations. Let me know if you’d like this saved as a .md file or included in your Airtop automation library. Read more about how to extract posts from Facebook groups

AirtopBy Airtop
2006

WhatsApp product catalog bot with PostgreSQL database

Who is this for? This workflow is designed for businesses or developers who want to integrate product information into a WhatsApp bot and allow users to retrieve details about products from a database. What problem is this workflow solving? This workflow automates the process of managing and retrieving product information via WhatsApp, allowing businesses to easily share product details with customers without manual interaction. What this workflow does Basis version: It adds product data to a Postgres database. It enables a WhatsApp bot to retrieve a list of products. Users can select a product to receive detailed information about it. Additional version: All features from Basis Version. Get a list of product categories. Get a list of products in a category. Add product to cart. Go to the cart or select more products. Remove unnecessary items in the cart or clear the entire cart. When all the desired items are in the cart, click Buy. The bot will send you a payment link. Setup Create Tables in Postgres DB Modify the SQL script to replace "n8n" with your schema name. Run the provided SQL script in your database (available in the workflow). Add Credentials Add WhatsApp credentials (OAuth, Account). Add Postgres credentials to connect the bot to your database. How to customize this workflow to your needs Update the database schema or table structure if you need additional product information. Modify the bot interaction to suit your specific product listing and display preferences.

AndrewBy Andrew
1977

Dynamically run SuiteQL queries in NetSuite via HTTP webhook in n8n

Dynamically Run SuiteQL Queries in NetSuite via HTTP Webhook in n8n > Important: This template uses a NetSuite community node, so it only works on self-hosted n8n. Cloud-based n8n instances currently do not support community nodes. Summary This workflow template allows you to dynamically run SuiteQL queries in NetSuite by sending an HTTP request to an n8n Webhook node. Once triggered, the workflow uses token-based authentication to execute your SuiteQL query and returns the results as JSON. This makes it easy to integrate real-time NetSuite data into dashboards, reporting tools, or other applications. Who Is This For? Developers & Integrators: Easily embed NetSuite data retrieval into custom apps or internal tools. Enterprises & Consultants: Integrate dynamic reporting or data enrichment from NetSuite without manual exports. System Administrators: Automate routine queries and reduce manual intervention. Use Cases & Benefits Dynamic Data Access Send any SuiteQL query on demand instead of hardcoding queries or manually running reports. Seamless Integration Quickly pull NetSuite data into front-end systems (like Excel dashboards, custom web apps, or internal tools) by calling the webhook endpoint. Simplified Reporting Automate data extraction and formatting, reducing the need for manual exports and improving efficiency. How It Works Trigger: An HTTP request to the webhook node initiates the workflow. Input Processing: The workflow reads the SuiteQL query from the incoming request parameter (suiteql). Query Execution: The NetSuite node uses your token-based authentication credentials to run the SuiteQL query. Response: Results are returned as JSON in the HTTP response, ready for further processing or immediate consumption. Prerequisites & Setup NetSuite Community Node This workflow requires the NetSuite community node. Make sure your self-hosted n8n instance supports community nodes. NetSuite Token-Based Authentication Enable TBA in NetSuite. Obtain the required consumer key, consumer secret, token ID, and token secret. n8n Webhook Copy the auto-generated webhook URL (e.g. http://<your-n8n-domain>/webhook/unique-id) from the Webhook node. Usage Send an HTTP GET or POST request to the webhook with your SuiteQL query. For example: sh curl "http://<your-n8n-domain>/webhook/unique-id?suiteql=SELECT%20*%20FROM%20account%20LIMIT%2010" The workflow will execute the query and return JSON data. Customization Change the Query: Simply adjust the suiteql parameter in your HTTP request to run different SuiteQL statements. Data Transformation: Insert nodes (e.g., Function, Set, or Format) to modify or reformat the data before returning it. Extend Integration: Chain additional nodes to push the retrieved data to other services (Google Sheets, Slack, custom dashboards, etc.). Additional Notes Remember that this template is only compatible with self-hosted n8n because it uses a community node. - [netsuite community node](https://www.npmjs.com/package/n8n-nodes-netsuite ) If you have questions, suggestions, or need support, contact us at support@dataants.org. ---

DataAntsBy DataAnts
1523

Send a message on Mattermost when an order is created in WooCommerce

This workflow allows you to send a message on Mattermost when an order is created in WooCommerce.

Harshil AgrawalBy Harshil Agrawal
1226

Daily Google Ads performance to Notion and Google Sheets

Description This workflow automates the daily reporting of Google Ads campaign performance. It pulls click and conversion data from the Google Ads API, merges both datasets, and stores the results into Notion databases and Google Sheets. It includes a campaign-level log and a daily performance summary. The workflow is triggered automatically every day at 08:00 AM, helping marketing teams maintain a consistent and centralized reporting system without manual effort. --- How It Works Scheduled Trigger at 08:00 AM The workflow begins with a Schedule Trigger node that runs once per day at 08:00. Set Yesterday’s Date The Set node defines a variable for the target date (yesterday), which is used in the API queries. Query Google Ads API – Clicks & Cost The first HTTP request pulls campaign-level metrics: campaign.id, campaign.name metrics.clicks, metrics.impressions, metrics.cost_micros Query Google Ads API – Conversions The second HTTP request pulls conversion-related data: metrics.conversions, segments.conversionactionname Split and Merge Both responses are split into individual campaign rows and merged using: campaign.id segments.date Store Campaign-Level Data Stored in Notion database: "Google Ads Campaign Tracker" Appended to Google Sheets tab: "Campaign Daily Report" Generate Daily Summary A code node calculates daily totals across all campaigns: Total impressions, clicks, conversions, cost Unique conversion types The summary is stored in: Notion database: "Google Ads Daily Summary" Google Sheets tab: "Summary Report" --- Setup Steps Schedule the Workflow The workflow is triggered using a Schedule Trigger node Set the schedule to run every day at 08:00 AM Connect it to the Set Yesterday Date node Google Ads API Access Create a Google Ads developer account and obtain a developer token Set up OAuth2 credentials with Google Ads scope In n8n, configure the Google Ads OAuth2 API credential Ensure HTTP request headers include: developer-token login-customer-id Content-Type: application/json Notion Database Setup Create two databases in Notion: Google Ads Campaign Tracker Fields: Campaign Name, Campaign ID, Impressions, Clicks, Cost, Conversion Type, Conversions, Date Google Ads Daily Summary Fields: Date, Total Impressions, Total Clicks, Total Conversions, Total Cost, Conversion Types Share both databases with your Notion integration Google Sheets Setup Create a spreadsheet with two tabs: Campaign Daily Report → for campaign-level rows Summary Report → for daily aggregated metrics Match all column headers to the workflow fields Connect your Google account to n8n using Google Sheets OAuth2 --- Output Summary Notion Databases: Google Ads Campaign Tracker: stores individual campaign metrics Google Ads Daily Summary: stores daily totals and conversion types Google Sheets Tabs: Campaign Daily Report: per-campaign data Summary Report: aggregated daily performance

Aziz devBy Aziz dev
1184

Gemini-powered Facebook comment & DM assistant with Notion

What Problem Does It Solve? Customers often ask product questions or prices in comments. Businesses waste time replying manually, leading to delays. Some comments only need a short thank-you reply, while others need a detailed private response. This workflow solves these by: Replying with a friendly public comment. Sending a private message with details when needed. Handling compliments, complaints, and unclear comments in a consistent way. How to Configure It Facebook Setup Connect your Facebook Page credentials in n8n. Add the webhook URL from this workflow to your Facebook App/Webhook settings. AI Setup Add your Google Gemini API key (or swap for OpenAI/Claude). The included prompt is generic — you can edit it to match your brand tone. Optional Logging If you want to track processed messages, connect a Notion database or another CRM. How It Works Webhook catches new Facebook comments. AI Agent analyzes the comment and categorizes it (question, compliment, complaint, unclear, spam). Replying: For questions/requests → public reply + private message with full details. For compliments → short thank-you reply. For complaints → apology reply + private message for clarification. For unclear comments → ask politely if they need help. For spam/offensive → ignored (no reply). Replies and messages are sent instantly via the Facebook Graph API. Customization Ideas Change the AI prompt to match your brand voice. Add forwarding to Slack/Email if a human should review certain replies. Log conversations in Notion, Google Sheets, or a CRM for reporting. Expand to Instagram or WhatsApp with small adjustments. If you need any help Get In Touch

Abdullah AlshiekhBy Abdullah Alshiekh
1109

Daily cash flow reports with Google Sheets, Slack & Email for finance teams

Simplify financial oversight with this automated n8n workflow. Triggered daily, it fetches cash flow and expense data from a Google Sheet, analyzes inflows and outflows, validates records, and generates a comprehensive daily report. The workflow sends multi-channel notifications via email and Slack, ensuring finance professionals stay updated with real-time financial insights. 💸📧 Key Features Daily automation keeps cash flow tracking current. Analyzes inflows and outflows for actionable insights. Multi-channel alerts enhance team visibility. Logs maintain a detailed record in Google Sheets. Workflow Process The Every Day node triggers a daily check at a set time. Get Cash Flow Data retrieves financial data from a Google Sheet. Analyze Inflows & Outflows processes the data to identify trends and totals. Validate Records ensures all entries are complete and accurate. If records are valid, it branches to: Sends Email Daily Report to finance team members. Send Slack Alert to notify the team instantly. Logs to Sheet appends the summary data to a Google Sheet for tracking. Setup Instructions Import the workflow into n8n and configure Google Sheets OAuth2 for data access. Set the daily trigger time (e.g., 9:00 AM IST) in the "Every Day" node. Test the workflow by adding sample cash flow data and verifying reports. Adjust analysis parameters as needed for specific financial metrics. Prerequisites Google Sheets OAuth2 credentials Gmail API Key for email reports Slack Bot Token (with chat:write permissions) Structured financial data in a Google Sheet Google Sheet Structure: Create a sheet with columns: Date Cash Inflow Cash Outflow Category Notes Updated At Modification Options Customize the "Analyze Inflows & Outflows" node to include custom financial ratios. Adjust the "Validate Records" filter to flag anomalies or missing data. Modify email and Slack templates with branded formatting. Integrate with accounting tools (e.g., Xero) for live data feeds. Set different trigger times to align with your financial review schedule. Discover more workflows – Get in touch with us

Oneclick AI SquadBy Oneclick AI Squad
619

LinkedIn scraping, structuring, and messaging using PhantomBuster and GPT-4

Description This template automates the end-to-end process of extracting professional data from LinkedIn and converting it into a usable format—ideal for recruiters, SDRs, marketers, and growth teams. With a few simple configurations, you’ll be able to trigger the flow, scrape profiles, and use AI to extract name, headline, company, role, industry, and more—without writing a single line of code. Key Features 🔗 Launch PhantomBuster Agent using a profile URL ⏳ Wait 45 seconds for PhantomBuster to complete scraping 📥 Fetch and parse scraped data (download URL to JSON) 🤖 Use GPT-4 (OpenAI/Azure) to extract structured information 📄 Output fields: Name, Headline, Company, Job Title, Industry, Experience, etc. 🧰 Optional: Personalize messages using extracted data 📊 Send structured output to Google Sheets, Airtable, or CRM Setup Instructions PhantomBuster Configuration Sign up for PhantomBuster Use the LinkedIn Profile Scraper Phantom Obtain your API Key and Agent ID Provide a valid LinkedIn session cookie (from browser dev tools) OpenAI or Azure Setup Add your GPT-4 or GPT-4o API credentials Can be from either OpenAI or Azure OpenAI Google Sheet (Optional) Add a sheet with LinkedIn profile URLs to process in batch mode Environment Cleanup This version uses n8n credentials manager All hardcoded tokens and API keys are removed for security compliance Customization Tips You can adjust the wait duration depending on PhantomBuster execution time Replace or extend the AI parsing prompt to include more fields (e.g., education, location, skills) Add additional automations: Slack notifications, CRM sync, or enrichment tools like Clearbit or Hunter.io Perfect For 🚀 Growth hackers and SDRs automating lead generation 🧠 Recruiters scraping profiles for outreach 📊 Marketing teams enriching data for campaigns 🛠️ SaaS builders building LinkedIn intelligence tools

Rahul JoshiBy Rahul Joshi
277

Create Branded Social Media Images with Bannerbear (Sync/Async modes)

Automatically create branded social media graphics, certificates, thumbnails, or marketing visuals using Bannerbear's template-based image generation API. Bannerbear's API is primarily asynchronous by default: this workflow shows you how to use both asynchronous (webhook-based) and synchronous modes depending on your needs. What it does This workflow connects to Bannerbear's API to generate custom images based on your pre-designed templates. You can modify text, colors, and other elements programmatically. By default, Bannerbear works asynchronously: you submit a request, receive an immediate 202 Accepted response, and get the final image via webhook or polling. This workflow demonstrates both the standard asynchronous approach and an alternative synchronous method where you wait for the image to be generated before proceeding. How it works Set parameters - Configure your Bannerbear API key, template ID, and content (title, subtitle) Choose mode - Select synchronous (wait for immediate response) or asynchronous (standard webhook delivery) Generate image - The workflow calls Bannerbear's API with your modifications Receive result - Get the image URL, dimensions, and metadata in PNG or JPG format Async mode (recommended): The workflow receives a pending status immediately, then a webhook delivers the completed image when ready. Sync mode: The workflow waits for the image generation to complete before proceeding. Setup requirements A Bannerbear account (free tier available) A Bannerbear template created in your dashboard Your API key and template ID from Bannerbear For async mode: ability to receive webhooks (production n8n instance) How to set up Get Bannerbear credentials: Sign up at bannerbear.com Create a project and design a template Copy your API key from Settings > API Key Copy your template ID from the API Console Configure the workflow: Open the "SetParameters" node Replace the API key and template ID with yours Customize the title and subtitle text Set call_mode to "sync" or "async" For async mode (recommended): Activate the "Webhook_OnImageCreated" node Copy the production webhook URL Add it to Bannerbear via Settings > Webhooks > Create a Webhook Set event type to "image_created" Customize the workflow Modify the template parameters to match your Bannerbear template fields Add additional modification objects for more dynamic elements (colors, backgrounds, images) Connect to databases, CRMs, or other tools to pull content automatically Chain multiple image generations for batch processing Store generated images in Google Drive, S3, or your preferred storage Use async mode for high-volume generation without blocking your workflow

Elodie TasiaBy Elodie Tasia
152

CYBERPULSE AI redOps: phishing simulation with redirect tracking

Description: Simulate cloaked phishing links that redirect through a controlled proxy. This module tracks if secure email gateways (SEGs) or sandboxes trigger the redirect before users do. Logs access, response, and timestamps in Google Sheets. Who’s It For: Red Teams simulating real-world phishing redirects Security teams testing gateway/sandbox behavior Awareness teams tracking click-throughs How It Works: Loads target list from Google Sheets Generates dynamic redirect links per target Emails the links using Gmail or SMTP Simulates access via webhook or internal call Logs metadata and redirect access to Sheets Requirements: Google Sheet Requirements Sheet Name: Redirect_Logs Required Columns: name, team, email, module, status, payload, response, timestamp Google Sheets credentials Email service (Gmail, SMTP, or custom node) Optional: Real endpoint for link redirection (e.g., Vercel Function, Cloudflare Worker) Setup Instructions Clone or copy the provided Google Sheet template (linked below). Set up the webhook trigger in the Redirect Proxy node. Use URL shortener node (optional) to obfuscate redirect links. Connect Google Sheets node and map fields: timestamp, IP, user-agent, original URL. Configure redirection logic using IF and Set nodes. Run a test redirect to validate Google Sheet logging. File Templates: RedOpsRedirectCloakLog_Template.xlsx email name team payload response status module timestamp test@org.com John Doe IT redirect.link/... Redirect triggered Simulated RedirectCloak 2025-07-27T12:00:00Z Customization Redirect Logic: Modify the HTTP Response or Set node to redirect to real servers or simulation targets. Tracking Format: Adjust the structure of the logged data — include fields like user-agent, referrer, campaign ID, etc. Redirection Endpoint: Host the redirection logic on a public API gateway (e.g., AWS API Gateway, Vercel Edge Function) if deploying outside of n8n. Obfuscation: Integrate a URL shortener (like Bitly) or a custom domain to hide the true destination during simulations. Ethics Note: This module is intended for internal simulations only and does not contain malicious payloads. Always use with authorization and red team awareness protocols. 🔗 Part of the CYBERPULSE AI RedOps Suite 🌐 https://cyberpulsesolutions.com 📧 info@cyberpulsesolutions.com

Adnan TariqBy Adnan Tariq
80
All templates loaded