12 templates found
Category:
Author:
Sort:

Conversational Meta ads reporting & management with GPT-5

Currently Work-In-Progress This n8n template creates an intelligent AI assistant that responds to chat messages, providing conversational access to your Meta Ads data. Powered by an OpenAI GPT-5 model and equipped with memory to maintain context, this agent can interact with your Meta Ads accounts via the Facebook Graph API. Users can ask it to: List all connected ad accounts. Retrieve detailed information for a specific ad account, including active campaigns, ad sets, and individual ads. Fetch performance insights (e.g., spend, impressions, conversions, CPC, CPM, CTR, ROAS) for a given account and time range. Ideal for marketers, advertisers, or anyone needing quick, conversational access to their Meta Ads performance data and campaign structure without logging into the Ad Manager directly. Requires: OpenAI and Facebook Graph API credentials.

Konrad RoziewskiBy Konrad Roziewski
3730

Find top keywords for Youtube and Google and store them in NocoDB

Template Description WDF Top Keywords: This workflow is designed to streamline keyword research by automating the process of generating, filtering, and analyzing Google and YouTube keyword data. Ensure compliance with local regulations and API terms of service when using this workflow. --- ๐Ÿ“Œ Purpose The WDF Top Keywords workflow automates collecting, processing, and managing keyword data for both Google and YouTube platforms. Leveraging multiple data sources and APIs ensures an efficient and scalable approach to identifying high-impact keywords for SEO, content creation, and marketing campaigns. Key Features Automates the generation of keyword suggestions using autocomplete APIs. Integrates with NocoDB to store and manage keyword data. Filters keywords based on monthly search volume and cost-per-click (CPC). Supports bulk import of keyword data into structured databases. Outputs both Google and YouTube keyword insights, enabling informed decision-making. --- ๐ŸŽฏ Target Audience This workflow is ideal for: Digital marketers aiming to optimize ad campaigns with data-driven insights. SEO specialists looking to identify high-potential keywords efficiently. Content creators seeking trending and relevant topics for their platforms. Agencies managing keyword research for multiple clients. --- โš™๏ธ How It Works Trigger: The workflow runs on-demand or at scheduled intervals. Keyword Generation: Retrieves base keywords from NocoDB. Generates autocomplete suggestions for Google and YouTube. Data Processing: Filters and formats keyword data based on specific criteria (e.g., search volume, CPC). Consolidates results for efficient storage and analysis. Storage and Output: Saves data into structured NocoDB tables for tracking and reuse. Bulk imports monthly search volume statistics for detailed analysis. --- ๐Ÿ› ๏ธ Key APIs and Tools Used NocoDB: Stores and organizes base and processed keyword data. DataForSEO API: Provides search volume and keyword performance metrics. Google Autocomplete API: Suggests relevant Google search terms. YouTube Autocomplete API: Suggests trending YouTube keywords. Social Flood Docker Instance: Serves as the local integration hub. --- Setup Instructions Required Tools: NocoDB n8n DataForSEO Account Social Flood Docker Instance Create the following NocoDB tables: Base Keyword Search Second Order Google Keywords Second Order YouTube Keywords Search Volume This template empowers users to handle complex keyword research tasks effortlessly, saving time and providing actionable insights. Share this template to enhance your workflow efficiency!

Shannon AtkinsonBy Shannon Atkinson
3434

Send daily recipe emails automatically

Not sure what to eat tonight? Have recipes emailed to you daily based on your criterial. To run this workflow, you will need to have: A Recipe Search API key from Edamam An active email account with configured credentials To set up your credentials: Set your Edamam AppID and AppKey in the Search Criteria node Select (or create) your email credentials in the Send Recipes node (and set up the to: and from: email addresses while you are at it) To customize the recipes that you receive, open up the Search Criteria node and modify one or more of the following: RecipeCount - the numner of recipes you would like to receive IngredientCount - the maximum number of ingredients you would like each recipe to have CaloriesMin - the minimum number of calories the recipe will have CaloriesMax - the maximum number of calories the recipe will have TimeMin - the minimum amount of time (in minutes) the recipe will take to prepare TimeMax - the maximum amount of time (in minutes) the recipe will take to prepare Diet - Select one of the following options: balanced - Protein/Fat/Carb values in 15/35/50 ratio high-fiber - More than 5g fiber per serving high-protein - More than 50% of total calories from proteins low-carb - Less than 20% of total calories from carbs low-fat - Less than 15% of total calories from fat low-sodium - Less than 140mg Na per serving random - selects a different random diet each day Health - Select one of the following options: alcohol-free - No alcohol used or contained immuno-supportive - Recipes which fit a science-based approach to eating to strengthen the immune system celery-free - does not contain celery or derivatives crustacean-free - does not contain crustaceans (shrimp, lobster etc.) or derivatives dairy-free - No dairy; no lactose egg-free - No eggs or products containing eggs fish-free - No fish or fish derivatives fodmap-free - Does not contain FODMAP foods gluten-free - No ingredients containing gluten keto-friendly - Maximum 7 grams of net carbs per serving kidney-friendly - per serving โ€“ phosphorus less than 250 mg AND potassium less than 500 mg AND sodium: less than 500 mg kosher - contains only ingredients allowed by the kosher diet. However it does not guarantee kosher preparation of the ingredients themselves low-potassium - Less than 150mg per serving lupine-free - does not contain lupine or derivatives mustard-free - does not contain mustard or derivatives low-fat-abs - Less than 3g of fat per serving no-oil-added - No oil added except to what is contained in the basic ingredients low-sugar - No simple sugars โ€“ glucose, dextrose, galactose, fructose, sucrose, lactose, maltose paleo - Excludes what are perceived to be agricultural products; grains, legumes, dairy products, potatoes, refined salt, refined sugar, and processed oils peanut-free - No peanuts or products containing peanuts pecatarian - Does not contain meat or meat based products, can contain dairy and fish pork-free - does not contain pork or derivatives red-meat-free - does not contain beef, lamb, pork, duck, goose, game, horse, and other types of red meat or products containing red meat. sesame-free - does not contain sesame seed or derivatives shellfish-free - No shellfish or shellfish derivatives soy-free - No soy or products containing soy sugar-conscious - Less than 4g of sugar per serving tree-nut-free - No tree nuts or products containing tree nuts vegan - No meat, poultry, fish, dairy, eggs or honey vegetarian - No meat, poultry, or fish wheat-free - No wheat, can have gluten though random - selects a different random health option each day SearchItem - the general term that you are looking for e.g. chicken

jasonBy jason
2156

CallForge - 06 - Automate sales insights with Gong.io, Notion & AI

--- CallForge - AI-Powered Sales Call Data Processor Automate sales call analysis and store structured insights in Notion with AI-powered intelligence. Who is This For? This workflow is ideal for: โœ… Sales teams looking to automate call insight processing. โœ… Sales operations managers managing AI-driven call analysis. โœ… Revenue teams using Gong, Fireflies.ai, Otter.ai, or similar transcription tools. It streamlines sales call intelligence, ensuring that insights such as competitor mentions, objections, and customer pain points are efficiently categorized and stored in Notion for easy access. --- ๐Ÿ” What Problem Does This Workflow Solve? Manually reviewing and documenting sales call takeaways is time-consuming and error-prone. With CallForge, you can: โœ” Identify competitors mentioned in sales calls. โœ” Capture objections and customer pain points for follow-up. โœ” Track sales call outcomes and categorize insights automatically. โœ” Store structured sales intelligence in Notion for future reference. โœ” Improve sales strategy with AI-driven, automated call analysis. --- ๐Ÿ“Œ Key Features & Workflow Steps ๐ŸŽ™๏ธ AI-Powered Call Data Processing This workflow processes AI-generated sales call insights and structures them in Notion databases: Triggers automatically when AI call analysis data is received. Extracts competitor mentions from the call transcript and logs them in Notion. Identifies and categorizes sales objections for better follow-ups. Processes integration mentions, capturing tools or platforms referenced in the call. Extracts customer use cases, categorizing pain points and feature requests. Aggregates all extracted insights and updates relevant Notion databases. ๐Ÿ“Š Notion Database Integration Competitors โ†’ Logs mentioned competitors for sales intelligence. Objections โ†’ Tracks and categorizes common objections from prospects. Integrations โ†’ Captures third-party tools & platforms discussed in calls. Use Cases โ†’ Stores customer challenges & product feature requests. --- ๐Ÿ›  How to Set Up This Workflow Prepare Your AI Call Analysis Data Ensure AI-generated sales call data is passed into the workflow. Compatible with Gong, Fireflies.ai, Otter.ai, and other AI transcription tools. Connect Your Notion Database Set up Notion databases for: ๐Ÿ”น Competitors (tracks competing products) ๐Ÿ”น Objections (logs customer objections & concerns) ๐Ÿ”น Integrations (captures mentioned platforms & tools) ๐Ÿ”น Use Cases (categorizes customer pain points & feature requests) Configure n8n API Integrations Connect your Notion API key in n8n under โ€œNotion API Credentials.โ€ Set up webhook triggers to receive data from your AI transcription tool. Test the workflow using a sample AI-generated call transcript. CallForge - 01 - Filter Gong Calls Synced to Salesforce by Opportunity Stage CallForge - 02 - Prep Gong Calls with Sheets & Notion for AI Summarization CallForge - 03 - Gong Transcript Processor and Salesforce Enricher CallForge - 04 - AI Workflow for Gong.io Sales Calls CallForge - 05 - Gong.io Call Analysis with Azure AI & CRM Sync CallForge - 06 - Automate Sales Insights with Gong.io, Notion & AI CallForge - 07 - AI Marketing Data Processing with Gong & Notion CallForge - 08 - AI Product Insights from Sales Calls with Notion --- ๐Ÿ”ง How to Customize This Workflow ๐Ÿ’ก Modify Notion Data Structure โ€“ Adjust fields to match your companyโ€™s CRM setup. ๐Ÿ’ก Enhance AI Data Processing โ€“ Align fields with different AI transcription providers. ๐Ÿ’ก Expand with CRM Integration โ€“ Sync insights with HubSpot, Salesforce, or Pipedrive. ๐Ÿ’ก Add Notifications โ€“ Send alerts via Slack, email, or webhook when key competitor mentions or objections are detected. --- โš™๏ธ Key Nodes Used in This Workflow ๐Ÿ”น If Nodes โ€“ Checks if AI-generated data includes competitors, integrations, objections, or use cases. ๐Ÿ”น Notion Nodes โ€“ Creates or updates entries in Notion databases. ๐Ÿ”น Split Out & Aggregate Nodes โ€“ Processes multiple insights and consolidates AI outputs. ๐Ÿ”น Wait Nodes โ€“ Ensures smooth sequencing of API calls and database updates. ๐Ÿ”น HTTP Request Node โ€“ Sends AI-extracted insights to Notion for structured storage. --- ๐Ÿš€ Why Use This Workflow? โœ” Eliminates manual data entry and speeds up sales intelligence processing. โœ” Ensures structured and categorized sales insights for decision-making. โœ” Improves team collaboration with AI-powered competitor tracking & objections logging. โœ” Seamlessly integrates with Notion to centralize and manage sales call insights. โœ” Scalable for teams using n8n Cloud or self-hosted deployments. This workflow empowers sales teams with automated AI insights, streamlining sales strategy and follow-ups with minimal effort. ๐Ÿš€

Angel MenendezBy Angel Menendez
1920

Monitor Amazon product prices with Bright Data and Google Sheets

Amazon Price Monitoring Workflow This workflow enables you to monitor the prices of Amazon product listings directly from a Google Sheet, using data provided by Bright Dataโ€™s Amazon Scraper API. It automates the retrieval of price data for specified products and is ideal for market research, competitor analysis, or personal price tracking. โœ… Requirements Before using this template, ensure you have the following: A Bright Data account and access to the Amazon Scraper API. An active API key from Bright Data. A Google Sheet set up with the required columns. N8N account (self-host or cloud version) โธป โš™๏ธ Setup Create a Google Sheet with the following columns: Product URL ZIP Code (used for regional price variations) ASIN (Amazon Standard Identification Number) Extract ASIN Automatically using the following formula in the ASIN column: =REGEXEXTRACT(A2, "/(?:dp|gp/product|product)/([A-Z0-9]{10})") Replace A2 with the appropriate cell reference Obtain an API Key: Sign in to your Bright Data account. Go to the API section to generate an API key. Create a Bearer Authentication Credential using this key in your automation tool. Configure the Workflow: Use a node (e.g., โ€œGoogle Sheetsโ€) to read data from your sheet. Use an HTTP Request node to send a query to Bright Dataโ€™s Amazon API with the ASIN and ZIP code. Parse the returned JSON response to extract product price and other relevant data. Optionally write the output (e.g., current price, timestamp) back into the sheet or another data store. โธป Workflow Functionality The workflow is triggered periodically (or manually) and reads product details from your Google Sheet. For each row, it extracts the Product URL and ZIP code and sends a request to the Bright Data API. The API returns product price information, which is then logged or updated back into the sheet using ASIN. You can also map the product URL to the product URL, but ensure that the URL has no parameters. If the URL has appended parameters, refer to the input field from the Bright Data snapshot result. โธป ๐Ÿ’ก Use Cases E-commerce sellers monitoring competitorsโ€™ prices. Consumers tracking price drops on wishlist items. Market researchers collecting pricing data across ZIP codes. Affiliate marketers ensuring accurate product pricing on their platforms. โธป ๐Ÿ› ๏ธ Customization Add columns for additional product data such as rating, seller, or stock availability. Schedule the workflow to run hourly, daily, or weekly depending on your needs. Implement email or Slack alerts for significant price changes. Filter by product category or brand to narrow your tracking focus.

Cyril Nicko GasparBy Cyril Nicko Gaspar
1571

Generate GLPI support performance reports with SLA tracking & email delivery

Overview This comprehensive n8n workflow automates the generation and distribution of detailed monthly technical support reports from GLPI (IT Service Management platform). The workflow intelligently calculates SLA compliance, analyzes technician performance, and delivers professionally formatted HTML reports via email. โœจ Key Features Intelligent SLA Calculation Business Hours Tracking: Automatically calculates resolution time considering only working hours (excludes weekends and lunch breaks) Configurable Schedule: Customizable work hours (default: 8 AM - 12 PM, 1 PM - 6 PM) Dynamic SLA Monitoring: Real-time compliance tracking with configurable thresholds (default: 24 hours) Visual Indicators: Color-coded alerts for critical SLA breaches and high-volume warnings Comprehensive Reporting General Summary: Total cases, open, in-progress, resolved, and closed tickets Performance Metrics: Total and average resolution hours in both decimal and formatted (hours/minutes) display Technician Breakdown: Individual performance analysis per technician including case distribution and SLA compliance Smart Alerts: Automatic warnings for high case volumes (>100 in-progress) and critical SLA levels (<50%) Professional Email Delivery Responsive HTML Design: Mobile-optimized email templates with elegant styling Dynamic Content: Conditional formatting based on performance metrics Automatic Scheduling: Monthly execution on the 6th day to ensure accurate SLA measurement ๐Ÿ’ผ Business Benefits Time Savings Eliminates Manual Work: Saves 2-4 hours per month previously spent compiling reports manually Automated Data Collection: No more exporting CSVs or copying data between systems One-Click Setup: Configure once and receive reports automatically every month Improved Decision Making Real-Time Insights: Identify bottlenecks and performance issues immediately Technician Accountability: Clear visibility into individual and team performance SLA Compliance Tracking: Proactively manage service level agreements before they become critical Enhanced Communication Stakeholder Ready: Professional reports suitable for management presentations Consistent Format: Standardized metrics ensure month-over-month comparability Instant Distribution: Automatic email delivery to relevant stakeholders ๐Ÿ”ง Technical Specifications Requirements n8n instance (self-hosted or cloud) GLPI server with API access enabled Gmail account (or any SMTP-compatible email service) GLPI API credentials (App-Token and User credentials) Configuration Points Variables Node: Server URL, API tokens, entity name, work hours, SLA limits Schedule Trigger: Monthly execution timing (default: 6th of each month) Email Recipient: Target email address for report delivery Date Range Logic: Automatic previous month calculation Data Processing Retrieves up to 999 tickets per execution (configurable) Filters by entity and date range Excludes weekends and non-business hours from calculations Groups data by technician for detailed analysis ๐Ÿ“‹ Setup Instructions Prerequisites GLPI Configuration: Enable API and configure the Tickets panel with required fields (ID, -Title, Status, Opening Date, Closing Date, Resolution Date, Priority, Requester, Assigned To) API Credentials: Create Basic Auth credentials in n8n for GLPI API access Email Authentication: Set up Gmail OAuth2 or SMTP credentials in n8n Implementation Steps Import the workflow JSON into your n8n instance Configure the Variables node with your GLPI server details and business hours Set up GLPI API credentials in the HTTP Request nodes Configure email credentials in the Gmail node Update the recipient email address Test the workflow manually before enabling the schedule Activate the workflow for automatic monthly execution ๐ŸŽฏ Use Cases IT Support Teams: Track helpdesk performance and SLA compliance Service Managers: Monitor team productivity and identify training needs Executive Reporting: Provide high-level summaries to stakeholders Resource Planning: Identify workload distribution and capacity issues Compliance Auditing: Maintain historical records of SLA performance ๐Ÿ“ˆ ROI Impact Time Savings: 24-48 hours annually in manual reporting eliminated Error Reduction: Eliminates human calculation errors in SLA tracking Faster Response: Early alerts enable proactive issue resolution Better Visibility: Data-driven insights improve team management

Luis HernandezBy Luis Hernandez
584

Real-time OKX spot market data with GPT-4o & Telegram

Instantly access live OKX Spot Market data directly in Telegram! This workflow integrates the OKX REST v5 API with Telegram and optional GPT-4.1-mini formatting, delivering real-time insights such as latest prices, order book depth, candlesticks, trades, and mark prices โ€” all in clean, structured reports. --- ๐Ÿ”Ž How It Works A Telegram Trigger node listens for incoming user commands. The User Authentication node validates the Telegram ID to allow only authorized users. The workflow creates a Session ID from chat.id to manage session memory. The OKX AI Agent orchestrates data retrieval via HTTP requests to OKX endpoints: Latest Price (/api/v5/market/ticker?instId=BTC-USDT) 24h Stats (/api/v5/market/ticker?instId=BTC-USDT) Order Book Depth (/api/v5/market/books?instId=BTC-USDT&sz=50) Best Bid/Ask (book ticker snapshot) Candlesticks / Klines (/api/v5/market/candles?instId=BTC-USDT&bar=15m) Average / Mark Price (/api/v5/market/mark-price?instType=SPOT&instId=BTC-USDT) Recent Trades (/api/v5/market/trades?instId=BTC-USDT&limit=100) Utility tools refine the data: Calculator โ†’ spreads, % change, normalized volumes. Think โ†’ reshapes raw JSON into clean text. Simple Memory โ†’ stores sessionId, symbol, and state for multi-turn interactions. A message splitter ensures Telegram output stays under 4000 characters. Final results are sent to Telegram in structured, human-readable format. --- โœ… What You Can Do with This Agent Get latest price and 24h stats for any Spot instrument. Retrieve order book depth with configurable size (up to 400 levels). View best bid/ask snapshots instantly. Fetch candlestick OHLCV data across intervals (1m โ†’ 1M). Monitor recent trades (up to 100). Check the mark price as a fair average reference. Receive clean, Telegram-ready reports (auto-split if too long). --- ๐Ÿ› ๏ธ Setup Steps Create a Telegram Bot Use @BotFather to generate a bot token. Configure in n8n Import OKX AI Agent v1.02.json. Replace the placeholder in User Authentication node with your Telegram ID. Add Telegram API credentials (bot token). Add your OpenAI API key for GPT-4.1-mini. Add your OKX API key optional. Deploy and Test Activate the workflow in n8n. Send a query like BTC-USDT to your bot. Instantly get structured OKX Spot data back in Telegram. --- ๐Ÿ“บ Setup Video Tutorial Watch the full setup guide on YouTube: [](https://www.youtube.com/watch?v=TAA_BFuwml0) --- โšก Unlock real-time OKX Spot Market insights directly in Telegram โ€” no private API keys required! --- ๐Ÿงพ Licensing & Attribution ยฉ 2025 Treasurium Capital Limited Company Architecture, prompts, and trade report structure are IP-protected. No unauthorized rebranding permitted. ๐Ÿ”— For support: Don Jayamaha โ€“ LinkedIn

Don Jayamaha JrBy Don Jayamaha Jr
424

Automated construction project alerts with email notifications and data APIs

This n8n workflow monitors and alerts you about new construction projects in specified areas, helping you track competing builders and identify business opportunities. The system automatically searches multiple data sources and sends detailed email reports with upcoming projects. Good to know Email parsing accuracy depends on the consistency of request formats - use the provided template for best results. The workflow includes fallback mock data for demonstration when external APIs are unavailable. Government data sources may have rate limits - the workflow includes proper error handling. Results are filtered to show only upcoming/recent projects (within 3 months). How it works Email Trigger - Detects new email requests with "Construction Alert Request" in the subject line Check Email Subject - Validates that the email contains the correct trigger phrase Extract Location Info - Parses the email body to extract area, city, state, and zip code information Search Government Data - Queries government databases for public construction projects and permits Search Construction Sites - Searches construction industry databases for private projects Process Construction Data - Combines and filters results from both sources, removing duplicates Wait For Data - Wait for Combines and filters results. Check If Projects Found - Determines whether to send a results report or no-results notification Generate Email Report - Creates a professional HTML email with project details and summaries Send Alert Email - Delivers the construction project report to the requester Send No Results Email - Notifies when no projects are found in the specified area The workflow also includes a Schedule Trigger that can run automatically on weekdays at 9 AM for regular monitoring. Email Format Examples Input Email Format To: alerts@yourcompany.com Subject: Construction Alert Request Area: Downtown Chicago City: Chicago State: IL Zip: 60601 Additional notes: Looking for commercial projects over $1M Alternative format: To: alerts@yourcompany.com Subject: Construction Alert Request Please search for construction projects in Miami, FL 33101 Focus on residential and mixed-use developments. Output Email Example html Subject: ๐Ÿ—๏ธ Construction Alert: 8 Projects Found in Downtown Chicago ๐Ÿ—๏ธ Construction Project Alert Report Search Area: Downtown Chicago Report Generated: August 4, 2024, 2:30 PM ๐Ÿ“Š Summary Total Projects Found: 8 Search Query: Downtown Chicago IL construction permits ๐Ÿ” Upcoming Construction Projects New Commercial Complex - Downtown Chicago ๐Ÿ“ Location: Downtown Chicago | ๐Ÿ“… Start Date: March 2024 | ๐Ÿข Type: Mixed Development Description: Mixed-use commercial and residential development Source: Local Planning Department Office Building Construction - Chicago ๐Ÿ“ Location: Chicago, IL | ๐Ÿ“… Start Date: April 2024 | ๐Ÿข Type: Commercial Description: 5-story office building with retail space Source: Building Permits [Additional projects...] ๐Ÿ’ก Next Steps โ€ข Review each project for potential competition โ€ข Contact project owners for partnership opportunities โ€ข Monitor progress and timeline changes โ€ข Update your competitive analysis How to use Setup Instructions Import the workflow into your n8n instance Configure Email Credentials: Set up IMAP credentials for receiving emails Set up SMTP credentials for sending alerts Test the workflow with a sample email Set up scheduling (optional) for automated daily checks Sending Alert Requests Send an email to your configured address Use "Construction Alert Request" in the subject line Include location details in the email body Receive detailed project reports within minutes Requirements n8n instance (cloud or self-hosted) Email account with IMAP/SMTP access Internet connection for API calls to construction databases Valid email addresses for sending and receiving alerts API Integration Code Examples Government Data API Integration javascript // Example API call to USA.gov jobs API const searchGovernmentProjects = async (location) => { const response = await fetch('https://api.usa.gov/jobs/search.json', { method: 'GET', headers: { 'Content-Type': 'application/json', }, params: { keyword: 'construction permit', location_name: location, size: 20 } }); return await response.json(); }; Construction Industry API Integration javascript // Example API call to construction databases const searchConstructionProjects = async (area) => { const response = await fetch('https://www.construction.com/api/search', { method: 'GET', headers: { 'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36', 'Accept': 'application/json' }, params: { q: ${area} construction projects, type: 'projects', limit: 15 } }); return await response.json(); }; Email Processing Function javascript // Extract location from email content const extractLocationInfo = (emailBody) => { const lines = emailBody.split('\n'); let area = '', city = '', state = '', zipcode = ''; for (const line of lines) { if (line.toLowerCase().includes('area:')) { area = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('city:')) { city = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('state:')) { state = line.split(':')[1]?.trim(); } if (line.toLowerCase().includes('zip:')) { zipcode = line.split(':')[1]?.trim(); } } return { area, city, state, zipcode }; }; Customizing this workflow Adding New Data Sources Add HTTP Request nodes for additional APIs Update the Process Construction Data node to handle new data formats Modify the search parameters based on API requirements Enhanced Email Parsing javascript // Custom email parsing for different formats const parseEmailContent = (emailBody) => { // Add regex patterns for different email formats const patterns = { address: /(\d+\s+[\w\s]+,\s[\w\s]+,\s[A-Z]{2}\s*\d{5})/, coordinates: /(\d+\.\d+),\s*(-?\d+\.\d+)/, zipcode: /\b\d{5}(-\d{4})?\b/ }; // Extract using multiple patterns // Implementation details... }; Custom Alert Conditions Modify the Check If Projects Found node to filter by: Project value/budget Project type (residential, commercial, etc.) Distance from your location Timeline criteria Advanced Scheduling javascript // Set up multiple schedule triggers for different areas const scheduleConfigs = [ { area: "Downtown", cron: "0 9 1-5" }, // Weekdays 9 AM { area: "Suburbs", cron: "0 14 1,3,5" }, // Mon, Wed, Fri 2 PM { area: "Industrial", cron: "0 8 1" } // Monday 8 AM ]; Integration with CRM Systems Add HTTP Request nodes to automatically create leads in your CRM when high-value projects are found: javascript // Example CRM integration const createCRMLead = async (project) => { await fetch('https://your-crm.com/api/leads', { method: 'POST', headers: { 'Authorization': 'Bearer YOUR_TOKEN', 'Content-Type': 'application/json' }, body: JSON.stringify({ name: project.title, location: project.location, value: project.estimatedValue, source: 'Construction Alert System' }) }); }; Troubleshooting No emails received: Check IMAP credentials and email filters Empty results: Verify API endpoints and add fallback data sources Failed email delivery: Confirm SMTP settings and recipient addresses API rate limits: Implement delays between requests and error handling

Oneclick AI SquadBy Oneclick AI Squad
416

Automate sales follow-ups with GPT-4o-mini, HubSpot, Slack, Teams & Telegram

How it works This workflow automatically generates personalized follow-up messages for leads or customers after key interactions (e.g., demos, sales calls). It enriches contact details from HubSpot (or optionally Monday.com), uses AI to draft a professional follow-up email, and distributes it across multiple communication channels (Slack, Telegram, Teams) as reminders for the sales team. Step-by-step Trigger & Input Schedule Trigger โ€“ Runs automatically at a defined interval (e.g., daily). Set Sample Data โ€“ Captures the contactโ€™s name, email, and context from the last interaction (e.g., โ€œhad a product demo yesterday and showed strong interestโ€). Contact Enrichment HubSpot Contact Lookup โ€“ Searches HubSpot CRM by email to confirm or enrich contact details. Monday.com Contact Fetch (Optional) โ€“ Can pull additional CRM details if enabled. AI Message Generation AI Language Model (OpenAI) โ€“ Provides the underlying engine for message creation. Generate Follow-Up Message โ€“ Drafts a short, professional, and friendly follow-up email: References previous interaction context. Suggests clear next steps (call, resources, etc.). Ends with a standardized signature block for consistency. Multi-Channel Communication Slack Reminder โ€“ Posts the generated message as a reminder in the sales teamโ€™s Slack channel. Telegram Reminder โ€“ Sends the follow-up draft to a Telegram chat. Teams Reminder โ€“ Shares the same message in a Microsoft Teams channel. Benefits Personalized Outreach at Scale โ€“ AI ensures each follow-up feels tailored and professional. Context-Aware Messaging โ€“ Pulls in CRM details and past interactions for relevance. Cross-Platform Delivery โ€“ Distributes reminders via Slack, Teams, and Telegram so no follow-up is missed. Time-Saving for Sales Teams โ€“ Eliminates manual drafting of repetitive follow-up emails. Consistent Branding โ€“ Ensures every message includes a unified signature block.

Avkash KakdiyaBy Avkash Kakdiya
389

Auto-categorize blog posts with OpenAI GPT-4, GitHub, and Google Sheets for Astro/Next.js

Automatically Assign Categories and Tags to Blog Posts with AI This workflow streamlines your content organization process by automatically analyzing new blog posts in your GitHub repository and assigning appropriate categories and tags using OpenAI. It compares new posts against existing entries in a Google Sheet, updates the metadata for each new article, and records the suggested tags and categories for review โ€” all in one automated pipeline. --- Whoโ€™s It For Content creators and editors managing a static website (e.g., Astro or Next.js) who want AI-driven tagging. SEO specialists seeking consistent metadata and topic organization. Developers or teams managing a Markdown-based blog stored in GitHub who want to speed up post curation. --- How It Works Form Trigger โ€“ Starts the process manually with a form that initiates article analysis. Get Data from Google Sheets โ€“ Retrieves existing post records to prevent duplicate analysis. Compare GitHub and Google Sheets โ€“ Lists all .md or .mdx blog posts from the GitHub repository (piotr-sikora.com/src/content/blog/pl/) and identifies new posts not yet analyzed. Check New Repo Files โ€“ Uses a code node to filter only unprocessed files for AI tagging. Switch Node โ€“ If there are no new posts, the workflow stops and shows a confirmation message. If new posts exist, it continues to the next step. Get Post Content from GitHub โ€“ Downloads the content of each new article. AI Agent (LangChain + OpenAI GPT-4.1-mini) โ€“ Reads each postโ€™s frontmatter (--- section) and body. Suggests new categories and tags based on the articleโ€™s topic. Returns a JSON object with proposed updates (Structured Output Parser) Append to Google Sheets โ€“ Logs results, including: File name Existing tags and categories Proposed tags and categories (AI suggestions) Completion Message โ€“ Displays a success message confirming the categorization process has finished. --- Requirements GitHub account with repository access to your website content. Google Sheets connection for storing metadata suggestions. OpenAI account (credential stored in openAiApi). --- How to Set Up Connect your GitHub, Google Sheets, and OpenAI credentials in n8n. Update the GitHub repository path to match your project (e.g., src/content/blog/en/). In Google Sheets, create columns: FileName, Categories, Proposed Categories, Tags, Proposed Tags. Adjust the AI model or prompt text if you want different tagging behavior. Run the workflow manually using the Form Trigger node. --- How to Customize Swap OpenAI GPT-4.1-mini for another LLM (e.g., Claude or Gemini) via the LangChain node. Modify the prompt in the AI Agent to adapt categorization style or tone. Add a GitHub commit node if you want AI-updated metadata written back to files automatically. Use the Schedule Trigger node to automate this process daily. --- Important Notes All API keys and credentials are securely stored โ€” no hardcoded keys. The workflow includes multiple sticky notes explaining: Repository setup File retrieval and AI tagging Google Sheet data structure It uses a LangChain memory buffer to improve contextual consistency during multiple analyses. --- Summary This workflow automates metadata management for blogs or documentation sites by combining GitHub content, AI categorization, and Google Sheets tracking. With it, you can easily maintain consistent tags and categories across dozens of articles โ€” boosting SEO, readability, and editorial efficiency without manual tagging.

Piotr SikoraBy Piotr Sikora
208

Convert POML to AI-Ready Prompts & Chat Messages with Zero Dependencies

POML โ†’ Prompt/Messages (No-Deps) What this does Turns POML markup into either a single Markdown prompt or chat-style messages\[] โ€” using a zero-dependency n8n Code node. It supports variable substitution (via context), basic components (headings, lists, code, images, tables, line breaks), and optional schema-driven validation using componentSpec + attributeSpec. Credits Created by Real Simple Solutions as an n8n template friendly POML compiler (no dependencies) for full POML feature parity. View more of our templates here Whoโ€™s it for Teams who author prompts in POML and want a template-safe way to turn them into either a single Markdown prompt or chat-style messagesโ€”without installing external modules. Works on n8n Cloud and self-hosted. What it does This workflow converts POML into: prompt (Markdown) for single-shot models, or messages[] (system|user|assistant) for chat APIs when speakerMode is true. It supports variable substitution via a context object ({{dot.path}}), lists, headings, code blocks, images (incl. base64 โ†’ data: URL), tables from JSON (records/columns), and basic message components. How it works Set (Specs & Context): Provide componentSpec (allowed attrs per tag), attributeSpec (typing/coercion), and optional context. Code (POML โ†’ Prompt/Messages): A zero-dependency compiler parses the POML and emits prompt or messages[]. > Add a yellow Sticky Note that includes this description and any setup links. Use additional neutral sticky notes to explain each step. How to set up Import the template. Open the first Set node and paste your componentSpec, attributeSpec, and context (examples included). In the Code node, choose: speakerMode: true to get messages[], or false for a single prompt. listStyle: dash | star | plus | decimal | latin. Run โ†’ inspect prompt/messages in the output. Requirements No credentials or community nodes. Works without external libraries (template-compliant). How to customize Add message tags (<system-msg>, <user-msg>, <ai-msg>) in your POML when using speakerMode: true. Extend componentSpec/attributeSpec to validate or coerce additional tags/attributes. Preformat arrays in context (e.g., bulleted, csv) for display, or add a small Set node to build them on the fly. Rename nodes and keep all user-editable fields grouped in the first Set node. Security & best practices Never hardcode API keys in nodes. Remove any personal IDs before publishing. Keep your Sticky Note(s) up to date and instructional.

RealSimple SolutionsBy RealSimple Solutions
149

Track crypto prices, new listings & transactions with CoinGecko & Google Sheets

โš™๏ธ How It Works This workflow is a comprehensive crypto automation system that combines three critical functions for traders and investors into one powerful tool: ๐Ÿ“Š Price Monitor A Cron trigger runs on a schedule (e.g., every minute). A HTTP Request node checks the cryptocurrency's price. An If node compares the price against a defined threshold. If the condition is met, a Telegram node sends an alert. ๐Ÿ“ฐ New Listing Notifier An RSS Feed Trigger monitors exchange announcements. When a new listing is published, a Telegram node sends a real-time notification. ๐Ÿงพ Automated Crypto Transaction Logger A second Cron trigger runs daily (or as scheduled). A HTTP Request node fetches trade history from your exchangeโ€™s API. A Code node formats the transaction data. A Google Sheets node logs it in your spreadsheet. --- ๐Ÿ› ๏ธ How to Set Up 1๏ธโƒฃ Configure Credentials You will need credentials for: Telegram: To send alerts and notifications. Google Sheets: To log transaction history. Exchange API (e.g., Binance): To fetch your trade history. 2๏ธโƒฃ Customize the Price Monitor Node 2: HTTP Request (Check BTC Price) Change the url to monitor a different crypto (e.g., Ethereum). Node 3: If (Price > $50k) Adjust rightValue to set your target price threshold. Node 4: Telegram (Send Alert) Replace [YOUR TELEGRAM CHAT ID] with your actual ID. 3๏ธโƒฃ Customize the Listing Notifier Node 1: RSS Feed (New Listing Trigger) Replace feedUrl with your preferred exchangeโ€™s RSS feed. Node 2: Telegram (Listing Notif) Replace [YOUR TELEGRAM CHAT ID] with your actual ID. 4๏ธโƒฃ Customize the Transaction Logger Node 2: HTTP Request (Get Binance Trades) Set the url to your exchangeโ€™s trade history endpoint. Configure headerParameters for Authorization with your API key. Node 4: Google Sheets (Log Transactions) Replace [YOUR SPREADSHEET ID] and [YOUR SHEET NAME] accordingly. 5๏ธโƒฃ Final Activation Once credentials and parameters are configured: โœ… Save the workflow ๐Ÿ” Activate it! --- Ready to give this a visual punch with icons or a mini preview for the n8n template gallery? I can help dress it up in seconds!

MarthBy Marth
95
All templates loaded