14 templates found
Category:
Author:
Sort:

Real-time crypto news & sentiment analysis via Telegram with GPT-4o

Stay on top of the latest crypto news and market sentiment instantly, all inside Telegram! This workflow aggregates articles from the top crypto news sources, filters for your topic of interest, and summarizes key news and market sentiment using GPT-4o AI. Ideal for crypto traders, investors, analysts, and market watchers needing fast, intelligent news briefings. > 💬 Just type a coin name (e.g., "Bitcoin", "Solana", "DeFi") into your Telegram AI Agent—and get a smart news digest. --- How It Works Telegram Bot Trigger User sends a keyword (e.g., "Ethereum") of questions to the Telegram AI Agent. Keyword Extraction (AI-Powered) An AI agent identifies the main topic for better targeting. News Aggregation Pulls articles from 9 major crypto news RSS feeds: Cointelegraph Bitcoin Magazine CoinDesk Bitcoinist NewsBTC CryptoPotato 99Bitcoins CryptoBriefing Crypto.news Filtering Finds and merges articles relevant to the user's keyword. AI Summarization GPT-4o generates a 3-part summary: News Summary Market Sentiment Analysis List of Article Links Telegram Response Sends a structured, easy-to-read digest back to the user. --- 🔍 What You Can Do with This Workflow 🔹 Summarize breaking news for any crypto project or keyword 🔹 Monitor real-time market sentiment on Bitcoin, DeFi, NFTs, and more 🔹 Stay ahead of FUD, bullish trends, and major news events 🔹 Quickly brief yourself or your team via Telegram 🔹 Use it as a foundation for more advanced crypto alert bots --- ✅ Example User Inputs ✅ "Bitcoin" → Latest Bitcoin news and sentiment summary ✅ "Solana" → Updates on Solana projects, price movements, and community trends ✅ "NFT" → Aggregated news about NFT markets and launches ✅ "Layer 2" → Insights on Optimism, Arbitrum, and other L2s --- 🛠️ Setup Instructions Create a Telegram Bot Use @BotFather and obtain the Bot Token. Configure Telegram Credentials in n8n Add your bot token under Telegram API Credentials. Configure OpenAI API Add your OpenAI credentials for GPT-4o access. Update Telegram Send Node In the Telegram Send node, replace the placeholder chatId with your real Telegram user or group chat ID. Deploy and Test Start chatting with your bot: e.g., "Ethereum" or "DeFi". --- 📌 Workflow Highlights 9 major crypto news sources combined Smart keyword matching with AI query parsing Summarized insights in human-readable format Reference links included for deeper reading Instant delivery via Telegram --- 🚀 Get ahead of the crypto market—automate your news and sentiment monitoring with AI inside Telegram!

Don Jayamaha JrBy Don Jayamaha Jr
13520

Get live crypto market data with AI-powered CoinMarketCap agent

Access real-time cryptocurrency prices, market rankings, metadata, and global stats—powered by GPT-4o and CoinMarketCap! This modular AI-powered agent is part of a broader CoinMarketCap multi-agent system designed for crypto analysts, traders, and developers. It uses the CoinMarketCap API and intelligently routes queries to the correct tool using AI. This agent can be used standalone or triggered by a supervisor AI agent for multi-agent orchestration. --- Supported API Tools (6 Total) This agent intelligently selects from the following tools to answer your crypto-related questions: 🔍 Tool Summary Crypto Map – Lookup CoinMarketCap IDs and active coins Crypto Info – Get metadata, whitepapers, and social links Crypto Listings – Ranked coins by market cap CoinMarketCap Price – Live prices, volume, and supply Global Metrics – Total market cap, BTC dominance Price Conversion – Convert between crypto and fiat --- What You Can Do with This Agent 🔹 Get live prices and volume for tokens (e.g., BTC, ETH, SOL) 🔹 Convert crypto → fiat or fiat → crypto instantly 🔹 Retrieve whitepapers, logos, and website links for any token 🔹 Analyze total market cap, BTC dominance, and circulating supply 🔹 Discover new tokens and track their CoinMarketCap IDs 🔹 View the top 100 coins ranked by market cap or volume --- Example Queries ✅ "What is the CoinMarketCap ID for PEPE?" ✅ "Show me the top 10 cryptocurrencies by market cap." ✅ "Convert 5 ETH to USD." ✅ "What’s the 24h volume for ADA?" ✅ "Get the global market cap and BTC dominance." --- AI Architecture AI Brain: GPT-4o-mini Memory: Session buffer with sessionId Agent Type: Subworkflow AI tool Connected APIs: 6 CoinMarketCap endpoints Trigger Mode: Executes when called by a supervisor (via message and sessionId inputs) --- Setup Instructions Get a CoinMarketCap API Key Register here: https://coinmarketcap.com/api/ Configure Credentials in n8n Use HTTP Header Auth with your API key for each connected endpoint Connect This Agent to a Supervisor Workflow (Optional) Trigger this agent using Execute Workflow with inputs message and sessionId Test Prompts Try asking: “Convert 1000 DOGE to BTC” or “Top 5 coins in EUR” --- Included Sticky Notes Crypto Agent Guide – Agent overview, node map, and endpoint details Usage Instructions – Step-by-step usage and sample prompts Error Handling & Licensing – Troubleshooting and IP rights --- ✅ Final Notes This agent is part of the CoinMarketCap AI Analyst System, which includes multiple specialized agents for cryptocurrencies, exchanges, community data, and DEX insights. Visit my Creator profile to find the full suite of tools. --- Get smarter about crypto—analyze the market in real time with AI and CoinMarketCap.

Don Jayamaha JrBy Don Jayamaha Jr
8169

Medical Q&A chatbot for urology using RAG with Pinecone and GPT-4o

Medical Q&A Chatbot for Urology using RAG with Pinecone and GPT-4o This template provides an AI-powered Q&A assistant for the Urology domain using Retrieval-Augmented Generation (RAG). It uses Pinecone for vector search and GPT-4o for conversational responses. 🧠 Use Case This chatbot is designed for clinics or medical pages that want to automate question answering for Urology-related conditions. It uses a vector store of domain knowledge to return verified responses. 🔧 Requirements ✅ OpenAI API key (GPT-4o or GPT-4o-mini) ✅ Pinecone account with an active index ✅ Verified Urology documents embedded into Pinecone ⚙️ Setup Instructions Create a Pinecone vector index and connect it using the Pinecone credentials node. Upload Urology-related documents to embed using the Create Embeddings for Urology Docs node. Customize the chatbot system message to reflect your medical specialty. Deploy this chatbot on your website or link it with Telegram via the chat trigger node. 🛠️ Components chatTrigger: Listens for user messages and starts the workflow. Medical AI Agent: GPT-based agent guided by domain-specific instructions. RAG Tool Vector Store: Fetches relevant documents from Pinecone using vector search. Memory Buffer: Maintains conversation context. Create Embeddings for Urology Docs: Encodes documents into vector format. 📝 Customization You can replace the knowledge base with any other medical domain by: Updating the documents stored in Pinecone. Modifying the system prompt in the AI Agent node. 📣 CTA This chatbot is ideal for clinics, medical consultants, or educational websites wanting a reliable AI assistant in Urology.

HoangSPBy HoangSP
3963

Send DingTalk message on new Azure DevOps pull request

This template automates sending a DingTalk message on new Azure Dev Ops Pull Request Created Events. It uses a MySQL database to store mappings between Azure users and DingTalk users; so the right users get notified. Set up instructions Define the path value of ReceiveTfsPullRequestCreatedMessage Webhook node of your own, copy the webhook url to create a Azure DevOps ServiceHook that call webhook with Pull Request Created event. In order to configure the LoadDingTalkAccountMap node, you need to create a MySQL table as below: |Name|Type|Length|Key| |-|-|-|-| |TfsAccount|varchar|255| |UserName|varchar|255| |DingTalkMobile|varchar|255| You can customize the Ding Talk message content by editing the BuildDingTalkWebHookData node. Define the URL of SendDingTalkMessageViaWebHook Http Request node as your Ding Talk group chat robot webhook URL. Send test of production message from Azure DevOps to test.

PretenderXBy PretenderX
2465

Bitrix24 task form widget application workflow with webhook integration

Use Case Extend Bitrix24 tasks with custom widgets that display relevant task information and enable seamless interaction through a custom tab interface. What This Workflow Does Processes incoming webhook requests from Bitrix24 task interfaces Handles authentication and secure token validation Manages application installation and placement registration Displays task data in a custom formatted view Stores and retrieves configuration settings persistently Provides user-friendly HTML interfaces for task information Setup Instructions Configure Bitrix24 webhook endpoints for the task widget Set up authentication credentials in your Bitrix24 account Install the application and register the task view tab placement Customize the task data display format as needed Deploy and test the application functionality within Bitrix24 tasks

Ferenc ErbBy Ferenc Erb
2054

Render custom text over images

This workflow gets triggered every Friday at 6 PM with the help of a Cron node. It pulls in data about a random cocktail via the HTTP Request Node and sends the data to a Bannerbear node to create an image based on a template. The image is then finally shared on a specified Rocket.Chat channel.

tanaypantBy tanaypant
1922

Track YouTube trends by country and language with RapidAPI & Google Sheets

📈 YouTube Trend Finder Workflow using n8n & RapidAPI Description: Easily discover trending YouTube videos by country and language using this automated n8n workflow. The flow leverages the YouTube Trend Finder API and logs insights to Google Sheets — ideal for content creators, marketers, and researchers. --- 🔗 Node-by-Node Explanation | Node Name | Type | Description | |-----------------------------|--------------------|-----------------------------------------------------------------------------| | 1. On form submission | Form Trigger | Captures user input for country and language through a web form. | | 2. Trend Finder API Request | HTTP Request | Sends a request to YouTube Trend Finder API with the form data. | | 3. Re format output | Code | Extracts and reshapes API response data like title, link, and tags. | | 4. Google Sheets | Google Sheets | Appends the trending video data into a structured spreadsheet. | --- 🎯 Use Cases 🔍 Content Research: Find top-trending videos in any region or language for idea inspiration. 📈 Marketing Intelligence: Track video trends to tailor your video marketing strategy. 📰 Trend Monitoring: Journalists and analysts can quickly surface viral video topics. --- ✅ Benefits of this Workflow No Coding Required: Easy-to-use form interface for non-technical users. Real-Time Trends: Instantly access trending YouTube content with the YouTube Trend Finder API. Automated Logging: Stores data directly in Google Sheets for future analysis or sharing. Customizable: Easily modify for more inputs like video category, max results, or add filters. --- Create your free n8n account and set up the workflow in just a few minutes using the link below: 👉 Start Automating with n8n Save time, stay consistent, and grow your LinkedIn presence effortlessly!

Evoort SolutionsBy Evoort Solutions
852

AWS Azure GCP multi-cloud cost monitoring & alerts for budget control

This automated n8n workflow tracks hourly cloud spending across AWS, Azure, and GCP. It detects cost spikes or budget overruns in real time, tags affected resources, and sends alerts via email, WhatsApp, or Slack. This ensures proactive cost management and prevents budget breaches. --- Good to Know AWS, Azure, and GCP APIs must have read access to billing data. Use secure credentials for API keys or service accounts. The workflow runs every hour for near real-time cost tracking. Alerts can be sent to multiple channels (Email, WhatsApp, Slack). Tags are applied automatically to affected resources for easy tracking. --- How It Works Hourly Cron Trigger – Starts the workflow every hour to fetch updated billing data. AWS Billing Fetch – Retrieves latest cost and usage data via AWS Cost Explorer API. Azure Billing Fetch – Retrieves subscription cost data from Azure Cost Management API. GCP Billing Fetch – Retrieves project-level spend data using GCP Cloud Billing API. Data Parser – Combines and cleans data from all three clouds into a unified format. Cost Spike Detector – Identifies unusual spending patterns or budget overruns. Owner Identifier – Matches resources to their respective owners or teams. Auto-Tag Resource – Tags the affected resource for quick identification and follow-up. Alert Sender – Sends notifications through Email, WhatsApp, and Slack with detailed cost reports. --- How to Use Import the workflow into n8n. Configure credentials for AWS, Azure, and GCP billing APIs. Set your budget threshold in the Cost Spike Detector node. Test the workflow to ensure all APIs fetch data correctly. Adjust the Cron Trigger for your preferred monitoring frequency. Monitor alert logs to track and manage cost spikes. --- Requirements AWS Access Key & Secret Key with Cost Explorer Read Permissions. Azure Client ID, Tenant ID, Client Secret with Cost Management Reader Role. GCP Service Account JSON Key with Billing Account Viewer Role. --- Customizing This Workflow Change the trigger frequency in the Cron node (e.g., every 15 min for faster alerts). Modify alert channels to include additional messaging platforms. Adjust cost spike detection thresholds to suit your organization’s budget rules. Extend the Data Parser to generate more detailed cost breakdowns. --- Want a tailored workflow for your business? Our experts can craft it quickly Contact our team

Oneclick AI SquadBy Oneclick AI Squad
812

Email parser for RAG agent powered by Gmail and Mem0

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Alternatively, you can delete the community node and use the HTTP node instead. Most email agent templates are fundamentally broken. They're stateless—they have no long-term memory. An agent that can't remember past conversations is just a glorified auto-responder, not an intelligent system. This workflow is Part 1 of building a truly agentic system: creating the brain. Before you can have an agent that replies intelligently, you need a knowledge base for it to draw from. This system uses a sophisticated parser to automatically read, analyze, and structure every incoming email. It then logs that intelligence into a persistent, long-term memory powered by mem0. The Problem This Solves Your inbox is a goldmine of client data, but it's unstructured, and manually monitoring it is a full-time job. This constant, reactive work prevents you from scaling. This workflow solves that "system problem" by creating an "always-on" engine that automatically processes, analyzes, and structures every incoming email, turning raw communication into a single source of truth for growth. --- How It Works This is an autonomous, multi-stage intelligence engine. It runs in the background, turning every new email into a valuable data asset. Real-Time Ingest & Prep: The system is kicked off by the Gmail Trigger, which constantly watches your inbox. The moment a new email arrives, the workflow fires. That email is immediately passed to the Set Target Email node, which strips it down to the essentials: the sender's address, the subject, and the core text of the message (I prefer using the plain text or HTML-as-text for reliability). While this step is optional, it's a good practice for keeping the data clean and orderly for the AI. AI Analysis (The Brain): The prepared text is fed to the core of the system: the AI Agent. This agent, powered by the LLM of your choice (e.g., GPT-4), reads and understands the email's content. It's not just reading; it's performing analysis to: Extract the core message. Determine the sentiment (Positive, Negative, Neutral). Identify potential red flags. Pull out key topics and keywords. The agent uses Window Buffer Memory to recall the last 10 messages within the same conversation thread, giving it the context to provide a much smarter analysis. Quality Control (The Parser): We don't trust the AI's first draft blindly. The analysis is sent to an Auto-fixing Output Parser. If the initial output isn't in a perfect JSON format, a second Parsing LLM (e.g., Mistral) automatically corrects it. This is our "twist" that guarantees your data is always perfectly structured and reliable. Create a Permanent Client Record: This is the most critical step. The clean, structured data is sent to mem0. The analysis is now logged against the sender's email address. This moves beyond just tracking conversations; it builds a complete, historical intelligence file on every person you communicate with, creating an invaluable, long-term asset. Optional Use: For back-filling historical data, you can disable the Gmail Trigger and temporarily connect a Gmail "Get Many" node to the Set Target Email node to process your backlog in batches. --- Setup Requirements To deploy this system, you'll need the following: An active n8n instance. Gmail API credentials. An API key for your primary LLM (e.g., OpenAI). An API key for your parsing LLM (e.g., Mistral AI). An account with mem0.ai for the memory layer.

Stephan KoningBy Stephan Koning
648

Monitor commercial real estate opportunities from LoopNet with ScrapeGraphAI & Telegram

How it works This workflow automatically scrapes commercial real estate listings from LoopNet and sends opportunity alerts to Telegram while logging data to Google Sheets. Key Steps Scheduled Trigger - Runs every 24 hours to collect fresh CRE market data AI-Powered Scraping - Uses ScrapeGraphAI to extract property information from LoopNet Market Analysis - Analyzes listings for opportunities and generates market insights Smart Notifications - Sends Telegram alerts only when investment opportunities are found Data Logging - Stores daily market metrics in Google Sheets for trend analysis Set up steps Setup time: 10-15 minutes Configure ScrapeGraphAI credentials - Add your ScrapeGraphAI API key for web scraping Set up Telegram connection - Connect your Telegram bot and specify the target channel Configure Google Sheets - Set up Google Sheets integration for data logging Customize the LoopNet URL - Update the URL to target specific CRE markets or property types Adjust schedule - Modify the trigger timing based on your market monitoring needs Keep detailed configuration notes in sticky notes inside your workflow

vinci-king-01By vinci-king-01
577

Analyze lost HubSpot deals and generate revival strategies with OpenAI

How it works This workflow runs on a daily schedule to analyze all Closed–Lost deals from your CRM and uncover the true reason behind each loss. It uses AI to classify the primary loss category, generate a confidence-backed explanation, and then create a realistic re-engagement strategy for every deal. All insights are consolidated into leadership-ready email and Slack summaries. Every analyzed deal and revival plan is logged for long-term tracking and audits. Step-by-step Trigger and fetch lost deals Schedule Trigger – Runs the workflow automatically at a defined time. Get many deals – Fetches all deal records from the CRM. If – Filters only deals marked as Closed–Lost. Edit Fields – Standardizes key deal attributes like amount, industry, owner, and loss reason. Analyze loss reasons and generate revival strategies Brief Explanation Creator – Uses AI to identify the primary loss category with confidence. Code in JavaScript – Parses and normalizes AI loss analysis output. Merge – Combines deal data with loss insights. Feedback Creator – Generates a practical re-engagement strategy for each lost deal. Code in JavaScript7 – Parses and safeguards revival strategy outputs. Merge4 – Merges deal details, loss analysis, and revival strategy into one final dataset. Report, notify, and store results Code in JavaScript11 – Builds a consolidated HTML summary email. Send a message4 – Sends the summary to stakeholders via email. Code in JavaScript12 – Creates a structured Slack summary. Send a message1 – Delivers insights to a Slack channel. Code in JavaScript10 – Reconstructs final data with delivery status. Append or update row in sheet – Logs all results into Google Sheets for audit and tracking. Why use this? Turns lost deals into actionable learning instead of static CRM records Gives sales teams clear, realistic re-engagement plans without manual analysis Provides leadership with concise, decision-ready summaries Creates a historical database of loss reasons and revival outcomes Improves pipeline recovery while enforcing consistent sales intelligence

Avkash KakdiyaBy Avkash Kakdiya
109

Analyze event feedback with sentiment analysis, Google Sheets, Slack & Email

This n8n workflow automates the collection and analysis of real-time attendee feedback and engagement data during sessions or live polls. It generates actionable insights for organizers, streamlining the process of gathering, processing, and delivering feedback to enhance event management and attendee experience. Key Features Collects session feedback and live poll responses in real-time. Analyzes sentiment and extracts key trends for actionable insights. Delivers summarized reports and recommendations to organizers via multiple channels. Supports seamless integration with external tools for data logging and communication. Workflow Process The Webhook Trigger node captures incoming feedback or poll data from attendees, initiating the workflow. The Extract Feedback Data node processes the raw input to organize and prepare it for analysis. The Analyze Sentiment node uses AI to evaluate feedback sentiment and identify key themes or trends. The Aggregate Feedback node compiles the analyzed data into a cohesive summary. The Calculate Insights node generates actionable insights and recommendations based on the aggregated data. The Check Urgency node assesses the priority of the feedback for timely responses or actions. The Log to Google Sheets node records the feedback and insights for future reference. The Webhook Response node sends real-time updates or acknowledgments back to the source. The Post to Slack node delivers summary messages to organizers via Slack. The Email Report to Organizers node sends detailed reports to organizers via email. Setup Instructions Import the workflow into n8n and configure the Webhook Trigger with your event platform's API credentials. Set up the AI service for sentiment analysis and insight generation with a suitable language model. Configure Google Sheets integration for logging feedback data. Set up Slack and email credentials for notifications and reports. Test the workflow by sending sample feedback or poll responses to ensure proper data flow and analysis. Monitor the output and adjust AI parameters or node settings as needed for optimal performance. Prerequisites Webhook integration with the event platform or polling system. AI/LLM service for sentiment analysis and insight generation. Google Sheets account for data logging. Slack workspace and email service for notifications and reports. Access to real-time attendee data from the event platform. Modification Options Modify the Extract Feedback Data node to include specific data fields or custom parsing rules. Adjust the Analyze Sentiment node to focus on particular sentiment metrics or keywords. Customize the Calculate Insights node to prioritize certain types of recommendations. Add additional notification channels (e.g., Microsoft Teams) to the Post to Slack or Email Report nodes. Configure the Check Urgency node to include custom urgency criteria based on event needs.

Oneclick AI SquadBy Oneclick AI Squad
88