14 templates found
Category:
Author:
Sort:

Host your own JWT authentication system with Data Tables and token management

Description A production-ready authentication workflow implementing secure user registration, login, token verification, and refresh token mechanisms. Perfect for adding authentication to any application without needing a separate auth service. Get started with n8n now! What it does This template provides a complete authentication backend using n8n workflows and Data Tables: User Registration: Creates accounts with secure password hashing (SHA-512 + unique salts) Login System: Generates access tokens (15 min) and refresh tokens (7 days) using JWT Token Verification: Validates access tokens for protected endpoints Token Refresh: Issues new access tokens without requiring re-login Security Features: HMAC-SHA256 signatures, hashed refresh tokens in database, protection against rainbow table attacks Why use this template No external services: Everything runs in n8n - no Auth0, Firebase, or third-party dependencies Production-ready security: Industry-standard JWT implementation with proper token lifecycle management Easy integration: Simple REST API endpoints that work with any frontend framework Fully customizable: Adjust token lifespans, add custom user fields, implement your own business logic Well-documented: Extensive inline notes explain every security decision and implementation detail How to set up Prerequisites n8n instance (cloud or self-hosted) n8n Data Tables feature enabled Setup Steps Create Data Tables: users table: id, email, username, passwordhash, refreshtoken refreshtokens table: id, userid, tokenhash, expiresat Generate Secret Keys: Run this command to generate a random secret: node -e "console.log(require('crypto').randomBytes(32).toString('hex'))" Generate two different secrets for ACCESSSECRET and REFRESHSECRET Configure Secrets: Update the three "SET ACCESS AND REFRESH SECRET" nodes with your generated keys Or migrate to n8n Variables for better security (instructions in workflow notes) Connect Data Tables: Open each Data Table node Select your created tables from the dropdown Activate Workflow: Save and activate the workflow Note your webhook URLs API Endpoints Register: POST /webhook/register-user Request body: { "email": "user@example.com", "username": "username", "password": "password123" } Login: POST /webhook/login Request body: { "email": "user@example.com", "password": "password123" } Returns: { "accessToken": "...", "refreshToken": "...", "user": {...} } Verify Token: POST /webhook/verify-token Request body: { "accesstoken": "youraccess_token" } Refresh: POST /webhook/refresh Request body: { "refreshtoken": "yourrefresh_token" } Frontend Integration Example (Vue.js/React) Login flow: const response = await fetch('https://your-n8n.app/webhook/login', { method: 'POST', headers: { 'Content-Type': 'application/json' }, body: JSON.stringify({ email, password }) }); const { accessToken, refreshToken } = await response.json(); localStorage.setItem('accessToken', accessToken); Make authenticated requests: const data = await fetch('https://your-api.com/protected', { headers: { 'Authorization': Bearer ${accessToken} } }); Key Features Secure Password Storage: Never stores plain text passwords; uses SHA-512 with unique salts Two-Token System: Short-lived access tokens (security) + long-lived refresh tokens (convenience) Database Token Revocation: Refresh tokens can be revoked for logout-all-devices functionality Duplicate Prevention: Checks username and email availability before account creation Error Handling: Generic error messages prevent information leakage Extensive Documentation: 30+ sticky notes explain every security decision Use Cases SaaS applications needing user authentication Mobile app backends Internal tools requiring access control MVP/prototype authentication without third-party costs Learning JWT and auth system architecture Customization Token Lifespan: Modify expiration times in "Create JWT Payload" nodes User Fields: Add custom fields to registration and user profile Password Rules: Update validation in "Validate Registration Request" node Token Rotation: Implement refresh token rotation for enhanced security (notes included) Security Notes :warning: Important: Change the default secret keys before production use Use HTTPS for all webhook endpoints Store secrets in n8n Variables (not hardcoded) Regularly rotate secret keys in production Consider rate limiting for login endpoints Support & Documentation The workflow includes comprehensive documentation: Complete authentication flow overview Security explanations for every decision Troubleshooting guide Setup instructions FAQ section with common issues Perfect for developers who want full control over their authentication system without the complexity of managing separate auth infrastructure. Get Started with n8n now! Tags: authentication, jwt, login, security, user-management, tokens, password-hashing, api, backend

Luka ZivkovicBy Luka Zivkovic
3421

Vector database as a big data analysis tool for AI agents [2/2 KNN]

Vector Database as a Big Data Analysis Tool for AI Agents Workflows from the webinar "Build production-ready AI Agents with Qdrant and n8n". This series of workflows shows how to build big data analysis tools for production-ready AI agents with the help of vector databases. These pipelines are adaptable to any dataset of images, hence, many production use cases. Uploading (image) datasets to Qdrant Set up meta-variables for anomaly detection in Qdrant Anomaly detection tool KNN classifier tool For anomaly detection The first pipeline to upload an image dataset to Qdrant. The second pipeline is to set up cluster (class) centres & cluster (class) threshold scores needed for anomaly detection. The third is the anomaly detection tool, which takes any image as input and uses all preparatory work done with Qdrant to detect if it's an anomaly to the uploaded dataset. For KNN (k nearest neighbours) classification The first pipeline to upload an image dataset to Qdrant. This pipeline is the KNN classifier tool, which takes any image as input and classifies it on the uploaded to Qdrant dataset. To recreate both You'll have to upload crops and lands datasets from Kaggle to your own Google Storage bucket, and re-create APIs/connections to Qdrant Cloud (you can use Free Tier cluster), Voyage AI API & Google Cloud Storage. [This workflow] KNN classification tool This tool takes any image URL, and as output, it returns a class of the object on the image based on the image uploaded to the Qdrant dataset (lands). An image URL is received via the Execute Workflow Trigger*, which is then sent to the Voyage AI Multimodal Embeddings API to fetch its embedding. The image's embedding vector is then used to query Qdrant, returning a set of X similar images with pre-labeled classes. Majority voting is done for classes of neighbouring images. A loop is used to resolve scenarios where there is a tie in Majority Voting, and we increase the number of neighbours to retrieve. When the loop finally resolves, the identified class is returned to the calling workflow.

Jenny By Jenny
1685

Create LinkedIn posts with AI agents using MCP server

Complete MCP server exposing all LinkedIn Tool operations to AI agents. Zero configuration needed - all 1 operations pre-built. ⚡ Quick Setup Need help? Want access to more workflows and even live Q&A sessions with a top verified n8n creator.. All 100% free? Join the community Import this workflow into your n8n instance Activate the workflow to start your MCP server Copy the webhook URL from the MCP trigger node Connect AI agents using the MCP URL 🔧 How it Works • MCP Trigger: Serves as your server endpoint for AI agent requests • Tool Nodes: Pre-configured for every LinkedIn Tool operation • AI Expressions: Automatically populate parameters via $fromAI() placeholders • Native Integration: Uses official n8n LinkedIn Tool tool with full error handling 📋 Available Operations (1 total) Every possible LinkedIn Tool operation is included: 🔧 Post (1 operations) • Create a post 🤖 AI Integration Parameter Handling: AI agents automatically provide values for: • Resource IDs and identifiers • Search queries and filters • Content and data payloads • Configuration options Response Format: Native LinkedIn Tool API responses with full data structure Error Handling: Built-in n8n error management and retry logic 💡 Usage Examples Connect this MCP server to any AI agent or workflow: • Claude Desktop: Add MCP server URL to configuration • Custom AI Apps: Use MCP URL as tool endpoint • Other n8n Workflows: Call MCP tools from any workflow • API Integration: Direct HTTP calls to MCP endpoints ✨ Benefits • Complete Coverage: Every LinkedIn Tool operation available • Zero Setup: No parameter mapping or configuration needed • AI-Ready: Built-in $fromAI() expressions for all parameters • Production Ready: Native n8n error handling and logging • Extensible: Easily modify or add custom logic > 🆓 Free for community use! Ready to deploy in under 2 minutes.

David AshbyBy David Ashby
1634

Daily weather reports with OpenWeatherMap and Telegram Bot

Get automated weather updates delivered directly to your Telegram chat at scheduled intervals. This workflow fetches current weather data from OpenWeatherMap and sends formatted weather reports via a Telegram bot. Use Cases Daily morning weather briefings Regular weather monitoring for outdoor activities Automated weather alerts for specific locations Personal weather assistant for travel planning Prerequisites Before setting up this workflow, ensure you have: An OpenWeatherMap API account (free tier available) A Telegram bot token Your Telegram chat ID n8n instance (cloud or self-hosted) Setup Instructions Step 1: Create OpenWeatherMap Account Go to OpenWeatherMap and sign up for a free account Navigate to the API keys section in your account Copy your API key (you'll need this for the workflow configuration) Step 2: Create Telegram Bot Open Telegram and search for @BotFather Start a chat and use the /newbot command Follow the prompts to create your bot and get the bot token Save the bot token securely Step 3: Get Your Telegram Chat ID Start a conversation with your newly created bot Send any message to the bot Visit https://api.telegram.org/bot<YourBOTToken>/getUpdates in your browser Look for your chat ID in the response (it will be a number like 123456789) Step 4: Configure the Workflow Import this workflow into your n8n instance Configure each node with your credentials: Schedule Trigger Node Set your preferred schedule (default: daily at 8:00 AM) Use cron expression format (e.g., 0 8 * for 8 AM daily) Get Weather Node Add your OpenWeatherMap credentials Update the cityName parameter to your desired location Format: "CityName,CountryCode" (e.g., "London,UK") Send a text message Node Add your Telegram bot credentials (bot token) Replace XXXXXXX in the chatId field with your actual chat ID Customization Options Location Settings In the "Get Weather" node, modify the cityName parameter to change the location. You can specify: City name only: "Paris" City with country: "Paris,FR" City with state and country: "Miami,FL,US" Schedule Frequency In the "Schedule Trigger" node, adjust the cron expression: Every 6 hours: 0 /6 Twice daily (8 AM & 6 PM): 0 8,18 * Weekly on Mondays at 9 AM: 0 9 1 Message Format In the "Format Weather" node, you can customize the message template by modifying the message variable in the function code. Current format includes: Current temperature with "feels like" temperature Min/max temperatures for the day Weather description and precipitation Wind speed and direction Cloud coverage percentage Sunrise and sunset times Language Support In the "Get Weather" node, change the language parameter to get weather descriptions in different languages: English: "en" Spanish: "es" French: "fr" German: "de" Polish: "pl" Troubleshooting Common Issues Weather data not updating: Verify your OpenWeatherMap API key is valid and active Check if you've exceeded your API rate limits Ensure the city name format is correct Messages not being sent: Confirm your Telegram bot token is correct Verify the chat ID is accurate (should be a number, not username) Make sure you've started a conversation with your bot Workflow not triggering: Check if the workflow is activated (toggle switch should be ON) Verify the cron expression syntax is correct Ensure your n8n instance is running continuously Testing the Workflow Use the "Test workflow" button to run manually Check each node's output for errors Verify the final message format in Telegram Node Descriptions Schedule Trigger Automatically starts the workflow based on a cron schedule. Runs at specified intervals to fetch fresh weather data. Get Weather Connects to OpenWeatherMap API to retrieve current weather conditions for the specified location. Format Weather Processes the raw weather data and creates a user-friendly message with emojis and organized information. Send a text message Delivers the formatted weather report to your Telegram chat using the configured bot. Additional Features You can extend this workflow by: Adding weather alerts for specific conditions (temperature thresholds, rain warnings) Including weather forecasts for multiple days Sending reports to multiple chat recipients Adding location-based emoji selection Integrating with other notification channels (email, Slack, Discord) Security Notes Keep your API keys and bot tokens secure Don't share your chat ID publicly Consider using n8n's credential system for storing sensitive information Regularly rotate your API keys for better security Special thanks to Arkadiusz, the only person who supports me in n8n mission to make automation great again.

Dariusz KorytoBy Dariusz Koryto
1171

Auto file organizer for Google Drive: sort PDFs, images & documents by type

Description: This ready-to-deploy n8n automation template smartly detects and classifies files uploaded to a specified Google Drive folder based on MIME type. It automatically moves each file into its correct destination folder: Documents, PDFs, or Images — ensuring a clean and organized Drive, effortlessly. Perfect for remote teams, admins, educators, legal pros, and automation-focused operations, this workflow eliminates manual sorting and saves hours of repetitive work. What This Template Does (Step-by-Step) ⚙️Manual Trigger: Launch the workflow on demand using the "Execute Workflow" trigger. 📁 Search Files in Source Folder (Google Drive): Lists all files inside your chosen folder (e.g., "Uploads"). 🔁 Loop Over Files (SplitInBatches): Iterates through each file one-by-one to ensure reliability. 📥 Download File (Google Drive): Retrieves file metadata and MIME type required for filtering. 🧠 Smart File Type Detection via If Nodes application/json → Move to Documents folder application/pdf → Move to PDFs folder image/jpeg → Move to Images folder (Easily customizable to support additional types like PNG, DOCX, etc.) 📂 Move Files to Designated Folders: Uses Google Drive API to relocate each file to its proper location. 🔁 Loop Returns for Next File After each move, the loop picks the next file in queue. Key Features ⚙️ Google Drive API v3 Integration 🔐 OAuth2 for secure access 📄 MIME-type–based routing logic 🔁 Batch-safe with looping logic ✅ File properties are preserved 🔄 Auto-removal from source after sorting Required Integration Google Drive (OAuth2) Use Cases Auto-organize client uploads Separate scanned PDFs, images, or forms Route invoices, receipts, or contracts into folders Automatically sort uploaded assignments or resources Maintain structured cloud storage without manual intervention Why Use This Template? ✅ No-code deployment ✅ Saves hours of manual work ✅ Works across teams, departments, or shared Drives ✅ Easy to expand with more file types or routing rules ✅ Keeps your Drive clean, fast, and organized

Rahul JoshiBy Rahul Joshi
988

Update Shopify order tags when a Onfleet event happens

Summary Onfleet is a last-mile delivery software that provides end-to-end route planning, dispatch, communication, and analytics to handle the heavy lifting while you can focus on your customers. This workflow template automatically updates the tags for a Shopify Order when an Onfleet event occurs. Configurations Update the Onfleet trigger node with your own Onfleet credentials, to register for an Onfleet API key, please visit https://onfleet.com/signup to get started You can easily change which Onfleet event to listen to. Learn more about Onfleet webhooks with Onfleet Support Update the Shopify node with your Shopify credentials and add your own tags to the Shopify Order

James LiBy James Li
876

Spotify sync liked songs to playlist

Short an simple: This Workflow will sync (add and delete) your Liked Songs to an custom playlist that can be shared. Setup: Create an app on the Spotify Developer Dashboard. Create Spotify Credentials - Just click on one of the Spotify Nodes in the Workflow an click on "create new credentials" and follow the guide. Create the Spotify Playlist that you want to sync to. Copy the exact name of you playlist, go into Node "Edit set Vars" and replace the value "CHANGE MEEEE" with your playlist name. Set your Spotify Credentiels on every Spotify Node. (Should be marekd with Yellow and Red Notes) Do you use Gotify? No: Delete the Gotify Nodes (all the way to the right end of the Workflow) Yes: Customize the Gotify Nodes to your needs.

DustinBy Dustin
817

Sync new Shopify products to Odoo products

Seamlessly sync newly created Shopify products into Odoo with this smart, no-code n8n workflow. Automatically checks for duplicates and creates new products only when needed – fully compatible with standard Odoo setup. 🚀 Key Features: Trigger-based Automation: Starts instantly when a new product is created in Shopify Duplicate Check: Searches for an existing product in Odoo using Shopify’s product SKU as the Internal Reference (Default Code) Smart Condition Logic: If a matching product is found, the workflow stops – preventing duplicates If not found, a new product is created in Odoo Auto-Creation of Odoo Product using Shopify data: Product Name Description Price Internal Reference (SKU) Works with both Odoo Community and Enterprise editions Fully customizable and extendable in n8n No coding, no custom modules required 🛠 Requirements: n8n (self-hosted or cloud) Shopify API access Odoo instance with standard API access 💡 A perfect solution for businesses that regularly add new products to Shopify and want them mirrored in Odoo – automatically and without manual effort. +++++++++++++++++++++++++++++++++++++++++ Workflow functions like the integration of a Shopify product with an Odoo product. Trigger: The workflow starts when a new product is created in Shopify. Check Existing Products: It searches in Odoo to check if any product already exists with the same Internal Reference or Default Code (taken from the Shopify product data). Condition: If a matching product is found: The workflow stops (no duplicates are created). If no matching product is found: It proceeds to create a new product in Odoo. Create Product in Odoo: A new product is created in Odoo using the following fields from Shopify: Product Name Description Price Internal Reference (or Default Code)

EvozardBy Evozard
713

Create multi-sheet Excel workbooks by merging datasets with Google Drive & Sheets

Create multi-sheet Excel workbooks in n8n to automate reporting using Google Drive + Google Sheets Build an automated Excel file with multiple tabs directly in n8n. Two Code nodes generate datasets, each is converted into its own Excel worksheet, then combined into a single .xlsx and (optionally) appended to a Google Sheet for sharing—eliminating manual copy-paste and speeding up reporting. Who’s it for Teams that publish recurring reports as Excel with multiple tabs Ops/Marketing/Data folks who want a no-code/low-code way to package JSON into Excel n8n beginners learning the Code → Convert to File → Merge pattern How it works Manual Trigger starts the run. Code nodes emit JSON rows for each table (e.g., People, Locations). Convert to File nodes turn each JSON list into an Excel binary, assigning Sheet1/Sheet2 (or your names). Merge combines both binaries into a single Excel workbook with multiple tabs. Google Sheets (optional) appends the JSON rows to a live spreadsheet for collaboration. Setup (only 2 connections) 1️⃣ Connect Google Sheets (OAuth2) In n8n → Credentials → New → Google Sheets (OAuth2) Sign in with your Google account and grant access Copy the example sheet referenced in the Google Sheets node (open the node and duplicate the linked sheet), or select your own In the workflow’s Google Sheets node, select your Spreadsheet and Worksheet https://docs.google.com/spreadsheets/d/1G6FSm3VdMZt6VubM6g8j0mFw59iEw9npJE0upxj3Y6k/edit?gid=1978181834gid=1978181834 2️⃣ Connect Google Drive (OAuth2) In n8n → Credentials → New → Google Drive (OAuth2) Sign in with the Google account that will store your Excel outputs and allow access In your Drive-related nodes (if used), point to the folder where you want the .xlsx saved or retrieved Customize the workflow Replace the sample arrays in the Code nodes with your data (APIs, DBs, CSVs, etc.) Rename sheetName in each Convert to File node to match your desired tab names Keep the Merge node in Combine All mode to produce a single workbook In Google Sheets, switch to Manual mapping for strict column order (optional) Best practices (per template guidelines) Rename nodes to clear, action-oriented names (e.g., “Build People Sheet”, “Build Locations Sheet”) Add a yellow Sticky Note at the top with this description so users see setup in-workflow Do not hardcode credentials inside HTTP nodes; always use n8n Credentials Remove personal IDs/links before publishing Sticky Note (copy-paste) > Multi-Tab Excel Builder (Google Drive + Google Sheets) > This workflow generates two datasets (Code → JSON), converts each to an Excel sheet, merges them into a single workbook with multiple tabs, and optionally appends rows to Google Sheets. > > Setup (2 connections): > 1) Google Sheets (OAuth2): Create credentials → duplicate/select your target spreadsheet → set Spreadsheet + Worksheet in the node. > 2) Google Drive (OAuth2): Create credentials → choose the folder for storing/retrieving the .xlsx. > > Customize: Edit the Code nodes’ arrays, rename tab names in Convert to File, and adjust the Sheets node mapping as needed. Troubleshooting Missing columns / wrong order: Use Manual mapping in the Google Sheets node Binary not found: Ensure each Convert to File node’s binaryPropertyName matches what Merge expects Permissions errors: Re-authorize Google credentials; confirm you have edit access to the target Sheet/Drive folder 📬 Contact Need help customizing this (e.g., filtering by campaign, sending reports by email, or formatting your PDF)? 📧 rbreen@ynteractive.com 🔗 https://www.linkedin.com/in/robert-breen-29429625/ 🌐 https://ynteractive.com

Robert BreenBy Robert Breen
539

Build comprehensive entity profiles with GPT-4, Wikipedia & vector DB for content

This n8n template demonstrates how to build an intelligent entity research system that automatically discovers, researches, and creates comprehensive profiles for business entities, concepts, and terms. Use cases are many: Try automating glossary creation for technical documentation, building standardized definition databases for compliance teams, researching industry terminology for content creation, or developing training materials with consistent entity explanations! Good to know Each entity research typically costs $0.08-$0.34, depending on the complexity and sources required. The workflow includes smart duplicate detection to minimize unnecessary API calls. The workflow requires multiple AI services and a vector database, so setup time may be longer than simpler templates. Entity definitions are stored locally in your Qdrant database and can be reused across multiple projects. How it works The workflow checks your existing knowledge base first to avoid duplicate research on entities you've already processed. If the entity is new, an AI research agent intelligently combines your vector database, Wikipedia, and live web research to gather comprehensive information. The system creates structured entity profiles with definitions, categories, examples, common misconceptions, and related entities - perfect for business documentation. AI-powered validation ensures all entity profiles are complete, accurate, and suitable for business use before storage. Each researched entity gets stored in your Qdrant vector database, creating a growing knowledge base that improves research efficiency over time. The workflow includes multiple stages of duplicate prevention to avoid unnecessary processing and API costs. How to use The manual trigger node is used as an example, but feel free to replace this with other triggers such as form submissions, content management systems, or automated content pipelines. You can research multiple related entities in sequence, and the system will automatically identify connections and relationships between them. Provide topic and audience context to get tailored explanations suitable for your specific business needs. Requirements OpenAI API account for o4-mini (entity research and validation) Qdrant vector database instance (local or cloud) Ollama with nomic-embed-text model for embeddings Automate Web Research with GPT-4, Claude & Apify for Content Analysis and Insights workflow (for live web research capabilities) Anthropic API account for Claude Sonnet 4 (used by the web research workflow) Apify account for web scraping (used by the web research workflow) Customizing this workflow Entity research automation can be adapted for many specialized domains. Try focusing on specific industries like legal terminology (targeting official legal sources), medical concepts (emphasizing clinical accuracy), or financial terms (prioritizing regulatory definitions). You can also customize the validation criteria to match your organization's specific quality standards.

Peter ZendzianBy Peter Zendzian
448

Stock fundamental analysis & AI-powered reports with Mistral and AlphaVantage

Fundamental Analysis, Stock Analysis, and AI Integration in the Fundamental Analysis Tool --- Overview of the Tool The Fundamental Analysis Tool is an automated workflow designed to evaluate a stock’s fundamentals using financial data and AI-driven insights. Built in the n8n automation platform, it: Collects financial data for a user-specified stock from AlphaVantage. Processes and structures this data for analysis. Analyzes the data using the Mistral AI model to provide expert-level insights. Generates a visually appealing HTML report with charts and delivers it via email. The tool is triggered by a form where users input a stock symbol (e.g., "NVDA" for NVIDIA) and their email address. From there, it follows a three-stage process: data retrieval, data processing, and AI analysis with report generation. --- Fundamental Analysis: The Foundation Fundamental analysis involves evaluating a company’s intrinsic value by examining its financial health, competitive position, and market environment. This tool performs fundamental analysis by: Data Retrieval Data Types: Six types of data are retrieved via HTTP requests: Overview: General company details (e.g., sector, industry, market cap). Income Statement: Revenue, net income, and profitability metrics. Balance Sheet: Assets, liabilities, and equity. Cash Flow: Operating, investing, and financing cash flows. Earnings Calendar: Upcoming earnings events. Earnings: Historical earnings data (annual and quarterly). Key Metrics Analyzed The tool structures this data into 8 categories critical to fundamental analysis, as defined in the "Code1" node: Economic Moats & Competitive Advantage: Assesses sustainable advantages (e.g., R&D spending, gross profit). Financial Health & Profitability: Examines ROE, debt levels, and dividend yield. Valuation & Market Sentiment: Evaluates P/E ratio, PEG ratio, and book value. Management & Capital Allocation: Reviews market cap justification and cash allocation (e.g., R&D, buybacks). Industry & Risk Exposure: Analyzes revenue cyclicality and geopolitical risks. Key Metrics to Probe: Investigates net income trends and gross margins. Red Flags: Identifies risks like inventory issues or stock dilution. Final Checklist: Summarizes pricing power and risk/reward potential. These categories cover the core pillars of fundamental analysis, ensuring a holistic evaluation of the stock’s intrinsic value and risks. --- Stock Analysis: Tailored Insights The tool performs stock-specific analysis by focusing on the user-provided stock symbol. Here’s how it tailors the process: Input and Customization Form Submission: Users enter a stock symbol (e.g., "NVDA") and email via the "On Form Submission" node. Dynamic Data Fetching: The "Set Variables" node passes the stock symbol to the API calls, ensuring the analysis is specific to the chosen stock. Processing for Relevance Data Filtering: The workflow limits historical data to the last 5 years (via the "Limit" node), focusing on recent trends. Merging and Cleaning: The "Merge" and "Code2" nodes combine and refine the data, removing irrelevant fields (e.g., quarterly reports) and aggregating annual reports for consistency. Output The final report is titled with the stock’s name (e.g., "Fundamental Analysis - NVIDIA"), ensuring the analysis is clearly tied to the user’s chosen stock. This stock-specific approach makes the tool practical for investors analyzing individual companies rather than broad market trends. --- AI Integration: Expert-Level Insights The integration of AI (via the Mistral model or others) is what sets this tool apart, automating complex analysis and report generation. Here’s how AI is woven into the workflow: Data Preparation for AI Structuring: The "Code1" node organizes the raw data into a JSON schema aligned with the eight fundamental analysis categories. This structured data is fed into the AI for analysis. AI Analysis Node: "Basic LLM Chain" uses the Mistral AI model. Prompt: The AI is instructed to act as an "expert financial advisor with 50 years of experience" and answer specific questions for each category, such as: Economic Moats: "What sustainable competitive advantages protect the company’s margins?" Financial Health: "Is ROE driven by leverage or true profitability?" Red Flags: "Are supply chain issues a concern?" Output: The AI generates a JSON response with detailed insights, e.g.: json { "Economic Moats & Competitive Advantage": "NVIDIA’s leadership in GPU technology and strong R&D investment...", "Financial Health & Profitability": "ROE of 25% is exceptional, driven by profitability rather than leverage...", ... } Validation: An "Auto-fixing Output Parser" ensures the output adheres to the expected JSON schema, retrying if necessary. Report Enhancement HTML Generation: The "HTML" node creates an initial report with placeholders for the AI’s insights and Google Charts for visualizations (e.g., ROE trends, revenue growth). AI-Driven Refinement: The "Basic LLM Chain1" node uses Mistral again to enhance the HTML, adding: Styled tables (e.g., financial ratios). Charts (e.g., bar charts for valuation, line charts for revenue). Visual indicators (e.g., ✅ for positive trends, ⚠️ for risks). Mobile-responsive design with modern fonts (Inter or Roboto). This dual AI approach—one for analysis, one for presentation—ensures the output is both insightful and user-friendly. --- Strengths and Limitations Strengths Comprehensive: Covers all key aspects of fundamental analysis. AI-Powered: Automates expert-level insights and report design. User-Friendly: Delivers an interactive, visual report via email. Limitations Data Dependency: Relies on public data, so data quality and timeliness matter. AI Constraints: Insights depend on AI’s capabilities; it may miss nuanced human judgment. Disclaimer: The tool notes it’s not investment advice, so users should consult advisors. ---

Sebastian/OptiLeverBy Sebastian/OptiLever
410

Daily Competitor Tweet Summarizer with X API, GPT-5-Nano, and Gmail Delivery

Automated Daily Competitor Tweet Summarizer with X API, GPT-5-Nano, and Gmail Stay on top of your competition with this powerful n8n workflow that automatically fetches and summarizes your competitors’ latest tweets every day. Using the official X (formerly Twitter) API and OpenAI's GPT-5-Nano model, this template extracts insights from public tweets and sends concise summaries directly to your Gmail inbox. Ideal for marketing teams, product managers, PR professionals, and competitive intelligence analysts, this solution turns noisy social feeds into clear, actionable summaries—automated and customized to your needs. --- Features Daily automation: Fetches competitor tweets every 24 hours using X API AI summarization: Uses GPT-5-Nano to highlight key insights and themes Smart filtering: Cleans and filters tweets for relevance before summarizing Email delivery: Sends summaries to Gmail (or your team’s inbox) Fully customizable: Modify schedules, accounts, and integrations as needed --- Setup Instructions Get API Keys: X API (Bearer Token) – from developer.x.com OpenAI API Key – from platform.openai.com Gmail OAuth2 credentials (via Google Cloud Console) Configure in n8n: Import the workflow Add credentials under the "Credentials" tab Set target X usernames and schedule Customize Delivery (Optional): Set email subject, recipients Add additional integrations (e.g., Slack, Notion, Sheets) --- How It Works Trigger: A daily cron node initiates the workflow. Fetch User ID: The workflow uses the X API to retrieve the user ID based on the provided username. This step is necessary because the tweet retrieval endpoint requires a user ID, not a username. Fetch Tweets: Using the extracted user ID, the workflow queries the X API for recent tweets from the selected account. Clean Data: Filters out replies, retweets, and any irrelevant content to ensure only meaningful tweets are summarized. Summarize: GPT-4 processes the cleaned tweet content and generates a concise, insightful summary. Send Email: The Gmail node sends the final summary to your inbox or chosen recipient. --- Use Cases Track competitor announcements and marketing messages Automate daily social media briefs for leadership Monitor trends in your industry effortlessly Keep your team aligned with market developments --- Requirements Valid X API credentials (Bearer token) OpenAI API key Gmail OAuth2 credentials Access to n8n (cloud or self-hosted) --- Delivery Options While Gmail is the default, you can easily extend the workflow to integrate with: Slack Notion Google Sheets Webhooks Any supported n8n integration --- Automate your competitive intelligence process and stay informed—without lifting a finger.

ScoutNowBy ScoutNow
389