9 templates found
Category:
Author:
Sort:

Website content scraper & SEO keyword extractor with GPT-5-mini and Airtable

This workflow allows you to scrape website content, clean the HTML, extract structured information using GPT-5-mini, and store the results along with SEO keywords into Airtable. Ideal for building keyword lists and organizing web content for SEO research. --- Setup Instructions Prerequisites n8n Community or Cloud instance Airtable account with a base and table ready OpenAI API Key with access to GPT-5-mini --- Airtable Structure Ensure your Airtable table has the following fields: | Field Name | Type | Notes | | ------------ | ------- | ------------------------------- | | Website Name | String | Name or URL of the website | | Data | String | Cleaned website text | | Keyword | String | Extracted SEO keyword list | | Status | Options | Values: Todo, In progress, Done | --- Node Setup ✅ Form Trigger: Collects website URL from the user. ✅ HTTP Request: Fetches the website content. ✅ HTML Cleaner (Code Node): Strips out styles, tags, and whitespace to get clean text. ✅ Topic Extractor (AI Agent + GPT-5-mini): Extracts topic-wise information from the cleaned website content. ✅ Text Cleaner (Code Node): Removes unwanted symbols like and . ✅ Keyword Extractor (AI Agent + GPT-5-mini): Generates a list of 90 important SEO keywords. ✅ Airtable Upsert: Stores the cleaned data, keywords, and status in Airtable. --- Key Features ✅ Automatic website content scraping ✅ Clean HTML and extract plain text ✅ Use GPT-5-mini for topic-wise information extraction ✅ Generate 90-keyword SEO lists ✅ Store and manage data in Airtable --- Use Cases SEO Keyword Research Competitor Website Content Analysis Structured Website Data Collection --- Additional Workflow Recommendations ✅ Rename Nodes for Clarity | Current Name | Suggested Name | | ------------ | ------------------------------- | | Website Name | Website URL Input Form | | HTTP Request | Fetch Website Content | | Code | HTML to Plain Text Cleaner | | Split Out1 | Clean Text Splitter | | AI Agent1 | Topic Extractor (GPT-5-mini) | | Code1 | Text Cleanup Formatter | | Split Out2 | Final Text Splitter | | AI Agent | Keyword Extractor (GPT-5-mini) | | Airtable | Airtable Data Upsert | | Wait1 | Delay Before Merge | | Merge | Combine Data for Airtable | ---

Abhishek PatoliyaBy Abhishek Patoliya
22016

Automated law firm lead management & scheduling with AI, Jotform & Calendar

Youtube Explanation: [https://youtu.be/KgmNiV7SwkU](https://youtu.be/KgmNiV7SwkU ) This n8n workflow is designed to automate the initial intake and scheduling for a law firm. It's split into two main parts: New Inquiry Handling: Kicks off when a potential client fills out a JotForm, saves their data, and sends them an initial welcome message on WhatsApp. Appointment Scheduling: Activates when the client replies on WhatsApp, allowing an AI agent to chat with them to schedule a consultation. Here’s a detailed breakdown of the prerequisites and each node. Prerequisites Before building this workflow, you'll need accounts and some setup for each of the following services: JotForm JotForm Account: You need an active JotForm account. A Published Form: Create a form with the exact fields used in the workflow: Full Name, Email Address, Phone Number, I am a..., Legal Service of Interest, Brief Message, and How Did You Hear About Us?. API Credentials: Generate API keys from your JotForm account settings to connect it with n8n. Google Google Account: To use Google Sheets and Google Calendar. Google Sheet: Create a new sheet named "Law Client Enquiries". The first row must have these exact headers: Full Name, Email Address, Phone Number, client type, Legal Service of Interest, Brief Message, How Did You Hear About Us?. Google Calendar: An active calendar to manage appointments. Google Cloud Project: Service Account Credentials (for Sheets): In the Google Cloud Console, create a service account, generate JSON key credentials, and enable the Google Sheets API. You must then share your Google Sheet with the service account's email address (e.g., automation-bot@your-project.iam.gserviceaccount.com). OAuth Credentials (for Calendar): Create OAuth 2.0 Client ID credentials to allow n8n to access your calendar on your behalf. You'll need to enable the Google Calendar API. Gemini API Key: Enable the Vertex AI API in your Google Cloud project and generate an API key to use the Google Gemini models. WhatsApp Meta Business Account: Required to use the WhatsApp Business Platform. WhatsApp Business Platform Account: You need to set up a business account and connect a phone number to it. This is different from the regular WhatsApp or WhatsApp Business app. API Credentials: Get the necessary access tokens and IDs from your Meta for Developers dashboard to connect your business number to n8n. PostgreSQL Database A running PostgreSQL instance: This can be hosted anywhere (e.g., AWS, DigitalOcean, Supabase). The AI agent needs it to store and retrieve conversation history. Database Credentials: You'll need the host, port, user, password, and database name to connect n8n to it. Node-by-Node Explanation The workflow is divided into two distinct logical flows. Flow 1: New Client Intake from JotForm This part triggers when a new client submits your form. JotForm Trigger What it does: This is the starting point. It automatically runs the workflow whenever a new submission is received for the specified JotForm (Form ID: 252801824783057). Prerequisites: A JotForm account and a created form. Append or update row in sheet (Google Sheets) What it does: It takes the data from the JotForm submission and adds it to your "Law Client Enquiries" Google Sheet. How it works: It uses the appendOrUpdate operation. It tries to find a row where the "Email Address" column matches the email from the form. If it finds a match, it updates that row; otherwise, it appends a new row at the bottom. Prerequisites: A Google Sheet with the correct headers, shared with your service account. AI Agent What it does: This node crafts the initial welcome message to be sent to the client. How it works: It uses a detailed prompt that defines a persona ("Alex," a legal intake assistant) and instructs the AI to generate a professional WhatsApp message. It dynamically inserts the client's name and service of interest from the Google Sheet data into the prompt. Connected Node: It's powered by the Google Gemini Chat Model. Send message (WhatsApp) What it does: It sends the message generated by the AI Agent to the client. How it works: It takes the client's phone number from the data (Phone Number column) and the AI-generated text (output from the AI Agent node) to send the message via the WhatsApp Business API. Prerequisites: A configured WhatsApp Business Platform account. --- Flow 2: AI-Powered Scheduling via WhatsApp This part triggers when the client replies to the initial message. WhatsApp Trigger What it does: This node listens for incoming messages on your business's WhatsApp number. When a client replies, it starts this part of the workflow. Prerequisites: A configured WhatsApp Business Platform account. If node What it does: It acts as a simple filter. It checks if the incoming message text is empty. If it is (e.g., a status update), the workflow stops. If it contains text, it proceeds to the AI agent. AI Agent1 What it does: This is the main conversational brain for scheduling. It handles the back-and-forth chat with the client. How it works: Its prompt is highly detailed, instructing it to act as "Alex" and follow a strict procedure for scheduling. It has access to several "tools" to perform actions. Connected Nodes: Google Gemini Chat Model1: The language model that does the thinking. Postgres Chat Memory: Remembers the conversation history with a specific user (keyed by their WhatsApp ID), so the user doesn't have to repeat themselves. Tools: Know about the user enquiry, GET MANY EVENTS..., and Create an event. AI Agent Tools (What the AI can do) Know about the user enquiry (Google Sheets Tool): When the AI needs to know who it's talking to, it uses this tool. It takes the user's phone number and looks up their original enquiry details in the "Law Client Enquiries" sheet. GET MANY EVENTS... (Google Calendar Tool): When a client suggests a date, the AI uses this tool to check your Google Calendar for any existing events on that day to see if you're free. Create an event (Google Calendar Tool): Once a time is agreed upon, the AI uses this tool to create the event in your Google Calendar, adding the client as an attendee. Send message1 (WhatsApp) What it does: Sends the AI's response back to the client. This could be a confirmation that the meeting is booked, a question asking for their email, or a suggestion for a different time if the requested slot is busy. How it works: It sends the output text from AI Agent1 to the client's WhatsApp ID, continuing the conversation.

iamvaarBy iamvaar
5764

Posting from Wordpress to Medium

Usage This workflow gets all the posts from your WordPress site and sorts them into a clear format before publishing them to medium. Step 1. Set up the HTTP node and set the URL of the source destination. This will be the URL of the blog you want to use. We shall be using https://mailsafi.com/blog for this. Step 2. Extract the URLs of all the blogs on the page This gets all the blog titles and their URLs. Its an easy way to sort ou which blogs to share and which not to share. Step 3. Split the entries for easy sorting or a cleaner view. Step 4. Set a new https node with all the blog URLs that we got from the previous steps. Step 5. Extract the contents of the blog Step 6. Add the medium node and then set the contents that you want to be shared out. Execute your workflow and you are good to go

Zacharia KimothoBy Zacharia Kimotho
2722

Crawl websites & answer questions with GPT-5 nano and Google Sheets

Web Consultation & Crawling Chatbot with Google Sheets Memory Who is this workflow for? This workflow is designed for SEO analysts, content creators, marketing agencies, and developers who need to index a website and then interact with its content as if it were a chatbot. ⚠ Note: if the site contains many pages, AI token consumption can generate high costs, especially during the initial crawling and analysis phase. --- Initial Mode (first use with a URL) When the user enters a URL for the first time: URL validation using AI (gpt-5-nano). Automatic sitemap discovery via robots.txt. Relevant sitemap selection (pages, posts, categories, or tags) using GPT-4o according to configured options. (Includes “OPTIONS” node to precisely choose which types of URLs to process) Crawling of all selected pages: Downloads HTML of each page. Converts HTML to Markdown. AI analysis to extract: Detected language. Heading hierarchy (H1, H2, etc.). Internal and external links. Content summary. Structured storage in Google Sheets: Lang H1 and hierarchy External URLs Internal URLs Summary Content Data schema (flag to enable agent mode) When finished, the sheet is marked with Data schema = true, signaling that the site is indexed. --- Agent Mode (subsequent queries) If the URL has already been indexed (Data schema = true): The chat becomes a LangChain Agent that: Reads the database in Google Sheets. Can perform real-time HTTP requests if it needs updated information. Responds as if it were the website, using stored and live data. This allows the user to ask questions such as: "What’s on the contact page?"* "How many external links are there on the homepage?"* "Give me all the H1 headings from the services pages"* "What CTA would you suggest for my page?"* "How would you expand X content?"* --- Use cases Build a chatbot that answers questions about a website’s content. Index and analyze full websites for future queries. SEO tool to list headings, links, and content summaries. Assistant for quick exploration of a site’s structure. Generate improvement recommendations and content strategies from site data.

Oriol SeguíBy Oriol Seguí
1541

Daily weather reports with OpenWeatherMap and Telegram Bot

Get automated weather updates delivered directly to your Telegram chat at scheduled intervals. This workflow fetches current weather data from OpenWeatherMap and sends formatted weather reports via a Telegram bot. Use Cases Daily morning weather briefings Regular weather monitoring for outdoor activities Automated weather alerts for specific locations Personal weather assistant for travel planning Prerequisites Before setting up this workflow, ensure you have: An OpenWeatherMap API account (free tier available) A Telegram bot token Your Telegram chat ID n8n instance (cloud or self-hosted) Setup Instructions Step 1: Create OpenWeatherMap Account Go to OpenWeatherMap and sign up for a free account Navigate to the API keys section in your account Copy your API key (you'll need this for the workflow configuration) Step 2: Create Telegram Bot Open Telegram and search for @BotFather Start a chat and use the /newbot command Follow the prompts to create your bot and get the bot token Save the bot token securely Step 3: Get Your Telegram Chat ID Start a conversation with your newly created bot Send any message to the bot Visit https://api.telegram.org/bot<YourBOTToken>/getUpdates in your browser Look for your chat ID in the response (it will be a number like 123456789) Step 4: Configure the Workflow Import this workflow into your n8n instance Configure each node with your credentials: Schedule Trigger Node Set your preferred schedule (default: daily at 8:00 AM) Use cron expression format (e.g., 0 8 * for 8 AM daily) Get Weather Node Add your OpenWeatherMap credentials Update the cityName parameter to your desired location Format: "CityName,CountryCode" (e.g., "London,UK") Send a text message Node Add your Telegram bot credentials (bot token) Replace XXXXXXX in the chatId field with your actual chat ID Customization Options Location Settings In the "Get Weather" node, modify the cityName parameter to change the location. You can specify: City name only: "Paris" City with country: "Paris,FR" City with state and country: "Miami,FL,US" Schedule Frequency In the "Schedule Trigger" node, adjust the cron expression: Every 6 hours: 0 /6 Twice daily (8 AM & 6 PM): 0 8,18 * Weekly on Mondays at 9 AM: 0 9 1 Message Format In the "Format Weather" node, you can customize the message template by modifying the message variable in the function code. Current format includes: Current temperature with "feels like" temperature Min/max temperatures for the day Weather description and precipitation Wind speed and direction Cloud coverage percentage Sunrise and sunset times Language Support In the "Get Weather" node, change the language parameter to get weather descriptions in different languages: English: "en" Spanish: "es" French: "fr" German: "de" Polish: "pl" Troubleshooting Common Issues Weather data not updating: Verify your OpenWeatherMap API key is valid and active Check if you've exceeded your API rate limits Ensure the city name format is correct Messages not being sent: Confirm your Telegram bot token is correct Verify the chat ID is accurate (should be a number, not username) Make sure you've started a conversation with your bot Workflow not triggering: Check if the workflow is activated (toggle switch should be ON) Verify the cron expression syntax is correct Ensure your n8n instance is running continuously Testing the Workflow Use the "Test workflow" button to run manually Check each node's output for errors Verify the final message format in Telegram Node Descriptions Schedule Trigger Automatically starts the workflow based on a cron schedule. Runs at specified intervals to fetch fresh weather data. Get Weather Connects to OpenWeatherMap API to retrieve current weather conditions for the specified location. Format Weather Processes the raw weather data and creates a user-friendly message with emojis and organized information. Send a text message Delivers the formatted weather report to your Telegram chat using the configured bot. Additional Features You can extend this workflow by: Adding weather alerts for specific conditions (temperature thresholds, rain warnings) Including weather forecasts for multiple days Sending reports to multiple chat recipients Adding location-based emoji selection Integrating with other notification channels (email, Slack, Discord) Security Notes Keep your API keys and bot tokens secure Don't share your chat ID publicly Consider using n8n's credential system for storing sensitive information Regularly rotate your API keys for better security Special thanks to Arkadiusz, the only person who supports me in n8n mission to make automation great again.

Dariusz KorytoBy Dariusz Koryto
1171

Web research assistant: automated search & scraping with Gemini AI and spreadsheet reports

⚠️ IMPORTANT: This template requires self-hosted n8n hosting due to the use of community nodes (MCP tools). It will not work on n8n Cloud. Make sure you have access to a self-hosted n8n instance before using this template. Overview This workflow automation allows a Google Gemini-powered AI Agent to orchestrate multi-source web intelligence using MCP (Model Context Protocol) tools such as Firecrawl, Brave Search, and Apify. The system allows users to interact with the agent in natural language, which then leverages various external data collection tools, processes the results, and automatically organizes them into structured spreadsheets. With built-in memory, flexible tool execution, and conversational capabilities, this workflow acts as a multi-agent research assistant, capable of retrieving, synthesizing, and delivering actionable insights in real time. How the system works AI Agent + MCP Pipeline User Interaction A chat message is received and forwarded to the AI Agent. AI Orchestration The agent, powered by Google Gemini, decides which MCP tools to invoke based on the query. Firecrawl-MCP: Recursive web crawling and content extraction. Brave-MCP: Real-time web search with structured results. Apify-MCP: Automation of web scraping tasks with scalable execution. Memory Management A memory module stores context across conversations, ensuring multi-turn reasoning and task continuity. Spreadsheet automation Results are structured in a new, automatically created Google Spreadsheet, enriched with formatting and additional metadata. Data processing The workflow generates the spreadsheet content, updates the sheet, and improves results via HTTP requests and field edits. Delivery of results Users receive a structured and contextualized dataset ready for review, analysis, or integration into other systems. Configuration instructions Estimated setup time: 45 minutes Prerequisites Self-hosted n8n instance (v0.200.0 or higher recommended) Google Gemini API key MCP-compatible nodes (Firecrawl, Brave, Apify) configured Google Sheets credentials for spreadsheet automation Detailed configuration steps Step 1: Configuring the AI Agent AI Agent node: Select Google Gemini as the LLM model Configure your Google Gemini API key in the n8n credentials Set the system prompt to guide the agent's behavior Connect the Simple Memory node to enable context tracking Step 2: Integrating MCP Tools Firecrawl-MCP Configuration: Install the @n8n/n8n-nodes-firecrawl-mcp package Configure your Firecrawl API key Set crawling parameters (depth, CSS selectors) Brave-MCP configuration: Install the @n8n/n8n-nodes-brave-mcp package Add your Brave Search API key Configure search filters (region, language, SafeSearch) Apify-MCP configuration: Install the @n8n/n8n-nodes-apify-mcp package Configure your Apify credentials Select the appropriate actors for your use cases Step 3: Spreadsheet automation “Create Spreadsheet” node: Configure Google Sheets authentication (OAuth2 or Service Account) Set the file name with dynamic timestamps Specify the destination folder in Google Drive “Generate Spreadsheet Content” node: Transform the agent's outputs into tabular format Define the columns: URL, Title, Description, Source, Timestamp Configure data formatting (dates, links, metadata) “Update Spreadsheet” node: Insert the data into the created sheet Apply automatic formatting (headers, colors, column widths) Add summary formulas if necessary Step 4: Post-processing and delivery “Data Enrichment Request” node (formerly “HTTP Request1”): Configure optional API calls to enrich the data Add additional metadata (geolocation, sentiment, categorization) Manage errors and timeouts “Edit Fields” node: Refine the final dataset (metadata, tags, filters) Clean and normalize the data Prepare the final response for the user Structure of generated Google Sheets Default columns | Column | Description | Type | |---------|-------------|------| | URL | Data source URL | Hyperlink | | Title | Page/resource title | Text | | Description | Description or content excerpt | Long text | | Source | MCP tool used (Brave/Firecrawl/Apify) | Text | | Timestamp | Date/time of collection | Date/Time | | Metadata | Additional data (JSON) | Text | Automatic formatting Headings: Bold font, colored background URLs: Formatted as clickable links Dates: Standardized ISO 8601 format Columns: Width automatically adjusted to content Use cases Business and enterprise Competitive analysis combining search, crawling, and structured scraping Market trend research with multi-source aggregation Automated reporting pipelines for business intelligence Research and academia Literature discovery across multiple sources Data collection for research projects Automated bibliographic extraction from online sources Engineering and development Discovery of APIs and documentation Aggregation of product information from multiple platforms Scalable structured scraping for datasets Personal productivity Automated creation of newsletters or knowledge hubs Personal research assistant compiling spreadsheets from various online data Key features Multi-source intelligence Firecrawl for deep crawling Brave for real-time search Apify for structured web scraping AI-driven orchestration Google Gemini for reasoning and tool selection Memory for multi-turn interactions Context-based adaptive workflows Structured data output Automatic spreadsheet creation Data enrichment and formatting Ready-to-use datasets for reporting Performance and scalability Handles multiple simultaneous tool calls Scalable web data extraction Real-time aggregation from multiple MCPs Security and privacy Secure authentication based on API keys Data managed in Google Sheets / n8n Configurable retention and deletion policies Technical architecture Workflow User query → AI agent (Gemini) → MCP tools (Firecrawl / Brave / Apify) → Aggregated results → Spreadsheet creation → Data processing → Results delivery Supported data types Text and metadata from crawled web pages Search results from Brave queries Structured data from Apify scrapers Tabular reports via Google Sheets Integration options Chat interfaces Web widget for conversational queries Slack/Teams chatbot integration REST API access points Data sources Websites (via Firecrawl/Apify) Search engines (via Brave) APIs (via HTTP Request enrichment) Performance specifications Query response: < 5 seconds (search tasks) Crawl capacity: Thousands of pages per run Spreadsheet automation: Real-time creation and updates Accuracy: > 90% when using combined sources Advanced configuration options Customization Set custom prompts for the AI Agent Adjust the spreadsheet schema for reporting needs Configure retries for failed tool runs Analytics and monitoring Track tool usage and costs Monitor crawl and search success rates Log queries and outputs for auditing Troubleshooting and support Timeouts: Manually re-run failed MCP executions Data gaps: Validate Firecrawl/Apify selectors Spreadsheet errors: Check Google Sheets API quotas

franck fambouBy franck fambou
1021

Automatically add travel time blockers to calendar using Google Directions API

Automatically add Travel time blockers before Appointments This bot automatically adds Travel time blockers to your calendar, so you never come late to an appointment again. How it works Trigger: The workflow is initiated daily at 7 AM by a "Schedule Trigger". AI Agent: An "AI Agent" node orchestrates the main logic. Fetch events: It uses the getcalendarevents tool to retrieve all events scheduled for the current day. Identify events with location: It then filters these events to identify those that have a specified location. Check for existing travel time Blockers: For each event with a location, it checks if a Travel time blocker already exists. Events that do not* have such a blocker are marked for processing. Calculate travel time: Using the Google Directions API it determines how lot it takes to get to the location of the event. The starting location is by default your Home Address, unless there is a previous event within 2 hours before the event, in which case it will use the location of that previous event. Create Travel time blocker: Finally, it uses the createcalendarevent tool to create the Travel time blocker with a duration equal to the calculated travel time + 10 minutes for buffer. Set up steps Set Variables Home address Blocker name Mode of Transportation Connect your LLM Provider Connect your Google Calendar Connect your Google Directions API

Kevin ArmbrusterBy Kevin Armbruster
473

Stock fundamental analysis & AI-powered reports with Mistral and AlphaVantage

Fundamental Analysis, Stock Analysis, and AI Integration in the Fundamental Analysis Tool --- Overview of the Tool The Fundamental Analysis Tool is an automated workflow designed to evaluate a stock’s fundamentals using financial data and AI-driven insights. Built in the n8n automation platform, it: Collects financial data for a user-specified stock from AlphaVantage. Processes and structures this data for analysis. Analyzes the data using the Mistral AI model to provide expert-level insights. Generates a visually appealing HTML report with charts and delivers it via email. The tool is triggered by a form where users input a stock symbol (e.g., "NVDA" for NVIDIA) and their email address. From there, it follows a three-stage process: data retrieval, data processing, and AI analysis with report generation. --- Fundamental Analysis: The Foundation Fundamental analysis involves evaluating a company’s intrinsic value by examining its financial health, competitive position, and market environment. This tool performs fundamental analysis by: Data Retrieval Data Types: Six types of data are retrieved via HTTP requests: Overview: General company details (e.g., sector, industry, market cap). Income Statement: Revenue, net income, and profitability metrics. Balance Sheet: Assets, liabilities, and equity. Cash Flow: Operating, investing, and financing cash flows. Earnings Calendar: Upcoming earnings events. Earnings: Historical earnings data (annual and quarterly). Key Metrics Analyzed The tool structures this data into 8 categories critical to fundamental analysis, as defined in the "Code1" node: Economic Moats & Competitive Advantage: Assesses sustainable advantages (e.g., R&D spending, gross profit). Financial Health & Profitability: Examines ROE, debt levels, and dividend yield. Valuation & Market Sentiment: Evaluates P/E ratio, PEG ratio, and book value. Management & Capital Allocation: Reviews market cap justification and cash allocation (e.g., R&D, buybacks). Industry & Risk Exposure: Analyzes revenue cyclicality and geopolitical risks. Key Metrics to Probe: Investigates net income trends and gross margins. Red Flags: Identifies risks like inventory issues or stock dilution. Final Checklist: Summarizes pricing power and risk/reward potential. These categories cover the core pillars of fundamental analysis, ensuring a holistic evaluation of the stock’s intrinsic value and risks. --- Stock Analysis: Tailored Insights The tool performs stock-specific analysis by focusing on the user-provided stock symbol. Here’s how it tailors the process: Input and Customization Form Submission: Users enter a stock symbol (e.g., "NVDA") and email via the "On Form Submission" node. Dynamic Data Fetching: The "Set Variables" node passes the stock symbol to the API calls, ensuring the analysis is specific to the chosen stock. Processing for Relevance Data Filtering: The workflow limits historical data to the last 5 years (via the "Limit" node), focusing on recent trends. Merging and Cleaning: The "Merge" and "Code2" nodes combine and refine the data, removing irrelevant fields (e.g., quarterly reports) and aggregating annual reports for consistency. Output The final report is titled with the stock’s name (e.g., "Fundamental Analysis - NVIDIA"), ensuring the analysis is clearly tied to the user’s chosen stock. This stock-specific approach makes the tool practical for investors analyzing individual companies rather than broad market trends. --- AI Integration: Expert-Level Insights The integration of AI (via the Mistral model or others) is what sets this tool apart, automating complex analysis and report generation. Here’s how AI is woven into the workflow: Data Preparation for AI Structuring: The "Code1" node organizes the raw data into a JSON schema aligned with the eight fundamental analysis categories. This structured data is fed into the AI for analysis. AI Analysis Node: "Basic LLM Chain" uses the Mistral AI model. Prompt: The AI is instructed to act as an "expert financial advisor with 50 years of experience" and answer specific questions for each category, such as: Economic Moats: "What sustainable competitive advantages protect the company’s margins?" Financial Health: "Is ROE driven by leverage or true profitability?" Red Flags: "Are supply chain issues a concern?" Output: The AI generates a JSON response with detailed insights, e.g.: json { "Economic Moats & Competitive Advantage": "NVIDIA’s leadership in GPU technology and strong R&D investment...", "Financial Health & Profitability": "ROE of 25% is exceptional, driven by profitability rather than leverage...", ... } Validation: An "Auto-fixing Output Parser" ensures the output adheres to the expected JSON schema, retrying if necessary. Report Enhancement HTML Generation: The "HTML" node creates an initial report with placeholders for the AI’s insights and Google Charts for visualizations (e.g., ROE trends, revenue growth). AI-Driven Refinement: The "Basic LLM Chain1" node uses Mistral again to enhance the HTML, adding: Styled tables (e.g., financial ratios). Charts (e.g., bar charts for valuation, line charts for revenue). Visual indicators (e.g., ✅ for positive trends, ⚠️ for risks). Mobile-responsive design with modern fonts (Inter or Roboto). This dual AI approach—one for analysis, one for presentation—ensures the output is both insightful and user-friendly. --- Strengths and Limitations Strengths Comprehensive: Covers all key aspects of fundamental analysis. AI-Powered: Automates expert-level insights and report design. User-Friendly: Delivers an interactive, visual report via email. Limitations Data Dependency: Relies on public data, so data quality and timeliness matter. AI Constraints: Insights depend on AI’s capabilities; it may miss nuanced human judgment. Disclaimer: The tool notes it’s not investment advice, so users should consult advisors. ---

Sebastian/OptiLeverBy Sebastian/OptiLever
410

GitHub workflow version control dashboard with commit history and rollbacks

This n8n template provides enterprise-level version control for your workflows using GitHub integration. Stop losing hours to broken workflows and manual exports – get proper commit history, visual diffs, and one-click rollbacks. This is the first template for n8n that provides real version control with commit-level granularity. Perfect for power users and tech teams managing multiple complex workflows. Animation demonstrates early bird version and may differ from the latest one How it works Automated sync: Workflows are automatically synced to GitHub on your preferred schedule Smart categorization: Dashboard shows which workflows are synced, n8n-only, or GitHub-only Complete commit history: View every change with timestamps, authors, and commit messages Flexible import: Import workflows from GitHub as new workflows or replace existing ones Individual workflow control: Sync specific workflows with custom commit messages How to use Import the workflow template into your n8n instance Generate your n8n API key from your instance settings Configure your GitHub credentials and repository settings Set up the webhook endpoint for the dashboard interface Access the dashboard via the generated webhook URL Configure automatic sync schedule or use manual sync options Requirements n8n instance (cloud or self-hosted) with API access GitHub account and repository Basic understanding of Git workflows Customizing this workflow DIY or available as part of consulting services Professional customization for specific deployment workflows and integrations Custom sync schedules and commit message formats can be configured during setup

EduardBy Eduard
211
All templates loaded