51 templates found
Category:
Author:
Sort:

Automated stock trading with AI: integrating Alpaca and Google Sheets

Description Transform your investment strategy with a fully automated, AI-driven trading bot. This workflow bridges the gap between AI-powered market insights and real-world trading by executing buy and sell orders directly through the Alpaca paper trading API. Designed to work in tandem with the Automated Stock Sentiment Analysis workflow, this solution takes the top-performing stocks based on daily news sentiment and automatically rebalances your portfolio. It's perfect for algorithmic traders, data-driven investors, and n8n enthusiasts who want to see their AI analysis translate into tangible actions, all while maintaining a comprehensive log of every transaction in Google Sheets. Key Features & Benefits Automated Trading Execution: Automatically places buy and sell orders on the Alpaca paper trading platform without manual intervention. Sentiment-Driven Decisions: Leverages the output from the sentiment analysis workflow to make informed decisions, selling positions with waning sentiment and buying into those with high positive sentiment. Dynamic Portfolio Rebalancing: Intelligently calculates which positions to close and how to allocate the resulting funds into new, high-potential stocks. Paper Trading Ready: Safely test and refine your trading strategies in a risk-free environment using Alpaca's paper trading API. Daily Performance Tracking: Automatically logs your account equity and daily percentage change to a Google Sheet, giving you a clear view of your portfolio's performance. Detailed Trade Logging: Every buy and sell order is meticulously recorded in a Google Sheet for easy review and historical analysis. Scheduled and Autonomous: The entire process runs on a daily schedule, making it a "set and forget" solution for systematic trading. How It Works This workflow executes a sophisticated, automated trading strategy in a few key stages: Daily Kick-off & Snapshot: The workflow triggers on a daily schedule, first fetching your current Alpaca account balance and logging it to a Google Sheet to track daily performance. Strategy Formulation: It then reads the daily sentiment scores produced by the accompanying "Stock Sentiment Analysis" workflow. A Code node filters these results to identify the top four stocks with the highest positive sentiment. The Decision Engine: The core of the workflow is a custom Code node that acts as the trading brain. It: Retrieves your currently open positions from Alpaca. Compares your holdings against the day's top four sentiment stocks. Generates a "sell list" of positions you hold that are no longer in the top four. Generates a "buy list" of top-sentiment stocks that you don't yet own. Calculates the total cash value from the "sell list" and determines the exact notional value to invest in each stock on the "buy list." Trade Execution: The workflow first iterates through the "sell list" and executes a DELETE request to Alpaca for each, closing the positions. A Wait node pauses the workflow for two minutes to ensure the sell orders are filled and the account balance is updated. It then iterates through the "buy list," executing POST requests to Alpaca to purchase the new assets with the calculated funds. Record Keeping: All executed orders (both buys and sells) are merged and logged in a dedicated Google Sheet, giving you a permanent and detailed transaction history. Nodes Used Schedule Trigger HttpRequest (Alpaca API) Google Sheets Code (JavaScript) SplitOut Wait Merge This workflow is the perfect next step for anyone looking to take their AI analysis to the next level. Take the emotion out of your trading and let this bot systematically execute your data-driven strategy.

Raz HadasBy Raz Hadas
22531

From Google Drive to Instagram, TikTok & YouTube with AI descriptions & Airtable tracking

Description This automation template is designed for content creators, social media managers, and influencers who want to streamline their video publishing workflow. It automatically detects new videos uploaded to a specific Google Drive folder, generates AI-powered descriptions based on video audio content, and simultaneously publishes them across Instagram, TikTok, and YouTube while tracking everything in Airtable. Note: This workflow uses upload-post.com API (free trial no credit card required) for multi-platform video distribution and requires API tokens for each service. The AI-generated descriptions are created using OpenAI's transcription and chat models to analyze video audio content. Who Is This For? Content Creators & Influencers: Automatically publish your videos across all major social platforms without manual work. Social Media Managers: Maintain consistent posting schedules across multiple platforms with AI-generated, platform-optimized descriptions. Marketing Teams: Scale video content distribution with automated workflows that include tracking and status monitoring. Video Producers: Focus on creating content while the system handles the tedious task of multi-platform publishing and description generation. What Problem Does This Workflow Solve? Publishing the same video content across Instagram, TikTok, and YouTube is time-consuming and repetitive. You need to manually upload each video, write unique descriptions, and track publication status. This workflow addresses these challenges by: Automated Video Distribution: Detects new videos in Google Drive and automatically uploads them to all three platforms simultaneously. AI-Powered Content Generation: Uses OpenAI to transcribe video audio and generate engaging, platform-appropriate descriptions automatically. Centralized Tracking: Maintains detailed records in Airtable including upload status, URLs, and metadata for each platform. Error Monitoring: Provides real-time error notifications via Telegram to ensure you're always aware of any issues. How It Works Video Upload Detection: The workflow monitors a specific Google Drive folder for new video uploads using automated triggers. Content Analysis: Downloads the video, extracts audio, and uses OpenAI to transcribe and generate compelling descriptions. Airtable Integration: Creates and updates records to track video metadata, descriptions, and publication status. Multi-Platform Publishing: Simultaneously uploads the video to Instagram, TikTok, and YouTube using the upload-post.com API. Status Tracking: Updates Airtable records with publication status and platform-specific URLs for each successful upload. Setup Google Drive Configuration: Set up the Google Drive trigger to monitor your specific folder Configure OAuth2 credentials for Google Drive access OpenAI Integration: Add your OpenAI API key to enable audio transcription and description generation Airtable Setup: Create an Airtable base with fields for Video Name, Description, Platform Status, URLs, and Upload Date Add your Airtable API token and configure base/table IDs in the "Set Variables" node Upload-Post.com Account: Create an account at upload-post.com to get your API token Configure the token in the HTTP request nodes for each platform Set your user ID in the variables section Platform Accounts: Ensure your Instagram, TikTok, and YouTube accounts are connected to upload-post.com Error Notifications: (Optional) Configure Telegram bot credentials for error notifications Requirements Accounts: Google Drive, OpenAI, Airtable, upload-post.com, Telegram (optional) API Keys & Credentials: Google Drive OAuth2, OpenAI API Key, Airtable API Token, upload-post.com API Token Platform Setup: Instagram, TikTok, and YouTube accounts connected to upload-post.com Transform your video publishing workflow from hours of manual work to a fully automated system that handles everything from content analysis to multi-platform distribution and tracking.

Juan Carlos Cavero GraciaBy Juan Carlos Cavero Gracia
14590

Monitor competitors' websites for changes with OpenAI and Firecrawl

Who is this template for? This workflow template is designed for people seeking alerts when certain specific changes are made to any web page. Leveraging agentic AI, it analyzes the page every day and autonomously decides whether to send you an e-mail notification. Example use cases Track price changes on [competitor's website]. Notify me when the price drops below €50. Monitor new blog posts on [industry leader's website] and summarize key insights. Check [competitor's job page] for new job postings related to software development. Watch for new product launches on [e-commerce site] and send me a summary. Detect any changes in the terms and conditions of [specific website]. Track customer reviews for [specific product] on [review site] and extract key themes. How it works When clicking 'test workflow' in the editor, a new browser tab will open where you can fill in the details of your espionage assignment Make sure you be as concise as possible when instructing AI. Instruct specific and to the point (see examples at the bottom). After submission, the flow will start off by extracting both the relevant website url and an optimized prompt. OpenAI's structured outputs is utilized, followed by a code node to parse the results for further use. From here on, the endless loop of daily checks will begin: Initial scrape 1 day delay Second scrape AI agent decides whether or not to notify you Back to step 1 You can cancel an espionage assignment at any time in the executions tab Set up steps Insert your OpenAI API key in the structured outputs node (second one) Create a Firecrawl account and connect your Firecrawl API key in both 'Scrape page'-nodes Connect your OpenAI account in the AI agents' model node Connect your Gmail account in the AI agents' Gmail tool node

SleakBy Sleak
12444

Parse PDF with LlamaParse and save to Airtable

Video Guide I prepared a comprehensive guide detailing how to automate the parsing of invoices using n8n and LlamaParse, seamlessly capturing and storing vital billing information. [](https://youtu.be/E4I0nru-fa8) Youtube Link Who is this for? This workflow is ideal for finance teams, accountants, and business operations managers who need to streamline invoice processing. It is particularly helpful for organizations seeking to reduce manual entry errors and improve efficiency in managing billing information. What problem does this workflow solve? Manually processing invoices can be time-consuming and error-prone. This automation eliminates the need for manual data entry by capturing invoice details directly from uploaded documents and storing structured data efficiently. This enhances productivity and accuracy across financial operations. What this workflow does The workflow leverages n8n and LlamaParse to automatically detect new invoices in a designated Google Drive folder, parse essential billing details, and store the extracted data in a structured format. The key functionalities include: Real-time detection of new invoices via Google Drive triggers. Automated HTTP requests to initiate parsing through Lama Cloud. Structured storage of invoice details and line items in a database for future reference. Google Drive Integration: Monitors a specific folder in Google Drive for new invoice uploads. Parsing with LlamaParse: Automatically sends invoices for parsing and processes results through webhooks. Data Storage in Airtable: Creates records for invoices and their associated line items, allowing for detailed tracking. Setup N8N Workflow Google Drive Trigger: Set up a trigger to detect new files in a specified folder dedicated to invoices. File Upload to LlamaParse: Create an HTTP request that sends the invoice file to LlamaParse for parsing, including relevant header settings and webhook URL. Webhook Processing: Establish a webhook node to handle parsed results from LlamaParse, extracting needed invoice details effectively. Invoice Record Creation: Create initial records for invoices in your database using the parsed details received from the webhook. Line Item Processing: Transform string data into structured line item arrays and create individual records for each item linked to the main invoice.

Mark ShcherbakovBy Mark Shcherbakov
11523

Enrich FAQ sections on your website pages at scale with AI

This n8n workflow template lets you easily generate comprehensive FAQ (Frequently Asked Questions) content for multiple services (or any items or pages you need to add the FAQs to). Simply provide the Google Sheets document containing the items to scrape, and the workflow automatically creates detailed, AI-enhanced FAQ documents. How it works The workflow reads data from a Google Sheets document containing information about different services and categories (again, in your case - whatever objects you need). For each service and category, it generates a set of standard questions and answers covering setup, permissions, integrations, use cases, and pricing benefits. An AI model (OpenAI's GPT) is used to enhance or complete some of the answers, making the content more comprehensive and natural-sounding. The workflow formats the Q&A pairs, combining AI-generated content with predefined answers where applicable. It creates a text file (JSON) for each service or category, containing the formatted Q&A pairs. The generated files are saved to specific folders in Google Drive, organized by the type of integration (native, credential-only, non-native) or category. After processing each service or category, it updates the status in the original Google Sheets document to mark it as completed. Ideal for: Marketing teams: Rapidly create comprehensive FAQ documents for multiple products or services. Customer support: Generate consistent and detailed answers for common customer queries. Product managers: Easily maintain up-to-date documentation as products evolve. Content creators: Streamline the process of creating informative content about various offerings. Accounts required Google account (for Google Sheets and Google Drive) OpenAI API account (for AI-enhanced content generation) n8n.io account (for workflow execution) Set up instructions Set up the required credentials for Google Sheets, Google Drive, and OpenAI when you first open the workflow. Prepare your Google Sheets document with the service/category information. Here's an example of Google Sheet. Fill the "Define Sheets" node with your sheets Adjust the folder IDs in the "Prepare Job" node to match your Google Drive structure. Configure the OpenAI model settings in the "OpenAI Chat Model" node if needed. Test the workflow with a small subset of data before running it on your entire dataset. Adjust the questions asked in the "Create your Q&A templates" section After testing, activate your workflow for automated FAQ generation. 🙏 Big, big kudos to Jim Le for his ideas, input and support when building this workflow. Your approach to AI workflows is always super helpful!

Polina MedvedievaBy Polina Medvedieva
10063

Scale deal flow with a Pitch Deck AI vision, chatbot and QDrant vector store

Are you a popular tech startup accelerator (named after a particular higher order function) overwhelmed with 1000s of pitch decks on a daily basis? Wish you could filter through them quickly using AI but the decks are unparseable through conventional means? Then you're in luck! This n8n template uses Multimodal LLMs to parse and extract valuable data from even the most overly designed pitch decks in quick fashion. Not only that, it'll also create the foundations of a RAG chatbot at the end so you or your colleagues can drill down into the details if needed. With this template, you'll scale your capacity to find interesting companies you'd otherwise miss! Requires n8n v1.62.1+ How It Works Airtable is used as the pitch deck database and PDF decks are downloaded from it. An AI Vision model is used to transcribe each page of the pitch deck into markdown. An Information Extractor is used to generate a report from the transcribed markdown and update required information back into pitch deck database. The transcribed markdown is also uploaded to a vector store to build an AI chatbot which can be used to ask questions on the pitch deck. Check out the sample Airtable here: https://airtable.com/appCkqc2jc3MoVqDO/shrS21vGqlnqzzNUc How To Use This template depends on the availability of the Airtable - make a duplicate of the airtable (link) and its columns before running the workflow. When a new pitchdeck is received, enter the company name into the Name column and upload the pdf into the File column. Leave all other columns blank. If you have the Airtable trigger active, the execution should start immediately once the file is uploaded. Otherwise, click the manual test trigger to start the workflow. When manually triggered, all "new" pitch decks will be handled by the workflow as separate executions. Requirements OpenAI for LLM Airtable For Database and Interface Qdrant for Vector Store Customising This Workflow Extend this starter template by adding more AI agents to validate claims made in the pitch deck eg. Linkedin Profiles, Page visits, Reviews etc.

JimleukBy Jimleuk
7779

Automate Instagram & Facebook posting with Meta Graph API & System User Tokens

This template automates posting to Instagram Business and Facebook Pages using the Meta Graph API. It supports both short-lived and long-lived tokens, with a secure approach using System User tokens for reliable, ongoing automation. Includes detailed guidance for authentication, token refresh logic, and API use. Features: 📸 Publish to Instagram via /media + /media_publish 📘 Post to Facebook Pages via /photos 🔐 Long-lived token support via Meta Business System User ♻️ Token refresh support using staticData in n8n 🧠 In-line sticky note instructions Use Cases: Schedule and publish branded social media content Automate marketing flows with CRM + social sync Empower internal teams or clients to post without manual steps Tags: Instagram, Facebook, Meta Graph API, Social Media, Token Refresh, Long-Lived Token, Marketing Automation, System User

Brian By Brian
6448

📢 Multi-platform video publisher – YouTube, Instagram & TikTok

Hi! I'm Amanda ❤️ I build intelligent automation flows with n8n and Make. This one is for all content creators, marketing teams, and agencies who want to publish once and post everywhere. With this workflow, you can upload a single video to YouTube, Instagram Reels, and TikTok — simultaneously and automatically. --- ✅ What the workflow does Downloads a video from a provided URL Uploads the video to your YouTube channel with title and description Publishes it as a Reel on Instagram via the Meta Graph API Sends the same video to TikTok using their official API Supports credential input via Set node (tokens, titles, descriptions) --- ⚙️ Nodes & Tech Used HTTP Request – Download video and handle uploads to Instagram & TikTok YouTube node – Official n8n integration for video upload Set node – For handling user inputs (tokens, titles, video URLs) Switch, Wait, Merge – Logic to control publishing status Manual or webhook start available --- 🛠️ Setup Instructions Open the workflow in your n8n (Cloud or self-hosted) instance Edit the Set node called Credentials and fill in: Token Instagram Token Tiktok YouTube title, description, and video URL Instagram account ID Connect your YouTube OAuth credentials in the YouTube node Optionally, trigger via webhook to automate from other apps (Typebot, CRM, Drive) Hit "Execute Workflow" or schedule via cron --- 👥 Who this is for Content creators who want to post everywhere at once Agencies managing video distribution across platforms Social media managers and freelancers Anyone wanting a one-click multi-platform publishing workflow --- 🌐 Explore more workflows ❤️ Buy workflows: https://iloveflows.com ☁️ Try n8n Cloud: https://n8n.partnerlinks.io/amanda

Amanda BenksBy Amanda Benks
4899

Sync data between multiple Google Spreadsheets

Triggers workflow all two minutes Reads data from a Google Spreadsheet (in example Sheet Data columns A to G) Write the data unchanged in two different Spreadsheets with same Sheet name and columns, expressions are optional)

Jan OberhauserBy Jan Oberhauser
3727

Discover business leads with Gemini, Brave Search and web scraping

This workflow contains community nodes that are only compatible with the self-hosted version of n8n. Uncover new business leads with this AI-Powered Prospect Discovery Agent! This n8n workflow acts as a specialized intelligent assistant that, given a business type and location, uses multiple search strategies to identify a list of potential prospect companies and their websites. Stop manually trawling through search results! This agent automates the initial phase of lead generation by: Understanding your target business profile (type, location, keywords). Strategically using web search tools (Brave Search, Google Gemini Search) to find relevant businesses. Performing quick validations to confirm relevance. Returning a clean, structured JSON list of prospect names and their website URLs. How it Works: The workflow is built around an AI agent powered by Google Gemini. This agent is equipped with tools like: Brave Web Search: For broad initial sourcing of potential business candidates. Google Gemini Search: For advanced, context-aware discovery and finding businesses mentioned in various online sources. Brave Local Search (Selective): For quick verification of local presence or finding website URLs for identified names. Jina AI Web Page Scraper (Very Selective): For extremely rapid relevance checks on uncertain websites by scanning page content for keywords. The agent's system prompt guides it to use these tools efficiently to build a list of prospects without getting bogged down in deep research on any single one at this discovery stage. Use Cases: Lead Generation: Automatically generate lists of potential clients based on industry and location. Market Research: Identify key players or types of businesses in a specific geographical area. Sales Development: Provide SDRs with initial lists of companies to research further. Called as a Sub-Workflow: Designed to be easily integrated as a "tool" into more complex orchestrating AI agents (e.g., a BNI Pitch Planner that first needs to identify who to target). Setup: Import the workflow. Configure Credentials: You'll need n8n credentials for: Google Gemini (for the Chat model and the Gemini Search/Vertex AI Search tool). Brave Search (e.g., via Smithery MCP, or adapt if you have direct API access). Jina AI (for the web scraper). Assign these to the respective nodes. Review System Prompt: The prospectdiscoveryagent node contains a detailed system prompt. You can fine-tune this to adjust its search strategies or the strictness of its matching. Inputs: This workflow is triggered by an "Execute Workflow Trigger" node (prospectdiscoveryworkflow). It expects the following inputs: business_type (string): e.g., "artisan bakery" location_query (string): e.g., "Portland, Oregon" desirednumprospects (number): e.g., 5 additional_keywords (string, optional): e.g., "organic, gluten-free" To Use (as a Sub-Workflow/Tool): This workflow is typically called by another n8n workflow (e.g., using a "Tool Workflow" node from the Langchain nodes). The calling workflow would provide the inputs listed above. The "Prospect Discovery" workflow will then execute and its final node (the prospectdiscoveryagent) will output a JSON array of found prospects, like: json [ { "business_name": "Rose Petal Bakery", "website_url": "https://rosepetalbakerypdx.com" }, { "business_name": "The Daily Bread Artisans", "website_url": "https://dailybreadpdx.com" } ] If no prospects are found, it returns an empty array []. This template provides a powerful and focused tool for automating the initial stages of prospect identification.

JezBy Jez
3495

Build a multichannel customer support AI assistant with Chatwoot & OpenRouter

Multichannel AI Assistant Demo for Chatwoot This simple n8n template demonstrates a Chatwoot integration that can: Receive new messages via a webhook. Retrieve conversation history. Process the message history into a format suitable for an LLM. Demonstrate an AI Assistant processing a user's query. Send the AI Assistant's response back to Chatwoot. Use Case: If you have multiple communication channels with your clients (e.g., Telegram, Instagram, WhatsApp, Facebook) integrated with Chatwoot, you can use this template as a starting point to build more sophisticated and tailored AI solutions that cover all channels at once. How it works A webhook receives the message created event from Chatwoot. The webhook data is then filtered to keep only the necessary information for a cleaner workflow. The workflow checks if the message is "incoming." This is crucial to prevent the assistant from replying to its own messages and creating endless loops. The conversation history is retrieved from Chatwoot via an API call using the HTTP Request node. This allows the assistant's interaction to be more natural and continuous without needing to store conversation history locally. A simple AI Assistant processes the conversation history and generates a response to the user based on its built-in knowledge base (see the prompt in the assistant node). The final HTTP Request node sends the AI-generated response back to the appropriate Chatwoot conversation. How to Use In Chatwoot, go to Settings → Integrations → Webhooks and add your n8n webhook URL. Be sure to select the message created event. In the HTTP Request nodes, replace the placeholder values: https://yourchatwooturl.com apiaccesstoken You can find these values on your Chatwoot super admin page. The LLM node is configured to use OpenRouter. Add your OpenRouter credentials, or replace the node with your preferred LLM provider. Requirements An API key for OpenRouter or credentials for your preferred LLM provider. A Chatwoot account with at least one integrated channel and super admin access to obtain the apiaccesstoken. Need Help Building Something More? Contact me on: Telegram: @ninesfork LinkedIn: George Zargaryan Happy Hacking! 🚀

George ZargaryanBy George Zargaryan
3239

Comprehensive SEO Audit with GPT-4 Specialists using Analytics, Search Console & PageSpeed

🤖 Automated SEO Audit with a Team of AI Specialists This workflow performs a comprehensive, automated monthly SEO and performance audit for any website. It uses a "team" of specialized AI agents to analyze data from multiple sources, aggregates their findings, and generates a final strategic report. Every month, it automatically fetches data from Google Analytics, Google Search Console, and Google PageSpeed Insights, and also performs a live crawl of the target website's homepage. Key Features Fully Automated: Runs on a schedule to deliver monthly reports without manual intervention. Multi-Source Analysis: Gathers data from four key marketing sources for a 360° view. AI Agent Team: Uses a sophisticated multi-agent system where each AI specializes in one area (Analytics, Performance, Technical SEO). Master Analyst: A final AI agent synthesizes all specialist reports into a single, actionable strategic plan. Automated Storage: All individual and final reports are automatically saved to a designated Google Sheet. --- ⚙️ Setup Instructions To use this template, you must configure your credentials and set your target website. Set Your Target Domain (Crucial!): Find the Set Target Website node at the beginning of the workflow. In the "Value" field, replace https://www.your-website.com with the URL of the website you want to audit. This will update the URL across the entire workflow automatically. Configure the Schedule Trigger: Click on the Schedule Trigger node to set when you want the monthly report to run. Connect Your Google Credentials: Google Analytics: Select your credential in the Get a report node. Google Search Console: Select your credential in the Search Console (HTTP Request) node. Google Sheets: Select your credential in all* Google Sheets nodes. Google PageSpeed API Key: Go to the "Credentials" tab in n8n and create a new "Generic Credential" with the type "API Key - Query Param". Name it Google API Key. The "Parameter Name" must be key. Paste your PageSpeed API key into the "API Key" field. Go back to the PageSpeed Insight node, select "API Key - Query Param" for Authentication, and choose your new credential. Connect OpenAI Credentials: This template uses multiple OpenAI Chat Model nodes. Configure each one with your OpenAI API key. Set Your Google Sheet: In each of the Google Sheets nodes, replace the hardcoded "Document ID" with the ID of your own Google Sheet where you want to store the reports. --- 🔬 Workflow Explained Phase 1: Data Collection: The Schedule Trigger starts the workflow. Four parallel branches collect data from Google Analytics, PageSpeed Insights, Search Console, and a direct website crawl. Phase 2: Data Processing & Specialist Analysis: Each data source is processed by a dedicated Code node to format the data. The formatted data is then sent to a specialized AI agent (ANALYTICS SPECIALIST, PERFORMANCE SPECIALIST, etc.) for in-depth analysis. Phase 3: Report Aggregation: A Merge node waits for all four specialist reports to be completed. A DATA AGGREGATOR node then combines them into a single, comprehensive package. Phase 4: Master Synthesis & Storage: The final MASTER ANALYST agent receives the aggregated data and produces a high-level strategic summary with actionable recommendations. This final report is then saved to Google Sheets.

Jimmy GayBy Jimmy Gay
2917