Split test different agent prompts with Supabase and OpenAI
Split Test Agent Prompts with Supabase and OpenAI Use Case Oftentimes, it's useful to test different settings for a large language model in production against various metrics. Split testing is a good method for doing this. What it Does This workflow randomly assigns chat sessions to one of two prompts, the baseline and the alternative. The agent will use the same prompt for all interactions in that chat session. How it Works When messages arrive, a table containing information regarding session ID and which prompt to use is checked to see if the chat already exists If it does not, the session ID is added to the table and a prompt is randomly assigned These values are then used to generate a response Setup Create a table in Supabase called splittestsessions. It needs to have the following columns: sessionid (text) and showalternative (bool) Add your Supabase, OpenAI, and PostgreSQL credentials Modify the Define Path Values node to set the baseline and alternative prompt values. Activate the workflow and test by sending messages through n8n's inbuilt chat Experiment with different chat sessions to test see both prompts in action Next Steps Modify the workflow to test different LLM settings such as temperature Add a method to measure the efficacy of the two alternative prompts
One-way sync Stripe invoice PDFs to a S3 bucket
This automation syncs your Invoice PDFs from Stripe to a (AWS) S3 Bucket each month, in a folder of your choice, with the following subPath: yourFolder/invoiceYear/invoiceMonth/fileName Fill in your Credentials and Settings in the Nodes marked with "*". You can adjust this Workflow to your needs. You can also override the yearand month in the ENV Node for manual syncs. It will sync every Invoice PDF which created-date is greater then* the provided year and month. It will automatically set the day to the first day of the desired month. Enjoy the Workflow! ❤️ https://let-the-work-flow.com Workflow Automation & Development
Multi-platform source discovery with SerpAPI, DuckDuckGo, GitHub, Reddit & Bluesky
Source Discovery - Automatically Search More Up-to-Date Information Sources 🎬 Overview Version : 1.0 This workflow utilizes various nodes to discover and analyze potential sources of information from platforms like Google, Reddit, GitHub, Bluesky, and others. It is designed to streamline the process of finding relevant sources based on specified search themes. ✨ Features Automated source discovery from multiple platforms. Filtering of existing and undesired sources. Error handling for API requests. User-friendly configuration options. 👤 Who is this for? This workflow is ideal for researchers, content marketers, journalists, and anyone looking to efficiently gather and analyze information from various online sources. 💡 What problem does this solve? This workflow addresses the challenge of manually searching for relevant information sources, saving time and effort while ensuring that users have access to the most pertinent content. Ideal use-cases include: Resource Compilation for Academic and Educational Purposes Journalism and Research Content Marketing Competitor Analysis 🔍 What this workflow does The workflow gathers data from selected platforms through search terms. It filters out known and undesired sources, analyzes the content, and provides insights into potential sources relevant to the user's needs. 🔄 Workflow Steps Search Queries Fetch sources using SerpAPI search, DuckDuckGo, and Bluesky. Utilizes GitHub repositories to find relevant links. Leverages RSS feeds from subreddits to identify potential sources. Filtering Step Removes existing and undesired sources from the results. Source Selection Analyzes the content of the identified sources for relevance. 📌 Expected Input / Configuration The workflow is primarily configured via the Configure Workflow Args (Manual) node or the Global Variables custom node. Search themes: Keywords or phrases relevant to the desired content. Lists of known sources and undesired sources for filtering. 📦 Expected Output A curated list of potential sources relevant to the specified search themes, along with insights into their content. 📌 Example ⚙️ n8n Setup Used n8n version: 1.105.3 n8n-nodes-serpapi: 0.1.6 n8n-nodes-globals: 1.1.0 n8n-nodes-bluesky-enhanced: 1.6.0 n8n-nodes-duckduckgo-search: 30.0.4 LLM Model: mistral-small-latest (API) Platform: Podman 4.3.1 on Linux Date: 2025-08-06 ⚡ Requirements to Use / Setup Self-hosted or cloud n8n instance. Install the following custom nodes: SerpAPI, Bluesky, and DuckDuckGo Search. n8n-nodes-serpapi n8n-nodes-duckduckgo-search n8n-nodes-bluesky-enhanced Install the Global Variables Node for enhanced configuration: n8n-nodes-globals (or use Edit Field (Set) node instead) Provide valid credentials to nodes for your preferred LLM model, SerpAPI, and Bluesky. Credentials for GitHub recommended. ⚠️ Notes, Assumptions \& Warnings Ensure compliance with the terms of service of any platforms accessed or discovered in this workflow, particularly concerning data usage and attribution. Monitor API usage to avoid hitting rate limits. The workflow may encounter errors such as 403 responses; in such cases, it will continue by ignoring the affected substep. Duplicate removal is applied, but occasional overlaps might still appear depending on the sources. This workflow assumes familiarity with n8n, APIs, and search engines. Using AI agents (Mistral or substitute LLMs) requires access to their API services and keys. This is not a Curator of News. It is designed to find websites that are relevant and useful to your searches. If you are looking for a relevant news selector, please check this workflow. ℹ️ About Us This workflow was developed by the Hybroht team. Our goal is to create tools that harness the possibilities of technology and more. We aim to continuously improve and expand functionalities based on community feedback and evolving use cases. For questions, reach out via contact@hybroht.com. --- ⚖️ Warranty & Legal Notice This free workflow is provided "as-is" without any warranties of any kind, either express or implied, including but not limited to the implied warranties of merchantability, fitness for a particular purpose, or non-infringement. By using this workflow, you acknowledge that you do so at your own risk. We shall not be held responsible for any damages, losses, or liabilities arising from the use or inability to use this workflow, including but not limited to any direct, indirect, incidental, or consequential damages. It is your responsibility to ensure that your use of this workflow complies with all applicable laws and regulations. ---
Forward email & LinkedIn message notifications from Reply.io to Telegram
How this works: This workflow contains a webhook that receives updates from Reply.io when one of your connections sends you a message via either Email or LinkedIn. The workflow also includes utility nodes to create the necessary webhook subscriptions. Setup steps: Configure the body of the utility nodes to create subscriptions for the correct URL — retrieve it from the "Webhook" node in the workflow (use the production URL) Obtain your Reply.io API key. Documentation: https://apidocs.reply.io/ Create a Telegram bot Set your Reply.io API key in the HTTP nodes using Header Authentication Enter your Telegram credentials in the Telegram node Activate the workflow