48 templates found
Category:
Author:
Sort:

Scrape any web page into structured JSON data with ScrapeNinja and AI

Disclaimer: This template only works on self-hosted for now, as it uses a community node. Use Case Web scrapers often break due to web page layout changes. This workflow attempts to mitigate this problem by auto-generating web scraping data extractor code via LLM. How It Works This workflow leverages ScrapeNinja n8n community node to: scrape webpage HTML, feed it into LLM (Google Gemini) and ask to write a JS extractor function code, then it executes the written JS extractor against scraped HTML to extract useful data from webpage (the code is safely executed in a sandbox) Installation To install ScrapeNinja n8n node, in your self-hosted instance, go to Settings -> Community nodes, enter "n8n-nodes-scrapeninja", and install. Make sure you are using at least v0.3.0. See this in action: https://www.linkedin.com/feed/update/urn:li:activity:7289659870935490560/

AnthonyBy Anthony
58699

Chat with GitHub API documentation: RAG-powered chatbot with Pinecone & OpenAI

This workflow demonstrates a Retrieval Augmented Generation (RAG) chatbot that lets you chat with the GitHub API Specification (documentation) using natural language. Built with n8n, OpenAI's LLMs and the Pinecone vector database, it provides accurate and context-aware responses to your questions about how to use the GitHub API. You could adapt this to any OpenAPI specification for any public or private API, thus creating a documentation chatbout that anyone in your company can use. How it works: Data Ingestion: The workflow fetches the complete GitHub API OpenAPI 3 specification directly from the GitHub repository. Chunking and Embeddings: It splits the large API spec into smaller, manageable chunks. OpenAI's embedding models then generate vector embeddings for each chunk, capturing their semantic meaning. Vector Database Storage: These embeddings, along with the corresponding text chunks, are stored in a Pinecone vector database. Chat Interface and Query Processing: The workflow provides a simple chat interface. When you ask a question, it generates an embedding for your query using the same OpenAI model. Semantic Search and Retrieval: Pinecone is queried to find the most relevant text chunks from the API spec based on the query embedding. Response Generation: The retrieved chunks and your original question are fed to OpenAI's gpt-4o-mini LLM, which generates a concise, informative, and contextually relevant answer, including code snippets when applicable. Set up steps: Create accounts: You'll need accounts with OpenAI and Pinecone. API keys: Obtain API keys for both services. Configure credentials: In your n8n environment, configure credentials for OpenAI and Pinecone using your API keys. Import the workflow: Import this workflow into your n8n instance. Pinecone Index: Ensure you have a Pinecone index named "n8n-demo" or adjust the workflow accordingly. The workflow is set up to work with this index out of the box. Setup Time: Approximately 15-20 minutes. Why use this workflow? Learn RAG in Action: This is a practical, hands-on example of how to build a RAG-powered chatbot. Adaptable Template: Easily modify this workflow to create chatbots for other APIs or knowledge bases. n8n Made Easy: See how n8n simplifies complex integrations between data sources, vector databases, and LLMs.

Mihai FarcasBy Mihai Farcas
29649

Automate Your RFP Process with OpenAI Assistants

This n8n workflow demonstrates how to automate oftern time-consuming form filling tasks in the early stages of the tendering process; the Request for Proposal document or "RFP". It does this by utilising a company's knowledgebase to generating question-and-answer pairs using Large Language Models. How it works A buyer's RFP is submitted to the workflow as a digital document that can be parsed. Our first AI agent scans and extracts all questions from the document into list form. The supplier sets up an OpenAI assistant prior loaded with company brand, marketing and technical documents. The workflow loops through each of the buyer's questions and poses these to the OpenAI assistant. The assistant's answers are captured until all questions are satisified and are then exported into a new document for review. A sales team member is then able to use this document to respond quickly to the RFP before their competitors. Example Webhook Request curl --location 'https://<n8nwebhookurl>' \ --form 'id="RFP001"' \ --form 'title="BlueChip Travel and StarBus Web Services"' \ --form 'reply_to="jim@example.com"' \ --form 'data=@"k9pnbALxX/RFP Questionnaire.pdf"' Requirements An OpenAI account to use AI services. Customising the workflow OpenAI assistants is only one approach to hosting a company knowledgebase for AI to use. Exploring different solutions such as building your own RAG-powered database can sometimes yield better results in terms of control of how the data is managed and cost.

JimleukBy Jimleuk
10881

Ultimate content generator for WordPress

Overview This workflow automates the end-to-end process of creating, optimizing, and publishing content on WordPress. It integrates AI-powered tools, Airtable, and WordPress plugins to generate high-quality, on-brand posts effortlessly. Perfect for content creators, marketers, and business owners looking to save time and scale their content strategy. --- Features Content Creation: AI-Powered Content: Generates SEO-friendly blog posts with structured headings, relevant keywords, and meta descriptions. Custom Prompts: Tailor the AI-generated content to match your brand’s tone and voice. SEO Optimization: RankMath Plugin Integration: Updates RankMath SEO with focus keywords and meta descriptions, ensuring your content is search-engine optimized. Content Management: Airtable Integration: Organizes content ideas, drafts, and publishing schedules in one place. Easily scalable for teams or solo creators. Visuals: Branded Featured Images: Automatically generates on-brand images for every post. Publishing: Effortless Formatting: Adapts content to fit your WordPress theme and schedules it for publication. --- Workflow Steps Trigger: Initiated manually or on a schedule. Content Management: Retrieves and organizes ideas from Airtable. Content Generation: Generates AI-driven blog content tailored to your audience. SEO Optimization: Automatically updates RankMath with SEO details. Featured Image Creation: Produces on-brand images for the post. Publishing: Formats and schedules the post on WordPress. --- Prerequisites API Keys: OpenAI Airtable WordPress REST API RankMath SEO Plugin Custom Code: Add a small update to your WordPress theme’s functions.php file to enable seamless automation. --- Customization Replace Airtable with another content management system if preferred. Adjust AI prompts to reflect different tones, styles, or industries. Add integrations for additional plugins, analytics, or storage services. --- Usage Import the workflow into your n8n instance. Configure API credentials for WordPress, Airtable, OpenAI, and RankMath. Update your functions.php file with the provided code snippet. Customize prompts and Airtable structure for your content needs. Trigger the workflow manually or set it on a schedule. --- Notes Experiment with Airtable views or add filters for more granular control over your content pipeline. Extend the workflow to include social media posting or analytics tracking. For questions, refer to n8n documentation or reach out to the creator. --- Tools Used Airtable OpenAI GPT WordPress REST API RankMath SEO Plugin Feel free to adapt and extend this workflow to meet your specific needs! 🎉

Alex KimBy Alex Kim
5614

Competitor price monitoring with web scraping,Google Sheets & Telegram

How it works ++Download the google sheet here++ and replace this with the googles sheet node: Google sheet , upload to google sheets and replace in the google sheets node. Scheduled trigger: Runs once a day at 8 AM (server time). Fetch product list: Reads your “master” sheet (product_url + last known price) from Google Sheets. Loop with delay: Iterates over each row (product) one at a time, inserting a short pause (20 s) between HTTP requests to avoid blocking. Scrape current price: Loads each product_url, extracts the current price via a simple CSS selector. Compare & normalize: Compares the newly scraped price against the “lastprice” from your sheet, calculates percentage change, and tags items where pricechanged == true. On price change: Send alert: Formats a Telegram message (“Price Drop” or “Price Hike”) and pushes it to your configured chat. Log history: Appends a new row to a separate “price_tracking” tab with timestamp, old price, new price, and % change. Update master sheet: After a 1 min pause, writes the updated current_price back to your “master” sheet so future runs use it as the new baseline. Set up step Google Sheets credentials (~5 min) Create a Google Sheets OAuth credential in n8n. Copy your sheet’s ID and ensure you have two tabs: productdata (columns: producturl, price) pricetracking (columns: timestamp, producturl, lastprice, currentprice, pricediffpct, price_changed) Paste the sheet ID into both Google Sheets nodes (“Read” and “Append/Update”). Telegram credentials (~5 min) Create a Telegram Bot token via BotFather. Copy your chat_id (for your target group or personal chat). Add those credentials to n8n and drop them into the “Telegram” node. Workflow parameters (~5 min) Verify the schedule in the Schedule Trigger node is set to 08:00 (or adjust to your preferred run time). In the Loop Over Items node, confirm “Batch Size” is 1 (to process one URL at a time). Adjust the Delay to avoid Request Blocking node if your site requires a longer pause (default is 20 s). In the Parse Data From The HTML Page node, double-check the CSS selector matches how prices appear on your target site. Once credentials are in place and your sheet tabs match the expected column names, the flow should be ready to activate. Total setup time is under 15 minutes—detailed notes are embedded as sticky comments throughout the workflow to help you tweak selectors, change timeouts, or adjust sheet names without digging into code.

Tony PaulBy Tony Paul
5362

Create AI news avatar videos with Dumpling AI, GPT-4o and HeyGen

🧾 What this workflow does This workflow automatically generates avatar-style videos from the latest AI-related news using Dumpling AI and HeyGen. It runs every hour, scrapes trending articles, turns them into 30–60 second spoken scripts with GPT-4o, and produces short avatar videos with HeyGen. Finally, it logs the final video URL in a Google Sheet. --- 👤 Who is this for Newsletters and creators who want to automate AI trend updates Content marketers generating short-form video content Product teams experimenting with AI-generated summaries Automation enthusiasts combining LLMs + video + trending data --- ⚙️ How to set up 🔐 Requirements Dumpling AI API Key stored securely as HTTP Header credential HeyGen API Key added as an HTTP Header credential OpenAI API Key for GPT-4o (can use GPT-4o-mini if preferred) Google Sheets account with one column: Video link 🛠 Step-by-step setup Google Sheet Setup Create a Google Sheet with a single column named: Video link Update Credentials Use n8n’s credential manager to add tokens for: Dumpling AI HeyGen OpenAI Google Sheets Optional Customizations In the "Dumpling AI: Search AI News" node, you can change "query": "AI Agent" to other trending keywords (e.g., "Generative AI", "Autonomous Agents", etc.) Update the avatarid and voiceid in the HeyGen request to match your preferred look/sound --- 🧠 How it works The Schedule Trigger runs hourly. Dumpling AI searches for fresh news related to "AI Agent." The top 4 news links are scraped for full content. Articles are merged and fed into GPT-4o via a LangChain Agent to produce a casual, conversational video script. HeyGen creates a video using the script, avatar, and voice. The workflow waits until the video rendering is complete. Once done, the final video link is logged into Google Sheets. --- 🧪 Customization Ideas Change the interval (e.g., every 6 hours, daily) Swap avatar/voice in HeyGen to fit your brand Expand to post the video directly to social media Add image background or B-roll overlays using Creatomate --- This is a fast, automated pipeline to create explainer-style AI news updates using real-time data and generative video tools.

YangBy Yang
4163

Personal assistant bot with multi-agent system using Telegram & Google Gemini

How It Works Telegram Trigger receives incoming messages (text, voice, photo, document). Switch routes by message type to appropriate processors: Text → forwarded as-is. Voice → downloaded and sent to Transcribe a recording. Photo → downloaded, converted to base64, then sent to Analyze image. Document → routed to document handler. Merge collects the processed input and passes a unified prompt to Manager Agent. Manager Agent (LM: Google Gemini) orchestrates specialized agents/tools: memory_base (Airtable) → saving & retrieving personal/company memory todoandtask_manager (Todoist / Google Sheets) → tasks email_agent (Gmail) → composing/sending emails calendar_agent (Google Calendar) → scheduling research_agent (SerpAPI / Wikipedia / Wolfram) → web research project_management (Google Sheets) → project updates Manager Agent updates memory windows and sends the final reply back to Telegram. --- Setup Steps Create and configure Telegram bot; set bot token/webhook in Telegram Trigger and Telegram nodes. Update chatId placeholders. Add Google Gemini (PaLM) credentials in the Gemini model nodes. Configure Airtable knowledge-base: set base ID & table IDs used by memory_base nodes. Connect Google APIs: Sheets, Calendar, Gmail credentials and set document/sheet IDs. Configure Todoist, SerpAPI, WolframAlpha credentials and any other tool API keys. Verify Window Buffer Memory sessionKey values (match user sessions). Check schedule triggers (cron expressions) and adjust times/timezone. Run quick tests: send text, voice, image, and confirm replies and memory writes. --- Estimated Setup Time 30–60 minutes → if credentials & IDs are ready. 2–4 hours → full setup (API keys, spreadsheets, Airtable, detailed permissions). 4–8 hours → complex deployment (team permissions, multiple calendars, advanced tool tuning, production testing).

Akil ABy Akil A
2316

Upload bulk records from CSV - Airtable Interfaces

This workflow is a supporting automation to a common Airtable situation, that as of this writing, has no direct solution but has great demand. Interfaces are your secret weapon for managing a variety of tasks – from sales funnels and task tracking to creating dynamic dashboards. But here's a common situation: how do you efficiently bulk upload records (like contacts, leads, or clients) from an interface with just a click? Once set up, you'll be able to upload CSV files directly to your tables from the Interfaces with ease. Workflow Key Points: Bulk Upload Functionality: Say goodbye to the limitations of standard Airtable interfaces. Now, you can upload multiple leads or contacts simultaneously, making your work swift and efficient. Customizable Fields: Tailor the base to meet your specific data needs. This ensures seamless integration with your existing systems and simplifies data management. ​ Perfect for teams in e-commerce, CRM, or any sector where managing a high volume of leads or contacts is key. Our Airtable Base is designed to eliminate the tediousness of importing contacts. It makes large-scale data management straightforward, saving you precious time and hassle. ​ Get ready to streamline your operations and boost your productivity! 🚀💡

SidetoolBy Sidetool
2075

Transform images with AI prompts using FLUX Kontext, Google Sheets and Drive

This workflow automates the generation of AI-enhanced, contextualized images using FLUX Kontext, based on prompts stored in a Google Sheet. The generated images are then saved to Google Drive, and their URLs are written back to the spreadsheet for easy access. --- Example Image: Prompt: The girl is lying on the bed and sleeping Result: --- Perfect for E-commerce and Social Media This workflow is especially useful for e-commerce businesses: Generate product images with dynamic backgrounds based on the use-case or season. Create contextual marketing visuals for ads, newsletters, or product pages. Scale visual content creation without the need for manual design work. --- How It Works Trigger: The workflow can be started manually (via "Test workflow") or scheduled at regular intervals (e.g., every 5 minutes) using the "Schedule Trigger" node. Data Fetch: The "Get new image" node retrieves a row from a Google Sheet where the "RESULT" column is empty. It extracts the prompt, image URL, output format, and aspect ratio for processing. Image Generation: The "Create Image" node sends a request to the FLUX Kontext API (fal.run) with the provided parameters to generate a new AI-contextualized image. Status Check: The workflow waits 60 seconds ("Wait 60 sec." node) before checking the status of the image generation request via the "Get status" node. If the status is "COMPLETED," it proceeds; otherwise, it loops back to wait. Result Handling: Once completed, the "Get Image Url" node fetches the generated image URL, which is then downloaded ("Get Image File"), uploaded to Google Drive ("Upload Image"), and the Google Sheet is updated with the result ("Update result"). --- Set Up Steps To configure this workflow, follow these steps: Google Sheet Setup: Create a Google Sheet with columns for PROMPT, IMAGE URL, ASPECT RATIO, OUTPUT FORMAT, and RESULT (leave this empty). Link the sheet in the "Get new image" and "Update result" nodes. API Key Configuration: Sign up at fal.ai to obtain an API key. In the "Create Image" node, set the Header Auth with: Name: Authorization Value: Key YOURAPIKEY Google Drive Setup: Specify the target folder ID in the "Upload Image" node where generated images will be saved. Schedule Trigger (Optional): Adjust the "Schedule Trigger" node to run the workflow at desired intervals (e.g., every 5 minutes). Test Execution: Run the workflow manually via the "Test workflow" node to verify all steps function correctly. Once configured, the workflow will automatically process pending prompts, generate images, and update the Google Sheet with results. --- Need help customizing? Contact me for consulting and support or add me on Linkedin.

DavideBy Davide
2067

Turn YouTube transcripts into newsletter drafts using Dumpling AI + GPT-4o

🧾 What this workflow does This workflow turns YouTube video links into ready-to-edit newsletter drafts using Dumpling AI and GPT-4o. It reads new video URLs from a Google Sheet, extracts their transcripts, summarizes them into email-friendly content, and logs the finished draft back into the same sheet. An email notification is also sent to alert the user once each draft is created. --- 👤 Who is this for Newsletter writers or marketers repurposing video content YouTube creators building email follow-ups from videos Agencies or VAs batching social → email content Automation users streamlining content workflows --- ⚙️ How to set up ✅ Requirements Google Sheet with the following columns: link — YouTube video URL blog post — for saving the generated newsletter draft Active accounts for: Dumpling AI (API for YouTube transcripts) OpenAI GPT-4 or GPT-4o Google Sheets Gmail (OAuth2 credential) --- 🔧 Setup steps Connect all credentials using n8n's Credential Manager: Google Sheets (OAuth2) Dumpling AI (via HTTP Header Auth) OpenAI Gmail Update the sheet ID and tab name in both Google Sheets nodes. Customize the GPT-4o prompt (optional): Located in the “GPT-4o: Write Newsletter Draft from Transcript” node You can edit tone, structure, and audience targeting in the system message Verify email recipient in the Gmail node and update if needed. --- 🧠 How it works The workflow is triggered manually or on schedule. It pulls YouTube links without drafts from the sheet. Each video’s transcript is fetched using Dumpling AI. GPT-4o summarizes the transcript into a clean, friendly newsletter format. The draft is written back to the same row in Google Sheets. An email is sent to notify the user that the draft is ready. --- 🛠️ Customization ideas Send finished drafts to Notion or Airtable instead of Sheets Generate social media posts from the same transcript Add automatic review steps using GPT scoring or editing Trigger this on new form submissions or YouTube uploads instead --- This is a fast, AI-powered way to turn long-form video content into clean, polished newsletters — ready to share or schedule with minimal editing.

YangBy Yang
2049

Daily GitHub release notification by email

Automating daily notifications of the latest releases from a GitHub repository. This template is ideal for developers and project managers looking to stay up-to-date with software updates. How it Works: Daily Trigger: The workflow initiates daily using the Schedule Trigger node. Fetch Repository Data: The HTTP Request node retrieves the latest release details from the specified GitHub repository. Check if new: The IF node check if the release was done in the last 24 hours. Split Content: The Split Out node processes the JSON response to extract and structure relevant data. Convert Markdown: The Markdown node converts release notes from Markdown format to HTML, making them ready to use in emails. Send a notification by email Key Features: Simple to customize by modifying the GitHub URL. Automatically processes and formats release notes for better readability. Modular design, allowing integration with other workflows like Gmail or Slack notifications. Setup Steps: Modify Repository URL: Update the Sticky Note node with the URL of the repository you want to monitor. Modify SMTP details: Update the Send Email node with your SMTP details.

DionysusBy Dionysus
1980

Spotify: discover weekly archive

This updated workflow will automatically archive your Spotify Discover Weekly tracks to another manually created playlist, without the nuisance of duplicate tracks. It utilizes the latest verisons of n8n's Schedule trigger, Spotify, Switch, Merge, and IF nodes. Special thanks to trey for their original version of the workflow, as well as ihortom for their help with navigating the Switch node's outputs. To use this workflow, you'll need to: Create a playlist for use as the archive playlist within your Spotify account Create and select your Spotify credentials within each Spotify node within the workflow See workflow README for additional information and optional setup steps.

JYLNBy JYLN
1979