Analyze any website with OpenAI and get on-page SEO audit
Instantly Find & Fix What’s Holding Your Page Back You’ve put in the work. Your content is strong. Your design is polished. But… ❌ Your page isn’t ranking where it should. ❌ Your competitors are outranking you—even with weaker content. ❌ You have no idea what’s wrong—or how to fix it. The truth? SEO isn’t just about keywords. Your technical setup, content structure, and on-page elements must work together seamlessly. And if anything is off? Google won’t rank your page. Who Is This For? SaaS Founders & Startups – Get higher rankings & organic traffic that converts. Marketing Teams & Agencies – Audit & optimize pages in seconds. E-commerce & Content Sites – Improve rankings for product pages, blogs, and landing pages. How It Works Paste your URL Get an instant audit + recommendations list Implement changes & watch your rankings jump The workflow scrapes the url you input, gets the htlm source code of the landing page, and sends it to OpenAI AI Agent. The Agent makes a deep analysis, audit the Technical + Content SEO of the page, and provides 10 Recommendations to improve your SEO. Setup Guide You will need OpenAI Credentials with an API Key to run the workflow. The workflow is using the OpenAI-o1 model to deliver the best results. It costs between $0.20/0.30 per run. You can adjust the prompt to your wish in the AI Agent parameters. Once the audit has been completed, it will send an email (don't forget to add your email address here) Below is an example of what you can expect
Search & summarize web data with Perplexity, Gemini AI & Bright Data to webhooks
Who this is for? This workflow is designed for professionals and teams who need real-time, structured insights from Perplexity Search results without manual effort. What problem is this workflow solving? This n8n workflow solves the problem of automating Perplexity Search result extraction, cleanup, summarization, and AI-enhanced formatting for downstream use like sending the results to a webhook or another system. What this workflow does Automates Perplexity Search via Bright Data Uses Bright Data’s proxy-based SERP API to run a Google Search query programmatically. Makes the process repeatable and scriptable with different search terms and regions/zones. Cleans and Extracts Useful Content The Readable Data Extractor uses LLM-based cleaning to remove HTML/CSS/JS from the response and extract pure text data. Converts messy, unstructured web content into structured, machine-readable format. Summarizes Search Results Through the Gemini Flash + Summarization Chain, it generates a concise summary of the search results. Ideal for users who don’t have time to read full pages of search results. Formats Data Using AI Agent The AI Agent acts like a virtual assistant that: - Understands search results Formats them in a readable, JSON-compatible form Prepares them for webhook delivery Delivers Results to Webhook Sends the final summary + structured search result to a webhook (could be your app, a Slack bot, Google Sheets, or CRM). Setup Sign up at Bright Data. Navigate to Proxies & Scraping and create a new Web Unlocker zone by selecting Web Unlocker API under Scraping Solutions. In n8n, configure the Header Auth account under Credentials (Generic Auth Type: Header Authentication). The Value field should be set with the Bearer XXXXXXXXXXXXXX. The XXXXXXXXXXXXXX should be replaced by the Web Unlocker Token. In n8n, configure the Google Gemini(PaLM) Api account with the Google Gemini API key (or access through Vertex AI or proxy). Update the Perplexity Search Request node with the prompt you wish to perform the search. Update the Webhook HTTP Request node with the Webhook endpoint of your choice. How to customize this workflow to your needs Change the Perplexity Search Input Default: It searches a fixed query or dataset. Customize: Accept input from a Google Sheet, Airtable, or a form. Auto-trigger searches based on keywords or schedules. Customize Summarization Style (LLM Output) Default: General summary using Google Gemini or OpenAI. Customize: Add tone: formal, casual, technical, executive-summary, etc. Focus on specific sections: pricing, competitors, FAQs, etc. Translate the summaries into multiple languages. Add bullet points, pros/cons, or insight tags. 3.Choose Where the Results Go Options: Email, Slack, Notion, Airtable, Google Docs, or a dashboard. Auto-create content drafts for WordPress or newsletters. Feed into CRM notes or attach to Salesforce leads.
Send financial metrics monthly to Mattermos
No description available.
Auto-send welcome messages to Vtiger CRM leads via Evolution API WhatsApp
One of My Best! Send WhatsApp Greetings to New Vtiger Leads Automatically 💪😍 ⚠️ This Workflow Requires a Community Nodes and a Self-Hosted n8n Instance > This template uses two custom community nodes: > * n8n-nodes-vtiger-crm > * n8n-nodes-evolution-api > You must be running self-hosted n8n with Community Nodes enabled. 🔧 How to Install Community Nodes Go to Settings → Community Nodes Click Install Node Add: bash n8n-nodes-vtiger-crm n8n-nodes-evolution-api Restart n8n if prompted. --- 💬 Auto-Send WhatsApp Welcome Messages to New Leads in Vtiger CRM Overview This workflow sends a personalized WhatsApp welcome message to newly created leads in Vtiger CRM — using Evolution API — and updates the CRM record to ensure the message isn’t sent again. It’s ideal for teams that want to greet new leads instantly, reduce manual effort, and automate the first touchpoint of the sales process. --- 🔄 What This Workflow Does ⏱ Runs every 1 minute via schedule trigger 📥 Fetches the latest uncontacted lead from Vtiger (cf_1090 != 1) 💬 Sends a personalized WhatsApp message using Evolution API ✅ Marks the lead as “messaged” by updating a custom field --- 📸 Visual Preview 🧩 Workflow Canvas > Full layout of the automation flow in n8n 💬 Evolution API Server > Example of the greeting message the lead receives: Hi Ahmed Saadawi 😊, We have received your interest with our services and we will contact you soon. Have a nice day 🙏💐 --- 🛠️ Setup Instructions Vtiger CRM Setup Add a custom field (e.g. cf_1090) to track if a message was already sent Ensure lead records contain: firstname, lastname, phone Connect your Vtiger CRM API credentials Evolution API Setup Install or connect to your Evolution API instance Configure: instanceName remoteJid (from Vtiger lead phone) Message template (edit as needed) Add your Evolution API credentials Customize Message Edit the message in the Evolution node to match your brand’s tone. --- 👥 Who Is This For? Sales teams needing instant CRM-to-WhatsApp follow-ups Companies automating first contact with leads Vtiger CRM users looking for WhatsApp engagement tools --- 🔐 Credentials Required ✅ Vtiger CRM API ✅ Evolution API (self-hosted or SaaS) --- 🏷 Tags vtiger, whatsapp automation, evolution api, crm follow-up, sales automation, welcome message, crm whatsapp integration, lead onboarding, no-code automation, n8n template, self-hosted n8n, vtiger crm automation, community nodes, whatsapp message workflow
Database alerts with Notion and SIGNL4
Objective In industry and production sometimes machine data is available in databases. That might be sensor data like temperature or pressure or just binary information. In this sample flow reads machine data and sends an alert to your SIGNL4 team when the machine is down. When the machine is up again the alert in SIGNL4 will get closed automatically. Setup We simulate the machine data using a Notion table. When we un-check the Up box we simulate a machine-down event. In certain intervals n8n checks the database for down items. If such an item has been found an alert is send using SIGNL4 and the item in Notion is updates (in order not to read it again). Status updates from SIGNL4 (acknowledgement, close, annotation, escalation, etc.) are received via webhook and we update the Notion item accordingly. This is how the alert looks like in the SIGNL4 app. The flow can be easily adapted to other database monitoring scenarios.
Monitor and download changed files from Google Drive automatically
Description This workflow automates the download of new or updated files from a Google Drive folder, processing only files changed since the last run using a timestamp control file. How It Works Triggered on a schedule. Checks for a n8nlastrun.txt file in your Google Drive to read when the workflow last ran. If missing, defaults to processing changes in the last 24 hours. Searches for new or modified files in your specified folder. Downloads new/changed files. Replaces the timestamp file with the current time for future runs. Setup Steps Set up your Google Drive credentials in n8n. Find the Folder ID of the Google Drive folder you wish to monitor. Edit all Google Drive nodes: Select your credentials Paste the Folder ID Adjust the schedule trigger if needed. Activate the workflow. Features No duplicate file processing (idempotent) Handles missing timestamp files Clear logical sticky notes in the editor Modular, extendable design Prerequisites Google Drive API credentials connected to n8n Target Google Drive folder accessible by the credentials
Reddit Sentiment Analysis for Apple WWDC25 with Gemini AI and Google Sheets
This workflow automates sentiment analysis of Reddit posts related to Apple's WWDC25 event. It extracts data, categorizes posts, analyzes sentiment of comments, and updates a Google Sheet with the results. Preliquisites Bright Data Account: You need a Bright Data account to scrape Reddit data. Ensure you have the correct permissions to use their API. https://brightdata.com/ Google Sheets API Credentials: Enable the Google Sheets API in your Google Cloud project and create credentials (OAuth 2.0 Client IDs). Google Gemini API Credentials: You need a Gemini API key to run the sentiment analysis. Ensure you have the correct permissions to use their API. https://ai.google.dev/". You can use any other models of choice Setup Import the Workflow: Import the provided JSON workflow into your n8n instance.", Configure Bright Data Credentials:, In the 'scrap reddit' and the 'get status' nodes, in Header Parameters find the Authorization field, replace Bearer 1234 with your Bright Data API key. Apply this to every node that utilizes your Bright Data API Key., Set up the Google Sheets API credentials, In the 'Append Sentiments' node, set up the Google Sheets API by connecting your Google Sheets account through oAuth 2 credentials. ", Configure the Google Gemini Credential ID, In the ' Sentiment Analysis per comment' node, set up the Google Gemini API by connecting your Google AI account through the API credentials. , Configure Additional Parameters:, In the 'scrap reddit' node, modify the JSON body to adjust the search term, date, or sort method., In the 'Wait' node, alter the 'Amount' to adjust the polling interval for scraping status, it is set to 15 seconds by default., In the 'Text Classifier' node, customize the categories and descriptions to suit the sentiment analysis needs. Review categories such as 'WWDC events' to ensure relevancy., In the 'Sentiment Analysis per comment' node, modify the system prompt template to improve context. customization_options Bright Data API parameters to adjust scraping behavior. Wait node duration to optimize polling. Text Classifier categories and descriptions. Sentiment Analysis system prompt. Use Case Examples Brand Monitoring: Track public sentiment towards Apple during and after the WWDC25 event. Product Feedback Analysis: Gather insights into user reactions to new product announcements. Competitive Analysis: Compare sentiment towards Apple's announcements versus competitors. Event Impact Assessment: Measure the overall impact of the WWDC25 event on various aspects of Apple's business. Target_audiences: Marketing professionals in the tech industry, Brand managers, Product managers, Market research analysts, Social media managers Troubleshooting: Workflow fails to start. Check that all necessary credentials (Bright Data and Google Sheets API) are correctly configured and that the Bright Data API key is valid. Data scraping fails. Verify the Bright Data API key, ensure the dataset ID is correct, and inspect the Bright Data dashboard for any issues with the scraping job. Sentiment analysis is inaccurate. Refine the categories and descriptions in the 'Text Classifier' node. Check that you have the correct Google Gemini API key, as the original is a placeholder. Google Sheets are not updating. Ensure the Google Sheets API credentials have the necessary permissions to write to the specified spreadsheet and sheet. Check API usage limits. Workflow does not produce the correct output. Check the data connections, by clicking the connections, and looking at which data is being produced. Check all formulas for errors. Happy productivity!
Remote job updates pipeline with RemoteOK, Airtable, and Telegram
🚀 Remote Job Automation Workflow Automatically fetch, clean, and broadcast the latest remote job listings — powered by RemoteOK, Airtable, and Telegram. 🔧 Key Features Seamless Data Fetching: Pulls the latest job listings from the RemoteOK API using an HTTP Request node. Smart Data Processing (via Code Node): Filters out irrelevant metadata Cleans and sanitizes job descriptions (e.g., HTML tags, special characters) Handles malformed or encoded text gracefully Extracts and formats salary ranges for clarity Airtable Integration (Upsert): Stores each job as a unique record using job ID Avoids duplication through conditional upserts using Airtable's Personal Access Token Telegram Bot Broadcasting: Automatically formats and sends job posts to a Telegram group or channel Keeps your community or team updated in real-time 📦 Tech Stack RemoteOK API – source of curated remote job listings Airtable – lightweight, accessible job database Telegram Bot API – for real-time job notifications n8n – workflow automation engine to tie everything together 🔁 Workflow Overview Fetch Jobs from RemoteOK API Clean & Normalize job descriptions and metadata Extract Salary ranges and standardize them Upsert to Airtable (avoiding duplicates) Format Post for visual clarity Send to Telegram via bot integration 🧠 Perfect For Remote job boards or aggregators Recruitment agencies/startups Developers building personal job feeds Communities or channels sharing curated remote opportunities Automating newsletters or job digests ✅ Benefits Near real-time updates Minimal maintenance Full control and extensibility with n8n
Analyze trending cryptocurrencies from CoinMarketCap with GPT-4o-mini for Discord
What It Does Stop chasing the market—let the market come to you. This Done-For-You AI Crypto Bot is a fully configured n8n workflow that scrapes CoinMarketCap for trending cryptocurrencies, analyzes them with cutting-edge OpenAI AI (GPT-4o-mini), and delivers concise, actionable insights directly to your Discord channel. Forget tedious manual research and complex setups; this system is ready for instant deployment, giving you and your community an unfair advantage by providing daily, automated crypto trend intelligence without lifting a finger. It’s the ultimate shortcut to staying ahead in the fast-paced crypto world with a pre-built crypto automation solution. --- ⚙️ Key Features ⏰ Automated Daily Crypto Updates:* Pre-scheduled to run multiple times a day, ensuring you never miss a trending coin. 🧠 AI-Powered Market Analysis:* Leverages GPT-4o-mini to distill complex data into digestible, insightful summaries. 💬 Seamless Discord Integration:* Delivers beautifully formatted, Markdown-compatible messages directly to your chosen channel. ⚡ Zero-Setup n8n Workflow:* Simply import the JSON, plug in your API keys, and go live within minutes. 📈 Actionable Insights:* Provides ticker, price, market cap, volume, and direct links for quick research and trading decisions. --- 😩 Pain Points Solved Tired of missing crucial crypto market trends and potential opportunities? Wasting countless hours on manual research and data aggregation from multiple sources? Struggling to provide timely, concise, and professional crypto insights to your community or personal trading strategy? Frustrated by the complexity and time investment of setting up custom AI and automation workflows from scratch? Need a reliable, hands-off solution to stay informed and competitive in the volatile cryptocurrency landscape? --- 📦 What’s Included Fully configured n8n workflow JSON file (ready to import) Pre-optimized AI prompt for expert crypto analysis Step-by-step setup guide for API keys and Discord integration Lifetime access to workflow updates --- 🚀 Call to Action Get your AI Crypto Bot live today. Automate insights, dominate trends. --- 🏷️ Optimized Tags done for you crypto bot, n8n workflow, ai crypto analysis, discord bot, trending coins, coinmarketcap automation, crypto insights, market intelligence, ready made solution, pre built automation, digital product, crypto trading tool, passive income bot
🎓 Learn evaluate tool. Tutorial for beginners with Gemini and Google Sheets
This workflow is a beginner-friendly tutorial demonstrating how to use the Evaluation tool to automatically score the AI’s output against a known correct answer (“ground truth”) stored in a Google Sheet. --- Advantages ✅ Beginner-friendly – Provides a simple and clear structure to understand AI evaluation. ✅ Flexible input sources – Works with both Google Sheets datasets and manual test entries. ✅ Integrated with Google Gemini – Leverages a powerful AI model for text-based tasks. ✅ Tool usage – Demonstrates how an AI agent can call external tools (e.g., calculator) for accurate answers. ✅ Automated evaluation – Outputs are automatically compared against ground truth data for factual correctness. ✅ Scalable testing – Can handle multiple dataset rows, making it useful for structured AI model evaluation. ✅ Result tracking – Saves both answers and correctness scores back to Google Sheets for easy monitoring. --- How it Works The workflow operates in two distinct modes, determined by the trigger: Manual Test Mode: Triggered by "When clicking 'Execute workflow'". It sends a fixed question ("How much is 8 * 3?") to the AI agent and returns the answer to the user. This mode is for quick, ad-hoc testing. Evaluation Mode: Triggered by "When fetching a dataset row". This mode reads rows of data from a linked Google Sheet. Each row contains an input (a question) and an expected_output (the correct answer). It processes each row as follows: The input question is sent to the AI Agent node. The AI Agent, powered by a Google Gemini model and equipped with a Calculator tool, processes the question and generates an answer (output). The workflow then checks if it's in evaluation mode. Instead of just returning the answer, it passes the AI's actualoutput and the sheet's expectedoutput to another Evaluation node. This node uses a second Google Gemini model as a "judge" to evaluate the factual correctness of the AI's answer compared to the expected one, generating a Correctness score on a scale from 1 to 5. Finally, both the AI's actual_output and the automated correctness score are written back to a new column in the same row of the Google Sheet. --- Set up Steps To use this workflow, you need to complete the following setup steps: Credentials Configuration: Set up the Google Sheets OAuth2 API credentials (named "Google Sheets account"). This allows n8n to read from and write to your Google Sheet. Set up the Google Gemini (PaLM) API credentials (named "Google Gemini(PaLM) (Eure)"). This provides the AI language model capabilities for both the agent and the evaluator. Prepare Your Google Sheet: The workflow is pre-configured to use a specific Google Sheet. You must clone the provided template sheet (the URL is in the Sticky Note) to your own Google Drive. In your cloned sheet, ensure you have at least two columns: one for the input/question (e.g., input) and one for the expected correct answer (e.g., expectedoutput). You may need to update the node parameters that reference $json.input and $json.expectedoutput to match your column names exactly. Update Document IDs: After cloning the sheet, get its new Document ID from its URL and update the documentId field in all three Evaluation nodes ("When fetching a dataset row", "Set output Evaluation", and "Set correctness") to point to your new sheet instead of the original template. Activate the Workflow: Once the credentials and sheet are configured, toggle the workflow to Active. You can then trigger a manual test run or set the "When fetching a dataset row" node to poll your sheet automatically to evaluate all rows. ---- Need help customizing? Contact me for consulting and support or add me on Linkedin.
Generate custom text content with IBM Granite 3.3 8B instruct AI
Generate Custom Text Content with IBM Granite 3.3 8B Instruct AI This workflow connects to Replicate’s API and uses the ibm-granite/granite-3.3-8b-instruct model to generate text. --- ✅ 🔵 SECTION 1: Trigger & Setup ⚙️ Nodes 1️⃣ On clicking 'execute' What it does: Starts the workflow manually when you hit Execute*. Why it’s useful: Perfect for testing text generation on-demand. 2️⃣ Set API Key What it does: Stores your Replicate API key securely. Why it’s useful: You don’t hardcode credentials into HTTP nodes — just set them once here. Beginner tip: Replace YOURREPLICATEAPI_KEY with your actual API key. --- 💡 Beginner Benefit ✅ No coding needed to handle authentication. ✅ You can reuse the same setup for other Replicate models. --- ✅ 🤖 SECTION 2: Model Request & Polling ⚙️ Nodes 3️⃣ Create Prediction (HTTP Request) What it does: Sends a POST request to Replicate’s API to start a text generation job. Parameters include: temperature, maxtokens, topk, top_p. Why it’s useful: Controls how creative or focused the AI text output will be. 4️⃣ Extract Prediction ID (Code) What it does: Pulls the prediction ID and builds a URL for checking status. Why it’s useful: Replicate jobs run asynchronously, so you need the ID to track progress. 5️⃣ Wait What it does: Pauses for 2 seconds before checking the prediction again. Why it’s useful: Prevents spamming the API with too many requests. 6️⃣ Check Prediction Status (HTTP Request) What it does: Polls the Replicate API for the current status (e.g., starting, processing, succeeded). Why it’s useful: Lets you loop until the AI finishes generating text. 7️⃣ Check If Complete (IF Condition) What it does: If the status is succeeded, it goes to “Process Result.” Otherwise, it loops back to Wait and retries. Why it’s useful: Creates an automated polling loop without writing complex code. --- 💡 Beginner Benefit ✅ No need to manually refresh or check job status. ✅ Workflow keeps retrying until text is ready. ✅ Smart looping built-in with Wait + If Condition. --- ✅ 🟢 SECTION 3: Process & Output ⚙️ Nodes 8️⃣ Process Result (Code) What it does: Collects the final AI output, status, metrics, and timestamps. Adds info like: ✅ output → Generated text ✅ model → ibm-granite/granite-3.3-8b-instruct ✅ metrics → Performance data Why it’s useful: Gives you a neat, structured JSON result that’s easy to send to Sheets, Notion, or any app. --- 💡 Beginner Benefit ✅ Ready-to-use text output. ✅ Easy integration with any database or CRM. ✅ Transparent metrics (when it started, when it finished, etc.). --- ✅✅✅ ✨ FULL FLOW OVERVIEW | Section | What happens | | ------------------------------ | ---------------------------------------------------------------------------- | | ⚡ Trigger & Setup | Start workflow + set Replicate API key. | | 🤖 Model Request & Polling | Send request → get Prediction ID → loop until job completes. | | 🟢 Process & Output | Extract clean AI-generated text + metadata for storage or further workflows. | --- 📌 How You Benefit Overall ✅ No coding needed — just configure your API key. ✅ Reliable polling — the workflow waits until results are ready. ✅ Flexible — you can extend output to Google Sheets, Slack, Notion, or email. ✅ Beginner-friendly — clean separation of input, process, and output. --- ✨ With this workflow, you’ve turned Replicate’s IBM Granite LLM into a no-code text generator — running entirely inside n8n! ✨ ---
Auto-extract & distribute video clips to multiple social platforms with Klap AI
--- Overview of the Workflow The automation process consists of four main steps: Get Longform: Retrieve the long-form video data (e.g., from Google Sheets). Analyze Longform: Use Clap to analyze the video and generate short clips. Produce Shorts: Export the generated clips. Publish Shorts: Update the status in Google Sheets and publish the clips to social media platforms. Each step is handled by specific nodes in n8n, a no-code automation tool, making the entire process accessible even if you’re not tech-savvy. The workflow is visually represented in the provided n8n screenshot, with nodes connected to show the flow of data and actions. --- Step 1: Get Longform Purpose Start the automation and retrieve the long-form video data. Tips Test the node with a sample row to ensure it retrieves the correct data. Use a consistent sheet structure to avoid errors in future runs. Why It Matters This step ensures the automation starts automatically and pulls the correct video for processing, saving you from manual intervention. --- Step 2: Analyze Longform Purpose Use Clap to analyze the long-form video and generate short clips. Tips Pin the Get Shorts Details Node: Right-click and pin it to retain data for testing across sessions. Test with a Sample Video: Run the workflow with a short video to verify Clap’s output. Why It Matters Clap’s AI identifies key moments and generates clips, saving hours of manual editing. The wait and status nodes ensure the workflow progresses only when ready. --- Step 3: Produce Shorts Purpose Export the generated clips for publishing. Tips Preview the clips after export to ensure quality. Adjust wait times based on export duration observed during testing. Why It Matters This step finalizes the clips, making them ready for publishing, with wait nodes preventing premature progression. --- Step 4: Publish Shorts Purpose Update the video’s status in Google Sheets and publish the clips to social media. Tips Add an If Node: Before updating, check if the status is already "done" to skip processed videos. Organize Clips: Use Google Sheets columns (e.g., "TikTok," "YouTube") to assign clips to platforms. Why It Matters This step automates publishing across multiple platforms and keeps your workflow organized by updating statuses. --- Additional Tips for Efficiency No-Code Simplicity: n8n’s drag-and-drop interface requires no coding—adjust nodes visually to suit your needs. Handle Processing Times: Use wait nodes to manage delays in analysis and export steps. Monetization Ideas: Offer this automation as a service to businesses or creators. Submit clips to platforms like "Wop" for earnings based on views. Testing: Run the workflow with a sample video, pinning nodes to retain data for debugging. --- Benefits of AI Automation Time Savings: Automate clipping and publishing, freeing you for creative tasks. Scalability: Produce 100+ shorts from one video, boosting reach. Consistency: Maintain a regular posting schedule effortlessly. Cost-Effective: Reduce reliance on manual editing or expensive tools. --- This workflow leverages n8n and Klap to streamline short-form content creation, making it ideal for content creators looking to maximize their long-form videos. If you need further clarification or help with specific nodes, let me know!