Summarize SERPBear data with AI (via Openrouter) and save it to Baserow
Who's this for?
- If you own a website and need to analyze your keyword rankings
- If you need to create a keyword report on your rankings
- If you want to grow your keyword positions
SerpBear is an opensourced SEO tool specifically for keyword analytics.
Click here to watch youtube tutorial
Example output of A.I.
**Key Observations about Ranking Performance:**
- The top-performing keyword is “Openrouter N8N” with a current position of 7 and an improving trend.
- Two keywords, “Best Docker Synology” and “Bitwarden Synology”, are not ranking in the top 100 and have a stable trend.
- Three keywords, “Obsidian Second Brain”, “AI Generated Reference Letter”, and “Actual Budget Synology”, and “N8N Workflow Generator” are not ranking well and have a declining trend.
**Keywords showing the most improvement:**
- “Openrouter N8N” has an improving trend and a relatively high ranking of 7.
**Keywords needing attention:**
- “Obsidian Second Brain” has a declining trend and a low ranking of 69.
- “AI Generated Reference Letter” has a declining trend and a low ranking of 84.
- “Actual Budget Synology”, “N8N Workflow Generator”, “Best Docker Synology”, and “Bitwarden Synology” are not ranking in the top 100.
Use case
Instead of hiring an SEO expert, I run this report weekly. It checks the keyword rankings of the past week and gives me recommendations on what to improve.
How it works
The workflow gathers SerpBear analytics for the past 7 days. It passes the data to openrouter.ai for A.I. analysis. Finally it saves to baserow.
How to use this
Input your SerpBearcredentials Enter your domain name Input your Openrouter.ai credentials Input your baserow credentials You will need to create a baserow database with columns: Date, Note, Blog
Created by Rumjahn
n8n Workflow: Summarize Data with OpenRouter AI and Save to Baserow
This n8n workflow demonstrates how to leverage AI for data summarization using OpenRouter and then store the results in Baserow. It provides a flexible setup to process data, generate summaries, and maintain a structured record of the AI-generated content.
What it does
This workflow automates the following steps:
- Trigger: The workflow can be initiated either manually or on a predefined schedule.
- Fetch Data (Placeholder): An HTTP Request node is included, typically used to fetch data from an external API or source that needs summarization. Note: The current JSON does not specify the URL or method for this HTTP Request, so it acts as a placeholder for integration with your data source.
- Prepare Data for AI: A Code node is used to transform or prepare the data fetched from the previous step into a format suitable for the AI model. This might involve extracting specific fields, formatting prompts, or combining information.
- AI Summarization (OpenRouter): The prepared data is sent to an AI model via OpenRouter (implied by the directory name, though not explicitly an OpenRouter node in the JSON, the HTTP Request node would be configured for this). The AI processes the input and generates a summary.
- Save to Baserow: The generated AI summary, along with any relevant metadata, is then stored as a new record in a Baserow database.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running n8n instance.
- Baserow Account: Access to a Baserow instance and an API key for authentication. You'll also need a specific database and table set up in Baserow to store the summarized data.
- OpenRouter API Key: An API key for OpenRouter to access their AI models. The HTTP Request node will need to be configured with the appropriate OpenRouter endpoint and authentication.
- Data Source: An API endpoint or data source from which you want to fetch data for summarization (to be configured in the HTTP Request node).
Setup/Usage
- Import the Workflow:
- Copy the provided JSON code.
- In your n8n instance, go to "Workflows" and click "New".
- Click the three dots menu (...) and select "Import from JSON".
- Paste the JSON code and click "Import".
- Configure Credentials:
- Baserow: You will need to set up a Baserow credential. In n8n, go to "Credentials", click "New Credential", search for "Baserow", and provide your API Key.
- OpenRouter (via HTTP Request): The HTTP Request node will need to be configured with the OpenRouter API endpoint, method (likely POST), headers (including your API key for authentication), and the body containing your prompt and data for summarization. You may need to create a generic "API Key" credential if OpenRouter requires a custom header for its key.
- Configure Nodes:
- HTTP Request (Data Source): Update this node to connect to your specific data source. Provide the URL, method (GET/POST), headers, and any authentication required to fetch the data you want to summarize.
- Code: Modify the JavaScript code within this node to correctly extract and format the data from the previous HTTP Request node for your AI prompt. This is where you'll construct the prompt sent to OpenRouter.
- Baserow:
- Select your Baserow credential.
- Specify the
Database IDandTable IDwhere you want to save the summarized data. - Map the fields from the AI's output (from the Code node) to the columns in your Baserow table.
- Activate the Workflow:
- Once all configurations are complete, activate the workflow by toggling the "Active" switch in the top right corner of the workflow editor.
- Execution:
- Manual Trigger: Click "Execute Workflow" to run it manually.
- Schedule Trigger: Configure the "Schedule Trigger" node to run the workflow at your desired intervals (e.g., daily, hourly).
This workflow provides a robust framework for integrating AI summarization into your data management processes with Baserow.
Related Templates
Two-way property repair management system with Google Sheets & Drive
This workflow automates the repair request process between tenants and building managers, keeping all updates organized in a single spreadsheet. It is composed of two coordinated workflows, as two separate triggers are required — one for new repair submissions and another for repair updates. A Unique Unit ID that corresponds to individual units is attributed to each request, and timestamps are used to coordinate repair updates with specific requests. General use cases include: Property managers who manage multiple buildings or units. Building owners looking to centralize tenant repair communication. Automation builders who want to learn multi-trigger workflow design in n8n. --- ⚙️ How It Works Workflow 1 – New Repair Requests Behind the Scenes: A tenant fills out a Google Form (“Repair Request Form”), which automatically adds a new row to a linked Google Sheet. Steps: Trigger: Google Sheets rowAdded – runs when a new form entry appears. Extract & Format: Collects all relevant form data (address, unit, urgency, contacts). Generate Unit ID: Creates a standardized identifier (e.g., BUILDING-UNIT) for tracking. Email Notification: Sends the building manager a formatted email summarizing the repair details and including a link to a Repair Update Form (which activates Workflow 2). --- Workflow 2 – Repair Updates Behind the Scenes:\ Triggered when the building manager submits a follow-up form (“Repair Update Form”). Steps: Lookup by UUID: Uses the Unit ID from Workflow 1 to find the existing row in the Google Sheet. Conditional Logic: If photos are uploaded: Saves each image to a Google Drive folder, renames files consistently, and adds URLs to the sheet. If no photos: Skips the upload step and processes textual updates only. Merge & Update: Combines new data with existing repair info in the same spreadsheet row — enabling a full repair history in one place. --- 🧩 Requirements Google Account (for Forms, Sheets, and Drive) Gmail/email node connected for sending notifications n8n credentials configured for Google API access --- ⚡ Setup Instructions (see more detail in workflow) Import both workflows into n8n, then copy one into a second workflow. Change manual trigger in workflow 2 to a n8n Form node. Connect Google credentials to all nodes. Update spreadsheet and folder IDs in the corresponding nodes. Customize email text, sender name, and form links for your organization. Test each workflow with a sample repair request and a repair update submission. --- 🛠️ Customization Ideas Add Slack or Telegram notifications for urgent repairs. Auto-create folders per building or unit for photo uploads. Generate monthly repair summaries using Google Sheets triggers. Add an AI node to create summaries/extract relevant repair data from repair request that include long submissions.
Send WooCommerce cross-sell offers to customers via WhatsApp using Rapiwa API
Who Is This For? This n8n workflow enables automated cross-selling by identifying each WooCommerce customer's most frequently purchased product, finding a related product to recommend, and sending a personalized WhatsApp message using the Rapiwa API. It also verifies whether the user's number is WhatsApp-enabled before sending, and logs both successful and unsuccessful attempts to Google Sheets for tracking. What This Workflow Does Retrieves all paying customers from your WooCommerce store Identifies each customer's most purchased product Finds the latest product in the same category as their most purchased item Cleans and verifies customer phone numbers for WhatsApp compatibility Sends personalized WhatsApp messages with product recommendations Logs all activities to Google Sheets for tracking and analysis Handles both verified and unverified numbers appropriately Key Features Customer Segmentation: Automatically identifies paying customers from your WooCommerce store Product Analysis: Determines each customer's most purchased product Smart Recommendations: Finds the latest products in the same category as customer favorites WhatsApp Integration: Uses Rapiwa API for message delivery Phone Number Validation: Verifies WhatsApp numbers before sending messages Dual Logging System: Tracks both successful and failed message attempts in Google Sheets Rate Limiting: Uses batching and wait nodes to prevent API overload Personalized Messaging: Includes customer name and product details in messages Requirements WooCommerce store with API access Rapiwa account with API access for WhatsApp verification and messaging Google account with Sheets access Customer phone numbers in WooCommerce (stored in billing.phone field) How to Use — Step-by-Step Setup Credentials Setup WooCommerce API: Configure WooCommerce API credentials in n8n (e.g., "WooCommerce (get customer)" and "WooCommerce (get customer data)") Rapiwa Bearer Auth: Create an HTTP Bearer credential with your Rapiwa API token Google Sheets OAuth2: Set up OAuth2 credentials for Google Sheets access Configure Google Sheets Ensure your sheet has the required columns as specified in the Google Sheet Column Structure section Verify Code Nodes Code (get paying_customer): Filters customers to include only those who have made purchases Get most buy product id & Clear Number: Identifies the most purchased product and cleans phone numbers Configure HTTP Request Nodes Get customer data: Verify the WooCommerce API endpoint for retrieving customer orders Get specific product data: Verify the WooCommerce API endpoint for product details Get specific product recommend latest product: Verify the WooCommerce API endpoint for finding latest products by category Check valid WhatsApp number Using Rapiwa: Verify the Rapiwa endpoint for WhatsApp number validation Rapiwa Sender: Verify the Rapiwa endpoint for sending messages Google Sheet Required Columns You’ll need two Google Sheets (or two tabs in one spreadsheet): A Google Sheet formatted like this ➤ sample The workflow uses a Google Sheet with the following columns to track coupon distribution: Both must have the following headers (match exactly): | name | number | email | address1 | price | suk | title | product link | validity | staus | | ---------- | ------------- | ----------------------------------------------- | ----------- | ----- | --- | ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | ---------- | -------- | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | verified | sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | unverified | not sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | verified | sent | Important Notes Phone Number Format: The workflow cleans phone numbers by removing all non-digit characters. Ensure your WooCommerce phone numbers are in a compatible format. API Rate Limits: Rapiwa and WooCommerce APIs have rate limits. Adjust batch sizes and wait times accordingly. Data Privacy: Ensure compliance with data protection regulations when sending marketing messages. Error Handling: The workflow logs unverified numbers but doesn't have extensive error handling. Consider adding error notifications for failed API calls. Product Availability: The workflow recommends the latest product in a category, but doesn't check if it's in stock. Consider adding stock status verification. Testing: Always test with a small batch before running the workflow on your entire customer list. Useful Links Dashboard: https://app.rapiwa.com Official Website: https://rapiwa.com Documentation: https://docs.rapiwa.com Support & Help WhatsApp: Chat on WhatsApp Discord: SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen
Track SDK documentation drift with GitHub, Notion, Google Sheets, and Slack
📊 Description Automatically track SDK releases from GitHub, compare documentation freshness in Notion, and send Slack alerts when docs lag behind. This workflow ensures documentation stays in sync with releases, improves visibility, and reduces version drift across teams. 🚀📚💬 What This Template Does Step 1: Listens to GitHub repository events to detect new SDK releases. 🧩 Step 2: Fetches release metadata including version, tag, and publish date. 📦 Step 3: Logs release data into Google Sheets for record-keeping and analysis. 📊 Step 4: Retrieves FAQ or documentation data from Notion. 📚 Step 5: Merges GitHub and Notion data to calculate documentation drift. 🔍 Step 6: Flags SDKs whose documentation is over 30 days out of date. ⚠️ Step 7: Sends detailed Slack alerts to notify responsible teams. 🔔 Key Benefits ✅ Keeps SDK documentation aligned with product releases ✅ Prevents outdated information from reaching users ✅ Provides centralized release tracking in Google Sheets ✅ Sends real-time Slack alerts for overdue updates ✅ Strengthens DevRel and developer experience operations Features GitHub release trigger for real-time monitoring Google Sheets logging for tracking and auditing Notion database integration for documentation comparison Automated drift calculation (days since last update) Slack notifications for overdue documentation Requirements GitHub OAuth2 credentials Notion API credentials Google Sheets OAuth2 credentials Slack Bot token with chat:write permissions Target Audience Developer Relations (DevRel) and SDK engineering teams Product documentation and technical writing teams Project managers tracking SDK and doc release parity Step-by-Step Setup Instructions Connect your GitHub account and select your SDK repository. Replace YOURGOOGLESHEETID and YOURSHEET_GID with your tracking spreadsheet. Add your Notion FAQ database ID. Configure your Slack channel ID for alerts. Run once manually to validate setup, then enable automation.