Export Google Search Console data to Airtable automatically
Export Google Search Console Data to Airtable Automatically
If you’ve ever downloaded CSV files from Google Search Console, opened them in Excel, cleaned the weird formatting, and pasted them into a sheet just to get a simple report… this workflow is made for you.
Who Is This Workflow For?
This automation is perfect for:
- SEO freelancers and consultants → who want to track client performance without wasting time on manual exports.
- Marketing teams → who need fresh daily/weekly reports to check what keywords and pages are performing.
- Website owners → who just want a clean way to see how their site is doing without logging into Google Search Console every day.
Basically, if you care about SEO but don't want to babysit CSV files, this workflow is your new best friend.
If you need a professional n8n agency to build advanced data automation workflows like this, check out Vision IA's n8n automation services.
What Does It Do?
Here’s the big picture:
- It runs on a schedule (every day, or whenever you want).
- It fetches data directly from the Google Search Console API.
- It pulls 3 types of reports:
- By Query (keywords people used).
- By Page (URLs that ranked).
- By Date (daily performance).
- It splits and cleans the data so it’s human-friendly.
- It saves everything into Airtable, organized in three tables.
End result: every time you open Airtable, you have a neat SEO database with clicks, impressions, CTR, and average position — no manual work required.
Prerequisites
You’ll need a few things to get started:
- Access to Google Search Console.
- A Google Cloud project with the Search Console API enabled.
- An Airtable account to store the data.
- An automation tool that can connect APIs (like the one we’re using here).
That’s it!
Step 1: Schedule the Workflow
The very first node in the workflow is the Schedule Trigger.
- Why? → So you don’t have to press “Run” every day.
- What it does → It starts the whole workflow at fixed times.
In the JSON, you can configure things like:
- Run every day at a specific hour (e.g., 8 AM).
- Or run every X hours/minutes if you want more frequent updates.
This is the alarm clock of your automation ⏰.
Step 2: Set Your Domain and Time Range
Next, we define the site and the time window for the report.
In the JSON, there’s a Set node with two important parameters:
domain→ your website (example:https://www.vvv.fr/).days→ how many days back you want the data (default: 30).
👉 Changing these two values updates the whole workflow. Super handy if you want 7-day reports instead of 30.
Step 3: Fetch Data from Google Search Console
This is where the workflow talks to the API.
There are 3 HTTP Request nodes:
-
Get Query Report
- Pulls data grouped by search queries (keywords).
- Parameters in the JSON:
startDate= today - 30 daysendDate= todaydimensions="query"rowLimit=25000(maximum rows the API can return)
-
Get Page Report
- Same idea, but grouped by page URLs.
- Parameters:
dimensions="page"- Same dates and row limit.
-
Get Date Report
- This one groups performance by date.
- Parameters:
dimensions="date"- You get a day-by-day performance view.
Each request returns rows like this:
{ "keys": ["example keyword"], "clicks": 42, "impressions": 1000, "ctr": 0.042, "position": 8.5 }
Step 4: Split the Data
The API sends results in a big array (rows). That’s not very usable directly.
So we add a Split Out node for each report.
What it does: breaks the array into single items → 1 item per keyword, per page, or per date.
This way, each line can be saved individually into Airtable.
👉 Think of it like opening a bag of candy and laying each one neatly on the table 🍬.
Step 5: Clean and Rename Fields
After splitting, we use Edit Fields nodes to make the data human-friendly.
For example:
- In the Query report → rename
keys[0]intoKeyword. - In the Page report → rename
keys[0]intopage. - In the Date report → rename
keys[0]intodate.
This is also where we keep only the useful fields:
Keyword/page/dateclicksimpressionsctrposition
Step 6: Save Everything into Airtable
Finally, the polished data is sent into Airtable.
In the JSON, there are 3 Airtable nodes:
- Queries table → stores all the keywords.
- Pages table → stores all the URLs.
- Dates table → stores day-by-day metrics.
Each node is set to:
- Operation =
Create→ adds a new record. - Base =
Search Console Reports. - Table =
Queries,Pages, orDates.
Field Mapping
For Queries:
Keyword→{{ $json.Keyword }}clicks→{{ $json.clicks }}impressions→{{ $json.impressions }}ctr→{{ $json.ctr }}position→{{ $json.position }}
👉 Same logic for Pages and Dates, just replace Keyword with page or date.
Expected Output
Every time this workflow runs:
- Queries table fills with fresh keyword performance data.
- Pages table shows how your URLs performed.
- Dates table tracks the evolution day by day.
In Airtable, you now have a complete SEO database with no manual exports.
Why This Is Awesome
- 🚫 No more messy CSV exports.
- 📈 Data is always up-to-date.
- 🎛 You can build Airtable dashboards, filters, and interfaces.
- ⚙️ Easy to adapt → just change
domainordaysto customize.
And the best part? You can spend the time you saved on actual SEO improvements instead of spreadsheet gymnastics 💃.
Need Help Automating Your Data Workflows?
This n8n workflow is perfect for automating SEO reporting and data collection. If you want to go further with document automation, file processing, and data synchronization across your tools, our agency specializes in building custom automation systems.
👉 Explore our document automation services: Vision IA – Document Automation Agency
We help businesses automate their data workflows—from collecting reports to organizing files and syncing information across CRMs, spreadsheets, and databases—all running automatically.
Questions about this workflow or other automation solutions? Visit Vision IA or reach out for a free consultation.
Export Google Search Console Data to Airtable Automatically
This n8n workflow automates the process of extracting data from Google Search Console and populating it into an Airtable base. It's designed to help you regularly monitor your website's search performance, track keywords, and analyze trends without manual data entry.
What it does
This workflow performs the following steps:
- Triggers on Schedule: The workflow starts at predefined intervals (e.g., daily, weekly) using a Schedule Trigger.
- Fetches Google Search Console Data: It makes an HTTP Request to the Google Search Console API to retrieve performance data for your specified website.
- Splits Out Data: The retrieved data, which is likely an array of objects, is then split into individual items for easier processing.
- Edits Fields: A "Set" node (labeled "Edit Fields") is used to transform or rename the data fields to match the column names in your Airtable base. This ensures data consistency.
- Adds/Updates Airtable Records: Finally, the processed data is sent to Airtable, where it either creates new records or updates existing ones, depending on your Airtable configuration and the data structure.
Prerequisites/Requirements
To use this workflow, you will need:
- n8n Instance: A running n8n instance.
- Google Search Console Account: Access to the Google Search Console API for your website. This typically involves setting up a Google Cloud Project and enabling the Search Console API.
- Airtable Account: An Airtable account with a base and a table configured to receive the Google Search Console data.
- n8n Credentials:
- Google Search Console API Credential: Configured in n8n to authenticate with the Google Search Console API.
- Airtable API Key Credential: Configured in n8n to authenticate with your Airtable account.
Setup/Usage
- Import the Workflow: Download the provided JSON and import it into your n8n instance.
- Configure Credentials:
- Set up your Google Search Console API credential. You'll likely need to provide an API key or OAuth 2.0 credentials from your Google Cloud Project.
- Set up your Airtable API Key credential.
- Configure HTTP Request Node (ID: 19):
- Update the URL to point to the correct Google Search Console API endpoint.
- Adjust the request body and parameters to specify the website, date range, and metrics you want to retrieve (e.g., queries, pages, countries, devices, clicks, impressions, CTR, position).
- Ensure the Google Search Console API credential is selected.
- Configure Split Out Node (ID: 1239):
- Verify that the "Field to Split Out" is correctly pointing to the array containing your search console data in the previous node's output.
- Configure Edit Fields Node (ID: 38):
- Map the fields from the Google Search Console output to the desired field names for your Airtable table. For example,
items.clicksmight be mapped toClicksin Airtable.
- Map the fields from the Google Search Console output to the desired field names for your Airtable table. For example,
- Configure Airtable Node (ID: 2):
- Select your Airtable credential.
- Specify the Base ID and Table Name where you want to store the data.
- Set the operation to "Create" or "Update" as needed. If updating, specify the "Key Field" (e.g.,
Date+Query+Page) to identify existing records. - Map the fields from the "Edit Fields" node to the corresponding columns in your Airtable table.
- Configure Schedule Trigger Node (ID: 839):
- Set your desired schedule for the workflow to run (e.g., every day, once a week).
- Activate the Workflow: Once all configurations are complete, activate the workflow to start automatically exporting your Google Search Console data to Airtable.
Related Templates
Two-way property repair management system with Google Sheets & Drive
This workflow automates the repair request process between tenants and building managers, keeping all updates organized in a single spreadsheet. It is composed of two coordinated workflows, as two separate triggers are required — one for new repair submissions and another for repair updates. A Unique Unit ID that corresponds to individual units is attributed to each request, and timestamps are used to coordinate repair updates with specific requests. General use cases include: Property managers who manage multiple buildings or units. Building owners looking to centralize tenant repair communication. Automation builders who want to learn multi-trigger workflow design in n8n. --- ⚙️ How It Works Workflow 1 – New Repair Requests Behind the Scenes: A tenant fills out a Google Form (“Repair Request Form”), which automatically adds a new row to a linked Google Sheet. Steps: Trigger: Google Sheets rowAdded – runs when a new form entry appears. Extract & Format: Collects all relevant form data (address, unit, urgency, contacts). Generate Unit ID: Creates a standardized identifier (e.g., BUILDING-UNIT) for tracking. Email Notification: Sends the building manager a formatted email summarizing the repair details and including a link to a Repair Update Form (which activates Workflow 2). --- Workflow 2 – Repair Updates Behind the Scenes:\ Triggered when the building manager submits a follow-up form (“Repair Update Form”). Steps: Lookup by UUID: Uses the Unit ID from Workflow 1 to find the existing row in the Google Sheet. Conditional Logic: If photos are uploaded: Saves each image to a Google Drive folder, renames files consistently, and adds URLs to the sheet. If no photos: Skips the upload step and processes textual updates only. Merge & Update: Combines new data with existing repair info in the same spreadsheet row — enabling a full repair history in one place. --- 🧩 Requirements Google Account (for Forms, Sheets, and Drive) Gmail/email node connected for sending notifications n8n credentials configured for Google API access --- ⚡ Setup Instructions (see more detail in workflow) Import both workflows into n8n, then copy one into a second workflow. Change manual trigger in workflow 2 to a n8n Form node. Connect Google credentials to all nodes. Update spreadsheet and folder IDs in the corresponding nodes. Customize email text, sender name, and form links for your organization. Test each workflow with a sample repair request and a repair update submission. --- 🛠️ Customization Ideas Add Slack or Telegram notifications for urgent repairs. Auto-create folders per building or unit for photo uploads. Generate monthly repair summaries using Google Sheets triggers. Add an AI node to create summaries/extract relevant repair data from repair request that include long submissions.
Send WooCommerce cross-sell offers to customers via WhatsApp using Rapiwa API
Who Is This For? This n8n workflow enables automated cross-selling by identifying each WooCommerce customer's most frequently purchased product, finding a related product to recommend, and sending a personalized WhatsApp message using the Rapiwa API. It also verifies whether the user's number is WhatsApp-enabled before sending, and logs both successful and unsuccessful attempts to Google Sheets for tracking. What This Workflow Does Retrieves all paying customers from your WooCommerce store Identifies each customer's most purchased product Finds the latest product in the same category as their most purchased item Cleans and verifies customer phone numbers for WhatsApp compatibility Sends personalized WhatsApp messages with product recommendations Logs all activities to Google Sheets for tracking and analysis Handles both verified and unverified numbers appropriately Key Features Customer Segmentation: Automatically identifies paying customers from your WooCommerce store Product Analysis: Determines each customer's most purchased product Smart Recommendations: Finds the latest products in the same category as customer favorites WhatsApp Integration: Uses Rapiwa API for message delivery Phone Number Validation: Verifies WhatsApp numbers before sending messages Dual Logging System: Tracks both successful and failed message attempts in Google Sheets Rate Limiting: Uses batching and wait nodes to prevent API overload Personalized Messaging: Includes customer name and product details in messages Requirements WooCommerce store with API access Rapiwa account with API access for WhatsApp verification and messaging Google account with Sheets access Customer phone numbers in WooCommerce (stored in billing.phone field) How to Use — Step-by-Step Setup Credentials Setup WooCommerce API: Configure WooCommerce API credentials in n8n (e.g., "WooCommerce (get customer)" and "WooCommerce (get customer data)") Rapiwa Bearer Auth: Create an HTTP Bearer credential with your Rapiwa API token Google Sheets OAuth2: Set up OAuth2 credentials for Google Sheets access Configure Google Sheets Ensure your sheet has the required columns as specified in the Google Sheet Column Structure section Verify Code Nodes Code (get paying_customer): Filters customers to include only those who have made purchases Get most buy product id & Clear Number: Identifies the most purchased product and cleans phone numbers Configure HTTP Request Nodes Get customer data: Verify the WooCommerce API endpoint for retrieving customer orders Get specific product data: Verify the WooCommerce API endpoint for product details Get specific product recommend latest product: Verify the WooCommerce API endpoint for finding latest products by category Check valid WhatsApp number Using Rapiwa: Verify the Rapiwa endpoint for WhatsApp number validation Rapiwa Sender: Verify the Rapiwa endpoint for sending messages Google Sheet Required Columns You’ll need two Google Sheets (or two tabs in one spreadsheet): A Google Sheet formatted like this ➤ sample The workflow uses a Google Sheet with the following columns to track coupon distribution: Both must have the following headers (match exactly): | name | number | email | address1 | price | suk | title | product link | validity | staus | | ---------- | ------------- | ----------------------------------------------- | ----------- | ----- | --- | ---------------------------------- | ------------------------------------------------------------------------------------------------------------------------------------------------------ | ---------- | -------- | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | verified | sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | unverified | not sent | | Abdul Mannan | 8801322827799 | contact@spagreen.net | mirpur dohs | 850 | | Sharp Most Demanding Hoodie x Nike | https://yourshopdomain/p-img-nike | verified | sent | Important Notes Phone Number Format: The workflow cleans phone numbers by removing all non-digit characters. Ensure your WooCommerce phone numbers are in a compatible format. API Rate Limits: Rapiwa and WooCommerce APIs have rate limits. Adjust batch sizes and wait times accordingly. Data Privacy: Ensure compliance with data protection regulations when sending marketing messages. Error Handling: The workflow logs unverified numbers but doesn't have extensive error handling. Consider adding error notifications for failed API calls. Product Availability: The workflow recommends the latest product in a category, but doesn't check if it's in stock. Consider adding stock status verification. Testing: Always test with a small batch before running the workflow on your entire customer list. Useful Links Dashboard: https://app.rapiwa.com Official Website: https://rapiwa.com Documentation: https://docs.rapiwa.com Support & Help WhatsApp: Chat on WhatsApp Discord: SpaGreen Community Facebook Group: SpaGreen Support Website: https://spagreen.net Developer Portfolio: Codecanyon SpaGreen
Track SDK documentation drift with GitHub, Notion, Google Sheets, and Slack
📊 Description Automatically track SDK releases from GitHub, compare documentation freshness in Notion, and send Slack alerts when docs lag behind. This workflow ensures documentation stays in sync with releases, improves visibility, and reduces version drift across teams. 🚀📚💬 What This Template Does Step 1: Listens to GitHub repository events to detect new SDK releases. 🧩 Step 2: Fetches release metadata including version, tag, and publish date. 📦 Step 3: Logs release data into Google Sheets for record-keeping and analysis. 📊 Step 4: Retrieves FAQ or documentation data from Notion. 📚 Step 5: Merges GitHub and Notion data to calculate documentation drift. 🔍 Step 6: Flags SDKs whose documentation is over 30 days out of date. ⚠️ Step 7: Sends detailed Slack alerts to notify responsible teams. 🔔 Key Benefits ✅ Keeps SDK documentation aligned with product releases ✅ Prevents outdated information from reaching users ✅ Provides centralized release tracking in Google Sheets ✅ Sends real-time Slack alerts for overdue updates ✅ Strengthens DevRel and developer experience operations Features GitHub release trigger for real-time monitoring Google Sheets logging for tracking and auditing Notion database integration for documentation comparison Automated drift calculation (days since last update) Slack notifications for overdue documentation Requirements GitHub OAuth2 credentials Notion API credentials Google Sheets OAuth2 credentials Slack Bot token with chat:write permissions Target Audience Developer Relations (DevRel) and SDK engineering teams Product documentation and technical writing teams Project managers tracking SDK and doc release parity Step-by-Step Setup Instructions Connect your GitHub account and select your SDK repository. Replace YOURGOOGLESHEETID and YOURSHEET_GID with your tracking spreadsheet. Add your Notion FAQ database ID. Configure your Slack channel ID for alerts. Run once manually to validate setup, then enable automation.