Back to Catalog

Baserow campaign database to Shopify with image upload & dynamic template update

SaschaSascha
1295 views
2/3/2026
Official Page

Automating your marketing campaign management process can streamline your workflow and save you valuable time. With the combination of Baserow and n8n, you can efficiently handle your campaign data and seamlessly publish content to your Shopify store.

In this workflow template, I demonstrate how to leverage Baserow as a centralized platform for organizing your marketing campaign assets, including copy and images. By utilizing n8n, we automate the process of fetching images and campaign descriptions from Baserow and uploading them directly to your Shopify store.

With this automated solution, you can expedite the publishing process, ensuring that your campaigns are launched swiftly across your sales channels. Additionally, this workflow serves as a foundational step towards further automation in campaign management, allowing you to dynamically generate and upload content to your Shopify store with ease.

This template will help you:

  1. Use n8n to get images for marketing campaigns from Baserow and upload them to your Shopify media library
  2. Dynamically inject data from Baserow into a template file
  3. Upload a template file to your Shopify theme

This template will demonstrate the follwing concepts in n8n:

  • use the Webhook node
  • use the IF node to control the execution flow of the workflow
  • do time calculation using expressions and javascript
  • use the GraphQL node to upload images to your Shopify media files
  • create a dynamic template file for your Shopify theme
  • use the HTTP Reqest node to upload your template file to your Shopify store

How to get started?

  1. Create a custom app in Shopify get the credentials needed to connect n8n to Shopify This is needed for the Shopify Trigger
  2. Create Shopify Acces Token API credentials n n8n for the Shopify trigger node
  3. Create Header Auth credentials: Use X-Shopify-Access-Token as the name and the Acces-Token from the Shopify App you created as the value. The Header Auth is neccessary for the GraphQL nodes.
  4. You will need a running Baserow instance for this. You can also sign up for a free account at https://baserow.io/

Please make sure to read the notes in the template.

For a detailed explanation please check the corresponding video: https://youtu.be/Ky-dYlljGiY

Baserow Campaign Database to Shopify with Image Upload and Dynamic Template Update

This n8n workflow automates the process of creating or updating Shopify products based on data from a Baserow campaign database, including image uploads and dynamic template adjustments.

Description

This workflow acts as a bridge between your Baserow campaign management and your Shopify store. It listens for updates or new entries in a Baserow database, processes the product information, dynamically uploads images to Shopify if needed, and then either creates a new product or updates an existing one on your Shopify store, even allowing for dynamic template assignment.

What it does

  1. Triggers on Baserow Webhook: The workflow starts when a webhook is received, likely from a Baserow database indicating a new or updated campaign entry.
  2. Processes Baserow Data: It receives the data from Baserow, which should contain product details, image URLs, and potentially Shopify template information.
  3. Prepares Product Data: An "Edit Fields (Set)" node likely transforms and formats the incoming Baserow data into the structure expected by Shopify, extracting relevant fields like product title, description, price, tags, and image URLs.
  4. Checks for Existing Product (HTTP Request): A generic HTTP Request node is used to query Shopify to determine if a product with a matching identifier (e.g., SKU or handle) already exists.
  5. Conditional Logic (If): An "If" node evaluates the result of the Shopify product check.
    • If Product Exists: It proceeds down a path to update the existing Shopify product.
    • If Product Does Not Exist: It proceeds down a path to create a new Shopify product.
  6. Uploads Images to Shopify (HTTP Request): For new or updated products, another HTTP Request node handles the image upload process to Shopify, likely fetching images from the provided URLs in Baserow.
  7. Creates/Updates Shopify Product (GraphQL): A GraphQL node is used to interact with the Shopify Admin API.
    • Create Product: If the product doesn't exist, this node creates a new product with all the processed details and uploaded images.
    • Update Product: If the product exists, this node updates its details, including images and potentially the assigned template.
  8. Handles Template Updates (HTTP Request): A final HTTP Request node might be used to dynamically update the Shopify product template based on data from Baserow, if applicable.
  9. No Operation: A "No Operation, do nothing" node is present, likely as a placeholder or to terminate a branch of the workflow cleanly.
  10. Sticky Note: A sticky note is included, likely for documentation or internal notes within the workflow.

Prerequisites/Requirements

  • n8n Instance: A running n8n instance.
  • Baserow Account: Access to a Baserow database with campaign/product data.
  • Shopify Account: An active Shopify store with API access enabled.
  • Shopify Admin API Credentials: API Key and Access Token for your Shopify store.
  • Webhook URL: A webhook configured in Baserow to send data to this n8n workflow's Webhook trigger.
  • Image Hosting: Images referenced in Baserow should be publicly accessible via URL for n8n to fetch and upload to Shopify.

Setup/Usage

  1. Import the Workflow: Import the provided JSON into your n8n instance.
  2. Configure Credentials:
    • Shopify API: Set up your Shopify API credentials in n8n. These will be used by the HTTP Request and GraphQL nodes.
    • Baserow Webhook: The "Webhook" trigger node will generate a unique URL. Copy this URL.
  3. Configure Baserow Webhook: In your Baserow database, set up a webhook that triggers on row creation or update and sends the relevant product data to the n8n Webhook URL you copied.
  4. Configure Nodes:
    • Edit Fields (Set): Adjust the expressions in this node to correctly map your Baserow field names to the desired Shopify product properties (e.g., title, body_html, variants[0].price, images[0].src, template_suffix).
    • HTTP Request (Check Existing Product): Update the Shopify API endpoint and query parameters to correctly search for existing products based on a unique identifier from your Baserow data.
    • HTTP Request (Upload Images): Ensure the image upload logic (method, headers, body) is correctly configured for the Shopify API.
    • GraphQL (Create/Update Product): Modify the GraphQL mutations to match your specific Shopify product creation and update requirements, including mapping all necessary fields from the Baserow data.
    • HTTP Request (Update Template): If dynamically updating templates, ensure this node's API call is correctly configured.
  5. Activate the Workflow: Once configured, activate the workflow in n8n.

Now, whenever a new product campaign entry is added or updated in your Baserow database, this workflow will automatically handle its creation or update on your Shopify store.

Related Templates

Automate Google Ads copy optimization with Channable feed and Relevance AI

🧠 Google Ads Monthly Performance Optimization (Channable + Google Ads + Relevance AI) 🚀 Overview This workflow automatically analyzes your Google Ads performance every month, identifies top-performing themes and categories, and regenerates optimized ad copy using Relevance AI — powered by insights from your Channable product feed. It then saves the improved ads to Google Sheets for review and sends a detailed performance report to your Slack workspace. Ideal for marketing teams who want to automate ad optimization at scale with zero manual intervention. --- 🔗 Integrations Used Google Ads → Fetch campaign and ad performance metrics using GAQL. Relevance AI → Analyze performance data and regenerate ad copy using AI agents and tools. Channable → Pull updated product feeds for ad refresh cycles. Google Sheets → Save optimized ad copy for review and documentation. Slack → Send a 30-day performance report to your marketing team. --- 🧩 Workflow Summary | Step | Node | Description | | ---- | --------------------------------------------------- | --------------------------------------------------------------------------- | | 1 | Monthly Schedule Trigger | Runs automatically on the 1st of each month to review last 30 days of data. | | 2 | Get Google Ads Performance Data | Fetches ad metrics via GAQL query (impressions, clicks, CTR, etc.). | | 3 | Calculate Performance Metrics | Groups results by ad group and theme to find top/bottom performers. | | 4 | AI Performance Analysis (Relevance AI) | Generates human-readable insights and improvement suggestions. | | 5 | Update Knowledge Base (Relevance AI) | Saves new insights for future ad copy training. | | 6 | Get Updated Product Feed (Channable) | Retrieves the latest catalog items for ad regeneration. | | 7 | Split Into Batches | Splits the feed into groups of 50 to avoid API rate limits. | | 8 | Regenerate Ad Copy with Insights (Relevance AI) | Rewrites ad copy with the latest product and performance data. | | 9 | Save Optimized Ads to Sheets | Writes output to your “Optimized Ads” Google Sheet. | | 10 | Generate Performance Report | Summarizes the AI analysis, CTR trends, and key insights. | | 11 | Email Performance Report (Slack) | Sends report directly to your Slack channel/team. | --- 🧰 Requirements Before running the workflow, make sure you have: A Google Ads account with API access and OAuth2 credentials. A Relevance AI project (with one Agent and one Tool setup). A Channable account with API key and project feed. A Google Sheets document for saving results. A Slack webhook URL for sending performance summaries. --- ⚙️ Environment Variables Add these environment variables to your n8n instance (via .env or UI): | Variable | Description | | -------------------------------- | ------------------------------------------------------------------- | | GOOGLEADSAPI_VERSION | API version (e.g., v17). | | GOOGLEADSCUSTOMER_ID | Your Google Ads customer ID. | | RELEVANCEAIAPI_URL | Base Relevance AI API URL (e.g., https://api.relevanceai.com/v1). | | RELEVANCEAGENTPERFORMANCE_ID | ID of your Relevance AI Agent for performance analysis. | | RELEVANCEKNOWLEDGESOURCE_ID | Knowledge base or dataset ID used to store insights. | | RELEVANCETOOLADCOPYID | Relevance AI tool ID for generating ad copy. | | CHANNABLEAPIURL | Channable API endpoint (e.g., https://api.channable.com/v1). | | CHANNABLECOMPANYID | Your Channable company ID. | | CHANNABLEPROJECTID | Your Channable project ID. | | FEED_ID | The feed ID for product data. | | GOOGLESHEETID | ID of your Google Sheet to store optimized ads. | | SLACKWEBHOOKURL | Slack Incoming Webhook URL for sending reports. | --- 🔐 Credentials Setup in n8n | Credential | Type | Usage | | ----------------------------------------------- | ------- | --------------------------------------------------- | | Google Ads OAuth2 API | OAuth2 | Authenticates your Ads API queries. | | HTTP Header Auth (Relevance AI & Channable) | Header | Uses your API key as Authorization: Bearer <key>. | | Google Sheets OAuth2 API | OAuth2 | Writes optimized ads to Sheets. | | Slack Webhook | Webhook | Sends monthly reports to your team channel. | --- 🧠 Example AI Insight Output json { "insights": [ "Ad groups using 'vegan' and 'organic' messaging achieved +23% CTR.", "'Budget' keyword ads underperformed (-15% CTR).", "Campaigns featuring 'new' or 'bestseller' tags showed higher conversion rates." ], "recommendations": [ "Increase ad spend for top-performing 'vegan' and 'premium' categories.", "Revise copy for 'budget' and 'sale' ads with low CTR." ] } --- 📊 Output Example (Google Sheet) | Product | Category | Old Headline | New Headline | CTR Change | Theme | | ------------------- | -------- | ------------------------ | -------------------------------------------- | ---------- | ------- | | Organic Protein Bar | Snacks | “Healthy Energy Anytime” | “Organic Protein Bar — 100% Natural Fuel” | +12% | Organic | | Eco Face Cream | Skincare | “Gentle Hydration” | “Vegan Face Cream — Clean, Natural Moisture” | +17% | Vegan | --- 📤 Automation Flow Run Automatically on the first of every month (cron: 0 0 1 ). Fetch Ads Data → Analyze & Learn → Generate New Ads → Save & Notify. Every iteration updates the AI’s knowledge base — improving your campaigns progressively. --- ⚡ Scalability The flow is batch-optimized (50 items per request). Works for large ad accounts with up to 10,000 ad records. AI analysis & regeneration steps are asynchronous-safe (timeouts extended). Perfect for agencies managing multiple ad accounts — simply duplicate and update the environment variables per client. --- 🧩 Best Use Cases Monthly ad creative optimization for eCommerce stores. Marketing automation for Google Ads campaign scaling. Continuous learning ad systems powered by Relevance AI insights. Agencies automating ad copy refresh cycles across clients. --- 💬 Slack Report Example 30-Day Performance Optimization Report Date: 2025-10-01 Analysis Period: Last 30 days Ads Analyzed: 842 Top Performing Themes Vegan: 5.2% CTR (34 ads) Premium: 4.9% CTR (28 ads) Underperforming Themes Budget: 1.8% CTR (12 ads) AI Insights “Vegan” and “Premium” themes outperform baseline by +22% CTR. “Budget” ads underperform due to lack of value framing. Next Optimization Cycle: 2025-11-01 --- 🛠️ Maintenance Tips Update your GAQL query occasionally to include new metrics or segments. Refresh Relevance AI tokens every 90 days (if required). Review generated ads in Google Sheets before pushing them live. Test webhook and OAuth connections after major n8n updates. --- 🧩 Import Instructions Open n8n → Workflows → Import from File / JSON. Paste this workflow JSON or upload it. Add all required environment variables and credentials. Execute the first run manually to validate connections. Once verified, enable scheduling for automatic monthly runs. --- 🧾 Credits Developed for AI-driven marketing teams leveraging Google Ads, Channable, and Relevance AI to achieve continuous ad improvement — fully automated via n8n.

Nikan NoorafkanBy Nikan Noorafkan
319

GPT-4.1-mini Powered Invoice Processing from Gmail to Google Sheets with Slack

AI-Powered Invoice Processing from Gmail to Google Sheets with Slack Approval This workflow completely automates your invoice processing pipeline. It triggers when a new invoice email arrives in Gmail, uses AI to extract key data from the PDF attachment, logs it in a Google Sheet, and sends a request to Slack with simple links for one-click approval or rejection. Who's it for? This template is perfect for small business owners, finance teams, freelancers, and anyone looking to eliminate the manual work of processing invoices. It saves hours of data entry, reduces human error, and streamlines the approval process. How it works Trigger: The workflow starts when an email with a specific subject (e.g., "Invoice") arrives in Gmail. Extraction: It automatically downloads the first PDF attachment from the email and extracts all its text content. AI Processing: The extracted text is sent to an AI model, which intelligently identifies and pulls out key details: invoice number, issue date, company name, total amount, and due date. Logging: This structured data is appended as a new row in your Google Sheet. The status is automatically set to "pending". The full, raw text from the PDF is also saved for easy verification. Approval Request: A formatted message is sent to a designated Slack channel. This message includes the key invoice details and two unique links: one to approve and one to reject the invoice. Handle Response: When a user clicks either the "Approve" or "Reject" link, the corresponding Webhook in the workflow is triggered. Update Sheet: The workflow finds the correct row in the Google Sheet using the invoice number and updates its status to "approved" or "rejected". Confirmation: A final confirmation message is sent to the Slack channel, closing the loop and informing the team of the action taken. How to set up Credentials: Add your credentials for Gmail, OpenAI, Google Sheets, and Slack in the designated nodes. Gmail Trigger: In the "1. New Invoice Email Received" node, change the search query q from "incoice" to the keyword you use for invoice emails (e.g., "invoice"). Google Sheets: In the three Google Sheets nodes ("6. Log Invoice...", "9a. Update Status to Approved", and "9b. Update Status to Rejected"), enter your Google Sheet ID and the name of the sheet. Slack: In the three Slack nodes ("7. Send Approval Request...", "10a. Send Approval Confirmation", and "10b. Send Rejection Confirmation"), enter your Slack Channel ID. Webhook URLs: First, activate the workflow using the toggle in the top-right corner. Open the "Webhook (Approve)" node, go to the Production URL tab, and copy the URL. Paste this URL into the "7. Send Approval Request to Slack" node, replacing the https://YOURWEBHOOKBASE_URL/webhook/approval part of the approval link. Repeat this process for the "Webhook (Reject)" node and the rejection link. Activate Workflow: Ensure the workflow is active for the Webhooks to work continuously. How to customize AI Prompt: You can modify the prompt in the "5. Extract Invoice Data with AI" node to extract different or additional fields to match your specific invoice formats. Slack Messages: Feel free to customize the text in all three Slack nodes to better fit your team's tone and communication style.

koichi naginoBy koichi nagino
68

Automate X-ray analysis with VLM Orion and distribute to Gmail, Telegram & Drive

📌 Overview This workflow provides an automated pipeline for processing medical X-ray images using VLM Run (model: vlmrun-orion-1:auto), and distributing the AI-generated analysis to multiple channels—email, Telegram, and Google Drive. --- ⚙️ How It Works Upload X-Ray Image A Form Trigger allows the user to upload an X-ray file. Once the image is submitted, the workflow immediately starts processing. --- Automated X-Ray Analysis The uploaded X-ray image is sent to VLM Run (vlmrun-orion-1:auto) via an OpenAI-compatible endpoint. The model returns: A text-based interpretation or description A disease-highlighted output image (if detected) A URL reference pointing to the annotated result image stored in Google Cloud --- Extract Artifact From artifact reference, download file using artifact node. --- Generate Report File The Convert to File node transforms the analysis text into a shareable .txt report. This file is used both for email and Drive storage. --- Send Notifications to Gmail & Telegram The workflow automatically: 📧 Emails the doctor (or configured staff email): The diagnostic description The generated report file The annotated X-ray image 📨 Sends a Telegram message containing: The same report The disease-highlighted X-ray image This ensures instant notification and cross-platform availability. --- Upload to Google Drive The final step uses Google Drive OAuth2 to store: The report file The annotated medical image These files are uploaded to a designated Drive folder for archiving and future reference. --- 🧩 Key Features ✔️ Automated X-ray processing using VLM Run ✔️ Structured extraction of annotated medical images ✔️ Multi-channel notification (Email + Telegram) ✔️ Centralized archive via Google Drive ✔️ Zero manual intervention after upload ✔️ Works with OpenAI-compatible VLM endpoints --- 🔧 Requirements VLM Run API Credentials Required to call vlm-agent-1 for image analysis. Gmail OAuth2 Credentials Needed to automatically email the diagnostic report. Telegram Bot Token Sends analysis results to a Telegram chat or group. Google Drive OAuth2 Stores reports and annotated images in Google Drive. --- 📎 Notes This workflow automates image handling and communication. All AI-generated content must be reviewed by a qualified medical professional before any clinical use.

Mehedi AhamedBy Mehedi Ahamed
56