Retrieve a Monday.com row and all data in a single node
This workflow is a building block designed to be called from other workflows via an Execute workflow node. When called from another workflow, and given the JSON input of a "pulse" field with the ID to pull from monday, this workflow will return:
- The items name and ID
- All column data, indexable by the column name
- All column data, indexable by the column's ID string
- All board relation columns, with their data and column values
- All subitems, with their data and column values
For example:
++Prerequisites++
- A monday.com account and credential
- A workflow that needs to get detailed data from a monday.com row
- The pulse id of the monday.com row to retreive data from.
++Setup++
- Import the workflow
- Configure all monday nodes with your credentials and save the workflow
- Copy the workflow ID from it's URL
- In a different workflow, add an Edit Fields node, to output the field "pulse", with the monday item you want to retrieve.
- Feed the Edit Fields node with your pulse into an Execute workflow node, and paste the workflow ID from above into it
- This "pulse" field will tell the workflow what pulse to retreive. This can be populated by an expression in your workflow
- There is an example of the Edit Fields and Execute Workflow nodes in the template
Retrieve Monday.com Row and All Data in a Single Node
This n8n workflow demonstrates how to retrieve a specific row from Monday.com and then extract all its associated data, including sub-items and column values, using a single Monday.com node call. It leverages sub-workflows and data manipulation to simplify complex data retrieval.
What it does
- Triggers the Sub-workflow: The workflow starts when it's called by another n8n workflow (acting as a sub-workflow).
- Retrieves Monday.com Item: It uses the Monday.com node to fetch a specific item (row) by its ID. It's configured to retrieve all available data, including sub-items and column values.
- Processes Column Values: A "Code" node then processes the
column_valuesarray from the Monday.com item. It transforms this array into a more accessible object format, where each key is the column ID and the value is the raw value from Monday.com. - Extracts Sub-items (if any): If the Monday.com item has sub-items, the workflow extracts these into a separate array.
- Merges Data: The
column_valuesobject and thesub_itemsarray are merged back into the main item data, making all related information available in a unified structure. - Splits Out Sub-items: If sub-items were found, the workflow then splits them out as individual items, allowing for further processing of each sub-item.
- Aggregates Data: Finally, if sub-items were processed, the workflow aggregates the original item and its sub-items back into a single output, providing a comprehensive view of the Monday.com row and all its nested data.
Prerequisites/Requirements
- n8n Instance: A running n8n instance.
- Monday.com Account: An active Monday.com account with an API key configured as an n8n credential.
- Monday.com Board/Item: A Monday.com board with at least one item (row) to test the retrieval.
Setup/Usage
- Import the Workflow:
- Download the provided JSON file.
- In your n8n instance, go to "Workflows" and click "New".
- Click the "Import from JSON" button and paste the workflow JSON or upload the file.
- Configure Monday.com Credentials:
- Locate the "Monday.com" node (ID: 309).
- Click on the "Credential" field and select your existing Monday.com API Key credential. If you don't have one, click "Create New" and follow the prompts to add your Monday.com API Key.
- Configure the Main Workflow (Calling Workflow):
- This workflow is designed to be executed by another workflow using the "Execute Sub-workflow" node.
- In your main workflow, add an "Execute Sub-workflow" node.
- Configure it to call this workflow. You will need to pass the
boardIdanditemIdof the Monday.com row you wish to retrieve as input data to this sub-workflow. For example, you can pass them as JSON:{"boardId": "YOUR_BOARD_ID", "itemId": "YOUR_ITEM_ID"}.
- Activate the Workflow:
- Once configured, activate the workflow by toggling the "Active" switch in the top right corner of the n8n editor.
- Test the Workflow:
- Execute your main workflow that calls this sub-workflow.
- Observe the output of this workflow to see the retrieved Monday.com item data, including its column values and sub-items.
This workflow provides a robust and efficient way to pull comprehensive data for a Monday.com item, making it ideal for scenarios where you need to process all associated information in a single operation.
Related Templates
Automate ETL error monitoring with AI classification, Sheets logging & Jira alerts
ETL Monitoring & Alert Automation: Jira & Slack Integration This workflow automatically processes ETL errors, extracts important details, generates a preview, creates a log URL, classifies the issue using AI and saves the processed data into Google Sheets. If the issue is important or needs attention, it also creates a Jira ticket automatically. The workflow reduces manual debugging effort, improves visibility and ensures high-severity issues are escalated instantly without human intervention. Quick Start – Implementation Steps Connect your webhook or ETL platform to trigger the workflow. Add your OpenAI, Google Sheets and Jira credentials. Enable the workflow. Send a sample error to verify Sheets logging and Jira ticket creation. Deploy and let the workflow monitor ETL pipelines automatically. What It Does This workflow handles ETL errors end-to-end by: Extracting key information from ETL error logs. Creating a short preview for quick understanding. Generating a URL to open the full context log. Asking AI to identify root cause and severity. Parsing the AI output into clean fields. Saving the processed error to Google Sheets. Creating a Jira ticket for medium/high-severity issues. This creates a complete automated system for error tracking, analysis and escalation. Who’s It For DevOps & engineering teams monitoring data pipelines. ETL developers who want automated error reporting. QA teams verifying daily pipeline jobs. Companies using Jira for issue tracking. Teams needing visibility into ETL failures without manual log inspection. Requirements to Use This Workflow n8n account or self-hosted instance. ETL platform capable of sending error payloads (via webhook). OpenAI API Key. Google Sheets credentials. Jira Cloud API credentials. Optional: log storage URL (S3, Supabase, server logs). How It Works & Setup Steps Get ETL Error (Webhook Trigger) Receives ETL error payload and starts the workflow. Prepare ETL Logs (Code Node) Extracts important fields and makes a clean version of the error.Generates a direct link to open the full ETL log. AI Severity Classification (OpenAI / AI Agent) AI analyzes the issue, identifies cause and assigns severity. Parse AI Output (Code Node) Formats AI results into clean fields: severity, cause, summary, recommended action. Prepare Data for Logging (Set / Edit Fields) Combines all extracted info into one final structured record. Save ETL Logs (Google Sheets Node) Logs each processed ETL error in a spreadsheet for tracking. Create Jira Ticket (Jira Node) Automatically creates a Jira issue when severity is Medium, High or Critical. ETL Failure Alert (Slack Node) Sends a Slack message to notify the team about the issue. ETL Failure Notify (Gmail Node) Sends an email with full error details to the team. How to Customize Nodes ETL Log Extractor Add/remove fields based on your ETL log structure. AI Classification Modify the OpenAI prompt for custom severity levels or deep-dive analysis. Google Sheets Logging Adjust columns for environment, job name or log ID. Jira Fields Customize issue type, labels, priority and assignees. Add-Ons (Extend the Workflow) Send Slack or Teams alerts for high severity issues Store full logs in cloud storage (S3, Supabase, GCS) Add daily/weekly error summary reports Connect monitoring tools like Datadog or Grafana Trigger automated remediation workflows Use Case Examples Logging all ETL failures to Google Sheets Auto-creating Jira tickets with AI-driven severity Summarizing large logs with AI for quick analysis Centralized monitoring of multiple ETL pipelines Reducing manual debugging effort across teams Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|----------| | Sheets not updating | Wrong Sheet ID or missing permission | Reconnect and reselect the sheet | | Jira ticket fails | Missing required fields or invalid project key | Update Jira mapping | | AI output empty | Invalid OpenAI key or exceeded usage | Check API key or usage limits | | Severity always “low” | Prompt too broad | Adjust AI prompt with stronger rules | | Log preview empty | Incorrect error field mapping | Verify the structure of the ETL error JSON | Need Help? For assistance setting up this workflow, customizing nodes or adding additional features, feel free to contact our n8n developers at WeblineIndia. We can help configure, scale or build similar automation workflows tailored to your ETL and business requirements.
Monitor Furusato Nozei market trends with Google News, AI analysis and Slack reporting
Analyze Furusato Nozei trends from Google News to Slack This workflow acts as a specialized market analyst for Japan's "Furusato Nozei" (Hometown Tax) system. It automates the process of monitoring related news, validating keyword popularity via search trends, and delivering a concise, strategic report to Slack. By combining RSS feeds, AI agents, and real-time search data, this template helps marketers and municipal researchers stay on top of the highly competitive Hometown Tax market without manual searching. 👥 Who is this for? Municipal Government Planners: To track trending return gifts and competitor strategies. E-commerce Marketers: To identify high-demand keywords for Furusato Nozei portals. Content Creators: To find trending topics for blogs or social media regarding tax deductions. Market Researchers: To monitor the seasonality and shifting interests in the Hometown Tax sector. ⚙️ How it works News Ingestion: The workflow triggers on a schedule and fetches the latest "Furusato Nozei" articles from Google News via RSS. AI Analysis & Extraction: An AI Agent (using OpenRouter) summarizes the news cluster and identifies the most viable search keyword (e.g., "Scallops," "Travel Vouchers," or specific municipalities). Data Validation: The workflow queries the Google Trends API (via SerpApi) to retrieve search volume history for the extracted keyword in Japan. Strategic Reporting: A second AI Agent analyzes the search trend data alongside the keyword to generate a market insight report. Delivery: The final report is formatted and sent directly to a Slack channel. 🛠️ Requirements To use this workflow, you will need: n8n (Version 1.0 or later recommended). OpenRouter API Key (or you can swap the model nodes for OpenAI/Anthropic). SerpApi Key (Required to fetch Google Trends data programmatically). Slack Account (with permissions to post to a channel). 🚀 How to set up Configure Credentials: Add your OpenRouter API key to the Chat Model nodes. Add your SerpApi key to the Google Trends API node. Connect your Slack account in the Send a message node. Check the RSS Feed: The RSS Read node is pre-configured for "Furusato Nozei" (ふるさと納税). You can leave this as is. Regional Settings: The workflow is pre-set for Japan (jp / ja). If you need to change this, check the Workflow Configuration and Google Trends API nodes. Schedule: Enable the Schedule Trigger node to run at your preferred time (default is 9:00 AM JST). 🎨 How to customize Change the Topic: While this is optimized for Furusato Nozei, you can change the RSS feed URL to track other Japanese market trends (e.g., NISA, Inbound Tourism). Swap AI Models: The template uses OpenRouter, but you can easily replace the "Chat Model" nodes with OpenAI (GPT-4) or Anthropic (Claude) depending on your preference. Adjust AI Prompts: The AI prompts are currently in Japanese to match the content. You can modify the system instructions in the AI Agent nodes if you prefer English reports.
AI-powered content factory: RSS to blog, Instagram & TikTok with Slack approval
This workflow automates the daily content creation process by monitoring trends, generating drafts for multiple platforms using AI, and requiring human approval before saving. It acts as an autonomous "AI Content Factory" that turns raw news into polished content for SEO Blogs, Instagram, and TikTok/Reels. How it works Trend Monitoring: Fetches the latest trend data via RSS (e.g., Google News or Google Trends). AI Filtering: An AI Agent acts as an "Editor-in-Chief," selecting only the most viral-worthy topics relevant to your niche. Multi-Format Generation: Three specialized AI Agents (using gpt-4o-mini for cost efficiency) run in parallel to generate: An SEO-optimized Blog post structure. An Instagram Carousel plan (5 slides). A Short Video Script (TikTok/Reels). Human-in-the-Loop: Sends a formatted message with interactive buttons to Slack. The workflow waits for your decision. Final Storage: If approved, the content is automatically appended to Google Sheets. Who is this for Social Media Managers & Content Creators Marketing Agencies managing multiple accounts Anyone wanting to automate "research to draft" without losing quality control. Requirements n8n: Version 1.19.0+ (requires AI Agent nodes). OpenAI: API Key (works great with low-cost gpt-4o-mini). Slack: A workspace to receive notifications. Google Sheets: To store the approved content. How to set up Configure Credentials: Set up your OpenAI, Slack, and Google Sheets credentials. Slack App: Create a Slack App, enable "Interactivity," and set the Request URL to your n8n Production Webhook URL. Add the chat:write scope and install it to your workspace. Google Sheet: Create a sheet with columns for Blog, Instagram, and Script (row 1 as headers). RSS Feed: Change the RSS node URL to your preferred topic source.