Google Sheets duplication & enrichment automation
How it Works
This workflow reads sheet details from a source Google Spreadsheet, creates a new spreadsheet, replicates the sheet structure, enriches the content by reading data, and writes it into the corresponding sheets in the new spreadsheet. The process is looped for every sheet, providing an automated way to duplicate and transform structured data.
π― Use Case
- Automate duplication and data enrichment for multi-sheet Google Spreadsheets
- Replicate templates across new documents with consistent formatting
- Data team workflows requiring repetitive structured Google Sheets setup
Setup Instructions
1. Required Google Sheets
- You must have a source spreadsheet with multiple sheets.
- The destination spreadsheet will be created automatically.
2. API Credentials
- Google Sheets OAuth2 β connect to both read and write spreadsheets.
- HTTP Request Auth β if external API headers are needed.
3. Configure Fields in Write Sheet
- Ensure you define appropriate columns and mapping for the destination sheet.
π Workflow Logic
- Manual Trigger: Starts the flow on user demand.
- Create New Spreadsheet: Generates a blank spreadsheet.
- HTTP Request: Retrieves all sheet names from the source spreadsheet.
- JavaScript Code: Extracts titles and metadata from the HTTP response.
- Loop Over Sheets: Iterates through each sheet retrieved.
- Delete Default Sheet: Removes the placeholder 'Sheet1'.
- Create Sheets: Replicates each original sheet in the new document.
- Read Spreadsheet1: Pulls data from the matching original sheet.
- Write Sheet: Appends the data to the newly created sheets.
π§© Node Descriptions
| Node Name | Description | |-----------|-------------| | Manual Trigger | Starts the workflow manually by user test. | | Create New Spreadsheet | Creates a new Google Spreadsheet for output. | | HTTP Request | Fetches metadata from the source spreadsheet including sheet names. | | Code | Processes sheet metadata into a list for iteration. | | Loop Over Items | Loops over each sheet to replicate and populate. | | Google Sheets2 | Deletes the default 'Sheet1' from the new spreadsheet. | | Create Sheets | Creates a new sheet matching each source sheet. | | Read Spreadsheet1 | Reads data from the source sheet. | | Write sheet | Writes the data into the corresponding new sheet. |
π οΈ Customization Tips
- Adjust the Google Sheet title to be dynamic or user-input driven
- Add filtering logic before writing data
- Append custom audit columns like 'Timestamp' or 'Processed By'
- Enable logging or Slack alerts after each sheet is created
π Required Files
| File Name | Purpose | |-----------|---------| | My_workflow_4.json | Main workflow JSON file for sheet duplication and enrichment |
π§ͺ Testing Tips
- Test with a spreadsheet containing 2β3 simple sheets
- Validate whether all sheets are duplicated
- Check if columns and data structure remain intact
- Watch for authentication issues in Google Sheets nodes
π· Suggested Tags & Categories
#GoogleSheets #Automation #DataEnrichment #Workflow #Spreadsheet
n8n Workflow: Google Sheets Duplication and Enrichment Automation
This n8n workflow provides a robust solution for automating the duplication of data within Google Sheets, with the flexibility to enrich or transform the data during the process. It's designed for scenarios where you need to copy rows from one sheet to another, potentially modifying them based on specific business logic.
What it does
This workflow is currently a foundational template. When fully configured, it would typically perform the following steps:
- Manual Trigger: The workflow is initiated manually by clicking 'Execute workflow' in n8n.
- Google Sheets Read: It would read data from a specified Google Sheet. This could be an entire sheet, a range, or filtered rows.
- Loop Over Items: The workflow would then process the retrieved data in batches, allowing for efficient handling of large datasets.
- Code Execution: Within the loop, a Code node is available to apply custom JavaScript logic. This is where you would define rules for data transformation, enrichment (e.g., calling external APIs to add more information), or conditional processing before duplication.
- HTTP Request: An HTTP Request node is included, suggesting the capability to interact with external APIs or services. This could be used for data enrichment, validation, or posting results to another system.
- Google Sheets Write: Finally, the processed or enriched data would be written back to a Google Sheet (either the same one or a different one, depending on configuration), effectively duplicating and enriching the original data.
Prerequisites/Requirements
- n8n Instance: A running n8n instance (cloud or self-hosted).
- Google Sheets Account: Access to a Google Sheets account and the specific spreadsheets you intend to work with.
- Google Sheets Credentials: Configured Google Sheets OAuth2 or API Key credentials within n8n.
- API Keys/Credentials (Optional): If the HTTP Request node is configured to interact with external services, relevant API keys or authentication details for those services will be required.
Setup/Usage
- Import the Workflow:
- Copy the provided JSON code.
- In your n8n instance, go to "Workflows" and click "New".
- Click the three dots menu (...) in the top right, then select "Import from JSON".
- Paste the JSON code and click "Import".
- Configure Credentials:
- Locate the "Google Sheets" node.
- Click on the "Credential" field and select your existing Google Sheets credential or create a new one. Follow the n8n documentation for setting up Google Sheets credentials if needed.
- If the "HTTP Request" node is to be used, configure its authentication method (e.g., API Key, OAuth2) based on the external service it will connect to.
- Customize Nodes:
- Google Sheets (Read): Configure this node to specify the spreadsheet and sheet name you want to read data from. You might also add filters or ranges.
- Code: Open the "Code" node and write your custom JavaScript logic to transform or enrich the data as needed. This is crucial for defining how the data is modified before duplication.
- HTTP Request: If you plan to enrich data from an external API, configure the URL, method, headers, and body of this node.
- Google Sheets (Write): Add another "Google Sheets" node after the "Code" or "HTTP Request" node. Configure it to write the processed data to your desired destination sheet. You might use the "Append Row" or "Update Row" operation.
- Test the Workflow:
- Click "Execute Workflow" on the "When clicking βExecute workflowβ" node to run the workflow manually and verify its functionality.
- Check the output of each node to ensure data is being processed as expected.
- Activate the Workflow:
- Once tested and configured, activate the workflow by toggling the "Active" switch in the top right corner of the workflow editor.
Related Templates
Automate Reddit brand monitoring & responses with GPT-4o-mini, Sheets & Slack
How it Works This workflow automates intelligent Reddit marketing by monitoring brand mentions, analyzing sentiment with AI, and engaging authentically with communities. Every 24 hours, the system searches Reddit for posts containing your configured brand keywords across all subreddits, finding up to 50 of the newest mentions to analyze. Each discovered post is sent to OpenAI's GPT-4o-mini model for comprehensive analysis. The AI evaluates sentiment (positive/neutral/negative), assigns an engagement score (0-100), determines relevance to your brand, and generates contextual, helpful responses that add genuine value to the conversation. It also classifies the response type (educational/supportive/promotional) and provides reasoning for whether engagement is appropriate. The workflow intelligently filters posts using a multi-criteria system: only posts that are relevant to your brand, score above 60 in engagement quality, and warrant a response type other than "pass" proceed to engagement. This prevents spam and ensures every interaction is meaningful. Selected posts are processed one at a time through a loop to respect Reddit's rate limits. For each worthy post, the AI-generated comment is posted, and complete interaction data is logged to Google Sheets including timestamp, post details, sentiment, engagement scores, and success status. This creates a permanent audit trail and analytics database. At the end of each run, the workflow aggregates all data into a comprehensive daily summary report with total posts analyzed, comments posted, engagement rate, sentiment breakdown, and the top 5 engagement opportunities ranked by score. This report is automatically sent to Slack with formatted metrics, giving your team instant visibility into your Reddit marketing performance. --- Who is this for? Brand managers and marketing teams needing automated social listening and engagement on Reddit Community managers responsible for authentic brand presence across multiple subreddits Startup founders and growth marketers who want to scale Reddit marketing without hiring a team PR and reputation teams monitoring brand sentiment and responding to discussions in real-time Product marketers seeking organic engagement opportunities in product-related communities Any business that wants to build authentic Reddit presence while avoiding spammy marketing tactics --- Setup Steps Setup time: Approx. 30-40 minutes (credential configuration, keyword setup, Google Sheets creation, Slack integration) Requirements: Reddit account with OAuth2 application credentials (create at reddit.com/prefs/apps) OpenAI API key with GPT-4o-mini access Google account with a new Google Sheet for tracking interactions Slack workspace with posting permissions to a marketing/monitoring channel Brand keywords and subreddit strategy prepared Create Reddit OAuth Application: Visit reddit.com/prefs/apps, create a "script" type app, and obtain your client ID and secret Configure Reddit Credentials in n8n: Add Reddit OAuth2 credentials with your app credentials and authorize access Set up OpenAI API: Obtain API key from platform.openai.com and configure in n8n OpenAI credentials Create Google Sheet: Set up a new sheet with columns: timestamp, postId, postTitle, subreddit, postUrl, sentiment, engagementScore, responseType, commentPosted, reasoning Configure these nodes: Brand Keywords Config: Edit the JavaScript code to include your brand name, product names, and relevant industry keywords Search Brand Mentions: Adjust the limit (default 50) and sort preference based on your needs AI Post Analysis: Customize the prompt to match your brand voice and engagement guidelines Filter Engagement-Worthy: Adjust the engagementScore threshold (default 60) based on your quality standards Loop Through Posts: Configure max iterations and batch size for rate limit compliance Log to Google Sheets: Replace YOURSHEETID with your actual Google Sheets document ID Send Slack Report: Replace YOURCHANNELID with your Slack channel ID Test the workflow: Run manually first to verify all connections work and adjust AI prompts Activate for daily runs: Once tested, activate the Schedule Trigger to run automatically every 24 hours --- Node Descriptions (10 words each) Daily Marketing Check - Schedule trigger runs workflow every 24 hours automatically daily Brand Keywords Config - JavaScript code node defining brand keywords to monitor Reddit Search Brand Mentions - Reddit node searches all subreddits for brand keyword mentions AI Post Analysis - OpenAI analyzes sentiment, relevance, generates contextual helpful comment responses Filter Engagement-Worthy - Conditional node filters only high-quality relevant posts worth engaging Loop Through Posts - Split in batches processes each post individually respecting limits Post Helpful Comment - Reddit node posts AI-generated comment to worthy Reddit discussions Log to Google Sheets - Appends all interaction data to spreadsheet for permanent tracking Generate Daily Summary - JavaScript aggregates metrics, sentiment breakdown, generates comprehensive daily report Send Slack Report - Posts formatted daily summary with metrics to team Slack channel
Generate Funny AI Videos with Sora 2 and Auto-Publish to TikTok
This automation creates a fully integrated pipeline to generate AI-powered videos, store them, and publish them on TikTok β all automatically. It connects OpenAI Sora 2, and Postiz (for TikTok publishing) to streamline content creation. --- Key Benefits β Full Automation β From text prompt to TikTok upload, everything happens automatically with no manual intervention once set up. β Centralized Control β Google Sheets acts as a simple dashboard to manage prompts, durations, and generated results. β AI-Powered Creativity β Uses OpenAI Sora 2 for realistic video generation and GPT-5 for optimized titles. β Social Media Integration β Seamlessly posts videos to TikTok via Postiz, ready for your audience. β Scalable & Customizable β Can easily be extended to other platforms like YouTube, Instagram, or LinkedIn. β Time-Saving β Eliminates repetitive steps like manual video uploads or caption writing. --- How it works This workflow automates the end-to-end process of generating AI videos and publishing them to TikTok. It is triggered either manually or on a recurring schedule. Trigger & Data Fetch: The workflow starts by checking a specified Form for new entries. It looks for rows where a video has been requested (a "PROMPT" is filled) but not yet generated (the "VIDEO" column is empty). AI Video Generation: For each new prompt found, the workflow sends a request to the Fal.ai Sora 2 model to generate a video. It then enters a polling loop, repeatedly checking the status of the generation request every 60 seconds until the video is "COMPLETED". Post-Processing & Upload: Once the video is ready, the workflow performs several actions in parallel: Fetch Video & Store: It retrieves the final video URL, downloads the video file Generate Title: It uses the OpenAI GPT-4o-mini model to analyze the original prompt and generate an optimized, engaging title for the video. Publish to TikTok: The video file is uploaded to Postiz, a social media scheduling tool, which then automatically publishes it to a connected TikTok channel, using the AI-generated title as the post's caption. --- Set up steps To make this workflow functional, you need to complete the following configuration steps: Prepare the Google Sheet: Create a Form with at least "PROMPT", "DURATION", and "VIDEO" fields. Configure Fal.ai for Video Generation: Create an account at Fal.ai and obtain your API key. In both the "Create Video" and "Get status" HTTP Request nodes, set up the "Header Auth" credential. Set the Name to Authorization and the Value to Key YOURAPIKEY. Set up TikTok Publishing via Postiz: Create an account on Postiz and connect your TikTok account to get a Channel ID. Obtain your Postiz API key. In the "Upload Video to Postiz" and "TikTok" (Postiz) nodes, configure the API credentials. In the "TikTok" node, replace the placeholder "XXX" in the integrationId field with your actual TikTok Channel ID from Postiz. (Optional) Configure AI Title Generation: The "Generate title" node uses OpenAI. Ensure you have valid OpenAI API credentials configured in n8n for this node to work. --- Need help customizing? Contact me for consulting and support or add me on Linkedin. Header 2
Generate WordPress blog posts with GPT-4O and Pixabay featured images via form
This workflow automates the creation of a draft article for a blog Use Cases Rapidly generate blog content from simple prompts. Ensure content consistency and speed up time-to-publish. Automatically source and attach relevant featured images. Save your digital marketing team significant time. (Personalized touch based on your experience) Prerequisites/Requirements An OpenAI API Key (for GPT-4O). A Pixabay API Key (for image sourcing). A WordPress site URL and API credentials (username/password or application password). Customization Options Adjust the AI prompt in the AI Content Generation node to change the content tone and style. Modify the search query in the Pixabay Query HTTP node to influence the featured image selection. Change the reviewer email address in the final Send Review Notification node.