Archive Spotify's discover weekly playlist
This workflow will archive your Spotify Discover Weekly playlist to an archive playlist named "Discover Weekly Archive" which you must create yourself. If you want to change the name of the archive playlist, you can edit value2 in the "Find Archive Playlist" node.
It is configured to run at 8am on Mondays, a conservative value in case you forgot to set your GENERIC_TIMEZONE environment variable (see the docs here).
Special thanks to erin2722 for creating the Spotify node and harshil1712 for help with the workflow logic.
To use this workflow, you'll need to:
- Create then select your credentials in each Spotify node
- Create the archive playlist yourself
Optionally, you may choose to:
- Edit the archive playlist name in the "Find Archive Playlist" node
- Adjust the Cron node with an earlier time if you know
GENERIC_TIMEZONEis set - Setup an error workflow like this one to be notified if anything goes wrong
Archive Spotify's Discover Weekly Playlist
This n8n workflow automates the process of archiving your Spotify Discover Weekly playlist. It ensures you never lose track of those personalized song recommendations by regularly checking and potentially saving them.
What it does
This workflow is designed to be a starting point for a more complex automation. As provided, it performs the following steps:
- Triggers on a Schedule: The workflow starts at a predefined interval (e.g., once a week).
- Interacts with Spotify: It includes a Spotify node, indicating an intention to perform actions related to Spotify, such as fetching playlist data.
- Conditional Logic: It incorporates an "If" node, suggesting that subsequent actions will depend on a specific condition being met (e.g., if new songs are found, or if the playlist has been updated).
- Merges Data: A "Merge" node is present, which typically combines data streams from different branches of a workflow.
Note: While the workflow JSON provided sets up the structure for these actions, the specific operations for the Spotify node (e.g., "get playlist," "add tracks") and the conditions for the "If" node are not fully configured in this base definition. This workflow serves as a robust template to build upon.
Prerequisites/Requirements
- n8n Instance: A running n8n instance to host the workflow.
- Spotify Account: A Spotify account is required to interact with the Spotify API.
- Spotify Credentials: You will need to set up Spotify OAuth2 credentials in your n8n instance.
Setup/Usage
- Import the Workflow: Import the provided JSON into your n8n instance.
- Configure Spotify Credentials:
- Locate the "Spotify" node.
- Click on "Credential" and select an existing Spotify OAuth2 credential or create a new one. Follow the n8n documentation for setting up Spotify OAuth2 credentials.
- Configure the Schedule Trigger:
- Locate the "Schedule Trigger" node.
- Adjust the schedule to your preference (e.g., weekly, daily) to determine how often the workflow runs.
- Flesh out Spotify Actions:
- In the "Spotify" node, configure the specific operation you want to perform (e.g., "Get a Playlist," "Add Tracks to a Playlist").
- Specify the Discover Weekly playlist ID (you can usually find this in the Spotify URL when viewing the playlist in a browser).
- Define Conditional Logic:
- In the "If" node, define the condition(s) that should determine the flow's path. For archiving Discover Weekly, this might involve checking if the playlist has new tracks since the last run, or if a specific date has passed.
- Add Archiving Logic:
- Connect the "True" branch of the "If" node to further nodes that will handle the archiving. This could involve:
- Creating a new playlist (using another Spotify node).
- Adding the tracks from Discover Weekly to the new archive playlist.
- Sending a notification to Slack, Discord, or email (using respective n8n nodes) to confirm the archive.
- Connect the "True" branch of the "If" node to further nodes that will handle the archiving. This could involve:
- Activate the Workflow: Once configured, activate the workflow in n8n.
Related Templates
Generate Weather-Based Date Itineraries with Google Places, OpenRouter AI, and Slack
🧩 What this template does This workflow builds a 120-minute local date course around your starting point by querying Google Places for nearby spots, selecting the top candidates, fetching real-time weather data, letting an AI generate a matching emoji, and drafting a friendly itinerary summary with an LLM in both English and Japanese. It then posts the full bilingual plan with a walking route link and weather emoji to Slack. 👥 Who it’s for Makers and teams who want a plug-and-play bilingual local itinerary generator with weather awareness — no custom code required. ⚙️ How it works Trigger – Manual (or schedule/webhook). Discovery – Google Places nearby search within a configurable radius. Selection – Rank by rating and pick the top 3. Weather – Fetch current weather (via OpenWeatherMap). Emoji – Use an AI model to match the weather with an emoji 🌤️. Planning – An LLM writes the itinerary in Markdown (JP + EN). Route – Compose a Google Maps walking route URL. Share – Post the bilingual itinerary, route link, and weather emoji to Slack. 🧰 Requirements n8n (Cloud or self-hosted) Google Maps Platform (Places API) OpenWeatherMap API key Slack Bot (chat:write) LLM provider (e.g., OpenRouter or DeepL for translation) 🚀 Setup (quick) Open Set → Fields: Config and fill in coords/radius/time limit. Connect Credentials for Google, OpenWeatherMap, Slack, and your LLM. Test the workflow and confirm the bilingual plan + weather emoji appear in Slack. 🛠 Customize Adjust ranking filters (type, min rating). Modify translation settings (target language or tone). Change output layout (side-by-side vs separated). Tune emoji logic or travel mode. Add error handling, retries, or logging for production use.
Seo blog content automation with GPT-4o-mini and human approval in Google Docs
Overview This n8n workflow automates the entire content creation process for SEO blog posts, from topic submission and AI drafting to human approval, revision, and final storage in Google Docs. It ensures high-quality, SEO-optimized content is generated efficiently while keeping a human in the loop for quality control. Prerequisites To use this workflow, you need the following accounts and credentials: Google Sheets Account: To manage the content tracker. The sheet must contain columns for Topic, Reference URL, Title, Status, and Link to document. OpenAI API Key (for GPT-4o-mini): To power the initial content drafting and subsequent revisions. Gmail Account: To send the content for human approval and wait for feedback. Google Docs Account: To create and store the final approved blog posts. How It Works The workflow operates in four main stages: Topic Submission, AI Content Creation, Human Approval & Revision Loop, and Final Publishing. Stage 1: Topic Submission and Tracking This stage captures a new blog topic and adds it to the content tracker. On form submission (Form Trigger): The workflow is initiated when a user submits a Topic and an optional Reference link through the form. Append row in sheet (Google Sheets): The new topic and reference link are added as a new row in your content tracker sheet. Get Topic from Google Sheets: The workflow fetches the newly added row, including the system-generated row_number for tracking updates. Stage 2: AI Content Creation (Initial Draft) The workflow uses an expert AI agent to generate the first draft of the blog post. Copywriter AI Agent: Acting as an expert SEO content strategist, the agent receives the topic and reference URL. The agent improves the provided topic into an SEO-optimized title. It writes a complete, conversational, SEO-friendly blog post (800–1200 words) using proper Markdown headings and lists. OpenAI Chat Model (GPT-4o-mini): Powers the agent's generation. Structured Output Parser: Ensures the output is in a JSON format with separate title and content keys. Set Data: The AI-generated title and content are mapped to workflow variables (Topic Title, Content) for easy use in subsequent nodes. Stage 3: Human Approval and Revision Loop This is the critical quality control step where a human reviews the draft and decides on the next action. Send Content for Approval (Gmail): The generated title and content are sent via email to the approver (<your email>). This uses a custom form that allows the user to choose Yes (Approve), No (Request Revision), or Cancel. The form also includes a Content Feedback textarea. Approval Result (Switch): The workflow pauses until the approver submits the form, and then directs the flow based on their choice. If "Yes" (Approved): Proceeds to the final publishing stage. The Update Topic Status on Google Sheets node is executed, setting the Status to Approved and logging the new Title. If "No" (Revision Requested): Proceeds to the revision loop. Copywriter Revision Agent: This agent takes the original content, the topic title, and the user's Content Feedback. It is instructed to incorporate the feedback, preserving the existing structure and tone, and outputs a revised blog post. The flow loops back through the Set Data node and then returns to the Send Content for Approval node for a new review. If "Cancel": Stops the workflow, and the Update Topic Status on Google Sheets node is executed. Stage 4: Final Publishing Once approved, the content is stored in Google Docs and the tracker is updated. Create Blog file (Google Docs): A new Google Doc is created using the approved Topic Title. Add blog content in file (Google Docs): The final Content (in Markdown format) is inserted into the newly created document. Update sheet with blog post link (Google Sheets): The final tracker update logs the Link to document and the Published date. Customization Tip Recommend using AI humanazier and AI detector tools before approval to make the SEO score better.
Generate automated social media reports with GPT-4o, Twitter, Facebook & Notion
Description Automate your weekly social media analytics with this end-to-end AI reporting workflow. 📊🤖 This system collects real-time Twitter (X) and Facebook metrics, merges and validates data, formats it with JavaScript, generates an AI-powered HTML report via GPT-4o, saves structured insights in Notion, and shares visual summaries via Slack and Gmail. Perfect for marketing teams tracking engagement trends and performance growth. 🚀💬 What This Template Does 1️⃣ Starts manually or on-demand to fetch the latest analytics data. 🕹️ 2️⃣ Retrieves follower, engagement, and post metrics from both X (Twitter) and Facebook APIs. 🐦📘 3️⃣ Merges and validates responses to ensure clean, complete datasets. 🔍 4️⃣ Runs custom JavaScript to normalize and format metrics into a unified JSON structure. 🧩 5️⃣ Uses Azure OpenAI GPT-4o to generate a visually rich HTML performance report with tables, emojis, and insights. 🧠📈 6️⃣ Saves the processed analytics into a Notion “Growth Chart” database for centralized trend tracking. 🗂️ 7️⃣ Sends an email summary report to the marketing team, complete with formatted HTML insights. 📧 8️⃣ Posts a concise Slack update comparing platform performance and engagement deltas. 💬 9️⃣ Logs any validation or API errors automatically into Google Sheets for debugging and traceability. 🧾 Key Benefits ✅ Centralizes all social metrics into a single automated flow. ✅ Delivers AI-generated HTML reports ready for email and dashboard embedding. ✅ Reduces manual tracking with Notion and Slack syncs. ✅ Ensures data reliability with built-in validation and error logging. ✅ Gives instant, visual insights for weekly marketing reviews. Features Multi-platform analytics integration (Twitter X + Facebook Graph API). JavaScript node for dynamic data normalization. Azure OpenAI GPT-4o for HTML report generation. Notion database update for long-term trend storage. Slack and Gmail nodes for instant sharing and communication. Automated error capture to Google Sheets for workflow reliability. Visual, emoji-enhanced reporting with HTML formatting and insights. Requirements Twitter OAuth2 API credentials for access to public metrics. Facebook Graph API access token for page insights. Azure OpenAI API key for GPT-4o report generation. Notion API credentials with write access to “Growth Chart” database. Gmail OAuth2 credentials for report dispatch. Slack Bot Token with chat:write permission for posting analytics summaries. Google Sheets OAuth2 credentials for maintaining the error log. Environment Variables TWITTERAPIKEY FACEBOOKACCESSTOKEN AZUREOPENAIAPI_KEY NOTIONGROWTHDB_ID GMAILREPORTRECIPIENTS SLACKREPORTCHANNEL_ID GOOGLESHEETERRORLOGID Target Audience 📈 Marketing and growth teams tracking cross-platform performance 💡 Social media managers needing automated reporting 🧠 Data analysts compiling weekly engagement metrics 💬 Digital agencies managing multiple brand accounts 🧾 Operations and analytics teams monitoring performance KPIs Step-by-Step Setup Instructions 1️⃣ Connect all API credentials (Twitter, Facebook, Notion, Gmail, Slack, and Sheets). 2️⃣ Paste your Facebook Page ID and Twitter handle in respective API nodes. 3️⃣ Verify your Azure OpenAI GPT-4o connection and prompt text for HTML report generation. 4️⃣ Update your Notion database structure to match “Growth Chart” property names. 5️⃣ Add your marketing email in the Gmail node and test delivery. 6️⃣ Specify the Slack channel ID where summaries will be posted. 7️⃣ Optionally, connect a Google Sheet tab for error tracking (error_id, message). 8️⃣ Execute the workflow once manually to validate data flow. 9️⃣ Activate or schedule it for weekly or daily analytics automation. ✅