Publish a post to a publication on Medium
Publish a Post to a Publication on Medium
This n8n workflow demonstrates a basic setup for publishing content to Medium. It serves as a starting point for more complex automation involving Medium posts, such as cross-posting from other platforms or scheduling content.
What it does
This workflow contains the following steps:
- Starts the workflow: The
Startnode initiates the workflow execution. This can be manually triggered or configured with a scheduled trigger in a more advanced setup. - Publishes to Medium: The
Mediumnode is configured to interact with the Medium API. In its current state, it's ready to be configured to publish a post to a specific publication or user, depending on the chosen operation and provided data.
Prerequisites/Requirements
- n8n Instance: You need a running n8n instance to import and execute this workflow.
- Medium Account: A Medium account is required.
- Medium API Key (Integration Token): You will need to create an integration token in your Medium settings to authenticate the
Mediumnode with your account.
Setup/Usage
- Import the Workflow:
- Copy the workflow JSON provided.
- In your n8n instance, go to "Workflows" and click "New".
- Click the "Import from JSON" button and paste the copied JSON.
- Configure Medium Credentials:
- Click on the
Mediumnode. - Under "Credentials", click "Create New".
- Select "Medium API" as the credential type.
- Enter your Medium Integration Token (API Key) into the "Access Token" field.
- Give your credential a descriptive name (e.g., "My Medium Account") and click "Save".
- Click on the
- Configure the Medium Node:
- With the
Mediumnode selected, choose the desired operation (e.g., "Create Post"). - Provide the necessary details for your post, such as "Title", "Content", "Tags", and importantly, the "Publication ID" if you intend to publish to a specific publication.
- You can manually enter test data or connect previous nodes to dynamically generate content.
- With the
- Test the Workflow:
- Click the "Execute Workflow" button to run a test.
- Review the output of the
Mediumnode to ensure the post was published successfully.
- Activate the Workflow:
- Once configured and tested, activate the workflow by toggling the "Active" switch in the top right corner of the editor.
- For a real-world scenario, you would likely replace the
Startnode with a trigger like anHTTP Requestnode, aCronnode, or a trigger from another application (e.g., a new article in an RSS feed).
Related Templates
Automate bank statement and invoice reconciliation with GPT and Google Sheets
๐ข Manual Trigger Workflow starts manually to initiate the reconciliation process on demand. ๐ Fetch Invoices & Bank Statements Retrieves invoice data and bank statement data from Google Sheets for comparison. ๐ Merge Data Combines both datasets into a single structured dataset for processing. ๐งฉ Format Payload for AI Function node prepares and structures the merged data into a clean JSON payload for AI analysis. ๐ค AI Reconciliation AI Agent analyzes the invoice and bank statement data to identify matches, discrepancies, and reconciled entries. ๐งฎ Parse AI Output Parses the AI response into a structured format suitable for adding back to Google Sheets. ๐ Update Sheets Adds the reconciled data and reconciliation results into the target Google Sheet for recordkeeping. ๐งพ Prerequisites โ OpenAI API Credentials Required for the AI Reconciliation node to process and match transactions. Add your OpenAI API key in n8n โ Credentials โ OpenAI. โ Google Sheets Credentials Needed to read invoice and bank statement data and to write reconciled results. Add credentials in n8n โ Credentials โ Google Sheets. โ Google Sheets Setup The connected spreadsheet must contain the following tabs: Invoices โ for invoice data Bank_Statement โ for bank transaction data Reconciled_Data โ for storing the AI-processed reconciliation output โ Tab Structure & Required Headers Invoices Sheet Columns: Invoice_ID Invoice_Date Customer_Name Amount Status Bank_Statement Sheet Columns: Transaction_ID Transaction_Date Description Debit/Credit Amount Reconciled_Data Sheet Columns: Invoice_ID Transaction_ID Matched_Status Remarks Confidence_Score โ๏ธ n8n Environment Setup Ensure all nodes are connected correctly and the workflow has permission to access the required sheets. Test each fetch and write operation before running the full workflow.
Upload large files to Dropbox with chunking & web UI progress tracking
Dropbox Large File Upload System How It Works This workflow enables uploading large files (300MB+) to Dropbox through a web interface with real-time progress tracking. It bypasses Dropbox's 150MB single-request limit by breaking files into 8MB chunks and uploading them sequentially using Dropbox's upload session API. Upload Flow: User accesses page - Visits /webhook/upload-page and sees HTML form with file picker and folder path input Selects file - Chooses file and clicks "Upload to Dropbox" button JavaScript initiates session - Calls /webhook/start-session โ Dropbox creates upload session โ Returns sessionId Chunk upload loop - JavaScript splits file into 8MB chunks and for each chunk: Calls /webhook/append-chunk with sessionId, offset, and chunk binary data Dropbox appends chunk to session Progress bar updates (e.g., 25%, 50%, 75%) Finalize upload - After all chunks uploaded, calls /webhook/finish-session with final offset and target path File committed - Dropbox commits all chunks into complete file at specified path (e.g., /Uploads/video.mp4) Why chunking? Dropbox API has a 150MB limit for single upload requests. The upload session API (uploadsession/start, appendv2, finish) allows unlimited file sizes by chunking. Technical Architecture: Four webhook endpoints handle different stages (serve UI, start, append, finish) All chunk data sent as multipart/form-data with binary blobs Dropbox API requires cursor metadata (session_id, offset) in Dropbox-API-Arg header autorename: true prevents file overwrites Setup Steps Time estimate: ~20-25 minutes (first time) Create Dropbox app - Go to Dropbox App Console: Click "Create app" Choose "Scoped access" API Select "Full Dropbox" access type Name your app (e.g., "n8n File Uploader") Under Permissions tab, enable: files.content.write Copy App Key and App Secret Configure n8n OAuth2 credentials - In n8n: Create new "Dropbox OAuth2 API" credential Paste App Key and App Secret Set OAuth Redirect URL to your n8n instance (e.g., https://your-n8n.com/rest/oauth2-credential/callback) Complete OAuth flow to get access token Connect credentials to HTTP nodes - Add your Dropbox OAuth2 credential to these three nodes: "Dropbox Start Session" "Dropbox Append Chunk" "Dropbox Finish Session" Activate workflow - Click "Active" toggle to generate production webhook URLs Customize default folder (optional) - In "Respond with HTML" node: Find line: <input type="text" id="dropboxFolder" value="/Uploads/" ... Change /Uploads/ to your preferred default path Get upload page URL - Copy the production webhook URL from "Serve Upload Page" node (e.g., https://your-n8n.com/webhook/upload-page) Test upload - Visit the URL, select a small file first (~50MB), choose folder path, click Upload Important Notes File Size Limits: Standard Dropbox API: 150MB max per request This workflow: Unlimited (tested with 300MB+ files) Chunk size: 8MB (configurable in HTML JavaScript CHUNK_SIZE variable) Upload Behavior: Files with same name are auto-renamed (e.g., video.mp4 โ video (1).mp4) due to autorename: true Upload is synchronous - browser must stay open until complete If upload fails mid-process, partial chunks remain in Dropbox session (expire after 24 hours) Security Considerations: Webhook URLs are public - anyone with URL can upload to your Dropbox Add authentication if needed (HTTP Basic Auth on webhook nodes) Consider rate limiting for production use Dropbox API Quotas: Free accounts: 2GB storage, 150GB bandwidth/day Plus accounts: 2TB storage, unlimited bandwidth Upload sessions expire after 4 hours of inactivity Progress Tracking: Real-time progress bar shows percentage (0-100%) Status messages: "Starting upload...", "โ Upload complete!", "โ Upload failed: [error]" Final response includes file path, size, and Dropbox file ID Troubleshooting: If chunks fail: Check Dropbox OAuth token hasn't expired (refresh if needed) If session not found: Ensure sessionId is passed correctly between steps If finish fails: Verify target path exists and app has write permissions If page doesn't load: Activate workflow first to generate webhook URLs Performance: 8MB chunks = ~37 requests for 300MB file Upload speed depends on internet connection and Dropbox API rate limits Typical: 2-5 minutes for 300MB file on good connection Pro tip: Test with a small file (10-20MB) first to verify credentials and flow, then try larger files. Monitor n8n execution list to see each webhook call and troubleshoot any failures. For production, consider adding error handling and retry logic in the JavaScript.
Filter the feedback from Typeform and store in Google Sheets
This workflow allows you to filter positive and negative feedback received from a Typeform and insert the data into Google Sheets. Typeform Trigger node: Start the workflow when a new form is submitted via Typeform Set node: Extract the information submitted in typeform IF node: Filter positive and negative reviews (i.e. ratings above or below 3 out of 5). Google Sheets node: Store the positive and negative reviews and ratings in two different sheets for each case.