Bookmarking urls in your browser and save them to Notion
Remember when you were doing some large research and wanted to quickly bookmark a page and save it, only to find premium options? Worry not; n8n got you covered.
You can now create a simple bookmarking app straight to your browser using simple scrips on your browser called bookmarklets.
A bookmarklet is a bookmark stored in a web browser that contains JavaScript commands that add new features to the browser.
To create one, we need to add a short script to the bookmark tab of our browser like below A simple hack is to open a new tab and click on the star that appears on the right side
Now that we have our bookmark, it's time for the fun part.
Right-click on the bookmark we just created and select the edit option. This will allow you to set the name you want for your bookmark and the destination URL. The URL used here will be the script that shall "capture" the page we want to bookmark.
The code below has been used and tested to work for this example
javascript:(() => {
var currentUrl = window.location.href;
var webhookUrl = 'https://$yourN8nInstanceUrl/webhook/1rxsxc04b027-39d2-491a-a9c6-194289fe400c';
var xhr = new XMLHttpRequest();
xhr.open('POST', webhookUrl, true);
xhr.setRequestHeader('Content-Type', 'application/json');
var data = JSON.stringify({ url: currentUrl });
xhr.send(data);
})();
Your Bookmark should look like something like this
Now that we have this setup, we are now going to n8n to receive the data sent by this script.
Create a new webhook node that receives the POST request as in the workflow and replace $yourN8nInstanceUrl with your actual n8n instance.
This workflow can then be configured to send this data to a notion database. Make sure the notion database has all the required permissions before executing the workflow. Otherwise the URLs will not be saved
Bookmark URLs in Your Browser and Save Them to Notion
This n8n workflow provides a simple way to capture URLs and save them directly to a Notion database. It acts as a backend for a browser extension or a custom tool that can send HTTP requests.
What it does
- Listens for incoming webhooks: The workflow starts by exposing a webhook URL. When an HTTP POST request is sent to this URL, it triggers the workflow.
- Saves to Notion: The data received from the webhook (presumably a URL and possibly other information) is then used to create a new item in a specified Notion database.
Prerequisites/Requirements
- n8n Instance: A running n8n instance to host this workflow.
- Notion Account: An active Notion account.
- Notion Integration: A Notion integration with access to the database where you want to save the bookmarks.
- Browser Extension/Tool: A browser extension or custom script capable of sending HTTP POST requests to a specified URL.
Setup/Usage
- Import the workflow: Import the provided JSON into your n8n instance.
- Configure Notion Credentials:
- Click on the "Notion" node.
- Under "Credentials", select an existing Notion API credential or create a new one.
- To create a new credential, you'll need an "Internal Integration Token" from Notion. Follow the n8n documentation for Notion credentials for detailed steps on how to create one and grant it access to your Notion database.
- Configure Notion Database:
- In the "Notion" node, select the "Database" operation.
- Choose "Create" for the action.
- Select the specific Notion database where you want to store your bookmarks.
- Map the incoming data from the "Webhook" node to the properties of your Notion database (e.g.,
{{ $json.url }}to a URL property,{{ $json.title }}to a title property).
- Activate the Webhook:
- Click on the "Webhook" node.
- Copy the "Webhook URL". This is the URL you will use in your browser extension or custom tool.
- Ensure the workflow is activated (toggle the "Active" switch in the top right of the n8n editor).
- Integrate with your Browser/Tool:
- Configure your browser extension or custom tool to send an HTTP POST request to the copied Webhook URL whenever you want to bookmark a page.
- The request body should contain the URL and any other relevant information you want to save to Notion (e.g.,
{"url": "https://example.com", "title": "Example Page"}). - The "Sticky Note" node serves as a comment for the workflow and does not require configuration.
Related Templates
Automate ETL error monitoring with AI classification, Sheets logging & Jira alerts
ETL Monitoring & Alert Automation: Jira & Slack Integration This workflow automatically processes ETL errors, extracts important details, generates a preview, creates a log URL, classifies the issue using AI and saves the processed data into Google Sheets. If the issue is important or needs attention, it also creates a Jira ticket automatically. The workflow reduces manual debugging effort, improves visibility and ensures high-severity issues are escalated instantly without human intervention. Quick Start – Implementation Steps Connect your webhook or ETL platform to trigger the workflow. Add your OpenAI, Google Sheets and Jira credentials. Enable the workflow. Send a sample error to verify Sheets logging and Jira ticket creation. Deploy and let the workflow monitor ETL pipelines automatically. What It Does This workflow handles ETL errors end-to-end by: Extracting key information from ETL error logs. Creating a short preview for quick understanding. Generating a URL to open the full context log. Asking AI to identify root cause and severity. Parsing the AI output into clean fields. Saving the processed error to Google Sheets. Creating a Jira ticket for medium/high-severity issues. This creates a complete automated system for error tracking, analysis and escalation. Who’s It For DevOps & engineering teams monitoring data pipelines. ETL developers who want automated error reporting. QA teams verifying daily pipeline jobs. Companies using Jira for issue tracking. Teams needing visibility into ETL failures without manual log inspection. Requirements to Use This Workflow n8n account or self-hosted instance. ETL platform capable of sending error payloads (via webhook). OpenAI API Key. Google Sheets credentials. Jira Cloud API credentials. Optional: log storage URL (S3, Supabase, server logs). How It Works & Setup Steps Get ETL Error (Webhook Trigger) Receives ETL error payload and starts the workflow. Prepare ETL Logs (Code Node) Extracts important fields and makes a clean version of the error.Generates a direct link to open the full ETL log. AI Severity Classification (OpenAI / AI Agent) AI analyzes the issue, identifies cause and assigns severity. Parse AI Output (Code Node) Formats AI results into clean fields: severity, cause, summary, recommended action. Prepare Data for Logging (Set / Edit Fields) Combines all extracted info into one final structured record. Save ETL Logs (Google Sheets Node) Logs each processed ETL error in a spreadsheet for tracking. Create Jira Ticket (Jira Node) Automatically creates a Jira issue when severity is Medium, High or Critical. ETL Failure Alert (Slack Node) Sends a Slack message to notify the team about the issue. ETL Failure Notify (Gmail Node) Sends an email with full error details to the team. How to Customize Nodes ETL Log Extractor Add/remove fields based on your ETL log structure. AI Classification Modify the OpenAI prompt for custom severity levels or deep-dive analysis. Google Sheets Logging Adjust columns for environment, job name or log ID. Jira Fields Customize issue type, labels, priority and assignees. Add-Ons (Extend the Workflow) Send Slack or Teams alerts for high severity issues Store full logs in cloud storage (S3, Supabase, GCS) Add daily/weekly error summary reports Connect monitoring tools like Datadog or Grafana Trigger automated remediation workflows Use Case Examples Logging all ETL failures to Google Sheets Auto-creating Jira tickets with AI-driven severity Summarizing large logs with AI for quick analysis Centralized monitoring of multiple ETL pipelines Reducing manual debugging effort across teams Troubleshooting Guide | Issue | Possible Cause | Solution | |-------|----------------|----------| | Sheets not updating | Wrong Sheet ID or missing permission | Reconnect and reselect the sheet | | Jira ticket fails | Missing required fields or invalid project key | Update Jira mapping | | AI output empty | Invalid OpenAI key or exceeded usage | Check API key or usage limits | | Severity always “low” | Prompt too broad | Adjust AI prompt with stronger rules | | Log preview empty | Incorrect error field mapping | Verify the structure of the ETL error JSON | Need Help? For assistance setting up this workflow, customizing nodes or adding additional features, feel free to contact our n8n developers at WeblineIndia. We can help configure, scale or build similar automation workflows tailored to your ETL and business requirements.
Monitor Furusato Nozei market trends with Google News, AI analysis and Slack reporting
Analyze Furusato Nozei trends from Google News to Slack This workflow acts as a specialized market analyst for Japan's "Furusato Nozei" (Hometown Tax) system. It automates the process of monitoring related news, validating keyword popularity via search trends, and delivering a concise, strategic report to Slack. By combining RSS feeds, AI agents, and real-time search data, this template helps marketers and municipal researchers stay on top of the highly competitive Hometown Tax market without manual searching. 👥 Who is this for? Municipal Government Planners: To track trending return gifts and competitor strategies. E-commerce Marketers: To identify high-demand keywords for Furusato Nozei portals. Content Creators: To find trending topics for blogs or social media regarding tax deductions. Market Researchers: To monitor the seasonality and shifting interests in the Hometown Tax sector. ⚙️ How it works News Ingestion: The workflow triggers on a schedule and fetches the latest "Furusato Nozei" articles from Google News via RSS. AI Analysis & Extraction: An AI Agent (using OpenRouter) summarizes the news cluster and identifies the most viable search keyword (e.g., "Scallops," "Travel Vouchers," or specific municipalities). Data Validation: The workflow queries the Google Trends API (via SerpApi) to retrieve search volume history for the extracted keyword in Japan. Strategic Reporting: A second AI Agent analyzes the search trend data alongside the keyword to generate a market insight report. Delivery: The final report is formatted and sent directly to a Slack channel. 🛠️ Requirements To use this workflow, you will need: n8n (Version 1.0 or later recommended). OpenRouter API Key (or you can swap the model nodes for OpenAI/Anthropic). SerpApi Key (Required to fetch Google Trends data programmatically). Slack Account (with permissions to post to a channel). 🚀 How to set up Configure Credentials: Add your OpenRouter API key to the Chat Model nodes. Add your SerpApi key to the Google Trends API node. Connect your Slack account in the Send a message node. Check the RSS Feed: The RSS Read node is pre-configured for "Furusato Nozei" (ふるさと納税). You can leave this as is. Regional Settings: The workflow is pre-set for Japan (jp / ja). If you need to change this, check the Workflow Configuration and Google Trends API nodes. Schedule: Enable the Schedule Trigger node to run at your preferred time (default is 9:00 AM JST). 🎨 How to customize Change the Topic: While this is optimized for Furusato Nozei, you can change the RSS feed URL to track other Japanese market trends (e.g., NISA, Inbound Tourism). Swap AI Models: The template uses OpenRouter, but you can easily replace the "Chat Model" nodes with OpenAI (GPT-4) or Anthropic (Claude) depending on your preference. Adjust AI Prompts: The AI prompts are currently in Japanese to match the content. You can modify the system instructions in the AI Agent nodes if you prefer English reports.
AI-powered content factory: RSS to blog, Instagram & TikTok with Slack approval
This workflow automates the daily content creation process by monitoring trends, generating drafts for multiple platforms using AI, and requiring human approval before saving. It acts as an autonomous "AI Content Factory" that turns raw news into polished content for SEO Blogs, Instagram, and TikTok/Reels. How it works Trend Monitoring: Fetches the latest trend data via RSS (e.g., Google News or Google Trends). AI Filtering: An AI Agent acts as an "Editor-in-Chief," selecting only the most viral-worthy topics relevant to your niche. Multi-Format Generation: Three specialized AI Agents (using gpt-4o-mini for cost efficiency) run in parallel to generate: An SEO-optimized Blog post structure. An Instagram Carousel plan (5 slides). A Short Video Script (TikTok/Reels). Human-in-the-Loop: Sends a formatted message with interactive buttons to Slack. The workflow waits for your decision. Final Storage: If approved, the content is automatically appended to Google Sheets. Who is this for Social Media Managers & Content Creators Marketing Agencies managing multiple accounts Anyone wanting to automate "research to draft" without losing quality control. Requirements n8n: Version 1.19.0+ (requires AI Agent nodes). OpenAI: API Key (works great with low-cost gpt-4o-mini). Slack: A workspace to receive notifications. Google Sheets: To store the approved content. How to set up Configure Credentials: Set up your OpenAI, Slack, and Google Sheets credentials. Slack App: Create a Slack App, enable "Interactivity," and set the Request URL to your n8n Production Webhook URL. Add the chat:write scope and install it to your workspace. Google Sheet: Create a sheet with columns for Blog, Instagram, and Script (row 1 as headers). RSS Feed: Change the RSS node URL to your preferred topic source.