Skip to main content
← Back to blog
Claude CodeAIObsidianAutomationProductivity

Daily AI News Digest with Claude Code CLI and Obsidian — Zero Dependencies

How I built a daily news research agent with a 6-line bash script, Claude Code headless mode, and macOS launchd. It searches 11 topics and writes digests directly to Obsidian.

Published April 12, 20267 min read

TL;DR

A 6-line bash script that runs Claude Code CLI in headless mode every morning at 9:00 AM. It searches the web for news across 11 configurable topics, filters out noise, and writes a formatted markdown digest directly into an Obsidian vault synced via iCloud. Zero dependencies. ~100 lines of config total.

The Problem

As a developer, staying current across multiple technologies is a daily tax. RSS feeds are noisy, Twitter is a time sink, newsletters arrive when you're deep in flow. I needed something that does the research for me and presents the results where I already work — my Obsidian vault.

The typical solution is to build a scraping pipeline: a scheduler, a crawler, an NLP pipeline, a database, a notification service. That's weeks of work for something that might break when a site changes its HTML. I wanted something I could build in an afternoon.

Architecture

The entire system is 4 files and zero dependencies. Here's how it works end to end:

macOS launchd (9:00 AM daily)
  │
  └── digest.sh
        │
        └── claude -p "$(cat prompt.md)" --max-turns 20 --allowedTools Read,WebSearch,WebFetch,Write
              │
              ├── Reads topics.yaml (11 configurable topics)
              ├── WebSearch → finds news for each topic (last 24-48h)
              ├── WebFetch → reads full articles
              ├── Filters noise: old tutorials, promos, AI spam
              └── Write → saves digest to Obsidian Vault
                    │
                    └── ~/Obsidian Vault/digests/2026-04-12.md
                          │
                          └── iCloud sync → available on all devices

The Code (All of It)

The project is intentionally minimal. Every line earns its place.

Entry Point: digest.sh

The entire application is a 6-line bash script:

digest.sh
#!/bin/bash
DIGEST_DIR="$HOME/Developer/news-digest"

claude -p "$(cat "$DIGEST_DIR/prompt.md")" \
  --max-turns 20 \
  --allowedTools Read,WebSearch,WebFetch,Write

The key flags: -p runs Claude in headless mode (no interactive terminal), --max-turns 20 gives the agent enough room to research all topics, and --allowedTools restricts the agent to only reading files, searching the web, and writing the output.

The Brain: prompt.md

This is where the intelligence lives. The prompt turns Claude into a news research agent with specific instructions on what to find, how to filter, and where to save:

prompt.md
# News Digest Agent

You are a news research agent. Your job is to find today's most important
and interesting news for a senior frontend developer.

## Instructions

1. Read the topics file at ~/Developer/news-digest/topics.yaml
2. For EACH topic, search the web for news from the last 24-48 hours
3. Filter: only include genuinely new and noteworthy items
4. Write the digest as a markdown file to Obsidian Vault digests/YYYY-MM-DD.md
5. IMPORTANT: Use the Write tool to save the file. Do NOT output to stdout.

## Rules

- Language: Ukrainian for summaries, English for titles and technical terms
- If there's no real news for a topic — SKIP IT ENTIRELY
- Prioritize: releases > breaking changes > security > new patterns > discussions
- Max 5 items per topic, sorted by importance
- Include direct links to sources
- Skip promotional content, generic tutorials, and AI-generated spam

Configuration: topics.yaml

Topics are fully configurable — add a new topic and it's included in tomorrow's digest. Each topic has a name and optional context that guides the AI's search:

topics.yaml
topics:
  - name: better-auth
    context: "auth library for TypeScript. New releases, breaking changes"

  - name: Next.js
    context: "GitHub issues, releases, App Router, Turbopack, performance"

  - name: SolidJS
    context: "SolidStart, releases, ecosystem, comparison with React"

  - name: Tailwind CSS
    context: "v4 updates, new utilities, plugins"

  - name: Claude AI
    context: "Anthropic announcements, Claude Code, new models, MCP, API"

  - name: GPT AI
    context: "OpenAI announcements, new models, ChatGPT features"

  - name: React
    context: "React 19+, Server Components, new patterns, ecosystem"

  - name: Apple
    context: "Hardware, software, WWDC, developer tools, Apple Intelligence"

  - name: Notable People
    context: >
      Latest tweets from: Elon Musk, Dario Amodei, Sam Altman,
      Jensen Huang, Andrej Karpathy, Simon Willison, Swyx...

  - name: AI Global
    context: "Major AI news, new models, regulations, open-source AI"

Scheduling with launchd

On macOS, launchd is the native way to schedule recurring tasks (like cron on Linux). The plist file defines when and how the script runs:

~/Library/LaunchAgents/com.news-digest.plist
<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple//DTD PLIST 1.0//EN"
  "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
    <key>Label</key>
    <string>com.news-digest</string>

    <key>ProgramArguments</key>
    <array>
        <string>/bin/bash</string>
        <string>/Users/you/Developer/news-digest/digest.sh</string>
    </array>

    <key>StartCalendarInterval</key>
    <dict>
        <key>Hour</key>
        <integer>9</integer>
        <key>Minute</key>
        <integer>0</integer>
    </dict>

    <key>StandardOutPath</key>
    <string>/Users/you/Developer/news-digest/logs/stdout.log</string>
    <key>StandardErrorPath</key>
    <string>/Users/you/Developer/news-digest/logs/stderr.log</string>

    <key>EnvironmentVariables</key>
    <dict>
        <key>PATH</key>
        <string>/usr/local/bin:/opt/homebrew/bin:/usr/bin:/bin</string>
        <key>HOME</key>
        <string>/Users/you</string>
    </dict>
</dict>
</plist>

Install it with launchctl load ~/Library/LaunchAgents/com.news-digest.plist. The script runs every day at 9:00 AM, even if you were logged out — launchd will run missed jobs when the system wakes up.

What the Output Looks Like

Every morning, a new markdown file appears in the Obsidian vault with structured, prioritized news:

digests/2026-04-12.md
---
date: 2026-04-12
---

# News Digest — 2026-04-12

## Next.js

### Next.js 16.1 Released with Improved Turbopack Caching
Нова версія Next.js 16.1 включає покращене кешування для Turbopack,
що зменшує час холодного старту на ~40%.
[Посилання](https://nextjs.org/blog/next-16-1)

### Critical Memory Leak Fix in App Router
Виправлено витік пам'яті в App Router при частому перемиканні
між динамічними маршрутами.
[GitHub Issue](https://github.com/vercel/next.js/issues/...)

## Claude AI

### Claude Code 1.5 — MCP Server Auto-Discovery
Anthropic випустив оновлення Claude Code з автоматичним
виявленням MCP серверів у проєкті.
[Блог](https://www.anthropic.com/news/...)

By the Numbers

MetricValue
Total files in project4 (digest.sh, prompt.md, topics.yaml, .gitignore)
Lines of code~100
External dependencies0
Setup time~10 minutes
Daily execution time2-5 minutes
Cost per run~$0.10-0.30 (Claude API usage)

Key Design Decisions

  • Claude Code CLI over API — no need to manage API keys, HTTP clients, or response parsing. The CLI handles authentication, tool execution, and error recovery out of the box
  • Obsidian over email — digests are searchable, linkable, and permanent. They live alongside my notes rather than drowning in an inbox
  • launchd over cron — launchd is the macOS-native scheduler. It handles missed runs, logging, and environment variables cleanly
  • YAML for topics — adding a new topic is a 2-line change. No code modifications needed
  • Skip empty topics — if there's no real news, the section is omitted entirely. No filler, no stretched content

How to Build Your Own

You can have this running in 10 minutes:

  1. Install Claude Code CLI and authenticate
  2. Clone the repo: git clone https://github.com/oleksiimazurenko/news-digest
  3. Edit topics.yaml with your interests and prompt.md with your output path
  4. Edit the plist file with your username and paths, then launchctl load it
  5. Wait for 9:00 AM — or test manually with bash digest.sh

Conclusions

The most interesting thing about this project is what's not in it. No database, no API server, no Docker container, no npm packages, no Python virtualenv, no HTML parser, no NLP pipeline. Just a prompt, a topic list, and a 6-line script that delegates all the hard work to an AI agent.

This is what building with AI agents looks like in practice: you define the what and the where, and the agent handles the how. The total development time was about 2 hours, including iterating on the prompt to get the filtering quality right.

Sources