This content pipeline was the first thing I built with Marc (my agent). One podcast episode turns into 6 content pieces. All automated. I record a conversation with a founder. By morning, 3 video clips are edited with subtitles, 3 X posts are written and scheduled, and a newsletter draft is ready for review. I touch nothing after hitting the stop button.

Here's the exact 7-step pipeline with every agent, every cron, and every tool explained.

Step 0: Guest Research and Outreach

Before I even record, Mona Lisa (my guest research agent) has already done the prospecting.

She runs on a cron job every Monday and Thursday morning. Here's what she does:

When I wake up, I open Notion, scan the prospects, tweak the DMs if needed, and send. The research that used to take me 3-4 hours per week now takes 15 minutes of review.

Conversion rate: About 40% of prospects respond positively. The personalization makes the difference. Generic "I'd love to have you on my podcast" messages get ignored. "I saw your tweet about hitting $50K MRR with zero paid ads, that's exactly the kind of story my audience needs to hear" gets responses.

Step 1: Record the Episode (The Only Manual Step)

This is the ONLY thing I do. I sit down, have a conversation with a founder, and hit stop.

That's it. That's my entire job in this pipeline.

The recording goes to Riverside.fm, which auto-exports the video file. Everything after this is hands-free.

Step 2: Auto-Detection and Transcription

Jimmy (my YouTube agent) monitors my podcast RSS feed with a cron job every Tuesday and Friday at 1AM Bali time.

The second a new episode drops, the automation chain starts:

  1. Detection: Jimmy detects the new episode via RSS feed change
  2. Download: Pulls the audio file from the feed
  3. Transcription: Runs OpenAI Whisper for a full word-level transcription with precise timestamps. Every word is mapped to its exact position in the audio.
  4. Storage: Transcript file (with timestamps) saved to the workspace and pushed to Notion

The word-level timestamps are critical. They're what allow the next steps to extract precise clips without manual timecoding. A sentence-level transcript wouldn't cut it.

Step 3: Clip Selection and Copywriting

Claude (my copy editor agent) picks up the transcript at 2AM. His job is two-fold: find the best moments and write posts for each one.

Clip Selection

He scans the full transcript and identifies the 3 best clip-worthy moments. Not random quotes. Moments with:

Copywriting

For each clip, Claude writes an X post in my voice. He:

Subtitle Notes

Claude also generates Subtitle Notes for each clip: proper nouns, brand names, technical terms. So the subtitles don't say "Proton emails" when the guest said "ProtonMail." This prevents embarrassing errors in the final video.

Everything gets pushed to Notion with the exact timestamps, post copy, and subtitle notes. By 2:30AM, the creative work is done.

Step 4: Automated Video Editing

Adrien (video editor agent) picks up Claude's timestamps at 3:20AM and extracts the clips from the full episode using ffmpeg.

Then he runs the full editing pipeline. Every clip goes through these stages:

  1. Extraction: ffmpeg cuts the clip from the full episode at the exact timestamps Claude specified
  2. Whisper re-transcription: Runs Whisper again on just the clip for word-level timing (more precise than the full-episode transcript)
  3. Smart cuts: A Python script removes filler words ("um," "uh," "you know," "like") and silences longer than 0.5 seconds. This tightens the clip by 15-25% without losing meaning.
  4. Subtitle generation: Another Python script generates SRT files with max 4 words per line, proper sentence boundary handling, and proper nouns verified against Claude's Subtitle Notes
  5. Subtitle burn-in: ffmpeg burns the subtitles directly into the video. Clean white text, no background box, positioned at the lower third.
  6. Final encode: h264 video, AAC audio, 16:9 aspect ratio, under 140 seconds, optimized for X's video player

All of this runs through a custom bash script (process_clip.sh) plus Python scripts for smart cuts and SRT generation. No Premiere. No CapCut. No manual editing whatsoever.

Quality Assurance

Bob (QA agent) reviews every clip at 4AM. He checks:

Score below 10/10? Sent back to Adrien with specific notes on what to fix. This loop runs until every clip passes.

Step 5: Scheduling and Publishing

Dan (X growth agent) picks up the approved clips at 5AM. For each clip, he:

  1. Uploads the video to Typefully using their 3-step media upload API (initialize, upload chunks, finalize)
  2. Creates the draft with Claude's post copy attached to the uploaded video
  3. Schedules at optimal times for different time zones:

3 posts. 3 clips. Every single day a new episode drops. I'm asleep for all of it.

Performance: Video clips consistently outperform text-only posts by 3-5x on impressions. The combination of a compelling hook (Claude's copy) + visual content (Adrien's editing) + optimal timing (Dan's scheduling) compounds over time.

Step 6: Newsletter Generation

Tyler (newsletter agent) takes the same transcript and works in parallel with the video pipeline.

His process:

  1. Extract: Pulls the top 3-5 insights, frameworks, and actionable takeaways from the episode
  2. Structure: Formats them as a newsletter teaser. Not the full content. Bullet points of what they'll learn. The goal is to drive clicks to the YouTube episode.
  3. Review: Claude reviews the draft for tone, accuracy, and structure before anything goes out
  4. Push: The approved draft gets pushed to Notion (Newsletter Drafts folder) and prepared for Beehiiv

The newsletter is a teaser, not a summary. It makes people want to watch the episode. "Here are 5 things Rob Hoffman learned building 3 SaaS companies to $50K+ MRR. #3 changed how I think about pricing." Then a link to the episode.

Step 7: Analytics Feedback Loop

Loop (analytics agent) scrapes engagement data daily. He tracks:

Everything lands in my HQ dashboard on Notion. Weekly report generated automatically.

The feedback loop matters because it informs future episodes. If clips about pricing strategy consistently outperform clips about fundraising, I know what my audience wants. When I sit down to record again, I already know which questions to ask.

Before vs. After: Manual vs. Automated Content Creation

Here's what this pipeline replaced.

TaskBefore (Manual)After (Automated)
Guest research3-4 hours/week15 min review
Transcription1 hour (upload to Descript, review, export)0 min (automatic)
Find best clips2 hours (watch full episode, take notes)0 min (Claude handles it)
Write X posts1 hour per post (3 posts = 3 hours)0 min (Claude writes, scores, rewrites)
Edit video clips3-4 hours in Premiere/CapCut0 min (ffmpeg + Python scripts)
Add subtitles30 min per clip (1.5 hours total)0 min (automated SRT generation + burn-in)
Schedule posts30 min (upload, write, schedule)0 min (Typefully API)
Newsletter2 hours (write, format, review)15 min review
Total per episode13-16 hours30 min review

That's 13-16 hours saved per episode. Two episodes per week = 26-32 hours/week. That's a full-time employee's worth of work. Handled by agents running on a $500 Mac Mini.

How the Agents Coordinate (Workflow Overview)

The coordination is time-based using cron jobs. Each agent has a scheduled window and knows what to expect from the previous agent.

Here's the timeline for a single episode (all times Bali/WITA):

  1. 1:00 AM - Jimmy detects new episode, downloads audio, runs Whisper transcription
  2. 2:00 AM - Claude scans transcript, selects 3 clips, writes X posts, generates subtitle notes
  3. 3:20 AM - Adrien extracts clips, runs smart cuts, generates subtitles, burns in, encodes
  4. 4:00 AM - Bob reviews all clips. Pass/fail. Failed clips loop back to Adrien.
  5. 5:00 AM - Dan uploads approved clips to Typefully, creates drafts, schedules posts
  6. 5:30 AM - Tyler generates newsletter teaser from transcript, Claude reviews
  7. 7:00 AM - Morning brief arrives in my Telegram with a summary of everything that happened overnight

Each agent works in its own workspace. They communicate through shared files in Notion and the workspace file system. If one agent fails, the next one in the chain gets a notification and the pipeline pauses at that step.

Bob is the quality gate. Nothing goes public without his 10/10 approval.

The Full Tech Stack

ToolPurposeCost
OpenClawAgent orchestration, cron jobs, tool executionFree (open source)
Claude Opus (Anthropic)AI model for all agents~$100-200/month API
OpenAI WhisperAudio transcription with word-level timestampsMinimal (included in API costs)
ffmpegVideo extraction, subtitle burn-in, encodingFree (open source)
Python scriptsSmart cuts, SRT generation, subtitle formattingFree (custom code)
Typefully APIX scheduling with native video upload~$12/month
Notion APIContent database, tracking, handoffs between agentsFree tier works
BeehiivNewsletter deliveryFree up to 2,500 subs
Mac Mini M4Hardware running all agents 24/7$500-700 one-time

Total monthly cost: About $150-225 for everything. Compare that to a video editor ($2,000+/month), social media manager ($1,500+/month), and newsletter writer ($1,000+/month). The ROI is absurd.

How to Build Your Own Content Pipeline

You don't need 13 agents to start. Here's the minimum viable pipeline.

Start Simple: 3 Agents

  1. Main agent (Marc): Handles transcription and coordinates the other two
  2. Copy agent: Writes social posts from transcripts
  3. Scheduling agent: Posts to your platforms via API

This alone saves 5-8 hours per piece of content. You can add video editing, newsletter, and analytics agents later as you get comfortable.

Prerequisites

Learn From Real Setups

Inside OpenClaw Lab, I share the exact SOUL.md files, cron schedules, SOP documents, and Python scripts that power this pipeline. 270+ founders are building similar systems for their own businesses. We do weekly live sessions where I walk through updates to the pipeline and answer questions.

The community includes founders running content pipelines for YouTube channels, podcasts, newsletters, SaaS blogs, and agency clients. If you can create one piece of content, OpenClaw can turn it into ten.

For the full content pipeline with more case studies, read our AI agent content creation guide. New to OpenClaw? Start with our installation guide or read what OpenClaw is first.

Frequently Asked Questions

What is the OpenClaw content machine pipeline?

The OpenClaw content machine pipeline is a 7-step automated workflow that takes raw content (podcast episodes, ideas, research) and produces finished outputs across multiple platforms. It handles research, writing, editing, formatting, scheduling, and publishing.

How do I automate content creation with OpenClaw?

Set up specialized agents for each step: research, writing, editing, and publishing. Use cron jobs to trigger the pipeline on a schedule. Each agent passes its output to the next one, creating an end-to-end content production system.

Can OpenClaw write and publish blog posts automatically?

Yes, OpenClaw can research topics, write blog posts, optimize them for SEO, and publish them to your website. The content machine pipeline automates the entire process from idea to published article with minimal human intervention.

How many pieces of content can the pipeline produce per week?

A well-configured content pipeline can produce 10 to 20+ pieces of content per week across different formats. This includes blog posts, social media threads, newsletter editions, and repurposed content from existing material.

What tools does the OpenClaw content pipeline use?

The pipeline uses web search for research, the file system for drafts, APIs for publishing (Typefully, Beehiiv, WordPress), and browser automation for platforms without APIs. Each tool is accessed through OpenClaw skills and shell commands.

Want the exact playbooks, skill files, and workflows? Join 260+ founders inside OpenClaw Lab.

Join OpenClaw Lab →