< >

Weekly Business Review System

Replace 2-4 hours of manual weekly assembly with a 15-minute automated review

15m
Weekly Generation
Down from 2-4 hours manual
12
Requirements Shipping
10 more on the roadmap
10pg
Executive Report
With week-over-week analysis

The Problem with Manual WBRs

Most weekly reviews are assembled from memory, not data. The tooling gap leads to drift, missed signals, and repeated mistakes.

Without a System

  • 2-4 hours logging into dashboards, chasing teammates, copy-pasting metrics
  • Context reconstructed from memory on Friday night
  • Manual transcription errors in every report
  • No week-over-week comparison unless you do it by hand
  • Decisions based on whatever you remember, not what happened

With the WBR System

  • 15 minutes reviewing AI-generated output
  • Context captured daily in 2-minute structured notes
  • Automated metrics collection across your platforms
  • Week-over-week comparison with deltas and trend indicators
  • Immutable snapshots for historical record

What the Toolchain Does

A single CLI entrypoint routes to over a dozen capabilities. No GUI. No SaaS dependency. Shell scripts, Python, and markdown.

Daily Notes

Structured templates with 13 sections: highlights, lowlights, risks, issues, challenges, observations, metrics, and more. 2-3 minutes per day. Friday aggregation pulls the whole week together automatically.

Metrics Pipeline

Automated collection across your platforms: analytics, social, content, Git repos, and more. Configurable per customer. JSON schema validation, markdown rendering, and formatted handoff for WBR generation.

PDF Generation

Converts WBR markdown to PDF and DOCX. 2x2 executive summary on page 1 with status indicators for every metric. Skips regeneration when output is current.

Data Masking

One flag redacts person names, company names, and sensitive references for sharing. Allowlists, CamelCase detection, and markdown-aware parsing preserve formatting while masking content.

Immutable Snapshots

Every weekly WBR gets frozen with SHA256 checksums, read-only permissions, and revision tagging. The complete record: WBR markdown, metrics handoff, and validated JSON.

Snapshot Comparison

Side-by-side delta across four dimensions: metadata, metrics (with percent change), content (lines added/removed), and sources. See what changed in the full picture between any two weeks.

Multi-Customer Profiles: Named profiles for different businesses or clients. Each profile gets its own configuration, masking rules, metrics targets, and output directory. Switch contexts with a single command.

The Full Mechanism

A WBR is not just a tool. It is the complete system: challenge, outcomes, tools, adoption, inspection, inputs, and iteration.

Business Challenge

Running a business without systematic weekly reflection leads to drift: missed signals, repeated mistakes, decisions based on memory instead of data.

Desired Outcomes

WBR generation in under 15 minutes. Elimination of manual transcription errors. Week-over-week trend visibility without manual lookback. Context captured daily, not reconstructed Friday night.

Adoption

A single CLI entrypoint with sensible defaults, or a pure-prompt workflow for non-terminal users. Daily notes take 2 minutes. The Friday workflow is one command to start, two prompts to complete, one command to finish.

Inspection

Snapshot comparison surfaces what changed week-over-week: metrics, content, RICO items, data quality. The system surfaces the deltas; you interpret them and decide what to act on.

Inputs

2 minutes per day capturing daily notes. Platform credentials for automated metrics collection. A WBR template defining metric targets and monthly reach ramp.

Iteration

Metric catalog, pluggable ingestion, time comparisons, anomaly detection. Each phase tightens the feedback loop. The system keeps getting better because you use it every week.

RICO Tracking: Risks, Issues, Challenges, and Observations with owners and expected completion dates. P0 and P1 items surface on page 1. Lower-priority items live in the appendix. This is the section that catches the signals you would otherwise miss.

Measured Outcomes

Results from running this system every week for months

90%
Time Reduction
2-4 hours to 15 minutes
0
Manual Entry Errors
Automated collection
1
Collection Prompt
All platforms at once
WoW
Trend Visibility
Automatic comparison

What the Output Looks Like

A 10-page executive report with a 2x2 summary on page 1 and nine detailed appendices

Page 1: Executive Summary

Four quadrants covering everything that matters:

  • Business Update: Projects, content performance, business development, next steps
  • RICO: P0/P1 risks, issues, challenges, and observations with owners and ECDs
  • Key Metrics: Input metrics with targets and status indicators
  • Highlights/Lowlights: What went well and what didn't

Appendices

Nine sections of operational detail:

  • Platform metrics detail across all sources
  • Content engagement metrics
  • Revenue and monetization
  • Publication calendar
  • Action items (new and carryover)
  • Git commit activity by repo and type
  • Additional insights and SMART goals
  • Reach trajectory with monthly targets
See a Real Example: Download a complete masked WBR (PDF) from an actual weekly review. Real data, real structure, real RICO items. Names and companies redacted.

Built with AI-Native Development

22 requirements in EARS format, each with acceptance criteria, traceability, and a dependency graph. Built with Claude Code, Codex, and Kiro.

The Development Pattern

  • Write the requirement with acceptance criteria using SHALL/SHALL NOT language
  • Point an AI coding agent at the spec and let it generate the implementation
  • Run the tests, iterate on failures, refine
  • Snapshot the result as part of the weekly WBR workflow

EARS-format specs with unambiguous acceptance criteria produced an implementation that passed 11 of 12 criteria on the first generation for the masking engine.

What's on the Roadmap

  • Metric Catalog: Central definition for every tracked metric with owners, targets, and guardrails
  • Pluggable Ingestion: Drop-in plugins with native API access for each data source
  • Time Comparisons: Automatic WoW, MoM, MTD, QTD, and YoY calculations
  • Anomaly Detection: Automatic flagging when metrics breach guardrail bounds

About the Founder

Paul Duvall has built software solutions and systems for 30+ years, with deep expertise in cloud and DevSecOps for more than 15 years. He led large-scale engineering and security programs at AWS and co-founded Stelligent, the first company focused exclusively on Continuous Delivery/DevOps on AWS. He has been building applications with AI-assisted development tools since early 2023.

The WBR system described on this page is informed by his experience as a Director at AWS, where Weekly Business Reviews are a core operational mechanism. He built this toolchain for his own business and uses it every Friday. The architecture, the metric design, and the adoption plan are the same things he helps customers implement.

Build Your Own WBR System

Whether you want help designing the architecture, defining the metrics, or building the adoption plan for your business, get in touch.