2L 2NDLAW epistemic governance for LLMs

Strategic Discipline

The Board-Level View: Risk, Capital, and Strategic Lock-In

This category explores the "why" and "what" of AI adoption, focusing on alignment with organizational goals, ethical governance, and long-term planning. Perspectives here address the critical decisions that precede and guide the technical work.

Propagation, Propaganda, and the Price of Truth

(On why reach dominates accuracy in attention-driven systems.)

Information optimized for spread outcompetes information optimized for accuracy. AI accelerates this asymmetry by collapsing the cost of fluent narrative production to zero while verification costs remain unchanged. Truth doesn't win by default—it survives only where systems force claims to answer to something other than attention.

Propagation · Narrative control · Virality · Epistemic drift

AI Landscape: Truth, Narrative Control, and the Cost of Refusal

(On the institutional pressures acting on truth at scale.)

Why truth fails by default under machine-scaled narrative systems unless structural cost is imposed at runtime.

Narrative control · Truth at scale · Institutional pressure · Runtime cost

The Epistemological Preemption: Why the Fight Over State AI Laws is a Battle for Historical Truth

(On regulatory preemption, narrative control, and the defense of epistemological integrity.)

How the federal strategy to block state AI laws enables training data manipulation and narrative control, and why model snapshots are essential for preserving truth against categorical deletion of knowledge.

Regulatory preemption · Training data · Model snapshots · Epistemological integrity

The Real AGI Cost Isn't the Model. It's the Decade You Lose.

(On strategic misallocation, opportunity cost, and the capital trap of TORE.)

AGI's real cost isn't the training bill—it's organizing a company around the wrong premise. When firms commit to TORE (Train Once, Reason Everywhere), they risk losing a decade to capital misallocation, overbuilt infrastructure, and strategic lock-in. The Windows Phone era is the closest preview we have of how an AGI bet can fail: not as a single bad project, but as a lost decade.

Strategic misallocation · Opportunity cost · Capital trap · TORE

The AGI Delta Problem: Why Every Year of Progress Shrinks the Case for AGI

(On time, delta decay, and the moving target of AGI value.)

The AGI dream assumes a future discontinuity—a great leap that renders all current systems obsolete. But that leap only looks dramatic if you imagine the present frozen in amber. The world doesn't freeze. Every year of incremental progress shrinks the delta AGI claims to deliver, making the supposed transformation less transformative with each passing quarter.

Delta decay · Moving targets · Time erosion · Strategic stasis