Applied Workflow

Extracting Methodology From Execution Data Through CEM

How I turned an unnamed personal methodology into a published, DOI-stamped body of work in 12 days — using the same mechanisms that built the software portfolio.

21
Working threads across the formalization period
5
Parallel threads operating simultaneously by Day 7

The Problem

I had been executing at a high level for months. Ten production systems. Over half a million lines of code. Clear patterns in how I worked, how I decided, how I recovered from problems. But none of it was written down. None of it was transferable. If I got hit by a bus, everything I had learned about how to operate at this velocity would vanish. The gap between doing and documenting is where intellectual property dies unrecognized — and I was standing in the middle of that gap.

The obvious answer was to stop building and start writing. But that tradeoff is exactly why most practitioners never formalize anything. The execution that generates methodology is rewarding, measurable, and immediately productive. The documentation that makes it transferable is slow, ambiguous, and produces no revenue. Every day I spent writing papers was a day I was not shipping product. So the methodology stayed in my head, invisible to everyone but me.

It got worse. Even if I committed to the documentation, the evidence was scattered across months of execution — git repositories, conversation logs, financial records, server analytics. Extracting methodology from that mess required forensic analysis of my own work. And traditional formalization assumes a multi-year cycle: practice for years, reflect for months, write for more months, submit for peer review, wait. I did not have years. In competitive fields, the person who publishes first gets the attribution. Speed of formalization is speed of protection.

What Knowledge Formalization Actually Is

Knowledge Formalization is the application of CEM mechanisms to the problem of converting tacit methodology — the patterns, decisions, and heuristics embedded in execution — into published, structured, defensible intellectual property. It treats documentation as a build project, subject to the same mechanisms that govern software production. The product is not code but a body of knowledge: definitions, evidence mapping, theoretical grounding, and publishable documents.

What it provides:

  • A repeatable process for converting execution data into structured methodology at execution velocity
  • Timestamped, DOI-stamped prior art that establishes intellectual property protection

What it does not provide:

  • A substitute for having real execution data — formalization without evidence is just storytelling
  • Automatic validation of claims — formalizing a methodology is not the same as proving it works

The key insight is that formalization does not require a separate skillset or a fundamentally different workflow. When the documentation process runs on the same mechanisms as the execution process — Foundation compounding, Pendulum decisions, Regroup checkpoints — it operates at the same velocity. I did not slow down to document. I ran the documentation through the same system that builds software.

The Formalization Cycle

The process moved through six stages, each feeding the next through Foundation compounding.

Recognition came first. I examined my own execution data and started naming what I saw. Recurring patterns that had been intuitive became observable. What I had been doing unconsciously got a vocabulary. Three initial patterns on Day 1 became the seed of everything that followed.

Structuring organized those patterns into architecture. Which patterns were foundational? Which were mechanisms? Which were emergent applications? Relationships were mapped, hierarchy was established, and the skeleton of the methodology emerged.

Grounding connected each pattern to existing academic literature. This was where AI as research partner earned its keep. I described how I worked; AI identified parallels — Wright's learning curve, Coase's transaction cost economics, Ericsson's deliberate practice. Connections that would have taken weeks of library research surfaced in minutes.

Evidence Mapping went forensic. Git histories, financial records, quality metrics, operator decision logs — all of it was analyzed to validate each pattern. Where evidence did not support a claim, the claim was revised or killed. No exceptions.

Publication shipped documents with timestamps and DOIs. Prior art was established. The methodology transitioned from tacit personal knowledge to published intellectual property — protected at 80% quality rather than waiting for 100%.

Iteration refined everything. Terminology standardized. Architecture consolidated. Components that seemed distinct merged. Components that seemed singular split. Each pass produced a more precise representation.

What the Data Shows

The formalization trajectory from January 30 to February 10, 2026, demonstrates CEM mechanisms operating on knowledge work at the same velocity observed in the software portfolio.

Output density acceleration:

Day Threads Output
Day 1 1 1 document — methodology named, three patterns identified
Day 3 6 Architecture locked, definitions written, 6 research docs, 15 publications
Day 4 3 16 revised papers, 53,400-word manuscript compiled
Day 7 5 24-paper structure, theory document, glossary, literature mapping

Day 1 produced a one-sheet. Day 4 compiled a book. That acceleration follows the same experience curve observed in the software builds — Foundation depth driving cycle compression regardless of domain.

Pendulum kills during formalization:

Component Killed Reason
"Micro-Loop" (entire mechanism) Determined to be AI-generated artifact, not genuine CEM mechanism
"Patterns" (entire mechanism) Absorbed into Foundation — not standalone
"Feedback Loop" (standalone mechanism) Feedback is how Foundation works, not separate
Vision as single concept Split into Vision (the why) and Target (the what)

The Micro-Loop elimination is the strongest signal. It had academic literature mapped, hypothesis groups written, testable predictions generated — significant investment. When analysis revealed it was an artifact of the AI-assisted process rather than a genuine feature of my methodology, I killed it in a single session and updated every document that referenced it. That is the Pendulum operating on knowledge work: advance what is real, kill what is not, regardless of sunk cost.

Three major Regroup events restructured the entire accumulated body of work — a document audit on Day 4, an architecture restructure on Day 6, and a full Version 1 alignment on Day 12 that eliminated non-genuine components, audited every claim, and locked terminology across all documents.

How to Apply It

1. Treat Documentation as a Build Project Stop thinking of formalization as something separate from execution. Apply the same mechanisms you use to build product: Foundation for accumulated assets, Pendulum for scope decisions, Regroup for quality checkpoints. Documentation that runs on your execution system moves at your execution speed.

2. Start With Recognition, Not Structure Do not begin by designing a framework. Begin by examining your own execution data and naming what you see. The patterns are already there — in your git histories, your decision logs, your financial records. Recognition precedes structure. Name first, organize second.

3. Use AI as Research Partner, Not Author AI amplifies pattern recognition, provides structural scaffolding, and enforces cross-document consistency. But you make every kill/keep decision. You provide the tacit knowledge. AI provides research velocity. The division is operator as authority, AI as amplifier. Never reverse it.

4. Publish at 80% and Iterate Waiting for perfection delays protection with zero benefit. Ship documents at functional quality with timestamps and DOIs. Establish prior art. Then iterate — add rigor, standardize terminology, deepen evidence mapping. The 80% version accomplishes the primary goal of intellectual property protection. Subsequent versions add polish.

References

  1. Wright, T.P. (1936). "Factors Affecting the Cost of Airplanes." Journal of the Aeronautical Sciences, 3(4), 122–128.
  2. Coase, R.H. (1937). "The Nature of the Firm." Economica, 4(16), 386–405.
  3. Ericsson, K.A., Krampe, R.T., & Tesch-Römer, C. (1993). "The Role of Deliberate Practice in the Acquisition of Expert Performance." Psychological Review, 100(3), 363–406.
  4. Polanyi, M. (1966). The Tacit Dimension. University of Chicago Press.
  5. Keating, M.G. (2026). "Foundation." Stealth Labz CEM Papers. Read paper
  6. Keating, M.G. (2026). "Pendulum." Stealth Labz CEM Papers. Read paper
  7. Keating, M.G. (2026). "Sweeps." Stealth Labz CEM Papers. Read paper
  8. Keating, M.G. (2026). "Regroup." Stealth Labz CEM Papers. Read paper
  9. Keating, M.G. (2026). "Vision." Stealth Labz CEM Papers. Read paper
  10. Keating, M.G. (2026). "Target." Stealth Labz CEM Papers. Read paper
  11. Keating, M.G. (2026). "Micro-Triage." Stealth Labz CEM Papers. Read paper