Article

194,954 Lines of Code: What a Solo AI-Built Enterprise Platform Looks Like

Building with AI

Key Takeaways
  • There is a widely held assumption in software engineering that codebase size correlates with team size.
  • The Stack Overflow 2024 Developer Survey reports that 63% of developers use AI tools in their workflow, up from 44% the prior year.
  • The 194,954-line codebase was not written in a single sprint.
  • The 194,954-line, single-operator codebase challenges a foundational assumption in engineering management: that project scale determines team size.

The Setup

There is a widely held assumption in software engineering that codebase size correlates with team size. Large systems require large teams. The Stack Overflow Developer Survey consistently reports that the majority of professional developers work in teams of 5 to 20 people, and that projects exceeding 100,000 lines of code are almost exclusively maintained by organizations with dedicated engineering departments. The COCOMO II model -- the standard estimation framework for software project effort -- would estimate that a 194,954-line codebase requires approximately 20 to 40 person-months of effort, translating to a team of 4 to 8 developers working for 12 to 18 months at mid-market US rates of $780,000 to $1,560,000. IEEE data on developer productivity places the industry average at 325 to 750 lines of finished code per developer per month for complex systems, which would put a 194,954-line platform at roughly 260 to 600 person-months -- or a 5-person team working for 4 to 10 years.

These models were calibrated for a world where one developer writes code, another reviews it, a third tests it, a fourth deploys it, and a project manager coordinates all four. That world still exists. But there is now operational evidence of what happens when AI-assisted tooling compresses those roles into a single operator.

PRJ-01 is a Customer Data Platform (CDP) built on Laravel 10. It has 194,954 lines of custom code, 135 database tables, 104 controllers, 59 service files, 64 console commands, and 20 external integrations spanning 12 inbound and 8 outbound systems. It was built in 74 active development days by one operator working with AI tools, with a single contractor (CON-01) contributing 10.7% of commits (149 of 1,394) focused on dashboard work. To put the codebase in context: SQLite has approximately 155,800 lines of code, WordPress has approximately 160,636, and Quake III Arena has approximately 229,000. PRJ-01 sits between WordPress and Quake III Arena by line count.

What the Data Shows

The Stack Overflow 2024 Developer Survey reports that 63% of developers use AI tools in their workflow, up from 44% the prior year. But usage statistics do not capture output magnitude. The question is not whether developers use AI, but whether AI-assisted development changes the scale of what a single person can build.

The COCOMO II model provides the clearest external benchmark. For a codebase of 194,954 lines with the complexity characteristics of PRJ-01 -- multi-tenant architecture, 20 external integrations, identity resolution, e-commerce transaction processing, subscription billing -- COCOMO II estimates a development cost between $2,404,350 and $3,900,000. That estimate assumes conventional team structures and productivity rates.

The actual build cost: $16,800 in contractor sweep costs plus $3,184 in AI tooling, totaling approximately $20,000. The actual timeline: 74 active development days over a 115-day calendar span (October 8, 2025 to January 31, 2026). The actual team: one operator (Michael George Keating, 86.8% of commits) plus one contractor (CON-01, 10.7%) plus AI tooling (2.5% -- a git configuration artifact of the operator's own commits through Claude Code, not a separate contributor).

The output rate data tells the progression story. Phase 1 (October 8-31) averaged 4.6 commits per day. Phase 2 (November 1-27) averaged 6.4 commits per day. Phase 3 (December 21-31) hit 24.1 commits per day. Phase 4 (January 1-6) peaked at 61.5 commits per day, with a single-day maximum of 89 commits on January 1, 2026. That is a 13.4x output multiplier from the first phase to the peak phase. Against the Sieber & Partners benchmark of 2 commits per day for the median developer, the peak phase represents a 30x multiple on industry median output.

The production data confirms this is not a demo. PRJ-01 has processed 616,543 leads, resolved 958,937 contact points through identity resolution, tracked 530,077 lead activities, captured 503,412 lead events, and processed 75,125 transactions -- all against live production data from the operator's own business operations. The database contains 135 tables with 3,224,987 rows across 2,013 MB of production data.

Rework data provides the quality signal. The overall rework rate was 31.3%, trending from 45.2% during the first production deployment phase down to 27.0% by the final phase -- a 40% reduction in rework as patterns solidified. Of the total rework, 18.3% was product bugs, 10.1% was cosmetic iteration, 1.5% was integration friction, and 1.4% was git and infrastructure learning. Total reverts: 8 out of 1,394 commits, or 0.6%. For context, industry bug-fix time typically consumes 20-50% of development effort (Rollbar, Stripe, Coralogix), and industry defect density runs 15-50 defects per thousand lines of code (McConnell, Code Complete). At a 12.1% product defect rate across the broader 10-project portfolio, the quality sits below the low end of the industry range.

How It Works

The 194,954-line codebase was not written in a single sprint. It accumulated across seven distinct development phases, each building on the infrastructure and patterns established by the previous phase.

The build started with a scaffolded foundation. During Phase 1 (October), the contractor (CON-01) handled dashboard construction while the operator built the core platform architecture -- database schema, authentication, role-based access control, API structure. This phase averaged 4.6 commits per day and established the patterns that every subsequent phase would reuse. The external contractor's contribution was concentrated in this early period; by Phase 3 (December onward), the operator handled all development.

The acceleration from 4.6 to 61.5 commits per day was not a function of working longer hours. It was a function of reduced friction. Each phase left behind tested, reusable components -- service files, Blade templates, SCSS design system variables, console commands, middleware patterns. By the peak phase, building a new feature meant composing from existing, proven pieces rather than writing from scratch. The system compounded: infrastructure built in October made November faster, which made December faster, which made January's peak sprint possible.

AI tooling played a specific role: it replaced the need for specialist contractors in architecture, QA, and DevOps roles. The operator used Cursor and Claude as AI coding assistants to handle scaffolding, pattern generation, and problem resolution. This is not AI writing the software autonomously. The operator designed every system, made every architectural decision, and resolved every production issue. AI accelerated the execution of those decisions. As the project's own documentation states: "A nail gun doesn't design the house."

What This Means for Technical Leaders and Engineering Managers

The 194,954-line, single-operator codebase challenges a foundational assumption in engineering management: that project scale determines team size. The COCOMO II model would price this build at $2,404,350 to $3,900,000 with a multi-person team over 12 to 24 months. It was built for approximately $20,000 in 74 active days by one operator with AI assistance.

This does not mean teams are obsolete. It means the threshold at which you need a team has moved. A solo operator with AI-assisted tooling and a compounding infrastructure approach can now build at a scale that previously required a funded engineering department. PRJ-01 has 135 database tables, 20 external integrations, 4 customer types, 9 subscription plans, 5 revenue streams, and has processed over 616,543 leads in production. That is not an MVP. That is enterprise-grade infrastructure built at solo-operator cost.

The implication for build-vs-hire decisions is concrete: before staffing a 5-person team at $960,000 to $1,440,000 per year, evaluate whether the same output is achievable with one senior operator, AI tooling at $105 per month, and a methodology that compounds infrastructure across projects. The data from PRJ-01 says that for at least one class of platform -- data-intensive, integration-heavy, multi-tenant SaaS -- the answer is yes.


Related: C1_S12 (Contractor Cost Collapse), C1_S13 (5-Day Production Builds), C1_S15 (Custom Software vs SaaS)

References

  1. Stack Overflow (2024). "Developer Survey." 63% of developers use AI tools in their workflow.
  2. University of Southern California (2024). "COCOMO II Software Cost Estimation Model." Center for Systems and Software Engineering. Development cost estimation of $2.4M-$3.9M for equivalent codebase.
  3. IEEE (2024). Software productivity data. Lines of code per developer per month benchmarks (325-750 LOC/month for complex systems).
  4. McConnell, S. (2004). Code Complete, 2nd ed. Microsoft Press. Defect density benchmarks (15-50 defects per KLOC).
  5. Sieber & Partners (2024). "Commit Velocity Analysis." 3.5 million commits across 47,000 developers.
  6. Rollbar, Stripe & Coralogix (2024). Industry defect rate benchmarks (20-50% of developer time on bug fixing).
  7. FullStack (2025), Keyhole Software (2026), Qubit Labs (2026). Software development price guides and cost benchmarks.
  8. Keating, M.G. (2026). "Case Study: The PRJ-01 Product Story." Stealth Labz. Read case study