Skip to content
Back to articles

Code Is Nearly Free. Block Just Cut 4,000 Jobs.

March 1, 2026Β·8 min readΒ·1,563 words
AIFuture of WorkSoftware EngineeringLayoffsVideo Summary
Theo Browne discussing the end of software engineering as we know it
Image: Screenshot from YouTube.

Key insights

  • Block cut from 10,000+ to under 6,000 employees despite growing revenue, signaling that large engineering teams may become a liability rather than an asset
  • Writing code went from the most expensive to the cheapest part of shipping software, eliminating the bottleneck that justified large teams
  • The new bottleneck is code review, testing, and QA β€” skills most developers have historically undervalued
SourceYouTube
Published February 28, 2026
Theo - t3β€€gg
Theo - t3β€€gg
Hosts:Theo Browne

This article is a summary of Software engineering is dead now. Watch the video β†’

Read this article in norsk


In Brief

Block, the company behind Square and Cash App, just laid off over 4,000 employees despite being profitable and growing. Theo Browne argues this is not a cost-cutting move but a signal that the entire structure of software companies is changing. AI tools have made writing code nearly free. The bottleneck has moved downstream to review, testing, and shipping, and companies with large engineering teams are finding that more people now means slower output, not faster.

4,000+
Block employees laid off
2 weeks
Theo rebuilt Frame.io (part-time)
52,581
lines of code in T3 Code (2-3 weeks)

Block: profitable, growing, and cutting half its staff

Jack Dorsey, Block's CEO and the original founder of Twitter, announced that Block would reduce from over 10,000 employees to just under 6,000 (12:54). That is over 4,000 people.

The unusual part: Block is not in financial trouble. Gross profit continues to grow, customer numbers are increasing, and profitability is improving (16:04). Dorsey's explanation is that AI tools, combined with smaller and flatter teams, are changing what it means to build and run a company.

Dorsey described two options: cut gradually over months or years, or act now. He chose the latter. His argument is that repeated small rounds of cuts destroy morale without forcing the organizational changes needed (16:29). Going from 10 people to 8 does not change your process. Going from 10 to 2 forces you to rethink everything (33:06).

Browne notes that the severance package is strong: 20 weeks of salary plus one extra week per year of service, equity vested through end of May, six months of healthcare, company devices, and $5,000 for transition support (13:06).


Proof of concept: rebuilding a $1.3 billion product in two weeks

To show how much has changed, Browne describes building Lawn, an alternative to Frame.io (a video review platform that Adobe acquired in 2021 for $1.3 billion) (6:47).

He says he built a working alternative in two weeks, part-time, while running his company and YouTube channel. He claims to have written zero lines of code by hand (5:27). He structured the APIs (Application Programming Interfaces, the rules for how different parts of the software talk to each other) and application logic by describing how they should work to an AI agent, reviewed its proposals, and let it write the code. He then open-sourced the result.

Browne acknowledges that his engineering background helped. He understood the constraints and requirements of video products. But he argues that even without that background, the AI model would have provided at least half of the key decisions (8:40).

The broader point: if one person can rebuild a billion-dollar product part-time in two weeks, the cost advantage of a large engineering team has collapsed.


The pipeline shift: where the bottleneck used to be

Browne breaks down the software shipping process into steps (22:19):

  1. User reports a problem
  2. Describe the problem clearly
  3. Identify a solution
  4. Scope and assign the work
  5. Write the code
  6. Review the code
  7. Test the code
  8. Plan and do the release

Historically, step 5 was the expensive one. It required highly skilled, well-paid engineers. Everything above it was about filtering ideas so only the best ones reached the engineering team. The pipeline looked like a funnel: 500 user problems became 100 described tickets, 50 had identified solutions, 15 got scoped, and maybe 5 actually got coded (22:48).

Now, Browne argues, that funnel has flattened at the coding step. A developer can paste a screenshot of a user complaint into an AI agent and skip straight from "user problem" to "code written" (23:31). The step that used to justify hiring more engineers is no longer the constraint.


Why more engineers now means slower shipping

This is what Browne considers most dangerous for large companies. With AI generating code quickly, the bottleneck has moved to code review, testing, and release approval.

More engineers means more PRs (pull requests, proposed code changes submitted for review before merging). More PRs means more coordination, more approvals, and more chances for teams to block each other. Browne describes his own team ending up with hundreds of PRs sitting unshipped because review could not keep up (28:37).

He compares large teams to aircraft carriers: they carry a lot, but if they need to change direction, it takes days to turn. Small teams turn instantly (29:18).

As a concrete example, Browne describes trying to get a new AI model added to GitHub's Copilot CLI (command-line interface, a terminal-based coding tool). Despite getting help from senior Microsoft employees, the answer was that adding it was "not an easy thing" because of internal processes (34:01). Browne argues this is a one-line code change blocked by organizational complexity, not technical difficulty.

His observation about his own work points in the same direction: his three-person dev team is competing with OpenAI's Codex desktop app team, which he estimates at 20+ people (20:24).


What's valuable now

If writing code is nearly free, what skills matter? Browne lists several (30:43):

  • Turning user problems into clear plans. Translating a vague complaint into a specific, actionable description.
  • Making code ready to ship. Code review, quality assurance (QA, verifying software works correctly), and manual testing.
  • Building rollback systems. Reliable ways to undo changes when something breaks in production.
  • Writing thorough tests. Automated checks that verify code behaves as expected.
  • Knowing your users. Browne claims many developers cannot name a single customer of the product they work on. If an AI agent knows more about your customers than you do, your position is vulnerable (39:10).

Browne also points out a morale challenge: writing code was one of the more enjoyable parts of the job. Now developers spend more time on review and QA, which most of them dislike (31:06).

His closing advice to developers: talk to users more, get involved in the release process, learn the systems around shipping, and try to automate your own work. Through that automation, you discover everything around the code that actually matters (38:18).


How to interpret these claims

Browne is a tech content creator and startup founder with a clear perspective and audience. His arguments are compelling but deserve fair scrutiny.

The examples favor greenfield work

Lawn and T3 Code are greenfield projects (built from scratch with no legacy constraints) in technology Browne knows deeply. Rebuilding something in a domain you understand is very different from maintaining a sprawling legacy system you inherited. The $1.3 billion comparison with Frame.io also includes years of partnerships, integrations, and customer relationships that code alone cannot replace. Browne acknowledges this for Block's products (11:52) but still uses the comparison.

"Code is free" applies unevenly

AI coding tools work best on common patterns: CRUD applications (apps that create, read, update, and delete data), standard web interfaces, and well-documented frameworks. Systems with unusual constraints, strict regulatory requirements, or complex distributed infrastructure are harder to replicate. The claim is strongest for the kind of software most companies build, and weakest for the kind that is hardest to replace.

The Mythical Man-Month was always true

The argument that more engineers slow things down predates AI by decades. Fred Brooks described it in 1975. AI tools have sharpened this dynamic, but organizations that already managed coordination well may see less disruption than Browne suggests. The deeper question is not just about code output but about how well an organization makes decisions.

Small-team survivorship

Browne's three-person team ships fast, but many small teams fail. The narrative highlights the winners. A fuller picture would include how many AI-accelerated projects stall because the builder hit an edge case the model could not handle, ran out of context, or shipped bugs that a larger QA team would have caught.


Glossary

TermDefinition
AI agentAn AI system that takes multi-step actions on its own, like writing code across multiple files based on a description of what needs to change.
APIApplication Programming Interface. The rules and structure for how different parts of software communicate with each other.
BlockFinancial technology company founded by Jack Dorsey. Operates Square, Cash App, Afterpay, and Tidal.
CLICommand-Line Interface. A text-based tool used in a terminal window, as opposed to a graphical app with buttons and menus.
CRUD appAn application whose main job is to Create, Read, Update, and Delete data. Most business software fits this pattern.
Frame.ioVideo review platform acquired by Adobe for $1.3 billion in 2021. Teams use it to share and comment on video edits.
Greenfield projectA software project built from scratch, with no existing code or legacy constraints to work around.
Mythical Man-MonthA 1975 book by Fred Brooks arguing that adding more people to a late software project makes it even later, because coordination costs grow faster than output.
PR (Pull Request)A proposed code change submitted for review before it gets merged into the main codebase. The standard workflow in teams using Git.
QA (Quality Assurance)The process of testing and verifying that software works correctly before it reaches users.
Vibe codingBuilding software by describing what you want to an AI agent and letting it write the code, without writing any code yourself.

Sources and resources