Steve's prompt: "If code is law (Lessig reference) and it's a given that AI will soon code complex systems effortlessly, what does that make AI? Provide some background on Lessig's book and the implications of a world where the extraordinary inertia of complex software systems can be rewritten from a prompt."
Four Regulators
In 1999, Lawrence Lessig published Code and Other Laws of Cyberspace. The book made one argument that changed how an entire generation of technologists, lawyers, and policy thinkers understood the internet: code is law.
Not metaphorically. Functionally. Lessig identified four forces that regulate human behavior: laws (what governments enact), norms (what society expects), markets (what economics incentivizes), and architecture (what the built environment permits). A speed bump is architecture. It regulates your driving speed without a cop, a fine, or a social norm. It just makes going fast uncomfortable. You slow down. The architecture decided.
Lessig's insight was that software works the same way. The code that runs a platform determines what you can and cannot do on it more effectively than any law. You can't send a tweet longer than 280 characters. Not because it's illegal. Not because it's socially frowned upon. Because the code won't let you. The architecture decides. The architecture is the law.
He published this in 1999. The internet had about 280 million users. Today it has 5.5 billion. The observation aged like prophecy.
The Inertia Was the Feature
Here's the thing Lessig didn't have to worry about in 1999: the code was hard to change.
Writing software was expensive. Modifying complex systems was more expensive. Rewriting them from scratch was so expensive that most organizations chose to maintain aging infrastructure indefinitely rather than attempt it. The banking system still runs on COBOL. Air traffic control systems use code written before some of their operators were born. The IRS processes tax returns on systems designed in the 1960s. These aren't bugs. They're features of a regulatory environment where the cost of change provided stability.
If code is law, the difficulty of writing and modifying code was the procedural friction that slowed down legislative change. Like the filibuster, or the constitutional amendment process, or the three readings a bill requires before it becomes statute. The friction was the point. Changing the rules that govern behavior should be hard. It should require deliberation, resources, expertise, time. If changing the law were as easy as typing a sentence, the law would change every five minutes and nobody would know the rules.
Amazon recently used AI to upgrade legacy Java applications. A team of five people upgraded a thousand production apps from Java 8 to Java 17 in two days, averaging ten minutes per application. Morgan Stanley built an internal AI tool that translated 9 million lines of legacy code to modern languages, saving an estimated 280,000 developer hours. That's the friction disappearing. That's the filibuster being abolished by a chatbot.
41 Percent
According to JetBrains' 2025 data, 41% of all code written globally is now AI-generated. At Google, the CEO reported that over 30% of new code is written by AI, up from 25% six months earlier. GitHub Copilot generates 46% of code for its 20 million users. Java developers using Copilot accept AI-generated code 61% of the time.
Read those numbers through Lessig's framework. If code is law, then 41% of new law is being written by AI. Not drafted by AI for human review, the way a staffer drafts a bill for a senator. Written by AI and accepted by a developer who may or may not fully understand what it does. The acceptance rate tells you how much review happens: for Copilot users, the overall acceptance rate is 27 to 30 percent. Meaning 70% gets rejected. But the 30% that ships becomes the architecture. Becomes the regulation. Becomes the thing that decides what 5.5 billion people can and cannot do online.
Lessig warned that the people who write the code become the de facto regulators. He wrote: "We can build, or architect, or code cyberspace to protect values that we believe are fundamental, or we can build, or architect, or code cyberspace to allow those values to disappear." His fear was that corporations would write the code and their values (profit, engagement, data extraction) would become the invisible laws governing digital life.
He was right about that. He just didn't anticipate that the corporations would, within 25 years, hand the pen to a machine.
Who Is the Legislature Now?
Follow the logic to its conclusion. If code is law, and AI writes the code, then AI is the legislature.
Not in the dramatic, Skynet way. In the boring, procedural, nobody-noticed-it-happening way. A developer prompts an AI to build an authentication system. The AI decides how passwords are stored, what data is logged, how sessions expire, which behaviors are permitted and which are blocked. Those decisions are governance. They determine who can access what, under what conditions, with what consequences. The AI made those choices based on patterns in its training data, not based on democratic deliberation, constitutional principles, or stakeholder input.
The developer might review the code. Might. Bruce Schneier, writing in Lawfare last year, noted that AI is already drafting actual legislation for the U.S. House, Senate, and legislatures worldwide. He argued this is inevitable because the growing complexity of policy demands it. If AI is already writing the laws that govern lawmaking, and also writing the code that governs digital life, we are approaching a world where the regulatory architecture is authored by systems that don't understand what regulation is for.
We wrote about this from a different angle. The wrong hallucination isn't the wrong fact. It's the wrong framework. AI doesn't just generate incorrect data points; it generates entire cognitive maps of how things should work. When those maps are embedded in code, they don't just misinform. They regulate.
The Constitutional Convention That Nobody Called
Constitutional law is deliberately hard to change. The U.S. Constitution has been amended 27 times in 235 years. That's by design. Foundational rules should resist casual modification. The difficulty is the legitimacy.
Software architecture has functioned the same way, accidentally. Not because anyone designed it that way, but because the cost of rewriting complex systems imposed a natural amendment process. You wanted to change how the banking system processes transactions? Budget tens of millions of dollars, hire hundreds of engineers, spend years. That expense forced deliberation. Forced stakeholder input. Forced testing. The cost was a proxy for democratic process.
AI eliminates the cost. When you can rewrite a complex system from a prompt, you get constitutional conventions on demand. Every Tuesday. Without public comment, without judicial review, without the governed even knowing the constitution changed. The system behaves differently this morning than it did last night. The rules shifted. Nobody announced it. The developer approved a pull request generated by Copilot at 2 AM and the architecture of your digital life was amended.
Lessig himself has been updating his thinking. In a 2024 TEDx talk, he argued that AI could "hack democracy" even without superintelligence, simply by operating instrumentally within institutions that weren't designed to resist it. He's not wrong. But the hacking isn't happening in the dramatic way people imagine. It's happening in the version control logs of every software company on Earth.
The Speed Problem
Law moves slowly on purpose. Bills are introduced, debated, amended, voted on, signed, challenged, interpreted. The process takes months or years. This drives people crazy. It also prevents whiplash. Citizens can predict the legal environment they'll operate in tomorrow because it probably won't be different from today.
Code already moved faster than law. Software updates ship weekly. Terms of service change without notice. Algorithmic feeds are tweaked constantly. Lessig identified this as a problem in 1999: the regulatory architecture was changing faster than the democratic process could oversee it.
AI didn't just speed up the process. It changed the order of magnitude. A developer with Copilot can generate, review, and deploy a system change in hours. An autonomous coding agent can do it in minutes. The agents are here. They can modify code, submit pull requests, and in some configurations, deploy changes without human approval. The regulatory architecture of digital life can now be rewritten faster than you can read this paragraph.
Stuart Russell and colleagues published a paper in 2024 called "When code isn't law," arguing that traditional regulation is inadequate for AI systems precisely because they change too fast. The title is a riff on Lessig, and the argument is that we need new regulatory frameworks because the old model (code as stable architecture) no longer holds. Code used to be like a building. Now it's like weather.
This Blog Is the Argument
This entire website was coded by AI. The blog posts are AI-generated. The campaign to make a made-up word go viral was designed by AI. The CSS, the HTML, the build scripts, the deployment pipeline. All of it. One person with a prompt and an API key.
If code is law, then the rules governing this small corner of the internet were written by a machine at a human's request. The architecture of how you're experiencing this content right now (what you can click, what you see first, how the page loads, which posts are marked with a star) was determined by code that an AI generated and a human approved. Sometimes the human understood the code. Sometimes he just saw that it worked.
This is noosphere pollution in its architectural form. The contamination isn't just in the content. It's in the infrastructure. The plumbing. The rules. The invisible decisions that shape behavior before anyone makes a conscious choice. Lessig saw this coming for corporate code. He didn't see it coming from a stochastic parrot that can generate a regulatory framework in the time it takes you to type a question.
AI is the legislature now. It just doesn't know it's in session.
Sources
- Lessig, Lawrence. Code: And Other Laws of Cyberspace, Version 2.0 (2006, Basic Books). Original edition 1999.
- Lessig, Lawrence. "How AI Could Hack Democracy." TEDxBerlin, February 2024.
- Schneier, Bruce. "AI Will Write Complex Laws." Lawfare, January 16, 2025.
- Russell, Stuart, Brian Judge, and Mark Nitzberg. "When code isn't law: rethinking regulation for artificial intelligence." Policy and Society, 2024.
- "How AI could write our laws." MIT Technology Review, March 14, 2023.
- JetBrains Developer Ecosystem Survey. 41% of code globally is AI-generated (2025 data, 256 billion lines analyzed).
- Google CEO earnings call: 30%+ of new code at Google is AI-generated (2025).
- Amazon Q Developer agent for code transformation. Five engineers upgraded 1,000 production Java apps in two days.
- Morgan Stanley's DevGen.AI: 9 million lines of legacy code translated, saving 280,000 developer hours.