Paul Ford, Can an A.I. Company Ever Be Good? NYTimes, Apr. 26, 2026.
Artificial intelligence can be wondrous, but the technology underneath is more than a little monstrous. It eats up all the words in the world, from blogs to books, often without permission. It burns whole forests’ worth of energy, digesting that raw material into its models, and gulps billions of gallons of water to cool down. These are the same qualities we perceive in Godzilla, but distributed. Is it any wonder that the Japanese word “kaiju,” or strange beast, has “AI” smack in the middle?
Mere greed didn’t get us here. In fact, ethics did. The big A.I. labs’ starry-eyed founders believed that the only way to stop the looming threat of a superintelligence that might kill us all was to create an aligned A.I. that would remain fond of humans. A friendly Godzilla could stop bad Godzillas before they got to Tokyo Bay.
Lessons of recent history:
Over three decades of watching the tech industry and watching big companies grow from tiny teams to global powers, I’ve observed the same pattern: Ethics don’t scale up. Tech companies like to start with a mission. Google wanted to connect the world’s information; Microsoft wanted to put a computer on every desktop; Twitter wanted to give all people a platform to publish their thoughts. These are good ideas — the stuff of TED Talks. But users show up with their own beliefs and ideas, by the millions. As a tech founder, you end up putting enormous work into making users behave (and stopping them from breaking the law). Lawsuits pour in, saying you did wrong, some because you’re a convenient target.
All the while, money keeps gushing in. You start out transparent, sharing your journey, but then before an initial public offering of shares, you must honor the S.E.C.-mandated quiet period and restrict promotional communications. After that, the transparency never quite returns. The market demands a rising stock price. Your company still makes a lot of software, but a huge amount of time goes to tax strategy and compliance.
At that scale, people start to blur together, and human users can become aggregate pools of statistics and growth vectors that go up and down — a mulch into which you plant your products.
The entire culture of American technology is built around two terms: disruption and, of course, scale. But ethics are constraints on disruption and scale. Truly ethics-bound organizations — the U.S. justice system, the American Medical Association, the Catholic priesthood — have hard scaling limits. Their rules run deep, and their requirements to serve are so onerous that only a few people can do the job. Punishments for transgressors include losing their licenses, being defrocked and being disbarred. Software industry people might have good degrees and are often good people, but they are making it up as they go along. They take no oath, are inconsistently certified and can only be fired, not exiled from the trade.
What to do? Regulate:
But regulation is absolutely in the interests of both America and the big A.I. companies themselves. Let me add two more terms people should know: “Google zero” and “model collapse.” Google zero (coined by Nilay Patel, the editor in chief of The Verge) is when Google stops sending traffic to websites and just provides an A.I. answer instead. When that happens, websites get less traffic, sell fewer ads and make less money. As a result, they may not be able to produce as much content. Model collapse is related: It’s when the A.I. models run out of knowledge to digest. What then? Do they excrete their own prose to redigest? Do they just give up?
Silicon Valley types like to say that data is the new oil. I think that’s right in two ways: Data is valuable, but it’s also a commodity, and these new A.I. tools are infrastructure. We regulate the electric grid, so why not these?
In this new world, there are so many new things to regulate: Deepfakes, A.I. liability, copyright rules, model bias concerns and ecological costs top the list. And we will also need to protect the digital commons and incentivize people to write and do things online.
There’s more at the link.
No comments:
Post a Comment