Got some good news today of the financial variety (“I’ll be able to pay my bills” as opposed to “I’m $%^#$( rich!”, but I’ll take it), so instead of complaining, let’s try to be a little more positive today.
Just kidding. Let’s talk about Artificial Intelligence and the absolute clusterfuckery of it all!
Let’s get the obvious out of the way: Traditionally, the purpose of automation in a capitalist system is to eliminate the need for workers. Doing this not only reduces the number of people you have to pay, but it also reduces the earning power of the people you can’t get rid of since there are more people competing for fewer jobs. That’s it. Anyone who tells you otherwise is trying to sell you something.
This goes back a long way. An Inquiry into the Nature and Causes of the Wealth of Nations was published in 1776 and served as a religious novel for certain members of my family. All of three years later, Ned Ludd supposedly-but-probably-not-really smashed two stocking frames, an event that because a cause celebre for people worried about how automation would affect a society built around employment (and Mr. Wolf).
Like a mote circling a drain, the impact of automation started slowly and has inevitably picked up pace as the void beckons. In 1954, The Twilight Zone episode “The Brain Center At Whipple’s” turned its flashlight to the writing on the wall. Spoiler alert: Not only did the workers lose their jobs; the managers did as well. If you want to feel old, watch any movie set in an office in the 1970s (9 to 5 is always worth a watch) and pay attention to how different it is: The typing pool alone has more employees than Twitter does now. If an office had 40 typists, it now has none, or, if there is still one, there are 40 people vying for that job and you ain’t gonna be able to live on what it pays.
Why all the talk about automation? I thought we were talking about AI. AI is, in one sense, part of that same trend, but it’s the point where the draining water gets more vertical than horizontal. We’re going to be bleeding jobs like we’ve never seen before. We’re not prepared for it. A lot of shit is going to break.
And finally, we get to the point.
The drive behind AI, and the money behind it, isn’t about simply eliminating labor. That’s a long-term play and, in today’s world, there is no long-term. This is about making fist-fulls of cash in the very short term. Play along with me here: What are the laws governing liability when AIs make mistakes? How do copyright laws treat AI-generated work-product? How do the laws handle AI-generated work that makes a facsimile of copyrighted or trademarked work? What laws cover AI at all?
If you think we’re not prepared for the joblessness we’re about to face, that is small potatoes compared to how our legal system is about to be abused and re-written. In some ways, the AI boom is similar to what Uber and Airbnb did: Get into a space before any regulation exists and become so entrenched that it becomes a fait acompli that your business model is legal. Uber and Airbnb in particular aren’t companies in the traditional sense (and they do everything they can to fight that classification). Their role is to break existing systems more than to make any money.
Sure, getting rid of workers is a great thing from a business standpoint, but imagine the mischief one can wreak with procedurally generated art, music, text, code, or almost anything else you can imagine when there are no regulations in place. When an AI or algorithm running an armed robot kills someone, where is the liability? What if the AI or algorithm is procedurally generated by another AI or algorithm?
AIs are very good at diagnosing medical patients. That should be a boon for everyone, but it almost certainly won’t be. It’ll be used to cut payroll, of course, but more importantly, it’ll be implemented in a way designed to save costs and avoid liability because that’s where the money is and why on Earth would private hospitals owned by equity firms do anything other than what makes the money and protects them from lawsuits.
This is a gold rush into an unregulated space and it’s going to happen and it’s going to suck.
You may have noticed that I haven’t offered any solutions or discussed the value of human innovation vs. machine-generated stuff. That’s because I think it’s probably too late for solutions. Short of a Butlerian jihad, we’re too far along to even think about getting ahead of this and setting up guardrails. America’s weird and self-defeating deference to the private sector has ensured this, but I don’t know that it would have been any better in a better society. As to the second point, I don’t honestly believe that human innovation, our creativity, whatever you want to call it, is superior to what machine learning will eventually accomplish. We all love the story of John Henry, but John Henry died and, 5 years later, he wouldn’t have won anyway.
My only advice is: Brace yourself. Vote for policies that will at least try to rein in the worst excesses, and for those that will support people who lose the ability to work because there’s no need for them anymore. That’s what we’re facing and pretending otherwise isn’t going to help.