Defending Daniel and Korean Drama
- 6 minutes read - 1116 wordsA few weeks ago, I read this post at Futurism and I found it unfair and it’s bothered me ever since.
It’s 2026. AI is everywhere, and frankly, humans have had it too good for far too long. For the world’s corporations — the movers and shakers of the global economy, as it’s currently organized — it’s well past time to leave us flesh bags behind.
That, at least, seems to be the contention of Daniel Miessler, an outspoken cybersecurity engineer and AI booster. In a rambling post on his personal blog, Miessler takes the position that human workers are already obsolete, so the best thing we can do is accept it and fall in line with the AI revolution.
This post’s characterization doesn’t sound like the Daniel I know at all. He’s affable and industrious. If I were in SF or he in NYC, I’d try to catch up over dinner. I know where he grew up. I know him to be thoughtful and fair-minded. But these last few years have been strange; maybe Daniel had amended his views.
To my relief, the content did not show scary signs of some toxic form of “red pilling.” The only conclusion to make then is that the Futurism article is straw-manning Daniel’s post into the post that Futurism wants to write against, versus what’s actually there.
And perhaps even worse, in the piece’s call for emotional reaction, it distracts from the all-too-real relationship between management and labor that Daniel is trying to share to help folks be ready for the future.
I read these posts a week or so after seeing No Other Choice, the latest Korean income inequality drama (à la Parasite) from Park Chan-wook (Oldboy). The movie did a wonderful job demonstrating Daniel’s point — seagull-flapping of the Futurism article aside.
The Claim
It’s 2026. AI is everywhere, and frankly, humans have had it too good for far too long. For the world’s corporations — the movers and shakers of the global economy, as it’s currently organized — it’s well past time to leave us flesh bags behind.
That, at least, seems to be the contention of Daniel Miessler, an outspoken cybersecurity engineer and AI booster. In a rambling post on his personal blog, Miessler takes the position that human workers are already obsolete, so the best thing we can do is accept it and fall in line with the AI revolution.
The rhetoric is obfuscatory. The verb seems in the second paragraph is doing an awful lot of lifting. But with seems no one’s ever wrong, it’s just a matter of interpretation. It’s a weasel word.
Additionally, the tone here is just an unnecessary rehash of 2004 blog-snark. The author throws in a dig (“rambling”) in the middle of two paragraphs where he makes the same claim twice. Glass houses.
So, tonally and formally it’s slop. On top of it, and my apologies to Daniel here, who the fuck cares what Daniel Miessler thinks? Does he go to talk shows or have the ear of the president? Are industries trading on his advice? He’s certainly smarter than the average bear, but why is Futurism making his claim a thing? Isn’t this piece tantamount to the XKCD wisdom:
The de-weasel-worded, economized claim in the first two paragraphs is this.
Daniel, someone you’ve probably never heard of said that human workers are already obsolete, so the best thing we can do is accept it and fall in line with the AI revolution.
Reading the post, it says no such thing. The post talks about the tendencies of business and how businesses are thinking about staffing in light of the presence of autonomous AI agents. Elsewhere, Daniel has talked about his desire to share this analysis so that rank-and-file folks aren’t surprised when the economic model that paid and fed them changes to their detriment.
The pull quote that seems to particularly have irritated the Futurism author (before they quoted Juan-Sebastian Carbonell, which was probably the post they actually wanted to write):
The ideal number of human employees inside of any company is zero. That is the number that they’re trying to get to.
The author seems (see what I did there, but I’ll attempt to represent the argument in good faith at least) incensed that Daniel articulated one of the core facts of capitalism: Employees are a cost center and cost reduction enables profit maximization, the core fiduciary duty of a publicly-traded company.
The bulk of the piece is about what he calls Human 3.0: a post-corporate substrate where people broadcast their full capabilities, work on their own terms, and get compensated for being themselves rather than for fitting inside an org chart. The Futurism piece quoted labor sociologist Juan Sebastian Carbonell approvingly – the real battle is “over whose interests the new technologies will serve” – apparently missing that this is precisely what Daniel’s Human 3.0 section is trying to steer.
I kept thinking about all of this while watching the Korean film No Other Choice. Man-su is a veteran paper worker, genuinely good at his job, laid off when his company gets bought out. He assures his family he’ll be back in papermaking within three months. Thirteen months later, he’s still out. The family is selling off dogs, considering selling the house, his wife is working as a dental assistant for a guy who’s clearly interested in panty-removal over plaque-removal. Man-su can’t even afford to fix a toothache.
Man-su identifies the one open position at the local paper company. He identifies his two competitors. He then murders them. It’s a black comedy.
The film is darkly funny in the way Korean cinema often is: the violence is sudden and unglamorous, the domestic scenes are warm and specific, the horror creeps in sideways. But what wrecked me was the turn. When his wife finds out about his homicidal career advancement plan, she gets onboard. Hey, get rich or die trying, no?
The family keeps the house, the dogs come home, his autistic daughter’s drawings turn out to be musical compositions. Everything is, improbably, fine.
The last shot is Man-su at work, alone, in a gleaming modern paper mill run almost entirely by machines – a “lights out” factory, where human presence is sparse enough that the company saves electricity by illuminating only where there’s motion. They needn’t heat in winter (buy Man-su a coat), the machines won’t care; they needn’t cool in summer (let Man-su wear shorts), the machines won’t care.
Both Park and Miessler are trying to get us to imagine a very strange future ahead. And straw-manning the argument so that one can create a post that farms clicks is not helpful.