It is easy to recognize the uber-clumsy efforts of artificial intelligence that sifted through nearly a dozen reviews of “War Made Invisible,” lifting bits and pieces while weirdly substituting supposed synonyms to steer clear of plagiarism lawsuits
By Norman Solomon
At first, I admit, I was a bit flattered to learn that online entrepreneurs are selling study guides for my new book. I thought of CliffsNotes from long ago, helping fellow students who were short on time or interest to grasp the basics of notable works. Curiosity quickly won. I pulled out my credit card, paid $9.99 plus tax for one of the offerings, and awaited its arrival in the mail.

The thin booklet got off to a reasonable enough start, explaining with its first sentence, “The U.S. media coverage that makes it easier to sell wars to the public, as well as the often-hidden cost of civilian casualties from errant U.S. attacks, are all harshly criticized by journalist Solomon.” That wasn’t a bad sum-up of my book.
But the study guide’s second sentence was not nearly as good: “He guarantees that when Russia designated Ukrainian communities during the new attack, the U.S. media was everyone available and jumping into action with compassionate, piercing revealing.” Rereading that sentence a few times didn’t improve it, and I began to worry.
To the extent that meaning could be grasped, the next pages seemed to include some praise: My book “constructs a convincing case that an excessive number of mysteries are being kept from people in general.” What’s more, “the creator presents a sharp and provocative outline of the outcomes of the media’s horrifying disappointments in spreading the word.”
But the study guide also included mild criticism amid the odd wording: “Solomon might have offered a fairly more profound examination of why American newscasting neglects to satisfy its beliefs in covering war and the justifications for why political pioneers could feel a sense of urgency to deal with misdirection while tending to people in general.”
The computer-programmed assaults on the English language escalated. And so, the “war on terror” became the “battle on dread.” A key source of meticulous research that I cited in my book, the Costs of War project at Brown University, became “the Expenses of War project at Earthy Colored College.”
At one point, my book’s actual title — “War Made Invisible” — shifted to “War Caused Imperceptible.” But the laughable malapropisms provided by artificial intelligence became more serious matters when I saw several dozen words forming badly mangled phrases — all attributed to me — inside quotation marks. I could imagine bleary-eyed students cramming on the night before a test or a term-paper deadline, reading the ostensible quotes and thinking that the author of my book must be an idiot.
Likewise, any would-be scholars seeking to glean the gist of the book’s themes in exchange for their $9.99 purchase will surely come away mystified at best after reading sentences like: “It’s totally unsuitable for writers to toe the conflict line for a really long time, and afterward, at last report, essentially, it tends to be informed years past the point of no return.”
I’m not among the authors who claim to never read reviews of their books. In fact, I remember them. So, I could recognize the uber-clumsy efforts of artificial intelligence that sifted through nearly a dozen reviews of “War Made Invisible,” lifting bits and pieces while weirdly substituting supposed synonyms to steer clear of plagiarism lawsuits.
So, let’s hear it for digital “free enterprise.” Or maybe that’s “unshackled business.” Nice AI work if you can get it.
Which brings us to a vastly more substantive matter. Artificial so-called intelligence is hardly immune to a dynamic that computer experts long ago dubbed “GIGO” — garbage in, garbage out. With AI, no matter how sophisticated it might seem, the consequences in war are apt to be horrific. Six decades after Martin Luther King Jr. warned of “guided missiles and misguided men,” the missiles are even more terrible, the people ordering launches are no less misguided, and the mentalities bent on war are eager to twist AI technology for their own lethal purposes.
A couple of weeks ago, the Department of Defense announced “the establishment of a generative artificial intelligence task force, an initiative that reflects the DoD’s commitment to harnessing the power of artificial intelligence in a responsible and strategic manner.”
If they were still alive, the 4.5 million people who have died as direct and indirect results of U.S. wars since 9/11 might doubt how “responsible” the Defense Department’s manner has been.
Let’s hope that the people running the Pentagon’s task force for artificial intelligence didn’t graduate from Earthy Colored College.
Norman Solomon is the national director of RootsAction.org and executive director of the Institute for Public Accuracy. He is the author of a dozen books including War Made Easy. His latest book, War Made Invisible: How America Hides the Human Toll of Its Military Machine, was published in summer 2023 by The New Press.
Recent Posts
How Democracies Fade and Why The U.S. Should Pay Attention
August 16, 2025
Take Action NowNayib Bukele is using populism to bury democracy in El Salvador. He’s not the only leader to do so.By Ameer Al-Auqaili,…
Wall Street Is Killing The Housing Market
August 15, 2025
Take Action Now Investment giants are buying up homes and pricing real people out of the market. But it doesn’t have to be this way.By Garrett…
Don’t Single Out Military Deportations, Dismantle The Deportation Machine
August 15, 2025
Take Action NowMilitary veterans and their families should not be deported. But neither should other immigrants. Let’s reject a politics of…
The Accursed Fate Of Palestinians In Israeli Prisons
August 14, 2025
Take Action Now As of August 2025, 10,800 Palestinian political detainees and political prisoners languish in Israeli jails. Since 1967, 320…