For those who still believe a democracy needs traditional journalism, this is a harrowing time, to say the least. Local newspapers, for a century the foundation of real reporting, continue laying off the reporters doing the scratch-and-claw work of covering communities. At the same time, more media resources than ever are somehow being plowed into media coverage of the media — an unselfconsciously narcissistic “never forget, we’re the real story” phenomenon most recently glorified in the documentary Page One, a retch-inducing heroization of the media desk at the New York Times. Meanwhile, a new push is on to distort what journalism actually is, from editors and reporters being paid by and/or investing in the industries they cover, to journalism schools changing their missions to include corporate marketing. The very definition of “journalist” is being reimagined by those aiming to enrich themselves. And, of course, all this is happening as the relatively few genuine journalists left in America are periodically lambasted for the horrific crime of actually reporting real news and questioning power.
But for all of these trends, none is more disturbing than recent moves to challenge the the basic assumption that journalism is even necessary anymore. In an economy that fetishizes synthetic derivatives rather than tangible products and in a political cauldron that periodically manufactures notions of “post-partisan,” “post-racial” and “post-industrial” utopias, the ascendant notion in the media industry is that news organizations and American democracy can survive and thrive in a “post-journalism” era — one that wholly removes journalism from the news media.