Amid the bacchanal of disruption, let us pause to honor the disrupted. The streets of American cities are haunted by the ghosts of bookstores and record stores, which have been destroyed by the greatest thugs in the history of the culture industry. Writers hover between a decent poverty and an indecent one; they are expected to render the fruits of their labors for little and even for nothing, and all the miracles of electronic dissemination somehow do not suffice for compensation, either of the fiscal or the spiritual kind. Everybody talks frantically about media, a second-order subject if ever there was one, as content disappears into “content.” What does the understanding of media contribute to the understanding of life? Journalistic institutions slowly transform themselves into silent sweatshops in which words cannot wait for thoughts, and first responses are promoted into best responses, and patience is a professional liability. As the frequency of expression grows, the force of expression diminishes: Digital expectations of alacrity and terseness confer the highest prestige upon the twittering cacophony of one-liners and promotional announcements. It was always the case that all things must pass, but this is ridiculous.
Meanwhile the discussion of culture is being steadily absorbed into the discussion of business. There are “metrics” for phenomena that cannot be metrically measured. Numerical values are assigned to things that cannot be captured by numbers. Economic concepts go rampaging through noneconomic realms: Economists are our experts on happiness! Where wisdom once was, quantification will now be. Quantification is the most overwhelming influence upon the contemporary American understanding of, well, everything. It is enabled by the idolatry of data, which has itself been enabled by the almost unimaginable data-generating capabilities of the new technology. The distinction between knowledge and information is a thing of the past, and there is no greater disgrace than to be a thing of the past. Beyond its impact upon culture, the new technology penetrates even deeper levels of identity and experience, to cognition and to consciousness. Such transformations embolden certain high priests in the church of tech to espouse the doctrine of “transhumanism” and to suggest, without any recollection of the bankruptcy of utopia, without any consideration of the cost to human dignity, that our computational ability will carry us magnificently beyond our humanity and “allow us to transcend these limitations of our biological bodies and brains. . . . There will be no distinction, post-Singularity, between human and machine.” (The author of that updated mechanistic nonsense is a director of engineering at Google.)
And even as technologism, which is not the same as technology, asserts itself over more and more precincts of human life, so too does scientism, which is not the same as science. The notion that the nonmaterial dimensions of life must be explained in terms of the material dimensions, and that nonscientific understandings must be translated into scientific understandings if they are to qualify as knowledge, is increasingly popular inside and outside the university, where the humanities are disparaged as soft and impractical and insufficiently new. The contrary insistence that the glories of art and thought are not evolutionary adaptations, or that the mind is not the brain, or that love is not just biology’s bait for sex, now amounts to a kind of heresy. So, too, does the view that the strongest defense of the humanities lies not in the appeal to their utility — that literature majors may find good jobs, that theaters may economically revitalize neighborhoods — but rather in the appeal to their defiantly nonutilitarian character, so that individuals can know more than how things work, and develop their powers of discernment and judgment, their competence in matters of truth and goodness and beauty, to equip themselves adequately for the choices and the crucibles of private and public life.
This gloomy inventory of certain tendencies in contemporary American culture — it is not the whole story, but it is an alarmingly large part of the story — is offered for the purpose of proposing an accurate name for our moment. We are not becoming transhumanists, obviously. We are too singular for the Singularity. But are we becoming posthumanists?
No culture is philosophically monolithic, or promotes a single conception of the human. A culture is an internecine contest between alternative conceptions of the human. Which culture is free of contradictions between first principles? This is no less true of religious cultures than of secular ones, of closed societies than of open ones. Popular culture may be as soaked in ideas as high culture: A worldview can be found in a song. Wherever mortal beings are thoughtful about their mortality, and finite beings ponder their finitude, at whatever level of intellectual articulation, there is philosophy. Philosophy is ubiquitous and inalienable; even the discourse about the end of philosophy is philosophy. A culture may be regarded as the sum of all the philosophies, all the reflective approaches to living, that are manifestly or latently expressed in a society. It is a gorgeous anarchy, even if it contains illusions and errors. There are worse things than being wrong.
Within a culture, however, some views may come to prevail over others, for intellectual or social reasons. The war between the worldviews has winners and losers, though none of the worldviews are ever erased and there is honor also in loss. In American culture right now, as I say, the worldview that is ascendant may be described as posthumanism. We have been here before, and not too long ago, but for different reasons. The posthumanism of the 1970s and 1980s was more insular, an academic affair of “theory,” an insurgency of professors; our posthumanism is a way of life, a social fate. An important book, a brilliant book, an exasperating book has just been written about the origins of that previous posthumanist moment. In “The Age of the Crisis of Man: Thought and Fiction in America, 1933-1973,” the gifted essayist Mark Greif, who reveals himself to be also a skillful historian of ideas, charts the history of the 20th-century reckonings with the definition of “man.” Strangely, he seems to regret the entire enterprise. Here is his conclusion: “Anytime your inquiries lead you to say, ‘At this moment we must ask and decide who we fundamentally are, our solution and salvation must lie in a new picture of ourselves and humanity, this is our profound responsibility and a new opportunity’ — just stop.” Greif seems not to realize that his own book is a lasting monument to precisely such inquiry, and to its grandeur. “Answer, rather, the practical matters,” he counsels, in accordance with the current pragmatist orthodoxy. “Find the immediate actions necessary to achieve an aim.” But before an aim is achieved, should it not be justified? And the activity of justification may require a “picture of ourselves.” Don’t just stop. Think harder. Get it right. (Why are liberals so afraid of their own philosophy?
Greif’s book is a prehistory of our predicament, of our own “crisis of man.” (The “man” is archaic, the “crisis” is not.) It recognizes that the intellectual history of modernity may be written in part as the epic tale of a series of rebellions against humanism. Humanism has been savaged by theists and atheists, conservatives and progressives, fascists and socialists, scientists and philosophers, though it has also been propounded by the same diversity of thinkers. Who has not felt superior to humanism? It is the cheapest target of all: Humanism is sentimental, flabby, bourgeois, hypocritical, complacent, middlebrow, liberal, sanctimonious, constricting and often an alibi for power. The abusers of humanism, of course, are guilty of none of those sins. From Heidegger to Althusser, they come as emancipators. I think we should emancipate ourselves from their emancipations.
But what is humanism? For a start, humanism is not the antithesis of religion, as Pope Francis is exquisitely demonstrating. The most common understanding of humanism is that it denotes a pedagogy and a worldview. The pedagogy consists in the traditional Western curriculum of literary and philosophical classics, beginning in Greek and Roman antiquity and — after an unfortunate banishment of medieval culture from any pertinence to our own — erupting in the rediscovery of that antiquity in Europe in the early modern centuries, and in the ideals of personal cultivation by means of textual study and aesthetic experience that it bequeathed, or that were developed under its inspiration, in the “enlightened” 18th and 19th centuries, and eventually culminated in programs of education in the humanities in modern universities. The worldview takes many forms: a philosophical claim about the centrality of humankind to the universe, and about the irreducibility of the human difference to any aspect of our animality; a methodological claim about the most illuminating way to explain history and human affairs, and about the essential inability of the natural sciences to offer a satisfactory explanation; a moral claim about the priority, and the universal nature, of certain values, not least tolerance and compassion. It is all a little inchoate — human, humane, humanities, humanism, humanitarianism; but there is nothing shameful or demeaning about any of it.
And posthumanism? It elects to understand the world in terms of impersonal forces and structures, and to deny the importance, and even the legitimacy, of human agency. It certainly does not mean, as Greif correctly notes about antihumanism, a “hatred of the human.” There have been humane posthumanists and there have been inhumane humanists. But the inhumanity of humanists may be refuted on the basis of their own worldview, whereas the condemnation of cruelty toward “man the machine,” to borrow the old but enduring notion of an 18th-century French materialist, requires the importation of another framework of judgment. The same is true about universalism, which every critic of humanism has arraigned for its failure to live up to the promise of a perfect inclusiveness. It is a melancholy fact of history that there has never been a universalism that did not exclude. Yet the same is plainly the case about every particularism, which is nothing but a doctrine of exclusion; and the correction of particularism, the extension of its concept and its care, cannot be accomplished in its own name. It requires an idea from outside, an idea external to itself, a universalistic idea, a humanistic idea. Asking universalism to keep faith with its own principles is a perennial activity of moral life. Asking particularism to keep faith with its own principles is asking for trouble.
Aside from issues of life and death, there is no more urgent task for American intellectuals and writers than to think critically about the salience, even the tyranny, of technology in individual and collective life. All revolutions exaggerate, and the digital revolution is no different. We are still in the middle of the great transformation, but it is not too early to begin to expose the exaggerations, and to sort out the continuities from the discontinuities. The burden of proof falls on the revolutionaries, and their success in the marketplace is not sufficient proof. Presumptions of obsolescence, which are often nothing more than the marketing techniques of corporate behemoths, need to be scrupulously examined. By now we are familiar enough with the magnitude of the changes in all the spheres of our existence to move beyond the futuristic rhapsodies that characterize much of the literature on the subject. We can no longer roll over and celebrate and shop. Every phone in every pocket contains a “picture of ourselves,” and we must ascertain what that picture is and whether we should wish to resist it. Here is a humanist proposition for the age of Google: The processing of information is not the highest aim to which the human spirit can aspire, and neither is competitiveness in a global economy. The character of our society cannot be determined by engineers.
“Our very mastery seems to escape our mastery,” Michel Serres has anxiously remarked. “How can we dominate our domination; how can we master our own mastery?” Every technology is used before it is completely understood. There is always a lag between an innovation and the apprehension of its consequences. We are living in that lag, and it is a right time to keep our heads and reflect. We have much to gain and much to lose. In the media, for example, the general inebriation about the multiplicity of platforms has distracted many people from the scruple that questions of quality on the new platforms should be no different from questions of quality on the old platforms. Otherwise a quantitative expansion will result in a qualitative contraction. The new devices do not in themselves authorize a revision of the standards of evidence and argument and style that we championed in the old devices. (What a voluptuous device paper is!) Such revisions may be made on other grounds — out of commercial ambition, for example; but there is nothing innovative about pandering for the sake of a profit. The decision to prefer the requirements of commerce to the requirements of culture cannot be exonerated by the thrills of the digital revolution.
And therein lies a consoling irony of our situation. The machines may be more neutral about their uses than the propagandists and the advertisers want us to believe. We can leave aside the ideology of digitality and its aggressions, and regard the devices as simply new means for old ends. Tradition “travels” in many ways. It has already flourished in many technologies — but only when its flourishing has been the objective. I will give an example from the humanities. The day is approaching when the dream of the democratization of knowledge — Borges’s fantasy of “the total library” — will be realized. Soon all the collections in all the libraries and all the archives in the world will be available to everyone with a screen. Who would not welcome such a vast enfranchisement? But universal accessibility is not the end of the story, it is the beginning. The humanistic methods that were practiced before digitalization will be even more urgent after digitalization, because we will need help in navigating the unprecedented welter. Searches for keywords will not provide contexts for keywords. Patterns that are revealed by searches will not identify their own causes and reasons. The new order will not relieve us of the old burdens, and the old pleasures, of erudition and interpretation.
Is all this — is humanism — sentimental? But sentimentality is not always a counterfeit emotion. Sometimes sentiment is warranted by reality. The persistence of humanism through the centuries, in the face of formidable intellectual and social obstacles, has been owed to the truth of its representations of our complexly beating hearts, and to the guidance that it has offered, in its variegated and conflicting versions, for a soulful and sensitive existence. There is nothing soft about the quest for a significant life. And a complacent humanist is a humanist who has not read his books closely, since they teach disquiet and difficulty. In a society rife with theories and practices that flatten and shrink and chill the human subject, the humanist is the dissenter. Never mind the platforms. Our solemn responsibility is for the substance.
Leon Wieseltier is a contributing editor at The Atlantic and the author of “Kaddish.”
Sent from my iPad