Bullshit. So he compares an annual list with a century of incredible ideas? That in itself is a straw man…Also, perhaps it’s just a phase we’re in–either societally as we have only just entered a new age of information, so we have to regroup, OR as individuals, including Gabler..perhaps at an older stage we can settle down, tune out and produce our own ‘ideas’ to add to the fishies above everyones heads.
And what about Chris Hedges? For just one example.
To me, Gabler reads like he had a thesis and wrote around it. Let’s all make a list on this wall of those we consider worthy contemporary thinkers, shall we? I continue with Alain de Botton, Slavoj Zizek…
WRIITEN BY: NEAL GABLER
August 13, 2011
THE July/August issue of The Atlantic trumpets the “14 Biggest Ideas of the Year.” Take a deep breath. The ideas include “The Players Own the Game” (No. 12), “Wall Street: Same as it Ever Was” (No. 6), “Nothing Stays Secret” (No. 2), and the very biggest idea of the year, “The Rise of the Middle Class — Just Not Ours,” which refers to growing economies in Brazil, Russia, India and China.
Now exhale. It may strike you that none of these ideas seem particularly breathtaking. In fact, none of them are ideas. They are more on the order of observations. But one can’t really fault The Atlantic for mistaking commonplaces for intellectual vision. Ideas just aren’t what they used to be. Once upon a time, they could ignite fires of debate, stimulate other thoughts, incite revolutions and fundamentally change the ways we look at and think about the world.
They could penetrate the general culture and make celebrities out of thinkers — notably Albert Einstein, but also Reinhold Niebuhr, Daniel Bell, Betty Friedan, Carl Sagan and Stephen Jay Gould, to name a few. The ideas themselves could even be made famous: for instance, for “the end of ideology,” “the medium is the message,” “the feminine mystique,” “the Big Bang theory,” “the end of history.” A big idea could capture the cover of Time — “Is God Dead?” — and intellectuals like Norman Mailer, William F. Buckley Jr. and Gore Vidal would even occasionally be invited to the couches of late-night talk shows. How long ago that was.
If our ideas seem smaller nowadays, it’s not because we are dumber than our forebears but because we just don’t care as much about ideas as they did. In effect, we are living in an increasingly post-idea world — a world in which big, thought-provoking ideas that can’t instantly be monetized are of so little intrinsic value that fewer people are generating them and fewer outlets are disseminating them, the Internet notwithstanding. Bold ideas are almost passé.
It is no secret, especially here in America, that we live in a post-Enlightenment age in which rationality, science, evidence, logical argument and debate have lost the battle in many sectors, and perhaps even in society generally, to superstition, faith, opinion and orthodoxy. While we continue to make giant technological advances, we may be the first generation to have turned back the epochal clock — to have gone backward intellectually from advanced modes of thinking into old modes of belief. But post-Enlightenment and post-idea, while related, are not exactly the same.
Post-Enlightenment refers to a style of thinking that no longer deploys the techniques of rational thought. Post-idea refers to thinking that is no longer done, regardless of the style.
The post-idea world has been a long time coming, and many factors have contributed to it. There is the retreat in universities from the real world, and an encouragement of and reward for the narrowest specialization rather than for daring — for tending potted plants rather than planting forests.
There is the eclipse of the public intellectual in the general media by the pundit who substitutes outrageousness for thoughtfulness, and the concomitant decline of the essay in general-interest magazines. And there is the rise of an increasingly visual culture, especially among the young — a form in which ideas are more difficult to express.
But these factors, which began decades ago, were more likely harbingers of an approaching post-idea world than the chief causes of it. The real cause may be information itself. It may seem counterintuitive that at a time when we know more than we have ever known, we think about it less.
We live in the much vaunted Age of Information. Courtesy of the Internet, we seem to have immediate access to anything that anyone could ever want to know. We are certainly the most informed generation in history, at least quantitatively. There are trillions upon trillions of bytes out there in the ether — so much to gather and to think about.
And that’s just the point. In the past, we collected information not simply to know things. That was only the beginning. We also collected information to convert it into something larger than facts and ultimately more useful — into ideas that made sense of the information. We sought not just to apprehend the world but to truly comprehend it, which is the primary function of ideas. Great ideas explain the world and one another to us.
Marx pointed out the relationship between the means of production and our social and political systems. Freud taught us to explore our minds as a way of understanding our emotions and behaviors. Einstein rewrote physics. More recently, McLuhan theorized about the nature of modern communication and its effect on modern life. These ideas enabled us to get our minds around our existence and attempt to answer the big, daunting questions of our lives.
But if information was once grist for ideas, over the last decade it has become competition for them. We are like the farmer who has too much wheat to make flour. We are inundated with so much information that we wouldn’t have time to process it even if we wanted to, and most of us don’t want to.
The collection itself is exhausting: what each of our friends is doing at that particular moment and then the next moment and the next one; who Jennifer Aniston is dating right now; which video is going viral on YouTube this hour; what Princess Letizia or Kate Middleton is wearing that day. In effect, we are living within the nimbus of an informational Gresham’s law in which trivial information pushes out significant information, but it is also an ideational Gresham’s law in which information, trivial or not, pushes out ideas.
We prefer knowing to thinking because knowing has more immediate value. It keeps us in the loop, keeps us connected to our friends and our cohort. Ideas are too airy, too impractical, too much work for too little reward. Few talk ideas. Everyone talks information, usually personal information. Where are you going? What are you doing? Whom are you seeing? These are today’s big questions.
It is certainly no accident that the post-idea world has sprung up alongside the social networking world. Even though there are sites and blogs dedicated to ideas, Twitter, Facebook, Myspace, Flickr, etc., the most popular sites on the Web, are basically information exchanges, designed to feed the insatiable information hunger, though this is hardly the kind of information that generates ideas. It is largely useless except insofar as it makes the possessor of the information feel, well, informed. Of course, one could argue that these sites are no different than conversation was for previous generations, and that conversation seldom generated big ideas either, and one would be right.
BUT the analogy isn’t perfect. For one thing, social networking sites are the primary form of communication among young people, and they are supplanting print, which is where ideas have typically gestated. For another, social networking sites engender habits of mind that are inimical to the kind of deliberate discourse that gives rise to ideas. Instead of theories, hypotheses and grand arguments, we get instant 140-character tweets about eating a sandwich or watching a TV show. While social networking may enlarge one’s circle and even introduce one to strangers, this is not the same thing as enlarging one’s intellectual universe. Indeed, the gab of social networking tends to shrink one’s universe to oneself and one’s friends, while thoughts organized in words, whether online or on the page, enlarge one’s focus.
To paraphrase the famous dictum, often attributed to Yogi Berra, that you can’t think and hit at the same time, you can’t think and tweet at the same time either, not because it is impossible to multitask but because tweeting, which is largely a burst of either brief, unsupported opinions or brief descriptions of your own prosaic activities, is a form of distraction or anti-thinking.
The implications of a society that no longer thinks big are enormous. Ideas aren’t just intellectual playthings. They have practical effects.
An artist friend of mine recently lamented that he felt the art world was adrift because there were no longer great critics like Harold Rosenberg and Clement Greenberg to provide theories of art that could fructify the art and energize it. Another friend made a similar argument about politics. While the parties debate how much to cut the budget, he wondered where were the John Rawlses and Robert Nozicks who could elevate our politics.
One could certainly make the same argument about economics, where John Maynard Keynes remains the center of debate nearly 80 years after propounding his theory of government pump priming. This isn’t to say that the successors of Rosenberg, Rawls and Keynes don’t exist, only that if they do, they are not likely to get traction in a culture that has so little use for ideas, especially big, exciting, dangerous ones, and that’s true whether the ideas come from academics or others who are not part of elite organizations and who challenge the conventional wisdom. All thinkers are victims of information glut, and the ideas of today’s thinkers are also victims of that glut.
But it is especially true of big thinkers in the social sciences like the cognitive psychologist Steven Pinker, who has theorized on everything from the source of language to the role of genetics in human nature, or the biologist Richard Dawkins, who has had big and controversial ideas on everything from selfishness to God, or the psychologist Jonathan Haidt, who has been analyzing different moral systems and drawing fascinating conclusions about the relationship of morality to political beliefs. But because they are scientists and empiricists rather than generalists in the humanities, the place from which ideas were customarily popularized, they suffer a double whammy: not only the whammy against ideas generally but the whammy against science, which is typically regarded in the media as mystifying at best, incomprehensible at worst. A generation ago, these men would have made their way into popular magazines and onto television screens. Now they are crowded out by informational effluvium.
No doubt there will be those who say that the big ideas have migrated to the marketplace, but there is a vast difference between profit-making inventions and intellectually challenging thoughts. Entrepreneurs have plenty of ideas, and some, like Steven P. Jobs of Apple, have come up with some brilliant ideas in the “inventional” sense of the word.
Still, while these ideas may change the way we live, they rarely transform the way we think. They are material, not ideational. It is thinkers who are in short supply, and the situation probably isn’t going to change anytime soon.
We have become information narcissists, so uninterested in anything outside ourselves and our friendship circles or in any tidbit we cannot share with those friends that if a Marx or a Nietzsche were suddenly to appear, blasting his ideas, no one would pay the slightest attention, certainly not the general media, which have learned to service our narcissism.
What the future portends is more and more information — Everests of it. There won’t be anything we won’t know. But there will be no one thinking about it.
Think about that.
Neal Gabler is a senior fellow at the Annenberg Norman Lear Center at the University of Southern California and the author of “Walt Disney: The Triumph of the American Imagination.”