I am as light as a feather, I am as happy as an angel, I am as merry as a schoolboy. I am as giddy as a drunken man. A merry Christmas to everybody!

The "Great Man" theory of history is usually attributed to the Scottish philosopher Thomas Carlyle, who wrote that "the history of the world is but the biography of great men." He believed that it is the few, the powerful and the famous who shape our collective destiny as a species. That theory took a serious beating this year.I simply cannot wait to find out how that theory took a serious beating this year, a year (see the NSA scandal, the Military Commissions Act, the continued, intransigent direction of the war) in which I would have found it more likely that my own father turn into a giant sprig of parsley than someone declare it an exception to the Great Men of History thesis.
To be sure, there are individuals we could blame for the many painful and disturbing things that happened in 2006. The conflict in Iraq only got bloodier and more entrenched. A vicious skirmish erupted between Israel and Lebanon. A war dragged on in Sudan. A tin-pot dictator in North Korea got the Bomb, and the President of Iran wants to go nuclear too. Meanwhile nobody fixed global warming, and Sony didn't make enough PlayStation3s.
But look at 2006 through a different lens [read: complete cretinism] and you'll see another story, one that isn't about conflict or great men. It's a story about community and collaboration on a scale never seen before. It's about the cosmic compendium of knowledge Wikipedia and the million-channel people's network YouTube and the online metropolis MySpace. It's about the many wresting power from the few and helping one another for nothing and how that will not only change the world, but also change the way the world changes [this clause is meaningless].You'll note the first instance of the language of "democratic revolution" in that "many wresting power away from the few" remark. Jesus Christ...
The tool that makes this possible is the World Wide Web. Not the Web that Tim Berners-Lee hacked together (15 years ago, according to Wikipedia) as a way for scientists to share research. It's not even the overhyped dotcom Web of the late 1990s. The new Web is a very different thing. It's a tool for bringing together the small contributions of millions of people and making them matter. Silicon Valley consultants call it Web 2.0, as if it were a new version of some old software. But it's really a revolution.World-changing (and changing-world-changing?) revolutions that have to be explained to you really are revolutions, we swear.
And we are so ready for it. We're ready to balance our diet of predigested news with raw feeds from Baghdad and Boston and Beijing. You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy-strewn basement rec rooms—than you could from 1,000 hours of network television.Interestingly, both YouTube and network television are considered better ways to "learn more about how Americans live" than, say, meeting and interacting with fellow Americans. It seems that you're going to be learning from a screen no matter what, but at least now it's an exciting new screen!
And we didn't just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open-source software.Homeless man: I haven't found work in a year. Can you spare some change?
America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user-created Linux. We're looking at an explosion of productivity and innovation, and it's just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy.I've spent ten minutes trying to think of something to say about this paragraph, which means writer Lev Grossman must have spent two, although he might have received help from Thomas Friedman on that last sentence.
Who are these people? Seriously, who actually sits down after a long day at work and says, I'm not going to watch Lost tonight. I'm going to turn on my computer and make a movie starring my pet iguana? I'm going to mash up 50 Cent's vocals with Queen's instrumentals? I'm going to blog about my state of mind or the state of the nation or the steak-frites at the new bistro down the street? Who has that time and that energy and that passion?At this point I'm beginning to question this whole exercise. Is this a joke year? Or have we really elevated the mundane to the level of earth-shaking? Both? Even if they've one-upped their critics by being both, does their hip irony make them any less stupid? Or have they caught me in a bind: I'm blogging my state of mind, therefore entangling myself in their whole twisted system? Can we even think ourselves outside of Web 2.0 anymore? Or has it become the very ground of and sole possibility for critique? The situation screams out for Baudrillard. Or, like, finishing this shitty article.
The answer is, you do. And for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game, TIME's Person of the Year for 2006 is you.I doubt if the Bolsheviks ever came up with a more congratulatory speech to the Russian proletariat after they seized the means of production. And all we had to do was forsake television in order to videotape our pet, all the while ignoring the daily atrocities of 21st century life. Dude, YouTube is fucking sweet. Plus, I totally Tivo'd Lost, anyway.
Sure, it's a mistake to romanticize all this any more than is strictly necessary. Web 2.0 harnesses the stupidity of crowds as well as its wisdom. Some of the comments on YouTube make you weep for the future of humanity just for the spelling alone, never mind the obscenity and the naked hatred.The ontology of the crowd is an amazing thing. First, it was (rightly, or at least accurately) depicted as an angry group of people, most likely threatening to the powers that be. An entire literature of tropes has sprung up in this respect concerning the "ignorance of the multitude." Now thousands of solitary Americans, furiously pecking away at their keyboards in their middle-class homes, can still be conceptualized as a "crowd," and are still considered in need of a good, anti-ochlocratic harangue now and then. Will we still be giving lectures about the stupidity of the crowd when people live their entire lives in front of a screen? When they never leave their homes? When stupidity is firmly entrenched and crowds are inconceivable? Is this the greatest achievement of Web 2.0, a success which other totalitarianisms could only dream of: Groupthink without the group?
But that's what makes all this interesting. Web 2.0 is a massive social experiment, and like any experiment worth trying, it could fail. There's no road map for how an organism that's not a bacterium lives and works together on this planet in numbers in excess of 6 billion. But 2006 gave us some ideas. This is an opportunity to build a new kind of international understanding, not politician to politician, great man to great man, but citizen to citizen, person to person. It's a chance for people to look at a computer screen and really, genuinely wonder who's out there looking back at them. Go on. Tell us you're not just a little bit curious.Take the third to last sentence. Now subtract everything after the word "screen": "It's a chance for people to look at a computer screen." Yup. See, we have discussed Web 2.0 before (complete with sweet Fishstix/Kushakov exchange). The internet is an opportunity, no doubt about it. But unless it is your means of livelihood, it is simply a mere means. If you meet your spouse on the internet, or organize a meeting on the internet, or stay in touch with your friends on the internet, there's nothing disrespectable about that. But the end is your spouse, the meeting, and your friends. Hypostasizing the mere act of looking at a computer screen as the most heroic thing one can do amounts to nothing; it is a weird hagiography of everyday life, a bit like congratulating Americans for buying automobiles at the turn of the century (certainly a "revolution" in itself) or moving to the suburbs in the '50s. Yet Time wants to tout the internet as something more than consumption, although I can guarantee that probably 99% of all Web 2.0 "usage" is consumptive. We are making the solitudinous internet the alibi (literally "elsewhere") of politics, art, and friendship--the interactive spheres of human existence--at a time when the world despises us for the very narcissism Time extols. And in our haste to discuss everything and do nothing, we leave our most despicable traits intact. If President Bush had been more tech-savvy, and perhaps more politically astute, rather than encourage us after the September 11 attacks to go out and "buy," he should have told us to blog about it.
Consider a curricular example. Decades ago a thinker who'd witnessed oppression firsthand embarked upon a multibook investigation into the operations of society and power. Mingling philosophical analysis and historical observation, he produced an interpretation of modern life that traced its origins to the Enlightenment and came down to a fundamental opposition: the diverse energies of individuals versus the regulatory acts of the state and its rationalizing experts. Those latter were social scientists, a caste of 18th- and 19th-century theorists whose extension of scientific method to social relations, the thinker concluded, produced some of the great catastrophes of modern times.
This is just plain wrong. The whole point of Foucault's analysis is that "state and its rationalizing experts" are not the only source of authority in society. Indeed, what's shocking about Foucault's thought is that former strongholds of independent thought and critical resistance (revolutionaries, the sexual liberation movement, et al) contain their own normativizing power and oppression. And not even oppression, but merely subject formation--that's the other crucial point: whereas Hayek was concerned with the enforcement of power by the state, Foucault showed how power flowed freely, unquestioningly, often secretly between people. That's why Foucault and other post-whatever-you-want-to-call-it thinkers are unappealing to traditional radicals: they leave no vantage point from whence one can make an enlightened critique of society. Bad news for marxists, not conservatives like Hayek.Here's the rub: I don't mean Michel Foucault. The description fits him, but it also fits someone less hallowed in academe today: Friedrich A. von Hayek, the economist and social philosopher. Before and after World War II, Hayek battled the cardinal policy sin of the time, central planning and the socialist regimes that embraced it. He remains a key figure in conservative thought, an authority on free enterprise, individual liberty, and centralized power.
To take a specific example, for the United States to join the International Criminal Court would be neither an isolationist policy nor a hegemonic one, but rather a liberal policy in which we submit to an egalitarian framework of rules and cooperate with others in the effort the enforce those rules.This sort of thinking I take to be in line with those who say that the Iraq War has been an "aberration," and that America has strayed from the path of diplomacy and multilateralism which has in the past and can in the future earn us respect. Yglesias suggests as much:
Truman did not seek to simply implement American domination. Rather, he constructed an alternative vision of a liberal community of nations featuring complex forms of cooperation between states within the framework of liberal institutions like NATO and the EU. The collapse of the Soviet Union creates, in essence, a fork in the road. The United States can either seek to fill the void with unipolar hegemony, or else it can seek to expand the scope of the miniature liberal order created during the Cold War.To which Kagan responds, quite directly, that the idea of an "aberration" is an illusion: "Since the cold war, America has launched more military interventions than all other great powers combined. The interventions in Somalia, Haiti, Bosnia and Kosovo were wars of choice, waged for moral and humanitarian ends, not strategic or economic necessity, just as realist critics protested at the time." Attempts to refashion the era before the Iraq War as one of peaceful internationalism are flawed and dishonest: "There is a yearning, even among the self-proclaimed realists, for a return to an imagined past innocence, to the mythical 'traditional approach', to a virtuous time that never existed, not even at the glorious birth of the republic." The real constant has been that America is "an ambitious, ideological, revolutionary nation with a belief in its own world-transforming powers and a historical record of enough success to sustain that belief." In other words: Don't try to talk away Iraq, because wars of choice are in our blood.
The problem isn't Grameen's size or its borrowers, but its philosophy: Yunus is firmly anti-profit. "Maybe banks can make a profit from [loaning money to the poor]. ... But this is what loan sharks do," Yunus said after his Nobel win was announced in October. "We have enough enterprises generating money for profit. I would rather think that the rich can set up social enterprises." Yunus even objects to the term "microfinance," preferring the profit-neutral "microcredit."Besides one reference later on to Grameen's low interest rates they charge (gasp!), Curry offers not a single statistic about Grameen, nor another quotation from Yunus. Instead, we get lots of statistics about microfinance (stick it to 'em, Andrew!) banks in Serbia and Bolivia, which by virtue of their being mentioned alongside "numbers" are somehow supposed to make them more appealing to toughminded, manly, realist men -- not those Gandhian types who think non-violence, or non-profit-driven banking, or whatever, is a real strategy.
"Robb was especially interested in sending more U.S. forces, according to one participant, and the panel considered proposals to deploy 100,000 to 200,000 additional troops. Ultimately, though, the panel discovered that there might be only 20,000 available, prompting vigorous discussion that led members to conclude that a substantial surge was unworkable."Frederick Kagan has a long article in the Weekly Standard that tries to work with this low number. Basically, all he can recommend is sending inexperienced reserve troops and extending others' tours of duty. In other words, throwing undertrained, scared, and weary soldiers into a fight for a Grand and Noble Idea, when all experience shows us that this will not work. I guess you've got to admire Kagan's "patriotism," though... At best this tack is blind zeal, even delusion, at worst clinical insanity. In short, a world of anti-reality.