David Graeber

Anthropology and the rise of the professional-managerial class

2014

      Abstract

      Anthropology and the failure of the 1980s critique

      Prefigurative anthropology

Abstract

Many of the internal changes within anthropology as a discipline—particularly the “postmodern turn” of the 1980s—can only be understood in the context of broader changes in the class composition of the societies in which university departments exist, and, in particular, the role of the university in the reproduction of a professional-managerial class that has come to displace any working-class elements in what pass for mainstream “left” political parties. Reflexivity, and what I call “vulgar Foucauldianism,” while dressed up as activism, seem instead to represent above all the consciousness of this class. In its place, the essay proposes a politics combining support for social movements and a prefigurative politics in the academic sphere.


What follows are some preliminary reflections on the larger political-economic context of the contemporary university, and the place of anthropology within it. I want to focus particularly on the role of what Barbara and John Ehrenreich (1979) first dubbed the “professional-managerial class.”[1] As a child of working-class parents, there’s a lot I could have to say about the reproduction of class structures in academia, but I have decided, for reasons that will eventually become apparent, not to dwell on my own experiences. Instead I will be making some much broader points about the issues anthropologists talk about, and, even more, the ones they don’t; about the way our practice can reproduce structures of inequality even as we claim to challenge them; and, finally, I will present at least the glimmer of a way out.


Here is my basic premise (and I recognize it is in some ways quite different from the Ehrenreichs’).

The period following the 1970s, I would argue, witnessed not only the financialization of capital, but also a concomitant process whereby the professional-managerial class not only changed form, but also, as it did so, gradually came to displace any remaining working-class institutions as the main constituency of what were considered “left” political parties. In the end, parties like Clinton’s “New Democrats” or Blair’s “New Labour” became the parties of increasingly corporatized public and private bureaucrats. (And indeed, in this period, the difference between public and private bureaucracies became increasingly difficult to distinguish.) This isn’t really the place to make the socioeconomic argument in detail, but essentially I am taking my inspiration from scholars such as Lazonick (1988, 2009, 2012) or Duménil and Lévy (2004, 2011) who have argued that what really marks the beginning of what we call the age of “financialization”—or, for that matter, neoliberalism—is a kind of shift of class alliances within the structure of large corporations. To put it very simply, ever since the rise of corporate capitalism in the United States and Germany at the end of the nineteenth century, the “technostructure” of those corporations—as J. K. Galbraith (1967) famously called it—including their own internal bureaucracies, had been mainly oriented inward. Even top executives in the perfume business or electronic companies were largely interested in producing more and better perfumes or electronics. At the same time, lifetime job security made it easier for all employees to identify with the company. As a result, both workers and management tended to see financiers and financial interests as outsiders, even interlopers.[2] In the 1970s and 1980s, all this began to change, and the upper echelons essentially shifted their allegiances and realigned with the financial classes. The notorious boom in mergers and acquisitions, and asset-stripping, and the like, so widely remarked on at the time, the abandonment of former guarantees of lifetime employment, the use of stock options to pay executives and increasingly even skilled workers, were all manifestations of this shift of allegiances. But in fact it ran deeper. During this period, financial elites and corporate bureaucrats essentially merged: the two classes began to intermarry; their careers tended to move back and forth between the different sectors; they came to speak the same language, share the same tastes, and see the world in identical terms. This gradually had profound cultural effects, at first in North Atlantic countries, and then among wealthy countries everywhere. Here I will just single out two. First, it seems to me that the profound bureaucratization of almost every aspect of social life that has marked the neoliberal era (Fisher 2009; Hibou 2012; Graeber forthcoming)—a bureaucracy in which it is increasingly difficult to even distinguish public and private elements—really traces back to this period. Second, the political dominance of this new financial-bureaucratic class was cemented by bringing on board large sectors of the middle classes (the professionals and managers again: essentially, by encouraging them to see the world from the perspective of investors). (Think, here, of how in the 1980s, at the very time almost all American newspapers were getting rid of their labor reporters—such things used to exist!—TV news reports began running crawls of the latest stock quotes on the bottom of the screen, as if this was clearly information of general interest to anyone.)

All this is an extremely abbreviated version of a much more elaborate argument, but I am outlining it only to set the stage. What’s important here is that these developments created a peculiar dilemma for the academy. On the one hand, campuses that in the 1960s had been the focus of actual social movements, even revolutionary movements, were largely depoliticized. On the other hand, much of the language and sensibilities of such movements were maintained even as this period saw the consolidation of the university as the place for the reproduction of a class that in its upper echelons at least had become no longer a mere auxiliary to power, but something at least very close to a branch of the ruling class in its own right, and even in its middle ranks largely identified with it.

To understand how all this happened, I think one needs to understand something else about the phenomenon of “financialization.” Here I am referring not so much to the composition of this new financial bureaucratic class, or to its political alliances, but to the actual basis of its wealth. “Financialization” is often represented—even by its harshest critics—as being characterized by a kind of “casino capitalism”; as a matter of conjuring value out of nothing; as an abstract game detached from what is often called the “real economy” or (if one prefers Marxist terms) from actual class relations. But this is not the case at all. Really, financialization—to put it in somewhat crude Marxist terms—marks a shift of the center of gravity of surplus extraction from wages and commerce to various forms of rent taking, that is, direct extraction, through semifeudal relations of extraction where financial interests work closely with state power (“policy,” as it’s euphemistically termed) to create conditions of mass indebtedness. In the United States, which pioneered this version of capitalism, universities actually play a critical role—second only to the real estate sector—by the intentional fostering of mass student loan debt, which the government then plays a very punitive role in collecting. As a result, between mortgage fees, student loan payments, credit card and banking fees and penalties, a typical family in the United States today may well end up having a third or even half of its income—exact numbers are hard to come by—directly appropriated by the FIRE sector (Finance, Insurance, Real Estate). The language we use to discuss all this is profoundly deceptive. We have not seen a “deregulation” of the financial sector. The sector is intensely regulated. It’s just that most of the legislation that regulates the banks is written by the banks themselves, through an institutionalized system of legalized bribery referred to as “lobbying” and “campaign finance” One result of this is that the role of the state in making corporate profits possible has undergone a fundamental change. It no longer merely preserves the infrastructure and property relations that make indirect extraction through the wage possible; the coercive mechanisms of the state—the legal system, the threat of courts, bailiffs, prisons, and police—play a direct role in the extraction itself. While it is still rare for people to actually be locked up for debt in America, the apparatus is at play in every aspect of the process. The fact that larger and larger percentages of the population are themselves being drafted into what is broadly characterized as “guard labor”—security guards, supervisors, debt collectors, those involved in surveillance of or overseeing of other laborers—is an intrinsic part of this process (Jayadev and Bowles 2006; for the argument in general, see Graeber 2013).

All of this, too, has had profound effects in turn on how most people imagine their own class positions. One might say: as the middle classes are in one sense being pulled upward to identify with the perspectives of the financial sector, the actual operations of financialization are pulling down in such a way as to make it increasingly difficult for many to see themselves as middle class at all.

The neoliberal age was initiated, in the 1980s, by an attack on the political place of labor—the breaking of the miner’s strike in the United Kingdom, the PATCO strike in the United States, the rail strike in Japan—followed by an eventual purging of any working-class influence over any mainstream political party. This was accompanied by an idea that mass home ownership, access to consumer credit, 401ks, and the like, would allow the bulk of the population to identify themselves no longer as working class but as middle class. But there is, I think, a catch here. “Middle classness” is not really an economic category at all; it was always more social and political. What being middle class means, first and foremost, is a feeling that the fundamental social institutions that surround one—whether police, schools, social service offices, or financial institutions—ultimately exist for your benefit. That the rules exist for people like yourself, and if you play by them correctly, you should be able to reasonably predict the results. This is what allows middle-class people to plot careers, even for their children, to feel they can project themselves forward in time, with the assumption that the rules will always remain the same, that there is a social ground under their feet. (This is obviously much less true either for the upper classes, who see themselves as existing in history, which is always changing, or the poor, who rarely have much control over their life situation.) An easy rule of thumb is: if you see a policeman on the street at night and feel more safe, rather than less safe, chances are you’re middle class. This would anyway explain why most people in, say, Pakistan or Nigeria do not feel middle class, and most people in the United States, traditionally, have done. But one paradoxical result of financialization is that this is reversing. One recent survey revealed that for the first time, most Americans no longer consider themselves middle class. It’s not hard to see why. If you find yourself facing eviction from your house, owing to an illegal robo-signing foreclosure that the government refuses to prosecute—even though armed government agents are, nonetheless, willing to arrive to actually expel you from your family home—then it doesn’t really matter what your income level is. You will not feel particularly middle class. And millions of people are now finding themselves in this or in analogous situations.

It’s increasingly members of the professional-managerial classes themselves— who typically inhabit the top fifth on the income scale—who most consistently identify themselves as middle class, and see themselves as embodying middle-class values and sensibilities. These are people for whom the rules, both tacit and explicit, are basically everything. They are also the traditional enemies of the working classes. As radical theorists like Michael Albert were already pointing out in the 1970s, this is the key flaw of traditional socialism: actual members of the working classes have no immediate hatred for capitalists because they never meet them; in most circumstances, the immediate face of oppression comes in the form of managers, supervisors, bureaucrats, and educated professionals of one sort or another—that is, precisely the people to whom a state socialist regime would give more power, rather than less (Albert and Hahnel 1979; Albert 2003). The decisive victory of capitalism in the 1980s and 1990s, ironically, has had precisely the same effect. It has led to both a continual inflation of what are often purely make-work managerial and administrative positions—”bullshit jobs”—and an endless bureaucratization of daily life, driven, in large part, by the Internet. This in turn has allowed a change in dominant conceptions of the very meaning of words like “democracy.” The obsession with form over content, with rules and procedures, has led to a conception of democracy itself as a system of rules, a constitutional system, rather than a historical movement toward popular self-rule and self-organization, driven by social movements, or even, increasingly, an expression of popular will.

The politics of Blair and Clinton were the inevitable outcome of such developments: a “pragmatic left” embrace of both the market and bureaucracy simultaneously, in a way that could not possibly make sense to anyone who was not fully incorporated into the sensibilities of those newly corporatized professional-managerial classes, and which was veritably designed to completely alienate any remaining working-class constituents. At the same time, the end of an older Keynesian class compromise has meant that access to those institutions by working-class organizations or individuals has been virtually denied, with the result that actual members of the working class (or, in America and Europe at least, the white working class) have become increasingly prone to identify, out of sheer rejection of the values of the professionals and administrators, with the populist right.

Obviously, this is a class whose sensibilities are largely produced by universities, which have, in turn, themselves been transformed by the rise of this class, coming to be seen as essentially training grounds for professionals and managers of various sorts rather than as autonomous institutions in their own right. This is actually worth emphasizing. Universities are—or, better said, until recently have been— among the only institutions that survived more or less intact from the High Middle Ages. As a result, universities still reflected an essentially medieval conception of self-organization and self-governance; this was an institution managed by scholars for the pursuit of scholarship, of forms of knowledge that were seen as valuable in their own right. This did not fundamentally change at the beginning of the nineteenth century, when university systems entered into an often somewhat uneasy alliance with centralizing states, providing training for the civil service in exchange for keeping the basic principle of autonomy intact. Obviously this autonomy was compromised in endless ways in practice. But it existed as an ideal. And it was important. It made a difference both in legitimizing the basic idea of a domain of autonomous production driven by values other than those of the market, but in any number of very practical ways as well: for instance, universities were, traditionally, spaces in cities not directly under the jurisdiction of the police.

In this sense what’s happened to universities since the 1970s—very unevenly, but pretty much everywhere—has represented a fundamental break of a kind we have not seen in eight hundred years. As Gayatri Spivak remarked in a talk she gave to Occupy Wall Street, even twenty years ago, when people spoke of “the university,” in the abstract, they were referring to the faculty; nowadays, when they speak of “the university,” they are referring to the administration. Universities are no longer corporations in the medieval sense; they are corporations in the capitalist sense, bureaucratic institutions organized around the pursuit of profit, even though the “profit” in question is, nowadays, slightly more broadly conceived. They are most certainly not institutions dedicated to the pursuit of knowledge and understanding as a value in itself. In that sense, I really think it can be said that the university, in the original conception of the term, is dead.[3]

But the death of the university had also been accompanied by a curious double movement. Scholars are expected to spend less and less of their time on scholarship, and more and more on various forms of administration—even as their administrative autonomy is itself stripped away. Here too we find a kind of nightmare fusion of the worst elements of state bureaucracy and market logic. But at the same time, just about everyone involved in some form of autonomous cultural production which has traditionally operated at least somewhat outside the logic of capitalism is expected to become part of this system: not just independent intellectuals, who effectively no longer exist, but painters, sculptors, poets, even investigative journalists. Finally, the marketization has been accompanied by the introduction of policies of overt violence against dissent, as—here again, the United States has led the way in recent years, but its model has been broadly imitated—police armed with weaponized torture devices like Tasers and pepper spray, and even SWAT teams trained in militaristic counterterrorism tactics, are deployed at the slightest sign of opposition.

We might well ask ourselves how academics have come to accept such things as simple, inevitable realities. Just a few months ago,[4] a revival of the student movement against the marketization of higher education in London was greeted by immediate and violent repression; I personally witnessed young scholars having teeth knocked out, being kicked and beaten, blood splattered on the streets in front of Senate House, all in response to a nonviolent sit-in—and this following such outrageous measures as the elected president of the University of London Student Union being banned from political activities on campus for failure to ask police permission for an on-campus march—all without a majority of “radical” lecturers so much as knowing it happened, let alone raising any significant protest. One could see a similar indifference in the response of the liberal classes in the United States to the violent suppression of the Occupy movement in late 2011. There is a reason for this indifference. According to the prevailing ethos of proceduralism, it’s almost impossible for any legally authorized act, even if it does involve knocking out the teeth of peaceful protestors, to be considered violent, and equally difficult for any extralegal procedure, even if it is conducted in such a way that it could not possibly harm anyone, to be considered anything else. As a result, the militarization of our societies comes to infiltrate the sensibilities even of those who consider themselves Gandhians—in fact, one might even say, for the Gandhians most of all.

I don’t want to sound too pessimistic. Considering the level of repression and indifference, the very existence of the student movement is inspiring. And such movements are in a better position to shape the future direction of events than we might imagine. After all, there is every reason to believe that neoliberalism, as an economic model, is in terminal crisis. The fact that one of the first political moves of the UK elite, in the wake of the economic collapse of 2008, and the concomitant delegitimation of market orthodoxies, was a full-scale attack on the autonomy of the university system, and an attempt to submit it even more thoroughly to market logic, shows that the political class, at least, is well aware where potential ideological threats might come from. It will take some time, no doubt, because neoliberals have placed such an extraordinary emphasis on winning the ideological game— arguably, at the expense of undermining capitalism’s own long-term economic viability[5]—but it’s clear that the degree to which the academy does remain the guardian of pretty much any possible alternative conception of social value gives it a unique potential role in developing whatever comes next.

These last words are quite intentional. It seems to me that this final, financialized stage of capitalism is a terminal one. The ideological game is the only one that capitalism has really won. The system seems to have run out of steam and to be rapidly approaching a dead end by almost any measure: whether in growth, sustainability, technological development, or political imagination—and this even apart from the possibility of immanent ecological catastrophe. Now you might argue that the phrase “capitalism” itself is deceptive—as many anthropologists who would otherwise be seen as procapitalist, such as the cultural economists whom Chris Gregory has described (this volume), are wont to do, or many working in the Marxist tradition who prefer to speak of the dominance of capital within a world economy organized around multiple competing systems of value. There is a lot to be said for the latter position. But it would be unwise, at this historical juncture, to deploy such arguments to make a tacit argument for the eternity of the existing system. For me at least it is less a question of whether capitalism—at least, in any historically recognizable form—is going to be here fifty years from now, and more one of whether the next thing will be even worse. This seems a disastrous time to place a taboo on even thinking about what might be better.


Anthropology and the failure of the 1980s critique

For the first two decades of neoliberalism, the term was almost never used in the academy; instead, the new dispensation was discussed almost exclusively as the advent of a giddy new age of “postmodernism”—just one that, in retrospect, almost precisely reproduced the language and spirit of neoliberal “globalization” being presented in the media at the time. Almost all the emerging theoretical foci of the time—identity, creative consumption, flows and scapes, and so on—turned out to encode a kind of neoliberal cosmology in miniature.[6] Even more, poststructural theory—particularly as enshrined in what might be termed the “vulgar Foucauldianism” that came to dominate so many ostensibly oppositional academic disciplines at the time—came to enshrine the particular class experiences of the professional-managerial class as universal truths: that is, a world of networks and networking, where games of power create social reality itself, all truth-claims are merely stratagems, and where mechanisms of physical coercion are made to seem irrelevant (even as they became ever more omnipresent) because all the real action is assumed to take place within techniques of self-discipline, forms of performance, and an endless variety of dispersed and decentered flows of influence. As a description of academic life, or for that matter professional life in general, such descriptions are often spot on. But it’s not what life is like for most people on earth and never has been. Indeed, the very fact that it was being posed not as a type of class experience but as a universal truth (in fact the only universal truth, since all others are denied) demonstrates just how wrong-headed the tendency, at this time, to dismiss older forms of ideology really was.[7]

Now, how does anthropology fit into all of this? Well, in the 1980s, it did at first appear to be moving in the opposite direction to most disciplines, where “postmodernism” hovered somewhere between toothless mock radicalism, at worst, and a kind of pretentious and aggressively depoliticizing fin-de-siècle despair. In US anthropology, where the term really took off, “postmodernism” seemed anything but depoliticizing. Exponents of the reflexive moment proposed to dissect and challenge the political implications of ethnographic practice on every level, not even ruling out the possibility of rejecting the entire enterprise of anthropology as irredeemably compromised by its history as handmaiden to colonialism.

The postmodern challenge transformed anthropology—most of all, in teaching, where all introductory courses, or histories of the discipline, necessarily begin with a kind of ritual condemnation of anthropological theory and practice from the Victorian era through to at least the 1950s, and often well beyond. It came with all the trappings of radicalism. The very existence of the discipline was called into question. Yet the critique was never quite as radical as it seemed. First of all, one of the main practical effects it had was to blunt the political potential of anthropology—as the bearer of any kind of archive of social possibilities—by providing anyone outside the discipline, daunted by the very kaleidoscopic multiplicity it had documented of possible arrangements of political, economic, or domestic life, with a handy two- or three-line series of catchphrases allowing them to dismiss all forms of anthropological knowledge as inherently illegitimate. This was no doubt highly convenient for those who did not wish to consider themselves Eurocentric, but also did not wish to have to trouble themselves with learning much of anything about non-European perspectives on the world, but it had devastating effects on the ability of anthropologists to take part in a planetary conversation on human possibilities at precisely the moment, one might argue, that we were needed most.

Secondly, the critique of forms of power directed itself overwhelmingly at colonialism and its legacy, and much less—if at all—at economic structures of domination, corporate and financial power, bureaucracy, or structures of state coercion that were not directly related to it. Remember the 1980s and 1990s were the period when regimes of IMF-imposed structural adjustment were being put in place across the Global South—something one would not be aware of at all if reading much of the “radical” anthropology being written at the time. But, one might object, surely retelling the history of the discipline as one of colonial entanglement has at least made anthropologists more cognizant of such dangers than the relatively complacent anthropologists of the 1960s? Actually, I think exactly the opposite has been the case. When outright colonial ventures were revived with the invasion and occupations of Iraq and Afghanistan in 2001 and 2003, and anthropologists were recruited to take part, the American Anthropological Association for most of a decade could not even bring itself to come up to the level of principle it had demonstrated in the 1960s, and make a clear statement opposing anthropological collaboration with the military or CIA (Zehfuss 2012).[8]

As a result, I feel it’s fair to say that, judged even by its own terms, the postmodern moment proved an utter, spectacular political failure. It’s almost as if the ultimate effect of the ritual denunciation of anthropology as an intrinsically racist, colonialist enterprise was to convince its practitioners that it couldn’t possibly be anything else. In fact I’d go further. On a deeper level I think these acts of self-condemnation can be seen as a subtle kind of taking possession; after all, treating a body of accumulated knowledge as fundamentally tainted, as your dirty little secret, is still treating it as your dirty little secret. Combined with a rejection of “high theory” as somehow itself intrinsically imperialistic, it becomes the perfect gesture for a discipline closing in on itself, one in which all factions come to be interested primarily in grabbing hold of a small piece of intellectual territory (usually defined geographically) as the basis for developing increasingly administrative-oriented professional careers.[9] At the same time, the reflexive impulse, taken in this context, can only become a profoundly bourgeois form of literary self-constitution which was at the very least continuous with the hyperprofessionalization of the discipline that began to take place at the time.

Here let me draw for a moment on some of the critical work on reflexivity, particularly from the feminist tradition exemplified by Beverley Skeggs (2002), who herself draws on Marilyn Strathern’s (1987, 1991) observations about reflexivity as a form of performance. I think Skeggs hits the nail pretty much on the head here. The confessional mode of moral self-narration has, as we all know, a very long Christian history, but in more recent centuries, she observes, the whole tradition of telling stories about oneself might be said to have bifurcated along class lines.

As social historians have long noted, people of working-class background rarely write autobiographies. Self-creation through literary technique is very much a game played by elites. This seems always to have been so. But over the last two hundred years, the working classes have increasingly found themselves subject to forms of “coerced self-narration,” where they were obliged to tell stories of their own sins, suffering, criminality, redemption, and reform, all so as to establish themselves in the eyes of the administrative classes as members of the “deserving poor.” Elites get to tell stories about themselves that are ultimately both manifestations of, and reflections on, their own power; everyone else is forced to tell stories about their misery and perseverance. For an anthropologist, it’s hard to contemplate this history without immediately calling to mind the difference between (a) the kind of performance of reflexivity that accompanied the hyperprofessionalization of the academy in the 1980s and 1990s, and (b) the simultaneous emergence of subgenres on the study of both popular “resistance” and “social suffering”—which began slightly later, but largely overlap in time. It’s an almost uncanny parallel. Joel Robbins (2013) has recently argued that the “suffering subject” has come to replace the savage as the primary object of anthropology—perhaps a tall claim, but one where there is surely some truth—and makes the very cogent point (equally true, I think, for most of the resistance literature) that what’s specifically eclipsed in most such accounts is any sense of what those we are asked to empathize with feel is ultimately important or valuable in life.

Still, this paragraph rather struck me:

As readers of Biehl, Daniel, and other anthropologists of suffering, we come to realize the shared humanity that links us to others who suffer. We also realize how profoundly human beings can fail one another, and sometimes we gain insight into ways we might be complicit in this failure. It is clearly a hope of suffering slot anthropology that these lessons might become a motive for change.... This kind of anthropology surely has important work to do in addressing the great cultural problems of our age. (Robbins 2013: 456)

Robbins is clearly bending over backward to be generous here; this is not meant as a critique of his comments at all (I agree with most of them), but still, on reading it, I couldn’t help to ask, “Yes, but ‘important work’ for whom? Who’s actually reading these books? Who is the ‘we’ here?” Granted, there are some rare works in this genre—Nancy Scheper-Hughes’ Death without weeping(1992) is an example of a book that actually has been read by non-anthropologists—that have won some kind of broader readership, but for the most part, we are talking about a literature that is assigned to students as part of their professional training to enter mostly academic or administrative careers. (And this makes sense. After all, people who actually live lives marked by dramatic suffering and resistance rarely need to be reminded that suffering and resistance are real and what it is like.) One can only ask: If this is “a motive for change,” what sort of change are we talking about, and who we are expecting to bring this change about?

On the other hand, self-creation through acts of writing—not to mention a proclivity for meditation on the minutiae of one’s own power and privilege—is typical of the kind of class milieu from within which professional-managerial elites emerge. And this is hardly less true if that reflection takes the typically American puritanical form, in which members of said elite compete with one another for moral superiority based on claims of greater cognizance of their own compromised nature. As Skeggs (2002) emphasizes, the real issue here is one of practice:[10] whether one is actually doing reflexivity, by constantly reexamining the power dynamics implicit in our research process as part of that process itself, in active engagement with and with a sense of accountability to those with whom we work, or simply being reflexive, which is perfectly congruent with the kind of performance of self required by the hyperprofessionalization of the discipline. And as Marilyn Strathern periodically reminds us (1991 etc.), there’s a real continuity here between this demand for constant individual self-examination, and the “audit culture” of constant collective self-assessment that began to be put in place in universities in the 1980s and 1990s as well. I’d go further. I would argue that the reflexive moment operated above all as a point of transition between dominant forms of academic authority. At the risk of being slightly cartoonish, let me evoke a sketch of two different paradigms of academic authority. On the one hand, we have the patriarchal professor, a figure dominant for most of the twentieth century. A figure of absolute self-assurance, whether pedantic or playful, he is on a day-to-day level at least largely oblivious to the forms of privilege and exploitation that make his life possible, and as a result entirely at peace with himself owing to the existence of an institutional structure that guarantees him near-perfect life security. This is a caricature but, still, anyone who has spent much time in academia has encountered someone who fits the description, and there are still a handful, if rapidly decreasing in number, alive and in positions of authority even today. Nevertheless, such characters are no longer being produced. After all, this is precisely the figure whose privilege was so dramatically challenged in the campus turmoil of the 1960s and 1970s. In the neoliberal university, this challenge, combined with the dramatic marketization of academic life that began in the 1980s, has ultimately produced a very different sort of figure of authority. Let us imagine him too as a white male, since white males are, still, most likely to win the academic game—but one who, in the place of the self-assurance of the old patriarchal professor, combines a kind of constant nervous self-examination of his own privilege with a determination to nonetheless deploy all advantages—including that very privilege—in any way he can to prevail in an increasingly precarious academic environment; an environment demanding near-continual acts of reinvention and self-marketing. It’s easy to see how, in the specific case of anthropology, long a preferred refuge for the impractical and eccentric, the reflexive moment played a critical role in creating the soil from which such monstrous figures could emerge.[11]


Prefigurative anthropology

One could well argue that the emergence of this figure was inevitable given other changes taking place in the academy at the time: whether demographic changes in the origin of students, changes in funding, job opportunities outside the academy, the casualization of academic labor, or, for that matter, changes in the organizational structure of academic presses, which ensure that even if anything like the works of Boas, Malinowski, or Evans-Pritchard were written today, they would never find a publisher—except, perhaps, outside the academy.

I don’t want to dwell on any of this here. But I do think it’s important to at least point out, because we tend to write as if theory is concocted in a kind of autonomous bubble. In fact, almost all the dominant theoretical trends within anthropology can only be understood in terms of the very context they themselves tend to efface. I have already given the example of what I’ve called vulgar Foucauldianism, which simultaneously developed the subjective experience of professional-managerial work arrangements as the basis for a universal principle of human sociality, and denied the central importance of either capitalism, or the threat of direct physical violence, at exactly the moment the threat of direct physical violence was becoming central to the operation of capitalism. But the same effacement can be observed even in those approaches that most loudly claimed to be doing the opposite. Proponents of actor-network theory, for instance, insist that they were “doing the work” of unearthing the connections that were simply presumed by theorists of “the social.” But in reality what ANT mainly does is translate politics—and not just politics, academic politics—into the very constitutive principle of reality. Or consider the “ontological turn,” which makes it impossible to even talk about what philosophers used to call ontology in the same way that an older relativist anthropology made it impossible to even talk about what philosophers used to call anthropology.

How does one break out? If the problem is that talking exclusively to certain types of people within certain institutional contexts will inevitably reproduce the sensibilities and habits of thought typical of such people and contexts, in abstract and hypostasized forms, the obvious thing is to talk to someone else in a different context—and not in the form of contained moments of fieldwork whose significance can only be appreciated elsewhere. My own approach has been to engage with social movements. This has often been misunderstood. I often find myself criticized as an “angry” man insisting that other anthropologists get out of their ivory towers and all become activists, even as failing to recognize the value of knowledge and understanding as ends in themselves. In fact, I believe exactly the opposite is the case. Granted, I do think it might behoove those academics who base much of their intellectual persona on identification with social movements to at least take some notice when students are having their teeth knocked out a few blocks away for participating in them, but, actually, the last thing I’d want would be for everyone with an academic job to declare themselves an activist.

Probably the most important thing I’ve learned from radical social movements, particularly those that have emerged from the engagement of anarchism, and other antiauthoritarian traditions, and radical feminism, is the notion of prefiguration. This is a very old idea—you already see it around 1900 in the Industrial Workers of the World’s call to “build a new society in the shell of the old”—but it has taken on a renewed power with the collapse of classical vanguardism: the widespread rejection of the idea of the stoic, humorless revolutionary whose purity can be judged by the degree to which they sacrifice all personal indulgences in the name of an absolute dedication to the cause, seen as a rational, calculated pursuit of power. There has been a general recognition that such a figure will never be able to produce a social order anyone would actually want to live in. Rather, prefigurative politics means making one’s means as far as possible identical with one’s ends, creating social relations and decision-making processes that at least approximate those that might exist in the kind of society we’d like to bring about. It is, as I’ve put it elsewhere, the defiant insistence on acting as if one is already free (Graeber 2004, 2013). Increasingly, this kind of defiant utopianism—an attendant refusal to operate through those institutional structures dominated by professional-managerial elites and their proceduralist ethic—has become the ground principle of democratic social movements, whether in Tunisia or Egypt, Greece, Spain, Occupy Wall Street, the Idle No More movement in Canada, or more recent outbreaks in Turkey, Bosnia, or Brazil. In fact, it’s everywhere. This is important, because it marks a real transformation in the idea of what a democratic movement would even mean.

What would it mean to apply this prefigurative principle to academic practice? Obviously it would not mean subordinating our passion for knowledge and understanding to the imperatives of activist strategy. It would challenge the very idea that there is, ultimately, any division here. It’s significant that just about every student occupation during the movement of 2010 began with a declaration that education is not an economic good, but a value in itself. But neither is it just a political good. A prefigurative approach, it seems to me, would most of all mean abandoning the nervous defensiveness of the hyperprofessionalized academic entrepreneur, and admitting to ourselves that what drew us to this line of work was mainly a sense of fun, that playing with ideas is a form of pleasure in itself, and that the deal we are tacitly being offered in the process of professionalization, that we must make a ritual sacrifice of everything that most gave us joy about the prospect of undertaking an intellectual life in order to have a chance of achieving even a modicum of life security, is itself violent and unnecessary. In retrospect, it’s hard not to see something deeply appealing about the easy self-confidence of that old patriarchal professor—and this, I note, coming from someone of nonelite class background who never had any chance of becoming that person under any circumstances. After all, in the final analysis, the problem with entitlement and privilege is not that some people have it, it’s that other people don’t. As any anthropologist who has had direct experience of an even moderately egalitarian society can attest, these are not, generally speaking, societies where everyone behaves like we expect a worker or a peasant to behave, but ones where everyone acts like an aristocrat. Call this, if you like, the utopian moment in intellectual practice. Whatever one choose to call it, it seems to me that any genuinely effective transformative practice would have to embrace that sense of confidence and pleasure in a form that would lead to a world where it would be available to absolutely everyone.

[1] Michael Albert and Robin Hahnel (1979), in the same volume, suggest the alternate term “coordinator class,” and Albert has used the term in his subsequent work, but otherwise it’s the Ehrenreichs’ usage that really caught on. Since I will be focusing on universities, I was tempted to call them the “professional-administrative classes,” after professors and administrators, but decided it would probably be better not to invent still another usage. At any rate, my own adaption of the term is idiosyncratic: I am not using it in precisely the way any of these authors uses it, but rather, as a kind of stand-in term for a process of class composition marked first by a convergence of the sensibilities of the lower echelons of the corporate bureaucracies and upper echelons of the professional classes that was only beginning to get fully under way around the time the book appeared. I will be describing the process, and its larger implications, in a forthcoming book on bureaucracy.

[2] This is of course the ideal of corporatism: that the interests of workers and management were ultimately the same, and converged around what Keynes (1936: 345) once called “the euthanasia of the rentier.” One shouldn’t romanticize it. It might have been the condition of possibility for the postwar welfare state, but it’s most extreme historical manifestation was fascism.

[3] See Ginsburg (2011) for an incisive, if ultimately conservative, take on the phenomenon; on professionalization in general, the best critique will always be Schmidt (2001).

[4] The talk on which this paper is based was delivered in January 2014; the events occurred in October 2013.

[5] Again, as I’ve argued at great length elsewhere: see Graeber (2013) for a relatively concise version of the argument.

[6] For a more elaborate version of this argument, see the Current Anthropology special issue “The new keywords” with articles by Leve, Gershon, Rockefeller, and myself: Current Anthropology 52 (4) (August 2011).

[7] Fred Pfeil (1990) was one of the first to make a case that postmodernism, including what I’ve called “vulgar Foucauldianism,” is in fact the class sensibility of the professional-managerial classes in this sense; but, in a move that a quarter-century later seems almost charmingly naive, he argued that this put that class in a position to launch a universalizing challenge against capitalist hegemony. We can now see how well that worked out.

[8] They finally did so in 2007, six years after the initial US invasion of Afghanistan. In contrast, the American Psychological Association banned members from working with the military or intelligence almost immediately.

[9] It is a matter of no little irony that “radicalism,” reframed as lowering one’s ambitions, thus often has the result of reducing anthropology to something much like area studies, which is historically exactly the sort of anthropology encouraged by governing institutions like the State Department or the intelligence establishment.

[10] All this might seem a bit of a caricature, and of course in many ways it is. But we must remember too that intellectual history tends to wash away the more ridiculous excesses of any historical period, since such excesses rarely see print. The actual writings of Marcus, Clifford, Tyler, and the rest, were never nearly as crude as their common academic appropriations at the time. (To take a striking example, the notorious question “How can I know the Other?,” so endlessly bruited about in seminars and student lounge discussions, never to my knowledge appeared in print in any anthropological essay of the time at all.) But of course as ethnographers we know that such buzzwords and catchphrases could not be more important. As with vulgar Foucauldianism, they created an intellectual environment where the hyperprofessionalization of the discipline could not just raise few hackles, but even seem a political advance over what had come before.

[11] Obviously it happened quite differently in other disciplines, though usually with similar ultimate effects.


Retrieved on 3rd September 2020 from https://www.journals.uchicago.edu/doi/10.14318/hau4.3.007