John Zerzan
Vagaries of Negation
Data on the Decomposition of Society
It wasn’t only radical intellectuals that found themselves unprepared for the end of the 60s. Change was simply no longer in the air and it fell to this intelligentsia, in the 70s increasingly part of the universities they once attacked, to explain “the 60s,” its swirling promise and its demise. Most of the professoriat who had come of age in the struggles before the “Me Decade” Ice Age found no new framework for understanding or reassessing their defeat circa 1970.
Herbert Marcuse’s One-Dimensional Man, which appeared just before the upheavals, provided a rather pessimistic picture of consumption- oriented citizens caught in the chains of “repressive tolerance.” With the movements of blacks and other minorities, hippies, anti-war students, and women, he rejoiced and became for a time more sanguine about the prospects for the future. But by the second half of the 70s he had become as grim as the rest of the radical intelligentsia; in his final book (1978) Marcuse embraced art as the last refuge of resistance.
Some realized the inadequacy of the last Frankfurt School theorist but offered nothing in his place to explain why events of the 60s had failed to deepen into more of a challenge to the dominant culture. However, Paul Piccone, editor of the quarterly journal Telos since it began in 1970, has tried to provide a fuller, if very depressing, schema to account for the failure of the 60s revolts and what he sees as a triumph of modem authority that pre-dates those revolts and rendered them abortive.
In 1977 and ’78 Piccone unveiled his “artificial negativity” thesis,[1] the most far-reaching and coherent model for understanding contemporary social reality since at least the 60s. Re-periodizing recent phases of capitalist development, it locates the decisive impact of consumerization in the early 60s as a watershed between incomplete and completed repression.[2] Correcting Marcuse’s “one-dimensionality” approach as obsolete, Piccone has offered a persuasive picture of a consumer-cultural hegemony grown so complete as to remove from its subjects a combative intelligence essential to now-extinct struggles. Internal opposition is necessary in order to equip the system with vital control mechanisms; with the too-victorious stamping out of the undomesticated, monopoly capitalism now must somehow relax its repressive force so as to help engender a renewed negative presence.
It seems very plausible that domination today needs just such “artificial negativity” for its future,[3] but where Piccone sees a docile, cretinized subject, produced as the over-success of integration, I see evidence of dis-integration, a subjectivity that, far from happy and conformist, cries out in anguish as it begins to withdraw from the reproduction of the social order.[4] The negative is in fact strongly present, if not in a form useful to power. Data and commentary on the social fabric of the 80s may suggest a clarification and re-interpretation of the Piccone thesis.
One might have expected the alleged arrival of standardized, homogenized consumer consciousness, with its “erosion of the last vestiges of individuality,”[5] to also mean the evaporation of psychic turmoil. Precisely the opposite is the case. Psychological immiserization is increasing on all fronts, fundamental testimony that the individual continues to register his incompatibility with the distortion and impoverishment of life as offered by late capitalism.
With the decline of the traditional two-parent family—which is occurring even faster in the 1980s than in the late 70s[6]—less emotional mediation is afforded against the onslaught of everyday life. Even the apparently successful are far from immune, as indicated in such articles as “Life of a Yuppie Takes a Psychic Toll” and “Madness Stalks the Corporate Ladder.”[7]
In fact, levels of emotional illness are growing, as reported by the National Institute of Mental Health or the supermarket tabloids,[8] as people find themselves unable to adjust to the triumphant culture. Newly prominent maladies, such as the Epstein-Barr virus, a kind of psychological devastation,[9] are complemented by new increases of others, like eating disorders.[10] A federal study released in 1984 found that one in five had some type of mental health treatment, compared to one in eight in 1960.[11] Not surprising is the fifteen percent jump in the NIMH research budget for 1987.
Suicide among the young has tripled in the past twenty-five years, following one hundred years of suicide stability going back to the mid- nineteenth-century data studied by Durkheim. Among fifteen- to nineteen- year-olds it is now the second leading cause of death and occasioned formation of a cabinet-level Task Force on Youth Suicide in 1985. Late in 1986 it was reported that after years of decline, suicide rates among the elderly are also rising.[12]
Stress, thought by some to be perhaps only a buzz-word of the late 1970s and early 80s, has never commanded so much attention. The literature is burgeoning as stress-wrought damage grows.[13] The Morbidity and Mortality Weekly Report released October 2, 1986 by the National Centers for Disease Control declared that mental stress caused by unsatisfactory working conditions has become America’s biggest occupational disease, six months after a news magazine had concluded that “the American workplace is being swamped with claims ranging from job burnout, or mental fatigue from tedium and stress, to chronic and severe anxiety, manic depression, nervous breakdown and schizophrenia.”[14] It has also been recently claimed, by Dr. Thomas Robertson, that the stress of getting up in the morning is the reason for the very high incidence of strokes and heart attacks occurring between eight and nine a.m.[15]
The unreality of our work-and-shop existence is also viscerally felt, it would seem, by the very young. A 1986 Cornell University Medical College study of randomly selected six- to twelve-year-olds in New York City found that 12% of them manifested suicidal tendencies, including overt manifestations,[16] while a 1985 offering discussed widespread child arson.[17]
In 1985 the American Medical Association revealed that “total outpatient drug exposure” increased 28% from 1971 to 1982.[18] This by way of background to 1986, the year of the cocaine epidemic and non-stop attention to the problem, with special attention to drugs at work and testing for drug use; several federal institutions came out for universal employee drug tests in March, 1986, for example.[19]
Turning more directly to work, it is clear that the “productivity crisis” is another hot topic of the 1970s that has proven its durability. If Marxist periodicals like Science and Society and Dollars and Sense denied its existence in the 70s, falling back in the 80s to assert that at least the mental state of workers is no factor in the productivity decline, those with a sincere spirit of inquiry into the matter of faltering output-per-hours-worked have had to be more forthright about this crisis that definitely has not gone away.
“Something important has happened to productivity. I don’t know what it is ... but it is very bad,”[20] judged E. Dennison in the late 70s. Baumol and McLennan concluded, more recently, that “this country’s productivity growth performance in recent years is extremely disquieting.”[21] After lackluster growth in 1984, it fell to -.2% for 1985[22] and is giving a poor showing thereafter.
Amid recent studies of a declining “work ethic,”[23] reactions range from outrage, blaming “irrationalities on the level of the individual,”[24] to sympathy, taking cognizance of the prevailing “national malaise and personal pain.”[25] And one of the most stunning aspects is that the productivity crisis has not been affected at all by massive recent outlays, organizational and technological. Wickham Skinner summed up the industrial situation thusly; “American manufacturers’ near-heroic efforts to regain a competitive edge through productivity improvements have been disappointing.
Worse, the results of these efforts has been paradoxical. The harder these companies pursue productivity, the more elusive it becomes.”[26] Also in mid-1986 came the parallel shocking news that the hundreds of billions spent on computerizing the office have not raised white collar productivity a whit.[27] At the same time performance in the service sector is being questioned,[28] there is great resistance to the neo-Taylorist monitoring of work by computers,[29] and layoffs signal to some new declines of company loyalty, morale and productivity.[30]
Meanwhile, since its effective beginnings in the early 80s,[31] participative management “has spread at an extraordinary rate”[32] with the prospect of even greater growth of worker-involvement, quality of worklife, and other democratizing of jobs.[33] More and more it is becoming clear that “workers themselves must be the real source of discipline,”[34] that authority has no choice but to give over more initiative to those who are becoming more demonstrably averse to contemporary work. At the same time, there is already evidence that after initial temporary reprieves, power-sharing schemes are not improving productivity or job satisfaction.[35]
Two other significant work tendencies, in passing, are the increase in part-time employment,[36] and the refusal of the young, though often unemployed, to accept work or to last long at it.[37] More evidence of disinvestment in the dominant values.
Rousseau argued that republics could outdo monarchies by turning the spectators into the spectacle.[38] Today’s political spectacle is failing because people are shunning their appointed role. “Americans are no longer merely criticizing their political system,” asserted historian James Bums in 1984, “they are deserting it.”[39] Turnout for the 1986 election fit, if exaggeratedly, the general tendency since 1960: it was the lowest since 1942 despite the most massive and costly voter registration drive ever mounted in a non- presidential year. Among those still participating in recent years, by the way, the trend has been toward an unaffiliated status, not a swing toward the right.[40]
The young Sartre averred that there was nothing he and his compatriots had been told that wasn’t a lie. Illiteracy in America is vast and increasing, prompting Jonathon Kozol to estimate that sixty million are “substantially excluded from the democratic process” by it.[41] There is a deep, visceral turn-off indicated here, deeper than that of non-voting, one which refuses and reverses one of civilization’s cardinal agencies and promises fundamental problems for a social order increasingly reliant on self-activation. The Army found that 10% of its conscripts were functionally illiterate in 1975; in the 1981 (volunteer-based) Army the figure was 31% and climbing.[42] At work, new computer-mediated environments require both literacy and initiative, as both qualities evaporate.[43] A related development is the rising high school dropout rate, with rates of forty and fifty percent from the central city schools now being reported.[44]
Another basic connection with this culture also seems to be loosening: that of a sense of history, a perspectival interest in the past. Commentators of every stripe have bemoaned a great indifference emerging in this area,[45] the tendency to live exclusively in the present. Ultimately, however, is this “de-memorization” so threatening? Are the horrors of the present not a sufficient reference point on which to base the project of emancipation—in fact, are they not the only basis? As Baudrillard reminds us, “Each man is totally there at each moment. Society is also totally there at each moment.”[46] Adorno closed his Minima Moralia with the counsel to thought that it must reveal this “indigent and distorted” world as it will one day appear from the vantage point of liberated existence—and to achieve such a perspective “entirely from felt contact”[47] with the world’s aspects; this proviso seems to imply both the definitive weight of the present and the promise that the subject is capable of measuring that present against surviving instincts and sensibilities. This brief survey tries to suggest that the individual does survive and tries to turn away from official living, maintaining particularity and otherness in fundamental ways, in the face of the demands of complicity.
It has become commonplace to reject or ignore Habermas’ early 1970s hypothesis that “late-capitalist societies are endangered by a collapse of legitimation.”[48] But the farther we get from the 60s the more obvious it is that a full range of de-legitimizing potentialities has been growing since that time. What Robert Wuthnow characterized as an unprecedented “fundamental uncertainty about the institutions of capitalism”[49] does not even take into account the real depth of “uncertainty” present when emotional survival itself is at issue.
Probably no single datum could provide better ammunition for the “artificial negativity” view of a totally passive, cretinized populace than that of the more than seven hours of television consumed per capita daily. But can there be much dispute that most of those so irradiated are consciously narcotizing themselves? Drugs of all kinds are clearly necessary simply to get through the day, and an aura of irony has never been so strong regarding television. Further, one could point, as many did, to the Happy Days generation of young men as they faced the institution of pre-draft registration in the early 80s. With all those thousands of television programs behind them, could there be any doubt that all of them would not docilely register? Their massive non-compliance staggered virtually everyone.
Television commericals also deserve comment. Ten years ago, it was “Harley Davidson—the freedom machine!” and “Mustang II, Boredom Zero”; today—along with much more attention to pain and dyspepsia relief and alcohol and drug treatment centers—Mastercard invites us to “Master all the possibilities,” Merrill Lynch sings “To know no boundaries,” and eroticism becomes far more pervasive in the promotion of a great variety of commodities. Banks, life insurance companies and other conservative components begin to sound like the motorcycle, whiskey and fast car purveyors of the 70s. The widely noted collapse of the commitment to deferred gratification[50] is not without grave danger to the present society, as more and more is offered—in terms of what can only be seen as less and less. Consumerized society provides less a guarantee of power’s stability than a bill of reckoning that grows ever larger by its noticeable failure to satisfy.
Meanwhile, polls reflect the public belief that ability and hard work count for almost nothing in “getting ahead”; state lotteries and other forms of gambling emerge as the national pastime; virtually universal employee theft promotes the use of millions of lie-detector and psychological “integrity” tests—not to mention drug testing; new studies show the widespread use of unemployment benefits to subsidize leisure rather than work search; shoplifting and tax evasion figures set new highs each year, as do the U.S. prison population numbers; an avalanche of articles touts the desperate need for moral education; the Army, reduced to a New Age “Be all that you can be” appeal, contends with drug, AWOL, illiteracy problems, and a new investigation points to “Army-wide” pilfering of all types of equipment—this list and its documentation could be greatly extended; I’ll spare the reader.
What stands out is that “narcissistic” withdrawal on this scale means that values dangerous to the dominant order are corroding its very foundation. As Baudrillard put it, “Everywhere the masses are encouraged to speak, they are urged to live socially, politically, organizationally .. the only genuine problem today is the silence of the masses.”[51]
Modem domination is democratic; it must have participation if it is to have legitimacy; if it is, ultimately, to function at all. This is precisely what is being withdrawn, as the return on investing in domination registers on the organism as zero or less. This “passivity” is of no instrumental use to the world we must continue to endure; an artificial negativity may well be required. But this negativity in no way means a real one, growing more visible, does not exist. Nor, it must be added, is it inevitable that a totally alternative consciousness will emerge from the crucible of intensifying alienation.
[1] Paul Piccone, “The Changing Function of Critical Theory,” New German Critique 12 (Fall, 1977) and “The Crisis of One-Dimensionality,” Telos 35 (Spring, 1978).
[2] This may be seen as paralleling Jacques Camatte’s categories of the informal and actual domination of capital, left rather indeterminate in The Wandering of Humanity (Detroit, 1973).
[3] Sun Oil, Bristol-Myers, and American Express recently commissioned an Oxford study on the future of American capitalism; predicated on the fact that the gap between the haves and the have-nots is widening—e.g. “Is the Middle Class Doomed?” New York Times Magazine, September 7, 1986 and “Is the Middle Class Shrinking?” Time, November 3,1986—an explosion is predicted as personal anxiety converts to social and political tension over downward mobility: America in Perspective, Oxford Analytica (New York, 1986). There is a kind of crude analog here to the “artificial negativity” thesis, as American capitalism in its decline is seen as captive to outmoded ideologies and unable to connect with the realities of the coming crisis.
[4] Earlier contributions to what some have termed the “breakdown” thesis by the author: Breakdown: Data on the Decomposition of Society (Milwaukie, OR, 1976); “The Promise of the 80s,” Fifth Estate (June 1980); “The 80s So Far,” Fifth Estate (Fall 1983); “Present Day Banalities,” Fifth Estate (Winter-Spring 1986). Available in Elements of Refusal, Left Bank Books, (Seattle, 1987).
[5] Paul Piccone, “Narcissism after the Fall: What’s on the Bottom of the Pool?” (Symposium on Narcissism) Telos 44 (Summer 1980), p. 114.
[6] Two-parent families declined by 751,000 from 1980 to 1985, more than twice the decrease in any five-year period since 1970, according to the Census Bureau (figures released November 4, 1986).
[7] “Life of a Yuppie Takes a Psychic Toll,” U.S. News and World Report, April 29, 1985; Douglas La Bier, “Madness Stalks the Corporate Ladder,” Fortune, September 1, 1986.
[8] A survey of Journal of the American Medical Association and Archives of General Psychiatry seem to indicate an upsurge of interest in depression in the literature, while the check-stand weeklies seem to feature stress, depression and loneliness in the mid-80s.
[9] “Malaise of the 80s,” Newsweek, October 27,1986.
[10] Joel D. Killen, et al, “Self-Induced Vomiting and Laxative and Diuretic Use among Teenagers,” Journal of the AMA, March 21, 1986. This study of tenth- graders revealed a higher incidence of bulimia (binge-purge syndrome) than was previously thought—13% among the 1,728 under scrutiny.
[11] Michael Waldhoz, “Use of Psycotherapy Surges, and Employers Blanch at the Costs: the Anxiety of Modem Life,” Wall Street Journal, October 20, 1986.
[12] CBS Evening News, November 12, 1986. Too recent for further documentation, but see “Suicide by the Elderly Up,” Jet, September 1,1986.
[13] A tiny, representative sampling: Gary Evans, ed., Environmental Stress (New York, 1982); “Stress!” (cover story, complete with contorted, screaming face) Time, June 6, 1983; Diane McDermott, “Professional Burnout and Control,” Journal of Human Stress, Summer 1984; T.F. Riggar, Stress Burnout: An Annotated Bibliography (Carbondale, Illinois, 1985); Naomi Breslau and Glenn C. Davis, “Chronic Stress and Major Depression,” Archives of General Psychiatry, April 1986.
[14] Muriel Dobbin, “Is the Daily Grind Wearing You Down?” U.S. News and World Report, March 24, 1986. In Oregon, where I’m writing this article, 42% of all Workers’ Compensation claims filed by all employees in 1985 were based on “mental stress.” Alan K. Ota, “Claims for Stress Increasing,” The Oregonian, October 24, 1986.
[15] Associated Press report of paper presented by Dr. Thomas Robertson, annual meeting of the American College of Cardiology, March 11, 1986.
[16] Donald Ian Macdonald, “Can a 6-year-old Be Suicidal?” Journal of the AMA, April 18, 1986.
[17] Wayne S. Wooden, “Why Are Middle-Class Children Setting their Worlds on Fire?” Psychology Today, January 1985.
[18] Carlene Baum, et al, “Drug Use and Expenditures in 1982,” Journal of the AMA, January 18, 1985.
[19] For example: “Panel Proposes Drug Screening in Work Place,” Associated Press, March 3, 1986; “Drugs on the Job” (cover story), Time, March 16, 1986; Irving R. Kaufman, “The Battle Over Drug Testing,” New York Times Magazine, October 19, 1986; Michael Waldholz, “Drug Testing in the Workplace: Whose Rights Take Precedence?” Wall Street Journal, November 11, 1986.
[20] Quoted in Marion T. Bentley and Gary B. Hansen, “Productivity Improvement: The Search for a National Commitment,” Daniel J. Srokan, ed., Quality of Work Life (Reading, Massachusetts, 1983), p. 91.
[21] William J. Baumol and Kenneth McLennan, “U.S. Productivity Performance and Its Implications,” Baumol and McLennan, eds, Productivity Growth and U.S. Competitiveness (New York, 1985), p. 31.
[22] David T. Cook, “Why U.S. Workers Built Fewer Widgets per Hour Last Year,” Christian Science Monitor, February 2, 1986.
[23] For example, the Aspen Institute’s late 1983 Work and Human Values report.
[24] “On the Manageability of Large Human Systems,” editorial. Human Systems Management, Spring, 1985, p. 3.
[25] Perry Pascarella, The New Achievers: Creating a Modem Work Ethic (New York, 1984), p. x.
[26] Wickham Skinner, “The Productivity Paradox,” Harvard Business Review, July-August, 1986, p. 55.
[27] William Bow, “The Puny Payoff from Office Computers,” Fortune, May 26, 1986.
[28] Jeffrey A. Trachtenberg, “Shake, Rattle, and Clonk,” Forbes, July 14, 1986.
[29] See William A. Serrin, “Computers Divide A.T. & T. and Its Workers,” New York Times, November 18, 1983; Beth Brophy, “New Technology, High Anxiety,” which discusses “guerilla warfare in the ranks,” U.S. News and World Report.
[30] “The End of Corporate Loyalty” (cover story), Business Week, August 4, 1986.
[31] See John Zerzan, “Anti-Work and the Struggle for Control,” Telos 50 (Winter 1981–82).
[32] Henry P. Sims and James W. Dean, Jr., “Beyond Quality Circles: Self- Managing Teams,” Personnel, January, 1985, p. 25. Also Peter R. Richardson, “Courting Greater Employee Involvement through Participative Management,” Sloan Management Review, Winter 1985.
[33] Irving H. Siegel and Edgar Weinburg, Labor-Management Cooperation: the American Experience (Kalamazoo, 1982). “Such collaborative activity will continue to expand and flourish ...” p. vii; Susan Albers Mohrman and Gerald E. Ledford, Jr., “The Design and Use of Effective Employee Participation Groups; Implications for Human Resource Management,” Human Resource Management, Winter, 1985.
[34] David N. Campbell, et al, “Discipline without Punishment—At Last,” Harvard Business Review, July/August 1985, p. 162.
[35] Sar A. Levitan and Diane Wemke, “Worker Participation and Productivity Change,” Monthly Labor Review, September 1984; Anat Rafaeli, “Quality Circles and Employee Attitudes,” Personnel Psychology, Autumn 1985. Also Robert Howard, Brave New Workplace (New York, 1986).
[36] Thomas J. Nardone, “Part-time Workers; Who Are They?” Monthly Labor Review, February 1986; “Measuring the Rise in Part-time Employment,” Business Week, August 18, 1986.
[37] Sylvia Nasar, “Jobs Go Begging at the Bottom,” Fortune, March 17, 1986; Albert Rees, “An Essay on Youth Joblessness,” Journal of Economic Literature, June 1986; Harry Bacas, “Where Are the Teenagers?” Nation’s Business, August 1986.
[38] J.J. Rousseau, Lettre a M. d’Alembert sur les Spectacles (Geneva, 1948), p. 168.
[39] James McGregor Bums, The Power to Lead (New York, 1984), p. 11. “People are staying home as a conscious act of withdrawal” (also p. 11).
[40] John A. Fleishman, “Trends in Self-Identified Ideology from 1972 to 1982: No Support for the Salience Hypothesis,” American Journal of Political Science, Vol. 30, No. 3 (August 1986); Thomas Ferguson and Joel Rogers, “The Myth of America’s Turn to the Right,” The Atlantic Monthly, 1986.
[41] Jonathon Kozol, Illiterate America (Garden City, N.Y., 1985), p. 23. Also, Ezra Bowen, “Losing the War of Letters,” Time, May 5, 1986, and “The Age of the Illiterate,” The Economist, September 27, 1986.
[42] David Harmon, “Functional Illiteracy; Keeping Up in America,” Current, September 1986, p. 8.
[43] Shoshana Zuboff, “Automate/Informate: the Two Faces of Intelligent Technology,” Organizational Dynamics, Autumn 1985; Amal Kumar Naj, “The Human Factor,” Wall Street Journal, November 10, 1986; Irwin Ross, “Corporations Take Aim at Illiteracy,” Fortune, September 29, 1986.
[44] Gary G. Wehlage and Robert A. Rutter, “Dropping Out: How Much Do Schools Contribute to the Problem?” Teachers College Record (special issue on school dropouts), Spring 1986; Robert Marquand, “High Dropout Rate Contradicts Official Report of School Progress,” Christian Science Monitor, February 28, 1986.
[45] William Bennett, “Lost Generation: Why America’s Children Are Strangers in Their Own Land,” Policy Review, Summer 1985; Diane Ravitch, “Decline and Fall of Teaching History,” New York Times Magazine, November 17, 1985; Christian Lenhardt, “Anamnestic Solidarity,” Telos 25 (Fall 1975).
[46] Jean Baudrillard, The Mirror of Production (St. Louis, 1975), p. 166.
[47] Theodor W. Adorno, Minima Moralia (New York, 1974), p. 247.
[48] For example, Jürgen Habermas, “What Does a Crisis Mean Today? Legitimation Problems in Late Capitalism,” Social Research, Winter 1973.
[49] Robert Wuthnow, “Moral Crisis in American Capitalism,” Harvard Business Review, March-April 1982, p. 77.
[50] Michael Rose, Reworking the Work Ethic (London, 1985), p. 104.
[51] Jean Baudrillard, In the Shadow of the Silent Majority ... or the End of the Social and Other Essays (New York, 1983), p. 23. However, Baudrillard explicitly eschews any negative, liberatory potential for the “mass,” which he sees as voracious, irrational, and dumb, simply a black hole which may swallow the system but not thereby provide deliverance. True to post-structural obeisance to an eternal, frozen reality, for Baudrillard the individual is extinct and negativity a meaningless term.