Chapter 3

Why Do Those in Bullshit Jobs Regularly Report Themselves Unhappy? (On Spiritual Violence, Part 1)

People :

Author : David Graeber

Text :

Chapter 3: Why Do Those in Bullshit Jobs Regularly Report Themselves Unhappy? (On Spiritual Violence, Part 1)

Workplaces are fascist. They’re cults designed to eat your life; bosses hoard your minutes jealously like dragons hoard gold.

—Nouri

In this chapter, I’d like to start exploring some of the moral and psychological effects of being trapped inside a bullshit job.

In particular, I want to ask the obvious question: Why is this even a problem? Or to phrase it more precisely: Why does having a pointless job so regularly cause people to be miserable? On the face of it, it’s not obvious that it should. After all, we’re talking about people who are effectively being paid—often very good money—to do nothing. One might imagine that those being paid to do nothing would consider themselves fortunate, especially when they are more or less left to themselves. But while every now and then I did hear testimonies from those who said they couldn’t believe their luck in landing such a position, the remarkable thing is how very few of them there were.[66] Many, in fact, seemed perplexed by their own reaction, unable to understand why their situation left them feeling so worthless or depressed. Indeed, the fact that there was no clear explanation for their feelings—no story they could tell themselves about the nature of their situation and what was wrong about it—often contributed to their misery. At least a galley slave knows that he’s oppressed. An office worker forced to sit for seven and a half hours a day pretending to type into a screen for $18 an hour, or a junior member of a consultancy team forced to give the exact same seminar on innovation and creativity week in and week out for $50,000 a year, is just confused.

In an earlier book about debt, I wrote about the phenomenon of “moral confusion.” I took as my example the fact that throughout human history, most people seem to have agreed both that paying back one’s debts was the essence of morality and that moneylenders were evil. While the rise of bullshit jobs is a comparatively recent phenomenon, I think it creates a similar moral embarrassment. On the one hand, everyone is encouraged to assume that human beings will always tend to seek their best advantage, that is, to find themselves a situation where they can get the most benefit for the least expenditure of time and effort, and for the most part, we do assume this—especially if we are talking about such matters in the abstract. (“We can’t just give poor people handouts! Then they won’t have any incentive to look for work!”) On the other hand, our own experience, and those of the people we are closest to, tends to contradict these assumptions at many points. People almost never act and react to situations in quite the way our theories of human nature would predict. The only reasonable conclusion is that, at least in certain key essentials, these theories about human nature are wrong.

In this chapter, I don’t just want to ask why people are so unhappy doing what seems to them meaningless make-work, but to think more deeply about what that unhappiness can tell us about what people are and what they are basically about.

about one young man apparently handed a sinecure who nonetheless found himself unable to handle the situation

I will begin with a story. The following is the tale of a young man named Eric, whose first experience of the world of work was of a job that proved absolutely, even comically, pointless.

Eric: I’ve had many, many awful jobs, but the one that was undoubtedly pure, liquid bullshit was my first “professional job” postgraduation, a dozen years ago. I was the first in my family to attend university, and due to a profound naïveté about the purpose of higher education, I somehow expected that it would open up vistas of hitherto-unforeseen opportunity.

Instead, it offered graduate training schemes at PricewaterhouseCoopers, KPMG, etc. I preferred to sit on the dole for six months using my graduate library privileges to read French and Russian novels before the dole forced me to attend an interview which, sadly, led to a job.

That job involved working for a large design firm as its “Interface Administrator.” The Interface was a content management system—an intranet with a graphical user interface, basically—designed to enable this company’s work to be shared across its seven offices around the UK.

Eric soon discovered that he was hired only because of a communication problem in the organization. In other words, he was a duct taper: the entire computer system was necessary only because the partners were unable to pick up the phone and coordinate with one another:

Eric: The firm was a partnership, with each office managed by one partner. All of them seem to have attended one of three private schools and the same design school (the Royal College of Art). Being unbelievably competitive fortysomething public schoolboys, they often tried to outcompete one another to win bids, and on more than one occasion, two different offices had found themselves arriving at the same client’s office to pitch work and having to hastily combine their bids in the parking lot of some dismal business park. The Interface was designed to make the company supercollaborative, across all of its offices, to ensure that this (and other myriad fuckups) didn’t happen again, and my job was to help develop it, run it, and sell it to the staff.

The problem was, it soon became apparent that Eric wasn’t even really a duct taper. He was a box ticker: one partner had insisted on the project, and, rather than argue with him, the others pretended to agree. Then they did everything in their power to make sure it didn’t work.

Eric: I should have realized that this was one partner’s idea that no one else actually wanted to implement. Why else would they be paying a twenty-one-year-old history graduate with no IT experience to do this? They’d bought the cheapest software they could find, from a bunch of absolute crooks, so it was buggy, prone to crashing, and looked like a Windows 3.1 screen saver. The entire workforce was paranoid that it was designed to monitor their productivity, record their keystrokes, or flag that they were torrenting porn on the company internet, and so they wanted nothing to do with it. As I had absolutely no background in coding or software development, there was very little I could do to improve the thing, so I was basically tasked with selling and managing a badly functioning, unwanted turd. After a few months, I realized that there was very little for me to do at all most days, aside from answer a few queries from confused designers wanting to know how to upload a file, or search for someone’s email on the address book.

The utter pointlessness of his situation soon led to subtle—and then, increasingly unsubtle—acts of rebellion:

Eric: I started arriving late and leaving early. I extended the company policy of “a pint on Friday lunchtime” into “pints every lunchtime.” I read novels at my desk. I went out for lunchtime walks that lasted three hours. I almost perfected my French reading ability, sitting with my shoes off with a copy of Le Monde and a Petit Robert. I tried to quit, and my boss offered me a £2,600 raise, which I reluctantly accepted. They needed me precisely because I didn’t have the skills to implement something that they didn’t want to implement, and they were willing to pay to keep me. (Perhaps one could paraphrase Marx’s Economic and Philosophical Manuscripts of 1844 here: to forestall their fears of alienation from their own labor, they had to sacrifice me up to a greater alienation from potential human growth.)

As time went on, Eric became more and more flagrant in his defiance, hoping he could find something he could do that might actually cause him to be fired. He started showing up to work drunk and taking paid “business trips” for nonexistent meetings:

Eric: A colleague from the Edinburgh office, to whom I had poured out my woes when drunk at the annual general meeting, started to arrange phony meetings with me, once on a golf course near Gleneagles, me hacking at the turf in borrowed golf shoes two sizes too large. After getting away with that, I started arranging fictional meetings with people in the London office. The firm would put me up in a nicotine-coated room in the St. Athans in Bloomsbury, and I would meet old London friends for some good old-fashioned all-day drinking in Soho pubs, which often turned into all-night drinking in Shoreditch. More than once, I returned to my office the following Monday in last Wednesday’s work shirt. I’d long since stopped shaving, and by this point, my hair looked like it was robbed from a Zeppelin roadie. I tried on two more occasions to quit, but both times my boss offered me more cash. By the end, I was being paid a stupid sum for a job that, at most, involved me answering the phone twice a day. I eventually broke down on the platform of Bristol Temple Meads train station one late summer’s afternoon. I’d always fancied seeing Bristol, and so I decided to “visit” the Bristol office to look at “user take-up.” I actually spent three days taking MDMA at an anarcho-syndicalist house party in St. Pauls, and the dissociative comedown made me realize how profoundly upsetting it was to live in a state of utter purposelessness.

After heroic efforts, Eric did finally manage to get himself replaced:

Eric: Eventually, responding to pressure, my boss hired a junior fresh out of a computer science degree to see if some improvements could be made to our graphical user interface. On this kid’s first day at work, I wrote him a list of what needed to be done—and then immediately wrote my resignation letter, which I posted under my boss’s door when he took his next vacation, surrendering my last paycheck over the telephone in lieu of the statutory notice period. I flew that same week to Morocco to do very little in the coastal town of Essaouira. When I came back, I spent the next six months living in a squat, growing my own vegetables on three acres of land. I read your Strike! piece when it first came out. It might have been a revelation for some that capitalism creates unnecessary jobs in order for the wheels to merely keep on turning, but it wasn’t to me.

The remarkable thing about this story is that many would consider Eric’s a dream job. He was being paid good money to do nothing. He was also almost completely unsupervised. He was given respect and every opportunity to game the system. Yet despite all that, it gradually destroyed him.

Why?

To a large degree, I think, this is really a story about social class. Eric was a young man from a working-class background—a child of factory workers, no less—fresh out of college and full of expectations, suddenly confronted with a jolting introduction to the “real world.” Reality, in this instance, consisted of the fact that (a) while middle-aged executives can be counted on to simply assume that any twentysomething white male will be at least something of a computer whiz (even if, as in this case, he had no computer training of any kind), and (b) might even grant someone like Eric a cushy situation if it suited their momentary purposes, (c) they basically saw him as something of a joke. Which his job almost literally was. His presence in the company was very close to a practical joke some designers were playing on one another.

Even more, what drove Eric crazy was the fact there was simply no way he could construe his job as serving any sort of purpose. He couldn’t even tell himself he was doing it to feed his family; he didn’t have one yet. Coming from a background where most people took pride in making, maintaining, and fixing things, or anyway felt that was the sort of thing people should take pride in, he had assumed that going to university and moving into the professional world would mean doing the same sorts of thing on a grander, even more meaningful, scale. Instead, he ended up getting hired precisely for what he wasn’t able to do. He tried to just resign. They kept offering him more money. He tried to get himself fired. They wouldn’t fire him. He tried to rub their faces in it, to make himself a parody of what they seemed to think he was. It didn’t make the slightest bit of difference.

To get a sense of what was really happening here, let us imagine a second history major—we can refer to him as anti-Eric—a young man of a professional background but placed in exactly the same situation. How might anti-Eric have behaved differently? Well, likely as not, he would have played along with the charade. Instead of using phony business trips to practice forms of self-annihilation, anti-Eric would have used them to accumulate social capital, connections that would eventually allow him to move on to better things. He would have treated the job as a stepping-stone, and this very project of professional advancement would have given him a sense of purpose. But such attitudes and dispositions don’t come naturally. Children from professional backgrounds are taught to think like that from an early age. Eric, who had not been trained to act and think this way, couldn’t bring himself to do it. As a result, he ended up, for a time, at least, in a squat growing tomatoes.[67]

concerning the experience of falseness and purposelessness at the core of bullshit jobs, and the importance now felt of conveying the experience of falseness and purposelessness to youth

In a deeper way, Eric’s story brings together almost everything that those with bullshit jobs say is distressing about their situation. It’s not just the purposelessness—though certainly, it’s that. It’s also the falseness. I’ve already mentioned the indignation telemarketers feel when they are forced to try to trick or pressure people into doing something they think is against their best interests. This is a complicated feeling. We don’t even really have a name for it. When we think of scams, after all, we think of grifters, confidence artists; they are easy to see as romantic figures, rebels living by their wits, as well as admirable because they have achieved a certain form of mastery. This is why they make acceptable heroes in Hollywood movies. A confidence artist could easily take delight in what she’s doing. But being forced to scam someone is altogether different. In such circumstances, it’s hard not to feel you’re ultimately in the same situation as the person you’re scamming: you’re both being pressured and manipulated by your employer, only in your case, with the added indignity that you’re also betraying the trust of someone whose side you should be on.

One might imagine the feelings sparked by most bullshit jobs would be very different. After all, if the employee is scamming anyone, it’s his employer, and he’s doing it with his employer’s full consent. But somehow, this is precisely what many report to be so disturbing about the situation. You don’t even have the satisfaction of knowing you’re putting something over on someone. You’re not even living your own lie. Most of the time, you’re not even quite living somebody else’s lie, either. Your job is more like a boss’s unzippered fly that everyone can see but also knows better than to mention.

If anything, this appears to compound the sense of purposelessness.

Perhaps anti-Eric would, indeed, have found a way to turn around that purposelessness and seen himself as in on the joke; perhaps if he were a real go-getter, he’d have used his administrative skills to effectively take over the office; but even children of the rich and powerful often find this difficult to pull off. The following testimony gives a sense of the moral confusion they can often feel:

Rufus: I got the job because my dad was a Vice President at the company. I was charged with handling complaints. Given that it was (in name) a biomedical company, all returned product was considered a biohazard. So I was able to spend a lot of time in a room all by myself, with no supervision and essentially no work to do. The bulk of my memory of the job involves either playing Minesweeper or listening to podcasts.

I did spend hours poring over spreadsheets, tracking changes on Word documents, etc., but I guarantee you that I contributed nothing to this company. I spent every minute at the office wearing headphones. I paid only the smallest attention possible to the people around me and the “work” I was assigned.

I hated every minute working there. In fact, more days than not, I went home early from work, took two- or three-hour lunch breaks, spent hours “in the bathroom” (wandering around), and nobody ever said a word. I was compensated for every minute.

Thinking back on it, it was kind of a dream job.

Retrospectively, Rufus understands that he got a ridiculously sweet deal—he seems rather baffled, actually, why he hated the job so much at the time. But surely he couldn’t have been entirely unaware of how his coworkers must have seen him: boss’s kid getting paid to goof off; feels he’s too good to talk to them; supervisors clearly informed “hands off.” It could hardly have evoked warm feelings.

Still, this story raises another question: If Rufus’s father didn’t actually expect his son to do the job, why did he insist he take it in the first place? He could presumably just as easily have given his son an allowance, or, alternately, assigned him a job that needed doing, coached him on his duties, and taken some minimal effort to make sure those tasks were actually carried out. Instead, he seems to have felt it was more important for Rufus to be able to say he had a job than to actually acquire work experience.[68]

That’s puzzling. It’s all the more puzzling because the father’s attitude appears to be extremely common. It wasn’t always so. There was once a time when most students in college whose parents could afford it, or who qualified for scholarships or assistance, received a stipend. It was considered a good thing that there might be a few years in a young man’s or woman’s life where money was not the primary motivation; where he or she could thus be free to pursue other forms of value: say, philosophy, poetry, athletics, sexual experimentation, altered states of consciousness, politics, or the history of Western art. Nowadays it is considered important they should work. However, it is not considered important they should work at anything useful. In fact, like Rufus they’re barely expected to work at all, just to show up and pretend to do so. A number of students wrote just to complain to me about this phenomenon. Here Patrick reflects on his job as a casual retail assistant in a student union convenience store:

Patrick: I didn’t actually need the job (I was getting by financially without it), but after some pressure from my family, I applied for it out of some warped sense of obligation to get experience in work to prepare me for whatever lay ahead beyond university. In reality, the job just took away time and energy from other activities I had been doing, like campaigning and activism, or reading for pleasure, which I think made me resent it even more.

The job was pretty standard for a student union convenience store and involved serving people on the till (could have easily been done by a machine) with the explicitly stated requirement, in my performance review after my trial period, that I “should be more positive and happy when serving customers.” So not only did they want me to do work that could have been performed by a machine just as effectively, they wanted me to pretend that I was enjoying that state of affairs.

It was just about bearable if my shift was during lunchtime, when it got really busy, so time went by relatively quickly. Being on shift on a Sunday afternoon when nobody frequented the SU was just appalling. They had this thing about us not being able to just do nothing, even if the shop was empty. So we couldn’t just sit at the till and read a magazine. Instead, the manager made up utterly meaningless work for us to do, like going round the whole shop and checking that things were in date (even though we knew for a fact they were because of the turnover rate) or rearranging products on shelves in even more pristine order than they already were.

The very, very worst thing about the job was that it gave you so much time to think, because the work was so lacking in any intellectual demand. So I just thought so much about how bullshit my job was, how it could be done by a machine, how much I couldn’t wait for full communism, and just endlessly theorized the alternatives to a system where millions of human beings have to do that kind of work for their whole lives in order to survive. I couldn’t stop thinking about how miserable it made me.

This is what happens, of course, when you first open the entire world of social and political possibility to a young mind by sending it to college and then tell it to stop thinking and tidy up already tidy shelves. Parents now feel it is important that young minds should have this experience. But what, precisely, was Patrick supposed to be learning through this exercise?

Here’s another example:

Brendan: I’m at a small college in Massachusetts training to be a high school history teacher. Recently I started work at the dining commons.

A coworker told me on my first day: “Half of this job is making things look clean, and the other half is looking busy.”

For the first couple of months, they had me “monitor” the back room. I would clean the buffet slider, restock the desserts, and wipe down tables when people left. It’s not a big room, so usually I could do all my tasks in five minutes out of every thirty. I ended up being able to get a lot of reading for my coursework done.

However, sometimes one of the less understanding supervisors would be working. In that case, I would have to keep the corner of my eye open at all times in order to make sure they would always see me acting busy. I have no idea why the job description couldn’t just acknowledge that I wouldn’t have much to do—if I didn’t have to spend so much time and energy looking busy, I could get my reading and the table cleaning done quicker and more efficiently.

But of course, efficiency is not the point. In fact, if we are simply talking about teaching students about efficient work habits, the best thing would be to leave them to their studies. Schoolwork is, after all, real work in every sense except that you don’t get paid for it (though if you’re receiving a scholarship or an allowance, you actually are getting paid for it). In fact, like almost all the other activities Patrick or Brendan might have been engaged in had they not been obliged to take on “real world” jobs, their classwork is actually more real than the largely make-work projects they ended up being forced to do. Schoolwork has real content. One must attend classes, do the readings, write exercises or papers, and be judged on the results. But in practical terms, this appears to be exactly what makes schoolwork appear inadequate to those authorities—parents, teachers, governments. administrators—who have all come to feel that they must also teach students about the real world. It’s too results-oriented. You can study any way you want to so long as you pass the test. A successful student has to learn self-discipline, but this is not the same as learning how to operate under orders. Of course, the same is true of most of the other projects and activities students might otherwise be engaged in: whether rehearsing for plays, playing in a band, political activism, or baking cookies or growing pot to sell to fellow students. All of which might be appropriate training for a society of self-employed adults, or even one made up primarily of the largely autonomous professionals (doctors, lawyers, architects, and so forth) that universities were once designed to produce. It might even be appropriate to train young people for the democratically organized collectives that were the subject of Patrick’s reveries about full communism. But as Brendan points out, it is very much not preparation for work in today’s increasingly bullshitized workplace:

Brendan: A lot of these student work jobs have us doing some sort of bullshit task like scanning IDs, or monitoring empty rooms, or cleaning already-clean tables. Everyone is cool with it, because we get money while we study, but otherwise there’s absolutely no reason not to just give students the money and automate or eliminate the work.

I’m not altogether familiar with how the whole thing works, but a lot of this work is funded by the Feds and tied to our student loans. It’s part of a whole federal system designed to assign students a lot of debt—thereby promising to coerce them into labor in the future, as student debts are so hard to get rid of—accompanied by a bullshit education program designed to train and prepare us for our future bullshit jobs.

Brendan has a point, and I’ll be returning to his analysis in a later chapter. Here, though, I want to focus on what students forced into these make-work jobs actually learn from them—lessons that they do not learn from more traditional student occupations and pursuits such as studying for tests, planning parties, and so on. Even judging by Brendan’s and Patrick’s accounts (and I could easily reference many others), I think we can conclude that from these jobs, students learn at least five things:

  1. how to operate under others’ direct supervision;

  2. how to pretend to work even when nothing needs to done;

  3. that one is not paid money to do things, however useful or important, that one actually enjoys;

  4. that one is paid money to do things that are in no way useful or important and that one does not enjoy; and

  5. that at least in jobs requiring interaction with the public, even when one is being paid to carry out tasks one does not enjoy, one also has to pretend to be enjoying it.

This is what Brendan meant by how make-work student employment was a way of “preparing and training” students for their future bullshit jobs. He was studying to be a high school history teacher—a meaningful job, certainly, but, as with almost all teaching positions in the United States, one where the proportion of hours spent teaching in class or preparing lessons has declined, while the total number of hours dedicated to administrative tasks has increased dramatically. This is what Brendan is suggesting: that it’s no coincidence that the more jobs requiring college degrees become suffused in bullshit, the more pressure is put on college students to learn about the real world by dedicating less of their time to self-organized goal-directed activity and more of it to tasks that will prepare them for the more mindless aspects of their future careers.

why many of our fundamental assumptions on human motivation appear to be incorrect

I do not think there is any thrill that can go through the human heart like that felt by the inventor as he sees some creation of the brain unfolding to success… such emotions make a man forget food, sleep, friends, love, everything.

—Nikola Tesla

If the argument of the previous section is correct, one could perhaps conclude that Eric’s problem was just that he hadn’t been sufficiently prepared for the pointlessness of the modern workplace. He had passed through the old education system—some traces of it are left—designed to prepare students to actually do things. This led to false expectations and an initial shock of disillusionment that he could not overcome.

Perhaps. But I don’t think that’s the full story. There is something much deeper going on here. Eric might have been unusually ill-prepared to endure the meaninglessness of his first job, but just about everyone does see such meaninglessness as something to be endured—despite the fact that we are all trained, in one way or another, to assume that human beings should be perfectly delighted to find themselves in his situation of being paid good money not to work.

Let us return to our initial problem. We may begin by asking why we assume that someone being paid to do nothing should consider himself fortunate. What is the basis of that theory of human nature from which this follows? The obvious place to look is at economic theory, which has turned this kind of thought into a science. According to classical economic theory, homo oeconomicus, or “economic man”—that is, the model human being that lies behind every prediction made by the discipline—is assumed to be motivated above all by a calculus of costs and benefits. All the mathematical equations by which economists bedazzle their clients, or the public, are founded on one simple assumption: that everyone, left to his own devices, will choose the course of action that provides the most of what he wants for the least expenditure of resources and effort. It is the simplicity of the formula that makes the equations possible: if one were to admit that humans have complicated motivations, there would be too many factors to take into account, it would be impossible to properly weight them, and predictions could not be made. Therefore, while an economist will say that while of course everyone is aware that human beings are not really selfish, calculating machines, assuming that they are makes it possible to explain a very large proportion of what humans do, and this proportion—and only this—is the subject matter of economic science.

This is a reasonable statement as far as it goes. The problem is there are many domains of human life where the assumption clearly doesn’t hold—and some of them are precisely in the domain of what we like to call the economy. If “minimax” (minimize cost, maximize benefit) assumptions were correct, people like Eric would be delighted with their situation. He was receiving a lot of money for virtually zero expenditure of resources and energy—basically bus fare, plus the amount of calories it took to walk around the office and answer a couple of calls. Yet all the other factors (class, expectations, personality, and so on) don’t determine whether someone in that situation would be unhappy—since it would appear that just about anyone in that situation would be unhappy. They only really affect how unhappy they will be.

Much of our public discourse about work starts from the assumption that the economists’ model is correct. People have to be compelled to work; if the poor are to be given relief so they don’t actually starve, it has to be delivered in the most humiliating and onerous ways possible, because otherwise they would become dependent and have no incentive to find proper jobs.[69] The underlying assumption is that if humans are offered the option to be parasites, of course they’ll take it.

In fact, almost every bit of available evidence indicates that this is not the case. Human beings certainly tend to rankle over what they consider excessive or degrading work; few may be inclined to work at the pace or intensity that “scientific managers” have, since the 1920s, decided they should; people also have a particular aversion to being humiliated. But leave them to their own devices, and they almost invariably rankle even more at the prospect of having nothing useful to do.

There is endless empirical evidence to back this up. To choose a couple of particularly colorful examples: working-class people who win the lottery and find themselves multimillionaires rarely quit their jobs (and if they do, usually they soon say they regret it).[70] Even in those prisons where inmates are provided free food and shelter and are not actually required to work, denying them the right to press shirts in the prison laundry, clean latrines in the prison gym, or package computers for Microsoft in the prison workshop is used as a form of punishment—and this is true even where the work doesn’t pay or where prisoners have access to other income.[71] Here we are dealing with people who can be assumed to be among the least altruistic society has produced, yet they find sitting around all day watching television a far worse fate than even the harshest and least rewarding forms of labor.

The redeeming aspect of prison work is, as Dostoyevsky noted, that at least it was seen to be useful—even if it is not useful to the prisoner himself.

Actually, one of the few positive side effects of a prison system is that, simply by providing us with information of what happens, and how humans behave under extreme situations of deprivation, we can learn basic truths about what it means to be human. To take another example: we now know that placing prisoners in solitary confinement for more than six months at a stretch inevitably results in physically observable forms of brain damage. Human beings are not just social animals; they are so intrinsically social that if they are cut off from relations with other humans, they begin to decay physically.

I suspect the work experiment can be seen in similar terms. Humans may or may not be cut out for regular nine-to-five labor discipline—it seems to me that there is considerable evidence that they aren’t—but even hardened criminals generally find the prospect of just sitting around doing nothing even worse.

Why should this be the case? And just how deeply rooted are such dispositions in human psychology? There is reason to believe the answer is: very deep indeed.


As early as 1901, the German psychologist Karl Groos discovered that infants express extraordinary happiness when they first figure out they can cause predictable effects in the world, pretty much regardless of what that effect is or whether it could be construed as having any benefit to them. Let’s say they discover that they can move a pencil by randomly moving their arms. Then they realize they can achieve the same effect by moving in the same pattern again. Expressions of utter joy ensue. Groos coined the phrase “the pleasure at being the cause,” suggesting that it is the basis for play, which he saw as the exercise of powers simply for the sake of exercising them.

This discovery has powerful implications for understanding human motivation more generally. Before Groos, most Western political philosophers—and after them, economists and social scientists—had been inclined either to assume that humans seek power simply because of an inherent desire for conquest and domination, or else for a purely practical desire to guarantee access to the sources of physical gratification, safety, or reproductive success. Groos’s findings—which have since been confirmed by a century of experimental evidence—suggested maybe there was something much simpler behind what Nietzsche called the “will to power.” Children come to understand that they exist, that they are discrete entities separate from the world around them, largely by coming to understand that “they” are the thing which just caused something to happen—the proof of which is the fact that they can make it happen again.[72] Crucially, too, this realization is, from the very beginning, marked with a species of delight that remains the fundamental background of all subsequent human experience.[73] It is hard perhaps to think of our sense of self as grounded in action because when we are truly engrossed in doing something—especially something we know how to do very well, from running a race to solving a complicated logical problem—we tend to forget that we exist. But even as we dissolve into what we do, the foundational “pleasure at being the cause” remains, as it were, the unstated ground of our being.

Groos himself was primarily interested in asking why humans play games, and why they become so passionate and excited over the outcome even when they know it makes no difference who wins or loses outside the confines of the game itself. He saw the creation of imaginary worlds as simply an extension of his core principle. This might be so. But what we’re concerned with here, unfortunately, is less with the implications for healthy development and more with what happens when something goes terribly wrong. In fact, experiments have also shown that if one first allows a child to discover and experience the delight in being able to cause a certain effect, and then suddenly denies it to them, the results are dramatic: first rage, refusal to engage, and then a kind of catatonic folding in on oneself and withdrawing from the world entirely. Psychiatrist and psychoanalyst Francis Broucek called this the “trauma of failed influence” and suspected that such traumatic experiences might lie behind many mental health issues later in life.[74]

If this is so, then it begins to give us a sense of why being trapped in a job where one is treated as if one were usefully employed, and has to play along with the pretense that one is usefully employed, but at the same time, is keenly aware one is not usefully employed, would have devastating effects. It’s not just an assault on the person’s sense of self-importance but also a direct attack on the very foundations of the sense that one even is a self. A human being unable to have a meaningful impact on the world ceases to exist.

a brief excursus on the history of make-work and particularly of the concept of buying other people’s time

Boss: How come you’re not working?

Worker: There’s nothing to do.

Boss: Well, you’re supposed to pretend like you’re working.

Worker: Hey, I got a better idea. Why don’t you pretend like I’m working? You get paid more than me.

—Bill Hicks comedy routine

Groos’s theory of “the pleasure at being the cause” led him to devise a theory of play as make-believe: humans invent games and diversions, he proposed, for the exact same reason the infant takes delight in his ability to move a pencil. We wish to exercise our powers as an end in themselves. The fact that the situation is made up doesn’t detract from this; in fact, it adds another level of contrivance. This, Groos suggested—and here he was falling back on the ideas of Romantic German philosopher Friedrich Schiller—is really all that freedom is. (Schiller argued that the desire to create art is simply a manifestation of the urge to play as the exercise of freedom for its own sake as well.[75]) Freedom is our ability to make things up just for the sake of being able to do so.

Yet at the same time, it is precisely the make-believe aspect of their work that student workers like Patrick and Brendan find the most infuriating—indeed, that just about anyone who’s ever had a wage-labor job that was closely supervised invariably finds the most maddening aspect of her job. Working serves a purpose, or is meant to do so. Being forced to pretend to work just for the sake of working is an indignity, since the demand is perceived—rightly—as the pure exercise of power for its own sake. If make-believe play is the purest expression of human freedom, make-believe work imposed by others is the purest expression of lack of freedom. It’s not entirely surprising, then, that the first historical evidence we have for the notion that certain categories of people really ought to be working at all times, even if there’s nothing to do, and that work needs to be made up to fill their time, even if there’s nothing that really needs doing, refers to people who are not free: prisoners and slaves, two categories that historically have largely overlapped.[76]


It would be fascinating, though probably impossible, to write a history of make-work—to explore when and in what circumstances “idleness” first came to be seen as a problem, or even a sin. I’m not aware that anyone has actually tried to do this.[77] But all evidence we have indicates that the modern form of make-work that Patrick and Brendan are complaining about is historically new. This is in part because most people who have ever existed have assumed that normal human work patterns take the form of periodic intense bursts of energy, followed by relaxation, followed by slowly picking up again toward another intense bout. This is what farming is like, for instance: all-hands-on-deck mobilization around planting and harvest, but otherwise, whole seasons taken up largely by minding and mending things, minor projects, and puttering around. But even daily tasks, or projects such as building a house or preparing for a feast, tend to take roughly this form. In other words, the traditional student’s pattern of lackadaisical study leading up to intense cramming before exams and then slacking off again—I like to refer to it as “punctuated hysteria”—is typical of how human beings have always tended to go about necessary tasks if no one forces them to act otherwise.[78] Some students may engage in cartoonishly exaggerated versions of this pattern.[79] But good students figure out how to get the pace roughly right. Not only is it what humans will do if left to their own devices, but there is no reason to believe that forcing them to act otherwise is likely to cause greater efficiency or productivity. Often it will have precisely the opposite effect.

Obviously, some tasks are more dramatic and therefore lend themselves better to alternating intense, frenetic bursts of activity and relative torpor. This has always been true. Hunting animals is more demanding than gathering vegetables, even if the latter is done in sporadic bursts; building houses better lends itself to heroic efforts than cleaning them. As these examples imply, in most human societies, men tend to try, and usually succeed, to monopolize the most exciting, dramatic kinds of work—they’ll set the fires that burn down the forest on which they plant their fields, for example, and, if they can, relegate to women the more monotonous and time-consuming tasks, such as weeding. One might say that men will always take for themselves the kind of jobs one can tell stories about afterwards, and try to assign women the kind you tell stories during.[80] The more patriarchal the society, the more power men have over women, the more this will tend to be the case. The same pattern tends to reproduce itself whenever one group clearly is in a position of power over another, with very few exceptions. Feudal lords, insofar as they worked at all, were fighters[81]—their lives tended to alternate between dramatic feats of arms and near-total idleness and torpor. Peasants and servants obviously were expected to work more steadily. But even so, their work schedule was nothing remotely as regular or disciplined as the current nine-to-five—the typical medieval serf, male or female, probably worked from dawn to dusk for twenty to thirty days out of any year, but just a few hours a day otherwise, and on feast days, not at all. And feast days were not infrequent.

The main reason why work could remain so irregular was because it was largely unsupervised. This is true not only of medieval feudalism but also of most labor arrangements anywhere until relatively recent times. It was true even if those labor arrangements were strikingly unequal. If those on the bottom produced what was required of them, those on top didn’t really feel they should have to be bothered knowing what that entailed. We see this again quite clearly in gender relations. The more patriarchal a society, the more segregated men’s and women’s quarters will also tend to be; as a result, the less men tend to know about women’s work, and certainly, the less able men would be able to perform women’s work if the women were to disappear. (Women, in contrast, usually are well aware of what men’s work entails and are often able to get on quite well were the men for some reason to vanish—this is why in so many past societies, large percentages of the male population could take off for long periods for war or trade without causing any significant disruption.) Insofar as women in patriarchal societies were supervised, they were supervised by other women. Now, this did often involve a notion that women, unlike men, should keep themselves busy all the time. “Idle fingers knit sweaters for the devil,” my great-grandmother used to warn her daughter back in Poland. But this kind of traditional moralizing is actually quite different from the modern “If you have time to lean, you have time to clean,” because its underlying message is not that you should be working but that you shouldn’t be doing anything else. Essentially, my great-grandmother was saying that anything a teenage girl in a Polish shtetl might be getting up to when she wasn’t knitting was likely to cause trouble. Similarly, one can find occasional warnings by nineteenth-century plantation owners in the American South or the Caribbean that it’s better to keep slaves busy even at made-up tasks than to allow them to idle about in the off-season; the reason given always being that if slaves were left with time on their hands, they were likely to start plotting to flee or revolt.

The modern morality of “You’re on my time; I’m not paying you to lounge around” is very different. It is the indignity of a man who feels he’s being robbed. A worker’s time is not his own; it belongs to the person who bought it. Insofar as an employee is not working, she is stealing something for which the employer paid good money (or, anyway, has promised to pay good money for at the end of the week). By this moral logic, it’s not that idleness is dangerous. Idleness is theft.

This is important to underline because the idea that one person’s time can belong to someone else is actually quite peculiar. Most human societies that have ever existed would never have conceived of such a thing. As the great classicist Moses Finley pointed out: if an ancient Greek or Roman saw a potter, he could imagine buying his pots. He could also imagine buying the potter—slavery was a familiar institution in the ancient world. But he would have simply been baffled by the notion that he might buy the potter’s time. As Finley observes, any such notion would have to involve two conceptual leaps which even the most sophisticated Roman legal theorists found difficult: first, to think of the potter’s capacity to work, his “labor-power,” as a thing that was distinct from the potter himself, and second, to devise some way to pour that capacity out, as it were, into uniform temporal containers—hours, days, work shifts—that could then be purchased, using cash.[82] To the average Athenian or Roman, such ideas would have likely seemed weird, exotic, even mystical. How could you buy time? Time is an abstraction![83] The closest he would have likely been able to come would be the idea of renting the potter as a slave for a certain limited time period—a day, for instance—during which time the potter would, like any slave, be obliged to do whatever his master ordered. But for this very reason, he would probably find it impossible to locate a potter willing to enter into such an arrangement. To be a slave, to be forced to surrender one’s free will and become the mere instrument of another, even temporarily, was considered the most degrading thing that could possibly befall a human being.[84]

As a result, the overwhelming majority of examples of wage labor that we do encounter in the ancient world are of people who are already slaves: a slave potter might indeed arrange with his master to work in a ceramics factory, sending half the wages to his master and keeping the rest for himself.[85] Slaves might occasionally do free contract work as well—say, working as porters at the docks. Free men and women would not. And this remained true until fairly recently: wage labor, when it did occur in the Middle Ages, was typical of commercial port cities such as Venice, or Malacca, or Zanzibar, where it was carried out almost entirely by unfree labor.[86]

So how did we get to the situation we see today, where it’s considered perfectly natural for free citizens of democratic countries to rent themselves out in this way, or for a boss to become indignant if employes are not working every moment of “his” time?

First of all, it had to involve a change in the common conception of what time actually was. Human beings have long been acquainted with the notion of absolute, or sidereal, time by observing the heavens, where celestial events happen with exact and predictable regularity. But the skies are typically treated as the domain of perfection. Priests or monks might organize their lives around celestial time, but life on earth was typically assumed to be messier. Below the heavens, there is no absolute yardstick to apply. To give an obvious example: if there are twelve hours from dawn to dusk, there’s little point saying a place is three hours’ walk away when you don’t know the season when someone is traveling, since winter hours will be half the length of summer ones. When I lived in Madagascar, I found that rural people—who had little use for clocks—still often described distance the old-fashioned way and said that to walk to another village would take two cookings of a pot of rice. In medieval Europe, people spoke similarly of something as taking “three paternosters,” or two boilings of an egg. This sort of thing is extremely common. In places without clocks, time is measured by actions rather than action being measured by time. There is a classic statement on the subject by the anthropologist Edward Evan Evans-Pritchard on the subject; he’s speaking of the Nuer, a pastoral people of East Africa:

[T]he Nuer have no expression equivalent to “time” in our language, and they cannot, therefore, as we can, speak of time as though it were something actual, which passes, can be wasted, can be saved, and so forth. I do not think that they ever experience the same feeling of fighting against time or having to coordinate activities with an abstract passage of time, because their points of reference are mainly the activities themselves, which are generally of a leisurely character. Events follow a logical order, but they are not controlled by an abstract system, there being no autonomous points of reference to which activities have to conform with precision. Nuer are fortunate.[87]

Time is not a grid against which work can be measured, because the work is the measure itself.

The English historian E. P. Thompson, who wrote a magnificent 1967 essay on the origins of the modern time sense called “Time, Work Discipline, and Industrial Capitalism,”[88] pointed out that what happened were simultaneous moral and technological changes, each propelling the other. By the fourteenth century, most European towns had created clock towers—usually funded and encouraged by the local merchant guild. It was these same merchants who developed the habit of placing human skulls on their desks as memento mori, to remind themselves that they should make good use of their time because each chime of the clock brought them one hour closer to death.[89] The dissemination of domestic clocks and then pocket watches took much longer, coinciding largely with the advent of the industrial revolution beginning in the late 1700s, but once it did happen, it allowed for similar attitudes to diffuse among the middle classes more generally. Sidereal time, the absolute time of the heavens, had to come to earth and began to regulate even the most intimate daily affairs. But time was simultaneously a fixed grid, and a possession. Everyone was encouraged to see time as did the medieval merchant: as a finite property to be carefully budgeted and disposed of, much like money. What’s more, the new technologies also allowed any person’s fixed time on earth to be chopped up into uniform units that could be bought and sold for money.

Once time was money, it became possible to speak of “spending time,” rather than just “passing” it—also of wasting time, killing time, saving time, losing time, racing against time, and so forth. Puritan, Methodist, and evangelical preachers soon began instructing their flocks about the “husbandry of time,” proposing that the careful budgeting of time was the essence of morality. Factories began employing time clocks; workers came to be expected to punch the clock upon entering and leaving; charity schools designed to teach the poor discipline and punctuality gave way to public school systems where students of all social classes were made to get up and march from room to room each hour at the sound of a bell, an arrangement self-consciously designed to train children for future lives of paid factory labor.[90]

Modern work discipline and capitalist techniques of supervision have their own peculiar histories, too, as forms of total control first developed on merchant ships and slave plantations in the colonies were imposed on the working poor back home.[91] But the new conception of time was what made it possible. What I want to underline here is that this was both a technological and a moral change. It is usually laid at the feet of Puritanism, and Puritanism certainly had something to do with it; but one could argue equally compellingly that the more dramatic forms of Calvinist asceticism were just overblown versions of a new time sense that was, in one way or another, reshaping the sensibilities of the middle classes across the Christian world. As a result, over the course of the eighteenth and nineteenth centuries, starting in England, the old episodic style of working came increasingly to be viewed as a social problem. The middle classes came to see the poor as poor largely because they lacked time discipline; they spent their time recklessly, just as they gambled away their money.

Meanwhile, workers rebelling against oppressive conditions began adopting the same language. Many early factories didn’t allow workers to bring their own timepieces, since the owner regularly played fast and loose with the factory clock. Before long, however, workers were arguing with employers about hourly rates, demanding fixed-hour contracts, overtime, time and a half, the twelve-hour day, and then the eight-hour day. But the very act of demanding “free time,” however understandable under the circumstances, had the effect of subtly reinforcing the idea that when a worker was “on the clock,” his time truly did belong to the person who had bought it—a concept that would have seemed perverse and outrageous to their great-grandparents, as, indeed, to most people who have ever lived.

concerning the clash between the morality of time and natural work rhythms, and the resentment it creates

It’s impossible to understand the spiritual violence of modern work without understanding this history, which leads regularly to a direct clash between the morality of the employer and the common sense of the employee. No matter how much workers may have been conditioned in time discipline by primary schooling, they will see the demand to work continually at a steady pace for eight hours a day regardless of what there is to do as defying all common sense—and the pretend make-work they are instructed to perform as absolutely infuriating.[92]

I well remember my very first job, as a dishwasher in a seaside Italian restaurant. I was one of three teenage boys hired at the start of the summer season, and the first time there was a mad rush, we naturally made a game of it, determined to prove that we were the very best and most heroic dishwashers of all time, pulling together into a machine of lightning efficiency, producing a vast and sparkling pile of dishes in record time. We then kicked back, proud of what we’d accomplished, pausing perhaps to smoke a cigarette or scarf ourselves a scampi—until, of course, the boss showed up to ask us what the hell we were doing just lounging around.

“I don’t care if there are no more dishes coming in right now, you’re on my time! You can goof around on your own time. Get back to work!”

“So what are we supposed to do?”

“Get some steel wool. You can scour the baseboards.”

“But we already scoured the baseboards.”

“Then get busy scouring the baseboards again!”

Of course, we learned our lesson: if you’re on the clock, do not be too efficient. You will not be rewarded, not even by a gruff nod of acknowledgment (which is all we were really expecting). Instead, you’ll be punished with meaningless busywork. And being forced to pretend to work, we discovered, was the most absolute indignity—because it was impossible to pretend it was anything but what it was: pure degradation, a sheer exercise of the boss’s power for its own sake. It didn’t matter that we were only pretending to scrub the baseboard. Every moment spent pretending to scour the baseboard felt like some schoolyard bully gloating at us over our shoulders—except, of course, this time, the bully had the full force of law and custom on his side.

So the next time a big rush came, we made sure to take our sweet time.


It’s easy to see why employes might characterize such make-work tasks as bullshit, and many of the testimonies I received enlarged on the resentment this produced. Here is an example of what might be called “traditional make-work,” from Mitch, a former ranch hand in Wyoming. Ranch work, he wrote, is hard but rewarding, and if you are lucky enough to work for an easygoing employer, it tends to alternate cheerfully between intense bursts of effort and just sort of hanging around. Mitch was not so lucky. His boss, “a very old and well-respected member of the community, of some regional standing in the Mormon church,” insisted as a matter of principle that whenever there was nothing to do, free hands had to spend their time “picking rocks.”

Mitch: He would drop us off in some random field, where we were told to pick up all the rocks and put them in a pile. The idea, we were told, was to clear the land so that tractor implements wouldn’t catch on them.

I called BS on that right off. Those fields had been plowed many times before I ever saw them, plus the frost heaves of the severe winters there would just raise more rocks to the surface over time. But it kept the paid hands “busy” and taught us proper work ethic (meaning obedience, a very high principle as taught in Mormonism), blah, blah.

Riiiight. A hundred-square-foot area of dirt would have hundreds of rocks the size of a fist or bigger.

I remember once spending several hours in a field, by myself, picking rocks, and I honestly tried to do my best at it (God knows why), though I could see how futile it was. It was backbreaking. When the old boss came back to pick me up to do something else, he looked disapprovingly at my pile and declared that I hadn’t really done very much work. As if being told to do menial labor for menial labor’s sake wasn’t degrading enough, it was made more so by my being told that my hours of hard work, performed entirely by hand with no wheelbarrow or any other tool whatsoever, simply wasn’t good enough. Gee, thanks. What’s more, no one ever came to haul off the rocks I had collected. From that day, they sat in that field exactly where I had piled them, and I wouldn’t be surprised if they were still there to this day.

I hated that old man every day until the day he died.

Mitch’s story highlights the religious element: the idea that dutiful submission even to meaningless work under another’s authority is a form of moral self-discipline that makes you a better person. This, of course, is a modern variant of Puritanism. For now, though, I mainly want to emphasize how this element just adds an even more exasperating layer to the perverse morality whereby idleness is a theft of someone else’s time. Despite the humiliation, Mitch could not help but try to treat even the most pointless task as a challenge to be overcome, at the same time feeling a visceral rage at having no choice but to play a game of make-believe he had not invented, and which was arranged in such a way that he could never possibly win.

Almost as soul destroying as being forced to work for no purpose is being forced to do nothing at all. In a way it’s even worse, for the same reason that any prison inmate would prefer spending a year working on a chain gang breaking rocks to a year staring at the wall in solitary.

Occasionally the very rich hire their fellow human beings to pose as statues on their lawns during parties.[93] Some “real” jobs seem very close to this: although one does not need to stand quite as still, one must also do it for much longer periods of time:

Clarence: I worked as a museum guard for a major global security company in a museum where one exhibition room was left unused more or less permanently. My job was to guard that empty room, ensuring no museum guests touched the… well, nothing in the room, and ensure nobody set any fires. To keep my mind sharp and attention undivided, I was forbidden any form of mental stimulation, like books, phones, etc.

Since nobody was ever there, in practice I sat still and twiddled my thumbs for seven and a half hours, waiting for the fire alarm to sound. If it did, I was to calmly stand up and walk out. That was it.

In a situation like that—I can attest to this because I have been in roughly analogous situations—it’s very hard not to stand there calculating “Just how much longer would it likely take me to notice a fire if I were sitting here reading a novel or playing solitaire? Two seconds? Three seconds? That is assuming I wouldn’t actually notice it quicker because my mind would not, as it is now, be so pulped and liquefied by boredom that it had effectively ceased to operate. But even assuming that it was three seconds, just how many seconds of my life have been effectively taken from me to eliminate that hypothetical three-second gap? Let’s work it out (I have a lot of time on my hands anyway): 27,000 seconds a work shift; 135,000 seconds a week; 3,375,000 seconds a month.” Hardly surprising that those assigned such utterly empty labor rarely last a year unless someone upstairs takes pity and gives them something else to do.

Clarence lasted six months (roughly twenty million seconds) and then took a job at half the pay that afforded at least a modicum of mental stimulation.


These are obviously extreme examples. But the morality of “You’re on my time” has become so naturalized that most of us have learned to see the world from the point of view of the restaurant owner—to the extent that even members of the public are encouraged to see themselves as bosses and to feel indignant if public servants (say, transit workers) seem to be working in a casual or dilatory fashion, let alone just lounging around. Wendy, who sent me a long history of her most pointless jobs, reflected that many of them seem to come about because employers can’t accept the idea that they’re really paying someone to be on call in case they’re needed:

Wendy: Example one: as a receptionist for a small trade magazine, I was often given tasks to perform while I was waiting for the phone to ring. Fair enough—but the tasks were almost uniformly BS. One I will remember for the rest of my life: one of the ad sales people came to my desk and dumped thousands of paper clips on my desk and asked me to sort them by color. I thought she was joking, but she wasn’t. I did it, only to observe that she then used them interchangeably without the slightest attention to the color of the clip.

Example two: my grandmother, who lived independently in an apartment in New York City into her early nineties. She did need help, though, so we hired a very nice woman to live with her and keep an eye out. Basically, she was there in case my grandmother fell or needed help, and to help her do shopping and laundry, but if all went well, there was basically nothing for her to do. This drove my grandmother crazy. “She’s just sitting there!” she would complain. We would explain that was the point.

To help my grandmother save face, we asked the woman if she would mind straightening out cabinets when she wasn’t otherwise occupied. She said no problem. But the apartment was small, the closets and cabinets were quickly put in order, and there was nothing to do again. Again, my grandmother was going crazy that she was just sitting there. Ultimately, the woman quit. When she did, my mother said to her, “Why? My mother looks great!” To which the woman responded famously, “Sure, she looks great. I’ve lost fifteen pounds, and my hair is falling out. I can’t take her anymore.” The job wasn’t BS, but the need to construct a cover by way of creating so much BS busywork was deeply demeaning to her. I think this is a common problem for people working for the elderly. (It comes up with babysitting, too, but in a very different way.)[94]

Not just. Once you recognize the logic, it becomes easy to see that whole jobs, careers, and even industries can come to conform to this logic—a logic that not so very long ago would have been universally considered utterly bizarre. It has also spread across the world. Ramadan Al Sokarry, for example, is a young Egyptian engineer working for a public enterprise in Cairo:

Ramadan: I graduated from the Electronics and Communications Department in one of the best engineering schools in my country, where I had studied a complicated major, and where all the students had high expectations of careers tied to research and the development of new technologies.

Well, at least that’s what our studies made us think. But it wasn’t the case. After graduation, the only job I could find was as a control and HVAC [heating, ventilating, and air-conditioning] engineer in a corporatized government company—only to discover immediately that I hadn’t been hired as an engineer at all but really as some kind of a technical bureaucrat. All we do here is paperwork, filling out checklists and forms, and no one actually cares about anything but whether the paperwork is filed properly.

The position is described officially as follows: “heading a team of engineers and technicians to carry out all the preventive maintenance, emergency maintenance operations, and building new systems of control engineering to achieve maximum efficiency.” In reality, it means I make a brief daily check on system efficiency, then file the daily paperwork and maintenance reports.

To state the matter bluntly: the company really just needed to have a team of engineers to come in every morning to check if the air conditioners were working and then hang around in case something broke. Of course, management couldn’t admit that. Ramadan and the other members of his team could have just as easily been sitting around playing cards all day, or—who knows?—even working on some of those inventions they’d been dreaming about in college, so long as they were ready to leap into action if a convector malfunctioned. Instead, the firm invented an endless array of forms, drills, and box-ticking rituals calculated to keep them busy eight hours a day. Fortunately, the company didn’t have anyone on staff who cared enough to check if they were actually complying. Ramadan gradually figured out which of the exercises did need to be carried out, and which ones nobody would notice if he ignored and used the time to indulge a growing interest in film and literature.

Still, the process left him feeling hollow:

Ramadan: In my experience, this was psychologically exhausting and it left me depressed, having to go every workday to a job that I considered pointless. Gradually I started losing interest in my work, and started watching films and reading novels to fill the empty shifts. I now even leave my workplace for hours almost each shift without anyone noticing.

Once again, the end result, however exasperating, doesn’t seem all that impossibly bad. Especially once Ramadan had figured out how to game the system. Why couldn’t he see it, then, as stealing back time that he’d sold to the corporation? Why did the pretense and lack of purpose grind him down?

It would seem we are back at the same question with which we started. But at this point, we are much better equipped to find the answer. If the most hateful aspect of any closely supervised wage-labor job is having to pretend to work to appease a jealous boss, jobs such as Ramadan’s (and Eric’s) are essentially organized based on the same principle. They might be infinitely more pleasant than my experience of having to spend hours (it seemed like hours) applying steel wool to clean perfectly clean baseboards. Such jobs are likely to be not waged but salaried. There may not even be an actual boss breathing down one’s neck—in fact, usually there isn’t. But ultimately, the need to play a game of make-believe not of one’s own making, a game that exists only as a form of power imposed on you, is inherently demoralizing.

So the situation was not, in the final analysis, all that fundamentally different from when me and my fellow dishwashers had to pretend to clean the baseboards. It is like taking the very worst aspect of most wage-labor jobs and substituting it for the occupation that was otherwise supposed to give meaning to your existence. It’s no wonder the soul cries out. It is a direct assault on everything that makes us human.

From : TheAnarchistLibrary.org.

Chronology :

January 06, 2021 : Chapter 3 -- Added.
January 17, 2022 : Chapter 3 -- Updated.

HTML file generated from :

http://revoltlib.com/