Main content

The Quantified Worker

Ifeoma Ajunwa

The quantified worker is awakened by an electronic device she wears on her wrist. First it is a gentle vibration, then the sensation gradually increases in intensity. The device tracks information on her sleeping habits — when she went to sleep, how long she slept, even whether it was fitful or peaceful sleep. Once she is out of bed, the device counts her steps. In the mirror she brushes her teeth vigorously — she hopes that counts for exercise. Because the quantified worker is part of a workplace wellness program, all the information from her electronic device is dispatched to the program and, in return, her employer pays her health insurance premium. To continue with the program, the quantified worker must also exercise for 30 minutes every day; she earns redeemable points each time she goes to a gym because she has a card with an embedded chip that keeps electronic tabs on her gym visits. At the beginning of the program, the quantified worker submitted to genetic testing. The result: a genetic profile for the quantified worker outlining her propensity for certain diseases and extrapolating what diet she must follow to maintain optimal health. As part of the wellness program, the quantified worker was encouraged to download a health application on her phone. On this app, she is expected to track all her prescription medications and also her menstrual cycle. There is also an option for tracking ovulation cycles.

For the quantified worker, it is not only her body that is quantified, it is also her mind. To apply for the job, the quantified worker had to run the gauntlet of automated hiring platforms. She answered questions like “How often do you smile per day?” and “Do you find yourself feeling sad for no reason?” Then the quantified worker sat for a video interview. In the comfort of her own home, she sat alone in a room, eyes rigidly fixed straight ahead on her laptop camera. She made sure never to show too much of the whites of her eyes, a sure sign of aggression. Her friends have told her she tends to speak with her hands, elucidating her points with fluid hand flourishes, so she keeps her hands gripped on a pencil on the desk in front of her — like an anchor or deadweight. It is true that this makes her feel like she is attempting to speak with a muzzle on, but no matter, she will appear confident to the machine evaluating her.

Once the quantified worker is hired, mechanical managers become her immediate supervisors. Whether she is hired to an office job where she wears a lapel pin that tracks her movements around the office and might record snippets of conversation, or whether she works a factory job requiring physical exertion where she must wear a safety exoskeleton that can detect whether she is indeed lifting with her knees, her mechanical managers silently and perpetually record her every move. If she works in an office with sensitive information, she might wear a badge with an RFID allowing her entrance to certain rooms and excluding her from others — for extra convenience, an RFID chip might be inserted under her skin, thereby taking the term “embedded manager” to new heights. If she works in an office, there are cameras everywhere except the bathroom — a code to unlock the bathroom door already documents who entered and for how long. As she sits and types at her computer, her keystrokes are logged. The websites she visits on her computer are logged. And for the hypervigilant employer, there is no need to stand over her shoulder to peer at her computer screen — a program will take a screenshot of her computer screen at whatever interval is required. Is she disgruntled perhaps from all the surveillance? Is she planning to leave the company? A program will track her visits and behavior on LinkedIn and will alert her human manager if she proves to be a significant flight risk. Her social media accounts also are surveilled. Any criticism of her employer is flagged: such action may reflect on her advancement in the company or could lead to dismissal.

If this sounds to you like a dystopian future, you are half correct. This is not the future. It is the current plight of workers. Most workers are now caught up in what scholars like Shoshana Zuboff have identified as “surveillance capitalism,” their every move tracked and monitored in service of profit-making. When William Whyte published The Organization Man in the fall of 1956, it carried a warning that large American corporations were systemically eroding the individuality of their workers. Furthermore, Whyte warned, this suppression of individuality would be detrimental, not just to the individual, but also to the corporation itself, because of the concomitant loss of creativity and innovation. In addition to Whyte’s warning, other books such as Windows on the Workplace (1942) and The Electronic Sweatshop (1988) have also raised questions about the deleterious effects of work technology on worker rights. With The Quantified Worker, I argue that AI technology advances have ushered in a new era in the workplace — the era of the quantified worker. This new era brings with it new legal challenges, both for the organization and for the worker.

The Quantified Worker carries a warning — it rings the alarm bell that American workers are increasingly quantified in a manner and to a degree that had been hitherto unknown in history. The quantification of workers is not new; it is as old as the valuation of Roman slaves or the counting of bushels of cotton picked by African slaves in the Americas. What sets this new era of worker quantification apart is that the quantification is now aided by technological advances grouped under the catch-all term of artificial intelligence. These new technologies perform automated decision-making with machine learning algorithms, often ignoring the gestalt of the worker in favor of numbers on a screen. The zeitgeist of this new era of quantification is that it is simultaneously nebulous and impactful; the intangibles of human behavior and genetic predilection concretized as numbers to manage risk and profit for the firm.

There is a future of work in which we all become digital serfs. We toil away in crowded open-floor offices with no privacy or in cavernous warehouses. Workplaces, optimized for the greatest level of surveillance, closely resemble the panoptical ideal, in which worker surveillance is omnipresent and indefatigable. In this future, the surveillants are not human; rather, they are mechanical managers. They take screenshots of our computer screen every ten seconds; they use productivity applications to log each keystroke and parse each email message; and they require wristbands, fitness trackers, or chips embedded under the skin. They are the exoskeletons monitoring the worker’s every move. In this future of work, we are all quantified by our data. As workers, we become synonymous with our data doubles. We lose all of our personhood and voice, and our digital doubles speak for us. All of our interactions, within the workplace and without, become data that is used to quantify our personalities, quantify our work ethic, quantify our trustworthiness, and so on. We are at the mercy of mechanical managers. From the beginning, with the automated interview and automated onboarding, to the automated productivity checks and the automated dismissal, we are judged by algorithmic systems. We have no right to explanations and no venue to contest automated decisions. It is a bleak future.

But a different, better, future of work is possible. One in which we, as workers, are empowered by our data, not ruled by it. Worker personhood and human dignity are retained as paramount and separate from the representations of digital data. Rather than being quantified by automated decision-making systems, these systems exist to enable workers to reach their highest potential. It will not be enough to merely declare a stance against worker quantification. To achieve true worker autonomy necessitates concrete changes in our legal regime. It requires a reconsideration of our ethical values concerning workers. One might start with a Rawlsian approach to the adoption of new technologies in the workplace. This approach prompts the first order question of “Who loses?” A question that encourages us to ponder who is disadvantaged by the AI technologies of work. Only by focusing on the least well off can we be attuned to the oppressive potential of these technologies. The liberty of all depends on this consideration. As Dr. Martin Luther King Jr., a great champion of human rights, once noted, “The society that performs miracles with machinery has the capacity to make some miracles for men — if it values men as highly as it values machines.”

The Declaration of Independence presaged a break from colonialism but it was the Constitution and, furthermore, the Bill of Rights that solidified democracy as the new order in the United States of America. Similarly, emerging AI technologies serve as harbingers for an upcoming economic society upheaval, one that threatens to tilt power further in the balance of employers, thus threatening our democracy. Awareness of this shifting landscape dictates decisive preventative action. First, we must envision a future in which workers’ labor and employment rights are respected, workers’ dignity and autonomy are valued, and technology is deployed to enhance the lives of workers rather than to control them. To achieve this requires concrete changes to employment law doctrine, such as a rethinking of the traditional deference accorded to employers and the adoption of the proposed discrimination per se cause of action, allowing workers to more easily bring suit when confronted with automated systems that have a disparate impact on minority workers. It requires a worker’s bill of rights that clearly delineates limits for worker surveillance and the use of the data derived from workers. It requires the mandated audits of automated hiring systems and design initiatives for them that have the goal of inclusion as its primary objective. It requires a reconsideration of the privacy and discrimination risks that arise from workplace wellness programs, wearable technology, and the laissez-faire approach to trading worker health data.

On the side of workers, one could envision a coalescence around worker unions that deploy worker data for the greater good of union members. Realtime worker data is wielded as worker power for better working conditions, as a bargaining chip for better pay, and to safeguard other work benefits. One could also preview employers compensating workers for their data when such data is used for new work innovations or deployed to build AI technologies that render those workers redundant. There is also a possible future where the idea of a universal basic income that ensures human welfare is not subsumed by corporate greed.

As it stands, we are set on a course towards a disastrous future of work where the worker is quantified in all aspects. The worker loses all individuality, all trace of personhood. Rather, she becomes a set of binaries, fit or unfit, to be plugged in to the computerized workplace. She is merely a cog in the machine — fungible and disposable. Yet a different future of work still exists. We merely must course correct for a better future of work by adopting a more worker protective legal regime.

— from The Quantified Worker (Cambridge University Press 2023)