Controlling Technologies by Disciplining the Technologists
Approximately eighty years after the Wealth of Nations was published, detective Allan Pinkerton had an idea for a commercial business. In the mid-nineteenth century, there was a growing need for businesses to exercise stronger and more continuous control over their employees. Employers sought to ensure compliance with the business during and after working hours, and inside and outside work locations. Pinkerton’s consultations with various businesses, including numerous Midwestern railroad companies, informed him of a broader market of employers that would pay for his commercial services.
Pinkerton and his attorney formed the North-Western Police Agency, later known as the Pinkerton National Detective Agency. It devised a novel form of worker surveillance at the time. The agency provided a wide range of services to its clients by spying on workers suspected to be threats to the clients’ interests, including infiltrating and sabotaging labor organizing. Predictably, employees dissented against such violation of personhood. With the Anti-Pinkerton Act of 1893, Congress limited the federal government’s ability to deploy such services. However, the use of worker surveillance by private employers remained unregulated.
A decade later, Henry Ford began applying the same methods to monitor and control employees. He would hire private investigators tasked with tracking his employees even outside of work to better understand their lives and uncover intimate issues that could affect their performance at work. Decades later, the New York Times would report in 1986 that “the irony was that in trying to make over his workers in terms of “Americanization” and “Fordliness,” Ford created a form of Big Brotherism that was closer to the totalitarian model.” As seventy percent of his workforce was foreign born, “Ford’s Sociological Department was commissioned to intrude on the private lives of his employees. Inspectors were sent to their homes to question them about their marital life and their finances to see if they were worthy enough to work for Ford.” Probes also evaluated diets, drinking habits, household population and proficiency in American culture. Innocently unaware of the technological surveillance storm to come, the Times then observed, “It seems amazing that people would tolerate such interrogation, but their jobs depended on it.”
In 1987, the United States Office of Technology Assessment, tasked with providing Congress with “new and effective means for securing competent, unbiased information concerning the physical, biological, economic, social, and political effects” of technology, published a report titled The Electronic Supervisor: New Technology, New Tensions. It discussed the implications of computer-aided collection and analysis of workplace data on management practices, labor relations, privacy and stress. Technology surveillance allowed employers to characterize their workers’ activities through various measurements: keystrokes, computer time and call accounting. Eavesdropping was conducted to track productivity. Cards, beepers, TV cameras, genetic screening, pregnancy testing, polygraphs and brainwave mental state tests were utilized to measure behavioral and personal characteristics for various management and business decisions.
The report detailed the purpose of such monitoring, such as work planning, increasing productivity, improved performance evaluation and investigation of workplace incidents. In addition, “increasing management control, discouraging union organizing activities, identifying dissidents, etc.” was also a stated goal. Indeed, as the report noted, because of the lowest levels of unionization in the developed world, employees in the US could not counteract what they deemed “unfair or abusive monitoring,” which would further impede the workers’ ability to form free associations. Expanding on the dearth of labor organization that perpetuated these issues, the report noted that “less than 20 percent of the office work-force is unionized, and even where unions are involved, their effectiveness has been limited because technology choice and productivity measurement are often considered “management rights” under the contract.” By 2019, the unionization rate had dropped to 10.3% in the U.S, with public sector workers at 33.6% and private sector workers at 6.2%. In addition, employees who reported that the constant monitoring induced “stress” did not have any specific recourse. “Although some legal doctrines may be implicitly aimed at vindicating a person’s claim to bodily or mental integrity, autonomy, or dignity, the law recognizes no “right of dignity” or “right of autonomy” as such.”
Early attempts by employers to develop a systematic control regime for worker compliance and obedience weren’t inefficient for a lack of trying. Rather, these methods were limited by the technological capabilities at the time. Executing round the clock surveillance with the simpler means of the day was a very uneconomical proposition. Thankfully, under the guiding hand of private capital interests, the technical enterprise focused its ingenuity and creativity on important and meaningful problems of the day — developing the means of surveillance of its own creators. As a 2015 survey by the American Management Association (AMA) showed, 66% of US companies monitor internet connections, 45% log keystrokes, and 43% log monitor emails, both automatically and manually read. Furthermore, employers also track workers’ locations through employer-provided equipment that features GPS. As one employee at AMA noted, “privacy in today’s workplace is largely illusory.”
These methods of monitoring and quantifying human labor are now quite conventional. The technological enterprise does not cease to invent new methods of addressing employers’ primordial desires for sublime employee control and cooperation. Paradoxically, with these advancements come not only more probing and efficient means of surveillance, but also seemingly more benign. Shedding the antagonistic touch of older methods, today’s workers are expected to enthusiastically participate in the process. For the sake of workplace wellness and productivity, modern surveillance of the worker is deployed as a mutual process, a congenial good that originates from ethical intentions, not power.
Such participatory surveillance is deemed necessary for innovation and progress, migrating away from the original necessities of discipline and control. As Julie Cohen of Georgetown University Law Center notes, “The rhetoric of participation and innovation that characterize [this] participatory turn work to position surveillance as an activity exempted from legal and social control.” For instance, behind Facebook’s colorful and affable façade operates a “secret police,” one tasked with scouring employees’ digital lives for purposes of information and attitude management. This amounts to a “ruthless code of secrecy” that seeks to control information about working conditions and misconduct that employees would otherwise want to freely express, as the Guardian noted.
Humanyze is a tech startup that has developed employee badges based on technology developed at MIT. The badge has two microphones for real-time voice analysis, and sensors that track employees’ movements and presence in the office. The company provides individualized analysis for self-improvement so that users can replicate patterns that result in greater productivity. As the CEO put it, “It’s exactly like a Fitbit for your career.” Managers are given the ability to track and improve “employee engagement, team productivity and organizational adaptability.”
As with all such applications, aggregated, anonymized data is evoked to dismiss the obvious privacy implications. However, as a study on worker surveillance noted, “the unspoken caveat is that there is no legal barrier to the employer’s acquisition of the raw data, which could be used for any purposes the employer wishes.” Other services such as BetterWorks and Kronos are also part of the large workforce management industry that is increasingly using data analytics to ultimately generate maximally profitable behaviors from their employees. Variables such as movements, speech and other daily activities begin to indirectly appear in calculations of the bottom line.
In addition, wellness applications allow employers to collect and utilize intimate information about employees’ lives, such as which drugs are prescribed to them (for instance, birth control prescriptions), how they shop, and even voting habits. Cohen, referring to this burgeoning technical indulgence as “the surveillance-innovation complex,” notes that “commercial surveillance environments use techniques of gamification to motivate user participation.” These apps portray surveillance in an “unambiguously progressive light,” merely seeking to drive innovation, productivity and indeed happiness through exciting technical flourishes. Advanced capabilities of data collection and analysis have spawned “workplace science,” a discipline that combines the fields of big data, machine learning and human resources, “trying to build better workers,” as the New York Times put it.
The increasing use of contract and freelance labor in the workforce has only accelerated the use of employee surveillance. For instance, Google has more temporary and contract workers than full-time employees. These modes of work are often remote, increasing the physical separation between the employer and employee. Telecommuting is on the rise, heightening the employer’s need for discipline and control. Satiating this need are companies that provide employers with the ability to remotely track their workers’ desktops, keystrokes, and minutes spent on various tasks and applications.
For instance, Amazon employs task forces that monitor online activities of the company’s independent contractors, or ‘Amazon Flex’ delivery drivers. These task forces, dubbed Social Listening Teams, scrape online activity, including closed Facebook groups, to monitor and track driver sentiment. Some insights are merely utilized for improved engineering of the drivers’ work. Such innocuous insights can include “[delivery contractor] happily posted about his good experience of the program during his first ever route,” or “[delivery contractor] posted negatively about Amazon GPS/Maps.” However, other labels for information include “warehouse employees [are] complaining about the poor working condition,” “strikes/protests: [delivery contractors] planning for any strike or protest against Amazon,” and “[delivery contractor] approached by researchers…for their project/thesis.” Amazon also hires “intelligence analysts” who track sensitive matters “including labor organizing threats against the company.”
Furthermore, Amazon warehouses track worker productivity by measuring scanned boxes per hour, and time spent off tasks for activities such as breaks and restrooms. Dropping below a desired threshold generates an automated firing without the involvement of a supervisor. In addition, while not yet realized, the company has patented a wristband for workers that emits ultrasonic pulses or radio signals to locate the workers’ hands relative to a target bin. To ensure the most efficient path to the target, the wristband would nudge the workers in the right direction using haptic feedback. Henry Ford’s wildest imaginations could not have accommodated such possibilities.
Studies show that employee surveillance and tracking increases worker alienation, lower job satisfaction and workplace stress. As anthropologists Sally Applin and Michael Fischer concluded in a study called Watching Me, Watching You, “a poorly configured paradigm has created a culture where […] people more often alter their behavior to suit machines and work with them, rather than the other way around,” which degrades agency. Already decaying independence of thought and action under managerial control is simply accelerated with machine management, encouraging repetition and gaming of metrics. As economist Charles Goodhart put it, “when a measure becomes a target, it ceases to be a good measure.” This “servile dependency upon their superiors” and the performance of “a few simple operations” results in the worker becoming “as stupid and ignorant as it is possible for a human creature to become,” as Adam Smith had observed. In some cases, sterilizing mental faculties is the least repulsive of consequences, as the ruthless treatment of workers’ bodies in Amazon warehouses would suggest.
Such mutations of our technical capabilities are not deterministic. These are the effects of a technological enterprise driven by private capital that at once, professes the value of scientific reason and humanitarian objectives, yet only deploys resources to pursue the “vile maxim of the masters of mankind.” As usual, Rocker presciently observed in 1937, “The machine, which was to have made work easier for men, has made it harder and has gradually changed its inventor himself into a machine who must adjust himself to every motion of the steel gears and levers. And just as they calculate the capacity of the marvelous mechanism to the tiniest fraction, they also calculate the muscle and nerve force of the living producers by definite scientific methods and will not realize that thereby they rob his soul and most deeply defile his humanity. We have come more and more under the dominance of mechanics and sacrificed living humanity to the dead rhythm of the machine without most of us even being conscious of the monstrosity of the procedure. Hence we frequently deal with such matters with indifference and in cold blood as if we handled dead things and not the destinies of men.”
Handling Dead Things
Rare earth elements (REE) are a group of elements that are not as rare as the name would suggest. Identified during the 18th and 19th centuries, most of these elements were classified as “earths” at the time, then defined as materials that could not change by heat. Compared to other “earths” such as magnesia, these elements were indeed rare. However, measured by their occurrence in the Earth’s crust, some of the elements are more abundant than copper, lead, gold, silver and platinum. Nevertheless, “rare” is quite appropriate, as “concentrated and economically minable deposits of REE’s are unusual,” as the United States Geological Survey (USGS) notes.
REEs have a wide range of technical applications. Scandium is used in aerospace components and alloys. Yttrium is used in lasers, microwave filters, TV and monitor displays. Lanthanum is used in hybrid-car batteries and digital camera lenses, including cell-phone cameras. Cerium, the most abundant REE, is used for glass-lens production, oil refining and polishing touchscreens on smartphones and tablets. Neodymium is used to make strong magnets for wind turbines, hard disk drives and automobile components such as power steering, electric vehicle motors and audio speakers. Promethium is used in portable x-ray machines. Europium is used in optical electronics. Erbium is used in fiber optics and nuclear-reactor control rods. Lutetium is used for chemical processing and LED lighting, while Terbium is used in solid-state electronics and sonar systems.
In short, the technological enterprise heavily relies on these elements. Production processes that yield REEs are fundamental to the tech industry’s supply chain. Referencing studies on REEs by the National Research Council, U.S Department of Energy, European Commission, and the American Physical Society Panel on Public Affairs and Materials Research Society, the USGS noted in 2017, “In recent years, expert panels convened by research institutes and Government agencies highlighted specific REEs as raw materials critical to evolving technologies, such as clean-energy applications, electronics, and high-tech military components.” Further noting that the production of these fundamental ingredients rests on precarious foundations, USGS observed, “These reports also suggest that a high potential exists for disruptions in the supply of these REEs.”
The global supply of REEs is highly concentrated. On average, China has provided approximately 90% of the world’s REE supply since the late 1990s. About half of it comes from Baotou, a city of 2.5 million in Inner Mongolia — an autonomous region of northern China. In 2019, China registered documented mine production of 132,000 tons with reserves of 44 million tons. During the same period, the U.S registered mine production of 26,000 tons with reserves of 1.4 million tons, a 44% increase compared to 2018 domestic production levels. Globally, REE production increased by 11% in 2019 compared to the previous year, totaling 210,000 tons. Hence, “China continued to dominate the global supply of rare earths,” USGS noted.
In December 2017, President Trump issued an executive order titled A Federal Strategy to Ensure Secure and Reliable Supplies of Critical Minerals. It directed the U.S Department of Commerce to assess reducing the nation’s reliance on REEs, or “critical minerals,” evaluate alternatives and recycling, explore increasing REE production with partners and allies, and fast-tracking processes related to REE deposit exploration, development leases, and mining workforce development. In its response report, the Department of Commerce noted that “critical minerals are needed for many products used by Americans in everyday life, such as cell phones, computers, automobiles, and airplanes. These minerals are also used to make many other products important to the American economy and defense, including advanced electronics; manufacturing equipment; electricity generation, storage, and transmission systems; transportation systems; defense systems and other military supplies; cutting-edge medical devices; and other critical infrastructure systems.” Noting that the US is majority import-reliant on 31 out of 35 critical minerals (as defined by the Department of the Interior) and fully import-reliant on 14 other critical minerals, the Department of Commerce emphasized, “The assured supply of these critical minerals, and the resiliency of their supply chains, are essential to the United States’ economic security and national defense.”
Globally, the explosion of consumer electronics has led to a dramatic increase in the demand for REEs. Various components in everyday devices such as smartphones, tablets, desktop computers and other products such as electric vehicles, increasingly treated as consumer electronics, begin their journey in remote mines and processing plants. Glossy devices and vehicles displayed in ostentatious storefronts obscure the long path from the mines to the perfectly manicured product packaging and display. Not evident are REE extraction processes, and inextricable environmental and health considerations of REE mining, such as water and sediment contamination by carcinogenic waste products like sulphates and ammonia. Processing one ton of REE generates 2000 tons of toxic waste. Just Baotou’s REE industry alone produces 10 million tons of wastewater per year. Furthermore, as the USGS noted, “There are several potential links between REEs and greenhouse gas emissions. For instance, one of the main ore minerals, bastnaesite, contains carbonate, which is liberated during ore processing.” However, due to its low grades, “this carbon flux is not likely to be significant relative to other sources associated with mining.”
The dumping ground for the toxic wastewater is called a tailings pond, a large lake where all mining wastewater accumulates as industrial refuse. Baotou’s tailings pond is only twenty minutes by car from the city center. As one BBC journalist described it during a tour, “It’s a truly alien environment, dystopian and horrifying. The thought that it is man-made depressed and terrified me, as did the realization that this was the byproduct not just of the consumer electronics in my pocket, but also green technologies like wind turbines and electric cars that we get so smugly excited about in the West.”
A marquee example of such smug excitement is Apple’s Worldwide Developer Conference. In stark contrast to a “truly alien environment, dystopian and horrifying,” the conference, like other tech conferences, is ornamented with dazzling displays and lights. Captivating animations and rousing sounds aid excited presenters showcasing their latest wares. Bar graphs show the company’s dominance in the market, pie charts depict advancements compared to the previous year. Polished graphics ensure attention spans remain short. Perhaps it isn’t in contrast to a dystopian and horrifying alien environment. While hailed as a sacred tech event that parades out Apple’s new devices and upgrades annually, the show is as much a testament to the new as it is a monument to the rapid cycles of obsolescence.
The principle of the show is hardly profound. Enraptured must become global users and developers by the company’s new offerings if it is to grow its influence. Yesterday’s breakthrough must be advertised as tomorrow’s inferior, undesirable technology if sales and profits are to continue to rise for one of the world’s biggest corporations. As advertiser J. George Frederick, credited with defining the idea of planned obsolescence, declared in 1928, “We must induce people…to buy a greater variety of goods on the same principle that they now buy automobiles, radios and clothes, namely: buying goods not to wear out, but to trade in or discard after a short time…the progressive obsolescence principle…means buying for up-to-dateness, efficiency, and style, buying for…the sense of modernness rather than simply for the last ounce of use.”
While the state agonizes over a reliable supply of critical minerals for its “economic security,” U.S corporations ensure that this very precondition for economic security is unrealizable. Bonded to resource and environmental insecurity, the relentless march for more product and faster disposal by the tech industry so thoroughly reliant on REEs tramples on elementary principles of conservation. The hurricane of hysterical consumption, ostensibly blowing refreshing winds of ‘innovation’ and ‘modernity,’ violently pummels already strained global resources. A wide range of techniques ensures that products are discarded much before their “last ounce of use,” and under the expectation of rapid cycles of consumption, products are not designed to last for the long term in the first place. Flashy tech conferences and sexy commercials are simple theatrics that only begin the process. The baton is then passed to after-sales monopolization of repair, a critical tool in business’s arsenal.
Up Next: Engineering Illusions Part III: Private Enterprise and Technology -III