Engineering Illusions Part II: State and Technology
An Insider’s Take on the Tech Industry
Engineering Illusions Part I: Religion and Technology
Engineering Illusions Part I: Religion and Technology — II
Engineering Illusions Part I: Religion and Technology — III
Surveillance for Freedom
In August 2018, the National Defense Authorization Act for 2019 established the National Security Commission on Artificial Intelligence (NSCAI). Its primary directive is “to consider the methods and means necessary to advance the development of artificial intelligence, machine learning, and associated technologies to comprehensively address the national security and defense needs of the United States.” Its commissioners were appointed by the Secretaries of Defense and Commerce, and by members of Congress. NSCAI is chaired by ex-CEO of Google Eric Schmidt, who is also the chair of the US Department of Defense’s Defense Innovation Advisory Board. The commission includes officials from Oracle, In-Q-Tel (CIA’s venture capital arm), Microsoft, Amazon, Google, Stanford Research Institute (SRI), and others.
In its first report to Congress in 2019, the group laid out four working groups that would focus on how the U.S. Government can, “through policy reforms, incentives, or appropriations, help accelerate academic research and commercial innovation of A.I,” “adopt A.I applications at speed and scale to protect U.S national security,” “develop incentives to build a world-class, AI-ready national security workforce,” and “enhance U.S competitiveness, leverage alliances, and establish norms that advance U.S values and interests,” respectively.
After a drafting process that included briefings with the National Security Council, the White House Office of Science and Technology Policy, and the Defense Department, the Commission released a recommendation report in March 2020. The report declared that “trends in A.I, including revelations about the power of A.I for surveillance and its implications for weapons systems like swarming drones, indicate the United States is rapidly entering a new security environment.” It emphasized that “the United States is in a strategic competition with AI at the center and that the future of our national security and economy are at stake.”
Expressing approval of the early actions taken by the U.S to assert its dominance, the report applauded public officials that supported investments to protect the United States’ advantages in AI. It appreciated that national security leaders continued “to identify AI as a priority technology for improving business practices and defending the nation.” It supported executive and legislative actions to double non-defense AI research in two years, and recognized the Justice Department’s aggressive pursuit of “foreign threats to U.S intellectual property.” The report commended leading companies and research universities for seeing the “urgency of reconceiving their responsibilities and consider how their work impacts the health of our democracy and future of our security.”
Stressing the importance of strong state commitments in the field of AI, the report highlighted “concerning developments” such as an “AI-empowered facial recognition program trained on publicly available data that appears to put the privacy of Americans at greater risk than is generally understood,” referencing a recent report about surveillance company Clearview AI and its dragnet. Where the esteemed, knowledgeable commissioners who hail from Google, Amazon etc. think the “generally understood” risk to privacy comes from would surely be a critical insight for the general public. Concern was also expressed over other news reporting that “demonstrated how effectively large sets of location data from cell phones can be combined with other available data to track the movements of individuals.” The commissioners can only hope to decipher the strange machinations of this technology and its mysterious source.
NSCAI urges the U.S government to significantly increase spending to bolster AI at academic centers, national laboratories and seek to execute public-private projects which would, among other measures, allocate appropriations for U.S corporations; deploy a cross-department committee for emerging technologies that includes National Intelligence and Defense Department officers; expand AI-enabling microelectronics and 5G cellular connectivity programs including design and deployment the hardware and infrastructure to empower AI research and data collection; improve AI cooperation among allies and partners such as NATO to “advance U.S military concept and capability to include AI war-gaming, experimentation and pilot projects;” and finally, the most perfunctory piece of any engineering curriculum — develop the principles of ethical and responsible AI through coursework and standards.
NSCAI summarizes its guiding philosophy by noting that the “purpose of the Commission’s threat-oriented line of effort is to focus on AI and associated technology threats to the United States from foreign state and non-state actors.”
Through a Freedom of Information Act (FOIA) request, the Electronic Privacy Information Center obtained a presentation made by NSCAI in May 2019. The presentation provided a competitive analysis of China and its technology economy with a focus on AI. It made the case that China is poised to leapfrog the U.S in various technology verticals such as smart cities and tele-medicine, in part enabled by its access to a large user population and massive data sets, and the absence of legacy systems in China that allows the country to quickly build modernized advanced digital services. The presentation noted that the rapid growth of China’s Big Three — Baidu, Alibaba and Tencent — has been enabled by the state’s embrace of public-private partnerships in mass surveillance and data collection, resulting in a competitive edge in AI.
The slides extol China’s “explicit government support and involvement e.g facial recognition” which leads to faster adoption. It propounds that “surveillance is one of the first-and-best customers for AI” and that “mass surveillance is a killer application for deep learning.” Discussing smart cities, the report observes that “having streets carpeted with cameras is good infrastructure for smart cities,” and that “close collaboration with the government allows Alibaba to gather information like car and foot traffic data based on surveillance cameras.” “Government data mixed with Alibaba’s own data and expertise in computing is a potent combination,” the presentation affirmed.
Summarizing the thesis of the presentation, a slide observed, “Government investment and contracts allow Al projects to justify the initial fixed cost of development. Once they are at scale, the marginal economics of software make propagating to other use cases much more economically practical.” In other words, the U.S Government must take the lead in creating the modern A.I-based technology economy worldwide to protect U.S corporate profits, while claiming the technical edge to secure state power. Otherwise, it risks China and other countries setting international norms and standards for this wave, leaving un-American technology in a favorable position to dominate this economy. Nothing less than the “future of our national security and economy are at stake.” NSCAI’s October 2020 recommendation reported summarized simply, “The United States is in an AI-charged technology competition fusing national economic competitiveness, great power rivalry, and a fierce contest between authoritarianism and democracy.”
In his 2019 autobiography Permanent Record, NSA whistleblower Edward Snowden wrote of his early research on China’s electronic surveillance systems, recalling, “To read the technical details of China’s surveillance of private communications — to read a complete and accurate accounting of the mechanisms and machinery required for the constant collection, storage, and analysis of the billions of daily telephone and Internet communications of over a billion people — was utterly mind-boggling. At first I was so impressed by the system’s sheer achievement and audacity that I almost forgot to be appalled by its totalitarian controls.” Describing disturbing details which he would learn as he continued his research, he noted, “There was simply no way for America to have so much information about what the Chinese were doing without having done some of the very same things itself, and I had the sneaking sense while I was looking through all this China material that I was looking at a mirror and seeing a reflection of America. What China was doing publicly to its own citizens, America might be — could be — doing secretly to the world.”
Condensing what is required to restore privacy and protect civil liberties, the whistleblower noted, “It becomes ever clearer to me that the American legal resistance to mass surveillance was just the beta phase of what has to be an international opposition movement, fully implemented across both governments and [the] private sector,” a private sector “for whom the federal government [is] less the ultimate authority than the ultimate client.”
A few months before the U.S Court of Appeals for the Ninth Circuit ruled in late 2020 that the NSA’s warrantless dragnet was illegal, NSCAI Chair Eric Schmidt had taken to the public his case for state funding of the next technological wave of AI. Schmidt, who holds $5.3 billion in shares of Google’s parent, Alphabet, wrote in a New York Times op-ed titled I Used to Run Google. Silicon Valley Could Lose to China, “Americans should be wary of living in a world shaped by China’s view of the relationship between technology and authoritarian governance.” Gallantly championing freedom in the face of dastardly, uncontrollable technologies with perplexing origins, he declared, “Free societies must prove the resilience of liberal democracy in the face of technological changes that threaten it.”
“Think Different.” — Apple, Inc.
NSCAI’s recommendations for the AI industry are in keeping with a long tradition of the U.S Government taking the lead in crafting entire technology industries for strategic, economic and political purposes. The state has always played the foundational role in producing scientific breakthroughs and funding applied research that yields innovative technologies. The process of pulling scientific understanding from nature’s void, developing techniques for basic research based on the theories, and converting this research into applied technologies is a long, challenging and risky endeavor that often takes decades to materialize. Such a process is simply not conducive for private capital’s requirement for de-risked, short-term profit.
In contrast, the Silicon Valley mythos promotes the idea of an unencumbered oasis of creativity, an entrepreneurial wild-west where scholarly cowboys build technologies, and astute financiers erect businesses under the auspices and wisdom of the free market. There is no room for any state activity in this pristine oasis, the myth goes, and any attempts to interfere simply hinders the genius at work. The facts, apparently a valued commodity in rational Silicon Valley, betray such a conception. To say that the state interferes with Silicon Valley would be to say the Earth interferes with the Moon’s orbit.
Simply put, the U.S Government created Silicon Valley. During the Cold War era, the U.S government ran an unprecedented program of funding scientific research and technological development. Universities, labs, private contractors were all major recipients of funding which laid the foundation for our modern high-tech economy. Specifically, military expenditures provided the economic impetus to sustain twenty, thirty year-long development timelines that culminated into commercially viable technologies. This mode of development is not just a historical artifact. It continues today, as also illustrated by NSCAI’s recommendations. In its October 2020 recommendations report, NSCAI made it explicit. Referencing the alliance between government, academic and industry, it urged, “To support the level of AI research, development and application that will underpin future U.S. technological leadership, the government must take action to strengthen the alliance by exploring new mechanisms to support research and enable partnerships with industry.” Furthermore, “To move as fast as U.S. competitors and maintain the defense advantage, [the Department of Defense] must have a means to support promising AI projects beyond early-stage research and development even when planned program funding is not yet in place.”
One of the premiere technology arms of the state is DARPA (Defense Advanced Research Projects Agency), the research division of the Department of Defense. Over the decades, it has spearheaded the development of technologies like micro-electronics, satellites, radars, drones, military weapon systems, military vehicles, supercomputing, the internet, self-driving cars, among many others. It is also a major source of funding for top computer science programs at universities like Stanford. NASA has also been a formative entity in the construction of the modern technological state, with innovations in rocketry, satellite systems and aerospace components that built the private U.S space industry.
Although we are immersed in examples of outputs from public investments, perhaps there is no better illustration of how fundamental the state has been to the technological enterprise than the singular product that ostensibly exemplifies Silicon Valley’s modern triumph: Apple iPhone. Apple’s Jan 2007 launch press briefing proudly declared, “Apple today introduced iPhone, combining three products — a revolutionary mobile phone, a widescreen iPod with touch controls, and a breakthrough Internet communications device with desktop-class email, web browsing, searching and maps — into one small and lightweight handheld device. iPhone introduces an entirely new user interface based on a large multi-touch display and pioneering new software, letting users control iPhone with just their fingers. iPhone also ushers in an era of software power and sophistication never before seen in a mobile device, which completely redefines what users can do on their mobile phones.”
In The Entrepreneurial State, University College London professor of economics Mariana Mazzucato provides a comprehensive deconstruction of the iPhone, and the state origins of the technologies required to make the product possible. She notes that “nearly every state-of-the-art technology found in the iPod, iPhone and iPad is an often overlooked and ignored achievement of the research efforts and funding support of the government and military.” Beginning the analysis of Apple’s journey even before the iOS product line, she notes that there were three main sources of state support; namely, “Direct equity investment during the early stages of venture creation and growth,” “access to technologies that resulted from major government research programs, military initiatives, public procurement contracts, or that were developed by public research institutions, all backed by state or federal dollars,” and the “creation of tax, trade or technology policies that supported US companies such as Apple that allowed them to sustain their innovation efforts during times when national and/or global challenges hindered US companies from staying ahead, or caused them to fall behind in the race for capturing world markets.”
These are the essential ingredients of the famous Apple Way. The Apple Way is not particularly about building products that would somehow elevate their user to the stature of originals like Martin Luther King, Jr., Albert Einstein and Bob Dylan as its Think Different marketing campaign claimed. Rather, it is about understanding that embedded in government funded technologies are not merely advanced capabilities, but tremendous value that can be privatized. In this spirit, Apple’s product line is teeming with state-funded technologies, including CPUs, dynamic memory, micro hard-drive storage, LCD screens, lithium-polymer and lithium-ion batteries, digital signal processing techniques like Fast Fourier transforms (FFTs), HTTP and HTML, click-wheel navigation, multi-touch screens, and voice-enabled AI-based user interface. Furthermore, there are exogenous systems that support the device’s features, such as the internet, GPS and cellular networking systems, that were all developed in the state sector.
CPU: Integrated circuits (IC) are dense arrays of transistors that are fundamental to CPUs. The early development of integrated circuits at AT&T Bell Labs, Fairchild Semiconductor and Intel was enabled by large procurement programs by the U.S Air Force and NASA. Providing a guaranteed market for nascent and experimental IC products, these defense contracts not only developed the embryonic domestic microprocessor industry but also boosted adjacent markets for secondary and tertiary microelectronic components. Large, sustained orders from the U.S Air Force for the Minuteman II missile program critically funded the research and engineering work for the development of core technologies in the modern CPU. Furthermore, NASA’s Apollo program accelerated these purchasing programs, as the advancing requirements of space applications pushed the technology further. Mazzucato observed, “Each of the government agencies helped to drive down the costs of integrated circuits significantly within a matter of years.”
It must be noted that AT&T Bell Labs was indeed a corporate-funded research laboratory. With its origins in the late 19th century, the lab was responsible for breakthroughs such as the transistor. However, AT&T was only able to sustain long and expensive technical projects, which often fail and do not produce a positive rate of return, because it had a government guaranteed monopoly on the telephone market — a form of state intervention. The lab shut down after AT&T was forced to operate without the monopoly status.
Furthermore, in response to being outpaced by Japan in both memory product design and manufacturing in the 1980s, the Department of Defense (DoD) created the Strategic Computing Initiative (SCI). Echoing NSCAI’s contemporary global outlook, the DoD considered it a matter of national security to ensure that the US controlled its supply of micro-electronic components that were critical for military capabilities. In addition, recognizing the commercial and military advantages that would materialize by dominating the semiconductor industry, the federal government formed a partnership of US manufacturers and universities called the Semiconductor Manufacturing Technology (SEMATECH) consortium. To promote domestic semiconductor manufacturing technology and capability was to boost US economic and power security internationally.
Hard Disk Drives: In 2007, European scientists Albert Fert and Peter Grünberg were awarded the Nobel Prize in physics for their independent discovery of giant magnetoresistance (GMR) in 1988. GMR is a quantum mechanical effect observed in multi-layer structures. The main application of GMR has been in magnetic field sensors, used to read data in hard disk drives, biosensors and other devices. During the prize ceremony, Borje Johannson, a member of the Royal Swedish Academy of Sciences remarked, “You would not have an iPod without this effect.” Indeed, “The MP3 and iPod industry would not have existed without this discovery.” Nor the iPhone. The discovery was the result of state-funded research in Germany and France.
Prior to the discovery, Peter Grünberg’s lab was affiliated with the largest US Department of Energy (DoE) lab in the Midwest, Argonne National Laboratory in Illinois. His lab would receive vital research support from the DoE. The breakthrough discovery enabled research that resulted in greater storage capacity in the 1990s, facilitating more applications. This enabled companies such as IBM and Seagate to capitalize on the newly commercially viable technology.
In his 2009 study From Lab to iPod: A Story of Discovery and Commercialization in the Post–Cold War Era, technology historian W. Patrick McCray details the origins of DARPA in the wake of Sputnik and its original mandate of maintaining an innovation pipeline to produce superior war technologies. During peace time, the agency would focus on converting those prior investments and innovations into tools for economic advantage. After the end of the Cold War, the Department of Defense initiated the Technology Reinvestment Program (TRP) in 1992 to pivot to a new geopolitical and technology development stance. As McCray describes, “TRP’s purpose was to build stronger links between the commercial and military sectors and help the United States reap a greater share of the anticipated ‘peace dividend.’” Directing $800 million to many programs that sought to advance such military-commercial technologies, the TRP had as one of its investments developing commercial applications based on GMR. This program made significant contributions to the advancement of applied physics and engineering in the field of memory devices.
Capacitive sensing and multi-touch screens: The full touch display was perhaps the most distinctive feature of the iPhone when it was launched. These capacitive touch screens were also borne of state military research. E. A. Johnson published his first study on the technology in the 1960s, while working at the Royal Radar Establishment (RRE), a British government military research agency. A collection of research programs at the European Organization for Nuclear Research (CERN), Oak Ridge National Laboratory (established in Tennessee in 1943) and University of Kentucky became the cornerstone of advanced capacitive sensing hardware development, and the modern touch-based devices we see today.
Furthermore, the development of multi-touch and gesture control was conducted at the University of Delaware by Professor John Elias and his student Wayne Westerman. Focused on neuromorphic systems, the work was part of the National Science Foundation (NSF) and Central Intelligence Agency Post-Doctoral Fellowship program. This work was commercialized when Elias and Westerman started FingerWorks, a company that developed touch-based input devices. FingerWorks was acquired by Apple in 2005, before the 2007 iPhone launch. As Mazzucato summarized, “Westerman and Elias, with funding from government agencies, produced a technology that has revolutionized the multi-billion-dollar mobile electronic devices industry. Apple’s highly comprehensive intellectual property portfolio had benefitted, once again, from technology that was originally underwritten by the state.”
GPS: Developed by the Department of Defense, the Global Positioning System (GPS) was launched in the early 1970s for military use only. The technology was developed as an electronic positioning system that improved the military’s ability to command and control its asset deployments worldwide. Following the implementation of civilian applications on the system in the 1990s, non-military use of GPS quickly outpaced even military use. The U.S government owns, maintains and upgrades the system. Mazzucato observed, “This technology, as well as the infrastructure of the system, would have been impossible without the government taking the initiative and making the necessary financial commitment for such a highly complex system.” It can be easily argued that the iPhone is useful even without GPS, but there wouldn’t be Google directions, Yelp restaurant finder, or Uber without this system.
AI assistant: The Cognitive Assistant that Learns and Organizes (CALO) project was funded by DARPA to develop a virtual assistant for military personnel. The five-year contract was launched in 2003, tasking the Stanford Research Institute (SRI) to coordinate researchers from over twenty U.S universities. Upon iPhone’s launch in 2007, SRI recognized the commercial potential of the virtual assistant and spun-off the technology through a venture-backed startup called SIRI. SIRI was acquired by Apple in 2010 for more than $200 million. The startup’s co-founder noted that the four keys to launching a successful business are an ambitious idea, the right team, other people’s money, and killer execution. Quite likely that “other people’s money” did not refer to the years of public financing for technical developments that were later privatized. Rather, per the cherished, retold Silicon Valley fables, it was likely a nod to the intrepid venture capitalists and their partners who ostensibly risk it all to bless society with various marvels of modern science.
Internet, HTTP/HTML: Under looming threats of a nuclear attack during the Cold War, the U.S government was strategizing ways to maintain continuity of operations in the aftermath of such an attack. The U.S Air Force’s Research and Development program, or RAND (later RAND corporation) began working on approaches to overcome the inherent risks of centralized network switching systems that would be vulnerable to outage. RAND researcher Paul Baran devised a solution that networked a collection of distributed stations that could be resilient to failure if a section of the network went offline. DARPA’s in-house projects led to the development of various technologies required to construct such a network.
DARPA had approached IBM and AT&T to build this network but was turned down because both companies considered such a network a threat to their business. Instead, DARPA partnered with the state-funded British Post Office to network stations from coast to coast. As Mazzucato observed, “From the 1970s through the 1990s, DARPA funded the necessary communication protocol (TCP/IP), operating system (UNIX) and email programs needed for the communication system, while the NSF initiated the development of the first high-speed digital networks in the US.”
British scientist Tim Berners-Lee’s contribution to the development of the internet are well known. In the late 1980s, he developed uniform resource locators (URLs), HTML, and HTTP. The latter was successfully implemented on a network of computers at CERN by Berners-Lee and computer scientist Robert Cailliau. These formative components of the internet were all devised with state funding under state sanctioned technology development programs. Without such an enterprise, the modern Silicon Valley internet economy simply would not exist, to say nothing of the iPhone.
Display Technologies: “The story of the LCD shares great similarities with the hard disk drive, microprocessor and memory chip (among other major technologies) that emerged during the Cold War era: it is rooted in the US military’s need to strengthen its technological capabilities as a matter of national security,” noted Mazzucato. Impelled again by Japan’s technical proficiency in designing and manufacturing flat panel displays, the Department of Defense assembled an industry consortium to strengthen the military’s position to procure displays without security or strategic concerns.
Research carried out at the Westinghouse laboratory led to a breakthrough in LCD technology. Operating under Peter Brody and funded by the U.S Army, the lab developed the thin-film transistor (TFT) in the 1970s. However, Westinghouse management shut down the lab, rendering the research work with no benefactor. Brody approached companies such as Xerox, 3M, IBM, DEC, Compaq and indeed, Apple for financial support. The companies did not consider it a prudent investment against low-cost, quality Japanese alternatives. In 1988, DARPA stepped in, and Brody established Magnascreen to further develop TFT-LCD.
With state funding from the National Institute of Standards and Technology’s (NIST) Advanced Technology Program (ATP), the Advanced Display Manufacturers of America Research Consortium (ADMARC) was established by the major display companies in the US to retain manufacturing domestically. The state bolstered this effort in the 1990s by implementing protectionist tariffs and executing purchase contracts through a range of military and non-military state agencies that supported various U.S startups.
Lithium-ion battery: With funding from the NSF and DoE, John Goodenough developed lithium-ion battery technology at the University of Texas, Austin. This work was commercialized and pushed into mass manufacturing by Sony, producing another example of a “US-invented but Japanese-perfected and manufactured-in-volume technology.” Domestically, the lack of battery technology to service the power needs of advancing electronic devices again compelled the state to intervene. The government assisted various battery companies to develop local manufacturing. This work not only enabled consumer electronics applications but would serve as the foundation for electric vehicle battery advancements as well.
These technical breakthroughs represent a very small fraction of the state’s role in the scientific and technical enterprise, that also includes infrastructure programs like the Interstate Highway System, biotechnology advancements in drugs, vaccines and other developments. Indeed, even with these few electronic and communication technologies, entire industries have been built and markets have been developed for commerce. The state’s role certainly does not end with simply producing the technical breakthroughs and providing subsidies; the very same markets are routinely protected for the benefit of private companies, which must be shielded from much extolled free-market discipline. Not that the state needs to be lectured on it, but as NSCAI sought to remind it in the context of AI, the “future of our national security and economy are at stake.”
Mazzucato summarized, “The federal government has actively fought on behalf of companies like Apple to allow it secure access to the global consumer market, and it is a crucial partner in establishing and maintaining global competitive advantage for these companies.” Market doctrine may elevate trans-national corporations to supremacy on the global stage, but returning to the guardianship and benefaction of the state is a reflex action for private capital to navigate the world. State action includes ensuring unfettered access to foreign markets, as the U.S Government has admirably provided in service of corporations including Apple. For instance, when Apple faced challenges entering the Japanese market, “the company called on the US government for assistance, arguing that it was the government’s obligation to assist the company in opening the Japanese market to US products by appealing to the Japanese government.” Incommensurate with these public boons and lavish gifts, the state also extends to private capital a wide range of tax deferrals, resulting in many technology companies not paying any federal taxes annually. As a 2019 study by the Institute on Taxation and Economic Policy showed, sixty profitable Fortune 500 companies avoided all federal income taxes in 2018, including technology companies such as Amazon, IBM, Activision Blizzard, Salesforce and Netflix.
As Mazzucato explained in a 2013 Harvard Business Review piece, the U.S Government’s active role in building the technological enterprise “stands in stark contrast to the steps that Apple, Google, and other technology companies take to avoid paying taxes.” For instance, by basing a subsidiary in Reno, Nevada, Apple can recognize U.S sales through a state that does not levy corporate taxes, yielding $2.5 billion in skipped taxes. Furthermore, a “convoluted tax structure” called the Double Irish With a Dutch Sandwich allows corporations like Google and Apple to skip taxes on overseas earnings. Then Google chairman Eric Schmidt said he was “very proud” of this arrangement, adding, “It’s called capitalism.”
It should be noted that for the first time in the post-WWII era, as of 2013, the federal government no longer funds a majority of the basic research carried out in the United States. However, it still has the largest slice of the pie, further comprised of corporate, university and private foundation funding. The federal share, which topped 70% throughout the 1960s and 1970s, stood at 61% as recently as 2004 before falling to 44% in 2013. U.S businesses contributed 31%, largely driven by a surge in pharmaceutical spending. Where US corporations dedicate their spending is on profitable applications; namely, the conversion of basic research to meet a commercial objective.
These shifts over the decades should be immaterial, if we are to adhere to market doctrine which requires that the intrepid, risk-taking investors should be rewarded. If the public, through the state, risks prodigious spending over the course of decades to produce scientific and technical breakthroughs that form the basis of our high-tech economy, surely that risk should be compensated and returns made available to the public, perhaps used for reinvestment into other programs? It’s a novel concept; I hear “it’s called capitalism.”
Next: Engineering Illusions Part II: State and Technology -II
Follow along on Twitter @ap_prose and Medium at Tech Insider for the next installment of Engineering Illusions!