<Table Of Contents

Three Lessons from the Holocaust for Young Technologists

by Spencer Doyle, Leah Kaplan, and Emma Pan, 2023 Design & Technology Fellows

We are three researchers in various fields of technology who had the privilege of participating in a two-week-long fellowship in Europe on the topic of professional ethics. This FASPE fellowship exposed us to the history of the Holocaust from the perspective of the perpetrators (especially the professional class of doctors, lawyers, and scientists) and asked us to reflect on ethics in our fields today.

From large language models to quantum computers, genome editing, autonomous vehicles, and virtual reality, we live in a time characterized by many and diverse innovations. While there exists potential for positive social outcomes, with complexity comes unpredictability. The unintended consequences stemming from our innovations may leave us wishing we had never invented such new technology in the first place.

History is filled with lessons for those willing to listen. By reckoning with the role of scientists and engineers in enabling the ultimate tragedy of our time at these historical sites we heard more than just lessons: a heartrending wail echoed through the camps, towns, and ruins. Although we will be reflecting on these cries for a lifetime, we feel the responsibility to share a condensed version of our collective experience in the form of in-progress lessons for modern technologists.

Lesson 1: Don’t let the title “technologist” fool you—our jobs are just as social as they are technical.

As we design new hardware and software, so too we design new ways for people to interact with each other both digitally and physically. By way of social connections in the workplace, even the professional norms that we develop as we work through R&D impose themselves on future researchers and projects.

When studying the Holocaust, the inescapable connection between the technological and the social is visible in essentially every case study. One, however, stands out: the company responsible for enabling the large-scale burning of bodies at concentration camps.

As cremation gained popularity in the early twentieth century, Topf and Sons depicted the development of crematoria as a means of bringing “dignity to death.”1 When tasked by the Nazi regime in 1939 with providing ovens for their camps, the company went above and beyond. They offered redesigned crematoria capable of much more efficient operation, going so far as to provide unsolicited advice on how to improve the venting of the gas chambers to speed up the killing process.

This was the corporate culture of Topf and Sons, emphasizing innovation and technological perfection above all else. In a 1938 letter from the Topf Brothers to their employees (a year before the company would begin testing and installing crematoria in concentration camps), they highlighted this operating principle: “this corporation always puts invention, creativity and proficiency before capital.”2 Indeed, this sentiment is clear from their collaboration with the SS: such contracts never accounted for more than 3% of the company’s income.3

The engineers were in it for the opportunity to innovate, taking on what philosopher Zygmunt Bauman describes as a technical, rather than a moral, responsibility.4

Bringing our moment back into focus: how different are we as engineers and technologists today?

Pessimistically, the so-called Silicon Valley model of innovation5 has encouraged an operating principle well summarized by one of its most successful proponents and benefactors, Mark Zuckerberg: “Move fast and break things.” Such a motto mirrors the rationale of companies like Topf and Sons in their choice to help realize the Nazi regime’s “Final Solution”: committing genocide against Jewish people.

Optimistically, we can learn from our recent history and reflect on further industrial and digital developments. With this in mind, we can see that moral and technological responsibilities are not interchangeable. If you find yourself working insistently on technical problems without considering social or ethical considerations, reflect on why this might be. Does it benefit your employer? Is it easier for you?

By understanding where this separation of responsibilities comes from, we can better modify our practices and institutions to move towards more socially cognizant innovation.

Lesson 2: Scale deliberately and iteratively to minimize harm.

Technologists often assert that a key contribution of their profession is improving scale and efficiency. The subtle implication is, of course, that scale and efficiency are inherently positive goals. At minimum, these do seem intimately tied to examples of modern technological achievement, such as skyscrapers, global communications networks, and highly automated assembly lines.

Recently, many have lauded large language models for their broad applicability, which promises widespread growth and more efficient task completion. In a world that seems to strive for bigger and faster everything, scale and efficiency have become key measures of performance.

Yet while we celebrate these achievements, we often seem to overlook the capacity of technology to enable large-scale harm also.

Technology did not create Nazi prejudice. But it did allow for atrocities at scales hitherto unfathomable.

On January 20, 1942 in a Berlin suburb, fifteen Nazi party officials discussed how to handle the approximately 11,000,000 Jews in Europe. This cold bit of calculation formed part of what is now referred to as the Wannsee Conference.6 Different officials in attendance raised concerns about the logistical difficulties of “evacuating” (a euphemism for murder) such a large number of people. The Nazis had a problem. Technology promised a “solution” in the form of gas chambers. While the Nazis were already committing mass murder prior to the Wannsee Conference, the subsequent scale of murder was made possible in large part due to new, fiendish technologies.

What, then, is good about scale and efficiency?

Moreover, these questions not only apply to how we might think about technology design but also to how we might reflect on our own individual roles as technologists. Raised on rhetoric about engineers saving the world, many of us set out to create large-scale change through our work. Indeed, we may even find ourselves motivated by one of the National Academy of Engineering’s 14 Grand Challenges (the promise of personal and professional grandeur embedded in even their name). From global pandemics to worsening climate change, we all feel a sense of urgency to create change—and fast!

Yet chasing such an impact often means ceding control of the shape our labor takes. The technologies we develop eventually leave the lab (or, more often now, an open-concept office space) and permeate society, entwining themselves in global problems and existing power structures. The larger the scale, the less we may be able to adjust and the more harm may come. We have our entire careers to work toward positive change. We should consider starting out by focusing on smaller-scale effects or slowing down to create change iteratively—and ideally collaboratively.

Lesson 3: Examine whose voices are left out of the design process and find ways to engage with them.

What we know about our impact as technologists depends on who we care about enough to talk to. For example, people who are not “users” of a product are often left out of user research studies, even if they are affected by the product. What’s more, the diversity of those included in user-research studies can vary greatly based on how much time the study is given, who is contacted, and who can afford to participate. Unintended consequences arise when designers fail to consider the perspectives of people who are not “target users.” These consequences often disproportionately harm minority communities.

While studying the Holocaust, we were struck by the importance of knowing the impact of one’s work. During this period, gas chambers were kept in remote locations, largely hidden from society. In these chambers, a chemically engineered pesticide named Zyklon B enabled Nazis to murder with speed and at scale. At the beginning of WWII, Degesch sold pesticide to concentration camps to prevent the spread of infections and disease. These chemicals eventually became means of mass extermination. Carl Wurster, the chairman of the Degesch board of directors, was acquitted of all charges in the Nuremberg Trials. The website of BASF, a company descended from Degesch, states that “the records still preserved and witness accounts give no indication that Carl Wurster knew of the misuse of pesticides for industrial mass extermination […A]s the war progressed, more and more people were housed in camps so it was to be expected that the demand would rise for pediculicides and other special pesticides.”7

If Wurster truly did not know that his company’s most profitable product was used for mass murder, would he have put an end to the production of Zyklon B if he had been aware? Would the engineers designing Zyklon B have chosen not to design a stronger pesticide if they had known what its intended use was? Regardless of whether they knew the true and horrifying impact of their product, both the leaders and the engineers of Degesch had the power to slow down, or even prevent, the development of Zyklon B. Such resistance would have had the potential to save countless lives.

By investing time and resources into learning about how technology can be used and who can be affected by it, we can acquire the knowledge needed to prevent harmful outcomes. By exploring the experiences of people who are directly and indirectly affected by an innovation, we can better understand the breadth of its impact. By listening to those who are historically excluded, we can predict and prevent unintended consequences.

It can take time and resources to collect a variety of experiences but designing for minority and excluded populations can improve outcomes for everyone. Curb cuts, for instance, were originally created to make sidewalks accessible to people with mobility devices like wheelchairs. They now, however, make it easier for everyone to move onto sidewalks, especially when toting suitcases, strollers, and other items with wheels.

We can improve the design process and prevent harm if we keep underrepresented peoples informed, interview them, and consult with them. Just talking to people, however, is not enough. Underrepresented opinions and experiences need to be taken seriously and translated into more inclusive products. By hiring people with relevant experiences, we can bring important perspectives to the decision-making table.

The above lessons are not silver bullets, nor are they a sufficient list of considerations an employee or organization might need to implement in order to be “good” innovators. They are, nevertheless, a good place to start.
We as technologists get to decide what we want our priorities to be. In an era of rapid innovation, we must choose to prioritize the impact of our work and the world we are a part of rather than innovation itself. Our future depends on it.


Spencer Doyle was a 2023 FASPE Design and Technology Fellow. He is a PhD candidate in physics at Harvard.
FASPE Journal 2023

Leah Kaplan was a 2023 FASPE Design and Technology Fellow. She is a PhD candidate in systems engineering at George Washington University.

Emma Pan was a 2023 FASPE Design and Technology Fellow. She is a software engineer, currently working at Microsoft on Seeing AI, an assistive app for people with visual impairments.


Notes

  1. Karen Bartlett, “Inside the Company Behind the Nazi Concentration Camp Ovens,” TIME, Aug. 21, 2018.
  2. Annika Van Baar and Wim Huisman, “THE OVEN BUILDERS OF THE HOLOCAUST: A Case Study of Corporate Complicity in International Crimes,” The British Journal of Criminology 52, no. 6 (2012): 1041.
  3. Bartlett, “Inside the Company.”
  4. Zygmunt Bauman, Modernity and the Holocaust (New York: Cornell University Press, 1989), 98.
  5. Sebastian M. Pfotenhauer and Sheila Jasanoff, “Traveling imaginaries: The ‘practice turn’ in innovation policy and the global circulation of innovation models” in The Routledge Handbook of the Political Economy of Science, eds. David Tyfield, Rebecca Lave, Samuel Randalls, Charles Thorpe (London: Routledge, 2017), 416–428.
  6. United States Holocaust Memorial Museum, “Wannsee Conference and the ‘Final Solution’,” Holocaust Encyclopedia, accessed June 2023. https://encyclopedia.ushmm.org/content/en/article/wannsee-conference-and-the-final-solution
  7. Chemical Warfare Agents and Zyklon B,” BASF, accessed July 8, 2023, https://www.basf.com/global/en/who-we-are/history/chronology/1925-1944/1939-1945/ka mpfstoffe-und-zyklon-b.html.