“We have entered a brave new world. It’s not only privacy that has become a concern as data gathering and analysis proliferate. Because algorithms—those little bits of machine code that increasingly mediate our behavior via our phones and the Internet—aren’t simply analyzing the data that we generate with our every move. They are also being used to actively make decisions that affect our lives”- The Ethical Algorithm (Micheal Kearns and Aaron Roth, 2020)

In this section, we explore the history of sorting algorithms. Focusing predominantly on Mergesort and Quicksort, we will examine the historical impact of these algorithms and how they continue to shape how we think about programming today.

Early Origns of Sorting:

From efficiency challenges in computing to decision-making in our daily lives, sorting is a fundamental process that has shaped computer programming since its inception. The origins of modern sorting algorithms trace back to radix sort, developed in 1887 by the German-American statistician, inventor, and businessman Herman Hollerith.

Herman Hollerith (1860-1929) (Wikipedia).

What is Radix sort?

Deriving from the Latin word “radix” meaning root, a radix is the mathematical term for the base of a number. In other words, a radix represents how many digits are used in a number system. So for the decimal number system, the radix is 10, whereas for binary, the radix is 2. Radix Sort is a non-comparison sorting algorithm that organizes numbers by looking at their digits one at a time, starting from the smallest place value (like ones, tens, hundreds). It groups numbers based on each digit and sorts them step by step until the whole list is in order. Instead of comparing numbers directly, it sorts them by placing them into buckets. This makes it fast for sorting large numbers, especially when they have a fixed number of digits (Joshi, 2017).

Hollerith originally created radix sort for sorting census data in the late 19th century. Hollerith developed a punched card system to efficiently process and analyze large amounts of census data. Inspired by railroad tickets, his approach used mechanical sorting machines called tabulating machines that relied on the radix sort principle, grouping and sorting punched cards based on digit positions. This method laid the foundation for computational sorting and automated data processing (Gries, 2018).

Tabulating Machine (Wikipedia).

Radix sort became more widely known in computer science as an efficient sorting algorithm for large datasets, particularly in applications where numbers have a fixed length. Unlike comparison-based sorting algorithms such as quicksort or mergesort, radix sort operates in linear time under certain conditions, making it useful for tasks like processing financial records, digital signal processing, and large-scale data organization.

Mergesort:

Mergesort, one of the most well-known and foundational sorting algorithms, was developed by John von Neumann in 1945. As a Hungarian-American mathematician, physicist, and computer scientist, von Neumann made groundbreaking contributions to numerous fields, including the development of early computing and algorithmic design.

John von Neumann (1903-1957) (Wikipedia).

Von Neumann devised MergeSort as part of his work on computational methods, particularly in the context of electronic computing and the increasing need for efficient data processing. The emergence of large-scale computing was heavily influenced by the war effort, and sorting algorithms became critical for handling vast amounts of wartime data, such as logistical coordination, codebreaking, and scientific computations (Janhangeer, 2023) .

During World War II, computational efficiency was a strategic priority. The U.S. military relied on large-scale computations for tasks such as ballistic trajectory calculations, cryptographic analysis, and logistical coordination. To meet these demands, the Electronic Numerical Integrator and Computer (ENIAC), one of the first general-purpose electronic computers, was developed under the direction of the United States Army Ballistic Research Laboratory. ENIAC was primarily designed to compute artillery firing tables, which were crucial for improving the accuracy of military weapons (National Museum United States Army).

ENIAC (1945) (Wikipedia).

Mergesort’s divide-and-conquer approach aligned well with these computational needs, enabling efficient sorting of large datasets with minimal memory overhead—a critical advantage given the limited hardware resources of the time. By optimizing sorting processes, algorithms like Mergesort played an indirect yet essential role in accelerating military computations, further reinforcing the wartime-driven evolution of computer science (Janhangeer, 2023).

Von Neumann’s formulation of Mergesort also influenced the broader field of algorithmic design, reinforcing the importance of recursion and the divide-and-conquer paradigm in computer science. This methodology would later become a cornerstone of algorithmic analysis, influencing numerous other sorting and searching techniques.

Quicksort:

QuickSort, one of the most efficient and widely used sorting algorithms, was developed by Tony Hoare in 1960. At the time, Hoare was working at the National Physical Laboratory in the UK, focusing on machine translation—specifically improving software for translating Russian to English. While working on this project, he needed a fast and efficient way to sort words in a dictionary and devised QuickSort as a novel approach to sorting large datasets.

Tony Hoare (1934-present) (Wikipedia).

Unlike earlier sorting algorithms such as Mergesort, which were well-suited for external sorting (where data is too large to fit in memory at once), Quicksort was particularly effective for in-memory sorting. It introduced the divide-and-conquer strategy that recursively partitions the dataset around a chosen pivot, leading to a highly efficient sorting method with an O(n log n) average-case complexity.

Although Quicksort was developed independently of military projects, its invention came at a time when computing technology was rapidly expanding due to Cold War-era investments in scientific research, artificial intelligence, and high-performance computing. As computing resources became more widely available, Quicksort’s speed and adaptability made it a foundational algorithm in areas such as compiler design, database management, and systems programming.

Contemporary Uses

Sorting algorithms are deeply embedded in everyday life, shaping the way information is processed, organized, and accessed across various domains. From digital applications to real-world logistics, sorting enhances efficiency, accuracy, and decision-making (AlgorithmExamples)

Modern Applications

  • Search Engines & Digital Platforms
    Sorting algorithms help Google, YouTube, and Amazon rank search results, recommendations, and advertisements, influencing how users interact with digital content.

  • E-Commerce & Online Shopping
    Online retailers use sorting to rank products by relevance, price, or popularity, streamlining the shopping experience for consumers.

  • Healthcare & Medical Diagnosis
    Medical systems rely on sorting to prioritize patient records, lab results, and diagnostic data, aiding in faster and more accurate decision-making.

  • Education & Examination Systems
    Universities and schools use sorting algorithms to rank student applications, arrange exam scores, and allocate courses, ensuring a structured admission and grading process.

  • Logistics & Supply Chain Management
    Sorting optimizes inventory organization, warehouse distribution, and delivery scheduling, helping businesses improve operational efficiency.

  • Financial Services & Banking
    Banks and financial institutions use sorting to rganize transaction data, detect fraudulent activity, and process credit scores, enhancing security and decision-making.

Sorting algorithms streamline complex systems, but their widespread use also raises ethical concerns. As reliance on sorting algorithms grows, ensuring fairness, transparency, and responsible implementation becomes increasingly important.

Ethical Considerations

Sorting algorithms like Mergesort and Quicksort now shape modern digital experiences, influencing search rankings, financial decisions, and social interactions. As algorithmic sorting expands into critical domains, ethical concerns become increasingly significant. The study “The ethics of algorithms: Mapping the debate” (Mittelstadt et al. (2016) explores six key ethical concerns of modern day algorithms: inconclusive evidence, inscrutable evidence, misguided evidence, unfair outcomes, transformative effects, and traceability.

Studies Six Key Ethical Concerns (Wikipedia).

Inconclusive Evidence:

Sorting-based ranking systems often rely on probabilistic models, making decisions without complete information. In hiring and credit scoring, algorithms may reinforce historical biases rather than reflect merit, leading to self-perpetuating inequalities.

Inscrutable Evidence:

Sorting mechanisms in search engines and recommendation systems often function as “black boxes”, making it difficult to understand how rankings are determined. Lack of transparency limits user agency and trust, reinforcing filter bubbles and information asymmetries.

Misguided Evidence:

Sorting algorithms inherently reflect the biases in their training data. Search engine rankings, financial risk assessments, and recommendation system often reinforce societal inequalities** by amplifying existing trends rather than promoting fairness.

Unfair Outcomes:

Sorting-based decision-making can lead to discriminatory pricing, biased loan approvals, and digital redlining. Users from marginalized communities may receive higher prices, lower credit scores, or restricted access based on proxies like zip codes.

Transformative Effects:

Sorting algorithms do not just organize information—they actively shape social perceptions. Newsfeeds, content recommendations, and financial forecasting models** restructure digital experiences, often prioritizing **engagement over accuracy.

Traceability:

As sorting algorithms become more autonomous, assigning responsibility** for their outcomes becomes increasingly complex. In fields like high-frequency trading (HFT) and automated content moderation**, unintended consequences can emerge without clear human oversight.

Conclusion:

Sorting algorithms are no longer neutral tools—they embed values, biases, and societal structures. Ethical algorithm design requires:

  • Transparency: Ensuring explainability in ranking mechanisms.
  • Fairness: Implementing bias-detection and fairness-aware sorting.
  • Accountability: Establishing regulatory frameworks to mitigate harm.

As sorting continues to shape digital experiences, balancing efficiency with ethical responsibility remains a crucial challenge for developers, researchers, and policymakers.

Discussion Questions:

  1. Mergesort was developed in 1945, during a time when computational advancements were increasingly applied to military and intelligence operations. Sorting algorithms like Mergesort helped improve the efficiency of war-related computations, including ballistic trajectory calculations, cryptographic analysis, and logistical planning. What are the ethical implications of using sorting algorithms in warfare? Should scientists and engineers be held accountable for how their algorithms are used?
  2. While Quicksort and Mergesort are primarily designed for efficiency, their application in real-world systems can raise ethical concerns. Select one of the six key ethical issues (inconclusive evidence, inscrutable evidence, misguided evidence, unfair outcomes, transformative effects, or traceability). How might this ethical concern manifest in a system that relies on Quicksort or Mergesort—such as credit scoring, hiring algorithms, or social media ranking? Discuss the potential harms, biases, or unintended consequences that could emerge. What steps could be taken to improve the ethical transparency and fairness of sorting-based decisions?