What is the number

Carlo C.
8 min readMay 13, 2024

--

όλων by Author with ideogram.ai

The concept of number is so ingrained in our everyday experience that we often take its complexity and depth for granted. Yet, since the dawn of philosophical thought, the nature of the number has fascinated and stimulated the reflection of mathematicians, philosophers and scientists. In this article we will explore some fundamental stages in the history of the concept of number, from ancient philosophy to modern computational theories, passing through etymology and the different numbering systems. We will discover how the number is not only a practical tool for counting and measuring, but also an abstract concept that opens the door to far-reaching metaphysical and mathematical speculation. We will see how the debate on the nature of number has influenced the development of scientific thought and how it is still at the center of important epistemological and philosophical questions. Finally, we will reflect on how the understanding of number can enrich our way of interpreting reality and thinking about the relationship between mind and the world.

1. Number in Ancient Philosophy

In ancient Greek philosophy, the concept of number took on a central and foundational role, particularly in the Pythagorean school. For the Pythagoreans, numbers were not mere instruments of calculation, but real ontological principles, the ultimate essences of reality. They saw in the number the archè, that is, the original and constitutive principle of the universe. This conception was based on the observation of the mathematical regularities and proportions present in nature, from music to celestial orbits. The Pythagoreans attributed symbolic and mystical meanings to numbers, associating them with concepts such as harmony, justice, perfection.

Another great Greek philosopher who dealt with the nature of numbers is Plato. In his metaphysical system, numbers occupy an intermediate place between the sensible world and the intelligible world of ideas. For Plato, numbers are ideal entities, eternal patterns, and immutable entities that transcend empirical reality. They represent the essence of things, their rational and immutable structure. For Plato, the knowledge of numbers and mathematics is preparatory to the elevation of the soul towards the contemplation of supreme ideas, in particular the idea of the Good.

A different perspective is that of Aristotle, who criticizes the Pythagorean and Platonic doctrines of numbers. For the Stagirite, numbers do not have an autonomous and separate reality, but are always linked to concrete and sensitive entities. Aristotle gives number a more scientific and naturalistic meaning, considering it to be the result of the abstraction of the quantitative properties of objects. Numbers are universal that exist only in the mind, not substances in their own right. This view will profoundly influence medieval and modern mathematics and philosophy.

The ancient debate about the nature of numbers laid the foundation for the later development of the philosophy of mathematics and science. Pythagorean and Platonic conceptions have inspired idealist and rationalist currents, while the Aristotelian approach has favoured more empiricist and nominalist views. Even today, questions such as the existence of mathematical entities, the relationship between mathematics and reality, and the logical foundation of arithmetic, are at the center of heated epistemological and ontological debates. Understanding the ancient roots of these problems can help us better frame the current challenges of the philosophy of mathematics.

2. The etymology and original meaning of the word “number” (arithmos)

Exploring the etymology of a word can often shed light on its deeper meaning and historical evolution. In the case of the term “number”, the linguistic root takes us back to the Latin “numerus” and the ancient Greek “ἀριθμός” (arithmós). Both of these terms are derived from the root Proto-Indo-European *nem-, meaning “to assign”, “to distribute”. This suggests that the concept of number is linked from the beginning to the idea of distribution, of subdivision of a quantity into discrete parts.

In particular, the Greek word “arithmós” originally had a narrower meaning than the modern concept of number. It indicated a “measured plurality”, a defined quantity composed of distinct units. For the Greeks, number was not an abstract and purely mathematical entity, but was always linked to concrete and tangible objects. Not surprisingly, the word “arithmós” was often used in the plural to indicate a set of countable things, such as “arithmoí” (numbers) of sheep or coins.

In addition, the original meaning of “arithmós” excluded concepts that we now consider an integral part of the notion of number, such as fractions, irrational numbers, and zero. For the Greeks, numbers were essentially positive integers, and the idea of fractional or immeasurable quantities was viewed with suspicion and difficulty. Only gradually, with the development of Hellenistic and then Arabic mathematics, did the concept of number extend to include these edge cases.

This semantic evolution of the term “number” also reflects a progressive process of abstraction and formalization of mathematical thought. From numbers as concrete entities linked to sensible experience, we have gradually moved on to a more general and symbolic conception, in which numbers are seen as purely ideal and formal objects, which can be manipulated according to logical and algebraic rules. However, the original link of number with empirical reality and with the act of counting and measuring has never failed, and indeed still constitutes the intuitive foundation of our understanding of arithmetic.

3. Number and computation

In the twentieth century, with the development of computer science and computational theory, the concept of number took on a new meaning and relevance. In this context, numbers are no longer just abstract mathematical objects, but become operational tools for encoding, processing, and transmitting information. Mathematics, and in particular number theory, provides the conceptual foundations for defining what it means to compute and for analyzing the properties of algorithms and formal languages.

At the heart of computation is the idea that numbers, and more generally mathematical symbols, can be manipulated in a mechanical and formal way, following precise and unambiguous rules. Algebra teaches us that it is possible to perform operations on numbers and variables in a purely syntactic way, without having to refer to their concrete meaning. This principle underpins the operation of computers, which process sequences of bits (binary numbers) by applying algorithms defined in formal programming languages.

Computation theory studies the fundamental properties of these languages and algorithms, such as decidability, computability, and complexity. It asks what problems can be solved by a computer, what are the intrinsic limits of computation, how to measure the efficiency and scalability of algorithms. All of these questions are deeply mathematical in nature and crucially involve the concept of number and its algebraic and combinatorial properties.

But the link between number and computation is not limited to theoretical aspects. Many practical applications of computer science, from cryptography to data compression, computer graphics to artificial intelligence, make extensive use of mathematical tools and number theory results. For example, encryption algorithms often rely on difficult computational problems involving prime numbers, discrete logarithms, or elliptic curves. Mathematics thus permeates all levels of computational science, from theory to practice, and the number constitutes in a sense its basic alphabet and grammar.

4. Numbering systems and analogies

A fascinating aspect of the history of the number is the variety of numbering systems that have been developed by different cultures and civilizations. Each numbering system represents a particular way of writing and manipulating numbers, with its own syntactic rules and expressive potential. Some systems, such as the Roman, are additive and non-positional, while others, such as the Indo-Arabic system we use today, are positional and use the position value principle of digits.

Comparing different numbering systems can help us better understand the features and benefits of the positional decimal system, which has gradually established itself as a universal standard. But it can also stimulate interesting reflections on the nature of number and its relationship with language and thought. In particular, the analogy between the numbering system and the alphabet has often been used to highlight the symbolic and combinatorial character of arithmetic.

Just as the letters of the alphabet are the building blocks for meaningful words and sentences, numerical digits are the basic building blocks for representing and working with numbers. The rules of digit composition and transformation can be seen as a kind of mathematical language’s grammar or syntax. This analogy suggests that learning to master numbers and calculus is somewhat similar to learning a new language, with its own vocabulary, structure, and internal logic.

While natural language has a wide field of use, numbering has a much narrower scope. On the other hand, natural language brings ambiguity (polysemy), while number has only one meaning, its quantity. Moreover, mathematics (and numbering) is a tool of science, but it is not a science in itself.

In fact, the number itself, without being associated with real entities, can only be valid, internally conforming, i.e., respecting mathematical rules.

Other interesting analogies are those between numbers and physical quantities, such as lengths, areas, volumes, or between numbers and geometric points on a straight line or plane. These analogies highlight the deep link between arithmetic and geometry, and the role of numbers in describing and modeling space and its properties. However, each analogy has its limits and must be used with caution, to avoid flattening the specificities of the concept of number or confusing the planes of discourse. Numbers remain mathematical sui generis entities, with their own irreducible complexity and abstractness.

5. Conclusion

In this article, we have explored some fundamental stages in the history and meaning of the concept of number, from ancient philosophy to modern computational theories. We have seen how the issue has been at the center of profound metaphysical, epistemological and mathematical reflections, and how it has influenced the development of Western scientific and philosophical thought. We also analyzed the etymology of the term “number” and its original connection to the idea of discrete and measurable quantity, and compared different numbering systems and some illuminating analogies to understand the nature of numbers.

What emerges from this path is the complexity and richness of the concept of number, which escapes univocal and reductive definitions. Numbers are at once abstract entities and concrete tools, ideal objects and operational symbols, models of reality and constructs of the mind. They somehow embody the mystery of the relationship between thought and world, between logic and reality, between theory and practice. To fully understand the nature of numbers therefore means to question the very foundations of human knowledge and action.

But the importance of numbers is not limited to the philosophical or scientific sphere. Numbers pervade our daily lives, our social relationships, our economic and political choices. They are indispensable tools for quantifying, ordering, predicting and deciding. In a world increasingly dominated by data and algorithms, a solid understanding of mathematics and number logic becomes a crucial skill to fully exercise one’s citizenship and freedom.

We hope this article has piqued your curiosity and desire to delve further into this fascinating topic. The history of number is, after all, the very history of human thought, of its effort to understand and dominate reality through language and calculation. Continuing to explore this story therefore means continuing to explore ourselves, our potentials and limits as rational and creative beings. And who knows, maybe this exploration will still reserve us new surprises and discoveries, new ways of seeing and thinking about the world through the lens of numbers. Starting from the first tool that man can use, that is, himself.

--

--

Carlo C.
Carlo C.

Written by Carlo C.

Data scientist, avidly exploring ancient philosophy as a hobby to enhance my understanding of the world and human knowledge.

No responses yet