How Information is Measured and Why God is the Source of Every Doubt
- Deodato Salafia
- Jul 28, 2024
- 4 min read
Updated: Mar 11

We live in an increasingly digital world, where we not only create and consume information but also manipulate it formally. Knowing that my mom will prepare meatballs on Sunday is a fact, while holding a smartphone with a message containing this information provides a formal representation of it.
We are witnessing the era of generative AI, where formal systems generate content and information that we consider to be of the highest quality. Until now, computers provided precise answers and questions; today, they create content stimulated by generic queries. Soon, they will likely generate questions and provide answers driven by our latent, rather than explicit, needs.

What is Information?
Information is the reduction of uncertainty. Without uncertainty—what we might also call "doubt" or "desire for knowledge"—there is no information. Does she love me or not? I have a doubt, and information is the answer: yes or no. If I am uninterested in love and have no curiosity, then discussing information is meaningless. However, if I am generally interested in knowing who loves whom in the world, whether someone loves me, or if anyone has viewed my LinkedIn profile, then I have doubts, making it appropriate to talk about information.
If my doubt concerns something complex, there are many possibilities, and I need a lot of information to resolve it. If my doubt is simple, such as "does she love me or not?", only a minimal amount of information is needed—one bit.
A bit is the smallest unit of information and is useful for resolving very simple doubts that can be reduced to a yes or no answer, 1 or 0. For example:
Has someone viewed my LinkedIn profile today? Yes or No.
Does God exist? Yes or No.
If my doubt were about which number will be drawn first in the Naples lottery tomorrow, there are 90 possible outcomes. To communicate this information, we need an efficient encoding strategy.
Instead of turning a light on and off 28 times (if the number 28 were drawn), a more optimized approach would use binary encoding. With just 7 flashes of light, each representing either 0 or 1, we could transmit any number between 0 and 90. This efficiency is fundamental to the formalization of information.
In short, information resolves doubt through an optimized formalism. Humans naturally seek efficiency, which is why information is often encoded in binary (0s and 1s)—it is easy to understand and represent physically (on or off).
Your Doubts and Questions Have Weight
We can measure information using a model first introduced by Ralph Hartley and later refined by Claude Shannon. Hartley’s method states that the number of bits required to resolve a doubt is the logarithm (base 2) of the total possible outcomes, rounded up. This means that:
If I have 4 identical twins and I wonder who will graduate first, assuming equal probabilities, I need 2 bits to resolve the doubt.
If I know that two of the four will never graduate, then I need only 1 bit.
If none of them are even enrolled, then I need zero bits—because there is no uncertainty.
Shannon improved this method by considering probability distribution. If some outcomes are much more likely than others, then less information is needed. For example:
If 14 out of 16 twins are men, but the 2 women are 100 times more likely to graduate first, then according to Shannon, the uncertainty is only about 2 bits, instead of the 4 bits required if all had an equal chance.
Shannon’s method provides a theoretical limit to information weight, but in practice, binary representation (bits) might be inefficient for complex data. Still, we use bits because they are easy to understand and physically implement (on/off systems).

Before Man, Doubt Was Created
We have established that without doubt, there is no information. But having doubt is, in itself, information. In the beginning of this article, we discussed the informational flow as a possible representation of life. Then, we explored how information is measured. However, we have yet to address the origin of the first doubt—the doubt that necessitates the formalization of information.
We are not debating whether God exists, as that would be a simplistic discussion. Instead, we pose a formal question: what created the first doubt?
René Descartes famously wrote, "I think, therefore I am." But perhaps a better formulation would be, "I doubt, therefore I am." Some philosophers, mystics, and even ordinary individuals believe that we live in a simulated world—a concept popularized by the movie The Matrix. In such a world, it would be clear who created the informational flow.
In the physical world, there is no doubt, and therefore no information. Information exists only as a response to doubt, and doubt itself is information.
By studying information science—ignoring chemistry or physics—we arrive at the formal necessity of God, the originator of the first doubt. Humanity's story is, in essence, an informational story. If miracles exist, they would be akin to software patches fixing system bugs.
Comments