TransWikia.com

Is infinity really infinite if we can encode it in a finite number of bits?

Philosophy Asked by boringbeing on November 5, 2021

I’m asking this question because in some programming languages there is an object defining "infinity", which behaves as the mathematical infinity (e.g. it is indefinite if you multiply it by 0, dividing by it a finite number yields 0), and of course this object is encoded by a finite number of bits.

I’m guessing that the concept of infinity encoded by a set of fixed rules does not equal infinity itself, but I would be interested in understanding this difference a little better, since superficially it may almost seem like a contradiction.

4 Answers

The phrasing of your question suggests that possibly the programming entities you are referring to are 'floating-point numbers'. If that is the case, then be advised that the use of the word 'infinity' here is a misnomer. What a floating-point 'infinity' value represents is NOT an actually-infinite value; instead it represents any value, possibly produced by some calculation, which is too large to represent using that number-representation scheme. This is done for the convenience of people who do numerical analysis in various contexts (you would have to study numerical analysis to fully understand this).

Answered by PMar on November 5, 2021

Imagine there is a hole. The hole has a specific location. Its width and breadth are known exactly. But it happens to be infinitely deep. Clearly there are some aspects of this hole we can describe usefully and succinctly. But we aren't able to plumb its depths. Anything dropped into it will never reach the bottom.

Similarly, we can describe the square root of two using just two, well-understood, commonly used mathematical symbols. But we can't ever reach the end of its decimal expansion. There are a lot of meaningful things we can do with the square root of two. But not that.

Answered by Chris Sunami supports Monica on November 5, 2021

Yes. An ideal computer (like the Turing machine) trying to count up to the symbol infinity would not terminate, but run infinitely long, whereas it would terminate for any finite number. (Non-ideal computers would halt earlier with an error when becoming unable to represent the next integer.)

Answered by tkruse on November 5, 2021

The last century or two of mathematics made a lot of progress by making clearer what we mean when we say anything that contains "infinite" or "infinity". One can in fact always rephrase such statements in other terms. (This typically involves saying certain things can be done with arbitrarily large finite quantities, or even arbitrarily small ones; it's complicated, but well-worn.) In doing so, we explain what we mean in a finite amount of information.

In general, descriptions, definitions etc. don't inherit the properties of what they talk about, because descriptions are a very specific kind of thing. To take a less mysterious example, descriptions of blue aren't blue; and descriptions of descriptions of blue aren't descriptions of blue either.

Even though there are infinitely many integers, an integer can be specified with finitely many bits of information: first specify its number of binary digits (so you specify how many digits that number has, until we get down to 1), then what they are. In general a real number takes infinitely many digits to specify (or is unspecifiable, if you prefer to define "specifiable" as "specifiable in a finite number of digits"), but you can show two real numbers differ by citing just finitely many of them, until a difference comes up.

Answered by J.G. on November 5, 2021

Add your own answers!

Ask a Question

Get help from others!

© 2024 TransWikia.com. All rights reserved. Sites we Love: PCI Database, UKBizDB, Menu Kuliner, Sharing RPP