Take the natural numbers. Large natural numbers will each require a minimum amount of information in order to be uniquely defined. However, this amount of information is not the same as the size of that number. That is, some are "compressible" due to some kind of regularity. For instance, a number can be defined recursively, or by its prime factors, or by its distance (where distance can be extended to orders of magnitude and even greater types of recursion) from another easily identifiable integer.

Given a formal mathematical language, is it possible to determine the amount of information in a number? Let I(N) = The minimum number of bits needed to uniquely identify a natural number N. For small numbers, this seems trivial. However, is it possible to generalize this function for all integers?