A New Concept of Information Based on Heisenberg's Uncertainty Principle and Its Experimental Verication

This study is the first use of Heisenberg's energy-time uncertainty principle to define information quantitatively from a measuring perspective: the smallest error in any measurement is a bit of information, i.e., 1 (bit)=(2∆E ∆t)⁄ℏ . If the input energy equals the Landauer bound, the time needed to write a bit of information is 1.75x10 -14 s . Newton's cradle was used to experimentally verify the information-energy-mass equivalences deduced from the aforementioned concept. It was observed that the energy input during the creation of a bit of (binary) information is stored in the information carrier in the form of the doubled momentum or the doubled “ momentum mass ” (mass in motion) in both classical position-based and modern orientation-based information storage. Furthermore, the experiments verified our new definition of information in the sense that the higher the energy input is, the shorter the time needed to write a bit of information is. Our study may help understand the fundamental concept of information and the deep physics behind it.


I. INTRODUCTION
In information theory, information can be interpreted as the resolution of uncertainty in the sense that it answers the question of "What an entity is" [1][2]. Uncertainty is inversely proportional to the probability of occurrence of an event: more information is required to resolve their uncertainty for more uncertain events [2]. A bit of information is "that which reduces uncertainty by half" [3].
In quantum computing, Heisenberg's uncertainty principle states that the more precisely the position of a Brownian particle is determined, the less precisely its momentum can be predicted from the initial conditions, and vice versa [4]. This prescription is different from classical Newtonian physics, which holds all variables of particles to be measurable to an arbitrary accuracy given sufficiently precise equipment. Heisenberg's principle also applies to the energy and time, in that one cannot measure the energy of a particle precisely in a finite amount of time.
In this study, we used Heisenberg's energy-time uncertainty principle to define a bit of information quantitatively from a measuring perspective as follows: where ∆ is the standard deviation in the energy, ∆ is the standard deviation in the time, and ℏ = 1.05 × 10 −34 • is the reduced Planck constant or the Dirac constant. Obviously, ∆ ∆ ℏ ⁄ is unitless, which does not violate the definition of information in units. Heisenberg's energy-time uncertainty principle is used to define information quantitatively from a measuring perspective: the smallest error in measuring the energy and time of a Brownian particle is a bit of information. This new concept mathematically defines the energy-time cost of an information manipulation: the higher the energy input is, the shorter the time needed to write a bit of information is and vice versa.
We defined information to be interchangeable with energy over time, where the smallest error (the fundamental limit to the accuracy) in measuring a pair of prescribed physical attributes of a particle is a bit of information, as shown in Fig.1.
The principle presented in Eq.1 corresponds to stating that the higher the energy input is, the shorter the time needed to write a bit of information is and vice versa.
The energy term in Eq.1 can be expressed by the Landauer bound = ln 2, where kB is the Boltzmann constant and T is the temperature [5]. This bound has been verified using an experiment, in which a particle is trapped in a double-well potential [6].

II. POSITION-BASED INFORMATION STORAGE
The Landauer bound can be illustrated by an information carrier (an electron or another particle) moving upwards and downwards between impenetrable barriers at the two ends of a one-dimensional nanotube [ Fig.2(a)]. Assuming collisions are nearly or totally elastic, the carrier loses no kinetic energy in a collision and thus has the same speed after the collision as before the collision. According to the impulse-momentum theorem [7], the average force, , between the two collisions is defined such that For ideal gases (molecules, ions, etc.), macroscopic phenomena, such as temperature, can be explained in terms of the classical mechanics of microscopic particles [8][9] [10]. According to the equipartition theorem, each classical degree of freedom of a freely moving particle has an average kinetic energy where 0 is the mass of the information carrier, 0 is the velocity of the information carrier, and is Boltzmann's constant [11]. In choosing an information carrier, one should bear in mind that, at a fixed temperature, the speed of a particle increases with the decreased mass.
Therefore, the work an information carrier does on a frictionless piston during information erasure [ Fig.2 This result is the Landauer bound, which has a value of approximately 3 × 10 −21 at room temperature (300 K) [5] [6].
From Eq.1, we can now use the Landauer bound to calculate the time needed to write a bit of information as follows: This calculation result agrees reasonably with the picosecond timescale demonstrated by ultrafast magnetization reversal [12]. FIGURE 2. Position-based information storage. In (a), a free state (without any written information) exists, in which the information carrier is only reciprocating inside a one-dimensional tube without remaining in either half of the tube, where v0 and -v0 denote the velocities before and after a collision, respectively. In (b), a bit of information is created by adding a partition to confine the information carrier in the desired half of the tube, in harmony with the observation "a bit of information reduces uncertainty by half" [3]. In (c), the partition becomes a frictionless piston and the information carrier can push it to do useful work in exchange for the loss of the written information (that is, the half of the tube that is occupied). According to the impulse-momentum theorem [7], the average force, , between the two collisions is defined such that ∆ = Theoretically, a room-temperature computer could be operated at a rate of 57 Tera-bits/s according to Eq.6. As a matter of fact, a modern computer uses millions of times as much energy as the Landauer bound [13] so that the theoretical time needed to write a bit of information could be much shorter than the value given by Eq.6, as shown in Fig.3.
If the write time is as short as the Planck time (5.39 × 10 −44 ) that is the shortest validly measurable time length [14], the corresponding energy needed to write a bit of information is which is 29 orders of magnitude larger than the Landauer bound.
As shown in Fig.2, a compressed ideal gas (as a bit of written information) can be approximated by an elastic spring with a (constant) elasticity coefficient K. Under the application of a force F, the contraction of the spring is described by Hooke's law: F = K × contraction [15]. For a spring with a free length L, we can use Eq.3 to estimate the elasticity coefficient K at = /2 as follows: The elastic potential energy stored in this spring is When = , = 0, which corresponds to the free state of the spring; when = /2, = 1 2 0 0 2 , which indicates that the elastic potential energy reaches a maximum and originates from the kinetic energy of the information carrier. The calculated potential energy ( 1 2 0 0 2 ) stored in this imaginary spring is slightly different from the Landauer bound ( ln 2 = 0 0 2 ln 2) because the elastic coefficient is actually not constant for a (contracted) ideal gas.
To investigate the mass effect of the information carrier, the Landauer bound in Eq.5 can be rewritten in terms of the mass as follows: As the translational motion of a particle has three degrees of freedom, the average translational kinetic energy of a freely moving particle in a system with temperature T will is Considering that the average molar mass of dry air is 28.97 ⁄ = 28.07×10 −3 6.022×10 23 = 4.81 × 10 −26 , we obtain This velocity is larger than the speed of sound (340 m/s), but much smaller than the speed of light. For this reason, Einstein's energy-mass formula = 2 [16] should not be used to convert energy to mass in the Landauer bound [17]. Relativistic effects do not need to be considered in this classical thermodynamic system. Actually, the two aforementioned equivalences form an information-energy-mass triangle (Fig.4), which shows that information is ultimately physical in terms of requiring energy to create/manipulate it as well as a carrier (of a mass 0 ) to carry it.

III. EXPERIMENTS
Newton's cradle [18] was used to experimentally verify the aforementioned information-energy-mass equivalences. As shown in Fig.5, a typical Newton's cradle consists of identical metal balls suspended in a metal frame such that the balls just touch each other at rest. A ball at one end (equivalent to the frictionless piston in Fig.2) is lifted and released, thereby striking a stationary ball (equivalent to the information carrier in Fig.2), through which a force is transmitted that pushes the ball at the other end upward. A Newton's cradle consists of identical stainless steel balls suspended in a metal frame (18x18x12 cm) with a marble base that just touch each other at rest. The swing strings are made of high-impact nylon made with nano-technology for ultrathin yet extremely durable oscillatory suspension. As a collision between two balls is nearly elastic, a ball loses no kinetic energy in a collision and the ball speed after the collision is nearly the same as that before the collision, which means that those Newton's cradle balls (each weighs 150 g) behave like ideal gas particles.  Fig.2(a) is simulated here. The "tube" is 6 cm long, and a heavy hammer is used at one end to trap a ball as an information carrier. This information ball is only reciprocating inside the simulated tube, which reproduces the behaviour of a trapped reciprocating ideal gas particle in the Landauer bound calculation [5]. Fig.6 shows an experiment conducted on the free state without any written information. A one-dimensional tube with a trapped information ball is simulated. The kinetic energy of the information ball can be measured by the swing height of a test ball resulting from a collision between the two balls that brings the information ball to a complete stop, whereby all its kinetic energy is fully converted into the potential energy of the test ball, i.e., 0 = 1 2 0 0 2 , where g is the gravitational constant. Fig.7 shows an experiment conducted on the information creation/erasure. The information ball is confined in the desired half of the tube, in harmony with the observation "a bit of information reduces uncertainty by half" [3]. As shown in Fig.7(c)-(d), the information ball exerts useful work in pushing the test ball (analogous to the piston in Fig.2) to release irreversible heat to the environment as a result of losing that written information (that is, the side of the partition that is occupied). This so-called "information engine" is illustrated in Fig.8, where the information erasure is an irreversible manipulation of the created information that increases the entropy. Another way of interpreting the Landauer bound is that, if the information is "burnt", the "Maxwell's demon" that "created" information in Fig.2(b) or the observer who created "information" in Fig.7(a-b) loses the ability to extract work from the system. Fig.6 in order to gives the information ball the same velocity v0, which implies that the temperature T remains unchanged since  Fig.2) in exchange of the loss of the written information. FIGURE 8. An information "burning" engine. The information erasure is an irreversible manipulation of the created information, i.e., the "Maxwell's demon" or the observer that "created" the information loses the ability to extract work from the system after the information is "burnt".

. That is, the momentum of the information ball is doubled within the same time period as that of the free state. An important conclusion of this study is that the energy input during the creation of a bit of (binary) information is converted into the doubled momentum or the doubled momentum mass of the information ball. In (c)-(d), the information ball exerts useful work in pushing the test ball (analogous to the piston in
The comparison of Fig.6 and Fig.7 indicates that the swing height of the test ball in (d) after the operation is nearly the same (within the acceptable error tolerance), which implies that the kinetic energy of the information ball remains the same across the two cases. It is obvious that the travel time of the information ball in one reciprocation cycle is halved during the information creation, which implies that the impulse of the information ball doubles for the same time period as that of the free state. This result is obtained because the impulse is time-specific (the time interval must be specified to determine the corresponding impulse value) according to the definition of the impulse ∆ = • . That is, the energy input during the creation of a bit of (binary) information is converted into the doubled momentum or the doubled "momentum mass" (mass in motion) = 2 0 of the information ball because ∆ = • = 2 0 0 = 0 and the velocity 0 is a constant.   Fig.9. It is demonstrated again that the momentum of the information ball is doubled within the same time period as that of the free state. Furthermore, it verifies our new definition of information ( ( ) = ∆ ∆ ℏ ⁄ ) in the sense that the higher the energy input is, the shorter the time needed to write a bit of information is. Fig.9 and Fig.10 repeat the experiments in Fig.6 and Fig.7 with the doubled vecolcity of the information carrier. It is validated again that the momentum of the information ball is doubled within the same time period as that of the free state.
The doubled momentum or the doubled "momentum mass" during the creation of a bit of (binary) information in Fig.7 and Fig.10 can also be explained by a doubled number of collissions for the same time period. As shown in Eq.2, each collision doubles the momentum in such a way that ∆ = 0 0 − (− 0 0 ) = 2 0 0 by reversing the direction of a moving information carrier after a collision.

IV. ORIENTATION-BASED INFORMATION STORAGE
As elaborated above, the position of a Brownian particle is used to store information in a classical thermodynamic system (Fig.2). A spin can be used as a modern computing paradigm to replace a charge for information storage, allowing for faster, low-energy operations [19]. For example, an electron has a charge and a spin that are inseparable. Fig.11 shows the Stern-Gerlach experiment demonstrating the deflection of silver atoms with nonzero magnetic moments by a magnetic field gradient [20]. The screen reveals discrete points of accumulation, rather than a continuous distribution, resulting from the quantized spin. Historically, this experiment is a seminal benchmark experiment of quantum physics providing evidence for the reality of angular-momentum quantization in all atomicscale systems [20]. As shown in Fig.11, the energy of flipping a spin in a magnetic field B can be expressed as where μB is the Bohr magneton and the value of the electron spin g-factor is roughly equal to 2.
In this "nonclassical" information system, the magnetic field B (as an environmental parameter) is analogous to the temperature T (as another environmental parameter) in the Landauer bound, which determines a new energy bound.
The magnetic interaction between the two spin-1/2 valence electrons across a separation (2.18~2.76 ) was measured [21]. According to our calculation, one electron applies a magnetic field = 8.82 × 10 −14 to the other electron (across a separation of 2.76 ), which is much smaller than ≈ 0.1 , = 1 × 10 3 / in the Stern-Gerlach experiment [20]. Accordingly, the energy to flip a spin via the spin-spin interaction in the presence of the magnetic field B is ∆ = 2 = 1.64 × 10 −36 . This energy bound is 15 orders of magnitude lower than the Landauer bound (3 × 10 −21 ) [19]. The energy used to retain the defined spin state must still be higher than the Landauer bond to keep the electron at one side of the potential well. At the readout or erasure stage, there is no need to move the electron from one side of the potential well to the other side, which is different from the energy based on an electron's position. In either case, we cannot separate the (internal, intrinsic) spin and charge of an electron [19].
An information-energy-mass triangle for spin information storage is shown in Fig.12. The mass of an information carrier remains an important and necessary apex of this triangle in terms of carrying information in a carrier. The energy input during the information creation is still converted into the doubled momentum mass of the information carrier.

V. CONCLUSIONS & DISCUSSIONS
Information can be interpreted as the resolution of uncertainty and thereby expressed by Heisenberg's energytime uncertainty principle: 1 ( ) = 2∆ ∆ ℏ ⁄ . Historically, the constant h was found by Planck in his radiation law (Fig.13) for a black body: ∆ = ℎ , in which ∆ is the discrete amount of the vibrational energy of each oscillator (atoms in a state of oscillation) and v is the frequency of the oscillation [22]. An information carrier reciprocating inside a tube (Fig.2) can be viewed as a Plank's oscillator ( Fig.13) with an oscillation frequency . Therefore, our new definition of information based on Heisenberg's uncertainty principle is also supported by the Plank constant itself. As summarized in Fig.14, data storage can be categorized into two types: (classical) position-based and (modern) orientation-based. As illustrated in Table 1, a specific information carrier is needed for each type of data storage depending on the chosen physical feature. TABLE 1 A specific information carrier is needed for each type of data storage. Since the velocity is much smaller than the speed of light, relativistic effects do not need to be considered unless those particles travelling (fully/nearly) at the speed of light (such as photons and neutrinos) are used.

Info carrier
Mass m0 Velocity of Brownian motion V0 at R. T.

√ √
Regardless of whether a bit of information is positionbased or orientation-based, the energy barrier (the Landauer bound) of the bistable potential well needs to be overcome to create/write the bit from scratch. For position-based data storage, a rewriting, (destructive) readout or erasure operation still needs to move the information carrier between the two stable states. For orientation-based data storage, a readout or erasure operation of spin information does not need to move the information carrier (electron) from one side of the potential well to the other side. Although the energy of flipping a spin is much lower than the energy based on an electron's position, we cannot separate the (internal, intrinsic) spin from its carrier, and the mass m0 of an information carrier still needs to be considered in our energy calculation in a classical thermodynamic way, as illustrated in Fig.14.
Newton's cradle was used to experimentally verify the deduced information-energy-mass equivalences. These experiments vividly demonstrate that the energy (to halve the reciprocating motion distance of the information carrier) input during the creation of a bit of (binary) information is stored in the information carrier in the form of the doubled momentum or the doubled "momentum mass" (mass in motion). During the information erasure, the stored energy was found to release irreversible heat to the environment as a result of losing that written information. Furthermore, the experiments verified our new definition of information (1 (bit)= (2∆E ∆t)⁄ℏ) in the sense that the higher the energy input is, the shorter the time needed to write a bit of information is.

FIGURE 14. Regardless of whether a bit of information is position-based
or orientation-based, the barrier (the Landauer bound) of the bistable potential well needs to be overcome to create/write the bit from scratch. For position-based data storage, a rewriting, (destructive) readout or erasure operation still needs to move the information carrier between the two stable states. For orientation-based data storage, a readout or erasure operation of spin information does not need to move the information carrier (electron) from one side of the potential well to the other side. Although the energy of flipping a spin is much lower than the energy based on an electron's position, we cannot separate the (internal, intrinsic) spin from its carrier, and the mass of an information carrier still needs to be considered in the corresponding energy calculation in a classical thermodynamic way (see inset).
The aforementioned conclusions on the informationenergy-mass equivalences may help understand the fundamental concept of information and the deep physics behind it. It may also arouse considerable interest in reversible computing, in which no information is erased to avoid to release irreversible heat to the environment. According to Koomey's law [25], the increase in the computational energy consumption based on the Landauer bound will come to a halt by 2050.