Study estimates the energy costs of information processing in biological systems




The communication between cells and other molecular components is one of the innumerable biological processes that support the behaviors, physiology, and life of living creatures. It is known that these molecular constituents can communicate with one another through a variety of mechanisms, such as mechanical wave exchange or the processes of diffusion and electrical depolarization.

A new study by Yale University researchers sought to determine the metabolic cost of this information transmission between molecular components and cells. Their work, which was published in Physical Review Letters, presents a novel instrument that may be applied to the research and comprehension of cellular networks.

Researchers Benjamin B. Machta and others have been considering this idea in one way or another for some time, as Machta informed Phys.org.

"About ten years ago, my Ph.D. adviser Jim Sethna and I originally talked about concepts that would eventually become this project, but for a variety of reasons that work never quite took off. Sam and I got to speak about this when we were considering ways to make sense of the energy costs that biology must incur in order to compute—a topic that runs through a lot of his doctoral work—and perhaps in a broader sense, to make sure that its components are cohesive and under control. He worked out how to perform these calculations."

Inspiration for the latest study by Machta and colleague Samuel J. Bryant comes from past studies that were released in the late nineties, especially from Simon Laughlin and associates. This group of researchers had attempted to ascertain through experimentation how much energy neurons expend during information transmission.

The amount of energy required to delete a bit of information is significantly larger than the "fundamental" constraint of ~ KBT/bit, also known as the Landauer bound, according to research by Laughlin and colleagues, which varied depending on the circumstances, Machta said.

We were curious as to whether this was an instance of biology being wasteful in any manner. It's also possible that there were additional expenses that needed to be covered; in instance, the Landauer limit doesn't include geometry or other specifics. Although it is not the main subject of this discussion, applying the Landauer bound is subtle in and of itself because it is only paid on deleting information, making it feasible to compute reversibly, never delete anything, and pay NO computational cost."

Machta and Bryant's latest work also sought to ascertain if optimizing these energy costs may provide insight into the reasons for the diverse physical procedures that molecular systems use to interact with one another under various conditions. For example, whereas electrical impulses are the usual means of communication between neurons, other kinds of messages can also be sent by chemical diffusion.

"Our goal was to determine the optimal regime for each of these (and other) in terms of energy cost per bit," stated Machta. "All of our computations take into account data that is sent via a physical channel, from a physical information sender (such as an ion channel that is used for'sending' signals) to a physical receiver (such as a voltage detector in the membrane that may also be an ion channel). The information rate across a Gaussian channel is calculated using a textbook method at its core, with a few novel twists."

First off, Machta and his associates' calculations are always based on a physical channel, where electrical charges and currents of actual particles are transported in accordance with the physics of a cell. Second, the group has long believed that thermal noise in the cellular environment corrupts a channel.

"The 'fluctuation dissipation theorem,' which relates the spectrum of thermal fluctuations to the near equilibrium response functions, allows us to calculate the spectrum of this noise," Machta said.

The fact that the team's calculations were made with very simple models adds to their uniqueness. Because of this, the researchers were able to consistently set cautious lower constraints on the amount of energy needed to drive physical currents in a biological system and power a channel.

Machta stated, "We often calculate costs with a geometric prefactor doubling "KBT/bit," since the signal has to overcome thermal noise.

"The size of the transmitter and receiver may be seen as this geometric element; a larger sender can spread a dissipative current across a greater region, hence reducing costs per bit. Furthermore, a bigger receiver enables greater averaging across temperature variations, allowing the same information to be carried by a weaker overall signal."

For electrical communication, for instance, we obtain a form for the cost per bit that grows like r2/σI σO kBT/bit, where σI,σO are the sender and receiver's sizes and r represents their distance from one another. Importantly, this cost might potentially be several orders of magnitude more than kT/bit, which simpler (or more basic) considerations propose as a bottom bound, for ion channels that are a few nanometers broad but that transmit information over microns."

All things considered, the computations carried out by Machta and associates validate the substantial energy cost linked to information transit between cells. In the end, these approximations may serve as a basis for an explanation of the high information processing cost seen in experimental investigations.

"Our explanation relies on the geometry of neurons and ion channels, among other details, making it less 'fundamental' than the Landauer bound," Machta stated. But if biology is affected by these specifics, it's possible that, rather than just being inefficient, neurons—for instance—are efficient and facing actual energy or information constraints. While these calculations are by no means sufficient to conclude that a certain system is efficient at this point, they do indicate that transmitting data into space may need very high energy expenditures."

This latest work by Machta and colleagues may provide insights into exciting new biological research in the future. The researchers also included a "phase diagram" in their study, which depicts scenarios where it is best to employ particular communication mechanisms (such as chemical diffusion, electrical signaling, etc.) sparingly.

Soon, this graphic may aid in a better understanding of the fundamentals behind various cell signaling techniques. For example, it might explain why E. coli bacteria use diffusion to transmit information about their chemical environment and why neurons use chemical diffusion to communicate at synapses but use electrical signals when sending information over hundreds of microns from dendrites to the cell body.

Machta continued, "One thing we are working on right now is trying to apply this framework towards understanding the energetics of a concrete signal transduction system,"

"In actual systems, there are usually information processing networks; implementing our constraint necessitates a knowledge of the information flow in these networks. Our previous work only examined the abstract cost of conveying information between two isolated components. Application of our computations to particular geometries (such as a'spherical' neuron or an axon that resembles a tube, both significantly different from the endless plain we employed here) presents further technological challenges in pursuit of this aim.



Journal information: Physical Review Letters