Thursday, 28 October 2010

Fine-grained exploration

Many, if not all, complex systems in biology have a fine-grained architecture, in that they consist of large numbers of relatively simple elements that work together in a highly parallel fashion.

Several possible advantages are conferred by this type of architecture, including robustness, efficiency, and evolvability. One additional major advantage is that a fine-grained parallel system is able to carry out what Douglas Hofstadter has called a "parallel terraced scan". This refers to a simultaneous exploration of many possibilities and pathways, in which the resources given to each exploration at a given time depend on the perceived success of that exploration at that time. The search is parallel in that many different possibilities are explored simultaneously, but is "terraced" in that not all possibilities are explored at the same speeds or to the same depth. Information is used as it is gained to continually reasses what is important to explore.

In cellular metabolism such fine-grained explorations are carried out by metabolic pathways, each focused on carrying out a particular task. A pathway can be speeded up or slowed down via feedback from its own results or from other pathways. The feedback itself is in the form of time-varying concentrations of molecules, so the relative speeds of different pathways can continually adapt to the moment-to-moment needs of the cell.

Melanie Mitchell - Complexity: a guided tour.

Friday, 22 October 2010

Cheating yeast help group.

New results show that yeast populations grow better when a few individuals cheat the system

From: http://www.the-scientist.com

Yeast colonies with mooches, thieves and cheats actually grow faster and larger than colonies without these freeloading individuals, according to a study published 15th September in PLoS Biology, challenging the widely held belief that cheaters bring only bad news to cooperating populations.


Researchers found that when some yeast cheat their neighbors out of glucose, the entire population grows faster. Image: Eric Miller, Max Planck Institute of Evolutionary Biology "This is a most surprising result," said Laurence Hurst of the University of Bath in the UK, who coauthored the study. "The theory of cooperation was one of the best worked theories in all of evolution. Everyone assumed that it had to be the case that the world is better off when everyone cooperates."

The results may explain why yeast populations tolerate the presence of cheaters, added Michael Travisano, a biologist at the University of Minnesota, who was not involved in the research -- "because a mixed strategy is to everyone's benefit."

Most yeast secrete invertase, which hydrolyzes sucrose into fructose and glucose, their preferred food. However, some yeast are known to cheat the system. Cheater yeast don't secrete invertase and therefore don't contribute to the glucose production, yet they still eat the glucose that is generated by the rest of the population.

According to the theory of cooperation, which states that organisms are better off when everyone cooperates, yeast populations should be best off when all the yeast produce invertase. This would maximize the availability of glucose, which should enable more yeast growth. But when Hurst and his colleagues grew yeast populations with both producers and non-producers of invertase, this is not what they saw. Instead, the yeast grew the fastest and saw the highest population numbers when a proportion of the population was cheating.

One reason populations with cheaters grew better has to do with the yeast's inability to efficiently use abundant resources, Hurst said. "If you can hop down to your local McDonald's for a Big Mac, and it's very easy and very cheap, then you don't mind if you eat half of it and throw the rest away," Hurst said. "If you were starving in Africa, you wouldn't even imagine doing that." With the cheater yeast using up some of the available glucose, the cooperators are able to use the remaining resources much more efficiently, he said, allowing the population to grow larger and more quickly.

The team also modeled the experimental results in an effort to see whether their findings were specific to yeast growing on a petri dish, or whether they might apply to other organisms as well. The results showed that their experimental outcomes could be generalized -- cheats would benefit a population whenever certain criteria were met. These results imply that cooperation isn't always the most beneficial path for a population, Hurst said. Instead, the benefits of cooperation depend on the characteristics of the population itself. Under certain conditions, some amount of cheating is likely beneficial.

But the story is not a simple one, said Jeff Gore, a biophysicist at the Massachusetts Institute of Technology, who did not participate in the research. For example, "if the cheaters and cooperators are growing at different rates, the ratio of cooperators to cheaters won't be stable," Gore said. Thus, the population may be changing, and "you still have to ask what [it] is going to evolve to," and not just look at where it is now.

R.C. MacLean, et al., "A mixture of "cheats" and "cooperators" can enable maximal group benefit," PLoS Biol, 8(9): e1000486, 2010.

Saturday, 16 October 2010

Origins of organismal complexity

The vast majority of biologists engaged in evolutionary studies interpret virtually every aspect of biodiversity in adaptive terms. This narrow view of evolution has become untenable in light of recent observations from genomic sequencing and population-genetic theory. Numerous aspects of genomic architecture, gene structure, and developmental pathways are difficult to explain without invoking the nonadaptive forces of genetic drift and mutation. In addition, emergent biological features such as complexity, modularity, and evolvability, all of which are current targets of considerable speculation, may be nothing more than indirect by-products of processes operating at lower levels of organization. These issues are examined in the context of the view that the origins of many aspects of biological diversity, from gene-structural embellishments to novelties at the phenotypic level, have roots in nonadaptive processes, with the population-genetic environment imposing strong directionality on the paths that are open to evolutionary exploitation.

(...)

Although the basic theoretical foundation for understanding the mechanisms of evolution, the field of population genetics, has long been in place, the central significance of this framework is still occasionally questioned, as exemplified in this quote from Carroll, “Since the Modern Synthesis, most expositions of the evolutionary process have focused on microevolutionary mechanisms. Millions of biology students have been taught the view (from population genetics) that ‘evolution is change in gene frequencies.’ Isn't that an inspiring theme? This view forces the explanation toward mathematics and abstract descriptions of genes, and away from butterflies and zebras…. The evolution of form is the main drama of life's story, both as found in the fossil record and in the diversity of living species. So, let's teach that story. Instead of ‘change in gene frequencies,’ let's try ‘evolution of form is change in development’.” Even ignoring the fact that most species are unicellular and differentiated mainly by metabolic features, this statement illustrates two fundamental misunderstandings. Evolutionary biology is not a story-telling exercise, and the goal of population genetics is not to be inspiring, but to be explanatory. The roots of this contention are fourfold.

First, evolution is a population-genetic process governed by four fundamental forces. Darwin articulated one of those forces, the process of natural selection, for which an elaborate theory in terms of genotype frequencies now exists. The remaining three evolutionary forces are nonadaptive in the sense that they are not a function of the fitness properties of individuals: mutation is the ultimate source of variation on which natural selection acts, recombination assorts variation within and among chromosomes, and genetic drift ensures that gene frequencies will deviate a bit from generation to generation independent of other forces. Given the century of work devoted to the study of evolution, it is reasonable to conclude that these four broad classes encompass all of the fundamental forces of evolution. From Michael Lynch. PNAS May 15, 2007 vol. 104 no. Suppl 1 8597-8604.

Friday, 15 October 2010

Evolution of genetic networks

Posted by Thomas in http://blog-msb.embo.org/blog/e/evolution_1/


A few days ago, an exciting review by Michael Lynch was published in Nature Reviews Genetics (The evolution of genetic networks by non-adaptive processes, Lynch 2007a ), a close follow-up of another review, published in PNAS a few months ago (The frailty of adaptive hypotheses for the origins of organismal complexity, Lynch 2007b). Michael Lynch has also written a book on the topic: The Origins of Genome Architecture (read a review)

The architecture of biological networks are often hypothesized as being "shaped" by adaptive evolution to confer global properties such as redundancy, robustness, modularity, complexity and evolvability. Lynch has some robust comments (others have some too, see Jonathan Eisen's "adaptationomics awards") on the “vast majority of biologists engaged in evolutionary studies [who] interpret virtually every aspect of biodiversity in adaptive terms” (Lynch 2007b). In contrast to what he perceives as a widespread belief, Lynch states clearly:

It is an open question as to whether pathway complexity is a necessary prerequisite for the evolution of complex phenotypes, or whether the genome architectures of multicellular species are simply more conducive to the passive emergence of network connections.(Lynch 2007a)

Beyond its somewhat controversial tone, Lynch's central lesson is the need to adopt a population genetics viewpoint (“nothing in evolution makes sense except in light of population genetics”) and he reminds us that, beside natural selection, three additional non-adaptive processes drive the evolution of living organisms: genetic drift, mutation and recombination. By analyzing the interplay between relative rates of loss and gain of regulatory sites (which depend both on mutation rate and mutational target size such as non-coding DNA), population size and recombination frequency, he demonstrates that purely non-adaptive forces can, in principle, determine the level of connectivity of regulatory networks--for example, determine the predominance of highly connected network motifs over linear pathways--without invoking any inherent advantages of the respective architectures on biological functions related for example to development or metabolism. It appears thus that, depending on the population genetics parameters, network structure can be profoundly "shaped" by the mere physical processes of mutation and recombination. At the very least, Lynch proposes that such models should be considered as "null hypothesis" when claiming that selection is engaged in a given aspect of organisms complexity.

In his review of Lynch's book, Massimo Pigliucci draws our attention to the fact that "the genome is only part of the story, arguably the simplest part to figure out", and that one of the greatest current challenges is to explain how phenotypes evolve. Lynch also recognizes that his models are simplified and do not, for example, consider kinetic or dynamical properties of biological networks. But here is a naive question: would it be possible to design an experimental strategy to test directly, in the lab, the evolution of simple (synthetic?) genetic circuits and observe the trends in connectivity under non-selective conditions or are the timescales involved too unrealistic?

Monday, 11 October 2010

Modularity and evolution

It may not be easy for one species to change into another. Imagine fish evolving to become dry land species. It is easy to see how problems arise as fins become legs and lungs are needed. Somehow, the evolutionary process got there. How? Almost certainly, the features of modularity and redundancy were critical. It is possible to slot pre-existing gene-protein networks into new control networks without upsetting them too much. That is modularity.

Suppose mutations occur in some mechanisms, and eventually they are selected for functions quite different from those they originally supported. How, then, to maintain the function that they originally served? What were back-up mechanisms now become primary. That is the value of redundancy. This is the basic explanation for how nature can modify its "aircraft design" while still ensuring that the aircraft continues to fly. By Denis Noble: The music of life - Biology beyond genes.

Fascinating correlations or elegant theories?

From: http://blog-msb.embo.org/blog/ Posted by Thomas on July 10, 2008

Chris Anderson, Editor-in-Chief of Wired , wrote a few weeks ago a provocative piece "The End of Theory: The Data Deluge Makes the Scientific Method Obsolete", arguing that in our Google-driven data-rich era ("The Petabyte Age") the good old "approach to science —hypothesize, model, test — is becoming obsolete", leaving place to a purely correlative vision of the world. There is a good dose of provocation in the essay and it was quite successful in spurring a flurry of skeptical reactions in the blogosphere, FriendFeed-land and lately in Edge's Reality Club.

I know that it is a bit late to write a post on this but this debate reminds me of the bottom-up vs top-down dialectic in (systems) biology. The tradition in molecular biology has been to focus on molecular mechanisms–a series of molecular events–that explain given biological functions. With detailed knowledge on the properties of an increasing number of components, bottom-up mechanistic descriptions–or models–can be constructed, which account for the experimental observations.

Of course, the purpose of models, at least for insightful ones, is more than merely providing mechanistic descriptions. As William Bialek writes, "Given a progressively more complete microscopic description of proteins and their interactions, how do we understand the emergence of function?" (Aguera y Arcas et al, 2003). There is therefore some subsequent subtle transition from description to insight, from model to theory, from detailed and specific to simple and general (watch Murray Gell-Mann's TEDTalk on "Beauty and truth in physics").

Theories are elegant.

On the other hand, high-throughput technologies (microarrays, proteomics, metabolomics, ultra high throughput sequencing, etc...) are indeed profoundly changing molecular biology and flooding the field with experimental data like never before. Currently, only part of this data can be explained within the context of mechanistic models. Still, and this is probably Chris Anderson's main point, it turns out that if the data is rich enough, one can exploit it by looking at the data globally, from the 'top', to reveal statistical patterns and correlations. Even if there is no mechanistic explanations (yet) for these correlations, they may reveal new worlds, novel structures and detect relationships between processes that were considered before as unlinked.

Correlations are fascinating.

Correlations resulting from data-driven analysis may well in turn stimulate new mechanistic investigations and hopefully new understanding. On Edge, Sean Carroll summarizes it all: "Sometimes it will be hard, or impossible, to discover simple models explaining huge collections of messy data taken from noisy, nonlinear phenomenon. But it doesn't mean we shouldn't try. Hypotheses aren't simply useful tools in some potentially-outmoded vision of science; they are the whole point. Theory is understanding, and understanding our world is what science is all about."

BUT, what is true for fundamental science is not obligatorily a rule for more applied fields, where the priority might less be on understanding than on acting. In particular, in medically related fields, top-down data-driven correlative approaches represent a pragmatic approach to obtain predictive models without waiting for still elusive fully mechanistic models that would encompass the entire complexity of human physiology (Nicholson, 2006).

As often in science, as in other human activities, different but complementary views are championed by people with different temperaments: there are those who like to build an edifice piece by piece and those who want to explore new territories. I think–I hope–that progresses in systems biology on both fronts, top-down and bottom-up, demonstrates that there is no need to turn this complementarity into an opposition.

Genetics and the human nature

It's sometimes said that the genes determine the limits up to which, but not beyond which, a person's development may advance. This only confuses the issue. There is no way to predict all the phenotypes that a given genotype might yield in every one of the infinity of possible environments. Environments are infinitely diversified, and in the future there will exist environments that do not exist now. (...) Heredity cannot be called the "dice of destiny". Variations in body build, in physiology, and in mental traits are in part genetically conditioned, but this does not make education and social improvements any less well founded. What genetic conditioning does mean is that there is no single human nature, only human natures with different requirements for optimal growth and self-realization. The evidence of genetic conditioning of human traits, especially mental traits, must be examined with the greatest care. Theodosius Dobzhansky - Mankind evolving (1962).

Tuesday, 5 October 2010

Cis-regulatory evolution

Animal genomes are littered with conserved non-coding elements (CNEs) - most of which represent evolutionarily constrained cis-regulatory sequences - however, it is often not clear why these sequences are so exceptionally conserved, since anecdotal examples have shown that orthologous CNEs can have divergent functions in vivo. In an article recently published in Molecular Biology & Evolution, Ritter et al. compare the functional activities of 41 pairs of orthologous conserved non-coding elements (CNEs) from humans and zebrafish (2010). Interestingly, sequence similarity was found to be a poor predictor of which CNEs had conserved function. In contrast, the authors found that measuring transcription factor binding site change, instead of simple sequence divergence, improves their ability to predict functional conservation. While this set of tested CNEs remains relatively small, these results are encouraging because they suggest that as scientists move from phenomenological measures of CNE evolution to models based explicitly on binding site evolution, the patterns of cis-regulatory evolution observed within animal genomes should become far less mysterious.

From: http://blog-msb.embo.org/blog/