It's an interesting bit of syllogism there, I think. Here's more:gizmodo.com wrote:We probably all vaguely assume that computers will overthrow us someday, which may be why it's so unsettling to learn that computer code is evolving much like genetic code. By comparing bacterial genomes to Linux, researchers have found "survival of the fittest" acting in computer programming.
Sergei Maslov of Brookhaven National Laboratory and Stony Brook University grad student Tin Yau Pang looked at how different components in genomes and computer code survive. They noted that in both examples of complex systems, prevalent constituent parts become widespread by being so integral that they can't be removed. And they do this by contributing to reproduction, either directly or through expansions that make reproduction possible.
It makes sense that the more a gene or a specific program is used, the more future developments will depend on it as a given, but the surprising part is the similarity in frequency of use between important genes and computer programs. Maslov and Pang looked at 500 bacterial species and 2 million individual computers. They found that the frequency of certain genetic code being used in life-sustaining bacterial processes was extremely close to the frequency of installation of 200,000 Linux packages. Maslov explains:
You find the number of key parts by taking the square root of the dependent components. But Maslov points out that this only holds for open source code, where evolution happens "naturally." Okay, definitely thinking our universe is a computer simulation now.We found that we can determine the number of crucial components – those without which other components couldn't function – by a simple calculation that holds true both in biological systems and computer systems . . . Bacteria are the ultimate BitTorrents of biology.
This to me is interesting for several reasons - one, I've been reading up on the various possibilities for artificial life (or human-created, or at least human-inspired), and seeing that code can adapt and change organically much as life can is an interesting comparison.phys.org wrote:The term "survival of the fittest" refers to natural selection in biological systems, but Darwin's theory may apply more broadly than that. New research from the U.S. Department of Energy's Brookhaven National Laboratory shows that this evolutionary theory also applies to technological systems.
Computational biologist Sergei Maslov of Brookhaven National Laboratory worked with graduate student Tin Yau Pang from Stony Brook University to compare the frequency with which components "survive" in two complex systems: bacterial genomes and operating systems on Linux computers. Their work is published in the Proceedings of the National Academy of Sciences.
Maslov and Pang set out to determine not only why some specialized genes or computer programs are very common while others are fairly rare, but to see how many components in any system are so important that they can't be eliminated. "If a bacteria genome doesn't have a particular gene, it will be dead on arrival," Maslov said. "How many of those genes are there? The same goes for large software systems. They have multiple components that work together and the systems require just the right components working together to thrive.'"
Using data from the massive sequencing of bacterial genomes, now a part of the DOE Systems Biology Knowledgebase (KBase), Maslov and Pang examined the frequency of usage of crucial bits of genetic code in the metabolic processes of 500 bacterial species and found a surprising similarity with the frequency of installation of 200,000 Linux packages on more than 2 million individual computers. Linux is an open source software collaboration that allows designers to modify source code to create programs for public use.
The most frequently used components in both the biological and computer systems are those that allow for the most descendants. That is, the more a component is relied upon by others, the more likely it is to be required for full functionality of a system.
It may seem logical, but the surprising part of this finding is how universal it is. "It is almost expected that the frequency of usage of any component is correlated with how many other components depend on it," said Maslov. "But we found that we can determine the number of crucial components – those without which other components couldn't function – by a simple calculation that holds true both in biological systems and computer systems."
For both the bacteria and the computing systems, take the square root of the interdependent components and you can find the number of key components that are so important that not a single other piece can get by without them.
Maslov's finding applies equally to these complex networks because they are both examples of open access systems with components that are independently installed. "Bacteria are the ultimate BitTorrents of biology," he said, referring to a popular file-sharing protocol. "They have this enormous common pool of genes that they are freely sharing with each other. Bacterial systems can easily add or remove genes from their genomes through what's called horizontal gene transfer, a kind of file sharing between bacteria," Maslov said.
The same goes for Linux operating systems, which allow free installation of components built and shared by a multitude of designers independently of one another. The theory wouldn't hold true for, say, a Windows operating system, which only runs proprietary programs.
Secondly, the perspective of technology toward understanding life is a helpful one, and one that might reveal greater insights as time progresses.
---
It also presents an interesting idea, in my mind. In most scifi, the worry about artificial life is that it'll far more rapidly outstrip the progress and development of any organically-evolved nation. However, these two articles make me suspect otherwise - the rapid code adaptation and evolution could be viewed as being as DNA on a very basic level of synthetic life - while it can adapt and change rapidly on the small scale, its the issue of working up the scales in size and complexity that becomes the problem. It makes me think that while it would be a real trip to see a planet where robots of various sorts fill in various places on an artificial food and ecology chain, having done so through adaptation and over time.
Such a petri dish of a world I think would give rise to a very interesting form of sapients.