The readers may take a look at the New York Times article written by Carl Zimmer - Is Most of Our DNA Garbage?. It features the work of T. Ryan Gregory, whose name should be familiar those following our evolutionary biology section (e.g check Evolution of Genome Size Scientific and Religious Analysis). Dr. Gregory’s research consists of measuring the genome sizes of various organisms, and looking for any evolutionary pattern among those sizes. Here is a brief summary of what he found after decades of work.
1. Based on data collected on ~5000 animal genomes so far, genome sizes were found to vary 7000-fold. That is an enormous range.
2. Animals with the largest genomes lungfish (~80-120Gbp), salamanders (similar order), sharks (~10-20Gbp), grasshoppers, flatworms and crustaceans.
3. The only phenotypic links with genome size are in larger cell size and longer time to do cell division for organisms with large genomes.
4. Ecological correlation An emerging trend from animal (and more specifically, crustacean) genome size studies is the positive relationship between genome size and latitude.
5. Correlation with intron size Intron size and genome size are known to be positively correlated between species of Drosophila (Moriyama et al. 1998), within the class of mammals (Ogata et al. 1996), and across eukaryotes in general (Vinogradov 1999).
6. No relationship between genome size and animal complexity has ever been found. Researchers have been looking into this for over four decades.
The entirety of collected evidence points to only one explanation for large genome size - ‘junk dna’. That means if organism A and B are evolutionarily close and have similar level of complexity (as defined by the number of different cell types), and the genome of organism B is much larger than organism A, then a large part of the genome of organism B consists of nonfunctional DNA. As a good example, fugu fish and zebrafish are evolutionarily related, but the genome sizes are - fugu (390Mb), zebrafish (1.7 Gb). Therefore, the genome of zebrafish likely has 1.3 Gb more junk DNA than fugu. In fact, humans are evolutionarily not that distant from either of those two fish species. That suggests that human genome has even more junk DNA. That is the core of Gregory’s argument, and nobody has refuted it so far except Francis Collins and ENCODE clowns funded by him.
In January, Francis Collins, the director of the National Institutes of Health, made a comment that revealed just how far the consensus has moved. At a health care conference in San Francisco, an audience member asked him about junk DNA. We dont use that term anymore, Collins replied. It was pretty much a case of hubris to imagine that we could dispense with any part of the genome as if we knew enough to say it wasnt functional. Most of the DNA that scientists once thought was just taking up space in the genome, Collins said, turns out to be doing stuff.
Sadly he or the clowns he backs have not provided any solid evidence of ‘turns out to be doing stuff’. They back away from hyped up claims, when challenged.
John Rinn is one such hype-star featured in the NY Times article. Between 2002-2007, my co-authors and I wrote a number of papers showing that the noncoding regions of the genome were differentially expressed in several organisms under different conditions. Tiling array experiment was our mode of measurement in those experiments. However, the main criticism was that those expressions were likely transcriptional noise, and one needed to actually show the actual function to claim that those expressed regions were functional. I did that in 2006 in yeast for one noncoding RNA, and Sid Altman, who received Nobel prize for discovering catalytic noncoding RNA, confirmed our work in a later paper. Still our result showed the function of only one novel RNA, and we could not say anything about the remaining expressed region.
Neither did John Rinn, our ex-collaborator. In 2009, he published couple of papers showing the functionality of only one non-coding RNA in the human genome. That was not much extra achievement beyond ours, but he has an amazing talent of hyping things and saying what NHGRI likes to hear. His one functional RNA seemed to have ‘proved’ the functionality of the entire human genome, and now it is time to search for diseases there as well !!
Interestingly, this ‘scientific consensus’ has its effect on the corresponding wikipedia page as well. It is now full of contrasting scientific and pseudo-scientific claims, and you may feel like being in the middle of a war-zone. For example, check this paragraph written by a scientifically trained person -
The term “junk DNA” became popular in the 1960s. It was formalized in 1972 by Susumu Ohno, who noted that the mutational load from deleterious mutations placed an upper limit on the number of functional loci that could be expected given a typical mutation rate. Ohno predicted that mammal genomes could not have more than 30,000 loci under selection before the “cost” from the mutational load would cause an inescapable decline in fitness, and eventually extinction. This prediction remains robust, with the human genome containing approximately 20,000 genes.
You learn that a scientist named Ohno came up with the description ‘junk dna’ and made a prediction that remains robust. Yet, the introduction says -
Initially, a large proportion of noncoding DNA had no known biological function and was therefore sometimes referred to as “junk DNA”, particularly in the lay press. However, it has been known for decades that many noncoding sequences are functional.
That block makes absolutely no sense, because Ohno did not write for ‘lay press’ and his claim of large proportion of noncoding DNA to be junk was not invalidated by many noncoding sequences being functional. In fact, tRNA was discovered in 1960s and Francis Crick, who hypothesized its existence, was also credited for coming up with the concept of junk DNA, as mentioned in the NY Times piece.
Faced with this paradox, Crick and other scientists developed a new vision of the genome during the 1970s. Instead of being overwhelmingly packed with coding DNA, the genome was made up mostly of noncoding DNA. And, whats more, most of that noncoding DNA was junk that is, pieces of DNA that do nothing for us.
Therefore, clearly the nonsensical block in wikipedia is inserted by an ENCODE pseudo-scientist. There is no better evidence of that than the section it led to -
The Encyclopedia of DNA Elements (ENCODE) project suggested in September 2012 that over 80% of DNA in the human genome “serves some purpose, biochemically speaking”.
Morals of the story -
(i) The emperor has no cloth, but you can afford to say that only if you are living in Canada like T. Ryan Gregory,
(ii) Be careful about trusting ‘science’ discussed in wikipedia.