Tomasetti, Vogelstein and co-authors recently published another informative paper on somatic mutations and cancer, but it got lost due to cacophony related to their Science paper. This paper has a lot more details on their mathematical model.
Cancer arises through the sequential accumulation of mutations in oncogenes and tumor suppressor genes. However, how many such mutations are required for a normal human cell to progress to an advanced cancer? The best estimates for this number have been provided by mathematical models based on the relation between age and incidence. For example, the classic studies of Nordling [Nordling CO (1953) Br J Cancer 7(1):68–72] and Armitage and Doll [Armitage P, Doll R (1954) Br J Cancer 8(1):1–12] suggest that six or seven sequential mutations are required. Here, we describe a different approach to derive this estimate that combines conventional epidemiologic studies with genome-wide sequencing data: incidence data for different groups of patients with the same cancer type were compared with respect to their somatic mutation rates. In two well-documented cancer types (lung and colon adenocarcinomas), we find that only three sequential mutations are required to develop cancer. This conclusion deepens our understanding of the process of carcinogenesis and has important implications for the design of future cancer genome-sequencing efforts.
Speaking of criticism about the Science paper, the biggest part of nonsense comes from Yaniv Erlich, a poorly trained technician at Broad Institute, who keeps harping on their use of log-log distribution. Folks, log-log distribution has been used in cancer literature related to somatic mutations since the 1950s, such as these two classic papers – A New Theory on the Cancer-inducing Mechanism – C. O. Nordling and The Age Distribution of Cancer and a Multi-stage Theory of Carcinogenesis – P. Armitage and R. Doll. Everyone knows that it represents y=x^a type of mathematical relationship, including Nordling, who mentioned in his paper and Tomasetti et al., who reproduced the math of Armitage and Doll in their PNAS paper.
We believe Yaniv Erlich should stick to showing his ignorance on read coherence rather than diversifying into many different areas.
Readers may recall our post about Rayan Chikhi, Guillaume Rizk, Dominique Lavenier and their collaborators converting their efficient programs like Minia into an entire library with useful modules. Now others are building on top of GATB and LoRDEC is one success story. You can access the paper at this link (h/t: E. Rivals).
Motivation: PacBio single molecule real-time sequencing is a third-generation sequencing technique producing long reads, with comparatively lower throughput and higher error rate. Errors include numerous indels and complicate downstream analysis like mapping or de novo assembly. A hybrid strategy that takes advantage of the high accuracy of second-generation short reads has been proposed for correcting long reads. Mapping of short reads on long reads provides sufficient coverage to eliminate up to 99% of errors, however, at the expense of prohibitive running times and considerable amounts of disk and memory space.
Results: We present LoRDEC, a hybrid error correction method that builds a succinct de Bruijn graph representing the short reads, and seeks a corrective sequence for each erroneous region in the long reads by traversing chosen paths in the graph. In comparison, LoRDEC is at least six times faster and requires at least 93% less memory or disk space than available tools, while achieving comparable accuracy.
Availability and implementaion: LoRDEC is written in C++, tested on Linux platforms and freely available at http://atgc.lirmm.fr/lordec.
Readers may also find the following posts relevant.
Very Efficient Hybrid Assembler for PacBio Data
Cerulean: A Hybrid Assembly using High Throughput Short and Long Reads
This paper was posted in arxiv in early 2014, but we forgot to mention here. Ruibang Luo (one of the authors) mentioned that the paper is now accepted.
Background: Short-read aligners have recently gained a lot of speed by exploiting the massive parallelism of GPU. An uprising alternative to GPU is Intel MIC; supercomputers like Tianhe-2, currently top of TOP500, is built with 48,000 MIC boards to offer ~55 PFLOPS. The CPU-like architecture of MIC allows CPU-based software to be parallelized easily; however, the performance is often inferior to GPU counterparts as an MIC board contains only ~60 cores (while a GPU board typically has over a thousand cores). Results: To better utilize MIC-enabled computers for NGS data analysis, we developed a new short-read aligner MICA that is optimized in view of MICs limitation and the extra parallelism inside each MIC core. Experiments on aligning 150bp paired-end reads show that MICA using one MIC board is 4.9 times faster than the BWA-MEM (using 6-core of a top-end CPU), and slightly faster than SOAP3-dp (using a GPU). Furthermore, MICAs simplicity allows very efficient scale-up when multiple MIC boards are used in a node (3 cards give a 14.1-fold speedup over BWA-MEM). Summary: MICA can be readily used by MIC-enabled supercomputers for production purpose. We have tested MICA on Tianhe-2 with 90 WGS samples (17.47 Tera-bases), which can be aligned in an hour less than 400 nodes. MICA has impressive performance even though the current MIC is at its initial stage of development (the next generation of MIC has been announced to release in late 2014).
Readers may remember our commentary about the positivity lady.
Tragedy of the Day: PNAS Got Duped by Positivity Lady !!
In 2005, she wrote a paper to link human happiness with nonlinear dynamics and came up with a precise ratio of 2.9013 (critical positivity ratio) to improve life !!
The critical positivity ratio (also known as the Losada ratio or the Losada line) is a largely discredited concept in positive psychology positing an exact ratio of positive to negative emotions which distinguishes “flourishing” people from “languishing” people. The ratio was proposed by Marcial Losada and psychologist Barbara Fredrickson, who identified a ratio of positive to negative affect of exactly 2.9013 as separating flourishing from languishing individuals in a 2005 paper in American Psychologist. The concept of a critical positivity ratio was widely embraced by both academic psychologists and the lay public; Fredrickson and Losada’s paper was cited nearly 1,000 times, and Fredrickson wrote a popular book expounding the concept of “the 3-to-1 ratio that will change your life”. Fredrickson wrote: “Just as zero degrees Celsius is a special number in thermodynamics, the 3-to-1 positivity ratio may well be a magic number in human psychology.”
In 2013, the critical positivity ratio aroused the skepticism of Nick Brown, a graduate student in applied positive psychology, who felt that the paper’s mathematical claims underlying the critical positivity ratio were fundamentally flawed. Brown collaborated with physicist Alan Sokal and psychologist Harris Friedman on a re-analysis of the paper’s data. They found that Fredrickson and Losada’s paper contained “numerous fundamental conceptual and mathematical errors”, as did Losada’s earlier work on positive psychology, which completely invalidated their claims. Losada declined to respond to the criticism, indicating that he was too busy running his consulting business. Fredrickson wrote a response in which she conceded that the mathematical aspects of the critical positivity ratio were “questionable” and that she had “neither the expertise nor the insight” to defend them, but she maintained that the empirical evidence was solid. Brown and colleagues, whose response was published the next year, maintain that there is no evidence for the critical positivity ratio whatsoever.
That paper got retracted eight years later, when a group of scientists, including well-known physicist Alan Sokal, called her BS. By then, the positivity lady was on to her new venture involving genomics and, believe it or not, bioinformatics to show the effect of positivity on life. Our earlier blog post linked above was about another nonsensical claim from her that uses gene expression analysis to show impact of purpose in life. Last year, we published a paper in PNAS to again show that her claims were all meaningless.
A critical reanalysis of the relationship between genomics and well-being
This article critically reanalyzes the work of Fredrickson et al. [Fredrickson BL, et al. (2013) Proc Natl Acad Sci USA 110(33):13684–13689], which claimed to show that distinct dimensions of psychological well-being are differentially correlated with levels of expression of a selection of genes associated with distinct forms of immune response. We show that not only is Fredrickson et al.’s article conceptually deficient, but more crucially, that their statistical analyses are fatally flawed, to the point that their claimed results are in fact essentially meaningless. We believe that our findings may have implications for the reevaluation of other published genomics research based on comparable statistical analyses and that a variant of our methodology might be useful for such a reevaluation.
Today we are shocked to learn that NIH is now funding her ‘proven-by-genetics’ method to cure human beings !! This grant seems to have gotten approved after the publication of our paper. Therefore, NIH is completely ignoring all criticisms to allow this disgusting junk science to proceed.
Here is part of the abstract of the grant application –
An innovative upward spiral theory o lifestyle change positions warm and empathic emotional states as key pathways to unlocking the body’s inherent plasticity to reverse entrenched biological risk factors. The PI’s team has identified an affective intervention – the ancient practice of loving-kindness meditation (LKM) – that produces salubrious biological effects in healthy midlife adults. The innovation of the present study lies in testing this affective intervention in a sample of midlife adults on poor health trajectories by virtue of having low childhood SES plus present-day pathogenic behavioral tendencies (i.e., impulsivity and mistrust). A dual-blind placebo-controlled randomized controlled trial (RCT) is designed to provide proof of principle that early-established biological risks factors are mutable, not permanent.
If you are doing real science and cannot get NIH grant, please contact the grant manager Nielsen Lisbeth at National Institute of Aging to stop this nonsense. You can find her email address at this link.
NIH director Francis Collins wrote a new article in JAMA to sell the latest boondoggle of ‘precision medicine’.
Exceptional Opportunities in Medical Science – A View From the National Institutes of Health
Quite ironically, he made a strong case for his resignation from NIH and focusing on more productive activities (such as reading informative JAMA articles he failed to read).
Collins wrote the following text in the introduction –
As the world’s largest source of biomedical research funding, the US National Institutes of Health (NIH) has been advancing understanding of health and disease for more than a century. Scientific and technological breakthroughs that have arisen from NIH-supported research account for many of the gains that the United States has seen in health and longevity.
For example, an infant born today in the United States can look forward to an average lifespan of about 79 years—nearly 3 decades longer than one born in 1900.
The sense you get from the above text is that NIH helped greatly in improving life expectancy of Americans. The rest of the article suggests that NIH should be funded more to continue doing its great work.
But how is NIH doing under Collins? Let us see how much US life expectancy at birth improved since Collins joined NIH in 1993 to lead the human genome project. The answer comes from a figure in another important JAMA article Collins failed to read (“The Anatomy of Health Care in the United States”). We cannot access the original article, but the relevant figure is available from Incidental Economics blog.
As you can see, gap between the life expectancy of Americans and other OECD residents is increasing steadily ever since Collins started to lead HGP/NHGRI/NIH. USA is falling behind.
Since NIH claimed credit for gain in life expectancy since 1900, following the same logic, this is the clearest admission of failure of NIH under Francis Collins for over two decades.
Readers may also enjoy a good article written by Mike Eisen to call the bullshit of Francis Collins –
NIH Director Francis Collins’ ridiculous “We would have had an Ebola vaccine if the NIH were fully funded”
The guy, who turned Libya into prosperous democracy (not), punished the evil Wall Street bankers (not), made Middle East safe (not), gave free health insurance to everyone (not) and forced Afgan jihadis to hide back in caves (not), is now on to his new venture – ‘personalized medicine’. Thanks to the PR team, this also has a new name – ‘precision medicine’.
Tonight, I’m launching a new Precision Medicine Initiative to bring us closer to curing diseases like cancer and diabetes — and to give all of us access to the personalized information we need to keep ourselves and our families healthier.
Here is the touted success story –
I want the country that eliminated polio and mapped the human genome to lead a new era of medicine — one that delivers the right treatment at the right time. In some patients with cystic fibrosis, this approach has reversed a disease once thought unstoppable.
How good is that success story? Vox.com has some numbers.
All these treatments are still incredibly costly
There are big barriers here between the dream of personalized medicine and the reality. For the most part, the science of genetics just isn’t refined enough to help most patients, and developing targeted therapies is hugely expensive and time-consuming.
IT TOOK 24 YEARS AND TENS OF MILLIONS OF DOLLARS TO GET FROM THE DISCOVERY OF THE CYSTIC FIBROSIS MUTATION TO FDA APPROVAL. For example, Bill Gardner at the Incidental Economist ran some numbers on the promising cystic fibrosis therapy: “It took 24 years and tens of millions of dollars to get from the discovery of the CFTR [the particular genetic mutation that causes CF in some people] to the FDA approval of a drug. Moreover, this drug was designed for a mutation found in only a small fraction of the population of an already rare disease.”
Gardner noted that the drug costs about $300,000 per year, not only because of the manpower and years of research behind it, but because the market for the drug is small: “Precisely because the treatments are targeted at phenomena at the level of specific harmful mutations, they are not just personalized but practically bespoke, and correspondingly pricey.”
But do facts matter any more, when a country stands far apart from others?
[Figure from Calamities of Nature website and Vox.com].
An interesting use of Arduino –
Readers will enjoy a thoughtful blog post from professor Ken Weiss in response to – “Let’s Discuss – Is it Time to Shut Down NHGRI?“. With his permission, we are reblogging it entirely below, but please feel free to comment in his blog.
As an aside, while discussing this topic privately with other scientists, I noticed an attitude that there is no harm in NHGRI wasting money as long as it also funds good projects (i.e. their projects). In their (implied) view, the money comes from taxpayers, who are born suckers anyway. I do not agree with this notion, and can argue that NHGRI’s funding of poor quality science actually damages good science. I will share one such example in the following blog post.
Reblogged from The Mermaid’s Tale blog.
The NIH-based Human Genome Research Institute (NHGRI) has for a long time been funding the Big Data kinds of science that is growing like mushrooms on the funding landscape. Even if overall funding is constrained, and even if this also applies to the NHGRI (I don’t happen to know), the sequestration of funds in too-big-to-stop projects is clear. Even Francis Collins and some NIH efforts to reinvigorate individual-investigator RO1 awards don’t really seem to have stopped the grab for Big Data funds.
That’s quite natural. If your career, status, or lab depends on how much money you bring into your institution, or how many papers you publish, or how many post-docs you have in your stable, or your salary and space depend on that, you will have to respond in ways that generate those score-counting coups. You’ll naturally exaggerate the importance of your findings, run quickly to the public news media, and do whatever other manipulations you can to further your career. If you have a big lab and the prestige and local or even broader influence that goes with that, you won’t give that up easily so that others, your juniors or even competitors can have smaller projects instead. In our culture, who could blame you?
But some bloggers, Tweeters, and Commenters have been asking if there is a solution to this kind of fund sequestration, largely reserved (even if informally) for the big usually private universities. The arguments have ranged from asking if the NHGRI should be shut down (e.g., here) to just groping for suggestions. Since many of these questions have been addressed to me, I thought I would chime in briefly.
First, a bit of history or perspective, as informally seen over the years from my own perspective (that is, not documented or intended to be precise, but a broad view as I saw things):
The NHGRI was located administratively where it was for reasons I don’t know. Several federal institutes were supporting scientific research. NIH was about health, and health ‘sells’, and understandably a lot of fund is committed to health research. It was natural to think that genome sequences and sciences would have major health implications, if the theory that genes are the fundamental causal elements of life was in fact true. Initially James Watson, discoverer of DNA’s structure, and perhaps others advocated the effort. He was succeeded by Francis Collins who is a physician and clever politician.
However, there was competition for the genome ‘territory’, at least with the Atomic Energy Commission. I don’t know if NSF was ever in the ‘race’ to fund genomic research, but one driving force at the time was the fear of mutations that atomic radiation (therapeutic, from wars, diagnostic tests, and weapons fallout) generated. There was also a race with the private sector, notably Celera as a commercial competitor that would privatize the genome sequence. Dr Collins prominently, successfully, and fortunately defended the idea of open and free public access. The effort was seen as important for many reasons, including commercial ones, and there were international claimants in Japan, the UK, and perhaps elsewhere, that wanted to be in on the act. So the politics were rife as well as the science, understandably.
It is possible that only with the health-related promises was enough funding going to be available, although nuclear fears about mutations and the Cold War probably contributed, along with the usual less savory for self-interest, to AEC’s interests.
Once a basic human genome sequence was available, there was no slowing the train. Technology, including public and private innovation promised much quicker sequencing in the future, that was quickly to become available even to ordinary labs (like mine, at the time!). And once the Genome Institute (and other places such as the Sanger Centre in Britain and centers in Japan, China, and elsewhere) were established, they weren’t going to close down! So other sequences entered the picture–microbes, other species, and so on.
It became a fad and an internecine competition within NIH. I know from personal experiences at the time that program managers felt the need to do ‘genomics’ so they would be in on the act and keep their budgets. They had to contribute funds, in some way I don’t recall, to the NHGRI’s projects or in other ways keep their portfolios by having genomics as part of this. -Omics sprung up like weeds, and new fields such as nutrigenomics, cancer genomics, microbiomics and many more began to pull in funding, and institutes (and the investigators across the country) hopped aboard. Imitation, especially when funds and current fashion are involved, is not at all a surprise, and efficiency or relative payoff in results took the inevitable back seat: promises rather than deliveries naturally triumphed.
In many ways this has led to the current of exhaustively enumerative Big Data: a return to 17th century induction. This has to do not just with competition for resources, but a changed belief system also spurred by computing power: Just sample everything and pattern will emerge!
Over the decades the biomedical (and to some lesser extent biological) university establishment grew on the back of the external funding which was so generous for so long. But it has led to a dependency. Along with exponential growth in the number of competitors, hierarchies of elite research groups developed–another natural human tendency. We all know the career limitations that are resulting from this. And competition has meant that deans and chairs expect investigators always to be funded, in part because there aren’t internal funds to keep labs running in the absence of grants. It’s been a vicious self-reinforcing circle over the past 50 years.
As hierarchies built, private donors were convinced (conned?) into believing that their largesse would lead to the elimination of target diseases (‘target’ often meaning those in the rich donors’ families). Big Data today is the grandchild of the major projects, like the Manhattan Project in WWII, that showed that some kinds of science could be done on a large scale. Many, many projects during past decades showed something else: Fund a big project, and you can’t pull the plug on it! It becomes too entrenched politically.
The precedents were not lost on investigators! Plead for bigger, longer studies, with very large investments, and you have a safe bet for decades, perhaps your whole career. Once started, cost-benefit analysis has a hard time paring back, much less stopping such projects. There are many examples, and I won’t single any of them out. But after some early splash, by and large they have got to diminishing returns but not got to any real sense of termination: too big to kill.
This is to some extent the same story with the NHGRI. The NIH has got too enamored of Big Data to keep the NHGRI as limited or focused as perhaps it should have been (or should be). In a sense it became an openly anti-focused-research sugar daddy (Dr Collins said, perhaps officially, that NHGRI didn’t fund ‘hypothesis-based research”) based on pure inductionism and reductionism, so it did not have to have well-posed questions. It basically bragged about not being focused.
This could be a change in the nature of science, driven by technology, that is obsolescing the nature of science that was set in motion in the Enlightenment era, by the likes of Galileo, Newton, Bacon, Descartes and others. We’ll see. But the socioeconomic, political sides of things are part of the process, and that may not be a good thing.
Will focused, hypothesis-based research make a comeback? Not if Big Data yields great results, but decades of it, no matter how fancy, have not shown the major payoff that has been promised. Indeed, historians of science often write that the rationale, that if you collect enough data its patterns (that is, a theory) will emerge, has rarely been realized. Selective retrospective examples don’t carry the weight often given them.
There is also our cultural love affair with science. We know very clearly that many things we might do at very low cost would yield health benefits far exceeding even the rosy promises of the genomic lobby. Most are lifestyle changes. For example, even geneticists would (privately, at least) acknowledge that if every ‘diabetes’ gene variant were fixed, only a small fraction of diabetes cases would be eliminated. The recent claim that much of cancer is due just to bad mutational luck has raised lots of objections–in large part because Big Data researchers’ business would be curtailed. Everyone knows these things.
What would it take to kill the Big Data era, given the huge array of commercial, technological, and professional commitments we have built, if it doesn’t actually pay off on its promises? Is focused science a nostalgic illusion? No matter what, we have a major vested interest on a huge scale in the NHGRI and other similar institutes elsewhere, and grantees in medical schools are a privileged, very well-heeled lot, regardless of whether their research is yielding what it promises.
Or, put another way, where are the areas in which Big Data of the genomic sort might actually pay, and where is this just funding-related institutional and cultural momentum? How would we decide?
So what do to? It won’t happen, but in my view the NHGRI does not, and never did, belong properly in NIH. It should have been in NSF, where basic science is done. Only when clearly relevant to disease should genomics be funded for that purpose (and by NIH, not NSF). It should be focused on soluble problems in that context.
NIH funds the greedy maw of medical schools. The faculty don’t work for the university, but for NIH. Their idea of ‘teaching’ often means giving 5-10 lectures a year that mainly consist of self-promoting reports about their labs, perhaps the talks they’ve just given at some meeting somewhere. Salaries are much higher than at non-medical universities–but in my view grants simply should not pay faculty salaries. Universities should. If research is part of your job’s requirements, its their job to pay you. Grants should cover research staff, supplies and so on.
Much of this could happen (in principle) if the NHGRI were transferred to NSF and had to fund on an NSF-level budget policy. Smaller amounts, to more people, on focussed basic research. The same total budget would go a lot farther, and if it were restricted to non-medical school investigators there would be the additional payoff that most of them actually teach, so that they disseminate the knowledge to large numbers of students who can then go out into the private sector and apply what they’ve learned. That’s an old-fashioned, perhaps nostalgic(?) view of what being a ‘professor’ should mean.
Major pare-backs of grant size and duration could be quite salubrious for science, making it more focused and in that sense accountable. The employment problem for scientists could also be ameliorated. Of course, in a transition phase, universities would have to learn how to actually pay their employees.
Of course, it won’t happen, even if it would work, because it’s so against the current power structure of science. And although Dr Collins has threatened to fund more small RO1 grants it isn’t clear how or whether that will really happen. That’s because there doesn’t seem to be any real will to change among enough people with the leverage to make it happen, and the newcomers who would benefit are, like all such grass-roots elements, not unified enough.
These are just some thoughts, or assertions, or day-dreams about the evolution of science in the developed world over the last 50 years or so. Clearly there is widespread discontent, clearly there is large funding going on with proportionately little results. Major results in biomedical areas can’t be expected over night. But we might expect that research had more accountability.
Adam Eyre-Walker is an accomplished evolutionary biologist, a breed of scientists out of favor in these days of NHGRI-dominated ENCODE big-data ‘science’. He did not receive any grant for about ten years and essentially self funds his science.
However, stating that is not enough for money-hungry open-access journal PLoS, whose CEO earns over 500K. The journal now wants him to send bank statements of personal accounts to prove that he is really poor.
Oh well. He should feel grateful that he is mistreated by a journal that considers increasing diversity among academic editors and advisory board members as the most important problem in the world. It could have been far worse (such as getting his work published somewhere else).
PUBLIC LIBRARY OF SCIENCE (PLOS) FEE WAIVERS
Public Library of Science (PLoS) fee waivers
Over the last few years Iâ€™ve published quite a few papers in PLoS journals. In almost all cases I have been lucky enough to receive a complete or partial fee waiver because Iâ€™ve had almost no grant support for the last ten years. I have been very grateful for this support. Unfortunately times have changed. Previously you simply had to assert that you did not have the funds to pay the publication charge and you were granted waiver. Now you to prove that you donâ€™t have the funds and this includes proving that you cannot personally pay the publication fee â€“ this requires you to submit back statements and explain where your money is being spent. I was quite shocked when I found this out recently (and if you donâ€™t believe me, see the attached email exchange below). Besides the obvious invasion of privacy, I have to ask myself whether the paper, which I want to publish, is worth the Â£650 ($1000) that I would need to get it into print (assuming of course that it gets accepted)? Why am I publishing it anyway? Is it to advance science or simply the careers of me and my co-authors? If it is the latter then it seems reasonable for me to pay, but if it is an attempt to advance science and knowledge then Iâ€™m not so sure I should be paying. And yes, I could pay for this paper to be published, but I couldnâ€™t pay for every paper I write; this would amount to several thousand pounds/dollars a year, which would comprise a serious chunk of my salary. This policy, of expecting an author to pay, seems contrary to the original PLoS One policy of publishing all science and could distort the science that is being published.
Email exchange (I have removed the identity of my correspondee and replaced their name with PLOS):
Dear Dr. Eyre-Walker,
Thank you for submitting your manuscript to PLOS and applying for Publication Fee Assistance.
You indicated that you will be sending supporting financial documentation with your application. We have not yet received these supporting documents. Examples include grant award with budget information, signed letterhead from your institutional official with funding and budget details, expired grant award information, personal financial statements or other financial documentations that support your financial hardship explanations.
Please send them as soon as possible so we can append them to your application.
Should we receive insufficient evidence to verify demonstrated financial hardship, we will not be able to approve your PFA application. We will keep your application open until 1/2 Please send them as soon as possible to [email protected]
I’m not sure what form this supporting information should take, given that I have no grants. The last time I was funded by a grant was 2009. Since then I have had two students funded by UK research councils and two Marie Curie fellows funded by the European Community, but I receive no funds outside those associated with these projects – the paper I have submitted to PLoS One was written with two undergraduate students. The only evidence I can offer of this lack of grants is the lack of grant acknowledgements in papers I have published in the last few years. A full list of my publications including PDFs can be found atÂ http://www.lifesci.susx.ac.uk/home/Adam_Eyre-Walker/Website/Publications.htmlÂ and my current CV, with a list of my past funding, is attached.
I fully appreciate that PLoS One needs money to function, and I am also very grateful for the complete and partial fee waivers that PLoS has given me in the past, but I simply have no funds.
Dear Dr. Eyre-Walker,
Thank you for submitting your manuscript to PLOS and applying for Publication Fee Assistance. Â We have received your signed self-attestation, but need additional supporting documentation to validate your financial hardship. Our former system of granting waivers has been replaced by a formal application process and we appreciate your assistance in demonstrating your financial hardship.
Please provide supporting financial documents (e.g. grant and remaining grant balances, bank statements) to help us determine your financial assistance. Send us your information by 1/2. Â Without financial documents, we can only extend you a 20% fee waiver. Â Please advise. Â
Should we receive insufficient evidence to verify demonstrated financial hardship, we will not be able to approve the PFA assistance amount requested. We appreciate your assistance in demonstrating your financial need.
I cannot provide a grant statement for a grant that ended over 5 years ago; the university changed their accounting system and terminated grants were not transferred. Furthermore, neither this grant, nor any other grant I have ever had, would provide support for this project, which is unrelated to any of my former research (except for one paper published by PLoS Biology, which was incidentally granted a full waiver).
You suggest that I should provide details of my personal financial position and from this I presume from this that if I am judged to be personally wealthy enough, then I will be required to pay the publication fee. Is this the case?
Thank you for your message. Yes, we do require that you send financial documentation of your hardship. Publication Fee Assistance program launched in April 2014 and is a new financial assistance program that replaces our old system of granting waivers with a formal application. We appreciate your help in demonstrating your financial situation so that we can ensure the assistance is granted to authors with a demonstrated need. Without financial documents, we can only extend you a 20% fee waiver. Â Please advise. Â
Me (I have merged two emails here):
Since my case seems to depend upon my own personal finances I was wondering if you could tell me how these are judged. How small does my bank balance have to be before I am judged to be personally unable to pay?
I was also wondering whether these policies regarding fee waivers are being applied universally across all PLoS journals, or whether each journal has its won policies.
We apply the same criteria for the PFA application across all journals. There is no minimum, however the purpose is for you to help us understand your financial hardship. Others have shown documentation on what they are making payments towards which puts a strain on their personal budget. Either is acceptable to help us demonstrate your financial need in greater detail.
Professor Ken Weiss once proposed that the money allocated for NHGRI, the human genome research division of NIH, should be given to NSF and be spent on basic research on genomes. I think that is a very good idea, because there is nothing unusual in human genome that is not found in other genomes. Some parts of the human genome (information processing blocks) are conserved all the way to bacterial genomes. Some other parts came in after the origin of eukaryotes. Many developmental genes are conserved in almost all multicellular eukaryotes and definitely vertebrates. Ciliary genes are conserved all the way to chlamy.
Moreover, spending money to study only the human genome has the bad effect of making worthless ‘discoveries’ not consistent with evolutionary principles. In the ENCODE fiasco, a large group of researchers got large amount of money from NHGRI to show that human genome was unusual and they ended up ‘proving’ 80% functionality of human genome !!
On the immortality of television sets: “function” in the human genome according to the evolution-free gospel of ENCODE
A recent slew of ENCyclopedia Of DNA Elements (ENCODE) Consortium publications, specifically the article signed by all Consortium members, put forward the idea that more than 80% of the human genome is functional. This claim flies in the face of current estimates according to which the fraction of the genome that is evolutionarily conserved through purifying selection is less than 10%. Thus, according to the ENCODE Consortium, a biological function can be maintained indefinitely without selection, which implies that at least 80 – 10 = 70% of the genome is perfectly invulnerable to deleterious mutations, either because no mutation can ever occur in these “functional” regions or because no mutation in these regions can ever be deleterious. This absurd conclusion was reached through various means, chiefly by employing the seldom used “causal role” definition of biological function and then applying it inconsistently to different biochemical properties, by committing a logical fallacy known as “affirming the consequent,” by failing to appreciate the crucial difference between “junk DNA” and “garbage DNA,” by using analytical methods that yield biased errors and inflate estimates of functionality, by favoring statistical sensitivity over specificity, and by emphasizing statistical significance rather than the magnitude of the effect. Here, we detail the many logical and methodological transgressions involved in assigning functionality to almost every nucleotide in the human genome. The ENCODE results were predicted by one of its authors to necessitate the rewriting of textbooks. We agree, many textbooks dealing with marketing, mass-media hype, and public relations may well have to be rewritten.
Not that NHGRI learned anything from that exercise, because it appears to find new ways to waste money. Here they are on to a ‘new’ problem solved decades ago. From the recent NIH press release –
NIH grants aim to decipher the language of gene regulation
The National Institutes of Health has awarded grants of more than $28 million aimed at deciphering the language of how and when genes are turned on and off. These awards emanate from the recently launched Genomics of Gene Regulation (GGR) program of the National Human Genome Research Institute (NHGRI), part of NIH.
The GGR program aims to develop new ways for understanding how the genes and switches in the genome fit together as networks.
“There is a growing realization that the ways genes are regulated to work together can be important for understanding disease,” said Mike Pazin, Ph.D., a program director in the Functional Analysis Program in NHGRI’s Division of Genome Sciences.
You gotta be kidding !!
Yes, there is network involved, but it is a different kind of network. Who is Mike Pazin?
From the NIH website –
Dr. Pazin joined the National Human Genome Research Institute’s NHGRI’s Extramural Research Program in 2011. Mike is part of the NHGRI team overseeing the ENCODE project, generating an Encyclopedia of DNA Elements from the human genome, and the modENCODE project, identifying functional elements in the fly and worm genomes. He manages a portfolio of grants in functional genomics. He is also a member of the NHGRI Data Access Committee.
He is an ENCODE guy.
…and who does he give money to? Surprise, surprise – ENCODE leader Mike Snyder, who was the main author of their discredited paper, gets the biggest award to ‘decipher the language of gene regulation’.
Stanford University, Stanford, California, $7.1 million
Principal Investigator: Michael Snyder, Ph.D.
Dr. Snyder and his team will study the development of one type of skin cell (keratinocyte) as it develops from an early stage skin cell into a mature cell. To do this, they will examine the network of genes and pathways that control this developmental change. The results may ultimately have implications for better understanding skin biology and hundreds of skin disorders.
At this point, shutting down NHGRI and transferring funds to NSF will be the only way to end this endless waste of taxpayers’ money.
Also check –
Eric Green, Funding, Authorship, Integrity, and an Answer to @phylogenomics
Eric Green is the Director of the National Human Genome Research Institute at NIH.
The National Human Genome Research Institute and other institutes at NIH provide the bulk of the hundreds of millions of dollars for the ENCODE Project and other Big Science obscenities, such as modENCODE.
Eric Green is an author on many ENCODE papers, including the 2012 idiotic article in Nature, and of the many press releases which proclaimed that junk DNA does not exist.
Dan Graur has publicly described Eric Green and his ilk as “badly trained technicians.”
Dan Graur’s previous main source of research money was NIH.
It isn’t anymore.
NIH Director Francis Collins’ ridiculous “We would have had an Ebola vaccine if the NIH were fully funded” meme
But what really bothers me the most about this is that, rather than trying to exploit the current hysteria about Ebola by offering a quid-pro-quo “Give me more money and I’ll deliver and Ebola vaccine”, Collins should be out there pointing out that the reason we’re even in a position to develop an Ebola vaccine is because of our long-standing investment in basic research, and that the real threat we face is not Ebola, but the fact that, by having slashed the NIH budget and made it increasingly difficult to have a stable career in science, we’re making it less and less likely that we’ll be equipped to handle all of the future challenges to public health that we’re going to be face in the future.
Don’t get me wrong. I get what Collins is trying to do. I just think it’s a huge mistake. Every time I see testimony from NIH officials to Congress, they are engaged in this kind of pandering – talking about how concerned they are about [insert pet disease of person asking question] or that and how, if only they could get more money, we’d be able to take make amazing progress. But guess what? It hasn’t worked. The NIH budget is still being slashed. It’s time for the people who run the biomedical research enterprise in this country to make basic research the center of their pitch for funding. Collins had a huge opportunity to do that here, but he blew it.