A large number of NIH-funded parasites waste taxpayers’ money with the excuse that they are working toward improving the health of Americans. Francis Collins, the head of NIH, uses every opportunity to tell everyone how research funded by NIH helps in improving the life expectancy of Americans (a flat out lie). Yet, when research by Deaton and Case uncovered that the life expectancy of Americans of prime age (45-54) was falling, primarily due to rising suicides, Collins and his minions went completely silent.

We have not come across a single major science journal or blog giving enough space on this health epidemic spreading rapidly. Also, the word epidemic is appropriate given the rising global trend. For example,

New Zealand: Suicide toll reaches highest rate since records kept

Canada: [Whats Behind The Surge In Suicide Attempts Among Canadas Indigenous Population

More than 100 of Attawapiskats 2,000 members have tried to end their lives in the last seven months, including 11 youth on Saturday alone.

The following thoughtful essay from ‘The Automatic Earth’ blog shines some light on what is going on in New Zealand.


Finance + Stress = Suicide

_Nelson Lebo: _ Our already horrendous suicide rate hit a new record high last year. The news of New Zealand’s suicide rate did not surprise me when I heard it on the radio earlier this week. Anyone who pays attention to global trends could see this coming. Psychotherapists say we need a wide-ranging review into the mental health system before there are more preventable deaths reported Newstalk ZB.

At lighter moments I joke that the best thing about living in New Zealand is that you can see worldwide trends that are heading this way, but the worst part is that no-one believes you. This is not a lighter moment. Suicide is a serious issue and one that is growing dramatically among my peer group: white middle-aged men.

The first people to notice the emerging pattern in the United States were Princeton economists Angus Deaton and Anne Case. The New York Times reported on 2nd November, 2015 that the researchers had uncovered a surprising shift in life expectancy among middle-aged white Americans - what traditionally would have been considered the most privileged demographic group on the planet.

The researchers analyzed mountains of data from the Centers for Disease Control and Prevention as well as other sources. As reported by the Times, _they concluded that rising annual death rates among this group are being driven not by the big killers like heart disease and diabetes but by an epidemic of suicides and afflictions stemming from substance abuse: alcoholic liver disease and overdoses of heroin and prescription opioids. The mortality rate for whites 45 to 54 years old with no more than a high school education increased by 134 deaths per 100,000 people from 1999 to 2014. _

The most amazing thing about this discovery is that the Princeton researchers stumbled across these findings while looking into other issues of health and disability. But as we hear so often, everything is connected. A month before releasing this finding Dr. Deaton was awarded the Nobel Prize in Economics based on a long career researching wealth and income inequality, health and well-being, and consumption patterns.

The Royal Swedish Academy of Sciences credited Dr. Deaton for contributing significantly to policy planning that has the potential to reduce rather than aggravate wealth inequality. In other words, to make good decisions policy writers need good research based on good data. Too often this is not the case. To design economic policy that promotes welfare and reduces poverty, we must first understand individual consumption choices. More than anyone else, Angus Deaton has enhanced this understanding.

Days before hearing the news about New Zealand’s rising suicide rate I learned of another major finding from demographic researchers in the United States. For the first time in history the life expectancy of white American women had decreased, due primarily to drug overdose, suicide and alcoholism. This point is worth repeating as it marks a watershed moment for white American women. After seeing life expectancies continually extend throughout the history of the nation, the trend has not only slowed but reversed. Data show the slip is only one month, but the fact that it’s a decrease instead of another increase should be taken as significant milestone.

Please note that the following sentence is not meant in the least to make light of the situation, but is simply stating a fact. The demographic groups that are experiencing the highest rates of drug overdose, suicide and alcoholism are also the most likely to be supporters of Donald Trump in his campaign for the U.S. Presidency. It does not take a Nobel Laureate to observe a high level of distress among white middle-class Americans. Trump simply taps into that angst.

As reported by CBS News, The fabulously rich candidate becomes the hero of working-class people by identifying with their economic distress. That formula worked for Franklin D. Roosevelt in the 1930s. Today, Donald Trump’s campaign benefits from a similar populist appeal to beleaguered, white, blue-collar voters - his key constituency.

I don’t blame most Americans for being angry. That the very architects of the global financial crisis have only become richer and more powerful since they crashed the world economy in 2008 is unforgivable. The gap between rich and poor continues to widen and the chasm has now engulfed white middle-aged workers. As the Pope consistently tells us, wealth and income inequality is the greatest threat to humanity alongside climate change.

Instead of going down the Trump track for the rest of this piece, I’d rather wrap it up by bringing the issue back to Aotearoa (New Zealand) and my small provincial city of Whanganui. To provide some background for international readers, the NZ economy relies significantly on dairy exports and many dairy farmers hold large debts. Dairy prices are known for their volatility, and recently the payouts have dropped below break-even points for many farmers.

Earlier this month Primary Industries Minister Nathan Guy announced that the government would invest $175,000 to study innovative, low cost, high performing farming systems already in place in New Zealand. Stuff.co.nz reported, _The government is set to pick the brains of New Zealand’s top dairy farmers in an effort to help those struggling with the low dairy payout. _

That is great news, but the government’s investment in researching the best of the best farmers is a pittance when compared with what is spent addressing issues of depression and suicide prevention among Kiwi farmers. Isn’t this a case of putting the cart ahead of the horse, or treating symptoms instead of causes?

Research shows that financial stress contributes significantly to the increasing suicide rates here and abroad. We know that innovative farmers who use low-input/high-performance systems are more profitable that their conventional farming brethren. Would it then be a stretch to conclude that depression and suicide is much lower among these innovative and profitable farmers? At the same time, research shows that wealth and income inequality in our more urban centres contribute to anti-social behaviours such as crime, domestic abuse and illegal drug usage.

Angus Deaton, the Nobel-winning economist, would argue that in order for policy planners to address these issues effectively they must understand the underlying causes and resultant costs. Thankfully, we do see glimmers of that from central government instead of the usual neoliberal claptrap. Credit must be given to Finance Minister Bill English for his actuarial approach to some social issues rather than the inaccurate dogmatic position often adopted by the right.

But closer to home for me, such enlightened policy planning has yet to reach our city by the awa (river). To start off, the Council’s rates structure is stunningly regressive, clearly taking significantly higher proportions of household wealth from low-income families than from high-income families. If we believe the research in this field (ie, The Spirit Level, etc) wouldn’t we expect the widening gap between rich and poor to result in even more anti- social behavior in our city that already suffers from reputation problems nationwide?

Secondly, the council’s vision documents and long-term plan are nearly devoid of intelligent strategies to address the underlying issues of anti-social behaviour, depression, poor health, and domestic problems that afflict our community. The Council pours mountains of money into an art gallery and arts events while providing token services and events for low-income families.

Will it take our own Trump or Sanders running for office to stimulate a populist revolt against regressive policies that potentially do more harm than good to our community? What will it take for us to finally get it? I first wrote about these issues in our city’s newspaper, the Chronicle, two and a half years ago but, apparently, no one believed me. Welcome to provincial New Zealand!

‘Ancient’ Bene Israel Jews and late-arrived Baghdadi Jews in India started the Bollywood movie industry. Many famous early Indian actresses also came from these communities. This is not common knowledge in India, because those actresses took Muslim (Firoza Begum) or Hindu (Sulochana, Pramila) screen names.


Baghdadi Jews like David Sassoon also played big role in establishing Bombay as a major trading center.


A new population genetics study looks at the historic roots of the older (and of more ‘mysterious’ root) among those two groups. Bene Israel Jews were at times considered as one of the ‘lost tribes’.

The Genetics of Bene Israel from India Reveals Both Substantial Jewish and Indian Ancestry

The Bene Israel Jewish community from West India is a unique population whose history before the 18th century remains largely unknown. Bene Israel members consider themselves as descendants of Jews, yet the identity of Jewish ancestors and their arrival time to India are unknown, with speculations on arrival time varying between the 8th century BCE and the 6th century CE. Here, we characterize the genetic history of Bene Israel by collecting and genotyping 18 Bene Israel individuals. Combining with 486 individuals from 41 other Jewish, Indian and Pakistani populations, and additional individuals from worldwide populations, we conducted comprehensive genome-wide analyses based on FST, principal component analysis, ADMIXTURE, identity-by-descent sharing, admixture linkage disequilibrium decay, haplotype sharing and allele sharing autocorrelation decay, as well as contrasted patterns between the X chromosome and the autosomes. The genetics of Bene Israel individuals resemble local Indian populations, while at the same time constituting a clearly separated and unique population in India. They are unique among Indian and Pakistani populations we analyzed in sharing considerable genetic ancestry with other Jewish populations. Putting together the results from all analyses point to Bene Israel being an admixed population with both Jewish and Indian ancestry, with the genetic contribution of each of these ancestral populations being substantial. The admixture took place in the last millennium, about 1933 generations ago. It involved Middle-Eastern Jews and was sex-biased, with more male Jewish and local female contribution. It was followed by a population bottleneck and high endogamy, which can lead to increased prevalence of recessive diseases in this population. This study provides an example of how genetic analysis advances our knowledge of human history in cases where other disciplines lack the relevant data to do so.

Prior to 1960s, most computers were analog computers.

World War II era gun directors, gun data computers, and bomb sights used mechanical analog computers. Mechanical analog computers were very important in gun fire control in World War II, The Korean War and well past the Vietnam War; they were made in significant numbers.

The FERMIAC was an analog computer invented by physicist Enrico Fermi in 1947 to aid in his studies of neutron transport.[12] Project Cyclone was an analog computer developed by Reeves in 1950 for the analysis and design of dynamic systems.[13] Project Typhoon was an analog computer developed by RCA in 1952. It consisted of over 4000 electron tubes and used 100 dials and 6000 plug-in connectors to program.[14] The MONIAC Computer was a hydraulic model of a national economy first unveiled in 1949.


In those years, many universities and companies were building analog computers, although there were rare exceptions.

Computer Engineering Associates was spun out of Caltech in 1950 to provide commercial services using the “Direct Analogy Electric Analog Computer” (“the largest and most impressive general-purpose analyzer facility for the solution of field problems”) developed there by Gilbert D. McCann, Charles H. Wilts, and Bart Locanthi.[15][16]

Educational analog computers illustrated the principles of analog calculation. The Heathkit EC-1, a $199 educational analog computer, was made by the Heath Company, USA c. 1960.[17] It was programmed using patch cords that connected nine operational amplifiers and other components.[18] General Electric also marketed an “educational” analog computer kit of a simple design in the early 1960s consisting of a two transistor tone generator and three potentiometers wired such that the frequency of the oscillator was nulled when the potentiometer dials were positioned by hand to satisfy an equation. The relative resistance of the potentiometer was then equivalent to the formula of the equation being solved. Multiplication or division could be performed depending on which dials were considered inputs and which was the output. Accuracy and resolution was limited and a simple slide rule was more accurate; however, the unit did demonstrate the basic principle.

Analog computers are completely gone from the scene today, and the transition happened in several phases. In 1930s, Alan Turing developed the mathematical principles of an abstract digital machine to show that it could do arbitrary calculations. A number of digital machines were built in the 40s and 50s, but the big transition could start only after the invention of silicon integrated circuits of complementary-symmetry metaloxidesemiconductor (CMOS) or MOSFET switches. This is the predominant technology today.

However, despite such prevalence of digital computers everywhere, the analog world exists and it exists right beneath the digital circuitry. Think about it this way. If the natural world (of silicon materials) is analog and the computer is digital, the transition from analog to digital must happen somewhere, right? That transition is carefully hidden inside the MOSFET.

Between 1997-2000, I worked at a semiconductor company, and the task of our group was to hide the analog world from those, who did not want to face it. The task gets increasingly difficult with every round of miniaturization, because all elements of the circuit including n- and p-transistors, gate capacitors and even electrical wires connecting them start to remind the world that the analog world exists.



Switching to the other side of the living world, Darwin’s model of evolution was entirely analog. Darwin was not aware of Mendel’s discovery of ‘genes’, which introduced discreteness into evolution.

Biologists did not find out about Mendel’s experiment until ~1900s and then a huge conflict ensued between those, who sided with Darwin’s continuous evolution and those, who sided with Mendel’s discrete evolution.

Mendel’s results were quickly replicated, and genetic linkage quickly worked out. Biologists flocked to the theory; even though it was not yet applicable to many phenomena, it sought to give a genotypic understanding of heredity which they felt was lacking in previous studies of heredity which focused on phenotypic approaches. Most prominent of these previous approaches was the biometric school of Karl Pearson and W. F. R. Weldon, which was based heavily on statistical studies of phenotype variation. The strongest opposition to this school came from William Bateson, who perhaps did the most in the early days of publicising the benefits of Mendel’s theory (the word “genetics”, and much of the discipline’s other terminology, originated with Bateson). This debate between the biometricians and the Mendelians was extremely vigorous in the first two decades of the twentieth century, with the biometricians claiming statistical and mathematical rigor,[37] whereas the Mendelians claimed a better understanding of biology.[38][39] (Modern genetics shows that Mendelian heredity is in fact an inherently biological process, though not all genes of Mendel’s experiments are yet understood.)[40][41]


Fisher’s mathematics (1920s and 30s) brought those two camps together, and then the series of discoveries (1950s and 60s) starting with Watson-Crick’s solving of DNA structure further confirmed the existence of digital information layer in every living organism. Since then research in the living world moved together with the technological advances in the computing world. Advances in digital technology allowed better experimentation on living world, and those experiments focused more on genetics and genomics. As a result, we learned a lot about how the information layer of living organisms evolved over time and plan to learn more through the completion of 10,000 insect genome project, 1,000 fish transcriptome project and various other comparative sequencing projects.

However, when you compare the two narratives, do you notice something missing from the second one? In case of the semiconductor industry, a dedicated group hides the existence of analog world from vast majority of practitioners, but who does so for the living world? Moreover, where is the analog connection of information layer of life hidden?

As you can guess from the title, the answer most likely lies in ‘translation’ and ‘tRNAs’. How so will be the topic of another commentary.

In 2013, Dr. Elhaik complained about his home page at John Hopkins University mysteriously disappearing from google searches right after his first Jewish genomics paper started to gain attention. We reproduced his complaint here, and then his page came back on top again after a few days.

Google Censorship: Scariest Thing for Academic Freedom

Readers found those accusations rather hysterical and attributed the disappearance (and reappearance) to something as mundane as ‘search algorithm update’. It was a time before Snowden release, when the crooks running Google and Facebook were seen as various reincarnations of Buddha.

It is increasingly becoming clear that those ‘search algorithms’ and ‘trending news’ may be more than something generated by a fully-automated cluster of servers near Oregon-Washington border processing large amount of clicks and links. In 2014, RT posted an article on “Censorship war: Website unmasks links Google is blocking from search results”, but the evidence of censorship was still indirect. More direct evidence regarding Facebook came yesterday, when Gizmodo published -

Want to Know What Facebook Really Thinks of Journalists? Here’s What Happened When It Hired Some

According to former team members interviewed by Gizmodo, this small group has the power to choose what stories make it onto the trending bar and, more importantly, what news sites each topic links out to. We choose whats trending, said one. There was no real standard for measuring what qualified as news and what didnt. It was up to the news curator to decide.


They were also told to select articles from a list of preferred media outlets that included sites like the New York Times, Time, Variety, and other traditional outlets. They would regularly avoid sites like World Star Hip Hop, The Blaze, and Breitbart, but were never explicitly told to suppress those outlets. They were also discouraged from mentioning Twitter by name in headlines and summaries, and instead asked to refer to social media in a broader context.

News curators also have the power to deactivate (or blacklist) a trending topica power that those we spoke to exercised on a daily basis. A topic was often blacklisted if it didnt have at least three traditional news sources covering it, but otherwise the protocol was murkymeaning a curator could ostensibly blacklist a topic without a particularly good reason for doing so. (Those we interviewed said they didnt see any signs that blacklisting was being abused or used inappropriately.)

CNET summarized the article in one line that says all -

Facebook’s trending news feed may be a sham

The world’s largest social network calls out the top trending stories on its site, which is visited by more than a billion people every day. But the list may be manipulated by Facebook’s employees, who are allegedly deciding what’s “trending” based on their political views.

Zerohedge reports -

Facebook Workers Admit They “Routinely” Suppressed Conservative News

Therefore, we reopen our old question - was Google really censoring (i.e. manually filtering out) Elhaik’s Khazar research in its search results? Answer to this question is important, because many researchers now rely on searches at google scholar, whose results are now seen as sacrosanct as Google searches in 2013.

We presume answers will not be coming anytime soon given that Google has bigger issues to resolve :)

Google planned to help Syrian rebels bring down Assad regime, leaked Hillary Clinton emails claim

Clinton email reveals: Google sought overthrow of Syria’s Assad


Those worshiping the other false God named altmetric may take a look at Elhaik’s -

On the fallacies of Altmetric OR how I fell asleep in Sheffield and woke up in Louisiana

Caught red handed

I recalled that one of my Twitter friends had 124 followers who retweeted about my study, as you can see below, this is nearly TWICE than what Altmetric reports that I currently have. Sloppy Altmetric didnt bother to fix that number and only some of those 124 tweets were considered towards the 73 count


Previous segment - Organoids and the Coming Medical Revolution (i)

In 2006, Yamanaka surprised the whole world by announcing that he could turn adult mouse fibroblasts into pluripotent stem cells by applying four transcription factors - Sox2, Oct4, Klf4 and c-Myc. Those pluripotent stem cells can then be converted to any other cell lineage of the body.

To fully appreciate the significance of Yamanaka’s discovery, you need to also consider the political climate of 2006. The human genome assembly was completed around 2000-2001, and the researchers were looking for ways to use the genome to improve human health. Stem cell research was recognized as a potentially beneficial area, but work on human embryonic stem cells also got associated with killing babies. So, Federal funding stopped for using human embryonic stem cells in research. To add to complications, the research institutes had to build separate facilities to isolate federally funded research from human stem cell research from other funding.

2006 being an election year, Bush made a big deal of vetoing a stem cell bill to hide his Iraq war failure.


Stem Cell Bill Gets Bush’s First Veto

By Charles Babington

Washington Post Staff Writer

Thursday, July 20, 2006

President Bush issued the first veto of his five-year-old administration yesterday, rejecting Congress’s bid to lift funding restrictions on human embryonic stem cell research and underscoring his party’s split on an emotional issue in this fall’s elections.

At a White House ceremony where he was joined by children produced from what he called “adopted” frozen embryos, Bush said taxpayers should not support research on surplus embryos at fertility clinics, even if they offer possible medical breakthroughs and are slated for disposal.

The vetoed bill “would support the taking of innocent human life in the hope of finding medical benefits for others,” the president said, as babies cooed and cried behind him. “It crosses a moral boundary that our decent society needs to respect.” Each child on the stage, he said, “began his or her life as a frozen embryo that was created for in vitro fertilization but remained unused after the fertility treatments were complete. . . . These boys and girls are not spare parts.”

Within hours of Bush’s announcement, the House, as expected, fell short in a bid to override the veto, extinguishing the issue as a legislative matter this year but not as a political matter. Democrats said voters will penalize GOP candidates for the demise of a popular measure, and predicted the issue could trigger the defeat of Bush allies such as Sen. James M. Talent, who faces a tough reelection battle in Missouri.

Yamanaka’s discovery came as a big sign of relief, because turning one’s skin cells or other cells back into (induced) pluripotent stem cells by applying enzymes bypassed those ethical questions. The action was equivalent to using a time machine on a developed cell and turning the clock back. Books of moral dilemma do not have a chapter on negative time actions.

Readers will enjoy the following interview of Yamanaka, where he talked about his amazing discovery.

What inspired him to choose those four transcription factors? More on that topic in the following post, but here is a hint (h/t: Developmental Biology)


Among all biomolecules within the cell, tRNAs got the least respect. Their supposed importance ended right after the ‘adaptors’ related to entries in the genetic code table were identified (mid-60s). Since then, the attention shifted to more complex RNAs like the rRNAs.

Interestingly, tRNAs are making major comeback and that too in the most unexpected places. Several groups linked them with neurodegenerative disorders. Is that due to too much money looking for causes of complex diseases, or are tRNAs indeed more remarkable than previously thought?

Here is a paper from 2013 that made major splash by linking tRNAs with diseases, but there are a few additional ones.

CLP1 links tRNA metabolism to progressive motor-neuron loss

CLP1 was the first mammalian RNA kinase to be identified. However, determining its in vivo function has been elusive. Here we generated kinase- dead Clp1 (Clp1K/K) mice that show a progressive loss of spinal motor neurons associated with axonal degeneration in the peripheral nerves and denervation of neuromuscular junctions, resulting in impaired motor function, muscle weakness, paralysis and fatal respiratory failure. Transgenic rescue experiments show that CLP1 functions in motor neurons. Mechanistically, loss of CLP1 activity results in accumulation of a novel set of small RNA fragments, derived from aberrant processing of tyrosine pre-transfer RNA. These tRNA fragments sensitize cells to oxidative-stress-induced p53 (also known as TRP53) activation and p53-dependent cell death. Genetic inactivation of p53 rescues Clp1K/K mice from the motor neuron loss, muscle denervation and respiratory failure. Our experiments uncover a mechanistic link between tRNA processing, formation of a new RNA species and progressive loss of lower motor neurons regulated by p53.

Among various biotechnology inventions of the last few years with potential to revolutionize medicine, nothing excites us more than growing of three- dimensional human organoids on matrigel. Therefore, we plan to devote a number of posts on this topic to keep our readers aware of the practices, potentials and challenges.

If you have not heard of organoids at all, the following few news articles will help you get started. In future posts, we plan to cover the primary literature in depth, but let us start with the popular media.

Scientists grow functional kidney organoid from stem cells

There are many diseases that attack specific organs, landing patients on a transplant list. Unfortunately, our bodies have markers that identify an organ as self, which makes it difficult to find an organ match. Many individuals die waiting for an organ transplant because a match can’t be found.

Research on stem cellsa type of cell that is able to transform into nearly any cell typehas raised hopes of treating organ failure. Researchers envision using these cells to grow fully functional organs.

A functional organ is similar to a machine. Organs contain many interacting parts that must be positioned in a specific configuration to work properly. Getting all the right cell types in the appropriate locations is a real challenge. Recently, a team of scientists has met that challenge by using stem cells to grow a tissue, termed an organoid, that resembles a developing kidney.

Scientist: Most complete human brain model to date is a brain changer

Scientists at The Ohio State University have developed a nearly complete human brain in a dish that equals the brain maturity of a 5-week-old fetus.

The brain organoid, engineered from adult human skin cells, is the most complete human brain model yet developed, said Rene Anand, professor of biological chemistry and pharmacology at Ohio State.

The lab-grown brain, about the size of a pencil eraser, has an identifiable structure and contains 99 percent of the genes present in the human fetal brain. Such a system will enable ethical and more rapid and accurate testing of experimental drugs before the clinical trial stage and advance studies of genetic and environmental causes of central nervous system disorders.

An Inhibitor reduces the effects of Zika virus in Brain Organoid

An investigation on zika has shown that it modifies the function of the molecule and causes a TLR3 cell ‘suicide’ in the brain. The experiment conducted in laboratory brains, seeks to reduce the aggressiveness of infection, resulting in microcephaly fetuses, using an inhibitor. Early results indicate that cells infected by the virus decreased by 16% in five days.


What is an organoid?

Organoids are tiny organs grown in the lab on petri-dish.


How are they grown?

Usually the researcher starts with pluripotent stem cells and applies a series of enzymes over several days. The experiment is done in a special material (e.g. Matrigel) that allows the growth to take place in three-dimension. Over time, the stem cell divides and develops specific organs depending on the chosen enzymes.

For example, here is the ‘recipe’ to grow human lungs, based on the 2015 paper - “In vitro generation of human pluripotent stem cell derived lung organoids”. Let us quickly explain what is going on and then you can read the actual methods.

There are three steps explained in highly simplified form.

(i) The researchers start with pluripotent stem cells (PSC) and apply Activin A for 3 consecutive days. This process turns PSCs into definitive endoderm.

(ii) The researchers apply Noggin and SB431542 for another 4 days to turn definitive endoderm into anterior foregut.

(iii) The researchers apply Noggin, SB431542 and FGF4 for another 4 days to turn anterior foregut into lung organoid. This part of the experiment is done in Matrigel so that the tissue can maintain its three-dimensional pattern.

Differentiation of PSCs into definitive endoderm

Differentiation into definitive endoderm was carried out as previously described (D’Amour et al., 2005; Spence et al., 2011). Briefly, a 4-day Activin A (R&D; systems, Minneapolis, MN) differentiation protocol was used. Cells were treated with Activin A (100 ng ml?1) for 3 consecutive days in RPMI 1640 media (Life Technologies, Grand Island, NY) with increasing concentrations of 0%, 0.2% and 2% HyClone defined fetal bovine serum (dFBS, Thermo Scientific, West Palm Beach, FL).

Differentiation of definitive endoderm into anterior foregut

After differentiation into definitive endoderm, foregut endoderm was differentiated, essentially as described (Green et al., 2011). Briefly, cells were incubated in foregut media: Advanced DMEM/F12 plus N-2 and B27 supplement, 10 mM Hepes, 1 L-Glutamine (200 mM), 1 Penicillin-streptomycin (5000 U/ml, all from Life Technologies) with 200 ng/ml Noggin (NOG, R&D; Systems) and 10 M SB431542 (SB, Stemgent, Cambridge, MA) for 4 days. For long term maintenance, cultures were maintain in basal foregut media without NOG and SB, or in the presence of growth factors including 50, 500 ng/ml FGF2 (R&D; systems), 10 M Sant-2 (Stemgent), 10 M SU5402 (SU, Stemgent), 100 ng/ml SHH (R&D; systems), and SAG (Enzo Life Sciences, Farmingdale, NY) for 8 days.

Directed differentiation into anterior foregut spheroids and lung organoids

After differentiation into definitive endoderm, cells were incubated in foregut media with NOG, SB, 500 ng/ml FGF4 (R&D; Systems), and 2 M CHIR99021 (Chiron, Stemgent) for 46 days. After 4 days with treatment of growth factors, three-dimensional floating spheroids were present in the culture. Three- dimensional spheroids were transferred into Matrigel to support 3D growth as previously described (McCracken et al., 2011). Briefly, spheroids were embedded in a droplet of Matrigel (BD Bioscience #356237) in one well of a 24 well plate, and incubated at room temperature for 10 min. After the Matrigel solidified, foregut media with 1% Fetal bovine serum (FBS, CAT#: 16000044, Life Technologies) or other growth factors and small molecules were overlaid and replaced every 4 days. Organoids were transferred into new Matrigel droplets every 1015 days.

To procedure is similar for other organs except for the series of enzymes being selected. If the steps appear too mysterious, do not worry. We will cover the scientific details in a future post.


What are the medical potentials of the organoid technology?

One can take tissue from any person, turn it back into pluripotent stem cells (iPSC) and then grow that person’s organs from the iPSC. That means it is possible to check whether a drug will work or have bad effect on one person’s brain or kidney before applying the drug on the person himself. That marks the arrival of truly personalized medicine and that too without the person !



To understand how it all started, you need to go back to stem cell pioneer Shinya Yamanaka.

Shinya Yamanaka discovered more than 40 years later, in 2006, how intact mature cells in mice could be reprogrammed to become immature stem cells. Surprisingly, by introducing only a few genes, he could reprogram mature cells to become pluripotent stem cells, i.e. immature cells that are able to develop into all types of cells in the body.

Continue reading here.

…to stop Americans from flooding their effective and way cheaper healthcare system (click on image to see better version).



A few weeks back, we wrote about Dr. Eran Elhaik’s new and interesting work on tracing the roots of Ashkenazi Jews in Northeastern Turkey.

New Paper by Elhaik Shows that Ashkenazi Jews Came from Northeastern Turkey

Elhaik relied on his previously published population genetics algorithm that could trace the ancestral origin of someone to within a village based on genomic data. That work was done in collaboration with National Geographic.

Nevertheless, Elhaik’s latest paper did not catch as much attention as it deserved, until the Forward magazine decided to trash it using the opinions of a bunch of ‘scholars’ with no population genetics background.

Dont Buy the Junk Science That Says Yiddish Originated in Turkey

The core of their criticism is rather hilarious, because they elevated the peripheral part of Elhaik’s paper to ‘central evidence’, while downgrading his DNA evidence to ‘belief’ system !! In fact, the ignorance of genetics of their ‘scholars’ is downright scary, because they repeatedly argued how the results could change with the inclusion of Sephardic Jews. How can evidence derived from a person’s genomic data change with the inclusion of a completely different person?

Of the many dubious pieces of evidence presented to support Elhaiks unorthodox theory is the fact that four villages in ancient Turkey once had names that sounded something like Ashkenaz. Later in the film Elhaik explains, without a trace of humor, that Yiddish-speakers talk like Yoda and he even mentions Yodas use of unusual terms like Jedi and wookies to demonstrate Yiddishs linguistic properties. Dr. Elhaik, using a tool he designed called Geographic Population Structure, which he believes can pinpoint the origins of an ethnic group according to its DNA, published an article in the journal Genome Biology and Evolution in which he elucidates his theory.


Elhaik wrote a detailed criticism of the Forward article in his blog, which we post below.

Responding to the criticism for Das et al. (2016)

In Das et al. (2016), we applied the Geographic Population Structure (GPS) algorithm to the genomes of Yiddish and non-Yiddish speaking Ashkenazic Jews (and other Jewish and non-Jewish populations) to study the origin of their genomes. Since genetics, geography, and linguistics are well correlated we surmised that the origin of the DNA would point to the origin of the Yiddish language. Surprisingly, GPS traced 93% of the samples to northeastern Turkey where we found four villages whose names may be derived from the word Ashkenaz. By the proximity of this region to Slavic lands and combined with other historical and linguistic evidence our findings were in support of Prof. Wexler’s Slavic hypothesis rather than the dominant Rhineland hypothesis proposing a Germanic origin to Yiddish.


The study has been published two weeks ago and has been picked up by the media. It has been received nearly 100% positive coverage in over 100 media outlets and numerous blogs. Expectedly, we have also received a bit of criticism, some of it will be addressed here and some of it will be ignored because it is merely ad hominem, not science.

Genetic criticism

None has been received, however some people have voiced their concerns about the implications of our results to their potential relatedness to the ancient Judaeans. I have commented at length on this issue to the Israeli Globes (Hebrew). Briefly, our study did not focus on the origin of Jews or even all Ashkenazic Jews, but rather the origin of Yiddish using a third of the Ashkenazic Jewish community for which genomic data were available. Testing whether one is related to the ancient Judaeans, Jesus, Moses, or Muhammad requires actually sequencing the genomes of these people and meticulously comparing it with modern day genomes looking for shared biomarkers. As opposed to the latter three we actually have plenty of ancient Judaeans skeletons that no one has ever sequenced (and probably never will). Why the DNA of the ancient Judaeans has not being sequenced and settle the question of relatedness once and for all is a question that should be directed to Israeli archeologists. It is most unfortunate that the members of the general public have been mislead to believe (no doubt after paying a lot of money to DTC companies) that they are related to ancient figures without any shred of evidence. However, this is not something our study aims to prove or disprove.

Methodological criticism

In “Scholars Blast New Study Tracing Ashkenazi Jews to Khazars of Ancient Turkey” Mr. Liphshiz cites two academics who criticized on our study, or more precisely, criticized the press release: most blasting indeed. First is Prof. Sergio DellaPergola, a demographer who proposed that including Sephardic Jews would have changed our findings.

serious research would have factored in the glaring genetic similarity between Sephardim [sic] and Ashkenazim [sic], which mean Polish Jews are more genetically similar to Iraqi Jews than to a non-Jewish Pole.

First, let us start with a basic biology. Each person has unique DNA: studying the DNA of non-Ashkenazic Jews would not change the DNA of Ashkenazic Jews nor the predicted origin of their DNA (i.e., ancient Ashkenaz in northeastern Turkey). GPS is an unbiased algorithm, that is, including or excluding other samples does not change the results for the test samples. GPS also cannot relocate the villages bearing the name of Ashkenaz to Germany.

Second, Iraqi and Iranian Jews are extremely similar, and the latter were indeed included in the study. The genetic similarity between Ashkenazic Jews and Iranian Jews was explained by their shared Iranian-Turkic past.

**Had Prof. DellaPergola bothered to read our study rather than rely on the above figure, which was produced for the press and for simplicity included only Ashkenazic Jews, he would have found that we have analyzed Sephardic Jews **(The yellow and pink triangles below in Figure 4 from Das et al. 2016 correspond to Iranian and Mountain Jews considered “Sephardic Jews”).


Third, to date, only three biogeographical analyses were carried out for Ashkenazic Jews. The first was done by Elhaik (2013) who mapped Ashkenazic Jews to western Turkey, ~100km away from ancient Ashkenaz and included only Ashkenazic Eastern European Jews. The second was done by Behar et al. (2013), who included both Ashkenazic and Sephardic Jews and mapped Ashkenazic Jews to Eastern Turkey, ~600km away from ancient Ashkenaz. Using a large dataset that include mostly Ashkenazic Jews and some Sephardic Jews and a more accurate algorithm, the third study by Das et al. (2016) discovered ancient Ashkenaz. In summary, all three studies pointed to Turkey. Interestingly, Behar et al. (2013) interpreted their results in favor of a Middle Eastern (Israelite) origin, although it is unsupported by their data. The inclusion of Sephardic samples did not change the Turkish geo location for the latter two studies.

Unlike Prof. DellaPergola, historian Prof. Shaul Stampfer, went to an even greater length enlightening us with his well supported criticism of our study:

It is basically nonsense

Behind the scenes, Prof. Stampfer is harassing people involved in the preparation of my papers alleging that population description was omitted from Elhaik (2013). Unfortunately, this is another case of an academic who may consider reading papers a luxury. A brief glance at the method section and yes, most unfortunately, the supplementary materials, would have revealed a very detailed description of all population and individuals studied. Unless capable of reading and understanding genomic studies, Prof. Stampfer may be advised to focus on the controversy over Shechita between Hasidim and Mitnagdim, a field which earned him the respect of his peers.

Linguistic criticism

The most interesting criticism, in my opinion, is the one that focuses on the linguistic aspects of Yiddish. It is worth clarifying that we did not carry out a linguistic but rather a genomic study that yielded biogeographical predictions, which were interpreted in favour of the Slavic theory over the Rhineland theory.

Mr. Dovid Katz said to The Jewish Chronicle that:

There is not a single word or sound in Yiddish that comes from Iranian or Turkish, and older Western Yiddish thrived before there was a single Slavic- derived word in the language.

The following response was provided by Prof. Paul Wexler, co-author of Das et al. (2016):

Unfortunately, I am unable to reply in the most appropriate manner, which would require publishing my collection of about 800 pages (single-spaced) of Iranian, Turkic, Tocharian, Mongolian and Chinese influences in Yiddish-both in the latter’s western and eastern variants. Mr. Katz’s claim that there are no Oriental elements in Yiddish and that even Slavic elements do not enter Western Yiddish until centuries after the alleged rise of the Yiddish language in the German lands, is totally false. Mr. Katz’s remarks are more of an emotional tirade than a scholarly statement. Apparently, Mr. Katz did not read (or did read but failed to understand) my recent articles on the subject of Iranian and Turkic elements, or my many books and articles dating from the 1990s showing the nature of the “Slavicity” of Yiddish. Indeed, to the best of my knowledge, Katz has no knowledge of Slavic, Iranian, Turkic, etc. so it would be nearly impossible for him to evaluate my writings and examples (which I hope soon to publish in a book form). Mr. Katz also does not appear to know about the pre- and post-World War II writings of Max Weinreich, the doyen of the field of modern Yiddish linguistics (1894-1969), whom I had the distinct pleasure of meeting personally. Weinreich wrote that German Yiddish from its earliest stages (10th c) was in contact with Sorbian, and possibly also with Polabian, two of the many Slavic languages which reached what is now modern Germany (both west and east) in the 7th century. Today, Polabian is extinct (since the late 18th c) but Sorbian survives in two variants in eastern Germany. To claim that there are no Iranian elements in Yiddish is very puzzling.

Surely, Mr. Katz knows that the Babylonian Talmud was edited in its final form by the 6th century, in a territory that was part and parcel of the Iranian Empire. All Talmudic scholars could tell him of the immense Iranian philosophical, religious, legal and linguistic imprint on the Talmud. Many of the Iranianisms in Yiddish passed via Judeo-Aramaic (the language of the Talmud) into written Hebrew and from there into the later Jewish languages of varying stocks. My claim goes further: namely, that Jews were intimately acquainted with Iranian dialects and could not avoid, or did not ever wish to avoid, importing hundreds and maybe even thousands of covert and overt Iranianianims into Yiddish, etc.

The existence of Iranianisms in Yiddish was known also to scholars of Iranian and Slavic. The very distinguished general and Slavic linguist, Roman Jakobson, wrote in the 1960s that Yiddish paskudnik, paskudnjak ‘scoundrel’, though it bore a very close similarity to the coterritorial Slavic languages, was ultimately of Iranian origin (it also appears in the Talmud). Under Ukrainian influence, the Yiddish Iranianism adopted non-Jewish Slavic form- except that -njak in the second variant, though theoretically possible in Ukrainian, is unattested. The Yiddish word has also been discussed in the Iranian linguistic literature. Is this “off-the-wall linguistics”, to quote Katz?

Finally, Mr. Katz seems to think that the term “Ashkenaz” has always been associated with the German lands, the alleged homeland for him of the North European Jews. I would advise Mr. Katz to read the writings of Sa’adya Gaon, the 10th century scholar from Baghdad who understood that Ashkenaz meant “Slavic” (he is not the only writer of that age who thought so). I would advise Mr. Katz to read the Caucasian ethnographic literature of the 1920s where the term Ashkenaz was still being used by local peoples, both Jewish and non-Jewish. I would advise Mr. Katz to have a look at the writings of Wilhelm Bacher, the eminent Iranianist, writing (around the turn of the 20th century) about a 14th-century Uzbek Jew who composed the first extant Hebrew-Persian dictionary and who described himself (a native of Urgench) as “Ashkenazic”.

I am very grieved by Mr. Katz’s ignorance, since I have known him since the 1980s. In those days he wrote several ground-breaking articles on Yiddish which earned him the respect and admiration of all Yiddishists. Unfortunately, he did not live up to his promise.

Das et al. study was published in GBE.

Over the weekend, Pulitzer Prize winning writer Sid Mukherjee published an article ‘Same but Different - How epigenetics can blur the line between nature and nurture’ in the New Yorker magazine.


On October 6, 1942, my mother was born twice in Delhi. Bulu, her identical twin, came first, placid and beautiful. My mother, Tulu, emerged several minutes later, squirming and squalling. The midwife must have known enough about infants to recognize that the beautiful are often the damned: the quiet twin, on the edge of listlessness, was severely undernourished and had to be swaddled in blankets and revived.

The article received widespread criticism for turning black into white, which happens to be a common practice in the epigenetics field. For a detailed summary of criticisms by various scientists, readers may check the following two blog posts of evolutionary biologist Jerry Coyne. A couple of examples are included from many.

The New Yorker screws up big time with science: researchers criticize the Mukherjee piece on epigenetics

Researchers criticize the Mukherjee piece on epigenetics: Part 2

Wally Gilbert, Nobel Laureate, biochemist and molecular biologist, Harvard University (retired).

The New Yorker article is so wildly wrong that it defies rational analysis. Too much of the epigenetic discussion is wishful thinking seeking Lamarckian effects, and ignoring the role of sequence specific regulatory proteins and genes. (as well as sequence specific RNA molecules).

Sidney Altman, Sterling Professor of Molecular, Cellular, and Developmental Biology and Chemistry at Yale University, Nobel Laureate:

I am not aware that there is such a thing as an epigenetic code. It is unfortunate to inflict this article, without proper scientific review, on the audience of The New Yorker.


Readers may note that the real criticism should be directed to NHGRI with its funding of unscientific ‘big science’ projects on epigenetics and epigenomic markers, and unscrupulous scientists technicians like Eric Lander, Manolis Kellis, Eric Green and several others, who benefit from this largess. In fact, many criticisms of Sid Mukherjee’s article are similar to what we had been writing over the years regarding various epigenome papers and projects.

In “The Conspiracy of Epigenome?”, we wrote -

We would have had to scratch our heads less, if Lander used conspiracy of epigenome instead. Nothing managed to derail this expensive boondoggle over the last four years, including powerful critics of the scientific principle behind it, fraud allegation against the leader, public humiliation of its sister project ENCODE, NIH cost-cutting and protest of the scientists, and so on. What appears even more puzzling is that despite all those prior events, the leaders of the epigenome project ended up making the same mistakes as ENCODE. Dont these clowns learn anything?


In NHGRIs Epigenetics Investments Starting to Pay Off, we highlighted several other epigenetics- related bogus discoveries press-releases from NIH-funded research.


In Epigenetics in Stem Cells Is It a Significant Paradigm Shift in Biology?, we wrote -

Nobody gets more angry about reference to epiginetics than famous biologist Mark Ptashne. The reason is simple. Epigenetics is not as harmless as paradigm shift. In addition to using bad word, it also introduces bad scientific concepts as explained by Dr. Ptashne.

Faddish Stuff: Epigenetics and the Inheritance of Acquired Characteristics

Ptashnes battle against epigenetics is not new. Few years back, he, Oliver Hobert and Eric Davidson rightly questioned about the wisdom of backing large epigenome projects.


In Epigenetics Explained, in the Context of Descendants of Holocaust and Slavery Victims, we explained -

The word epigenetics was introduced by biologist C. H. Waddington in 1940s to explain how the cells maintained their states during development.

For example, the muscle cells, once differentiated, continue to divide into muscle cells and the same is true for other kinds of cells, even though they all start from one universal mother cell and carry the same chromosomes after division. How does a cell get programmed into being a muscle cell or neuronal cell? Waddington speculated that there must be some epigenetic effect, because all cells continue to have the same chromosomes and therefore the same set of genes.

Please note that in 1942, there was no DNA sequence and Waddington was trying to explain things in abstract terms. In the following 60+ years, Waddingtons question was fully answered by geneticists and developmental biologists. It is found that the cells remember their states (muscle or neuron or kidney, etc.) through the actions of a set of genes called transcription factors acting on other genes through binding sites on the chromosomes. Both Professor Ptashne and Professor Davidsons labs did extensive research to elucidate the mechanisms, and Waddingtons epigenetics was explained through their work.


Eric Davidson’s book

One major criticism of Mukherjee’s article is on his claim that Yamanaka’s stem cell work was a proof of ‘epigenetic regulation’, when it was the exact opposite. Readers interested in learning about the real science behind how genomes and Yamanaka’s work connect together should start with Eric Davidson’s book previously discussed in our blog -

A Must Read Book, If You Like to Understand Genomes


Lastly, maybe it is time for Sid Mukherjee to pick up a real cause that would save other writers like him from future embarrassment.

Lets Discuss Is it Time to Shut Down NHGRI?

(From one of our readers)


Postdoctoral Scholar: Forest Genomics Database and Software Developer


University of Connecticut, Storrs, CT, USA




DURATION: Full-time


The Plant Computational Genomics laboratory at University of Connecticut (Storrs, CT) has an opening for a database and software developer position as part of a collaborative project involving researchers from all of the world. This person will take a lead role in the continued development of the TreeGenes database and the CartograTree application (treegenesdb.org and cartogratree.org). These resources serve vast amounts of genomic, phenotypic, and environmental data to the user community. In addition, we are committed to providing greater collaboration and data sharing among partner databases through Galaxy modules, and new analysis/visualization platforms.


The successful candidate will work as part of a small interdisciplinary team of bioinformaticians and data curators. Tasks will include maintenance of the existing database, developing connections to collaborating databases, improving the user experience (interface development) and the implementation of novel analysis and visualization tools. The Postdoctoral Scholar will be provided with training opportunities and will attend conferences to present/train users on these new tools. In addition, publications on specific software products will result.


The qualified applicant will have a PhD degree in Bioinformatics, Information Systems, Computer Science, or a related field. Biology/Bioinformatics experience is essential as well as previous experience with web/database development. The applicant should have strong knowledge of Java (Struts/JSP) and JavaScript (jQuery). Experience with Linux/Unix, scripting languages (Python), and relational databases (PostgreSQL).


Please send the following THREE documents: cover letter, research statement, and CV to: jill.wegrzyn@uconn.edu Applications will be accepted until June 10th or until the position is filled.


Readers may note the previous opening from the same research group posted here in January.

from biorxiv

We present a generalization of the Positional Burrows-Wheeler Transform (PBWT) to genome graphs, which we call the gPBWT. A genome graph is a collapsed representation of a set of genomes described as a graph. In a genome graph, a haplotype corresponds to a restricted form of walk. The gPBWT is a compressible representation of a set of these graph-encoded haplotypes that allows for efficient subhaplotype match queries. We give efficient algorithms for gPBWT construction and query operations. We describe our implementation, showing the compression and search of 1000 Genomes data. As a demonstration, we use the gPBWT to quickly count the number of haplotypes consistent with random walks in a genome graph, and with the paths taken by mapped reads; results suggest that haplotype consistency information can be practically incorporated into graph-based read mappers.

One year back at this time, all of Obama’s proxy war bets were going well. Ukraine’s rebel block was in bad shape, Russian economy was in free fall, Eurozone was booming, ISIS was ready to strike at the heart of Damascus and Saudis confidently marched into Yemen expecting to control the place within a month. Even the Greeks were excited with the election of Varoufakis as finance minister.

Fast forward to today, and you see -

(i) Greece folded with Varoufakis being kicked out by EU bankers and the savings of ordinary people being taken away,

(ii) European economy is in the gutter and on top of that there is a major refugee crisis,

(iii) Syrian civil war is over with ISIS almost gone,

(iv) Erdogan exposed as the ISIS mastermind,

(v) Major terror attacks in Paris and Brussels with possibly more to come elsewhere in Europe,

(vi) Yemen, one of the poorest country, humiliated Saudis,

(vii) Last, but not the least, Russia re-established itself as a major power.

Those reading our blog should have anticipated most of those outcomes, if not all. Here are the three pointers to help you navigate through the next decades. Brief explanation follows at the bottom of the post.



1. China May Buy 19.5% Stake in Rosneft

Sale of major Rosneft stake to CNPC signals strengthening ties between China and Russia

China National Petroleum Corp. Deputy Chief Executive Wang Zhongcai said the company is interested in purchasing a stake in Rosneft, Russias state-owned oil company. The sale would strengthen the ties between the two countries, which have increasingly been working together on energy projects.

Russia needs the cash

Russia is looking to sell a 19.5% stake in Rosneft, the worlds largest listed crude producer, as it looks for ways to bridge its fiscal gap amid low oil prices. The Russian government has said the stake is worth $10 billion, reports The Wall Street Journal.


2. Obama Is Siding with Saudi Arabia over 9/11 Victims

The seven-decade-old bilateral relationship between the United States and Saudi Arabia is in rough waters. On seemingly every major issue of importance in the Middle East, Washington and Riyadh are either locking horns on strategy or arguing with one another over how a particular goal should be met.

More often than not, the Obama administration and the al-Saud ruling family share the very same objectives in the region. Both, for instance, want Bashar al-Assads rule to end in Syria; both wish that Yemens internationally recognized government will eventually have the strength to reinstate itself in the capital; and U.S. and Saudi leaders are increasingly of one mind on the barbarity that the Islamic State represents.


3. **The foreign policy of Donald Trump, the next US president -



Analysis - geopolitics as we expect to develop

1. The beginning of multipolar world:

The humiliation of Obama and neocons is one of the fastest since Napoleon’s invasion of Russia in 1812. Therefore, this event will define world events for many decades. Just like France disappeared from the global scene after 1810s leaving the British empire reign unopposed for a century, USA will slowly move out of the global scene. Donald Trump’s speech is pretty much the beginning of defining that course.

2. Russia/China/Iran triple alliance:

Increasing alliance between Russia and China (and soon Iran) will define the center of gravity of economic affairs in the world.

3. Rotation from coast to land:

Right now, you see various coastal cities in Europe, Asia and Americas as prosperous, while the inner regions struggle. That was not how the world appeared three centuries back, when Afghanistan was a prosperous country and Persia (Iran) was another. In contrast, people were afraid to live in the coastal cities to avoid pirates and other sea-based hazards. Then the sea- based commerce took over, eventually leading to the prosperity of coastal cities.

We are now beginning to see a reversal to land-based trade activities starting with the ‘new silk road’ efforts by China, and this reversal will last many centuries.

4. Fall of Saudi Arabia and Turkey:

The most immediate impact of rapid and unexpected defeat of neocons will be the fall of their middle eastern proxies - Saudi Arabia and Turkey. The process is well underway.

Hundreds of demonstrators stormed the heavily-secured Green Zone in Baghdad

Here is how much importance NY Times gives to an event of enormous global significance (h/t: nakedcapitalism blog) -



Seymour Hersh Says Hillary Approved Sending Libya’s Sarin To Syrian Rebels

The great investigative journalist Seymour Hersh, in two previous articles in the London Review of Books (“Whose Sarin?” and “The Red Line and the Rat Line”) has reported that the Obama Administration falsely blamed the government of Syrias Bashar al-Assad for the sarin gas attack that Obama was trying to use as an excuse to invade Syria; and Hersh pointed to a report from British intelligence saying that the sarin that was used didnt come from Assads stockpiles. Hersh also said that a secret agreement in 2012 was reached between the Obama Administration and the leaders of Turkey, Saudi Arabia, and Qatar, to set up a sarin gas attack and blame it on Assad so that the US could invade and overthrow Assad.

“By the terms of the agreement, funding came from Turkey, as well as Saudi Arabia and Qatar; the CIA, with the support of MI6, was responsible for getting arms from Gaddafis arsenals into Syria.”

Hersh didnt say whether these ‘arms’ included the precursor chemicals for making sarin which were stockpiled in Libya, but there have been multiple independent reports that Libyas Gaddafi possessed such stockpiles, and also that the US Consulate in Benghazi Libya was operating a “rat line” for Gaddafis captured weapons into Syria through Turkey. So, Hersh isnt the only reporter who has been covering this. Indeed, the investigative journalist Christoph Lehmann headlined on 7 October 2013, “Top US and Saudi Officials responsible for Chemical Weapons in Syria” and reported, on the basis of very different sources than Hersh used, that:

“Evidence leads directly to the White House, the Chairman of the Joint Chiefs of Staff Martin Dempsey, CIA Director John Brennan, Saudi Intelligence Chief Prince Bandar, and Saudi Arabias Interior Ministry.”

Remember that Russia’s stopping of that invasion in 2013 was the beginning of the end for the Obama team. So, those following the events closely sensed major change in wind direction at that time.

Once all dust clears, Iran will take control of middle east. Therefore, Trump’s plan to ‘keep Iran under control’ will be too late, when it arrives.



Our earlier article on Illumina’s surprise announcement received a number of informative comments. We like to discuss them here, but first it is important to explain the term ‘peak sequencing’ properly. That explanation requires us to discuss the relevance of using Moore’s law for sequencing.

Poser syndrome is quite common among humans. During 1999-2000, every Silly con valley startup wanted to be ‘the next yahoo’, and nobody wanted to imitate Steve Jobs. Steve Jobs was considered a failure (despite his success with Pixar), whereas yahoo was the future of internet. Fifteen year later, everyone wants to be ‘the next Steve Jobs’ and no yahoo fan is left.


However, the commonly uses Moore’s law in sequencing chart is the mother of all poser syndrome cases.

First, the chart is misleading, because the chart is not adjusted for quality of sequencing. In contrast, a transistor is a switch, be it in 5 micron technology or 14 nanometer technology. The semiconductor industry puts together all pieces to make sure the fundamental definition of MOSFET switch is not broken.

Second, sequencing cost is a meaningless measure within the big world of biomedical research and development, whereas transistor density is at the core of business proposition for many companies. In contrast, the bottom-line of hardly any company improves with Moore’s law like decrease in sequencing cost.

That brings us to ‘peak sequencing’. The importance of sequencing, and especially genome sequencing, was brought to the fore by propaganda leading up to the human genome project. That created a pent up demand for sequencing of other genomes, because many scientists saw ticket to high-profile papers by catching an interesting organism and sequencing its genome. There was no reason to expect the genome of an ant to be as social as an ant, or the genome of scarlet macaw to be as colorful scarlet macaw, but the pictures of interesting animals do make good journal covers. On top of that, NHGRI, the dysfunctional government agency that sponsored ENCODE, dreamed up of various make-believe uses of sequencing.

The entire enterprise failed, because the breadth of imagination of central planners could not keep up with the reduced cost of sequencing and other emergent technologies. The business case did not develop as anticipated, whereas the top journals made increasing demands on the genome projects, such as asking for two genomes, ten genomes, one thousand genomes and so on, before allocating them precious space. Scientists saw better opportunities elsewhere and moved on. Scientists are rational actors in this situation.

In our humble opinion, the peak of that shift came around late 2014-early 2015. Maybe this humorous post in parody site ‘The Science Web’ marked the peak.

90% of researchers sequencing things because they cant think of anything else to do

Cambridge, MA. 90% of genomics researchers are sequencing things because they exist in a complete intellectual vacuum and cant think of anything else to do, a recent survey suggests.

The survey asked the simple question: If you are currently involved in a genome sequencing project, can you identify any real reasons why?. 92% of the 10,000 scientists surveyed ticked no. In the comments box beneath the question, respondents had written if it moves, sequence it, I dont understand the question and you could sequence an ants scrotum and Genome Research would publish it.

Recent papers in genomics include those describing the genome of the cucumber, the genome of the centipede and the genome of the ferret. Some have suggested that scientists are simply following the alphabet, but Arthur MacDaniel of the Wide Institute denies this: Following the alphabet would imply that genomicists are following some kind of logic or rational; theyre not. Their motto is sequence first, think later but too often they only complete half of those tasks. These guys are frothing at the mouth for DNA. In fact, theyd sequence the froth and submit it to Nature if they could Arthur continued.

So, our ‘peak sequencing’ call does not mean researchers will not continue to do more sequencing in future, but this ‘sequence if you cant think of anything else to do’ madness has ended.


Going back to specific comments -


Interesting enough Pacbio had a strong quarter, particularly in Europe. Id say its not so much the market slowing but that Illumina has some decent competition now.

In our understanding, those using Pacbio are not trying to get genome papers. Many of them realize that they already got their fifteen minutes of fame by publishing genome papers in visible journals, but cannot do much research with that genome split into countless contigs and scaffolds. Therefore, we do not expect a lot of big genome papers from those efforts, but the technology will continue to make inroads among serious scientists.

In my eyes Illumina unfortunately does not have any competition at the moment (perhaps the BGI-seq? but this exists only in China). PacBio has its niche, but this is small niche.

Illumina is however manipulating the market perhaps a bit too much with their X-series sequencers (e.g. selling the same costly reagents to companies/huge- institutes which buy 5 or 10 X-series sequencers at a third of the price that regular customers are paying.)

Why would smaller regular customers (which previously certainly made up the majority of instrument purchases) invest into new $100,000 machines if the monopolist is making arbitrary reagent pricing decisions and makes up arbitrary limitations on what people are allowed to sequence or not (and then changes these rules annually)? Due to its monopoly Illumina is in a position to arbitrarily manipulate the market and they are doing so thus purchasing a new pricey instrument becomes too risky for small institutes and Illuminas manipulations are backfiring.

Illumina’s success since 2008 is due to confluence of two factors -

(i) pent up demand for mindless sequencing due to NIH/NHGRI propaganda since human genome project, and top journals giving space to publication of genomes, just because the organisms are interesting.

(ii) Obama stimulus grant (American Recovery and Reinvestment Act of 2009) and various similar post-2008 crash spending all over the world. Illumina happened to be in the right place at right time.

With tightening budgets and lack of free money, the scientists are getting smarter as highlighted by your comment (“Why would smaller regular customers invest into new $100,000 machines..”). Given that the scientists are in the business of publishing most interesting papers with the least effort, sequencing may not be the avenue they choose.


The utterly dishonest NIH head Francis Collins participated in a reddit AMA three days back. One reader asked -

Hi Dr. Collins. Due to the rising number of retracted papers and errata in the biomedical sciences and the lack of reproducibility of many key study results, it seems like the scientific community needs better quality control than our current peer review system provides. Does the NIH have any initiatives being developed to promote more reproducible science and better communication of science?

What Collins did not say:

One effective way to stop retractions is to promote the senior author to be NIH director. Unfortunately, there are not too many slots left.


For reference (NY Times 1996) -

Falsified Data Found in Gene Studies

Dr. Francis S. Collins, the head of the Government’s project to map all human genes, said yesterday that he was retracting five research papers on leukemia in leading scientific journals because a junior colleague had fabricated data.

The flawed papers involved laboratory research on the role of a defective gene in producing acute leukemia. The research did not involve patients or treatment of the disease, nor was it directly related to the gene-mapping project.

Upon learning of the problem in mid-August, Dr. Collins said in an interview, he ‘‘thought it was an isolated instance whereby a trainee in my laboratory manipulated the data.’’ But two weeks later, after examining the colleague’s laboratory notebooks and testing material in the freezer, he said, ‘‘the significance and the scope of the fabrication in this circumstance, of which I had not the slightest idea, began to be very apparent.’’

Collins essentially blamed ‘junior author’ for even the two-author papers and took no responsibility at all.

Yesterday, I attended an interesting talk by Ian Derrington, who is currently working as a post-doctoral researcher in the lab of professor Jens Gundlach at the University of Washington (UW). For those who do not know, Gundlach lab is the first to identify and use MspA as an efficient pore molecule for nanopore sequencing, and they published several key papers related to development of the technology over the years. In my understanding, Illumina’s licensing of MspA-related patents from UW is the basis of their IP lawsuit against Oxford Nanopore.

I was planning to write about Ian’s talk, but found an excellent video from three years back to save me from the trouble. The talk in the youtube was delivered by Jens Gundlach himself, and so I am including a somewhat unrelated video of Ian and leaving our readers to do the hard part of combining the two in their imagination :)

Above video of Gundlach is three years old, but the lab has been very active in exploring new frontiers using their technology. Ian’s talk included plenty of materials about their recent (and very interesting) research. Those looking for more information may start from their latest paper, where they used nanopore as a single-molecule ‘tweezer’ !

MspA nanopore as a single-molecule tool: From sequencing to SPRNT

Single-molecule picometer resolution nanopore tweezers (SPRNT) is a new tool for analyzing the motion of nucleic acids through molecular motors. With SPRNT, individual enzymatic motions along DNA as small as 40 pm can be resolved on sub-millisecond time scales. Additionally, SPRNT reveals an enzymes exact location with respect to a DNA strands nucleotide sequence, enabling identification of sequence-specific behaviors. SPRNT is enabled by a mutant version of the biological nanopore formed by Mycobacterium smegmatis porin A (MspA). SPRNT is strongly rooted in nanopore sequencing and therefore requires a solid understanding of basic principles of nanopore sequencing. Furthermore, SPRNT shares tools developed for nanopore sequencing and extends them to analysis of single-molecule kinetics. As such, this review begins with a brief history of our work developing the nanopore MspA for nanopore sequencing. We then describe the underlying principles of SPRNT, how it works in detail, and propose some potential future uses. We close with a comparison of SPRNT to other techniques and we present the methods that will enable others to use SPRNT.

How did life originate on earth? What chemical properties of living objects make them different from the non-living objects? Where did the genome come from in the first place? How was life before the emergence of the genetic code? Why does the genetic code have that specific form? What biochemical advantages do the proteins get by having methionine as their first amino acid? How did the metabolic pathway evolve to its current form? How did bacteria, archaea and eukaryotes evolve into distinct kingdoms? How did multi- cellularity evolve? How did our ability to hear and smell evolve? Where did the adaptive immune system come from?

There are hundreds of such unanswered questions in biology and biochemistry. In fact, if one opens an introductory biology or biochemistry book and places ‘why’ in front of every statement, one may be surprised by the lack of available answers for most. Many ‘accepted’ answers to evolutionary questions are no more sophisticated than the Kipling’s Just So Stories after the technical mumbo-jumbos are cleaned away. Even the molecular answers are often derived from just one or two organisms (E. coli bacteria and S. cerevisiae eukaryote), and may not survive in the same form, when a third kingdom (archaea) is added to the mix.


The availability of fully sequenced genomes may provide clues to solve many of those problems, and therefore scientists with computational skills looking for interesting puzzles will find a gold-mine (or many) in those unexplored questions. That realization is what got me attracted to this new line of research. All you have to do is get the ~50,000 bacterial genomes, ~6,000 archaeal genomes and >500 (unicellular) eukaryotic genomes sequenced so far, and then ask whether the accepted answer for a question is valid or not. In fact, I can make the first part easy for those, who are interested, because I am keeping an updated collection of those genomes (plus annotations) for my own work.


With that general introduction, let me write more about the current paper. The abstract is presented below, and instead of copying the technical details from the paper, I will discuss some of the conceptual thinking that went into choosing this particular topic.

RNase P, a ribozyme-based ribonucleoprotein (RNP) complex that catalyzes tRNA 5?-maturation, is ubiquitous in all domains of life, but the evolution of its protein components (RNase P proteins, RPPs) is not well understood. Archaeal RPPs may provide clues on how the complex evolved from an ancient ribozyme to an RNP with multiple archaeal and eukaryotic (homologous) RPPs, which are unrelated to the single bacterial RPP. Here, we analyzed the sequence and structure of archaeal RPPs from over 600 available genomes. All five RPPs are found in eight archaeal phyla, suggesting that these RPPs arose early in archaeal evolutionary history. The putative ancestral genomic loci of archaeal RPPs include genes encoding several members of ribosome, exosome, and proteasome complexes, which may indicate coevolution/coordinate regulation of RNase P with other core cellular machineries. Despite being ancient, RPPs generally lack sequence conservation compared to other universal proteins. By analyzing the relative frequency of residues at every position in the context of the high-resolution structures of each of the RPPs (either alone or as functional binary complexes), we suggest residues for mutational analysis that may help uncover structure-function relationships in RPPs.


Connecting genetics with physical/chemical laws

There is a physical and chemical underlayer in all living organisms, on top of which the genetic layer stays active. That is self-evident given that the living organisms have to satisfy all known laws of physics and chemistry, or have to have additional laws. The discovery of later will be news by itself.

Sadly, very few researchers with expertise in bioinformatics are looking into the connection between genomic information and the physical/chemical system. I came to the realization after working on the electric eel genome project, where we tried to figure out the genetic and evolutionary basis of electric eels giving massive electric shocks. In fact, those non-traditional organisms are so little explored that researchers can make unusual discoveries without the help from millions of dollars of grants. For example, check this single author paper with simple experiments that led to a profound discovery - “The shocking predatory strike of the electric eel”.

Just like electric eels can help in connecting electricity and genetics, RNAs like tRNAs and RNase P can connect the knowledge of chemistry with genetics and provide an evolutionary basis. That is the primary reason for my interest in RNase P.


Why RNase P and tRNA?

In my very personal opinion, tRNA and RNase P are the molecules, which connect the information content of the genome and the chemical underlayer of the cells. Moreover, recent discoveries suggest that at least one of these molecules (tRNA) is much less understood than previously thought. For example, the paper “The structural basis for specific decoding of AUA by isoleucine tRNA on the ribosome” by Rebecca M. Voorhees (well-respected senior authors Uttam RajBhandary and V Ramakrishnan) concludes -

The role of the agmatidine and lysidine modifications in isoleucine decoding is a notable example of the essential function of modifications in tRNA and, more generally, in RNA biology. It is increasingly evident that although the genetic code was first elucidated over 40 years ago, we are just beginning to appreciate the sheer complexity of ensuring accuracy in protein synthesis on the ribosome.

RNase P, whose function is intricately linked to tRNA is even less studied with the aid of so many sequenced genomes than tRNA. RNase P is a ribonucleoprotein complex, whose RNA component is present as ribozyme in bacteria, archaea and eukaryote. Given that the enzyme performs the mundane task of cutting 5’ end of precursor tRNA, why did it not get replaced by a protein early in the course of evolution? On the other hand, if the RNA has undiscovered major roles to play, why is it present in exactly one copy in all bacterial and archaeal genomes, and two (diverged) copies in eukaryotic genomes? Questions like these got me curious about tRNA and RNase P.


Why archaea?

Imagine you are a naturalist from 18th century looking for previously unknown large animals, and you are aware of four disconnected masses of land (Eurasia, Africa, Americas and Australia) with the first three well-explored. Where will you look for novelty? In the context of life on earth, archaea represents the unexplored continent.


The paper

Based on available genomes sequences and structures, we looked into the evolution of protein components of RNase P in archaea. It is known that the evolutionary origin of the protein component of the complex is quite puzzling. Its RNA component likely emerged before the origin of genetic code or all the way back in the ‘RNA world’, but the protein component appeared in an unusual manner. It has only one member in bacteria, five in archaea and nine or more in eukaryota. The single bacterial protein component has no match to anything in archaea or eukaryota, although all protein components in archaea have orthologs in eukaryotes.

In the paper, we searched the entire collection of sequenced archaeal genomes for RNase P proteins, checked their genomic neighborhoods, analyzed phylogeny and compared the conserved amino acid locations with protein structures. One unexpected observation was that all five proteins were present in all phyla of archaea. We expected them to get added to progressively to various phyla, and thought we could rank archaeal evolution based on the timing of addition. That did not happen opening up a hole in the general understanding of evolution. Did the entire complex in archaea appear all together very early during archaeal evolution, while the bacterial side got an entirely different protein? Proteins related to DNA replication and few other major cellular tasks also show similar behavior. That means there was a period after separation of bacteria and archaea, but before the separation of major archaeal phyla, when many important components of archaea appeared. How long was that period, and how did molecular evolution take place in that era?

Some of evolutionary questions related to RNase P can be addressed through better understanding of the functionality of the complex, and that is where our sequence-structure comparison will come in handy. We did all necessary computational analysis to help an experimentalist initiate mutagenesis on the highly conserved locations of the proteins and understand how the amino acids work together with the ribozyme RNA to cut the tRNA.



[caption id=”attachment_18343” align=”aligncenter” width=”180”]SONY
DSC SONY DSC[/caption]


The most enjoyable aspect of this project was my collaborators (Professor Venkat Gopalan and his student Stella Lai, and Professor Charles Daniels - all from Ohio State University). Each of us brought unique expertise to the project. Dr. Gopalan’s lab specializes on RNase P, whereas Dr. Daniels is an expert on archaea and especially halobacteria. If there is one big take home message I took from this project, it is that working with small groups asking focused questions is always a lot more exciting than ‘big science’.

NY Times reports that the suicide rate in USA is rising very fast.

Suicide in the United States has surged to the highest levels in nearly 30 years, a federal data analysis has found, with increases in every age group except older adults. The rise was particularly steep for women. It was also substantial among middle-aged Americans, sending a signal of deep anguish from a group whose suicide rates had been stable or falling since the 1950s.

The suicide rate for middle-aged women, ages 45 to 64, jumped by 63 percent over the period of the study, while it rose by 43 percent for men in that age range, the sharpest increase for males of any age. The overall suicide rate rose by 24 percent from 1999 to 2014, according to the National Center for Health Statistics, which released the study on Friday.

The increases were so widespread that they lifted the nations suicide rate to 13 per 100,000 people, the highest since 1986. The rate rose by 2 percent a year starting in 2006, double the annual rise in the earlier period of the study. In all, 42,773 people died from suicide in 2014, compared with 29,199 in 1999.

The article continues with several paragraphs of speculation from ‘experts’ on why the suicide rate is so high, and summarizes those views in one sentence.

The question of what has driven the increases is unresolved, leaving experts to muse on the reasons.

Our resident non-expert has a single word explanation - debt.

Interestingly, the NY Times article also mentions -

N.I.H. funding for suicide prevention projects had been relatively flat rising to $25 million in 2016 from $22 million in 2012

…with the suggestion that funding should be increased for ‘moaaaar studies’.

Our resident non-expert has different ideas.

Francis Collins Admits NIH Under Him Has Been Failing

Lets Discuss Is it Time to Shut Down NHGRI?