Our #AGBT16 Forecast – Oxford Nanopore Will Go Out of Business by 2017

Investor warning: The following post is for entertainment purpose only, and should not be considered as financial advice of any sort. Please consult your favorite government-certified investment adviser or central banker to decide, where to invest your life savings.
———————————————————-

Our regular readers know that our track record for making bold forecasts is not at all good. Please allow us to keep trying so that we can improve our skills.

Oxford nanopore appears to be on a roll. This morning, Nick Loman presented at the AGBT conference on his work on Ebola detection. His work was recently published in Nature.

Capture

Elsewhere, Omics!Omics! blog wrote a few weeks back –

When it comes to Nanopore, am I too GAGA?

SCIENTIFIC DREAM INCARNATE

When I first rotated in George Church’s lab in the summer of 1992, I naturally met each person and inquired as to their project. A post-doc named Rich told me a crazy idea of Georges that one might be able to sequence DNA using patch-clamp techniques as the DNA passed through a pore in a membrane. Later, Rich had a patch clamp setup — it was on a vibration-isolating table enclosed in a Faraday cage. It was at least the size of a desk. Even with all that, Rich would say he could tell who was walking down the blind hallway adjacent to it, based on the cadence from the footsteps creating noise on his instrument.

Fast forward 22 years, and we have our MinION at Starbase, ready to go. No air table, no Faraday cage — and I could stuff a shirt pocket with half a dozen of them. We loaded it with a library which my colleague created, and let it run. Then I looked a the data. Much of it was junk, and the yields that MinKNOW promised of long reads just weren’t there — but what was there was a single read which could be recognizable aligned to the entire 48.5 kilobases of lambda phage. Kaboom! That’s one hell of an amazing read.

We also wrote about “Two Potentially Important Developments on Nanopore“, and one of those developments was considered as paradigm shift by Mick Watson (Did you notice the paradigm shift in DNA sequencing last week?).

Despite all those positives, things are not looking good for Oxford Nanopore IMHO. Let us present our assessment for making the bold call in the title.

——————————————————-

1. Amazing technology

Nanopore fans often highlight the amazing technology that the company has built. Historically, companies with mind-boggling technologyies do go down from time to time, and supersonic plane Concorde is the best example of that. Amazing technology does not cut alone, unless a number of other factors are supportive, as listed below.

———————————————————

2. Investment and valuation

Capture

Oxford Nanopore raised large amount of money in mid-July 2015, and the arrow on the above chart shows the exact date when the public announcement was made. The comparison with Illumina is significant, because usually the investors make valuation decision based on important public companies. Illumina, being the 500 pound gorilla in the sequencing space, is likely that benchmark.

Fast forward to today and you see Illumina’s valuation being cut by nearly half as of today. That means high-flying unicorns (i.e. profitless private companies with >$1B valuation) are expected to lose even more on their valuations.

However, not being able to raise more money is not a business-killer, unless large investments are mismanaged.

———————————————————

3. Mismanagement

Readers many have noticed a tweet related to ‘Prometheon’ in our yesterday’s blog post from AGBT.

Capture

That appears rather innocuous, because companies sometimes have delay in delivering products. However, once you go through the comments at Glassdoor, a rather ominous picture develops. One persons says the CTO ‘announces products that don’t exist and before we know if they’re possible (usually they’re not and so we waste months and millions)’. Is ‘Prometheon’ one such product?

“Management don’t to listen to engineers”

I have been working at Oxford Nanopore Technologies full-time
Pros
You can work on interesting projects if you’re in the right team, opportunity to move between teams, friendly staff, opportunity to go on training courses, good private healthcare package, very commutable location.
Cons
The company is badly mismanaged – engineers are routinely ignored and as a consequence people take to glassdoor to vent. The CEO thinks that the organisation is very communicative and therefore makes zero effort to listen to technical staff (CEO just makes a bad joke about P45s every few months to nervous laughter). The attack on the CTO by the previous commenter isn’t completely justified, he’s not as scientifically illiterate as made out. However he does need to go – he announces products that don’t exist and before we know if they’re possible (usually they’re not and so we waste months and millions), and shuts people off mid-sentence if they disagree with him. He also posts bizarre things on the internet. The main effect of this is that engineers feel frustrated and unable to make ourselves heard.
Advice to Management
Make a real effort to improve communication, with everyone, don’t make them feel like they will be demoted/fired if they disagree with you. Rein in CTO.

Interesting tech but political

Pros
Good pay and benefits (private healthcare). Overtime is not required, though sometimes appreciated. Can in general be reimbursed for holidays not taken or carry them over to the next year. Basic technology is really interesting, and important (significant scientific and social impact). Regular bonuses and employees receive share options.
Cons
I didn’t like the management style very much, seemed quite political at times. I guess having raised so much money there was a huge pressure to hire, and large teams were formed without them really having work to do. Management seems quite isolated from the reality of what’s happening.
Advice to Management
Restructure, smaller more focused groups. Address issues with political individuals.

“This place is doomed”

I have been working at Oxford Nanopore Technologies full-time (More than 5 years)
Pros
The vision isn’t bad. The people in the trenches are brilliant and wonderful. The tech is cool.
Cons
The management, the management, the management, the management. The endless politics, the backstabbing, the lack of recognition.
Advice to Management
Stop keeping people in the dark. Stop playing games with people’s lives. Try to have a sense of humor.

———————————————————

4. Cost

One question that often keeps coming back is how much it costs to do the ‘cool USB sequencing’ after various factors are included. For example, Jason Chin from Pacbio asked in tweeter about the total cost of Loman’s Ebola sequencing.

Capture2

USB sticks being used in computers are popular not only because they are portable, but also because they are cheap. Is that the same for nanopore ‘sticks’?

This point about cost is intricately related to 2 and 3, because (i) a company with large investments and high valuation needs to raise prices to pay for the investment, (ii) if the company is not managed in a lean manner, that reduces room for cutting costs.

———————————————————

5. Not so amazing technology yet

Our point 1 was about ‘amazing technology’ as seen from the point of view of the fans of the company. Their tweets made us re-evaluate various papers at biorxiv, and we are not so enamored on this point.

For example, in our previous blog posts, readers mentioned the space as a potentially big application for the technology. They may take a look at the following paper from NASA, who did not manage to make good use of the sequencer. After they did a full evaluation, they wrote –

Nanopore Sequencing in Microgravity

The ability to perform remote, in situ sequencing and diagnostics has been a long-sought goal for point-of-care medicine and portable DNA/RNA measurements. This technological advancement extends to missions beyond Earth as well, both for crew health and astrobiology applications. However, most commercially available sequencing technologies are ill-suited for space flight for a variety of reasons, including excessive volume and mass, and insufficient ruggedization for spaceflight. Portable and lightweight nanopore-based sequencers, which analyze nucleic acids electrochemically, are inherently much better suited to spaceflight, and could potentially be incorporated into future missions with only minimal modification. As a first step toward evaluating the performance of nanopore sequencers in a microgravity environment, we tested the Oxford Nanopore Technologies MinIONTM in a parabolic flight simulator to examine the effect of reduced gravity on DNA sequencing. The instrument successfully generated three reads, averaging 2,371 bases. However, the median current was shifted across all reads and the error profiles changed compared with operation of the sequencer on the ground, indicating that distinct computational methods may be needed for such data. We evaluated existing methods and propose two new methods; the first new method is based on a wave-fingerprint method similar to that of the Shazam model for matching periodicity information in music, and the second is based on entropy signal mapping. These tools provide a unique opportunity for nucleic acid sequencing in reduced gravity environments. Finally, we discuss the lessons learned from the parabolic flight as they would apply to performing DNA sequencing with the MinIONTM aboard the International Space Station.

CTO of the company challenged their assertion in the comment section, and his point is scientifically valid. The larger point is that NASA did not find the technology easy to use, and it will possibly take several iterations before they pay for it. At this point, they remain to be unconvinced customers.

I am highly sceptical that gravity changes the operation of the chemistry which works on a very small scale and is driven by ion flow and diffusion – and there isn’t enough data here to test that hypothesis fully yet anyway. There are lots of other things that might indirectly affect operation of the entire system, the simplest being that you had bad chips – but even so, given the amount of data, at this stage i’d be sceptical that the differences you have seen do not have simpler causes or are just statistically insignificant.

An interesting alternative might be instead, for example, take a MinION system that is operating normally from the start so providing a baseline, drop it from a tall building in a box, and continue to record the data. As I understand, the vomit comet is simply freefall in an enclosed space removing decelerating wind resistance. This drop experiment, whilst short lived, would at least provide an internal control for the setup, chip quality, buffers, operator etc and any changes in state from baseline ‘normal’ caused by the freefall itself could be seen against that. My bet, just a bet, is that there wouldn’t be any changes to the data, in fact – Im thinking this is a helpful test we could do. Not least as we don’t have a Vomit Comet.

A valiant effort nonetheless, and my suggestion would be overcome any operational obstacles to generating many more reads on a normal run caused by doing the set up in low G. Use of VolTRAX instead of tip and Gilsons (and humans) would remove the operator/liquid handling difficulties and any variance coming from it, again isolating the core system.

Even if there was a systematic effect on the operation of the sensing/chemistry itself, provided the overall S:N is preserved and we still have step wise movement of the DNA under a constant applied current – it is probably learnable and amenable to the same decoding methods used now, new methods would not be needed.

—————————————————–

After considering all pluses and minuses, our assessment is that this company managed to raise money in a very positive environment and will not be able to pull through during a global financial rout. Moreover, the hostility (based on glassdoor) between management and rank and file employees will lead to demise or firesale of the company in such an environment.

Senior management, in particular the CTO, have placed company in a situation where we are not profitable and our jobs are at risk. While employees worry about their employment prospects some senior management park Ferraris and Porches in the car park at the building entrance for everyone to see. * CTO surrounds himself with yes men, demoting those who disagree with him, He is also arrogant to the point which many find extremely off-putting and has very limited understanding of core scientific concepts. There is not one technical person in the company with a positive word to say about him. * It is unclear how long we can survive unprofitable. Management share none of the finances with staff. * Highly-political environment.

Future of Science is Being Decided in Aleppo, Syria

The Science Alert website posted an article on the sci-hub site.

Researcher illegally shares millions of science papers free online to spread knowledge

Here is the ‘he says, she says’ part from the article.

sci-hub is bad

The big publishers are pissed off. Last year, a New York court delivered an injunction against Sci-Hub, making its domain unavailable (something Elbakyan dodged by switching to a new location), and the site is also being sued by Elsevier for “irreparable harm” – a case that experts are predicting will win Elsevier around $750 to $150,000 for each pirated article. Even at the lowest estimations, that would quickly add up to millions in damages.

sci-hub is good

She also explains that the academic publishing situation is different to the music or film industry, where pirating is ripping off creators. “All papers on their website are written by researchers, and researchers do not receive money from what Elsevier collects. That is very different from the music or movie industry, where creators receive money from each copy sold,” she said.

Irrespective of those emotional points, the article makes clear that the site has no worry about being shut down, as long as it is deemed legal in Russia.

Astute readers may realize that we are seeing a reversal from the events of 400 years back, when the British empire rose on the back of pirates. Naval trading to Asia was the ‘cutting edge industry’ of that time. If you check the list of pirates during the ‘golden age of piracy’ and before, you will find most representatives from UK and Scotland.

England and Scotland practiced privateering both separately and together after they united to create the Kingdom of Great Britain in 1707. It was a way to gain for themselves some of the wealth the Spanish and Portuguese were taking from the New World before beginning their own trans-Atlantic settlement, and a way to assert naval power before a strong Royal Navy emerged.

Sir Andrew Barton, Lord High Admiral of Scotland, followed the example of his father, who had been issued with letters of marque by James III of Scotland to prey upon English and Portuguese shipping in 1485; the letters in due course were reissued to the son. Barton was killed following an encounter with the English in 1511.

Spain, militarily the big dog in those days, tried to stop piracy, but lost its naval ships as a result.

Sir Francis Drake, who had close contact with the sovereign, was responsible for some damage to Spanish shipping, as well as attacks on Spanish settlements in the Americas in the 16th century. He participated in the successful English defence against the Spanish Armada in 1588, though he was also partly responsible for the failure of the English Armada against Spain in 1589.

Sir George Clifford, 3rd Earl of Cumberland was a successful privateer against Spanish shipping in the Caribbean. He is also famous for his short-lived 1598 capture of Fort San Felipe del Morro, the citadel protecting San Juan, Puerto Rico. He arrived in Puerto Rico on June 15, 1598, but by November of that year Clifford and his men had fled the island due to fierce civilian resistance. He gained sufficient prestige from his naval exploits to be named the official Champion of Queen Elizabeth I. Clifford became extremely wealthy through his buccaneering, but lost most of his money gambling on horse races.

That brings us to today, when it is not possible to stop ‘journal piracy’ without defeating Putin at war. Nature journal may have best recognized the connection given how rabidly anti-Russian its editorial pages went over the last few years (check – “Russian Government is Beheading Scientists in Red Square” – Nature Reports).

That brings us to the war front of today, where NATO countries are fighting against Russia/Iran/Syria through their ISIS proxy funded by USA/Turkey/Saudi Arabia/Qatar. Based on latest news, the home team is losing badly and is asking for a time-out.

Maybe it is time for the journals to draft those scientists, who are staunch believers of ‘high impact factor’ glam journals, and send to the war front :).

The Virus Spreading in Europe

Max Delbrück, together with Luria and Hershey, received Nobel prize in 1969 “for their discoveries concerning the replication mechanism and the genetic structure of viruses”.

Delbrück had another connection with a virus that is currently spreading fast in Europe. His grandfather’s brother founded Deutsche bank ~150 years back. Europe’s largest bank is now in trouble, and is in desperate need of bail-out/bail-in. Moreover, when the largest European bank goes down, that event has the potential to take down several other banks. In the worst case, the entire global and interconnected banking system can experience ‘blue screen of death’ on some sunny day. No wonder, central bankers are panicking again.

Deutsche Bank Spikes Most In 5 Years (Just Like Lehman Did)

20160210_DBLEH_0

Of course, failure of the largest bank is what Europe needs at this moment.

Live Streaming of Pacbio Session at #AGBT16

You will enjoy a number of very good talks on the Sequel system.

Also, if you are new to long reads, please check out our tutorial by joining here. We have done incredible work to simplify your life. You get cloud account, where all software are installed and learn in parallel, while reading the instructions.

Ca81_GKXEAAGudO

The Most Important Part of Flatley’s #AGBT16 Talk

Capture

Capture2

Think about it for a moment. When Collins made dreamy forecasts about the benefits of human genome project in 1999 (check “A hypthetical case in 2010“), sequencing cost was the biggest hindrance for his forecast, not science. Now that sequencing cost has fallen by many orders of magnitude beyond what Collins anticipated, we find out that that the ‘science’ was never there.

You notice the same lack of scientific discoveries in another talk of the morning. It was the one from Anne Wojcicki of 23andme. Several tweets were made about finding association between genome sequence and those responding to spam emails (geez!), but no huge breakthroughs were reported.

The above failure does not mean all hopes are lost however. In the coming week, we will post an interview with a researcher, who pursued the right science and made an incredible contribution that will very likely lead to curing diseases. Small hint – he found our post – “A Must Read Book, If You Like to Understand Genomes” helpful in his work. In contrast, the failure of NHGRI is simply due to Collins and NHGRI following an incorrect model of how genomes work despite warnings from Ken Weiss and alike.

Notes from Technology Day of #AGBT16

AGBT16 conference is currently taking place in Florida. Those not present at the conference can learn a lot from this link.

Highlights and gossips –

1. Word cloud (it is all about sequencing)

CYOfDEbWkAA1Vkf

2. Long reads

Pacbio will host a session on long reads today. We will cover the main points in a later blog post.

Also, our tutorial on analysis of long read sequences will be available to the members today. Please join here, if you work on long read data.

3. Jay Flatley’s talk

OmicsOmics blog posted Illumina CEO’s talk in storify format here. The section “Apple Lisa & Bead Express: Examples of new tech not working out” is interesting. Also, he talked about Grail, Illumina’s liquid biopsy venture and Firefly prep module.

Flatley: 9 sample system, 3.5h lib prep unattended. Electrode pads in disposable cart (is this Adv Liq Logic tech?)

JF: digital fluidic library prep, 8 samples in parallel, reagents already there, library in 3 hrs unattended

Flatley: Second cartridge does clustering, PE steps, 3.5h to 13h run-times. Driven off wireless iPad #AGBT16

Flatley: 2 modules library prep feeding multiple sequencers. 1G/run, 4M reads, $100/sample, $30K for both modules

JF: 1G per run, 4M reads, $100 per sample, $30k for machine (library prep and sequencer)

4. Nanopore gossip

Capture

Flatley is not happy to hear that.

Capture

UC Berkeley is Running out of Money. Here is What They Plan to Do

Despite having a huge technology boom in their backyard, the finances of UC Berkeley are diving into deep red.

The Chronicle of Higher Education reports in “Acknowledging ‘New Normal,’ Berkeley Announces Broad Plan to Shore Up Finances” –

Citing a “new normal” of diminished state funding, the University of California at Berkeley is launching a “strategic-planning process” aimed at shoring up its finances. In a message to the campus on Wednesday morning, the university’s chancellor, Nicholas B. Dirks, laid out several broad areas for change the university will consider.

Specifics about the “strategic-planning process” are scarce. What is clear, however, is the university’s harsh financial outlook. According to a financial summary provided by a university spokesman, Berkeley faces a $150-million deficit for the current fiscal year, representing 6 percent of its operating budget.

Here are the expected plans from the above article. We also provide plain-English interpretation of MBA-speak –

Proposal 1. Raising revenue through the use of the university’s “‘brand,’ land, and other assets.”

In plain English, (i) raise tuition on suckers, (ii) take loans on the buildings, (iii) lease some of the UC land to private companies. Among those, the first option is possibly out, given that-

UC Berkeley and Oakland high school students protest tuition hikes, other higher education issues

96hours_ahart-900x580

Students from both UC Berkeley and Oakland Technical High School organized two demonstrations on campus Wednesday — one to protest proposed tuition hikes and the other to advocate admitting more Oakland public school students to UC Berkeley.

The UC Berkeley students’ demonstration — called a “funeral for public education” — was a part of the 96 Hours of Action, a series of protests throughout UC campuses against a tuition hike policy passed in November. The protesters carried a cardboard coffin representing the death of California’s public education system. They began on Sproul Plaza, walked to California Hall and then marched back to Telegraph Avenue and Bancroft Way, blocking traffic for a short period of time.

———————————

Proposal 2. The “realignment” of some of the university’s academic parts — by bolstering some units and combining others to “capture intellectual synergies.”

“Realignment” is code-word for “downsizing”, which itself is a code-word for getting rid of employees. Given that UCs cannot fire tenured professors yet, this step will lead to combining departments and cutting on everything else. This is gonna be fun.

———————————

Proposal 3. Better supporting teaching and research “while also redesigning many of our work processes in order to achieve greater efficiency.”

This is MBA speak with no additional action beyond proposal 2. Maybe they will replace classroom lectures with videos from Coursera and TAs with robots from Google :). “Efficiency” always means replacing humans with “smart” machines.

————————————————————————-

We should point out that none of those are going to work. What is UC Berkeley’s real problem? It is the Federal Reserves cutting interest rate to 0%, and bleeding all kinds of savings funds. Let me explain.

Suppose an organization needs to hire a tenured professor. That means the organization needs to make sure it can provide steady salary for the person for many years.

Think about it this way. Suppose you run a company, and want to hire an employee with a six-month contract of $10,000/month. That means your company needs to secure $60K of cash flow to make sure the employee contract is not violated. Historically, that cash flow came either from continuing business or interests on savings made by the business. For charitable institutions, the second option historically played big role.

For example, if the company had $1M in bank earning 6%, it could use the earned interest to pay the employee’s salary without worrying about business prospects. Many organizations like universities and charitable non-profits had substantial savings, and they used the interest to pay for employees in perpetuity. Even the Nobel Foundation uses the same model for paying the prize in perpetuity. This kind of ‘paying perpetually from interest’ calculations were routine everywhere. For example, I remember, when our family made a small donation for a college award several years back, the head of the college did the same interest rate calculation to explain how much the donation needed to be for the award amount.

Zero percent interest rate throws a monkey wrench into the entire model and especially damages long-term hiring. US treasury 10 year bond produces 1.7% interest at present. Therefore, to permanently hire someone with the salary of $200K/year from the interest, an organization needs to keep $6M in 10 year treasury bonds, whereas the requirement was only one third of that a few years back. So, UC Berkeley has to either dramatically increase its savings fund with each drop in interest rate, or cut down on ambitions of hiring tenured professors. Most universities are going for the second option.

Are we glad that these universities are suffering? Absolutely, given that this dumb economic model of lowering interest rate to support banks came from none other than the academics from various top US universities.

download

Finally, if you think 0% interest rate is the end of this nonsense, you may be glad to know that the academics are talking about deep negative interest rates (check – JPM’s Striking Forecast: ECB Could Cut Rates To -4.5%; BOJ To -3.45%; Fed To -1.3%). God bless UC Berkeley, when that happens.

The above money problems are another reason UC Berkeley is going for all-or-nothing CRISPR-cas9 court battle with Harvard/MIT/Broad. Readers may check “CRISPR Fight and the Corrosive Role of Bayh-Dole Act in Damaging Basic Science“.

Combinatorial Scoring of Phylogenetic Networks

Several eons ago, we reviewed “Algorithms for Constructing of Phylogenetic Network (Reticulate Evolution)”. Those, who are interested in the latest and greatest, may check this new arxiv paper.

Construction of phylogenetic trees and networks for extant species from their characters represents one of the key problems in phylogenomics. While solution to this problem is not always uniquely defined and there exist multiple methods for tree/network construction, it becomes important to measure how well constructed networks capture the given character relationship across the species.
In the current study, we propose a novel method for measuring the specificity of a given phylogenetic network in terms of the total number of distributions of character states at the leaves that the network may impose. While for binary phylogenetic trees, this number has an exact formula and depends only on the number of leaves and character states but not on the tree topology, the situation is much more complicated for non-binary trees or networks. Nevertheless, we develop an algorithm for combinatorial enumeration of such distributions, which is applicable for arbitrary trees and networks under some reasonable assumptions.

RECKONER: Read Error Corrector Based on KMC

Our readers are already familiar with the work of Sebastian Deorowicz and colleagues for their innovative kmc2 kmer-counting program. In this new paper posted at arxiv, they used kmc2 to build an error correction program.

Motivation: Next-generation sequencing tools have enabled producing of huge amount of genomic information at low cost. Unfortunately, presence of sequencing errors in such data affects quality of downstream analyzes. Accuracy of them can be improved by performing error correction. Because of huge amount of such data correction algorithms have to: be fast, memory-frugal, and provide high accuracy of error detection and elimination for variously-sized organisms.

Results: We introduce a new algorithm for genomic data correction, capable of processing eucaryotic 300 Mbp-genome-size, high error-rated data using less than 4 GB of RAM in less than 40 minutes on 16-core CPU. The algorithm allows to correct sequencing data at better or comparable level than competitors. This was achieved by using very robust KMC~2 k-mer counter, new method of erroneous regions correction based on both k-mer counts and FASTQ quality indicators as well as careful optimization. Availability: Program is freely available at this http URL Contact: [email protected]

Another memory-efficient approach was published in LoRDEC, which used Rayan Chikhi et al’s GATB library.

Zika Oxitech Connection Can be Explained without Resorting to ‘Conspiracies’

Zerohedge reported –

zika-virus-2

Zika Outbreak Epicenter In Same Area Genetically-Modified Mosquitoes Released In 2015

When examining a rapidly expanding potential pandemic, it’s necessary to leave no stone unturned so possible solutions, as well as future prevention, will be as effective as possible. In that vein, there was another significant development in 2015.

Oxitec first unveiled its large-scale, genetically-modified mosquito farm in Brazil in July 2012, with the goal of reducing “the incidence of dengue fever,” as The Disease Daily reported. Dengue fever is spread by the same Aedes mosquitoes which spread the Zika virus — and though they “cannot fly more than 400 meters,” WHO stated, “it may inadvertently be transported by humans from one place to another.” By July 2015, shortly after the GM mosquitoes were first released into the wild in Juazeiro, Brazil, Oxitec proudly announced they had “successfully controlled the Aedes aegypti mosquito that spreads dengue fever, chikungunya and zika virus, by reducing the target population by more than 90%.”

Though that might sound like an astounding success — and, arguably, it was — there is an alarming possibility to consider.

Nature, as one Redditor keenly pointed out, finds a way — and the effort to control dengue, zika, and other viruses, appears to have backfired dramatically.

——————————————————————

A poorly trained technical writer at Discover magazine argued –

No, GM Mosquitoes Didn’t Start The Zika Outbreak

A new ridiculous rumor is spreading around the internets. According to conspiracy theorists, the recent outbreak of Zika can be blamed on the British biotech company Oxitec, which some are saying even intentionally caused the disease as a form of ethnic cleansing or population control. The articles all cite a lone Redditor who proposed the connection on January 25th to the Conspiracy subreddit. “There are no biological free lunches,” says one commenter on the idea. “Releasing genetically altered species into the environment could have disastrous consequences” another added. “Maybe that’s what some entities want to happen…?”

For some reason, it’s been one of those months where random nonsense suddenly hits mainstream. Here are the facts: there’s no evidence whatsoever to support this conspiracy theory, or any of the other bizarre, anti-science claims that have popped up in the past few weeks. So let’s stop all of this right here, right now: The Earth is round, not flat (and it’s definitely not hollow). Last year was the hottest year on record, and climate change is really happening (so please just stop, Mr. Cruz). And FFS, genetically modified mosquitoes didn’t start the Zika outbreak.

——————————————————————

Let us try it differently.

Observation. A correlation between spread of zika virus and Oxitec’s action has been observed.

Correlation does not mean causation (check – “Do not Go for a Swim after Having an Ice Cream”). However, the correlation is still a valid observation that needs to be explained.

The problem with Discover Magazine’s argument is that instead of trying to think about various ways to explain the correlation and ruling them out, it decided to present its own model of why there could not be any connection between Oxitec and Zika, and then boldly claimed theirs is the only possible model.

How about another possibility? Let us say there are two subspecies of Aedes aegypti mosquito – Aedes aegypti “dengue” and Aedes aegypti “zika” – spreading dengue and zika respectively. The current scientific knowledge assigns both diseases to one ‘Aedes aegypti’, but that may be a limitation of current knowledge. It is possible that Oxitec targeted Aedes aegypti “dengue” subspecies and wiped it out. As a result, the other mosquito fill up its space. Will zika cases explode in such a scenario?

—————————————————–

Web Analytics