Monday, December 26, 2016

What's really up with Roche and PacBio?

You're probably aware that Roche recently severed its 2013 partnership with PacBio and decided to go it alone without access to long read SMRT technology. Neil Gunn, Roche's head of sequencing R&D explains that Roche will now 'focus more intently on its internal R&D efforts to "drive our long term strategy, which is to be a leader in clinical diagnostic sequencing."'. This wouldn't have anything to do with the 2014 acquisition of Genia, would it?

Keith Robison at Omics! Omics! has already speculated about various reasons for Roche's break up, but I think that the developments around Genia are the primary reason for this parting of way. As Robison explains:
The obvious culprit would be Genia, the nanopore sequencing technology which Roche purchased. ... Genia has twice reported this year technical success, first with their nucleotide chemistry and then their protein engineering.

But good publications do not a commercial instrument make, and there has been no signal that a launch is imminent.  Perhaps we will hear a lot more at JP Morgan or AGBT, but Genia has burned a lot of credibility in the past with premature announcements.
This may be true, but consumers in the genomics space have been pretty forgiving:
Many of complained that ONT announced vaporware at 2012, but they've delivered devices to customers since mid-2014.  
And as of now there are a lot of working ONT devices in the field.

But back to Roche/Genia. It's tempting to speculate that Roche is building a competing platform to Oxford Nanopore's technology, which on the surface may look like what it is. However, the technologies and the way sequence is read from them is different.  On one hand, ONT's nanopores detect electrical signals generated by DNA kmers traveling through the pore, and in comparison Genia's technology reads DNA by having a polymerase send engineered tags through their pore.

How will Oxford Nanopore's MinION compare with Roche's Genia?

At this point, they look like competing technologies in the nanopore sequencing space, but without more information on the Genia it's hard to make a call. However, since the Genia relies on proprietary modified bases called NanoTags read by a nanopore, I'm going to speculate and assume that the Genia will have a higher consumable cost as result; potentially a big drawback for the Roche platform. Nevertheless, here's how they compare across a two key value drivers.

Read accuracy is a big deal. At the moment, it looks like Genia might have a hand up due to their NanoTag read system. I suspect that it easier to calling bases directly using engineered NanoTags that it is to infer bases from electrical signals generated by DNA going through a pore, as is the case with ONT. If Roche can make up the additional cost of NanoTag reagents by offering a higher read accuracy, the Genia might be worth it, particularly if the flow cell cost is kept below that of ONT's.

Cost. There are two major categories of cost sequencing teams are concerned with; operating costs and capital costs. The nice innovation that ONT has brought to the genomics field is the idea of zero (or nearly zero) upfront capital costs. You buy a flow cell for <$1000 and start sequencing.  The capital cost of a Genia is an unknown, but if it's anywhere near that of PacBio or Illumina equipment it'll be an obstacle to adoption in the field. Unfortunately, without a target purchase price (and other assumptions about the Genia's longevity) it's impossible to come up with a cost model and estimated $/gigabase.

Some guidance from Roche is needed here both in terms of cost and accuracy. I haven't even touched read length as a dimension of performance, but given ONT's extreme read lengths (up to 200kb, more commonly ~50kb), Genia needs to easily hit >30-50kb even to be considered as competition.

Saturday, October 22, 2016

Why you should be wary of companies hiding everything for 'intellectual property reasons'

Think of Theranos. I remember reading this article when it came out, thinking 'what can they possibly be protecting?':
The technical details about Theranos' seemingly revolutionary tests are hard to come by, and the company is known for its secrecy about its founder's invention.

There's one fundamental question, one that in some ways is unanswerable without revealing information that Theranos wants to keep confidential: How, exactly, does what Holmes invented work? ... [The company] hasn't published peer-reviewed studies comparing its tests to traditional ones, and the company hasn't allowed independent experts to publicly assess its labs, citing the need to protect its intellectual property.
IP comes in many flavours: patents, designs, trade secrets, and business processes, to name a few. Not all intellectual property can be protected like a patent, but some of it can, and that means that not all intellectual property needs to stay secret.

When a company like Theranos doesn't even let independent experts in to check out the technology and say 'Yes, I've seen what they do and it works', which they could conceivably do under an NDA, it raises a red flag that something isn't right. Most medical device/diagnostic companies have at least something that can go through the patent prosecution process, which allows experts to understand how the new widget works and whether it will eventually provide value in a business model.

Sometimes the best strategy is to show everyone that at least part of your firm is not just smoke and mirrors.

Monday, July 11, 2016

Scientists' perennial problem: The Brexit Effect

I've spent the last week or so thinking about the effect a Brexit would have on U.K. science, as have many other folks on the 'net. Having lived through the Canadian War on Science (2006-2015), there are lots of parallels to draw between policies and strategies that try to cut out basic research and what scientists seem to be freaking out over with regards to the Brexit (which by the way isn't a done deal after the key leaders bailed on the movement).

In my experience, three things related to changing science policy hit a nerve with researchers: funding cuts, restricted mobility of people, and general political instability. All of these aren't unique to the kind-of-hypothetical Brexit Effect; they're just brought to top of mind by it.

Funding cuts. It's no doubt that money and resources are the lifeblood of a research group - any group, for that matter - and whenever there's less money to go around it's an uncomfortable time, by definition.

There's a wonderful post at the London School of Economics' BrexitVote blog that goes into extensive detail about how the UK science funding may change. Specifically on how whatever gains won by leaving the EU are overshadowed by the general loss of economic activity if the UK were to go through with the plan:
Importantly, even the more optimistic assessments of the UK’s economic performance following a Brexit ,.. model an immediate loss in GDP for the transition years following a Brexit. The size of that loss is substantially larger than the current net contribution of the UK to the EU budget. ...

Therefore the attempt to financially gain in the short term via a Brexit is akin to killing the goose that lays the golden egg. It is a sure-fire short term loss, wiping any free money for Research & Innovation investment until at least a decade down the line – according to the most optimistic scenarios. This strongly counters any claim that voting to leave the EU provides immediate funds for a shot in the arm of national science. The extra money simply will not be there for science as the UK economy is hit by huge transition costs. 
Other parts of the post go on to explain how Switzerland manages to get along with science just fine without being an EU member.

Now you might point out that the above scenario explains what would happen to global research funding (similar to a US-style Sequestration effect), and ask what about cuts to specific programs or areas of research? This, sadly, is sometimes needed. Just as in any other business, as the needs of the 'market' move on, science needs to adapt to ask new questions, use new technologies, and re-train people.

This process of creative destruction has been accepted by mainstream business people and spawned so many different bestsellers (including the classic ones by Eric Topol and Rotman Professor Sarah Kaplan). It really should be one of those things that's accepted as a given by grant funded researchers, not fought against.

Next post, Mobility of People.

Thursday, April 28, 2016

Is your business idea unscalable?

Jon Westenberg:
There’s a lot of pressure put on young entrepreneurs. It’s the pressure to only build scalable startups, rather than focusing their efforts on any other type of business. You can see this almost everywhere. If you’re not building high growth software or platforms, you’re seen as wasting your time.

It’s snobbish. And it’s dangerous. It relies on the idea that there are businesses that are inherently better than others. And they’re only the businesses that have room to become $100,000,000 companies.
Lots of business ideas aren't scalable, but they can still make a ton of money for the founders.

On Transgenic Zika-Proof Mosquitoes

Reinaldo Jose Lopes:
Even if [Oxitec's] transgenic mosquitoes can be proven to reduce dengue or Zika infections, it is possible that natural selection could reduce their effectiveness. Females could develop a preference for wild-type A. aegypti males — stopping the company's currently furthest-developed lineage of GM insects (called OX513A) from spreading in the wild.
At best this means a lot of money for Oxitec and little value for the payors. At worst, Zika virus adapts and the mosquitoes become useless.

Someone once said that "Life... finds a way".

Tuesday, April 26, 2016

Really, PubMed?:"Diverse biological effects of electromagnetic-treated water"

Really? Oh yes, if it's published it must be true:
The effects of water treated with an electromagnetic field (EMF) were investigated on two biological systems, humans and plants. Purified de-ionised water was treated by (1) boiling, (2) exposure to microwave radiation, and (3) low frequency electromagnetic oscillation molecular resonance effect technology (MRET), before being used to prepare media for culturing human peripheral blood mononuclear cells (PBMC) from three healthy females. Our results indicated that PBMC culture in MRET-activated medium showed significantly less oxidative metabolism when compared to media prepared from other types of water. As for the effects on soybean, our results indicated that both MRET- and microwave-treated water greatly enhanced the length of the root. These results suggested that electromagnetic-treated water can have diverse biological effects on both animal and plant cells. Since these effects are related to the ‘Memory of Water’, hypothesis which has been suggested as an explanation of the action of high homeopathic dilutions, our finding warrant a further investigation on the mechanisms of various types of physically conditioned water on specific cellular activities.
Issues with this paper:
  1. No controls.
  2. Apparently no ethics review for using healthy human volunteers.
  3. A paywall. An Elsevier paywall.
  4. Because of the paywall, there's no idea what the n of soy plants is.
  5. Homeopathy.
  6. The "Memory of Water" hypothesis is based on two PubMed citations. Hey, at least it's more than one.

Gene editing in human embryos gains traction

Nature has a short update on where human gene editing research is going:
China's lead

Fan’s team began its experiments in early 2014 and originally submitted the paper to Cell Stem Cell, Fan says. By the time the manuscript ended up on the desk of David Albertini, editor-in-chief of the Journal of Assisted Reproduction and Genetics, a different Guanghzou-based team had become the first to report human-embryo-editing experiments. That paper, which tried to correct a mutation that causes a blood disease, fed into a firestorm over the ethics of modifying human reproductive cells (or ‘germline’ modification). Some researchers called for a moratorium even on proof-of-principle research in non-viable embryos. ...

Fan’s paper should help to reassure international observers about the legitimacy of human-embryo-editing research in China, says Robin Lovell-Badge, a developmental biologist at the Crick. More such embryo-editing papers are likely to be published, he adds. “I know that there are papers floating around in review,” he says.“I’d much rather everything was out in the open.” 
The public issue, in my mind, is that many opposing human cells see the next logical step as a full blown program to produce genetically engineered human. I'm very skeptical that the science is going to go that far, that fast.  To start, through CRISPR-Cas9 gene editing is pretty specific, it's known to have off-targets, and those off-target regions depend on the site being edited.

Until all the consequences of editing a specific site (including unintentional targets) are determined to be 'safe', human CRISPR experiments in embryos should remain very basic.  First things first.

Friday, April 15, 2016

Our SiMSenSeq: Simple, Multiplexed, Sensitive, DNA Sequencing paper is out

The SiMSenSeq PCR method that I've been working on for about two years has just been published in Nucleic Acids Research.  Here's the abstract:
Detection of cell-free DNA in liquid biopsies offers great potential for use in non-invasive prenatal testing and as a cancer biomarker. Fetal and tumor DNA fractions however can be extremely low in these samples and ultra-sensitive methods are required for their detection. Here, we report an extremely simple and fast method for introduction of barcodes into DNA libraries made from 5 ng of DNA. Barcoded adapter primers are designed with an oligonucleotide hairpin structure to protect the molecular barcodes during the first rounds of polymerase chain reaction (PCR) and prevent them from participating in mis-priming events. Our approach enables high-level multiplexing and next-generation sequencing library construction with flexible library content. We show that uniform libraries of 1-, 5-, 13- and 31-plex can be generated. Utilizing the barcodes to generate consensus reads for each original DNA molecule reduces background sequencing noise and allows detection of variant alleles below 0.1% frequency in clonal cell line DNA and in cell-free plasma DNA. Thus, our approach bridges the gap between the highly sensitive but specific capabilities of digital PCR, which only allows a limited number of variants to be analyzed, with the broad target capability of next-generation sequencing which traditionally lacks the sensitivity to detect rare variants.
I'm currently packaging up the informatics pipeline used to analyze SiMSenSeq data, which will be up on GitHub pretty soon.

Sunday, April 10, 2016

Working 24/7 is the cognitive equivalent of coming to work drunk

In an NPR interview about her new book, The Sleep Revolution, Arianna Huffington offered up this perfect quote on the importance of sleep and its relationship to the workplace:
We hear employees being congratulated for working 24/7, which now we know is the cognitive equivalent of coming to work drunk. But it's changing. We are now in this amazing transition period where more and more companies are beginning to realize that living like that and working like that has actually terrible consequences — not just on the health and productivity of their employees but also on their bottom line. 

Tuesday, April 5, 2016

Why AbbVie still has traction despite looming Humira patent expiry

Arthur Jeannerot, on Seeking Alpha:
In 2015, Humira (Adalimumab) represented 61% of AbbVie's total revenues, which could be seen as problematic since the composition of matter patent covering Humira expires in December 2016 in the U.S., and in October 2018 in the European Union. However, Humira is covered by more than 50 other patents on formulation, method of treatment, manufacturing and more. Those other patents are due to expire between 2022 and 2034, which should make it more difficult for competitors to come up with biosimilar versions of Humira.
Either way, AbbVie is going to start experiencing competition from other companies that are more than capable of producing Humira biosimilars - the technology to produce therapeutic antibodies is becoming more and more commonplace, and even large academic groups are jumping on the bandwagon. This means that, as far as technical complexity goes, making a biosimilar is within the capability of a talented PhD student.

What I think the market is over-estimating is the ability for competing biosimilars to carve out Humira market share, probably with the assumption that customers will be able to substitute one antibody for another as easily as one proprietary molecule for a generic drug in the small molecule drug space.

Unfortunately, it's not that easy. Antibodies can have a ton of idiosyncratic activities; they're bigger, and less well defined than small molecule drugs. In addition, there's going to a great deal of brand name inertia with Humira, as consumers stick with what works until a generic proves that it's as good as the original - which will take some time. This likely means that Abbvie can ride out the storm and plan a strategy to protect this drug for a little while longer.

Move over cronut burgers; the Burgerizza has arrived

In Toronto, The Ex is known for outrageous food being created for the once annual occasion.  Leave it to a baseball stadium to top Cronut Burgers, poutine balls and Krispy Kreme burgers with The Burgerizza:


Yuck. No thanks.

(Photo credit: National Post)

Friday, April 1, 2016

Is your work scalable?

Douglas Rushkoff posted this on LinkedIn:
Most of the technologies we're currently developing replace or obsolesce far more employment opportunities than they create. Those that don’t—technologies that require ongoing human maintenance or participation in order to work—are not supported by venture capital for precisely this reason. They are considered unscalable because they demand more paid human employees as the business grows.
Sometimes you need to stop and ask yourself: Is what I'm working on today scalable, or is it limited by some finite constraint, like highly skilled people or the number of hours in a day?

Tuesday, March 29, 2016

The Ontario health minister on overpaid physicians

From the National Post:
Health Minister Eric Hoskins complained Wednesday about the billing practices of some Ontario doctors, who he said were taking hundreds of millions of dollars away from home care and other services. “Unpredictable and frankly out of control billing by some doctors is a problem that creates huge income for some doctors, but it leaves less for family doctors,” said Hoskins.  

“It leaves less for our salaried doctors in community health centres, it squeezes our ability to invest more money in home care and community care, and it robs of us of the capacity to responsibly plan our health care spending each year.”
I never thought he'd have the guts to finally say that.

BC company offers free cancer tumour screening to 1,500 Canadians

From the CBC:
Free cancer tumour genetic screening? That's the unusual offer from Contextual Genomics, a private company in Vancouver.

Starting this month, the first 1,500 Canadians who get their oncologists to send in their tumour samples will have their cancer tested using the the company's trademarked Find-It Cancer Hotspot Panel at no charge. After that, the test will be offered for sale for less than $1,000.

"You could call it marketing, but it's making this test available to people who haven't had access to it before," said Contextual Genomics CEO Chris Wagner. The idea is that there might be a drug out there that can target the particular cancer mutation. But that's if a drug exists, and if it's approved for use, and if the oncologist knows what to do.
This is progress. Why attempt to make an issue of a private company providing a new test? Most of Canadian health care is delivered by private companies anyways.

That said, the Contextual Genomics is making an excellent strategic decision to offer their test for free. I'm actually surprised that they're limiting it to 1,500 people; the marginal cost of running these tests can be pretty low, especially for those testing highly specific mutations. I'm not so sure about the full cost of delivery, though.

After this phase of their National Access Program, my guess is that they'll eventually offer some kind of compassionate access program for those that can't pay ~$1,000 out of pocket.

Wednesday, March 16, 2016

How bioRxiv will make journals pay for publishing research papers

Preprint servers are the future of scientific publishing.

There's a detailed article on biological research and the controversies surrounding the bioRxiv preprint server up at The National Post:
After several dozen biologists vowed to rally around preprints at an “ASAPbio’’ meeting last month, [bioRxiv] has had a small surge, and not just from scientists whose august stature protects them from risk. On Twitter, preprint insurgents are celebrating one another’s postings and jockeying for revolutionary credibility.

For most of the history of organized scientific research, the limitations of technology made print journals the chief means of disseminating scientific results. But some #ASAPbio advocates argue that since the rise of the Internet, biologists have been abdicating their duty to the public — which pays for most academic research — by not sharing results as quickly and openly as possible.

Unlike physicists, for whom preprints became a default method of communicating discoveries in the 1990s, biomedical researchers typically wait more than six months to disseminate their work while they submit it — on an exclusive basis — to the most prestigious journal they think might accept it for publication. If, as is often the case, it is rejected, they try another journal. As a result, it can sometimes take years to publish a paper ... and because science is in many ways a relay, with one scientist building on the published work of another, the communication delays almost certainly slow scientific progress.
Yes, paper reviews are slow and definitely consume resources, but they are usually helpful. Later on in the article, there's a bit of fearmongering that preprints will be 'detrimental to science' and that the world will end, etc. but that's coming from editors that are incentivized to keep scientists paying for the review process.

What I'd really like to see in the near future is a model where scientists post their work to bioRxiv to stake claim to something new in their field, and where it's incumbent on journals to bid for the right to accept good papers into the review process. Why shouldn't labs be compensated after the fact for doing high quality research, in addition to being funded in advance as in the current system?

Several factors work for in favor of this model:

  1. Scientists will still preprint research that's of good quality in an effort to receive journal bids. In fact, they'll still have an incentive to produce work that's as high quality as possible.
  2. Journals will still be in a position to offer a proper peer-review for preprints, and in some (most?) cases this means that the version of the paper published by the journal will be of better quality than the pre-print.
  3. Journals won't need staff to deal with inquiries from every researcher thinking that their paper is good enough to review. Instead, scouts will contact labs since they know what's out there.
  4. Excessive resources won't be spent on polishing little papers so that they're 'good enough' for submission to a journal. If someone has one or two interesting figures they can still publish it, get a DOI, and get on with their career.
  5. The corollary of #3 is that the need for a 'Journal of Negative Results' is eliminated. Just preprint the damn results and get a bit of credit for it.

I don't think preprints are the threat that some folks are imagining; on the contrary, they should improve the system at a time when the model of 20th century publishing seems to be broken or at best dysfunctional.

So if you have any other ideas reach out on @CheckmateSci.

Monday, March 14, 2016

How to name your biotech startup properly

From STAT:
Alnylam Pharmaceuticals, named after a star in Orion’s belt. “It marks the strength of our vision, and gives our effort a clear association with something that’s up in the sky,” said Chief Executive John Maraganore.

However, the Cambridge company did what many in the industry do: they tweaked the spelling, from Alnilam (the star) to Alnylam, to help it stand out to investors, in Google searches, and in trademark filings.

The word Alnylam derives from Arabic — it means “string of pearls” — and in this, too, the company’s name is illustrative of a larger industry trend: Names drawn from words in Latin, Greek, or other foreign languages.

Avak Kahvejian, who helps build life science startups at Flagship Ventures, said “one of the tricks” he uses is to run English words associated with his companies’ technologies through an online translator. That’s how he named his most recent startup, Cambridge-based Rubius Therapeutics, which launched in December to engineer red blood cell-based drugs. (Rubius means “red” in Latin.)
The article also touches on a company working in the CD47 receptor space that decided to go a much less romantic naming route: Forty Seven Inc.

Thursday, March 10, 2016

Those fighting over Editas are in it for the long haul

When I first started following the promising CRISPR/Cas9 technology after the chance that shRNA/RNAi technology becoming biotech's Holy Grail seemed to fizzle, I thought that having two or more research groups reporting CRISPR technology wasn't surprising; it was just another example of the well known multiple discovery effect operating in science.

However, the CRISPR story sounds much more like a tale of politics and competition (money necessarily follows) instead of that of independent people fighting over first rights to the same bright idea.

Via TechCrunch:
Two of Editas founders, UC Berkeley’s Jennifer Doudna and the Broad Institute’s (BI) Feng Zheng, are credited with pioneering CRISPR/Cas9, a gene-editing technology that has radically advanced the biotech industry. Editas uses this technology to develop therapies to treat humans at a genetic level.

Those with a genetically induced cancer would be able to receive treatment to snip out parts of the faulty gene sequencing using this technology, for example.

Though Doudna is listed as one of the founders of the company, she left Editas two years ago to create the competing Caribou Biosciences in Berkeley, California. However, BI filed for the CRISPR patents for Zheng and was originally awarded the rights to them.
It'll be interesting to see how this unravels.

Thursday, February 4, 2016

Amgen publishes failures to reproduce research: So what are they really saying?

From Nature:
Right now, the main way that the scientific community spreads the word about irreproducible research is through innuendo, which is inefficient and unfair to the original researchers, says Ricardo Dolmetsch, global head of neuroscience at Novartis’s Institutes for Biomedical Research in Cambridge, Massachusetts. “Anything we can do to improve the ratio of signal to noise in the literature is very welcome,” he says.
Totally agree. Finding the right people to tell you that a particular R&D approach is irreproducible because of factors beyond your control is a difficult, time consuming, and potentially expensive process. So why aren't there more negative results published?
Partly, because negative results don't sell too well to a community of people interested in positive results. But I think a strong disincentive to publish these 'contra-papers' is a very personal one:
Academic researchers are unlikely to risk alienating their peers by publishing disconfirming results, predicts Elizabeth Iorns, head of Science Exchange in Palo Alto, California. 
So why is Amgen now taking this role?
The idea emerged from discussions at a meeting focused on improving scientific integrity, hosted by the US National Academy of Sciences in 2015. Sasha Kamb, who leads research discovery at Amgen, said that his company's scientists have in many instances tried and failed to reproduce academic studies, but that it takes too much time and effort to publish these accounts through conventional peer-review procedures.
I bet somewhere, somehow, someone has run the cost-benefit analysis between spending 'many instances' of resources chasing scientific geese and the cost of drawing metaphorical borders around impractical research findings (I won't say 'bad', because they may actually be useful in a non-industrial setting).

The balance of the analysis was probably strong enough to make Amgen send the following message, in a very nice way: "In the future, if you don't think we can reproduce your research, don't waste our time it".

Monday, February 1, 2016

Headwinds ahead for Canadian Startup Visa Program

From VisaReporter.com:
Under the program, investor who seeks to start a fresh business get Canadian PR, and it targets to lure innovative businessmen and connect them with the private sector businesses of Canada via government approved angel investor groups. Business incubators or venture capital funds who would act as the facilitators for establishing the startup business in Canada.
I just finished reading Why Mexicans Don't Drink Molson, by Andrea Mandel-Campbell, which left me with the impression that this program is also going to need to pair these new businesspeople with Canadian business with the right mindset.

Sunday, January 31, 2016

Top 10 Healthtech advances to watch in 2016

Writing for Healthcare IT News, Jessica Davis describes the following 10 advances that may define 2016. The original article goes into each in some detail:

  1. Mobile stroke units,
  2. Medical device cybersecurity, 
  3. Wireless wearable sensors,
  4. Miniature leadless pacemakers, 
  5. Blue-violet LED light fixtures,
  6. New high-cost cardiovascular drugs,
  7. Changing landscape of robotic surgery,
  8. Spectral computed tomography,
  9. Injected bioabsorbable hydrogels, 
  10. Warm donor organ perfusion systems.

What, no genomics?!  :-)

Friday, January 29, 2016

Internet connected inhalers: Technology to watch

From Reuters:
Novartis wants every puff of its emphysema drug Onbrez to go into the cloud.
The Swiss drugmaker has teamed up with U.S. technology firm Qualcomm to develop an internet-connected inhaler that can send information about how often it is used to remote computer servers known as the cloud.

This kind of new medical technology is designed to allow patients to keep track of their drug usage on their smartphones or tablets and for their doctors to instantly access the data over the web to monitor their condition.

It also creates a host of "Big Data" opportunities for the companies involved - with huge amounts of information about a medical condition and the efficacy of a drug or device being wirelessly transmitted to a database from potentially thousands, even millions, of patients.
This technology has amazing potential. If you have an idea regarding how much this device would cost, send me a message on Twitter.

Presumably a more pricey inhaler wouldn't be disposable like the current plastic devices; It would be a reusable device that simply accepts replacement Onbrez cartridges as new prescriptions are filled.
In this case, the inhaler cost becomes less relevant as it's amortized over the life of the patient's disease (long) versus the life of the patient's prescription fill (short).

Since it's internet connected, presumably it would be easy to add features like a reminder at the next dose (i.e. sound or LED), automatic prescription refills, etc.

All in all, very nice!

Tuesday, January 19, 2016

There may be a link between immunosuppressants and cancer

The Toronto Star reports that some transplant recipients have a 3 times higher risk of dying from cancer, related to the general population:
The increased cancer mortality may be due to the immunosuppressant drugs that allow the patient not to reject the organ. The suppressed immune system may not be able to fight the cancer from developing and may allow the malignancy to be more aggressive, says [Nancy] Baxter, a senior scientist at ICES. Transplant recipients are often on the drugs for life. 

Once cancer develops, a transplant recipient may receive less aggressive treatment because of other existing health problems and the fear of possible transplant rejection, according to the study.
A follow up commentary to the study notes that while the role of wholesale cancer screening for the general population is being re-thought, this study argues that screening those that are solid transplant recipients might be worthwhile.

Whether these people are high enough risk to justify the cost of providing screening services is a question left for health economists to answer.

The original study can be found here.

Thursday, January 7, 2016

Personalized Medicine has Two Sides

Andre Picard, in the National Post, writes:
The cost of sequencing is falling rapidly. It cost in excess of $3-billion (U.S.) to decode the first human genome, but now the $1,000 test is imminent, putting the technology within the reach of many. Practically, this means we are moving to an era in which medical treatments, and drugs in particular, are tailored to individuals based on their genetic makeup.

These advances, however, bring with them a host of ethical and economic challenges – in part, whether the new technologies and the benefits that flow from them, will be available equitably, to those most in need and not just those who can afford them.
One of the big potential benefits of personalized medicine is not giving expensive treatments to those who can afford them, if they won't work because of their (or their tumor's) genetics. For example, a $250 ALK mutation that's negative rules out a $90,000/year run of Xalkori, an ALK inhibitor,

Wednesday, January 6, 2016

Maternal kissing study is clearly a hoax

There's an entertaining hoax article in the Journal of Evaluation in Clinical Practice, “Maternal Kisses Are Not Effective in Alleviating Minor Childhood Injuries (Boo-Boos): A Randomized, Controlled, and Blinded Study,” authored by the Study of Maternal and Child Kissing (SMACK) Working Group.

Their conclusions clearly find that maternal kissing has no effect on making children feel better, suggesting that the practice is probably a waste of everyone's time.
Maternal kissing of boo-boos confers no benefit on children with minor traumatic injuries compared to both no intervention and sham kissing. In fact, children in the maternal kissing group were significantly more distressed at 5 minutes than were children in the no intervention group. The practice of maternal kissing of boo-boos is not supported by the evidence and we recommend a moratorium on the practice.
Not only is kissing boo-boos not beneficial, it could actual be harmful! So what should mothers do instead?
Some would likely argue that, given that maternal kisses did not clearly harm children, the practice is innocuous. ... [Since] maternal resources are very limited, and time spent on delivering ineffective kisses to boo-boos means that maternal attention is not devoted to other activities that have clearly been shown to be beneficial to toddlers. ... Most importantly, reliance on ineffective therapies may delay or prevent the delivery of proven and appropriate medical care, such as Bac-Be-Gone® antibacterial ointment and Steri-Aids® self-adhesive bandages.
If you weren't convinced that the paper is a joke, you have to read the fine print. Always read the conflicts or funding sections of papers.
Acknowledgement
Funding provided by The Initiative to Improve Childhood Health.1
Footnote
1 A fully owned subsidiary of Proctor and Johnson, Inc., manufacturers of Bac-Be-Gone ointment and Steri-Aids self-adhesive bandages.
Right. Not only is there a clear conflict of interest, the conflict is due to funding from a company that's obviously a completely fake!

Well played!

Monday, January 4, 2016

10 Things Scientists Should Do to Start 2016

  1. Don't start doing research right away. Think.
  2. Figure out which 2015 projects are past their prime and finish/publish/kill them. Quickly.
  3. Identify projects worth developing/growing in 2016 and make a plan to do so.
  4. Think about the best way your work creates value. Are you working towards getting grants, making intellectual property, or within some business model? Focus on whatever fits.
  5. Spend some time reconnecting with co-workers that you've lost touch with, because you were so busy in 2015.
  6. Find a few good review articles of interest to you and read them.  It will help with #3.
  7. Recycle that pile of papers you printed pre-2015 and never got around to reading. Admit it, this pile exists.
  8. Identify a few good bloggers/tweeters and follow them. They often turn up hard to notice papers or science news.
  9. Develop a plan to improve your communication and/or presentation skills. Focus on the style used in good writing, TED talks, etc. If you think you're already good here, there's always room for improvement.
  10. Think about what things were a poor use of your time in 2015 and find a way to stop doing them. Usually this means delegation or outsourcing.

Why academia has a data sharing problem

Martin Bobrow, Chair of the Wellcome Trust's advisory group on data access, submitted an enlightening summary of data sharing problems in Nature, where he asked:
Most research-funding agencies, and most scientists, now agree that research data should be shared — provided that those who donate their data and samples are protected. This approach is strongly advocated by organizations such as the Global Alliance for Genomics and Health. But data sharing will work well only when it is streamlined, efficient and fair. How can more scientists be encouraged and helped to make their data available, without adding an undue administrative burden?
I think the burden he's addressing is actually split into at least two parts:

1. The burden of actually sharing data. This is what usually comes to mind when people think of data sharing being difficult, and it involves hammering down infrastructure and data formats to enable sharing.

2. The burden created by actually making data available. Being the 'owner' of data brings both the opportunity for first crack at investigating that data and also the responsibility to share it. There's a real cost that sharing imposes, both in serving people that want access to data and the cost of storing it (though both are continually falling).

Thinking realistically, there's an actual disincentive to share academically generated data. Sharing data essentially gives potential competitors 'your data' at no cost, which may vaporize whatever competitive scientific advantage you may have gained.

Further on in the article, Bobrow offers this explanation:
It is reasonable for scientists to impose certain conditions or restrictions on the use of their hard-earned data sets, but these should be proportionate and kept to a minimum. Justifiable conditions can range from requiring secondary users to acknowledge the source of the data in publications, to stipulating a fair embargo time on the use of new data releases. Whatever the conditions imposed, they need to be presented clearly to data users.

Criteria used to judge academic careers still focus heavily on individual publication records and provide little incentive for wider data sharing. Scientists who let others use their data deserve reward too.
So yes, the issue with academic data sharing is incentive.

People who put together well designed data sets should be rewarded for their expertise and talents in doing so. Good data isn't as simple as sending a box of samples to a [insert your favourite high-throughput technology] production center; it requires knowledge of what constitutes 'normal' samples, experimental design, not to mention actually handling the logistics of obtaining the right samples in the first place.

Why wouldn't someone deserve credit for that?

Sunday, January 3, 2016

The Bias against Downgrading: Why too many PhDs graduate

Last month, my friend and fellow blogger David Kent posted a few ideas about restructuring PhD programs at The Black Hole, highlighting three main points:
  1. Collect and provide data on PhD outcomes,
  2. Modernize the PhD degree, and
  3. Cut the number of PhDs.
Though I tend to agree with David's assessments points #1 and #2 in his article, I'm going to take the opposite position on cutting the number of PhD positions. Here's what he said about #3:
I often struggle with this one (and maybe I’m part of the problem for this reason), but to me it seems that as long as we have big unanswered questions in medicine, biotechnology, etc., we need people to educate themselves in the life sciences. Should they all become academics? God no. Should they all move into life sciences related industry positions – again, no. But should they acquire skills and knowledge to critically assess these areas – absolutely. ...

Cutting PhD numbers by making stiffer entrance requirements is a reasonable thought, but as pointed out in the article, these requirements will be difficult to establish. I shudder at the thought of having medical school style requirements for PhDs since this will almost certainly serve to cut off those who cannot “work the system” in the same way as others in more fortunate positions. 
I'd argue that the current system already favours people who can work the system (and social access is a problem recognized in the original Nature article The Black Hole's post is based on). Entrance requirements to PhD programs are already competitive and tend to have a decent weighting on marks, requiring a B+ average at a minimum.  However, good marks accrue to undergraduates who can spend more time studying, which coincidentally include those spending less time working to pay tuition bills. Enter with your favourite socioeconomic argument.

Another way that graduate school candidates can look like prime candidates is by working summers in laboratories, partly to gain "lab experience" and partly to get decent reference letters. I don't mean to make research labs seem unique in this arena, as experience and references are valued in almost any industry I know of. Again, not working for pay in those precious undergraduate summers helps to make research time available.

That said, working the admissions system isn't what I'm concerned with.

Both articles look at the flow of PhD entrants into programs and don't examine how to reduce the number of PhDs exiting training programs. Here's what I think the major reason that once students are in a PhD program, they're committed:

Reclassifying from a PhD to an MSc program isn't seen often enough.

At least, I'm personally only aware of two people that did so; one to accept a job and another to enter medical school. But I digress.

I think one main reason reclassifying this way (i.e. PhD-to-MSc) is discouraged is that every PhD student goes through some period of self-doubt where Impostor Syndrome runs wild, so with the best intention people try to help the poor PhD student through this difficult time. Grad school isn't a game of Texas Hold'em (okay, sometimes it is) and I don't think anyone wants people to fold their hand on a degree because of a period of stress.

On the other hand, there are several possible behavioural and organizational reasons why the "Too many PhDs" problem exists:
  1. Students don't want to be perceived as failures. I've sometimes heard that failing PhD students can be "encouraged" to graduate with a Masters, so PhD students are reluctant to fold their cards and take the MSc. I've never seen this. Similarly, some people may have the perception that by not reclassifying to a PhD, a MSc student is admitting that they can't cut it.
  2. Scientists don't want to be perceived as unable to train PhDs. Here's a more powerful incentive to keep that PhD student around for 6, 7, or 8 years. It may really be that finishing that PhD thesis isn't motivating the student anymore and it's best that they capitalize their experience as an MSc, but the question would remain: "Why couldn't Professor So-and-so help that student finish their PhD?"  There might also be a culture of only graduating PhDs in that department, faculty or university. 
  3. There's a incentive to have PhD students over MSc students. Disregarding differences in individual talents, more complex projects are performed by people with more experience (presumably with the same lab).  Therefore lab heads don't want to let employees (ahem) students leave after two years of experience; they'd rather have productive people around for a few more years.
  4. Finally: No one wants to 'downgrade' from a PhD to a MSc. This problem is caused by academic jargon. The two degrees are for different people, different purposes, and different career paths. You could probably find many talented people where reclassifying to a PhD was, in retrospect, a 'downgrade' for their true career potential. Eliminate 'downgrading' from the lexicon, now.
If you have more ideas about what influences the too many PhDs problem, tweet them to @CheckmateSci and if they're good, I'll update this post (with credit, of course).

A Potential Solution: Replace 'Masters' and 'Doctoral' student classifications and call everyone a graduate student.


In doing so, the relationship between the two streams can be eliminated and the whole problem with 'upgrading' and 'downgrading' becomes moot.

I would like to see a system where the only point of differentiation between graduate students is when they petition for graduation. For instance, such a system could be as simple as an achievement checklist at an exit interview:
  • Completed your posters? Check.  
  • Done your comprehensive exam? Check. 
  • Do you fulfil all the requirements for a Masters?  Check.
  • Published some number/quality of papers needed for a PhD? No.
In this case, why wouldn't you award this person a Masters and let them reach for whatever they had on their mind? No 'downgrading' required. They're not a failure; they just decided to follow another career opportunity and didn't feel working in academia for a few more years was going to be worth it.

They gained scientific knowledge and (hopefully) contributed to some important research direction. They learned to think about problems in a particular way, to follow the scientific method, and now they're going to apply it to something other than basic science.

And besides, I think that's what many research funders expect in return for their dollars.