Monday, December 8, 2014

Big Changes: Getting an MBA while Postdocing

It's been a long while since last posting here but there's a reason for it: I've been extremely preoccupied with what's become a huge change in my life: Pursuing an MBA while doing a postdoc.  I've already begun courses at the University of Toronto's Rotman School of Management, and have to say so far so good.

First of all, I'd like to say that most people have been extremely, unwaveringly, supportive in my decision to pursue this goal.  First and foremost there's obviously my spouse and family, but also my current PI, Lincoln Stein, as well as a number of individuals and colleagues at the OICR.  I don't think I'd have been able to consider making the time for it in most other academic research environments. 

It's difficult, but it's not madness, as suggested a few years ago at New Scientist in the context of a combined PhD/MBA.  Personally, I think it would have been easier to balance workloads in a combined PhD/MBA program, but then again, I don't think I would have had the appreciation for what business school offers without the work experience I gained as a postdoc.

As I go through this long experience (And it is long -- 32 months -- as I'm taking the Morning MBA program at Rotman), I'll probably write several posts that touch on some or all of the following key points.  If you're considering combining business school with research (at the post-PhD level), you might thinking about challenges that fall under any of the following points:

  1. Time management.  This, by far, is the biggest challenge for me.  After putting in on average 45-50 hours a week into postdoctoral research, allocating the ~10-20 extra hours (it varies) for MBA classes, assignments, reading, and team work is a challenge.  And on top of that there's family life to keep sane.  The first things that went were hobbies (reduced to watching videos on Fine Woodworking), recreation (reduced to reading), and hanging out with friends (sorry, guys).  All of those gave way to sleep, which is kind of important. 
  2. New, complementary, subjects to study.  From what I can see, the biggest benefit of an MBA for science types is that the topics are mostly new.  I'm not aware of any research programs that encourage students to take courses on accounting, economics, or negotiation, but these kinds of courses at Rotman were immensely interesting to me because they're complimentary to almost everything I had studied to date.  Studying something out of your element enhances the breadth of your skillset.
  3. The price.  Business school isn't cheap.  I don't think people go to business school unless they really want to build their career in that direction.  It's already expensive for most people, but it's especially pricey for people coming out of academic jobs, with MBA tuition running about 3-4 times the annual salary of a PhD student and about twice the salary of a postdoc.  
  4. Your classmates.  This is one of the most interesting points of my experience.  There are a handful of 'scientists' (with MScs and PhDs) at Rotman, but many of the people I study with are in financial services, engineering, marketing, and mining.  You learn a lot about how things are done outside of research from these people, and most of the time things are done differently.
Those are the four big areas that I've been thinking about, so if you're a science geek and are thinking of going the MBA route, by all means reach out to me and I'll try to put together a post to address your questions.

Wednesday, October 15, 2014

Going With Your Gut vs

Greg Satell, at Forbes, argues that a mix of computer driven predictions and human intuition will drive the future of marketing:
Yet as powerful as they have become, computers are not all powerful, they perform much better when guided by humans.  For example, in a freestyle chess tournament combining both humans and machines, the winner was not a chess master or a supercomputer, but two amateurs running three simple programs in parallel.
And that’s gives us a clue to where marketing is going.
With algorithm driven decision systems like IBM's Watson starting to guide medical decisions, I don't think it'll be long before research questions are computer guided as well.  People will still need to wade through potential ideas, but arguments that research and development are a purely human-driven enterprise don't seem likely to hang around much longer.

What's wrong with using technology to help you fulfill a job?

Thursday, September 18, 2014

Academic Conference Networking Tips

There's a nice article on networking at scientific conferences over at Cheeky Scientist. The best point of advice, which I unfortunately learned the hard way:
1. Skip the scientific talks.
You love science. I get it. Science is why we all went to graduate school. But you shouldn’t go to a conference to learn the science. Not if you want to get an industry job. ... Everything in the talk is either published or in an abstract in the conference booklet. Plus, you can always seek out the conference speakers (or their posters) later.
Point taken.  If you're watching presentations, you're not meeting anyone new.  Conferences are not about taking supercharged doses of PowerPoint slides over three days; Conferences are about conferring with people.

As I found out through experience, my best contacts were always made when I walked out of talks that didn't interest me or were just plain boring and tried to find people I wanted to talk to.  If you happen to run into someone walking out of the same talk, you at least have something common to start a conversation with.

Skipping conference talks brings me to a digression about how departments dole out travel funds for students.  Some places require students to return to headquarters and give a 'conference presentation', usually intended to inform people back home of interesting news from the conference.

If this applies to you, try to balance your news-gathering efforts with networking efforts.  You're not obligated to attend every single talk, and if you come back and bring people up to speed with 'what was hot' at the conference, you've probably done your job.

Back After A Long Hiatus

It's been a long while since I last posted, and there's a good reason why.  I'm currently putting together a post to describe the additional project that I've taken up, which required a lot of time away from blogging in order to tie up loose ends, prepare, not to mention take a decent vacation beforehand.

So in short, I expect to be contributing posts more regularly going forward.   The easiest ways to follow for new content are still @CheckmateSci and via RSS.


Tuesday, July 8, 2014

Five Tips on Doing Business in Silicon Valley. Actually, Five Tips on Doing Business Anywhere.

The folks at MaRS just released this little video highlighting five tips for doing business in Silicon Valley.  The advice is applicable anywhere.
1. To succeed, first understand the area’s history.
Whenever you're working with people outside of your area, be it geographical or outside of your area of expertise, you need to be able to relate to where they came from.  How do their values differ from yours?  What is important to them?  Is there something about that location or field that attracts certain type of people, or encourages a particular kind of behaviour (think entrepreneurship, research excellence, etc.)?
2. Spend your time there legally and intelligently.
Plan ahead to get the biggest return on your time investment. What
3. Be open to collaboration.
Share ideas with your potential partners.  Help them develop their ideas and they will help do the same for yours. 

I've written before about how operating in stealth mode stifles research projects and exposes scientists to several traps.  Collaboration takes effort, but can pay off in spades when you find good partners, especially in high risk, pre-commercial (i.e. basic) research.
4. Steer clear of the myths about Silicon Valley.
Not sure I fully agree with this one.

Myths exist about every place and every institution.  However, there are bad myths and good myths. 

Bad ones will usually serve to drive you to inaction.  They're the ones about cutthroat competition, backstabbing, politics, and favoritism.

Good myths, on the contrary, will encourage you to make connections and build on your ideas.  The good myths may turn out to be false, but at least they've led you to break that inertia of doing nothing.
5. Recognize that San Francisco is not Silicon Valley.
Aron Solomon's point is that they may be a 45 minute car ride away, but they are not the same kind of place.  The same is true about the many organizations that may exist in a technology cluster, even if they're within a 45 minute walk. 

Universities are different from research institutes, and independent research institutes are different from those associated with hospitals. 

A Big Cash Prize is a Great Motivator

A business plan competition for 'young' (under 36) scientists by Oxford Biotech Roundable and GSK figures out how to motivate scientists to come out to bat:
Our fundamental challenge was to generate enthusiasm for a biotech business plan competition and get people excited about entrepreneurship in a sector and region not known for its risk-taking culture. But we also knew that the caliber of researchers and students we sought to engage would need an attractive value proposition to incentivize them to invest their time and energy. In this respect, the grand prize (£100,000 or about $180,000) provided an attractive reason for entrants to engage with the competition rather than pursue more established career trajectories.
The rest of the article includes many other bits of useful knowledge, like the main obstacles young researchers face when considering entrepreneurship (think networks and poor mentors), but the importance of setting the value of prize, grant, or fellowship is clear: If you want quality applicants, the chance of getting a prize must be worthwhile.

Thursday, July 3, 2014

More Data Doesn't Mean More Interesting Data

David Beer, at Adaptive Computing, writes:
One of the keys to winning at Big Data will be ignoring the noise. As the amount of data increases exponentially, the amount of interesting data doesn’t.
He describes the problem of predicting what online video a user is going watch next, and how an analysis can quickly run the number of predictions up into thousands of possible 'next steps' to evaluate.
These are then compared with all of the other empirical data from all other customers to determine the likelihood that you might also want to watch the sequel, other work by the director, other work from the stars in the movie, things from the same genre, etc. As I perform these calculations, how much data should be ignored? How many people aren’t using the multiple user profiles and therefore don’t represent what one person’s interests might be? How many data points aren’t related to other data points and therefore shouldn’t be evaluated as a valid permutation the same as another point?
Thes points are probably the biggest value that an experienced scientist can provide to the scale of these data problems.  This kind of person has at least several years of work experience in a hypothesis driven research environment and is able to solve problems using incomplete data.  They probably have a PhD to go with that quantitative experience.

The first point, working in a hypothesis driven environment, demonstrates that that person should be able to devise a strategy to prove/disprove the hypothesis (I hypothesize that this customer will watch video Y after video X), and figure out how to do that efficiently without getting stuck in the weeds, or the irrelevant data Beer describes.  Unfortunately, it does take some skill to interview a person before you determine that they can actually do this, especially there are differences between yourself and the interviewee.

The second point, being able to use incomplete data, is something seems to come from experience.  Most people trained in research fields start off trying to collect the most data possible, and don't make a decision until 'more data is collected'.  It's easy to get stuck in a data collection rut, but eventually most people realize that it's actually OK to come to a conclusion before seeing the whole picture.

Collecting a lot of extra data costs time, resources, and puts a demand on your attention span until that elusive point of having 'enough data' is reached.  Sometimes that data is worth it, but many times it's not.  It just sits there because no one has time to do anything with it, so the data remains idle and risks becoming stale.  Unless it's actually your job to do so, be careful of making data for the sake of making data.
ASIDE: One of the neatest things I find about the customer analytics field (as compared with genomics or computational biology) is that data is basically being generated by the study population itself, for what is essentially free.

Tuesday, July 1, 2014

Snowflakes Visualize Wind Turbine Effects on Airflow

Oh yeah, by the way, 'Here we use snowflakes from a winter snowstorm as flow tracers to obtain velocity fields downwind of a 2.5-MW wind turbine'
say the authors of a really neat article at Nature Communications.

Checkmate Scientist is Closing Comments

Regular readers (there are about 200) may be sad to find out that I'll be closing comments going forward.

I've become much more busy over the last six months (as you may have noticed by the decrease in posts) and I've unfortunately been moderating an increasing amount of comments that are clearly from spammers.  I'd rather spend time reading and writing than deleting spam, and I think you'll agree.

As always, you can send in comments to or via Twitter to @pmkrzyzanowski and I'll do my best to respond.

Thursday, June 26, 2014

Ties in Science

First to market.

First to the finish line.

First to know.

First to file.

First to climb Mount Everest.

First country to land someone on the moon.

Human achievement is defined by one group out-competing another.  When the release of multiple research papers is coordinated, it may look like a tie but in reality one of two things has happened: Journal editors synchronized the release of papers to create a bigger impact, or two research groups shared enough information to synchronize their submissions.

If it's driven by editors, one of the groups is still first to submit.  In comparison, if it's driven by the groups, they've acted as one larger collective that's first to publish over all their competitors.

A long time ago, Andrew Carnegie quipped that "The first man gets the oyster, the second man gets the shell". 

There are no ties.

Wednesday, June 25, 2014

Hey TTC, we need Tax Credits, not Low-Income Transit Fares

Tess Kalinowski, at The Toronto Star, writes that the Toronto Transit Commission is considering implementing special fares for low-income riders:
The issue of income-based fares has been raised at the TTC and other city departments individually. Now, however,a report before Toronto’s executive committee July 2 recommends that staff from social development, the TTC, public health, planning and others develop joint guidelines for affordable fares. The policy would come back to council in early 2015.
The article later points out that six dollars per day on transit fares is a lot of money for lower income people, like the unemployed, but should include students as well.

Instead of rolling out a special Low-Income Metropass (I'm not holding my breath for the rollout of Presto just yet) and creating yet another class of fares, the TTC should work with the Ontario government to provide refundable tax credits on a single class of transit passes.

That's right, get rid of Post-Secondary and Senior Metropasses.  One class of TTC Metropass would probably simplify the TTC's operations to a small extent.

The best feature of this scheme is that since most post-secondary students are low income, and arguably some seniors are also low income, everyone the existing policies are intended to cover is still covered.  It would basically work like the Federal Transit Credit, except that if you earn less than $20k per year, you get 50% of your transit costs refunded, as an example.

As an aside, note that I said 'some seniors' as it's not really fair for me to be subsidizing people with pensions that exceed my income.  The same with the rare students getting by on dividends from inherited stock market investments.  Hey, it's not just me saying so: “It’s not necessarily fair to ask other customers to pay more [to subsidize low income fares],” says TTC chief customer officer Chris Upfold.

Running this scheme through the income tax system keeps everyone's income information more or less private.  It's hard enough for people to live on a low-income, and giving them a special card to identify them as such isn't really the hand up that they need.

Thursday, June 5, 2014

Learning to Start Businesses, in the Ivory Tower?

Over at Entrepreneur, Isaiah Hankel wrote what's, overall, just another article criticizing academic culture, but if you read it with an open mind you'll find a paragraph that's probably the most optimistic description I've ever read of what a research community can pull off, especially since Hankel's describing the academic community:
The ivory tower shouldn’t be perceived as a safe haven or a place for professionals to bide their time when the economy goes south. Rather it should be considered the best place to learn how to start and run a business [emphasis mine]. Academia’s sole purpose should be developing people not to just be professors, doctors and lawyers but ones who innovate and invent products and services.
Agreed, 100%.

Thursday, May 15, 2014

Yes, Science Can Save Government Money

Mark Strauss, at io9, writes this about the abolished U.S. Office of Technology Assessment:
Last week, Rep. Rush Holt (D-NJ) tried to reopen the agency with minimal funding.

He failed.

The legislation—a proposed amendment to the Legislative Branch Appropriations Act that would have provided $2.5 million for the Office of Technology Assessment (OTA) —was defeated in the House by a 248-164 vote, with 217 Republicans opposing and 155 Democrats supporting.
It's amazing that $2.5 million can't be spared for objective science advice - by a government.  The NIH alone spends a thousand times that amount each year.  In the absence of a strong pro-science lobby group, similar to what Franklin's List is trying to become, an OTA is a much needed government service.

Maybe the OTA was disbanded because it didn't offer good value for money.  If only the benefits of something as intangible as science could be accounted for:
OTA had always saved taxpayers far more money than it cost. An OTA study on Agent Orange, for instance, helped save the government $10 million. Another report recommended changes in computer systems at the Social Security Administration that saved more than $350 million.
OK, maybe they can.  All governments should take notice.

The rationale for dissolving a office like the OTA was even more perplexing when you consider how Newt Gingrich suggested he could compensate for the lack of one:
"Gingrich's view was always, 'I'll set up one-on-one interactions between members of Congress and key members of the scientific community,'" recalls Bob Palmer, former Democratic staff director of the House Committee on Science. "Which I thought was completely bizarre. I mean, who comes up with these people, and who decides they're experts, and what member of Congress really wants to do that?"
That's a lot of running around chasing busy members of the scientific community.  I'd say that would cost more than $2.5 million worth of politician's time alone, never mind the additional burden and cost of all the meetings stemming from that.  If you don't pay for an office like the OTA, the costs are just spread around like a giant game of hide the umbrella. Either way, taxpayers pay.

Similar efforts to create a Parliamentary Science Officer in Canada have been set in motion by Kennedy Stewart, the MP for Burnaby-Douglas in BC.  Bill C-558 is the one to keep an eye on.

Monday, May 5, 2014

Paper Pusher, PhD

GEN reports on a recent report from the National Science Board (NSB) in the United States:
"Regulation and oversight of research are needed to ensure accountability, transparency, and safety," said Arthur Bienenstock, chair of the NSB task force that examined the issue. "But excessive and ineffective requirements take scientists away from the bench unnecessarily and divert taxpayer dollars from research to superfluous grant administration."
The rest of GEN's article can be found here and here's the original NSB report.  Here's how big the problem is, as summarized by the NSB:
A 2005 Federal Demonstration Partnership (FDP) survey of investigators found that PIs of federally sponsored research projects spend, on average, 42 percent of their time on associated administrative tasks. Seven years later, and despite collective Federal reform efforts, a 2012 FDP survey found the average remained at 42 percent.
That's right: 42%.  If you're thinking of going the research route, on average, you'll spend two hours of administration to do three hours of actual research.  This depends on what you actually consider 'administration' but I'd say 2 out of 5 hours is a good estimate: you might spend less if you have a small lab or are a postdoc, more if you run a big lab or have a fragmented mix of funding greasing the wheels.

Scientists reactions are mixed.  On one extreme, you'll find scientists who claim that anything non-research related is a waste of their time. I don't take this position; reporting and auditing is necessary to ensure that the buyer of research is getting a good return on their research dollars.  There are many questions that report docs should answer.

Some are obvious: Is the project making effective use of funding, and is time and money being spent on things that contribute to the research research goals?  Are funds being spent efficiently, or can the researcher be helped to spend better?

Some are more subtle: Are the right students and postdocs being hired to do the work, and are they actually being trained to do something the funder values?  Is the project fulfilling the goals set out by the sponsor, or has the scope drifted?

But regardless of the purpose, the problem with 'excessive' paperwork isn't that it's utterly pointless, it's that it detracts from the value scientists are best able to produce.  Scientists, in general, want to discover things, build things, create things, and write about all that experience.  But administrative tasks don't generally fall under what scientists would consider cool and if the tasks truly enter 'excessive' territory you take away the cool part of the job.

Tuesday, April 22, 2014

Where are the STAP cells?

Signals Blog just posted a short commentary I wrote on STAP cells and why I think they're too good to be true:
Esophageal cancer is the end point on a spectrum of diseases. At the beginning, chronic acid reflux exposes the esophagus to the low pH levels of stomach acid, which is a risk factor for Barrett’s Esophagus. Barrett’s, in turn, is a risk factor for developing this type of cancer, which may take several decades to appear.

My original thoughts when STAP cells were first reported were that there would be a clear link between acidic conditions and stemness. ... But there’s no overwhelming evidence to suggest that that happens.
Acid reflux seems like the ideal natural experiment to prove that STAP cells exist, yet the esophagus doesn't turn into a mess of iPS cells every time someone gets heartburn.

Read on at Signals.

Monday, April 14, 2014

Called It: Artificial Blood from iPS Cells

Genetic Engineering & Biotechnology News reports that the Scottish National Blood Transfusion Service is looking into the safety of stem cell derived blood:
A team of scientists led by SNBTS director Marc Turner, M.D., Ph.D. is heading the project, which reflects the combined efforts of the BloodPharma consortium, which has benefited from a Strategic Award, provided by the Wellcome Trust, in the amount of £5 million (approximately $8.4 million). 

The research funded by the award involves multiplying and converting pluripotent stem cells into red blood cells for use in humans, with the aim of making the process scalable for manufacture on a commercial scale.
It's a study to test transfusions using small amounts of blood (5mL), but nevertheless it will be real stem cell derived blood.

I called it back in 2010.

Friday, April 11, 2014

Answer the Why of Your Work

All too often, scientists (and other research minded people) are drawn into a never ending spiral of questions.  Answers lead to questions which lead to answers, leading someone to inevitably describe the next line of inquiry and cap off their thoughts with "We need to do experiments to answer these questions."

The problem that usually arises is that no one objects.

Why not? It's easier to let someone go ahead and do their work than it is to stop and think about other things that can be done.

But assuming they've already decided that the questions are worthy of work, it should be easy for them to articulate why those questions need to be answered and why now is a good time to answer them.  Is it because there's a key conundrum in your field of specialization?  Will the answer tell us something useful about a disease, a key point about cells or disease, or a physical process?  On an extremely practical level, will your answer contribute to a publishable paper or getting a grant?

Or, most commonly, will your answer tell you that Gene X amongst 20,000 genes goes up or down because you poked a particular cell the right way?  That, too, may be important but you need to state why.

The reality is that not all questions need to be answered, at least not immediately.  Unanswered questions can simmer for a little while longer.

Monday, April 7, 2014

Big Data Sets, Multiple Hypothesis Testing, and Choices

Jason McDermott, at The Mad Scientist's Confectioners Club writes:
Here’s where the problem of a false dichotomy occurs. Many researchers who analyze large amounts of data believe that utilizing a hypothesis-based approach mitigates the effect of multiple hypothesis testing on their results. That is, they believe that they can focus their investigation of the data to a subset constrained by a model/hypothesis and thus reduce the effect that multiple hypothesis testing has on their analysis. Instead of looking at 10,000 proteins in a study they now look at only the 25 proteins that are thought to be present in a particular pathway of interest (where the pathway here represent the model based on existing knowledge). ... All well and good EXCEPT for the fact that the actual chance of detecting something by random chance HASN’T changed.
The article in its entirety is a good read, especially in describing the use of big data sets as a balance between hypothesis-driven projects and discovery-driven ones.  The former can loosely be described as "research" in it's classical sense, while the latter is sometimes derided as "a fishing expedition".  Both approaches can be useful, as long as you're honest with yourself and know what you're dealing with.

But the quote above isn't exactly accurate.  In the hypothetical 10,000 protein experiment, the chance of detecting any one thing as significant is the same whether you're looking at a subset of 25 or 250.  Given that constant random probability, the chance of finding anything significant is much greater in the whole set of 10,000 as compared with 25.  That 10,000 protein data set is where multiple testing is drastically needed.  You still need correction with 25 but you usually simple methods are adequate.  Picking the right way to correct your results is tricky, as I've seen large experiments designed as a fishing expeditions fail to detect known, real effects in the data set with statistical significance, even after multiple testing correction is done.

So if you know what you're looking for and have a specific question in mind, you can make multiple hypothesis testing work for you.  You won't have your big data set dilute away all your interesting observations.

Having something very specific to act on also means you're less likely to be fooled by chance and drawn down a path that's "significant".  You're free to restrict your observations to a more specific set of data and choose to look at any set of measurements based on the question at hand, and not the other way around. 

Of course, making the decision to ask a specific question should be made before seeing the data in it's entirety, not after the fact when something "looks good", but that's a whole other issue altogether.

Wednesday, March 12, 2014

Important Advice from Peter Gluckman, NZ Chief Science Advisor

Peter Gluckman offer ten points of advice at Nature for those interested in advising government regarding science policy.  Amongst the ten, this one is probably most critical:
Distinguish science for policy from policy for science. Science advising is distinct from the role of administering the system of public funding for science. There is potential for perceived conflict of interest and consequent loss of influence if the science adviser has both roles. There is a risk that the adviser comes to be perceived as a lobbyist for resources [Emphasis mine].

Tuesday, February 25, 2014

There's Something to Learn from Hero Worship

Kevin Hughes, writing for The Guardian and arguing against the idea of worshiping big name people in academia, centers his position around the following statements:
The truth of the matter is that heroic academics are just regular academics with two uninspiring credentials: good connections and a healthy dose of luck. A hard work ethic and an agile mind – which is to say a normative talent set at the graduate level – sets almost no one apart.
This is a sweeping assumption that dismisses the value of identifying people who are unusually successful (here, in academics, but in principle in any industry).  Identifying the unusually successful from the merely excellent is actually hard work, and is something that a 'big name' is supposed to help people sift out those worthy of emulation. 

There are heroic individuals that got lucky, were in the right place at the right time, or are simply egocentric, if you can figure out who is heroic for what reason you can find those people that can teach you something useful.  In the comments, 'fluffybunnywabbits' describes this kind of viewpoint very well:
I also think it's telling that in the anecdote recounted here the 'worshippers' are at a more advanced stage than the author (PhDs to his masters). I think the further through you get, the more you can track your own improvement, and that makes you realize how valuable experience is. I'm a couple of years into (what I hope will become) an academic career, and revising my doctoral thesis for publication. Re-reading stuff I wrote just a year or two ago makes me cringe, and makes me realize how much I've developed. If I think my ideas now are worth much more than my ideas were two years ago, why wouldn't I respect someone with ten times that extra experience?

Tongue in Cheek Look at the $1000 Genome

This post at AllSeq has a list of seven important things to consider if you want to deliver $1000 genomes using Illumina's HiSeq X platform, most of which can be realistically met (for a megaproject), except for these two points:
  • You don’t need to pay for the building you’re in and you can work in the dark. The budget doesn’t include overhead.
  • You don’t really want to analyze or store the data. The $1000 might get you a basic alignment, but nothing else.
Besides overhead, analysis of data has become, and probably will remain for the foreseeable future, the more expensive part of doing genomics.

Friday, February 21, 2014

OICR: Perhaps One of the Best Canadian Employers Ever

I haven't posted anything in the past several weeks as I've been incredibly busy with a whole series of projects. This is even despite going to AGBT 2014 and seeing the GenapSys presentation (Wow.  Just wow.).

Today, everyone at OICR received the following email, which must be either the most awesome or most Canadian thing I've ever heard of from anyone, anywhere I've worked before. 


Thursday, February 6, 2014

Three Different Ways of Reading a Scientific Article

Nature News reports:
In 2012, US scientists and social scientists estimated that they read, on average, 22 scholarly articles per month (or 264 per year). That is, statistically, not different from what they reported in an identical survey last conducted in 2005. It is the first time since the reading-habit questionnaire began in 1977 that manuscript consumption has not increased.
And further on:
Aside from the levelling out of article readings, the latest survey of 800 scholars, which is due to appear in the journal Learned Publishing, also finds that the time taken per article seems to have bottomed out at just over half an hour.
Anecdotally, I'd have to say this study hits the trend bang-on.  22 articles per month at half an hour each is actually a pretty low commitment, if you consider how articles are being read by many people.

I doubt many of the people 'reading' over 22 articles actually have the time to fully absorb every little bit of information within - Most people don't really have that luxury of time (or perhaps terrific reading comprehension.  Some of the other thoughts mentioned at Nature capture this very well:
When articles were only available in print, it was implicitly assumed by communication analysts that researchers always read manuscripts in their entirety, as if a ‘scholarly article’ was an object to be consumed as a whole. That may never have been true, he says: most of the time, scholars were likely scanning for particular snippets of information.
Below are a few approaches and reasons why someone would want to read a scientific article.  This list isn't exhaustive by any means:
  1. To understand a new idea.  This is the real learning, and learning takes effort.  This is also where you really have to study the article in depth to avoid missing details that don't seem relevant at first glance.  If you're out of your usual area of expertise, you need to understand the context of why the final product is scientifically important, what the assumptions or facts in the report are, how and why the experiments are done (at a technical level).  You might also have to re-read the article a second time to really 'get it'.  Time alloted: Up to several hours.
  2. To stay up to date in your field.  Here, you're really just skimming the results and references while still reading the paper.  You don't have to study technical aspects of the report because you're familiar with them.  Were the experiments actually risky enough to show something daring?  Is the result worth citing in the future, or does the paper refer you to other new papers?  Time alloted: About 30 minutes.
  3. To replicate or adapt some published experiment.  You're only interested in one figure in the paper that shows the data you'd like, or think you'd like, to show in your work.  The end result of the paper doesn't matter to you, but the methods, software, and reagents actually used matter to you.  Just look up the information you need and file the paper away for a rainy day.  Time alloted: 10 minutes.
There are many other ways of approaching a paper.  If you have another way, send in your comments or add them below.

Monday, February 3, 2014

Getting Prestigious Awards Gets You Noticed

Especially if you're a scientist. 

Bioscience Technology covered a recent paper in Management Science which points out that life sciences investigators see a 12 percent increase in citation rates, on average, after becoming Howard Hughes Medical Institute investigators.  Being associated with HHMI is considered prestigious but most accounts.

Among the main things the authors observed: Big gains in citation rates post-award were seen for people working in new areas of research, publishing in lower impact journals, and for younger researchers.  However, the effect of a new prize isn't very significant if people were publishing work in big name journals already. 

Since citations are usually given freely, the study does seem to support the idea of prizes as a mark of quality and a signal that reading work from that particular person is more likely to be worthwhile, and in general awards serve to build up a personal brand that's similar to that of big-name journals.  Hot journals generally contain quality work, so work from someone who's been recognized with an award should also be interesting (though it's not hard to find lukewarm papers in hot journals, and reading work from HHMI investigators is no guarantee that it will be hot).

See the original paper here.  Unfortunately, it's paywalled unless you're at an institution that gives you access.

Monday, January 27, 2014

Five Short Facts About Fat Cell Biology

Cell recently posted a huge review of fat and adipose biology written by Evan Rosen and Bruce Spiegelman.

There are currently three types of fat known: white, brown, and beige.  If you're interested in where fat tissue comes from and how it behaves, skip ahead to the section titled "The Developmental Origins of Adipose Tissue: A Bloody Mess", which means that it's actually bloody and that it's just bloody confusing to understand how all the genes involved relate to each other.  There you'll find a handful of good factoids:
  • The total number of fat cells humans carry as adults is set by adolescence.
  • Humans turn over about 8% of their fat cells per year.
  • Mice turn over about 0.6% of their fat cells every day.
  • Fat cells can be derived from stem cells that can also create blood cells.
  • Brown fat cells are derived from stem cells that actually reside in muscles, not fatty tissue.  A single gene controls the switch between the two.
Besides giving you the 50,000 ft view of fat biology, another key take-home message in this review is that having basic stakes in the ground to frame research questions is a necessary catalyst before triggering a lot of research down the road.  This may be obvious, but it's worth repeating when good examples arise.

In this review, it's best shown in the first figure, where it's not until several fat related cytokines were identified in the mid-1990's that work in the fat field really took off.  Though interest in the field slowly grew, it wasn't until reaching milestones like the discovery of leptin and adiponectin that both raw and relative numbers of papers (blue and red), respectively, shot up from a baseline that spanned more than two decades.

Monday, January 20, 2014

Franklin's List: Helping Scientists Become Politicians

GEN has an excellent and timely article on an emerging political group, Franklin's List, that's helping scientists get involved in politics in the United States, by directly helping them become politicians.  Though the group is new, they already acknowledge several obstacles that need to be paid attention to. The largest once are human issues and have little to do with a need for money:
One key roadblock those recruits, and Franklin’s List, will need to surmount is cultural: Until lately, investigators and other STEM professionals have balked at going into politics. [Shane] Trimmer (Franklin's List Executive Director) says that’s starting to change following years of flat or reduced spending on NIH. ... “They’re seeing how the decisions made in Congress by politicians are directly affecting their ability to do research. Now they’re seeing that if they do not get more involved, then these things will just keep on happening,” he added.
It's almost as if that imaginary world of scientists cloistered in their labs ignoring reality is real, and represents a major liability to the research enterprise.  You just can't get tenure and skive off from the rest of the world to do research until your retirement at the age of 79.

The whole idea of scientists forming a lobby group reminds me of a conversation I had with another trainee long ago at a Stem Cell Network conference.  He was a postdoc and I was a PhD student, and he took the position that that scientists, being paid to manage government funds, couldn't use those same government funds to lobby the government for more money.

I argued that that wasn't true; once grant money was paid to people (researchers, technicians, students, etc.), they could do whatever they wanted.  That included spending it on professional bodies that, like those for teachers and physicians, spend a lot of time and energy on negotiating better terms for their members.  Why scientists aren't very good at doing this puzzles me to this day.

But Franklin's List seems like it can partly fill this need for a scientific lobby group, at least in the United States.  Interestingly, it looks like it'll focus on gathering scientists at local levels to try and grow out candidates for higher political levels.  Kind of like running farm teams.
“The STEM candidates we’ll be searching for who have been in the lab or in academic circles, their idea was always to be in academia as a biologist or a physicist. They don’t have the network that somebody might have who has been a businessperson or an attorney in the community and might always have, in the back of their mind, thought about politics as an option,” Trimmer said. “It will be much easier for them to work their way up and to build that grassroots support.”
The GEN article is worth the few minutes to read, and it definitely portrays Franklin's List as a movement to watch.

Tuesday, January 14, 2014

Stunning Protein Animations by Nanobotmodels Studio

Yuriy Svidinenko, head of Nanobotmodels, is running a crowdfunding campaign to produce more jaw-dropping animations like this one of nanoparticles delivering drugs to cancer cells:

He's proposing to use the crowdfunding proceeds to produce an animated video about cancer biology and proteins involved in the process, and his IndieGoGo pitch video can be seen below.

You have to wait to see the cool renderings of human IgG at 1:00, what appears to be a protein encapsulated in a lipid nanoparticle at 1:12, and a translucent cell (a neuron?) starting at 1:38.

Apparently rendering costs are a significant fraction of making these videos (~40%), which he estimates at about $65-85 per secondThe campaign runs until February 25th, 2014.

Best of luck Yuriy!

Friday, January 10, 2014

Science Transforms War, Transforming Science

At Nature, David Kaiser, an MIT Professor and Head of the Program in Science, Technology, and Society, writes about how the Second World War's need for physicists to run huge research programs transformed the model of science:
Until the war, most scientific research in the United States had been supported by private foundations, local industries and undergraduate tuition fees. After the war, scientists experienced a continuity — even an expansion — of the wartime funding model. Almost all support for basic, unclassified research (as well as for mission-oriented defense projects) came from the federal government.
While the main point here is that government became the major funder of research, the point that's more important to remember is Kaiser's description of research, pre-WW2, as being paid for by (and probably driven by) foundations, industry, and tuition. 

But in the context of changing government funding, these are the same sources of money that seem to be becoming more and more important today.  Could it be that the model of running science for the last 60 to 70 years has been 'abnormal'?

Part way through, Kaiser throws in another interesting historical quip:
Veterans of the intense, multidisciplinary wartime projects came to speak of a new type of scientist. They touted the war-forged 'radar philosophy' and the quintessential 'Los Alamos man': a pragmatist who could collaborate with everyone (emphasis is mine) from ballistics experts to metallurgists, and who had a gut feeling for the relevant phenomena without getting lost in philosophical niceties.
Learning to work in collaborations and to do collaborative science is probably one of the more important and useful skills to pick up during a PhD, and it seems like the idea of a pragmatic 'serial collaborator' who manages to identify common ground with others in other disciplines seems to have also originated in the post-war period.

Thursday, January 9, 2014

Frozen Human Brains, Stem Cells, and Ice Cream

Signals just posted a short summary I wrote on this paper, where a team at Columbia University and the New York Stem Cell Foundation managed to create iPS cells from human brain tissue that was frozen for 11 years. 

The paper itself is actually a neat example that human cells are pretty resilient, as the team specifically used tissue samples that weren't protected against freezing with glycerol or DMSO, common additives to prevent ice crystals from forming and damaging the cells.

And since stem cells were created from patients with four different neurological diseases, it also means that other kinds of poorly stored samples, not protected with an antifreeze, might be used to make model cells as well. 

Since Signals has the summary of the paper, I'll digress on the topic of antifreezes here.  You might also know that antifreezes aren't just useful for storing biological samples in freezers; many organisms protect themselves from freezing with antifreeze compounds or antifreeze proteins in their own bodies.  

Several antifreeze protein structures.  From RCSB.

Besides keeping organisms alive, antifreeze proteins also have a variety of useful applications for humans (which obviously don't have any), with the best one I know of is the use of fish antifreeze protein as an additive to ice cream.

I first heard of this in a talk by Peter Davies, a scientist at Queen's University, who described how antifreeze proteins were identified in mealworms. He describes some of that work in this short interview at NPR, where he also adds that the proteins were quickly rebranded by companies using the proteins in food:
DAVIES: Unilever, which is a big company in Europe, who make frozen foods like ice cream for example, they have for some time now been putting the antifreeze proteins into especially low-fat ice cream. Now they don't call them antifreeze proteins because the public would, the consumers would be perhaps nervous about the idea of antifreeze being in food. So they actually call them ice structuring proteins.
By whatever name you call them, the proteins are yet another example of something very useful that came out of purely academic research.

Tuesday, January 7, 2014

Neurocrine Biosciences: Developing a Drug With a Huge Market

This morning, Neurocrine Biosciences shot up 79% on news that a study of their small molecule NBI-98854 to treat tardive dyskinesia (a disorder characterized by involuntary facial movements) is progressing well.  The drug targets and inhibits VMAT2, a solute carrier channel that transports monoamines at nerve synaptic junctions, including dopamine, norepinephrine, serotonin, and histamine.

Though tardive dyskinesia (TD) at first sounds similar to the symptoms of Parkinson's Disease, apparently they're very different, according to the Tardive Dyskinesia Center, which explains why the company isn't angling NBI-98854 as a possible drug for use in Parkinson's.  

Nevertheless, I don't think Parkinson's would have been the major market for Neurocrine's drug.  What makes NBI-98854 interesting is that the long term use of a variety of other drugs used to treat common conditions can lead to TD; take metclopramide (Reglan) for treatment of nausea and vomiting, or espesically Haloperidol (Haldol) as an example of many anti-psychotic drugs.  Here's a laundry list of about twenty drugs that can lead to TD (From The Dystonia Foundation).  

So while it still may be early to say Neurocrine is in the clear, it seems that with NBI-98854 the company is positioning itself to own a companion drug that'll be used in a wide variety of markets.  A very nice strategy indeed.