About this Author
College chemistry, 1983
The 2002 Model
After 10 years of blogging. . .
Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases.
To contact Derek email him directly: firstname.lastname@example.org
In the Pipeline:
Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline
September 22, 2014
I'm listening to Jean-Louis Reymond of Bern talking about the GDB data set, the massive enumerated set of possible molecules. That's the set of chemically feasible molecules at or below a certain heavy atom count - the first iteration was GDB11 (blogged about here), and it's since been extended to GDB13, which has nearly one billion compounds with up to 13 C, N, O, S and Cl atoms. (Note, as always, that huge vast heaps of poly-small-ring compounds, especially concatenations of 3-membered rings, are pre-filtered out of these sets, because otherwise they would overwhelm them completely). They're working now on GDB17, which is a truly huge mound of data.
I was particularly taken with the image shown (from this paper), an artificial set of compounds (up to heavy atoms counts of 500) from several main classes of real molecules. It's a 3-D principle components analysis plot, which tunes things up to emphasize the differences, of course, and there's what chemical space looks like from this angle. There go the proteins and nucleic acids, off into their own zones, and similarly the linear alkanes and diamond-like lattices, beaming off in separate directions. In the middle are drug-like compounds - and don't imagine for a minute that any substantial number of those have actually been prepared, either. This is where we live, all of us organic chemists.
+ TrackBacks (0) | Category: In Silico
Erland Stevens at Davidson is going to be running an online med-chem course on EdX, the MOOC platform founded by Harvard and MIT. It starts in October, runs for 8 weeks, can be audited for free, and covers these topics:
(1) The drug approval process (early drugs, clinical trials, IP factors)
(2) Enzymes and receptors (inhibition, Ki, types of ligands, Kd)
(3) Pharmacokinetics (Vd, CL, compartment models)
(4) Metabolism (phase I and II, genetic factors, prodrugs)
(5) Molecular diversity (drug space, combi chem, libraries)
(6) Lead discovery (screening, filtering hits)
(7) Lead optimization (FG replacements, isosteres, peptidomimetics)
(8) Important drug classes (selected examples)
So if you know someone who would like to have a better understanding of the basics of med-chem and has been looking for an opportunity, this might be the answer. Stevens taught this one in the spring on the same platform, and had 14,000 people sign up at the beginning.
Update: from the comments, there's another med-chem course starting on Coursera shortly, from UCSD: https://www.coursera.org/course/drugdiscovery.
+ TrackBacks (0) | Category: Pharma 101
Well, I didn't see this one coming: Merck KGaA (Merck-Darmstadt) is buying Sigma-Aldrich. So that's Merck/Millipore/Sigma-Aldrich, which will be a big life sciences company indeed. It's a $17 billion dollar deal, which goes off at at 37% premium to SIAL's close on Friday. Doesn't look much like any news of this offer leaked out, either, since the company's stock went down on Friday, and I don't see anything particularly weird in the options activity, either.
Merck KGaA has had a rough time of it recently trying to be a biotech company - perhaps they've decided that there's more stability in services?
+ TrackBacks (0) | Category: Business and Markets
I'm at the FBLD conference in Basel today through Wednesday, and I'll be blogging some of the things I hear there. This morning we had an overview from Harren Jhoti of Astex, looking back over the company's history (they were recently bought by Otsuka for about $900 million).
One of the most striking slides was not from the scientific side - it was a plot of the company's financial position over the years. They had several deals with larger companies over the years (GSK, Novartis, J&J among others), and you could see that these, in many cases, were very timely indeed. In fact, Jhoti said, the company was more than once in the position of running out of money within a few months.
I think that many small companies could show a similar slide - well, the ones that survive could. The others would show a graph, if there were anyone left to show it, where that line came down and crossed the X-axis, and that was that. You hear a lot about these little tech companies, especially social media and the liek, that just take off and soar ever higher, but that doesn't happen much in biopharma. You can go for years dodging disaster until something works, which can make it a tough sell to investors, since you can also go for years and have nothing that works at all.
+ TrackBacks (0) | Category: Business and Markets
September 19, 2014
See what you think of Peter Thiel's characterization of the drug industry in this piece for Technology Review. Thiel's a very intelligent guy, and his larger points about technology stalling out make uncomfortable reading, in the best sense. (The famous quote is "We wanted flying cars; instead we got 140 characters"). But take a look at this (emphasis added):
You have to think of companies like Microsoft or Oracle or Hewlett-Packard as fundamentally bets against technology. They keep throwing off profits as long as nothing changes. Microsoft was a technology company in the ’80s and ’90s; in this decade you invest because you’re betting on the world not changing. Pharma companies are bets against innovation because they’re mostly just figuring out ways to extend the lifetime of patents and block small companies. All these companies that start as technological companies become antitechnological in character. Whether the world changes or not might vary from company to company, but if it turns out that these antitechnology companies are going to be good investments, that’s quite bad for our society.
I'd be interested in hearing him revise and extend those remarks, as they say in Washington. My initial reaction was to sit down and write an angry refutation, but I'm having second thoughts. The point about larger companies becoming more cautious is certainly true, and I've complained here about drug companies turning to M&A and share buybacks instead of putting that money back into research. I'd say, though, that the big drug companies aren't so much anti-technology as they are indifferent to it (or as indifferent as they can afford to be).
Even that still sounds harsh - what I mean is that they'd much rather maximize what they have, as opposed to coming up with something else. Line extensions and patent strategies are the most obvious forms of this. Buying someone else's innovations comes next, because it still avoids the pain and uncertainty of coming up with your own. There's no big drug company that does only these things, but they all do them to some degree. Share buybacks are probably the most galling form of this, because that's money that could, in theory, be applied directly to R&D, but is instead being used to prop up the share price.
But Thiel mentions elsewhere in his interview that we could, for example, be finding cures for Alzheimer's, and we're not. Eli Lilly, though, is coming close to betting the company on the disease, taking one huge swing after another at it. Thiel's larger point stands, about how more of the money that's going into making newer, splashier ways to exchange cat pictures and one-liners over the mobile phone networks could perhaps be applied better (to Alzheimer's and other things). But it's not that the industry hasn't been beating away on these itself.
I worry that the Andy Grove fallacy might be making an appearance again, given Thiel's background (PayPal, Facebook, LinkedIn). That link has a lot more on that idea, but briefly, it's the tendency for some people from the computing/IT end of the tech world to ask what the problem is with biomedical research, because it doesn't improve like computing hardware does. It's a good day to reference the "No True Scotsman" fallacy, too: sometimes people seem to identify "technology" with computing, and if something doesn't double in speed and halve in cost every time you turn around, well, that's not "real" technology. At the very least, it's not living up to its potential, and there must be something wrong with it.
I also worry that Thiel adduces the Manhattan project, the interstate highway system, and the Apollo program as examples of the sort of thing he'd like to see more of. Not that I have anything against any of those - it's just that they're all engineering projects, rather than discovery ones. The interstate system, especially: we know how to build roads, so build bigger ones. The big leap there was the idea that we needed large, standardized ones across the whole country, with limited entrances and exits. (And that was born out of Eisenhower's experiences driving across the country as the road network formed, and seeing Germany's autobahns during the war).
But you can say similar things about Apollo: we know that rockets can exist, so build bigger ones that can take people to the moon and back. There were a huge number of challenges along the way, in concept, design, and execution, but the problem was fundamentally different than, say, curing Alzheimer's. We don't even know that Alzheimer's can be cured - we're just assuming that it can. I really tend to think it can be cured, myself, but since we don't even know what causes it, that's a bit of a leap of faith. We're still making fundamental "who knew?" type discoveries in biochemistry and molecular biology, of the sort that would totally derail most big engineering projects. The Manhattan project is the closest analog of the three mentioned, I'd say, because atomic physics was such a new field (and Oppenheimer had to make some massive changes in direction along the way because of that). But I've long felt that the Manhattan project is a poor model, since it's difficult to reproduce its "Throw unlimited amounts of money and talent at the problem" mode, not to mention the fight-for-the-survival-of-your-civilization aspect.
But all that said, I do have to congratulate Peter Thiel on putting his money down on his ideas, though his investment fund. One of things I'm happiest about in today's economy, actually, is the way that some of the internet billionaires are spending their money. Overall, I'd say that many of them agree with Thiel that we haven't discovered a lot of things that we could have, and they're trying to jump-start that. Good luck to them, and to us.
+ TrackBacks (0) | Category: Business and Markets | General Scientific News | Who Discovers and Why
I wanted to mention that there's an interesting symposium on "Irreproducibility in Target Validation" taking place at Novartis (Cambridge, MA) next month, October 23. This is a topic that many an industrial biopharma researcher can relate to, and as academic centers get into more drug research, they're joining the rueful party, too. There are a number of good speakers from both academia and industry on the schedule, so if you're in the area, it's worth a look. More information here.
+ TrackBacks (0) | Category: Academia (vs. Industry) | The Scientific Literature
The GSK/China bribery case has come to some sort of ending. The company has been fined the equivalent of nearly $500 million, and Mark Reilly, the former head of that part of their operations in China, has been sentenced to prison for "two to four years". No further details seem to be available about the sentence. There are so few particulars, in fact, that although several other people are reported to have also been sentenced, we don't know who they are or what prison terms they've received. China's press (and government, same difference) are working on the usual need-to-know basis, and they've decided that no one else needs to know.
Update: the latest report is that Reilly has been given a suspended sentence and has been deported back to the UK. Certainly beats time in the Chinese prison system.
+ TrackBacks (0) | Category: Business and Markets | The Dark Side
Big high-impact journals have more retractions, it seems. I can see how that would be, because there are several forces at work. People want to publish their splashy, cutting-edge results in the big-name journals, and a higher percentage of those papers are wrong to start with, as opposed to more incremental ones. And the big papers in the big journals get more scrutiny, so they're more likely to be picked apart when there's something wrong with them. I would have been surprised if this correlation had come out any other way, actually.
+ TrackBacks (0) | Category: The Scientific Literature
September 18, 2014
Thinking of good seminars and bad ones reminds me of a story, which I'm surprised that I haven't told here, because it's a favorite memory of mine from grad school. Like everyone else, I've attended some pretty deadly talks over the years - some of them had decent subject matter, but were presented murderously, while others had such grim content that they would not have been redeemed by substituting the best speaker available. Combine those two, and you have a section of the Venn diagram that makes you wonder what you've done with your life (or with a previous one) to be sitting through the thing.
I remember coming back upstairs after one of those. Like most grad students, I didn't have the nerve to just bail on a speaker if they turned out to be horrible (heck, I sometimes don't have the nerve now). So I'd sat through a real forced march, or forced stagger, though a bunch of uninteresting stuff delivered at dictation speed in a nasal monotone. At this remove, I couldn't tell you what it was about even for a large reward; all I remember was the pointlessness.
So I was back in front of my hood when my labmate at the time appeared in the doorway. "That was the WORST seminar I have EVER heard in my LIFE!" he proclaimed, and I could only agree with him, which I did with a strange expression on my face. "Why are you grinning like that?" he asked. "Because the seminar speaker just walked behind you when you said that", I told him (truthfully). "No!" he said in horror, and looked off to his right down the hall. "Oh my God! Oh, well. He's heard it before." And maybe he had. Nominations for your own worst seminar experience are welcome in the comments, if you haven't blocked them out of your mind by now.
By the way, I checked to see if I'd told this story on the site by a Google trick, which is useful for searching the site as a whole. Just start your query with site:pipeline.corante.com, and you'll search only within that domain. It works quite well, but to be sure, I went and checked a text backup of the site (I make one from time to time, via an "Export" command. In case you're wondering, the whole site (posts and comments) rendered in 10-point Courier with standard margins on letter-sized paper, now comes to over 28,000 pages. Dang. I did a search for "worst seminar" and didn't find the phrase, but at this point you'd have to do a search for "Swann" to find the text of "Rememberance of Things Past" in there.
+ TrackBacks (0) | Category: Blog Housekeeping | Graduate School
I wanted to let people know that next week I'll be attending the FBLD (Fragment-Based Ligand Design) meeting in Basel, Switzerland. I'm looking forward to it - there are a number of good talks on the agenda, and it's always nice to attend a specialized conference where you're interested in the great majority of what's on offer. (Sitting through bad or irrelevant talks, as I've mentioned, becomes harder and harder for me every year). I hope to do some blogging from the conference itself as interesting topics come up.
If there are folks in Basel who'd like to meet up while I'm in town, it looks like I'll be free on Monday and Wednesday evening, so drop me an e-mail and maybe we can find a place to meet.
+ TrackBacks (0) | Category: Blog Housekeeping
September 17, 2014
This makes for a very disturbing read. The author details his participation in a clinical trial for an asthma therapy being developed by Amgen at a clinic in Newport Beach (CA). He doesn't say what the drug was, but my guess is that it's brodalumab, an anti-IL17 antibody which has been in trials for asthma and psoriasis.
What he recounts is very disturbing. Here's a sample:
Moment of Truth #2 came during one of the many whispering sessions they gave me. The lead technician had a disturbing habit of frequently pulling me into a corner or another room and whispering things like “We’re just going to say that you take this medication.” I had to fill out numerous questionnaires, and she would often stand over me and whisper which answer I should mark. At last, one day after a battery of breathing tests, questionnaires, and vital-sign checks, it was required that the doctor (listed as the principal investigator on this study) verify all this, personally examine me, and sign off on it. Amgen was very clear on that point. “But he’s not here today,” she whispered, “so we’re just going to mark this off and send it through. We’ve already done everything he was going to do anyway.” By now I knew this contractor was willfully and knowingly giving Amgen invalid data, and I resolved to stick with it only long enough to see what more I could learn. I’d already decided I would not complete the trial and contribute bad data to a medical clinical trial.
It gets worse from there. The comments to Brian Dunning's post are already starting to fill up with the expected "Yeah, that's what Big Pharma does" stuff. So I'd like to help provide an antidote to that: Hey Amgen! Hey FDA! Check out this Newport Beach trial center! Dig into these allegations, and do something about them. And tell everyone what you've found!
Update: some readers are asking how anyone can be sure that this description is real. We can't, although it's certainly a detailed description, and attached to the name of someone with a good bit of internet traffic and associated notoriety. Mind you, some of that associated notoriety is a conviction for wire fraud. But in a "cui bono" sense, I don't see any reason for someone to make up the details in this account.
And how accurate it is should be an immediate concern for Amgen. In this business, we not only have to be death on clinical trial fraud, but we also have to be seen to be death on clinical trial fraud, so that (1) other people won't get the impression that it's a good idea, and (2) the general public won't get the impression that we're a bunch of crooks. So one way or another, these allegations have to be looked at, pronto.
Second update: see the comments section. Adam Feuerstein reports that Amgen has told him that they're already investigating this, which is just what the company should be doing. Glad to see them moving this quickly!
+ TrackBacks (0) | Category: Clinical Trials | The Dark Side
So, Ebola. A terrible virus, with a high mortality rate. And although that percentage is raised by the poor public health infrastructure in the areas where it's endemic, it's still very bad news indeed, a major medical challenge anywhere it might show up. The current outbreak in West Africa is by far the largest yet, which makes prediction of its course difficult. One can hope that the virus will mutate to a less ferocious form, as often happens with host/pathogen systems, but (1) that could take a long time, and (2) there's no guarantee that it would help much. After all, polio and smallpox had been infecting humans for a long time, and were still awful enough. There's also a small (but non-zero) chance that a mutation could go the other way, and make the virus easier to spread by airborne routes, to pick one worrisome possibility, in which case we're all going to long for the good old days when Ebola was only as bad as it is now.
With those cheerful thoughts in mind, what does the biopharma industry have to offer against the disease? As I mentioned regarding avian influenza several years ago, we can probably cross off most small-molecule therapies. Not very many antiviral drugs have a broad enough mechanism to go after Ebola as well as their designed targets, and there are no small molecules on the market that have been aimed at Ebola itself. As with many infectious diseases, immune-system therapies are a better bet - antibodies and the like for acute treatment, and vaccines as the best hope for prevention. Famously, there is a potential Ebola antibody in development (more on this in another post, and as we get some more details on the work itself), but there is (equally famously) no such thing yet as an Ebola vaccine yet, although there's a GSK/NIH venture that not many people had heard about until this latest outbreak.
You can realize all this, as the Independent newspaper did in England, and write a big story about how Big Pharma has been callous and negligent. But once you scroll down that one, you'll find in the comments section that the main source for the article is quite unhappy with it:
his article claims that I, Adrian Hill, have “ launched a devastating attack on Big Pharm, accusing drugs giants including GlaxoSmithKline (GSK), Sanofi, Merck and Pfizer of failing to manufacture a vaccine, not because it was impossible, but because there was no business case.” I did no such thing.
I simply explained to Mr Cooper what is widely known in the biomedical research and development community: that vaccine development is extremely expensive, usually takes a very long time and the market is dominated by some very large pharma companies. Because outbreaks of diseases such as Ebola are rare and unpredictable and, until now, have afflicted small numbers of people in very poor countries, it is widely understood that that is no business case for a private company to invest tens or hundreds of millions of dollars in vaccine development for such diseases. It may be the Independent’s view that this is a telling indictment of the global pharmaceutical industry, but it is not mine and I am unhappy that the first paragraph of this report wrongly attributes that view to me.
Adrian Hill is completely correct. Until this latest outbreak, there have been far, far larger pubic health problems in Africa than Ebola. Not even the deepest-pocketed NGO or most open-hearted charity would have been able to make a good case for putting large amounts of money behind an Ebola vaccine effort - in fact, such a project would almost certainly have been a criminal misuse of funds. There are just too many other things that have caused (and are causing) death and disease in these areas - until this year, the number of people killed by Ebola was trivial compared to the number being killed by a host of other factors. In fact, the number of people being infected now by Ebola is pretty damned small compared to the other public health problems in the poorest parts of West Africa - just to stick with the infectious diseases alone, you have malaria, yellow fever, schistosomiasis, dengue, typhoid, hepatitis A, meningococcus, and just plain diarrhea (a killer for young children). Ebola doesn't deserve the current level of attention just because of the number of people it's killed - the number of people in Liberia who've died before their time over the last ten or twenty years is horrific, and it wasn't Ebola that killed them. It deserves attention because we don't know how bad this outbreak is going to get, and because it's already (and understandably) causing huge amounts of fear and disruption, given its high mortality rate. If cases continue to show up inside the larger cities, things could get out of control pretty quickly.
The NIH (specifically, the NIAID) has funded a number of Ebola research programs in the US, and I would guess that many of these have been directed, at least partly, to possible bioterror threats. It's also worthwhile to figure out how to develop a vaccine against a class of viruses that no one's tackled before, since there are bound to be some lessons learned that can be applied again. Okairos, a small Swiss-based company spun out of Merck, has been working on a vaccine along with the NIH as well, and they were recently acquired by GSK. (This one is the focus of a great deal of suddenly accelerated work).
But no, the big companies have not spent time working on an Ebola vaccine until now. And as Hill says, this should not come as a surprise. By the WHO's count, from 1976 through 2012 there had been a total of 1590 human fatalities from Ebola, with whole years going by without a single case. There is not enough research money in the world to work on everything at this level, or at least, not without taking away from everything else.
+ TrackBacks (0) | Category: Infectious Diseases
September 16, 2014
Today brings news of a deal with AstraZeneca to help develop AZ's beta-secretase inhibitor, AZD3293 (actually an Astex compound, developed through fragment-based methods). AZ has been getting out of CNS indications for some time now, so they really did need a partner here, and Lilly lost their own beta-secretase compound last year. So this move doesn't come as too much of a shock, but it does reaffirm Lilly's bet-the-ranch approach to Alzheimer's.
This compound was used by AZ in their defense against being taken over by Pfizer, but (as that link in the first paragraph shows), not everyone was buying their estimated chances of success (9%). Since the overall chances for success in Alzheimer's, historically, have ranged between zero and 1%, depending on what you call a success, I can see their point. But beta-secretase deserves to have another good shot taken at it, and we'll see what happens. It'll takes years, though, before we find out - Alzheimer's trials are painfully slow, like the disease itself.
Update: I've had mail asking what I mean by AZ "getting out of CNS indications", when they still have a CNS research area. That's true, but it's a lot different than it used to be. The company got rid of most of its own infrastructure, and is doing more of a virtual/collaborative approach. So no, in one sense they haven't exited the field at all. But a lot of its former CNS people (and indeed, whole research sites) certainly exited AstraZeneca.
+ TrackBacks (0) | Category: Alzheimer's Disease | Business and Markets | Drug Development
Here's a look from Technology Review at the resurgent fortunes of Alnylam and RNA interference (which I blogged about here).
But now Alnylam is testing a drug to treat (familial amyloid polyneuropathy) in advanced human trials. It’s the last hurdle before the company will seek regulatory approval to put the drug on the market. Although it’s too early to tell how well the drug will alleviate symptoms, it’s doing what the researchers hoped it would: it can decrease the production of the protein that causes FAP by more than 80 percent.
This could be just the beginning for RNAi. Alnylam has more than 11 drugs, including ones for hemophilia, hepatitis B, and even high cholesterol, in its development pipeline, and has three in human trials —progress that led the pharmaceutical company Sanofi to make a $700 million investment in the company last winter. Last month, the pharmaceutical giant Roche, an early Alnylam supporter that had given up on RNAi, reversed its opinion of the technology as well, announcing a $450 million deal to acquire the RNAi startup Santaris. All told, there are about 15 RNAi-based drugs in clinical trials from several research groups and companies.
“The world went from believing RNAi would change everything to thinking it wouldn’t work, to now thinking it will,” says Robert Langer, a professor at MIT, and one of Alnylam’s advisors.
Those Phase III results will be great to see - that's the real test of a technology like this one. A lot of less daring ideas have fallen over when exposed to that much of a reality check. If RNAi really has turned the corner, though, I think it could well be just the beginning of a change coming over the pharmaceutical industry. Biology might be riding over the hill, after an extended period of hearing hoofbeats and seeing distant clouds of dust.
There was a boom in this sort of thinking during the 1980s, in the early days of Genentech and Biogen (and others long gone, like Cetus). Proteins were going to conquer the world, with interferon often mentioned as the first example of what was sure to be a horde of new drugs. Then in the early 1990s there was a craze for antisense, which was going to remake the whole industry. Antibodies, though, were surely a big part of the advance scouting party - many people are still surprised when they see how many of the highest-grossing drugs are antibodies, even though they're often for smaller indications.
And the hype around RNA therapies did reach a pretty high level a few years ago, but this (as Langer's quote above says) was followed by a nasty pullback. If it really is heading for the big time, then we should all be ready for some other techniques to follow. Just as RNAi built on the knowledge gained during the struggle to realize antisense, you'd have to think that Moderna's mRNA therapy ideas have learned from the RNAi people, and that the attempts to do CRISPR-style gene editing in humans have the whole biologic therapy field to help them out. Science does indeed march on, and we might possibly be getting the hang of some of these things.
And as I warned in that last link, that means we're in for some good old creative destruction in this industry if that happens. Some small-molecule ideas are going to go right out the window, and following them (through a much larger window) could be the whole rare-disease business model that so many companies are following these days. Many of those rare diseases are just the sorts of things that could be attacked more usefully at their root cause via genomic-based therapies, so if those actually start to work, well. . .
This shouldn't be news to anyone who's following the field closely, but these things move slowly enough that they have a way of creeping up on you unawares. Come back in 25 years, and the therapeutic landscape might be a rather different-looking place.
+ TrackBacks (0) | Category: Biological News | Business and Markets | Clinical Trials | Drug Development
September 15, 2014
It's time for a hang-heads-in-shame moment. This is another off the Twitter feed, and the only place to see the figure in its native state is to go the the Chemical Reviews table of contents and scroll down to the article titled "Aqueous Rechargable Li and Na Ion Batteries". A perfectly reasonable topic, but take a look at the graphical abstract figure. Oh, dear.
+ TrackBacks (0) | Category: The Scientific Literature
Last year I mentioned a paper that described the well-known drug tramadol as a natural product, isolated from a species of tree in Cameroon. Rather high concentrations were found in the root bark, and the evidence looked solid that the compound was indeed being made biochemically.
Well, thanks to chem-blogger Quintus (and a mention on Twitter by See Arr Oh), I've learned that this story has taken a very surprising turn. This new paper in Ang. Chem. investigates the situation more closely. And you can indeed extract tramadol from the stated species - there's no doubt about it. You can extract three of its major metabolites, too - its three major mammalian metabolites. That's because, as it turns out, tramadol is given extensively to cattle (!) in the region, so much of it that the parent drug and its metabolites have soaked into the soil enough for the African peach/pincushion tree to have taken it up into its roots. I didn't see that one coming.
The farmers apparently take the drug themselves, at pretty high dosages, saying that it allows them to work without getting tiree. Who decided it would be a good thing to feed to the cows, no one knows, but the farmers feel that it benefits them, too. So in that specific region in the north of Cameroon, tramadol contamination in the farming areas has built up to the point that you can extract the stuff from tree roots. Good grief. In southern Cameroon, the concentrations are orders of magnitude lower, and neither the farmers nor the cattle have adopted the tramadol-soaked lifestyle. Natural products chemistry is getting trickier all the time.
+ TrackBacks (0) | Category: Analytical Chemistry | Chemical News | Natural Products
September 12, 2014
Well, it was not a dull evening around the In the Pipeline headquarters last night. I submitted a link to Reddit for my post yesterday about Retrophin and Thiola, and that blew up onto that site's front page. The Corante server melted under the impact, which isn't too surprising, since it's struggling at the best of times. (A site move really is coming, and no, I can't wait, either, at this point.)
But then, to my great surprise, Martin Shkreli (CEO of Retrophin) showed up in the Reddit thread, doing an impromptu AMA (Ask Me Anything), which I have to say takes quite a bit of aplomb (or perhaps foolhardiness - I don't think too many other CEOs of any publicly traded corporations would have done it). But not too long after that, the entire thread vanished off the front page, and off of r/News, the subreddit where I'd submitted it.
Then I got a message from one of the moderators of r/News, saying that I'd been banned from it, and going on to say that I would likely be banned from the site as a whole. After having been on Reddit for seven years, that took me by surprise. As best I can figure, the thread itself was reported to r/Spam by someone, and the automated system took over from there. Over the years, I've submitted links to my blog posts, and Reddit, or some parts of it, anyway, has been notoriously touchy about that. The last time I submitted such a link, though, was back in February (and before that, August of 2013), so I'm not exactly a human spam-bot. We'll see what happens. Update: I was banned for some hours, but I've been reinstated.
But back to Retrophin, Thiola, and Martin Shrkeli. The entire Reddit thread can still be read here, via a direct link, although it can't be found in r/News any more. If you look for a user named "martinshkreli", you can see where he gets into the fray (I'm "dblowe" on the site, or perhaps I was?). You'll note that he gives out his cell phone, office phone, and e-mail, which again is not your usual CEO move - you have to give him that, although it does seem a bit problematic from a regulatory/compliance angle. So what arguments does he make for the Thiola price increase?
From what I can see, they boil down to this: patients themselves aren't going to be paying this increased price - insurance companies are. And Retrophin is actually going to be working on new formulations for the drug, which no one has done previously. He seems to have implied that the previous company (Mission Pharmacal) was reluctant to raise the price and take the public outcry, and stated (correctly) that Mission was having trouble keeping the drug in supply. He claims that the current price is still "pretty low", and that he does not expect any pushback from the eventual payers. There was also quite a bit about the company's dedication to patients, their work on other rare diseases, and so on.
He and I didn't cross paths much in the thread. I tried asking a few direct questions, but they weren't picked up on, so my take on Shkreli's answers will show up here. He's correct that the drug's availability was erratic, and he may well be correct that its price was too low for a company to deal with it properly. But if so, that does make you wonder what Mission Pharmacal was up to, and how they were sourcing the material.
He's also correct that Retrophin is planning to work on new formulations of the drug. But when you look at the company's investor presentation about Thiola, all that comes under a slide marked "Distribution and Intellectual Property". The plan seems to be that they'll introduce 250mg and 500mg dosages, at which time they'll discontinue the current 100mg formulation. Later on, they'll try to introduce a time-release formulation, at which time they'll discontinue the 250mg and 500mg forms. You can argue that this is helping patients, but you can also argue that it's making it as difficult as possible for anyone else to show bioequivalence and enter the market as well, assuming that anyone wants to.
And as I mentioned yesterday, the company does seem to care about someone else entering the market. My questions to Shkreli about the "closed distribution" model mentioned on the company's slides went unanswered, but the only interpretation I can give them is that Retrophin plans to use the FDA's risk management system to deny any competitors access to their formulations, in order to try to keep themselves as the sole supplier of Thiola in perpetuity. Patents at least expire: regulatory rent-seeking is forever.
Also left out of Shkreli's comments on Reddit are the issues on the company's slide titled "Pharmacoeconomics", where it says (vis-a-vis the other drug for cystinuria, penicillamine):
Current pricing of Thiola® - $4,000 PPPY
– Penicillamine pricing- $80,000-$140,000
• Thiola could support a significant price increase
Personally, I think that's the main reason for Retrophin's interest. You'll note that the price hike takes Thiola's cost right up to the penicillamine region (the price of that one is another story all its own). But to a first approximation, that's business. I've defended some drug company pricing decisions on this site before (although not all of them), so what's different this time?
I've been thinking hard about that one, and here's what I have so far. I think that pricing power of this sort is a powerful weapon. That's the reason for the patent system - you get a monopoly on selling your invention, but it's for a fixed period only, and in return you disclose what your invention is so that others can learn from it. And I think that this sort of pricing power should be a reward for actually producing an invention. That's the incentive for going through all the work and expense, the (time-limited) pot of gold at the end of the rainbow. I have a much lower opinion of seeing someone ram through a big price increase just because, hey, they can. Thiola has nothing to do with the patent system - it's off patent. What this situation looks like to me is regulatory rent-seeking. Celgene seems to be doing that too, with thalidomide (as mentioned yesterday), which is why they're being taken to court. Retrophin is betting that Thiola just isn't a big enough deal for anyone to go to that trouble, once they tell them to buzz off by using Celgene's strategy.
Businesses can, though, charge what they think the market will bear, and Retrophin's contribution to cystinuria therapy so far is to have realized that the market will bear a lot more than people had realized. But in an actual market, it would be easier for someone else to come in and compete on price. What Retrophin is planning is to use regulatory loopholes to keep anyone else from doing so, with no time limit until someone at the FDA does something about it. Cloaking this in a lot of noble-sounding talk about being the company that really cares about cystinuria patients is a bit stomach-turning. In my opinion.
+ TrackBacks (0) | Category: Business and Markets | Drug Prices | Why Everyone Loves Us
September 11, 2014
There's a drug called Thiola (tiopronin) that most people have never heard of. It's on my list of "smaller than aspirin" drugs, and I'd never heard of it until I put that one together. But thanks to a little company called Retrophin, we all get to hear about it now.
It's used to treat cystinuria, a rare disease that causes painful kidney complications, namely unusual kidney stones of pure cystine. And until recently, tiopronin (as a small, nearly forgotten drug for an orphan indication) was rather cheap. It was sold by a small company in Texas, Mission Pharmacal, until Retrophin bought the marketing rights earlier this year (a move complicated by the company's CEO, investor Martin Shkreli, who may have let the news of the deal leak on his Twitter account).
That link mentions part of Shkreli's business plan as "acquiring the rights to obsolete remedies Shkreli says can be put to new and lucrative purposes", and by gosh, that's certainly accurate. Retrophin is increasing the price of Thiola from $1.50 per pill to over $30 per pill. Because they can - they stated when they bought the drug that their first move would be to raise the price. New dosages are formulations are also mentioned, but the first thing is to jack the price up as high as it can be jacked. Note that patients take several pills per day. Shkreli is probably chortling at those Mission Pharmacal hicks who didn't realize what a gold mine they were sitting on.
Now, there have been somewhat similar cases in recent years. Colchicine's price went straight up, and (infamously) so did the progesterone formulation marketed as Makena. But in both those cases, the small companies involved took the compound back through the FDA, under an agency-approved program to get marketing exclusivity. I've argued here (see those last two links) that this idea has backfired several times, and that the benefit from the clinical re-evaluation and re-approval of these drugs has not been worth their vastly increased cost. I think that drug companies should be able to set the price of their drugs, because they have a lot of failures to make up for, but this FDA loophole gives people a chance to do minimal development at minimal risk and be handed a license to print money in return.
But this isn't even one of those cases. It's worse. Retrophin hasn't done any new trials, and they haven't had to. They've just bought someone else's old drug that they believed could be sold for twenty times its price, and have put that plan right into action. No development costs, no risks whatsoever - just slap a new sticker on it and put your hands over your ears. This is exactly the sort of thing that makes people go into fist-clenching rages about the drug industry, and with damn good reason. This one enrages me, and I do drug research for a living.
So thank you, Martin Shkreli. You've accelerated the progress of the giant hammer that's coming down on on all of us over drug pricing, and helped drag the reputation of the pharmaceutical industry even further into the swamp. But what the hell do you care, right? You're going to be raking in the cash. The only thing I can say about Shkreli and Retrophin is that they make the rest of the industry look good in comparison. Some comparison.
Update: There are some interesting IP aspects to this situation. As pointed out in the comments section, this compound has no exclusivity left and is off patent. So what's to stop someone else from filing an ANDA, showing bioequivalence, and competing on price (since there seems to be an awful lot of room in there)?
Simon Lackner on Twitter sent me to this presentation from Retrophin on their purchase of the Thiola license. In it, you can see that their plan for this: "Similar to Chenodal, Retrophin will move Thiola into closed distribution". Chenodal was the company's previous brainstorm of this sort, when they bought Manchester Pharmaceuticals, details of which can be seen on this presentation. What they say on that one is "Closed distribution system does not allow for generics to access product for bioequivalence study. ANDA filings are impossible unless generic company illegally penetrates specialty distributor. Recent Celgene v. Lannett case establishes precedent." So let's go back and take a look at Celgene v. Lannett.
That was a long-running dispute between the two companies over Lannett's desire to market a generic equivalent of Celgene's thalidomide. Lannett brought suit, accusing Celgene of using the drug's Risk Evaluation and Mitigation Strategy (REMS) improperly to deny potential competitors access to their product (which is needed to do a head-to-head comparison for an ANDA filing). As you can imagine, the REMS for thalidomide is pretty extensive and detailed! But there was no court decision in the case. The companies reached an out-of-court settlement before it went to trial in 2012, although I have to say that that Retrophin slide makes it sound like there's some sort of legal precedent that was set. There wasn't. The limits of REMS restrictions to deny access to a given drug are still an open question.
In late 2012, Acetelion and Apotex went at it over the same issue, this time over access to Tracleer (bosentan). The Federal Trade Commission filed an amicus brief, warning that companies could be abusing the REMS process to keep out competition. That case was also dismissed, though, after the two companies reached an out-of-court settlement of their own, removing another chance for a legal opinion on the subject.
But the issue is very much alive. Earlier this year, Mylan went after Celgene, also over thalidomide (and its follow-up, lenalidomide). Their complaint:
Celgene, a branded drug manufacturer, has used REMS as a pretext to prevent Mylan from acquiring the necessary samples to conduct bioequivalence studies, even after the FDA determined that Mylan’s safety protocols were acceptable to conduct those studies. In furtherance of its scheme to monopolize and restrain trade, Celgene implemented certain distribution restrictions that significantly limit drug product availability.
And this is the plan that Retrophin has in mind - they say so quite clearly in those two presentations linked above. What their presentations don't go into is that this strategy has been under constant legal attack. It also doesn't go into another issue: the use of REMS at all. Thalidomide, of course, is under all kinds of restrictions and has plenty of hideous risks to manage. Bosentan's not exactly powdered drink mix, either - patients require monthly liver function tests (risk of hepatoxicity) and monitoring of their hematocrit (risk of anemia). But what about Thiola/tiopronin? It's not under any risk management restrictions that I can see. Its side effects seem to be mainly diarrhea and nausea, which does not put it into the "This drug is so dangerous that we can't let any generic company get ahold of our pills" category. So how is Retrophin going to make this maneuver work?
Update: more on this issue here.
+ TrackBacks (0) | Category: Business and Markets | Drug Prices | Why Everyone Loves Us
September 10, 2014
Bizarre news from Evotec - see what you make of this press release:
Evotec AG was informed that US company Hyperion Therapeutics, Inc. ("Hyperion") is terminating the development of DiaPep277(R) for newly diagnosed Type 1 diabetes.
In a press release published by Hyperion on 08 September 2014 at market opening in the US, the company states that it has uncovered evidence that certain employees of Andromeda Biotech, Ltd. ("Andromeda"), which Hyperion acquired in June 2014, engaged in serious misconduct, involved with the trial data of DiaPep277. Hyperion announced that it will complete the DIA-AID 2 Phase 3 trial, but will terminate further development in DiaPep277.
Here's the Hyperion press release, and it details a terrible mess:
The company has uncovered evidence that certain employees of Andromeda Biotech, Ltd., which Hyperion acquired in June 2014, engaged in serious misconduct, including collusion with a third-party biostatistics firm in Israel to improperly receive un-blinded DIA-AID 1 trial data and to use such data in order to manipulate the analyses to obtain a favorable result. Additional evidence indicates that the biostatistics firm and certain Andromeda employees continued the improper practice of sharing and examining un-blinded data from the ongoing DIA-AID 2 trial. All of these acts were concealed from Hyperion and others.
The Company has suspended the Andromeda employees known to be involved, is notifying relevant regulatory authorities, and continues to investigate in order to explore its legal options. Hyperion employees were not involved in any of the improper conduct.
What a nightmare. All biomedical data are vulnerable to outright fraud, and it gives a person the shivers just thinking about it. I can only imagine the reactions of Hyperion's management when they heard about this, and Evotec's when Hyperion told them about it. What, exactly, the Andromeda people (and the third-party biostatistics people) thought they were getting out of this is an interesting question, too - did they hope to profit if the company announced positive results? That's my best guess, but I'm not sleazy enough (I hope) to think these things through properly.
+ TrackBacks (0) | Category: Business and Markets | Clinical Trials | The Dark Side
I'd seen various solventless reactions between solid-phase components over the years, but never tried one until now. And I have to say, I'm surprised and impressed. I can't quite say which literature reference I'm following, unfortunately, because it might conceivably give someone a lead on what I'm making at the moment, but it's a reference that I found as a new technique for an old reaction. Doing it in solution gives you a mess, but just grinding up the two solid reactants and the reagent, in a mortar and pestle, gives you a very clean conversion. The stuff turns into a sort of ugly clay inside the mortar, but it looks are deceiving. I feel like an alchemist. Consider me a convert to the solventless lifestyle - I'll try this again on some other reaction classes when I get the chance. Anyone else ever ground up some solids and made a new product?
+ TrackBacks (0) | Category: Life in the Drug Labs
Retraction Watch has a rare look behind the peer review curtain in the (now notorious) case of the STAP stem cell controversy. This was the publication that claimed that stem-like cells could be produced by simple acid treatment, and this work has since been shown to be fraudulent. Damaged reputations, bitter accusations, and one suicide have been the result so far, and there are still bent hubcaps wobbling around on the asphalt.
The work was published in Nature, but it had been rejected from Science and Cell before finding a home there. That's not unusual in itself - a lot of groundbreaking work has had a surprisingly difficult time getting published. But the kinds of referee reports this got were detailed, well-argued, and strongly critical, which makes you wonder what Nature's reviewers said, and how the work got published in the form it did, with most (all?) of the troublesome stuff left in.
Retraction Watch has obtained the complete text of the referee comments from the Science submission process and published them. Here are some highlights from just the first reviewer:
. . .This is such an extraordinary claim that a very high level of proof is required to sustain it and I do not think this level has been reached. I suspect that the results are artifacts derived from the following processes: (1) the tendency of cells with GFP reporters to go green as they are dying. (2) the ease of cross contamination of cell lines kept in the same lab. . .
. . .The DNA analysis of the chimeric mice is the only piece of data that does not fit with the contamination theory. But the DNA fragments in the chimeras don’t look the same as those in the lymphocytes. This assay is not properly explained. If it is just an agarose gel then the small bands could be anything. Moreover this figure has been reconstructed. It is normal practice to insert thin white lines between lanes taken from different gels (lanes 3 and 6 are spliced in). Also I find the leading edge of the GL band suspiciously sharp in #2-#5. . .
This report and the other two go on to raise a long list of detailed, well-informed criticisms about the experimental design of the work and the amount of information provided. Solutions and reagents are not described in enough detail, images of the cells don't quite show what they're supposed to be showing, and numerous useful controls and normalizations are missing outright. The referees in this case were clearly very familiar with stem cell protocols and behavior, and they did exactly what they were supposed to do with a paper whose claims were as extraordinary as these were.
Had any of this stuff been real, meeting the objections of the reviewers would have been possible, and would have significantly improved the resulting paper. This process, in fact, handed the authors a list of exactly the sorts of objections that the scientific community would raise once the paper did get published. And while rejections of this sort are not fun, that's just what they're supposed to provide. Your work needs to be strong enough to stand up to them.
Congratulations to the Science and Cell editorial teams (and reviewers) for not letting this get past them. I would guess that publication of these reports will occasion some very painful scenes over at Nature, though - we'll see if they have any comment.
+ TrackBacks (0) | Category: The Dark Side | The Scientific Literature
September 9, 2014
Google's Calico venture, the company's out-there move into anti-aging therapy, has made the news by signing a deal with AbbVie (the company most of us will probably go on thinking of as Abbott). That moves them into the real world for sure, from the perspective of the rest of the drug industry, so it's worth taking another look at them. (It's also worth noting that Craig Venter is moving into this area, too, with a company called Human Longevity. Maybe as the tech zillionaires age we'll see a fair amount of this sort of thing).
On one level, I applaud Google's move. There's a lot of important work to be done in the general field of aging, and there are a lot of signs that human lifespan can be hacked, for want of a better word. The first thought some people have when they think of longer lifespan is that it could be an economic disaster. After all, a huge percentage of our healthcare money is already spent in the last years of life as it is - what if we make that period longer still? But it's not just sheer lifespan - aging is the motor behind a lot of diseases, making them more like to crop up and more severe when they do. The dream (which may be an unattainable one) is for longer human lifespans, in good health, without the years of painful decline that so many people experience. Even if we can't quite manage that, an improvement over the current state of things would be welcome. If people stay productive longer, and spend fewer resources on disabling conditions as they age, we can come out ahead on the deal rather than wondering how we could possibly afford it.
Google and AbbVie are both putting $250 million into starting a research site somewhere in the Bay area (and given the state of biotech out there, compared to a few years ago, it'll be a welcome addition). If things go well, each of them have also signed up to contribute as much as $500 million more to the joint venture, but we'll see if that ever materializes. What, though, are they going to be doing out there?
Details are still scarce, but FierceBiotechIT says that "a picture of an IT-enabled, omics-focused operation has emerged from media reports and early hiring at the startup". That sounds pretty believable, given Google's liking for (and ability to handle) huge piles of data. It also sounds like something that Larry Page and Sergey Brin would be into, given their past investments. But that still doesn't tell us much: any serious work in this area could be described in that fashion. We'll have to use up a bit more of our current lifespans before things get any clearer.
So I mentioned above that on one level I like this - what, you might be asking, is the other level on which I don't? My worry is what I like to call the Andy Grove Fallacy. I applied that term to Grove's "If we can improve microprocessors so much, what's holding you biotech people back"? line of argument. It's also a big part of the (in)famous "Can a Biologist Fix a Radio" article (PDF), which I find useful and infuriating in about equal proportions. The Andy Grove Fallacy is the confusion between man-made technology (like processor chips and radios) and biological systems. They're both complex, multifunctional, miniaturized, and made up of thousands and thousands of components, true. But the differences are more important than the similarities.
For one thing, human-designed objects are one hell of a lot easier for humans to figure out. With human-designed tech, we were around for all the early stages, and got to watch as we made all of it gradually more and more complicated. We know it inside out, because we discovered it and developed it, every bit. Living cells, well, not so much. The whole system is plunked right down in front of us, so the only thing we can do is reverse-engineer, and we most definitely don't have all the tools we need to do a good job of that. We don't even know what some of those tools might be yet. Totally unexpected things keep turning up as we look closer, and not just details that we somehow missed - I'm talking about huge important regulatory systems (like all the microRNA pathways) that we never even realized existed. No one's going to find anything like that in an Intel chip, of that we can be sure.
And that's because of the other big difference between human technology and biochemistry: evolution. We talk about human designs "evolving", but that's a very loose usage of the word. Real biological evolution is another thing entirely. It's not human, not at all, and it takes some time to get your head around that. Evolution doesn't do things the way that we would. It has no regard for our sensibilities whatsoever. It's a blind idiot tinkerer, with no shame and no sense of the bizarre, and it only asks two questions, over and over: "Did you live? Did you reproduce? Well, OK then." Living systems are full of all kinds of weird, tangled, hacked-together stuff, layer upon layer of it, doing things that we don't understand and can't do ourselves. There is no manual, no spec sheet, no diagram - unless we write it.
So people coming in from the world of things that humans built are in for a shock when they find out how little is known about biology. That's the shock that led to that Radio article, I think, and the sooner someone experiences it, the better. When Google's Larry Page is quoted saying things like this, though, I wonder if it's hit him yet:
One of the things I thought was amazing is that if you solve cancer, you’d add about three years to people’s average life expectancy. We think of solving cancer as this huge thing that’ll totally change the world. But when you really take a step back and look at it, yeah, there are many, many tragic cases of cancer, and it’s very, very sad, but in the aggregate, it’s not as big an advance as you might think."
The problem is, cancer - unrestrained cellular growth - is intimately tied up with aging. Part of that is statistical. If you live long enough, you will surely come down with some form of cancer, whether it's nasty enough to kill you or benign enough for you to die of something else. But another connection is deeper, because the sorts of processes that keep cells tied down so that they don't take off and try to conquer the world are exactly the ones, in many cases, that we're going to have to tinker with to extend our lifespans. There are a lot of tripwires out there, and many of them we don't even know about yet. I'd certainly assume that Larry Page's understanding of all this is deeper than gets conveyed in a magazine article, but he (and the other Google folks) will need to watch themselves as they go on. Hubris often gets rewarded in Silicon Valley - after all, it's made by humans, marketed to humans, and is rewarded by human investors. But in the biomedical field, hubris can sometimes attract lightning bolts like you would not believe.
+ TrackBacks (0) | Category: Aging and Lifespan | Business and Markets
September 8, 2014
Just how reactive are chemical functional groups in vivo? That question has been approached by several groups in chemical biology, notably the Cravatt group at Scripps. One particular paper from them that I've always come back to is this one, where they profiled several small electrophiles across living cells to see what they might pick up. (I blogged about more recent effort in this vein here as well).
Now there's a paper out in J. Med. Chem. that takes a similar approach. The authors, from the Klein group at Heidelberg, took six different electrophiles, attached them to a selection of nonreactive aliphatic and aromatic head groups, and profiled the resulting 72 compounds across a range of different proteins. There are some that are similar to what's been profiled in the Cravatt papers and others (alpha-chloroketones), but others I haven't seen run through this sort of experiment at all.
And what they found confirms the earlier work: these things, even fairly hot-looking ones, are not all that reactive against proteins. Acrylamides of all sorts were found to be quite clean, with no inhibition across the enzymes tested, and no particular reaction with GSH in a separate assay. Dimethylsulfonium salts didn't do much, either (although a couple of them were unstable to the assay conditions). Chloroacetamides showed the most reactivity against GSH, but still looked clean across the enzyme panel. 2-bromodihydroisoxazoles showed a bit of reactivity, especially against MurE (a member of the panel), but no covalent binding could be confirmed by MS (must be reversible). Cyanoacetamides showed no reactivity at all, and neither did acyl imidazoles.
Now, there are electrophiles that are hot enough to cause trouble. You shouldn't expect clean behavior from an acid chloride or something, but the limits are well above where most of us think they are. If some of these compounds (like the imidazolides) had been profiled across an entire proteome, then perhaps something would have turned up at a low level (as Cravatt and Weerapana saw in that link in the first paragraph). But these things will vary compound by compound - some of them will find a place where they can sit long enough for a reaction to happen, and some of them won't. Here's what the authors conclude:
An unexpected but significant consequence of the present study is the relatively low inhibitory potential of the reactive compounds against the analyzed enzymes. Even in cytotoxicity assays and when we looked for inhibitor enzyme adduct formation we did not find any elevated cytotoxicity or unspecific modification of proteins. Particularly in the case of chloroacetylamides/-anilides and dimethylsulfonium salts, which we consider to be among the most reactive in this series, this is a promising result. From these results the following consequences for moderately reactive groups in medicinal chemistry can be drawn. Promiscuous reactivity and off-target effects of electrophiles with moderate reactivity may often be overestimated. It also does not appear justified to generally exclude “reactive” moieties from compound libraries for screening purposes, since the nonspecific reactivity may turn out to be much inferior than anticipated.
There are a lot of potentially useful compounds that most of us have never thought of looking at, because of our own fears. We should go there.
+ TrackBacks (0) | Category: Chemical Biology | Chemical News | Drug Assays
September 5, 2014
Here's what sounds like a good idea from VC firm Index Ventures, from the latest issue of BioCentury (same one I referenced the other day). Like many others in the biopharma venture capital world, they're trying to run the "killer experiment" as soon as possible, to see which ideas for new companies look solid. Unlike the others, though, they're planning a web site where they will detail the successes - and the failures. Here's an example:
Founded in 2013, (Purple Pharmaceuticals) was started to identify small molecule inhibitors of proprotein convertase subtilisin/kexin type 9. Two mAbs against PCSK9 are already in Phase III testing to treat hypercholesterolemia with regulatory submissions expected this year: evolocumab from Amgen Inc. and alirocumab from partners Regeneron Pharmaceuticals Inc. and Sanofi.
Grainger said the antibodies have limitations, as they require high doses to suppress PCSK9 activity and once-weekly or once-monthly infusions. Thus a pill that could match the PCSK9 inhibition of the biologics could be “the holy grail” of lowering LDL cholesterol.
Purple began by trying to identify small molecules that were highly selective for PCSK9 over other proprotein convertases because, as Grainger noted, “PCSK9 is a member of a large family of enzymes that do some pretty critical things.”
The killer experiment, he said, “was to ask could we make a small molecule that was selective over these other proprotein convertases, and could we demonstrate that it would lower LDL cholesterol?”
After a year, Purple had identified some hits selective for PCSK9, but a conversation with researchers at the Genentech Inc. unit of Roche led to the realization that the virtual company would need to run a second experiment.
“We learned from that interaction with Genentech that they had also run a PCSK9 screening program the same way we had,” Grainger said. “They discovered that their hits, while preventing PCSK9 from cleaving an external substrate, did not prevent PCSK9 from cleaving itself.”
Purple learned that in vivo, PCSK9 is auto-activated by cleaving itself — meaning the only important interaction to inhibit is PCSK9 auto- activation, not interactions with external substrates.
Purple’s second experiment showed none of its small molecule hits that inhibited PCSK9 interaction with an external substrate also inhibited auto-activation.
“Therefore we were able to kill a project which had spent at that stage only about £300,000 over a year, only to discover at the critical moment that it didn’t have the profile that we wanted,” Grainger said. “We were able to terminate that without having created any infrastructure, without having spent a painful amount of money prosecuting the project.”
That story illustrates a number of points about drug discovery. First off, congratulations to those involved for being able to definitively test a hypothesis; that's the engine at the heart of all scientific research. And as they say, it was good to be able to do that without having spent too much money and time, because both of those have a way of getting a bit out of control as complication after complication gets uncovered. Investors start getting jumpy when you keep coming back to them saying "Well, you know, it turns out that. . . ", but you know, it often turns out that way.
The next thing this story shows is that when you see an obvious gap in the landscape that there may well be a good reason for it. PCSK9 antibodies are widely thought to be potential blockbusters; a huge battle is shaping up in that area. So why no small molecules, eh? That's the question that launched Purple, it seems, and it's a valid one. But it turns out to have a valid answer, one that others in the field had already discovered. I suspect that the people behind this effort were, at the same time they were characterizing their lead molecules, also beating all the bushes for the sort of information that they obtained from Genentech. Somebody must have tried small-molecule PCSK9 inhibitors, you'd think, so what happened? Were those projects abandoned for good reasons, or was there still some opportunity there that a new company could claim for itself?
There may well be more to this story, though, than the Index Ventures people are saying. Update: there is - see the end of this post. The autocatalytic cleavage of PCSK9 was already well-known - pretty much everything in the that protease family works that way. (The difference is that with PCSK9, the prodomain part of the protein stays on longer - details of its cleavage were worked out in 2006). And in this 2008 paper from Journal of Lipid Research, we find this:
Several approaches for inhibiting PCSK9 function are theoretically feasible. Because autocatalytic cleavage is required for the maturation of PCSK9, a small-molecule inhibitor of autocatalysis might be useful, provided that it was specific for PCSK9 processing and did not lead to a toxic accumulation of misfolded PCSK9. Small molecules that block the PCSK9-LDL receptor interactions would likely be efficacious, although designing inhibitors of protein-protein interactions is a tall order. Antisense approaches pioneered by Isis Pharmaceuticals (Carlsbad, CA) are well suited for liver targets, and studies in mice suggest that this approach is efficacious for PCSK9. Finally, there is considerable interest in developing antibody therapeutics to inhibit PCSK9-LDL receptor interactions.
Even more to the point is the paper that that JLR piece is commenting on. That one demonstrates, through studies of mutated PCSK9 proteins, that its catalytic activity does not seem to be necessary at all for its effects on LDL receptors (a result that had already been suggested in cell assays). Taken together, you'd come away with the strong impression that inhibiting PCSK9's catalytic activity, other than stopping it from turning itself into its active form, had a low probability of doing anything to cholesterol levels. And you'd come away with that impression in 2008, at the latest.
So Purple's idea was a longer shot than it appeared on the surface, not that the real information was exactly buried deep in the literature. They shouldn't have needed someone at Genentech to tell them that PCSK9's autocatalysis was the real target - I've never worked in the area at all, and I found this out in fifteen minutes on PubMed while riding in to work. They must have had more reason to think that an assay for PCSK9's exogenous activity would be worth running - either that, or this story has gotten garbled along the way.
But this example aside, I applaud the idea of making these early-stage calls public. And I agree with the Index Ventures folks that this should actually help academics and others unused to drug discovery to see what needs to be done to actually launch an idea out into the world. I look forward to seeing the web site - and perhaps to hearing a bit more about what really happened at Purple.
Update: David Grainger of Index Ventures has more in the comments, and says that there is indeed more to the story. He points out that mutations of PCSK9 were found that inhibited its autocatalytic activity (such as this one), and that work had appeared that suggested that molecules that inhibited only the autocatalytic activity could be useful. This is what Purple was seeking - the BioCentury piece makes things sound a bit different (see above), but the problem seems to have been that molecules that inhibited PCSK9's activity against other substrates turned out not to inhibit its activity against itself. If I'm interpreting this right, then, Genentech's contribution was to point out that the autocatalytic activity couldn't be modeled by looking at another substrate.
+ TrackBacks (0) | Category: Business and Markets | Cardiovascular Disease | Drug Assays
September 4, 2014
A reader sends along this paper, on some small molecules targeting the C2 domain of coagulation factor VIII. It illustrates some points that have come up around here over the years, that's for sure. The target is not a particularly easy one: a hit would have to block the interaction of that protein domain with a membrane surface. There is something of a binding pocket down in that region, though, and there were some hits reported from a screen back in 2004. Overall, it looks like a lot of targets that show up, especially these days - you're trying to affect protein conformation by going after a not-necessarily-made-for-small-molecules cavity. Possible, but not something that's going to light up a screening deck, either.
And many of the things that do show up are going to be false positives of one sort or another. That's always the tricky part of doing low-hit-rate screening. The odds are excellent that any given "hit" will turn out not to be real, since the odds are against having any hits at all. This is especially a red flag when you screen something like this and you get a surprisingly decent hit rate. You should suspect fluorescence interference, aggregation, impurities, any of the other myriad ways that things can be troublesome rather than assume that gosh, this target is easier than we thought.
It's often a chemist who's in charge of dumping these buckets of cold water (if you have the help of the people who set up the assay, so much the better). Traditionally, it's one of the biology project champions who gets enthusiastic about the great list of compounds, but if you have someone who's been burned by false positives a few times, then so much the better, too. It's not fun to knock down all these "hits" and "leads", but someone's got to do it, otherwise everyone's time will be wasted to an even more painful extent.
And you should be especially worried when your screen turns up compounds like some of the ones in this paper. Yep, it's our old friends the rhodanines, everybody's cheap date of the screening deck. These compounds have come up around here many times, because they keep on showing up in the flippin' literature. In this case, the authors did some virtual screening over the ChemBridge collection and then moved on to assays against the protein itself, eventually finding a number of active compounds in the micromolar range. The compounds look a lot like the ones from 2004, since those were used as the template for screening, and that was a pretty ugly rhodanine-infested set, too.
Indeed most of the compounds they found are pretty unattractive - the aforementioned rhodanines, lots of nitroaromatics, some other heterocycles that also hit more often than one would like. I would feel better about these sorts of papers if the authors acknowledged somewhere that some of their structures are frequent hitters and might be problematic, but you don't often see that: a hit is a hit, and everything's equally valid, apparently. I would also feel better if there were something in the experimental section about how all the compounds were assayed by LC/MS and NMR, but you don't often see that, either, and I don't see it here. Implicitly trusting the label is not a good policy. Even if the particular compounds are the right ones in this case, not checking them shows a lack of experience (and perhaps too trusting a nature where organic chemistry is concerned).
But let's cross our fingers and assume that these are indeed the right compounds. What does it mean when your screening provides you with a bunch of structures like this? The first thing you can say is that your target is indeed a low-probability one for small molecules to bind to - if most everything you get is a promiscuous-looking ugly, then the suspicion is that only the most obliging compounds in a typical screening collection will bother looking at your binding site at all. And that means that if you want something better, you're really going to have to dig for it (and dig through a mound of false positives and still more frequent hitters to find it).
Why would you want to do that? Aren't these tool compounds, useful to find out more about the biology and behavior of the target? Well, that's the problem. If your compounds are rhodanines, or from other such badly-behaved classes, then they are almost completely unsuitable as tool compounds. You especially don't want to trust anything they're telling you in a cellular (or worse, whole-animal) assay, because there is just no telling what else they're binding to. Any readout from such an assay has to be viewed with great suspicion, and what kind of a tool is that?
Well then, aren't these starting points for further optimization? It's tempting to think so, and you can give it a try. But likely as not, the objectionable features are the ones that you can't get rid of very easily. If you could ditch those without paying too much of a penalty, you would have presumably found more appealing molecules in your original screen and skipped this stage altogether. You might be better off running a different sort of screen and trying for something outside of these classes, rather than trying to synthesize a silk purse out of said sow's ear. If you do start from such a structure, prepare for a lot of work.
As mentioned, the problem with a lot of papers that advance such structures is that they don't seem to be aware of these issues at all. If they are, they certainly don't being them up (which is arguably even worse). Then someone else comes along, who hasn't had a chance to learn any of this yet, either, and reads the paper without coming out any smarter. They may, in fact, have been made slightly less competent by reading it, because now they think that there are these good hits for Target Z, for one thing, and that the structures shown in the paper must be OK, because here they are in this paper, with no mention of any potential problems.
The problem is, there are a lot of interesting targets out there that tend to yield just these sorts of hits. My own opinion is that you can then say that yes, this target can (possibly) bind a small molecule, if those hits are in fact real, but just barely. If you don't even pick up any frequent hitters, you're in an even tougher bind, but if all you pick up are frequent hitters, it doesn't mean that things are that much easier.
+ TrackBacks (0) | Category: Academia (vs. Industry) | Drug Assays | The Scientific Literature
September 3, 2014
A reader sends along this query, and since I've never worked around monoclonal antibodies, I thought I'd ask the crowd: how much of a read on safety do you get with a mAb in Phase I? How much Phase I work would one feel necessary to feel safe going on to Phase II, from a tox/safety standpoint? Any thoughts are welcome. I suspect the answer is greatly going to depend on what said antibody is being raised to target.
+ TrackBacks (0) | Category: Drug Development | Toxicology
I always enjoy BioCentury's "Back to School" issue this time of year, and this time they're being more outspoken than usual. (That link is free access). The topic is pricing:
Last year, (we) argued biopharma companies can no longer assume the market will support premium pricing, even for drugs that deliver meaningful and measurable improvements over the standard of care.
This year, BioCentury’s 22nd Back to School essay goes on to argue that the last bastion of free pricing is crumbling, and biotech and pharma had better start experimenting with new pricing models based on value for money while they still have the chance.
The wake-up call was the launch of Sovaldi sofosbuvir from Gilead Sciences Inc.
Payers, reimbursement authorities and health technology assessment agencies almost universally — with the exception of Germany — acknowledge the drug is a breakthrough for patients with HCV.
At $84,000, the drug is clearly cost effective for a subset of HCV patients who would otherwise progress to expensive sequelae such as liver transplant. But its broad indication includes a majority of patients whose disease won’t progress to the point of costly interventions. And doing the math makes it obvious that treating even a fraction of eligible patients would be a staggering sum for payers to absorb.
What Gilead has done - thanks, guys - is to accelerate a number of trends that were already looking like trouble.
With Sovaldi as the stimulus, government officials, payers, reimbursement authorities and patient groups are fighting back against high drug prices with renewed vigor. For these stakeholders, biopharma’s arguments that drug developers must be compensated for the cost and risk of creating medical breakthroughs don’t hold water.
The easiest response of payers and consumers to industry’s argument is: not my problem.
Far worse, biopharma’s historical arguments about the cost and risk of drug development are giving ammunition to academics, legislators, health technology assessment bodies and payers to argue that the costs of developing and manufacturing drugs plus a “reasonable” margin should be the basis for price.
Industry needs to wrest the discussion away from a cost-plus system that would essentially turn biopharmaceutical companies into utilities, cutting off the lifeblood of innovation.
We seem to be too busy testing the limits of what insurance will pay for to worry about that right now, unfortunately. As the essay goes on to show, companies (Gilead and Alexion, for starters) are getting requests from regulators and legislators to provide an exact breakdown of just what it cost to develop their latest drugs. Demonstrate to us, in other words, that your pricing is justified. The next step beyond that is for these authorities to disagree with the numbers and their interpretation, and to suggest - and then enforce - their own. And I'm pretty sure that the industry would rather avoid that.
Cost-plus pricing places no value on the benefits provided by medicines and eliminates the incentives for biopharma industry innovation and for risk-taking in poorly understood diseases where many failures are likely.
The right question is not how much does R&D cost, but how to measure the benefit to the patient, payer and society; how to value that benefit over time; and how to distribute the risk should the expected benefit not be realized.
The answers are not obvious, and many approaches will need to be tested. Undoubtedly, in many settings the current systems for data collection, coding and reimbursement are not adequate to the task.
But that is no excuse for inaction. The current system of drug pricing and reimbursement is unsustainable and will be fixed — with or without the industry’s participation.
Biocentury suggests several things that should be looked at. First of all would be pricing per course of treatment, rather than per unit dose. That brings the spotlight more on what the drug is supposed to be accomplishing - and if that also spotlights some of them that aren't accomplishing as much as they're supposed to be, well, so be it. The industry should also consider risk-sharing arrangements, to take on more of the downside if a drug doesn't work as well as anticipated, with the opportunity to pick up more gains if it exceeds. Another idea would be pricing models where the payments are spread out over the time that the drug benefits a patient, rather than all of it being up front. In general, we need to make the connection between new drugs and their benefits easier to see, which in turn makes their pricing easier to see.
And if some of those prices turn out to be too high, well, that's our problem. We have to be ready to accept it when we have drugs that don't work as well for some conditions as we want. Only if we can do that can we turn around and charge the higher prices for the ones that are truly effective. I've made the argument many times that companies, not just drug companies, should be able to charge what they want to for their goods. And in the abstract, that's true. But in the world we live in, politics will intrude, big-time, if the drug industry tries to always extract the maximum revenue for everything, every time.
The downside for biopharma companies would be lower prices for drugs that provide incremental or modest benefits. But that reality is coming one way or another. The upside is a better shot at continued premium prices for real breakthroughs — although probably not as high as historical premiums — plus the potential for preferred formulary placement and earlier market access for many drugs.
But it's a tragedy-of-the-commons situation, because even though some of these ideas for different pricing, and even the calls for restraint, may well be in the industry's best interests, individual companies look at each other and say "You first". But as the old political saying has it, if you're not at the table, then you're on the menu.
+ TrackBacks (0) | Category: Drug Prices
September 2, 2014
I wanted to let readers know of a fun new book that's out this week. Randall Munroe, of webcomic XKCD fame, has written What If?: Serious Scientific Answers to Absurd Hypothetical Questions. There are a lot of truly odd ones in there, and he takes them on as best he can. I'm glad to say that I'm quoted in the chapter on "What would happen if you made a periodic table of cube-shaped bricks, where each brick was made of the corresponding element?" (That should give you an idea of the sorts of questions that come in to him; it makes my mail look fairly sane by comparison). And no, you wouldn't want to do that one - consider astatine and francium, for starters.
+ TrackBacks (0) | Category: Book Recommendations
There is some good news from the clinic today. Novartis reported data on LCZ696, a combination therapy for congestive heart failure, and the results have really grabbed a lot of attention. (The trial had been stopped early back in March, so the news was expected to be good). This is a combo of the angiotensin II antagonist valsartan and a neprilysin (neutral endopeptidase) inhibitor, AHU-377.
Compared to enalapril, the standard ACE inhibitor therapy for CHF, the Novartis combo lowered the risk of cardiovascular death by 20% and the risk of hospitalization by 21%, while having at least as good a safety profile as the generic ACE drug. Those are powerful arguments for the company to make, both to physicians and to insurance payers, so the future of the therapy, barring any sudden misfortunes, looks assured. There's not a lot that you can do for people with congestive heart failure as it is, and this looks like a real advance.
As Matthew Herper mentions, though, this isn't the first time that a similar combination has been tried in CHF. A few years ago, Bristol-Myers Squibb had a major failure with a single drug that inhibited both the ACE and neprilysin enzyme pathways, Vanlev (omapatrilat). That compound had a persistent problem with angioedema, as detailed here, and that led to its eventual rejection by the FDA on risk/benefit grounds, after a great deal of expensive Phase III work. Back in 2002, in the early days of this blog, I predicted that no ACE/endopeptidase combination would ever see the light of day again, which shows you how much I know about it. But I wasn't alone, that's for sure. It's very interesting and surprising that LCZ696 has worked out as well as it has, and it's a very worthwhile question to wonder what the difference could have been. Balance between the two pathways? Having an receptor antagonist on the ACE end rather than an enzyme inhibitor? Whatever it was, it seems to have done the trick.
The only question I have about the new combo is how it would compare to an ACE/diuretic combination, which (from what I know) is also a standard course of therapy for CHF patients. On the other hand, you'd expect that a diuretic might also be added to LCZ696 treatment - it was shown that it could be combined with omapatrilat, since they're all different mechanisms.
And one other point - I always make this one in these kind of situations. I'm willing to bet that critics of the drug industry, who like to go on about "me-too" drugs and lazy industrial research efforts, would have had LCZ696 on the list of eye-rolling follow-up drugs (that is, if they'd been paying attention at all). I mean, the angiotensin pathway is thoroughly covered by existing drugs, and neprilysin/NEP has been targeted before, too (both by omapatrilat and by Pfizer's so-called "female Viagra", UK-414,495). But there's an awful lot we don't know about human medicine, folks.
Update: here's a deep look at the IP and patent situation around the combo.
Update 2: and here's a detailed exchange about the way the trial was conducted and the drug's possible impact.
+ TrackBacks (0) | Category: "Me Too" Drugs | Cardiovascular Disease | Clinical Trials
Exelixis is a company with a very interesting history, but that's in the sense of "much rather read about it than experience it", like the interesting parts in a history book. At one point they had a really outsized pipeline of kinase inhibitors, to the point where it could be hard to keep track of everything, but these projects have largely blown up over the last few years. Big collaboration deals have been wound down, compounds have been returned to them, and so on.
Most recently, the company has been developing cabozantinib for prostate cancer. Along the way (2011) they had a dispute with the FDA about clinical trial design - the company had a much speedier surrogate endpoint in mind, but the agency wasn't having it. At this point, there are enough options in that area to make overall survival the real endpoint that matters, and the FDA told them to go out and get that data instead of messing around with surrogates. So the company plowed ahead, and yesterday announced Phase III results. They weren't good. The compound showed some effects in progression-free survival (PFS), but seems to have no benefit in the longer-running overall survival (OS) measurement. And that one's the key.
There's no way to put a good spin on it, either. The same press release that announced the results also announced that the company was going to have to "initiate a significant workforce reduction" in order to make it through the two other ongoing cabozantinib trials (for renal cell carcinoma and advanced hepatocellular carcinoma). Exelixis has had some pretty brutal workforce reductions over the years already, so this would appear to be cutting down as far as things can be cut (from 330 employees down to 70). And those two remaining indications are tough ones, too - if the compound shows efficacy, it'll be very good news, but those are not the first battlefields you'd choose to fight on. The prostate results don't offer much room for optimism, but on the other hand, the compound has orphan drug status for medullary thyroid cancer, for which it has shown real benefit in a disease that otherwise has no real treatment at all.
So Exelixis will try to stay alive long enough to get through these last trials, and if nothing comes up there, I'd have to think that this will be it for them. You wouldn't have predicted this back in about 2002, but you can't predict anything important in this industry to start with.
+ TrackBacks (0) | Category: Cancer | Clinical Trials
August 29, 2014
I'm going to be taking an extra day of vacation before the kids start back to school, so I'm adding to the Labor Day weekend today. Blogging will resume on Tuesday, unless something gigantic happens before then. If I can come up with something appropriate, maybe I'll put up a recipe!
+ TrackBacks (0) | Category: Blog Housekeeping
August 28, 2014
Here's a short video history of the FDA, courtesy of BioCentury TV. The early days, especially Harvey Wiley and the "Poison Squad", are truly wild and alarming by today's standards. But then, the products that were on the market back then were pretty alarming, too. . .
+ TrackBacks (0) | Category: Drug Industry History | Regulatory Affairs
A reader has sent along the question: "Have any repurposed drugs actually been approved for their new indication?" And initially, I thought, confidently but rather blankly, "Well, certainly, there's. . . and. . .hmm", but then the biggest example hit me: thalidomide. It was, infamously, a sedative and remedy for morning sickness in its original tragic incarnation, but came back into use first for leprosy and then for multiple myeloma. The discovery of its efficacy in leprosy, specifically erythema nodosum laprosum, was a complete and total accident, it should be noted - the story is told in the book Dark Remedy. A physician gave a suffering leprosy patient the only sedative in the hospital's pharmacy that hadn't been tried, and it had a dramatic and unexpected effect on their condition.
That's an example of a total repurposing - a drug that had actually been approved and abandoned (and how) coming back to treat something else. At the other end of the spectrum, you have the normal sort of market expansion that many drugs undergo: kinase inhibitor Insolunib is approved for Cancer X, then later on for Cancer Y, then for Cancer Z. (As a side note, I would almost feel like working for free for a company that would actually propose "insolunib" as a generic name. My mortgage banker might not see things the same way, though). At any rate, that sort of thing doesn't really count as repurposing, in my book - you're using the same effect that the compound was developed for and finding closely related uses for it. When most people think of repurposing, they're thinking about cases where the drug's mechanism is the same, but turns out to be useful for something that no one realized, or those times where the drug has another mechanism that no one appreciated during its first approval.
Eflornithine, an ornithine decarboxylase inhibitor, is a good example - it was originally developed as a possible anticancer agent, but never came close to being submitted for approval. It turned out to be very effective for trypanosomiasis (sleeping sickness). Later, it was approved for slowing the growth of unwanted facial hair. This led, by the way, to an unfortunate and embarrassing period where the compound was available as a cream to improve appearance in several first-world countries, but not as a tablet to save lives in Africa. Aventis, as they were at the time, partnered with the WHO to produce the compound again and donated it to the agency and to Doctors Without Borders. (I should note that with a molecular weight of 182, that eflornithine just barely missed my no-larger-than-aspirin cutoff for the smallest drugs on the market).
Drugs that affect the immune system (cyclosporine, the interferons, anti-TNF antibodies etc.) are in their own category for repurposing, I'd say, They've had particularly broad therapeutic profiles, since that's such a nexus for infectious disease, cancer, inflammation and wound healing, and (naturally) autoimmune diseases of all sorts. Orencia (abatacept) is an example of this. It's approved for rheumatoid arthritis, but has been studied in several other conditions, and there's a report that it's extremely effective against a common kidney condition, focal segmental glomerulosclerosis. Drugs that affect the central or peripheral nervous system also have Swiss-army-knife aspects, since that's another powerful fuse box in a living system. The number of indications that a beta-blocker like propanolol has seen is enough evidence on its own!
C&E News did a drug repurposing story a couple of years ago, and included a table of examples. Some others can be found in this Nature Reviews Drug Discovery paper from 2004. I'm not aware of any new repurposing/repositioning approvals since then, but there's an awful lot of preclinical and clinical activity going on.
+ TrackBacks (0) | Category: Clinical Trials | Drug Development | Drug Industry History | Regulatory Affairs
August 27, 2014
Here is the updated version of the "smallest drugs" collection that I did the other day. Here are the criteria I used: the molecular weight cutoff was set, arbitrarily, at aspirin's 180. I excluded the inhaled anaesthetics, only allowing things that are oils or solids in their form of use. As a small-molecule organic chemist, I only allowed organic compounds - lithium and so on are for another category. And the hardest one was "Must be in current use across several countries". That's another arbitrary cutoff, but it excludes pemoline (176), for example, which has basically been removed from the market. It also gets rid of a lot of historical things like aminorex. That's not to say that there aren't some old drugs on the remaining list, but they're still in there pitching (even sulfanilamide, interestingly). I'm sure I've still missed a few.
What can be learned from this exercise? Well, take a look at those structures. There sure are a lot of carboxylic acids and phenols, and a lot more sulfur than we're used to seeing. And pretty much everything is polar, very polar, which makes sense: if you're down in this fragment-sized space, you've got to be making some strong interactions with biological targets. These are fragments that are also drugs, so fragment-based drug discovery people may find this interesting as the bedrock layer of the whole field.
Some of these are pretty specialized and obscure - you're only going to see pralidoxime if you have the misfortune to be exposed to nerve gas, for example. But there are some huge, huge compounds on the list, too, gigantic sellers that have changed their whole therapeutic areas and are still in constant use. Metformin alone is a constant rebuke to a lot of our med-chem prejudices: who among us, had we never heard of it, would not have crossed it off our lists of screening hits? So give these small things a chance, and keep an open mind. They're real, and they can really be drugs.
+ TrackBacks (0) | Category: Chemical News | Drug Industry History
What scientific journals can you not be bothered to keep up with? I know, sometimes it's tempting to answer "all of them", but a well-informed chemist really should watch what comes out in the better ones. But how about the not-so-better ones? The "Life's too short" ones? Reading journals by RSS gives a person some perspective on signal-to-noise.
One problem is that Elsevier's RSS feeds are sort of perpetually hosed. Are they working now? I haven't checked in a while, because I finally gave up on them. And that means that I don't regularly look at Tetrahedron Letters or Bioorganic and Medicinal Chemistry Letters, even though (once in a while) something interesting turns up there. I look at ACS Medicinal Chemistry Letters more often, just because it has a working RSS feed (and I should note that I've rotated off their editorial board, by the way). Overall, though, I can't say that I miss either of those Elsevier journals, because you have to scroll through an awful lot of. . .stuff. . .to see something worth noting.
The same goes, I'm afraid, for Chemical Communications, and that makes me wonder if it's possible to keep up with the Letters/Communications style journals usefully at all. There are just so many papers pouring through them, and since Chem Comm takes them in from every sort of chemistry there is, vast numbers of them are of little interest to any particular reader. Their mini-review articles are perhaps an attempt to counteract this problem, and the journal also seems to have a slant towards "hot" topics. It's still in my RSS feed, but I look at the numbers of papers that pile up in it, and wonder if I should just delete and get it over with.
Organic Letters, on the other hand, I seem to be able to stay on top of, perhaps because it's focused down to at least organic chemistry (as opposed to Chem Comm). And I find a higher percentage of papers worth looking at than I do in Tet Lett (do others feel the same way?) And as for the other short-communications organic chemistry journals, I don't have them in the feed. Synthesis, Syn Comm, Synlett - writing this prompts me to go in and add them, but we'll see over the next couple of months if I regret it.
What it comes down to is that there's room for only a certain number of titles that can be followed as the papers publish. (The rest of them turn up in literature searches, responses to directed queries). And there are only a certain number of titles that are worth following in real time. So to get back to the question at the start of the post, which well-known journals do you find to be not worth the trouble?
+ TrackBacks (0) | Category: The Scientific Literature
August 26, 2014
There have been several analyses that have suggested that phenotypic drug discovery was unusually effective in delivering "first in class" drugs. Now comes a reworking of that question, and these authors (Jörg Eder, Richard Sedrani, and Christian Wiesmann of Novartis) find plenty of room to question that conclusion.
What they've done is to deliberately focus on the first-in-class drug approvals from 1999 to 2013, and take a detailed look at their origins. There have been 113 such drugs, and they find that 78 of them (45 small molecules and 33 biologics) come from target-based approaches, and 35 from "systems-based" approaches. They further divide the latter into "chemocentric" discovery, based around known pharmacophores, and so on, versus pure from-the-ground-up phenotypic screening, and the 33 systems compounds then split out 25 to 8.
As you might expect, a lot of these conclusions depend on what you classify as "phenotypic". The earlier paper stopped at the target-based/not target-based distinction, but this one is more strict: phenotypic screening is the evaluation of a large number of compounds (likely a random assortment) against a biological system, where you look for a desired phenotype without knowing what the target might be. And that's why this paper comes up with the term "chemocentric drug discovery", to encompass isolation of natural products, modification of known active structures, and so on.
Such conclusions also depend on knowing what approach was used in the original screening, and as everyone who's written about these things admits, this isn't always public information. The many readers of this site who've seen a drug project go from start to finish will appreciate how hard it is to find an accurate retelling of any given effort. Stuff gets left out, forgotten, is un- (or over-)appreciated, swept under the rug, etc. (And besides, an absolutely faithful retelling, with every single wrong turn left in, would be pretty difficult to sit through, wouldn't it?) At any rate, by the time a drug reaches FDA approval, many of the people who were present at the project's birth have probably scattered to other organizations entirely, have retired or been retired against their will, and so on.
But against all these obstacles, the authors seem to have done as thorough a job as anyone could possibly do. So looking further at their numbers, here are some more detailed breakdowns. Of those 45 first-in-class small molecules, 21 were from screening (18 of those high-throughput screening, 1 fragment-based, 1 in silico, and one low-throughput/directed screening). 18 came from chemocentric approaches, and 6 from modeling off of a known compound.
Of the 33 systems-based drugs, those 8 that were "pure phenotypic" feature one antibody (alemtuzumab) which was raised without knowledge of its target, and seven small molecules: sirolimus, fingolimod, eribulin, daptomycin, artemether–lumefantrine, bedaquiline and trametinib. The first three of those are natural products, or derived from natural products. Outside of fingolimod, all of them are anti-infectives or antiproliferatives, which I'd bet reflects the comparative ease of running pure phenotypic assays with those readouts.
Here are the authors on the discrepancies between their paper and the earlier one:
At first glance, the results of our analysis appear to significantly deviate from the numbers previously published for first-in-class drugs, which reported that of the 75 first-in-class drugs discovered between 1999 and 2008, 28 (37%) were discovered through phenotypic screening, 17 (23%) through target-based approaches, 25 (33%) were biologics and five (7%) came from other approaches. This discrepancy occurs for two reasons. First, we consider biologics to be target-based drugs, as there is little philosophical distinction in the hypothesis driven approach to drug discovery for small-molecule drugs versus biologics. Second, the past 5 years of our analysis time frame have seen a significant increase in the approval of first-in-class drugs, most of which were discovered in a target-based fashion.
Fair enough, and it may well be that many of us have been too optimistic about the evidence for the straight phenotypic approach. But the figure we don't have (and aren't going to get) is the overall success rate for both techniques. The number of target-based and phenotypic-based screening efforts that have been quietly abandoned - that's what we'd need to have to know which one has the better delivery percentage. If 78/113 drugs, 69% of the first-in-class approvals from the last 25 years, have come from target-based approaches how does that compare with the total number of first-in-class drug projects? My own suspicion is that target-based drug discovery has accounted for more than 70% of the industry's efforts over that span, which would mean that systems-based approaches have been relatively over-performing. But there's no way to know this for sure, and I may just be coming up with something that I want to hear.
That might especially be true when you consider that there are many therapeutic areas where phenotypic screening basically impossible (Alzheimer's, anyone?) But there's a flip side to that argument: it means that there's no special phenotypic sauce that you can spread around, either. The fact that so many of those pure-phenotypic drugs are in areas with such clear cellular readouts is suggestive. Even if phenotypic screeningwere to have some statistical advantage, you can't just go around telling people to be "more phenotypic" and expect increased success, especially outside anti-infectives or antiproliferatives.
The authors have another interesting point to make. As part of their analysis of these 113 first-in-class drugs, they've tried to see what the timeline is from the first efforts in the area to an approved drug. That's not easy, and there are some arbitrary decisions to be made. One example they give is anti-angiogenesis. The first report of tumors being able to stimulate blood vessel growth was in 1945. The presence of soluble tumor-derived growth factors was confirmed in 1968. VEGF, the outstanding example of these, was purified in 1983, and was cloned in 1989. So when did the starting pistol fire for drug discovery in this area? The authors choose 1983, which seems reasonable, but it's a judgment call.
So with all that in mind, they find that the average lead time (from discovery to drug) for a target-based project is 20 years, and for a systems-based drug it's been 25 years. They suggest that since target-based drug discovery has only been around since the late 1980s or so, that its impact is only recently beginning to show up in the figures, and that it's in much better shape than some would suppose.
The data also suggest that target-based drug discovery might have helped reduce the median time for drug discovery and development. Closer examination of the differences in median times between systems-based approaches and target-based approaches revealed that the 5-year median difference in overall approval time is largely due to statistically significant differences in the period from patent publication to FDA approval, where target-based approaches (taking 8 years) took only half the time as systems-based approaches (taking 16 years). . .
The pharmaceutical industry has often been criticized for not being sufficiently innovative. We think that our analysis indicates otherwise and perhaps even suggests that the best is yet to come as, owing to the length of time between project initiation and launch, new technologies such as high-throughput screening and the sequencing of the human genome may only be starting to have a major impact on drug approvals. . .
Now that's an optimistic point of view, I have to say. The genome certainly still has plenty of time to deliver, but you probably won't find too many other people saying in 2014 that HTS is only now starting to have an impact on drug approvals. My own take on this is that they're covering too wide a band of technologies with such statements, lumping together things that have come in at different times during this period and which would be expected to have differently-timed impacts on the rate of drug discovery. On the other hand, I would like this glass-half-full view to be correct, since it implies that things should be steadily improving in the business, and we could use it.
But the authors take pains to show, in the last part of their paper, that they're not putting down phenotypic drug discovery. In fact, they're calling for it to be strengthened as its own discipline, and not (as they put it) just as a falling back to the older "chemocentric" methods of the 1980s and before:
Perhaps we are in a phase today similar to the one in the mid-1980s, when systems-based chemocentric drug discovery was largely replaced by target-based approaches. This allowed the field to greatly expand beyond the relatively limited number of scaffolds that had been studied for decades and to gain access to many more pharmacologically active compound classes, providing a boost to innovation. Now, with an increased chemical space, the time might be right to further broaden the target space and open up new avenues. This could well be achieved by investing in phenotypic screening using the compound libraries that have been established in the context of target-based approaches. We therefore consider phenotypic screening not as a neoclassical approach that reverts to a supposedly more successful systems-based method of the past, but instead as a logical evolution of the current target-based activities in drug discovery. Moreover, phenotypic screening is not just dependent on the use of many tools that have been established for target-based approaches; it also requires further technological advancements.
That seems to me to be right on target: we probably are in a period just like the mid-to-late 1980s. In that case, though, a promising new technology was taking over because it seemed to offer so much more. Today, it's more driven by disillusionment with the current methods - but that means, even more, that we have to dig in and come up with some new ones and make them work.
+ TrackBacks (0) | Category: Drug Assays | Drug Development | Drug Industry History