Corante

About this Author
DBL%20Hendrix%20small.png College chemistry, 1983

Derek Lowe The 2002 Model

Dbl%20new%20portrait%20B%26W.png After 10 years of blogging. . .

Derek Lowe, an Arkansan by birth, got his BA from Hendrix College and his PhD in organic chemistry from Duke before spending time in Germany on a Humboldt Fellowship on his post-doc. He's worked for several major pharmaceutical companies since 1989 on drug discovery projects against schizophrenia, Alzheimer's, diabetes, osteoporosis and other diseases. To contact Derek email him directly: derekb.lowe@gmail.com Twitter: Dereklowe

Chemistry and Drug Data: Drugbank
Emolecules
ChemSpider
Chempedia Lab
Synthetic Pages
Organic Chemistry Portal
PubChem
Not Voodoo
DailyMed
Druglib
Clinicaltrials.gov

Chemistry and Pharma Blogs:
Org Prep Daily
The Haystack
Kilomentor
A New Merck, Reviewed
Liberal Arts Chemistry
Electron Pusher
All Things Metathesis
C&E News Blogs
Chemiotics II
Chemical Space
Noel O'Blog
In Vivo Blog
Terra Sigilatta
BBSRC/Douglas Kell
ChemBark
Realizations in Biostatistics
Chemjobber
Pharmalot
ChemSpider Blog
Pharmagossip
Med-Chemist
Organic Chem - Education & Industry
Pharma Strategy Blog
No Name No Slogan
Practical Fragments
SimBioSys
The Curious Wavefunction
Natural Product Man
Fragment Literature
Chemistry World Blog
Synthetic Nature
Chemistry Blog
Synthesizing Ideas
Business|Bytes|Genes|Molecules
Eye on FDA
Chemical Forums
Depth-First
Symyx Blog
Sceptical Chymist
Lamentations on Chemistry
Computational Organic Chemistry
Mining Drugs
Henry Rzepa


Science Blogs and News:
Bad Science
The Loom
Uncertain Principles
Fierce Biotech
Blogs for Industry
Omics! Omics!
Young Female Scientist
Notional Slurry
Nobel Intent
SciTech Daily
Science Blog
FuturePundit
Aetiology
Gene Expression (I)
Gene Expression (II)
Sciencebase
Pharyngula
Adventures in Ethics and Science
Transterrestrial Musings
Slashdot Science
Cosmic Variance
Biology News Net


Medical Blogs
DB's Medical Rants
Science-Based Medicine
GruntDoc
Respectful Insolence
Diabetes Mine


Economics and Business
Marginal Revolution
The Volokh Conspiracy
Knowledge Problem


Politics / Current Events
Virginia Postrel
Instapundit
Belmont Club
Mickey Kaus


Belles Lettres
Uncouth Reflections
Arts and Letters Daily
In the Pipeline: Don't miss Derek Lowe's excellent commentary on drug discovery and the pharma industry in general at In the Pipeline

In the Pipeline

March 5, 2015

Chemistry Incident in Manchester

Email This Entry

Posted by Derek

There was some sort of incident at the University of Manchester yesterday. one that led to an evacuation of the chemistry building and all sorts of haz-mat people being called in. Press reports had it that this was a peroxide-acetone mixture that "crystallized", but you never want to take these things at face value. One can, if one is utterly luckless or foolhardy, produce crystals of triacetone triperoxide with such a mixture, and that would certainly be a great reason to vacate the area. You'd think, though, that no one in a chemistry department would do such a thing. Does anyone from Manchester have any more solid information?

Update: here's more. It was not the chemistry department, but someone in an engineering building (the Pariser building), apparently working on some sort of sustainable plastics research. He did indeed combine acetone and peroxide, and left it sitting for some time when he noticed that it had produced a crystalline precipitate. A quick look at the literature for that mixture would have saved a good deal of trouble, or so it seems.

Comments (34) + TrackBacks (0) | Category: Chemical News

Diversity and Similarity Scoring: Does One Size Ever Fit All?

Email This Entry

Posted by Derek

We spend a lot of time talking about compound similarities in this business. All those schemes for virtual screening, to find new activities for old compounds, to predict side effects and general toxicity, and many others besides rely on some sort of measurement of how similar various compounds (and collections of compounds) are to each other.

But how do you determine similarities? Some might answer "By comparing Tanimoto coefficients, of course", but that's an example of a little knowledge being a dangerous thing. Tanimoto distance calculations are a meat grinder, which will grind whatever you shove into the hopper. How, then, are you characterizing the compounds themselves? That's where things get tricky. There are all sorts of "fingerprint" descriptors for molecules, and different ones will give you different measures of similarity (or of chemical diversity, depending on which end of the tube you're looking through).

And the problem is, all of them have a tendency to look funny. Many medicinal chemists have experienced this, looking over a list of compounds ranked by similarities. You come across one with a high score, but it doesn't look that similar to you, because the algorithm liked what look to you like unimportant details. The next list has two compounds that are supposed to be quite different, but they're a methyl ester versus a t-butyl ester, or something of the sort, and just how different is that when the rest of the molecule is the same? And these are just two-dimensional comparisons. If you want to talk similarity in conformational space, and our drug targets generally want to talk that way, then you're in for an even bigger universe of choices and tradeoffs. (Here's a good recent overview from J. Med. Chem.)

The Matsy algorithm is supposed to generate results that look a little less alien, but I haven't used it myself. I'd be interested in hearing what people have found to be the most useful in their own hands for such measurements. Is this always a case-by-case thing, or are there methods that have enough of a Swiss-army-knife character to them to stick with? Any favorites out there?

Comments (3) + TrackBacks (0) | Category: In Silico

Twenty-One Billion Dollars. Really.

Email This Entry

Posted by Derek

Ibrutinib seems to be worth even more than everyone thought! As of this morning, AbbVie has won what was apparently a lively bidding war for Pharmacyclics, paying $21 billion for the company. Now that is a lot of money, and I'm not sure that I can made those numbers add up, but presumably there are people at AbbVie who are paid more than I am to figure such things out. (Adam Feuerstein sounds a bit stunned, too). Many news organizations had stories yesterday about how J&J was about to buy them, but if you're willing to pay enough money, you can horn in on any deal you like.

I would assume that the company is going to aggressively move the compound into clinical trials for as many plausible indications/combinations as possible - that's the main way I can see this working out. But as I said last week, this is what keeps people investing in biopharma: back in 2009, you could have had Pharmacyclics for $1 per share or less. Last night's offer was $261.25. David Shayvitz has a good history of ibrutinib here, and it's quite a story. Personally, I think that a lot of Pharmacyclics people are probably just relieved that they're not going to have to attend any more sessions to learn how to become geniuses.

Comments (36) + TrackBacks (0) | Category: Business and Markets | Cancer | Drug Development

March 4, 2015

Neural Networks for Drug Discovery: A Work in Progress

Email This Entry

Posted by Derek

There have been many attempts over the years to bring together large amounts of biological and drug activity data, winnow them computationally, and come up with insights that would not be obvious to human eyes. It's a natural thing to imagine - there are so many drug targets in the body, doing so many things, and there's an awful lot of information out there about thousands and thousands of compounds. There's no way that a human observer could pick up on all the things that are going on; you need tireless software to sift through the piles.

The success record of this sort of thing has been mixed, though. Early attempts can now be set aside as underpowered, but what to make of current attempts at "virtual clinical trials" and the like? (We're probably still underpowered for that sort of thing). Less ambitiously, people have tried to mine for new targets and new drug activities by rooting through the Big Data. But this sort of thing is not without controversy: many of us, chemists and biologists alike, don't have the mathematical background to say if the methods being used are appropriate, or what their weaknesses and blind spots might be.

A new paper has gotten me thinking about all this again. It's a collaboration between several researchers at Stanford and Google (press release here) on machine learning for drug discovery. What that means is that they're trying to improve virtual screening techniques, using a very Google-centric approach that might be summed up as "MOAR DATA!" (That phrase does not appear in the paper, sadly).

In collaboration with the Pande Lab at Stanford University, we’ve released a paper titled "Massively Multitask Networks for Drug Discovery", investigating how data from a variety of sources can be used to improve the accuracy of determining which chemical compounds would be effective drug treatments for a variety of diseases. In particular, we carefully quantified how the amount and diversity of screening data from a variety of diseases with very different biological processes can be used to improve the virtual drug screening predictions.

Using our large-scale neural network training system, we trained at a scale 18x larger than previous work with a total of 37.8M data points across more than 200 distinct biological processes. Because of our large scale, we were able to carefully probe the sensitivity of these models to a variety of changes in model structure and input data. In the paper, we examine not just the performance of the model but why it performs well and what we can expect for similar models in the future. The data in the paper represents more than 50M total CPU hours.

I end up with several trains of thought about this kind of thing. On track one, I appreciate that if virtual screening is going to work well, it needs to draw from the largest data sets possible, since there are so many factors at work. But on track two, I wonder how good the numbers going into this hopper really are, since I (like anyone else in the business) have seen some pretty garbagey screening numbers, both in person and in the literature. Piling more noise into the computations cannot improve them, even if your hardware is capable of dealing with landfills of the stuff. (The authors do note that they didn't do any preprocessing of the data sets to remove potential artifacts. The data come from four main sources (see the paper, which is open access, for more), and only one of these has probably been curated to that level.) And that brings us to track three: my innate (and doubtless somewhat unfair) suspicions go up when I see a lot of talk about just how Incredibly Large the data sets are, and how Wildly Intense all the computations were.

Not to be too subtle about it, asking for a virtual screen against some target is like asking for a ditch to be dug from Point A to Point B. Can you dig the ditch, or not? Does it get to where it's supposed to go, and do what a ditch is supposed to do? If so, then to a good approximation, I don't care how many trained badgers you herded in for the job, or (alternatively) about the horsepower and fuel requirements of the earth-moving equipment you rented. If someone spends a lot of time telling me about these things (those engines! those badgers!) then I wonder if they're trying to distract me from what really matters to me, which is the final product.
Multitask.jpg
Well, I'm willing to accept that that's not a completely fair criticism, but it's something that always crosses my mind, and I may not be alone in this. Let's take a look at the ditch - uh, the virtual screening - and see how well it came out.

In this work, we investigate several aspects of the multitask learning paradigm as applied to virtual screening. We gather a large collection of datasets containing nearly 40 million experimental measurements for over 200 targets. We demonstrate that multitask networks trained on this collection achieve significant improvements over baseline machine learning methods. We show that adding more tasks and more data yields better performance. This effect diminishes as more data and tasks are added, but does not appear to plateau within our collection. Interestingly, we find that the total amount of data and the total number of tasks both have significant roles in this improvement. Furthermore, the features extracted by the multitask networks demonstrate some transferability to tasks not contained in the training set. Finally, we find that the presence of shared active compounds is moderately correlated with multitask improvement, but the biological class of the target is not.

As the paper notes, this is similar to Merck's Kaggle challenge of a couple of years back (and I just noticed this morning that they cite that blog post, and its comments, as an example of the skepticism that it attracted from some parts of the med-chem community). In this case, the object isn't (yet) to deliver up a bunch of virtual screening hits, so much as it is to see what the most appropriate architecture for such a search might be.

One of the biggest problems with these papers (as this one explicitly states) is that the criteria used to evaluate the performance of these systems are not standardized. So it's basically impossible to compare one analysis with another, because they're scoring by different systems. But that graphic gives some idea of how things worked on different target classes. The Y axis is the difference between using multitask models (as in this paper) and single-task neural network models, and it shows that in most cases, most of the time, multitask modeling was better. But I note that almost every class has some cases where this doesn't hold, and that (for reasons unknown) the GPCR targets seem to show the least improvement.

But what I don't know is how well these virtual screening techniques compared to the actual screening data. The comparisons in the paper are all multi-task versus single-task (which, to the fair, is the whole focus of the work), but I'd be interested in an absolute-scale measurement. That shows up, though, in Table B2 in the appendix, where they use Jain and Nicholls' "enrichment" calculation. Assuming that I'm reading these correctly, which may or may not be warranted, the predictions look to be anywhere from about 5% to about 25% better than random, depending on what false-positive rate you're looking at, with occasional hops up to the 40% better range. Looking at the enrichment figures, though, I don't see this model performing much better than the Random Forest method, which has already been applied to med-chem work and activity prediction many times. Am I missing something in that comparison? Or does this all have quite a ways to go yet?

Comments (21) + TrackBacks (0) | Category: In Silico | In Silico

Our Friend the Sulfur Atom

Email This Entry

Posted by Derek

Here's a review on a topic that I'll bet not too many medicinal chemists have thought about in detail: noncovalent interactions with sulfur atoms. Sulfur's a weird element - small enough to fit unobtrusively into organic structures, but just big enough to show some orbital effects that you don't get one row above in the periodic table.

This paper, from a well-known group of authors at BMS and Amgen, highlights the way that a bivalent sulfur's sigma* orbitals can interact with the lone pairs on oxygen and nitrogen atoms, giving you a conformational effect similar to a hydrogen bond. This sort of thing is probably encountered more in a negative sense than a positive one - people try to get away from a sulfur and find that nothing else quite does the trick. (As an aside, that's always been one of my big problems with the dynamic combinatorial chemistry schemes that use disulfide formation and exchange as their engine. If you want to turn your hit into a more druglike entity, you're faced with replacing a disulfide, and there's really nothing else quite like a disulfide, either). Definitely worth a look.

Comments (11) + TrackBacks (0) | Category: Chemical News

The SEC Tells Companies to Speak Up

Email This Entry

Posted by Derek

The SEC is specifically warning drug companies to be more forthcoming about their dealings with the FDA. He's definitely got a point. Too many companies try to act as if important, material information from the agency is under some sort of confidential shield, when it really should be disclosed as soon as possible. I should clarify that: we're talking about important, material bad news from the agency. Good news gets trumpeted out within milliseconds. As is only proper.

If the SEC is serious about this - and I'm hoping that they are - then we should look for some well-publicized actions by the agency to signal that this isn't just talk. It'll be worth keeping an eye on the regulatory landscape over the next few months to see if this really happens. . .

Comments (2) + TrackBacks (0) | Category: Regulatory Affairs

March 3, 2015

Doing the Right Thing

Email This Entry

Posted by Derek

This is well outside my field of chemistry (a paper on iridium-catalyzed silane couplings), but I wanted to highlight it anyway. A grad student working on the project realized that one of the key products had been mischaracterized (not hard to do, with these compounds), and that this invalidated a good part of the published paper. The student went to their professor (Rory Waterman at Vermont) who promptly retracted the paper with a full explanation of what happened. That is exactly how it's supposed to work, and all the shenanigans that go on in the literature are enough to make you forget that. Thanks to Prof. Waterman, and I hope he goes on to give his student a glowing recommendation.

Comments (13) + TrackBacks (0) | Category: The Scientific Literature

Put Away The Lecture Bottle

Email This Entry

Posted by Derek

For fluorination fans, here's a new way to get trifluoromethyl groups in. Trifluoromethyl iodide is a useful reagent, or it would be if it weren't a gas. That makes it annoying to measure out and work with, especially on a small scale. But Tobias Ritter's lab has a new way to deal with the stuff: turns out that the reagent forms a 1:1 complex with tetramethylguanidine or DMSO, and the resulting liquids are shelf-stable. The paper shows that one or the other of these can substitute in many of the reactions of the neat reagent, which should make them a lot more convenient. And you can now get both the trifluoromethyl and pentafluoroethyl variations commercially, so I'll be ordering some today. I think that Quintus is a customer as well. Happy fluorinating!

Comments (4) + TrackBacks (0) | Category: Chemical News

Dinitrophenol: A Possible Comeback

Email This Entry

Posted by Derek

I've mentioned metabolic uncoupling as a possible diabetes therapy. The idea is that your mitochondria will plow though large amounts of lipids under these conditions, and there's plenty of evidence that knocking down free fatty acids and tissue lipid stores would be of great benefit for Type II patients. The problem is that this therapy has a well-deserved reputation for having a low therapeutic index.

2,4-dinitrophenol is a pretty unlikely-looking drug, but it most certainly has metabolic effects. It was on the market for a while (many decades ago) as a weight-loss therapy, and no one can say that it won't make you lose weight. The danger is that you lose it all the way down to your dry bone mass, though, because it doesn't take much extra DNP to give you dangerous amounts of overheating and perhaps even a critical shortage of ATP, which frivolous organs like your heart and brain seem to have become dependent on. Various foolhardy types (extreme bodybuilders, etc.) have experimented with it since then, but it's just too dangerous to recommend to anyone. Even short of death, there were other unpleasant side effects.

But there have been reports from time to time that the compound might still have legitimate uses, and a recent one from Gerry Shulman's group at Yale is getting a lot of attention. Shulman is one of the world's experts on diabetes and metabolism, and his lab has been working on DNP for some years now. The latest version is a time-release form of the drug, one that delivers up to 100-fold less Cmax than the standard human dose, if DNP can be said to have one.

This formulation does a dramatic job of reversing diabetes symptoms in rodent models, and fatty liver disease as well. Shulman is working on taking this toward human clinical trials, and the animal results make a good case. If this were any other drug showing these effects, people would be moving it forward as fast as possible - it's just the history of DNP that's going to make things more difficult. But we have the example of thalidomide - if that can find a therapeutic niche, anything can. The next key step will be rodent and dog tox studies, and if DNP can clear those, then I would see no reason not to take it on into Phase I and beyond. Who'd have thought?

Comments (23) + TrackBacks (0) | Category: Diabetes and Obesity

March 2, 2015

Chlorine Azide For Everyone

Email This Entry

Posted by Derek

One of my "Things I Won't Work With" compounds may have moved into a zone where I'd actually use it. This new paper in JOC describes in situ preparation of small amounts of chlorine azide, which can then react with alkene to give useful beta-chloro azide products. This way, you in dilute solution, with slow release - the only possible conditions I'd consider if working with the stuff.

The paper points out, most accurately, that the halogen azides "have been regarded as challenging compounds to work with". And even with this latest variation, there were challenges, and things that you should definitely not do:

The ClN3 thus produced is a gas and can be isolated by transfer out of the reaction flask with a gentle stream of nitrogen, bubbled into an organic solvent placed in a receiving flask. While we performed this procedure a number of times, it proved to be occasionally explosive, and we strongly discourage its practice. Instead, we developed a safe procedure to generate ClN3 in small quantities in situ and in the presence of an alkene with which it can immediately react. . .Even though these conditions diminish the hazards of chlorine azide substantially, it is still necessary to use common sense and care.

I'll report if I try this out. I'm still going to have to have a good reason, of course, but this at least takes things into the realm of possibility!

Comments (15) + TrackBacks (0) | Category: Chemical News

Of Proteasome Inhibitors and PAINs

Email This Entry

Posted by Derek

Amgen is out with some new data that might well justify their purchase of Onyx a year and a half ago. A big driver for that deal was the proteasome inhibitor Kyprolis (carfilzomib), and the company just reported results in a head-to-head trial in multiple myeloma versus the Takeda/Millennium competition, Velcade.

Kyprolis comes out looking pretty good:

In the "ENDEAVOR" study announced Sunday night, patients with multiple myeloma still progressing despite one to three prior therapies were randomized to receive Kyprolis plus a steroid or Velcade plus the same steroid. Following an interim analysis, patients in the Kyprolis arm had a 47% reduction in the risk of disease worsening or death compared to Velcade. At the median, Kyprolis patients went 18.7 months before their disease progressed compared to 9.4 months for Velcade patients. The benefit favoring Kyprolis was statistically significant.

The study is continuing to see what the overall survival benefit might be, but I'm sure that Amgen is hopeful that those numbers will translate into something robust. Velcade itself will be off patent in a couple of years, so Amgen is going to need data to make the case that people should get their drug rather than the generic competition.

It's worth taking a look at the structures of those two drugs again. Velcade is, of course, a boronic acid, the first to get approved as a human therapy. "Boron is for morons" went the joke for many years, as boron-containing enzyme inhibitors ran into trouble on their way to the clinic. But that particular moronic compound brought in over two billion dollars in sales last year, which a lot of us smart people who've avoided boron have never been able to ring out with anything we've made. And Kyprolis is rather funny-looking, too. It came out of some natural products work in Craig Crews' lab at Yale, and it's a modified tetrapeptide with an epoxide hanging off it. This is another structure that would get the fisheye from a lot of people, for both those reasons, but there it is, out there in the clinic working well.

Now, I take the point that targeting the proteasome is not exactly like coming up with a new diabetes drug. You're going to be treating some very sick patients, many of whom are (otherwise) going to die quickly. The sorts of structures that a project is willing to look at do need to be calibrated a bit for these things. But a lot of us - including me, a few years ago - would have calibrated these two drug structures right off the side of the page, and that clearly would have been a bad decision. Yet another reminded for us to loosen up a bit.

"Right", I can hear some readers saying. "Here's one of those guys who's death on PAINs and makes fun of people's screening compounds, telling us to loosen up on funny structures". A fair point, but here's where I get off saying this. The big rap on boronic acids and peptidic drugs is that they have poor PK, and that's something that can be checked out. You'll note that morpholine hanging off the end of carfilzomib, and I suspect that's on there for just those PK reasons. Epoxides, for their part, are actually a lot less reactive and nonspecific than their reputation has it.

PAINs, though, are not looked at with distaste becuase of their phamacokinetics or how they've tended to perform in the clinic. They're trouble because they tend to give false screening results and/or hit in way too many assays to be good candidates for further development. So when they show up in a paper as great new screening hits, and the authors show no sign of realizing their problematic nature in just the sorts of assays that they've been running, then yes, it's a problem. Anyone developing a boronic acid, a tetrapeptide, or an epoxide should know that they have a lot of PK and tox assays waiting for them, and that no one will believe in these compounds until they start passing them. For PAINs, this disbelief kicks in very early, as is should. The very first step in the whole process, activity in the screening assay, may well be bogus.

And in the same way that there are PK and tox assays to sort out odd-looking structures that have been known to have trouble in these areas, there are screening-level assays that need to be run for suspicious-looking hits. They need to be shown not to be redox cyclers, not to hit all sorts of other targets, not to decompose to reactive species under the assay conditions, not to soak up thiol nucleophiles nonspecifically, not to hit because of fluorescent interference, and so on, and so on. Peptides and boronic acids have a reputation for not clearing the later hurdles in drug development, so if you're working on them, be prepared and let the data guide you. They'll work, or they won't, and you'll get a clear answer. False positives are not a big problem in most PK assays.

But PAINs are the compounds that tend to not even clear the first hurdles, while looking like they have. Let the data guide you there, too, but that means getting the correct data with those follow-up assays after your initial screen, not declaring them wonderful new leads and pushing on regardless.

Comments (6) + TrackBacks (0) | Category: Cancer | Clinical Trials | Drug Assays | Drug Development

Gene Editing Therapies Thunder Along

Email This Entry

Posted by Derek

Here's a one-stop way to get caught up with all the therapeutic gene editing technologies out there, courtesy of Nature Medicine. Huge amounts of money are flowing into this area, because CRISPR/Cas9 looks so solid, compared to the various other things that have been tried in the past. But zinc finger nuclease and TALEN are also getting their shots.

The challenge is often going to be getting the population of new, modified cells to take off. Blood disorders are a natural fit, since you can pull bone marrow, modify it, kill off the original, and replace (as is done with the CAR-T immunotherapies for cancer, another area where dumptrucks of money are pulling up and unloading). But for other cell populations, it's going to be a bit trickier, and my guess is that the gene editing may (perversely) turn out to be the easy part. We don't always have a good handle on stem-cell percursors (their behavior or their localization), and clearing out the original defective cell population will (in some cases) be an issue. (In others, you can probably wait for them to die off and cycle in your improved versions).

And gene editing comes in several varieties - disruption of a defective gene alone, replacement of one with a working form, both of those at once, or just splicing in a working form in addition to what's already there (which in some cases would probably be enough to correct things). There's a lot of experimentation to be done, and a lot of it is going to end up being done in the clinic, with human patients. It's an open question how predictive animal models will be for some of these things - you'd think that would work, but we haven't modified too many living human genomes yet. I would count on all sorts of twists and turns along the way - dramatic successes, dramatic failures, and some that looked like one of those but were actually the other. Deep pockets, patience, and no small amount of courage (both medical and financial) will be needed.

Comments (18) + TrackBacks (0) | Category: Biological News

February 27, 2015

PAINs And Good Old Med Chem

Email This Entry

Posted by Derek

The mainstreaming of the PAINs concept continues, with editorials from Jonathan Baell in ACS Med. Chem. Letters and Dan Erlanson in J. Med. Chem.. Both are definitely worth a read.

Baell emphasizes that real hits tend to have real SAR around them. You can take pieces off the structures, and activity falls away; you can move things around and it changes. Eventually, you can make more potent compounds. I will say that I've seen some oddball programs where things resolutely refused to make sense, and the projects were advanced by a sort of human-wave attack. But even in those cases, the compounds were, in fact, advanced, even it it wasn't always sensible how they got there. PAINs compounds, on the other hand, just seem to sort of sit there. You try this, you try that, you try the other damn thing, and you never seem to get much better than the original hit, even though the activity might wander around some with structural variations.

Their hallmarks typically comprise the following: little or no medicinal chemistry optimization, unconvincing structure–activity relationships (SAR), relative lack of improvement in biological activity to meaningful levels that often hover around the micromolar mark, and molecular modeling described as though it is an experimental observation of relevant binding sites. Also, the literature is frequently ignored as an important SAR source of evidence that similar compounds appear to be hitting different targets and could be promiscuous.

Baell also makes a good point about the size of many screening efforts, particularly academic ones. It's a worthy cause to screen against a tough target, but you should be prepared to go big or go home:

Screening too few compounds, perhaps even as small a number as several thousand, is a contributory problem. An understandable attraction to academic researcher with a tight budget is the relative affordability and that such an exercise may not even require access to robotics but just the services of an on-site research assistant. However, screening too small a library of unbiased commercially available compounds may not return a progressable hit. In contrast, it will certainly produce artifacts that are then more likely to attract unwarranted further attention. . .

. . .I believe that even screens of 30,000 compounds are suboptimal, especially for difficult targets. For highly druggable targets, some progressable hits will plausibly be unearthed, but why not screen 150,000–250,000 compounds to find better hits that could potentially save years of medicinal chemistry optimization?

Well, money, for one thing, but he's right that most of the expense of doing a 250K screen will have already been encountered while doing a 30K one. And if your assay has false-positive problems, running it across a large collection will merely bury you in more junk, which is all the more reason to take care before you start screening at all. The point stands, though: if you want to go after a protein-protein interaction or a transcription factor (or some such), a 10,000 compound screen is completely inadequate. Update: as has been pointed out in the comments, sheer numbers alone are not necessarily enough. It is possible to have large piles of horrible compounds, and making them larger is not the way to go. And there is no way to run a "focused library" screen to improve your chances (no matter what the vendors might tell you), because no one has any good idea of what to focus on for such targets. Going on to cell-based and phenotypic screens can actually exacerbate the problem, because it opens up new ways to get fooled (membrane-perturbing amphipathic compounds, etc.).

Erlanson's piece is a commentary on the recent paper discussing thiol-scavenging compounds as promiscuous screening hits. He makes the point that what looks like SAR may just be SIR (a structure-interference relationship), since there are no classes of PAINs where every member hits the assay. You have to run counterscreens, control experiments, and mechanistic tests, particularly if you're looking at a compound that might be reactive (like a thiol scavenger). And even experienced chemists might not realize that they're looking at one, so run those experiments anyway, for peace of mind and to avoid wasting even more time down the line. Reactive compounds can indeed be great drugs (look at ibrutinib, or heck, look at aspirin), but they can also spit out confusingly convincing false positive data in a screen and scuttle your chances of finding anything real.

Another point that Erlanson's article emphasizes is that computational screens of structure, while useful, are always going to be inadequate. It's hard to write such filters so that they catch everything that they're supposed to catch, for one thing, but the bigger problem is that there are many more problematic structures than are contained in any set of filters at all. There is, alas, no substitute for doing more work and taking more care, and this annoying truth has been with us for a long, long time.

Baell makes a similar point, that structural filters are necessary but nowhere near sufficient. And I particularly liked this part:

We need to recognize the astonishingly limiting bottleneck created by the pace with which hit discovery through HTS so overwhelms the subsequent and necessary medicinal chemistry optimization. Impatient biologists may find it hard to understand that HTS heralds just the start of a multiyear journey of medicinal chemistry optimization. In this context, the medicinal chemists among us could do a better job explaining to biologists that drugs are not discovered through screening or design, but principally through medicinal chemistry.

Preach it! And let this message not only be heard in the biology labs, but yea, even unto the upper reaches of management. None of our spiffy equipment, none of our advanced techniques will do the job by themselves. Someone has to make compounds, and then make some more, and there's no way around it. A significant part of the history of the last 25 years of the drug industry consists of attempts to evade this part of the process, but they haven't worked, and they're not going to.

Comments (28) + TrackBacks (0) | Category: Chemical News | Drug Assays

AZ Spins Off Anti-Infectives

Email This Entry

Posted by Derek

Word came yesterday that AstraZeneca is spinning off its anti-infectives division into a separate subsidiary company. This does not appear to have been their first choice - they've been shopping these assets around for actual cash, but had no takers (at least, on their terms). Here's what they're telling the employees:

. . .We are creating a stand-alone subsidiary company, focused exclusively on the research and development of our early-stage antibiotic pipeline, including the novel gyrase inhibitor AZD0914, which is currently in Phase II for the treatment of gonorrhoea. AstraZeneca will invest $40 million in the new company. We anticipate that the new company will be led by and include staff from AstraZeneca’s Innovative Medicines Unit. . .

. . .As already communicated to our employees, this new way of conducting small molecule antibiotic research and early-stage development will impact approximately 95 employees based in Waltham, MA. It is anticipated that some of the researchers impacted by the changes will take up roles in the new company, or in other parts of AstraZeneca. We are fully committed to supporting our people through the transition.

FierceBiotech has more, agreeing that this seems to have been something of a last resort for the company. But it probably could have been worse, too - better a "subsidiary company" than no company at all, if you're working there, I should think. We'll see how this goes in the coming months. My guess is that the gyrase inhibitor's fortunes will determine a lot of the story.

Comments (15) + TrackBacks (0) | Category: Business and Markets | Infectious Diseases

February 26, 2015

Ibrutinib's Rise

Email This Entry

Posted by Derek

Pharmacyclics and their reactive kinase inhibitor Imbruvica (ibrutinib) have come up a few times around here in the past. (And someone who was involved in its earlier development at Celera shows up here in the comments from time to time). Here's a piece at Bloomberg that spells out just what a massive return buying that compound has turned out to have. Pharmacyclics picked it up in 2006 for a few million, and it's expected to have sales of up to 4 billion a year. This is why people keep investing in small pharma and biotech: because neither we, nor they, nor anyone else can predict when this sort of thing will happen next.

Comments (15) + TrackBacks (0) | Category: Cancer | Drug Development

Double-Blinded Peer Review

Email This Entry

Posted by Derek

Nature has decided to add an option for double-blind peer review - papers would be sent to the referees without author names or institutional affiliation on them. I think this is a worthwhile idea, but I agree with many of the points in this post over at Retraction Watch by David Vaux in Melbourne.

A big potential problem is that the double-blind system is optional. It's reasonable to assume that papers from Big Names at Big Labs won't bother, because they have more to lose by being covered up. So the double-blinded papers might end up disproportionately from smaller groups who are trying to even the playing field, and if that happens, it risks becoming a negative signal all its own. It might be better if Nature were to take the plunge and blind everything. And what about the editors? They're the ones deciding at the very beginning about whether to send a paper out for review at all, and at a journal like Nature that's a big step in itself. Should the papers be blinded even before they get to that stage? Why not?

It's true that in some cases a reviewer will be able to guess where a given paper came from, or at least to narrow it down. There's no way around that, but I still think that double-blind peer review is a worthwhile idea. Will any other big name journal follow Nature's lead, or go even further?

Comments (24) + TrackBacks (0) | Category: The Scientific Literature

The Latest Fragment List

Email This Entry

Posted by Derek

Practical Fragments has an updated list of all the drug that have made it into the clinic from fragment-based drug design. There are more than thirty of them, and there are probably more that aren't on the list yet. So while fragment-based methods aren't a magic wand - those are on back-order, still - it's certainly a legitimate part of the toolkit, and this is solid evidence for anyone who needs it. (I still get the question sometimes when I talk about fragments: "So what's it done for anyone so far?")

Comments (6) + TrackBacks (0) | Category: Drug Development

February 25, 2015

All That Cash

Email This Entry

Posted by Derek

Biopharma%20cash.jpg
From the LifeSciVC blog, here's a revealing comparison of the amount of money going into startups in this business. This is why I tend to get worked up about the number of stock buyback plans among the big pharma players. It would seem that there is so much cash sloshing around on the balance sheets, compared to the amounts that are being raised to start new companies. Why so much?

In recent years, building up cash reserves has been a trend across all industries, though, so we might be seeing the pharma end of a big secular tendency.

It's also true that pharma companies have always tended to carry larger cash balances than companies in many other sectors. There are a lot of acquisitions in this business, both of individual drug or platform assets and of whole companies, and there's a constant threat of legal action, too. But the big companies have plenty of access to capital, too, which makes that a somewhat less compelling argument. Across different companies and different sectors, investors seem to value each dollar of cash a company is holding across a rather wide range, and the larger drug companies would seem to be toward the lower end of it. On the other hand, it's been argued that knowledge-based businesses (biopharma, IT, etc.) are in a different situation than many other industries, since it's difficult to directly hedge the risks involved in R&D. To make things worse, those risks aren't really correlated with the company's cash flow or other financial indicators: good news and bad news both can come out of a clear sky.

So it's not a surprise that pharma companies should have relatively hefty amounts of cash on hand. But this hefty? Last year saw even more reserves piled up, making many speculate that we're going to be seeing a lot of dealmaking this year. At some point, you'd figure that shareholders will ask for something to be done with all that money (although you'd have expected that to happen before now with Apple, to pick one whopping example).

But that focus on M&A might be part of the answer to the question. I'm riding, as I write this, on a railroad whose deficiencies in maintenance and upkeep have been made painfully, abundantly clear by the run of bad weather we've had in the Boston area. Over that same recent era, though, the same MBTA has opened up a new commuter rail line to the south of the city, tried to expand one of its surface lines into a whole new area (the Green Line to Somerville), and has been expanding tracks and stations on its outer rail lines (including a big project at the one I use). This despite massive amounts of debt on its own balance sheet. My take on this is that expansion is (and has been) more popular than upkeep, and easier to sell, both to customers and to politicians. Building out a new line is far more visible than fixing the switches and tracks on the old ones, even though the latter activity might be a better use of the money.

And M&A might have the same thing going for it in pharma, compared to either putting more money into a company's own R&D or seeding new companies. An acquisition has an immediate effect on the balance sheet, with (for the most part) clearly visible assets changing hands. We bought company X for their drug Y. That's much easier to explain to the investors than saying that we plowed a bunch of money into things that might or might not bear fruit ten or twelve years from now, even though that's a pretty good description of the whole business of drug research.

So that takes us back to the first paragraph: why don't big pharma companies take a bit of their excess cash and use it to grow the whole sector? Because they, and by extension, their investors, don't see that as a worthwhile thing to do. We can argue about that, and I'd argue that an amount of cash that the big companies would hardly even miss could make a huge difference, but it's not that the companies involved have never let it cross their minds. While there are some big company venture funds, they're never going to be as big as the balance sheets might suggest they could be. That money is for signaling, not for investing.

Comments (18) + TrackBacks (0) | Category: Business and Markets

February 24, 2015

An Antibiotic Discovery Prize

Email This Entry

Posted by Derek

Ezekiel Emanuel of the University of Pennsylvania has a proposal in the New York Times for a prize in antibiotic discovery:

Let’s use prize money. What if the United States government — maybe in cooperation with the European Union and Japan — offered a $2 billion prize to the first five companies or academic centers that develop and get regulatory approval for a new class of antibiotics? As the XPrize — a foundation that runs competitions to spur innovations for difficult problems that often aren’t being addressed — and others have demonstrated, prizes for lofty goals can catalyze the creation of hundreds of unexpected research teams with novel approaches to old challenges. The prestige, bragging rights and renewed sense of mission created by such a prize would alone make an investment in research worthwhile.

I think that's a good idea, and I'd submit that this is about the minimum amount needed. (It's certainly a lot more realistic than this proposal). Regulatory approval is certainly the appropriate endpoint. If someone wants to put more money into it, I'd tighten up the requirements a bit to say a new mechanism of action against gram-negative organisms, since hitting the gram-positive ones is (somewhat) easier and (somewhat) less critical. Knowing that payout is waiting would make a good case for a number of small companies to try a lot of unusual things, and unusual things are just what's needed in this area. Let's see if anyone expresses serious interest. . .

Comments (38) + TrackBacks (0) | Category: Infectious Diseases

Cutbacks at Merck Serono

Email This Entry

Posted by Derek

Merck/Serono (Merck KGaA of Darmstadt) is apparently cutting back in Billerica, their US research site. According to FierceBiotech, though, they're not saying by how much:

"a limited number of discovery positions have been impacted in our research organization. We are working to re-assign as many of those positions as possible in the organization as we drive all of our research efforts forward."

. . .the company declined to say just how many staffers are involved in the reorganization, though a spokesman did say the cutbacks are focused in their Billerica facility near Boston.

The company has been having its problems over the last few years, as that link details. Billerica and Damstadt are the main R&D sites, after they closed down the Serono site in Switzerland.

Comments (7) + TrackBacks (0) | Category: Business and Markets

February 23, 2015

Is FEP Ready For the World?

Email This Entry

Posted by Derek

Here's a paper that basically throws down the computational gauntlet. A large group of authors from Schrödinger, Nimbus, Columbia, Yale, and UC-Irvine say that their implementation of free energy perturbation (FEP) calculations really does lead to a significant number of more active compounds being predicted. That's as compared to other computational methods, or to straight med-chem intuition and synthesis.

Here, we report an FEP protocol that enables highly accurate affinity predictions across a broad range of ligands and target classes (over 200 ligands and 10 targets). The ligand perturbations include a wide range of chemical modifications that are typically seen in medicinal chemistry efforts, with modifications of up to 10 heavy atoms routinely included. Critically, we have applied the method in eight prospective discovery projects to date, with the results from two of those projects disclosed in this work. The high level of accuracy obtained in the prospective studies demonstrates the ability of this approach to drive decisions in lead optimization.

They say that these improvements are due to a better force field, better sampling algorithms, increased computing power, and automated work flow to get through things in an organized fashion. The paper shows some results against BACE, CDK2, JNK1, MCL1, p38, PTP1b, and thrombin, which seems like a reasonably diverse real-world set of targets. Checking the predicted binding energies versus experiment, most of them are within 1 kcal/mol, and only about 5% are 2 kcal/mol or worse. (To put these into med-chem terms, the rule of thumb is that a 10x difference in Ki represents 1.36 kcal/mol). These calculation should, in theory, be capturing the lot: hydrogen bonding, hydrophobic interaction, displacement of bound waters, pi-pi interactions, what have you. The two prospective projects mentioned are IRAK4 and TYK2. In both of these, the average error between theory and experiment was about 1 kcal/mol.

But this is not yet the Rise of the Machines:

The preceding notwithstanding, a highly accurate and robust FEP methodology is not, in any way, a replacement for a creative and technically strong medicinal chemistry team; it is necessary to generate the ideas for optimization of the lead compound that are synthetically tractable and have acceptable values for a wide range of druglike properties (e.g., solubility, membrane permeability, metabolism, etc.). Rather, the computational approach described here can be viewed as a tool to enable medicinal chemists to pursue modifications and new synthetic directions that would have been considered too risky without computational validation or to eliminate compounds that would be unlikely to meet the desired target affinity. This is particularly significant when considering whether to make an otherwise highly attractive molecule that may be synthetically challenging. If such a molecule is predicted to achieve the project potency targets by reliable FEP calculations, this substantially reduces the risk of taking on such synthetic challenges.

There's no reason, a priori why this shouldn't work; it's all down to limits in how well the algorithms at the heart of the process deal with the energies involved, and how much computing force can be thrown at the problem. To that point, these calculations were done by running on graphics processing units (GPUs), which really do have a lot more oomph for the buck (although it's still not just as trivial as plugging in some graphics processor cards and standing back). GPUs are getting more capable all the time themselves, and are a mass-market item, which bodes well for their application in drug design. Have we reached the tipping point here? Or is this another in a very long series of false dawns? I look forward to seeing how this plays out.

Comments (56) + TrackBacks (0) | Category: In Silico

Para-Chloro Was Good Enough For Them, So It's Good Enough For Me

Email This Entry

Posted by Derek

How many of the molecular pieces that we use in medicinal chemistry are historical accidents? I've wondered this from time to time. There's no doubt that drug structures are partly driven by ease of synthesis/commercial availability (these two go hand in hand), and these in turn are influenced by which reactions and feedstocks were exploited earlier. The Grignard reaction came well before palladium coupling methods, but there's no reason that it had to, just to pick one example.
para-chloro.jpg
This new paper in J. Med. Chem. is what has me thinking about this again. The authors, from AstraZeneca, show that their in-house chemists tend to think of para-aromatic substituents more often than the other regioisomers, and that this preference is mirrored in the commercially available reagents (and indeed, in marketed drugs). The paper looks into sources of this bias - cost, the 1972 Topliss tree paper, and so on, but no single factor appears to be at work. What does seem to be going on is a self-reinforcing bias - there are more p-aromatics in the screening deck, so more of them hit. And there are more commercially available compounds with the structure, so more of them get made in turn.

We believe that ultimately the present day bias is now likely due to unjustified personal preferences and overused at the expense of meta and ortho regioisomers as well as other potentially diverse bioisosteres. This last point is an important conclusion. The bias for p-ClPh has propagated throughout the years and influenced design and synthesis plans. A simple extension into disubstituted aromatics revealed that chemists favor the similarly substituted compounds (e.g., diCl, diF, diMeO), with many of these having at least one element in the para position. This analysis also illustrated that many disubstituted compounds are underrepresented in the public domain, highlighting an opportunity for screening collection differentiation.

I suspect that there are many more such biases, based on availability of different heterocycles, lack of stereoselective methods in some areas, etc. Our screening collections (and our building block catalogs) are the work of human beings, making conscious and unconscious choices, not some random slice of chemical space.

Comments (18) + TrackBacks (0) | Category: Chemical News | Drug Industry History

February 20, 2015

More Price Hikes on Obscure Medication

Email This Entry

Posted by Derek

Get ready for some twists and turns here. I wrote back in September about the business model of Retrophin, a company whose plans (at least for the foreseeable future) were to find small-market drugs, buy them from their obscure producers, and then raise their prices into geosynchronous orbit. That particular story blew up in unforeseen ways, capped by the company's CEO, Martin Shkreli, making a bizarre appearance on a public forum (Reddit), which was of a piece along with his often ill-advised activity on Twitter. Shkreli was ousted by Retrophin's board a few weeks later, amid accusations of stock-trading irregularities.

He went on to form Turing Pharmaceuticals, whose business plan, by contrast, was (at least at first) to find some small-market drugs, buy them from their obscure producers, and raise their prices into geosynchronous orbit. As reported here by Adam Feuerstein, his first target was praziquantel (Biltricide), the antihelmenthic made by Bayer:

Shkreli is negotiating with the German drug giant Bayer to purchase marketing rights to Biltricide, a drug used to treat infections caused by worm-like parasites called liver flukes. A course of treatment with Biltricide typically involves taking six to nine pills in a single day and costs around $100.

If Shkreli acquires Biltricide from Bayer, he plans to raise the price of the drug to $100,000 for a single-day course of treatment, according to people briefed on Turing's business plans. No other changes or improvements to the drug will be made by Turing. The extra revenue generated by Biltricide is expected to earn Turing a fast profit for its investors and help defray the cost of developing other, experimental drugs, sources said.

But that sale seems to have fallen through. Bayer is not an obscure producer, as opposed to the former manufacturers of Thiola (the drug I wrote about last September), and when Shkreli's interest alerted them to the drug's potential for a price raise, they decided to just do that on their own. Effective earlier this month, they raised the wholesale price by 3.5x, apparently because insurance providers won't care much about such a low-volume drug. That (as Feuerstein pointed out on his on Twitter feed) is at least far less than what Turing planned. (The Thiola price hike was 20x).

But even that price increase by itself is still not the sort of thing that I (or anyone else) would like to see. As mentioned in my second Thiola post, pricing power is a weapon, for sure, but it's one that if you keep using indiscriminately can be taken away. "Not many people will notice" is not much of a good reason to unsheath it, either.

Praziquantel itself is not in the same category as Thiola, as far as I can see, where there really does seem to have been just one supplier. Several foreign generic suppliers make the compound, and Merck Serono has a long-standing donation program in Africa. So this one is not putting on the screws as hard, not that that's an excuse.

Comments (30) + TrackBacks (0) | Category: Drug Prices | Infectious Diseases

Bonne Chance, Brandicourt

Email This Entry

Posted by Derek

Sanofi's CEO hunt has ended, and the roue de la fortune has pointed to Bayer's Olivier Brandicourt. (I'm tempted to keep dropping French phrases every couple of lines, but I figure that no one will stand for it, so you can work them in mentally as appropriate). But before his two years at Bayer, Brandicourt had many years at Pfizer, so that might be a more appropriate pedigree to cite.

That history has some interesting chapters, such as overseeing the launch of Exubera, the inhaled insulin product whose effect was a bit more like the launch of an N-1. Exubera was, at least to an outsider, an absolutely hair-frizzing example of wrongheaded groupthink, a completely avoidable debacle from an organization poisoned by breathing its own fumes. This will make the recent Sanofi connection with Mannkind's inhaled insulin product fun to watch. I think that Mannkind's work in this area is the sort of thing that Don Quixote might have done if you'd given him a few billion dollars to work with, and that many of their investors need to adjust the dosages of their non-insulin medications. But we'll see what Brandicourt makes of it all. He has a lot on his agenda at Sanofi, and people will be wondering if (1) he can do the job, (2) if anyone can do the job, and (3) whether Sanofi's board of directors will let anyone do the job.

Comments (11) + TrackBacks (0) | Category: Business and Markets | Diabetes and Obesity

Unclick Undone, Unsurprisingly

Email This Entry

Posted by Derek

The now-notorious "unclick" paper has been retracted. Last summer saw an editorial "Expression of Concern", and later it was reported by C&E News that a common author (Kelly Wiggins) of all three papers in this area had confessed to fabricating data.

In light of this, the retraction notice is interesting. It makes reference to the UT-Austin investigation, but notes that its results (other than a finding of misconduct) were not shared. The original corresponding author, C. W. Bielawski, concluded that the key results were not trustworthy, though. He and the other author of the Science paper agreed that it be withdrawn, and "After the conclusion of the investigation, authors Bielawski and Brantley volunteered to withdraw the paper; it has not been possible to contact author Wiggins". I would guess that we're probably not going to hear from her again. . .

Comments (13) + TrackBacks (0) | Category: The Dark Side | The Scientific Literature

February 19, 2015

Experience Phase III Failure, Twice

Email This Entry

Posted by Derek

I'm going to use a short phrase that should make everyone who's ever been involved with clinical research shiver a little bit: post-hoc subgroup analysis. This comes up again and again in drug research (and has been the subject of several posts here over the years), because when a drug comes up short in the clinic, the natural impulse is to see if there are some groups that just responded better than others. New hope! New trial!

But you're walking into a very dangerous landscape when you do this sort of thing. The first big factor is how the trial itself was set up: did you define these subgroups before you started, and thus (presumably) took care to see that they were all populated with enough patients to have a chance of being meaningful? And just how many subgroups are we talking about, here? As everyone should be aware, the more of these you look at, the greater the risk you have of a seemingly interesting effect being nothing more than chance. Given the uncertainties of most clinical readouts, if you can look at enough subgroups in a large data enough data set, you can jack things up to the point where at least one of them will have to look significant. Getting excited about this is not recommended.

I bring this up because we've had yet another example of a company pushing onwards in the clinic after a subgroup analysis and getting scorched. Takeda and Amgen took an anti-VEGF (among other things) compound, motesanib, into the clinic against non-small-cell lung cancer a few years ago. Phase II looked encouraging enough to go on, but the Phase III trial (adding it to existing chemotherapy) struck out pretty thoroughly.

Not all that surprising in small-molecule oncology, that sort of result. And adding hydra-headed tyrosine kinase inhibitors into the NSCLC mix was an idea that had failed before with other compounds. I think that it was at this point that Amgen exited the picture, but Takeda dug into the data and believed that Asian patients actually showed some response to the drug. That's not a crazy idea, of course, but that doesn't mean it's real, either.

They went on to do another Phase III, with patients recruited from various East Asian countries. And two days ago, they announced the results: the primary endpoint (progression-free survival) was missed completely. And so the saga of motesanib comes to an end - well, in non-small-cell lung cancer, anyway. I believe the compound is still being looked at in thyroid cancer, and Takeda is probably trying to think of some other uses even now.

But in general, when your big Phase III trial flops, you'd better be ready for it to flop again if you still want to press on. I'm trying to think of any examples where this resuscitation strategy has worked. Even one is enough to give a person (or a company) hope, given the amount of money at stake, but given the odds, just how much hope is appropriate? How much was appropriate here?

Comments (22) + TrackBacks (0) | Category: Cancer | Clinical Trials

February 18, 2015

Spectra of An Actual Transition State?

Email This Entry

Posted by Derek

I don't spend too much time on physical organic chemistry here on the blog, which in a way is a shame. The readership would dwindle, although probably not as much as when I talk about patent law and intellectual property. But physical organic is an area I've always enjoyed, intellectually, even though it was sometimes hard to infer that during my graduate school classes. (I doubt if I have the patience to be much good at it in a lab setting).

But there's a new paper out in Science from a team at Stanford's SLAC, home of some of the brightest and hardest X-ray beams that ever fried a target sample. (Here's the press release from Stanford). Working with the University of Stockholm, they claim to have actually detected X-ray spectral data (K-edge absorption) from the transition state in the catalytic oxidation of CO to carbon dioxide. This was done on the surface of a ruthenium catalyst, with extremely fast and precise heating from an optical laser to get things going.

For any non-chemistry types reading down this far, try imagining a chemical reaction as a journey from one valley to another, through a high mountain pass. "Elevation" in this landscape, is how much energy the system has, and an irreversible reaction features things going, overall, into a lower valley/energy state. The absolute peak of the mountain transit, though, is the transition state for the reaction. It's a real thing, but it only lasts for one molecular vibration before it heads off down one slope or another. It's the highest-energy species in the whole path because it features all sorts of half-formed and half-broken bonds, the sort of state that molecules generally avoid ever getting themselves wrenched into. But since getting up to and over that particular hump is such a big part of any reaction, anything that stabilizes the TS will speed a reaction up, sometimes immensely, which is just the sort of thing we'd like to learn more about how to do on demand.

As that press release says, quite honestly, this was "long thought impossible", and my prediction is that there will be quite a few people who won't accept that it's been done. My x-ray fu is not strong enough, personally, to be able to offer an informed opinion. Even if this report is accurate, it's surely right on the edge of what's possible with some of the best equipment in the world, so you really have to know this area at a high level to critique it thoroughly. But what's reported is both plausible and interesting.

What they saw was that the oxygen molecules began to change first. The the electron distribution began to change in the CO molecules, followed by a productive collision (some of the time) to form the transition state itself. And one of the interesting things about that was how many times it apparently collapsed back to the original molecules, rather than going on to product. This is going to be subtly different for every reaction, or so theory tells us, but if we are finally able to physically investigate such things we may find ourselves revising a few theories a bit. This particular reaction, taking place between two small molecules, has been modeled extensively at all levels of calculation, and the results seen fit very well, so it's not like we're going to be packing big swaths of human knowledge into the dumpster. But anything we can learn about transition states (and how to make them selectively happier and unhappier) is the key to chemistry as a whole.

Here's a YouTube video from the Stanford team on what they've been up to. We'll see the rest of the analytical and theoretical chemistry community reacts to this work.

Comments (13) + TrackBacks (0) | Category: Analytical Chemistry | Chemical News

New Alzheimer's Research in the UK

Email This Entry

Posted by Derek

Oxford University's new Drug Discovery Institute is making a big push in dementia and Alzheimer's. Update: link fixed, I hope. One one level, that's good news, because this is a tough area that needs all the well-placed shots that can be aimed at it. The folks Oxford will be staffing this institute with will surely not be dummies, and there's a lot of currently underused drug discovery and development talent in the UK that I hope that they'll tap into.

The Alzheimer's Research UK charity is helping to launch three such institutes - one at Oxford, one in Cambridge, and one with University College, London, so they're also planning to get plenty of academic firepower as well (and it's not like people at all three institutions haven't been working in the field already, in various ways). All this makes me hope that this part is just some hyperbole for the press release:

The research team at Oxford will develop multiple projects to design and develop new therapies. It plans to deliver up to three new therapies for further clinical development and trials within the next five years.

'This has never been done before, and we believe that it will transform dementia research', said Professor Chas Bountra, the other project leader at Oxford University, 'We will work with the best academic and industrial scientists to identify potential new drug targets for dementia. We will then generate high quality starting points for making new medicines, but then uniquely, make them freely available to the world's biomedical community. By doing so we will catalyse new biology, new disease understanding and importantly accelerate those few molecules which are likely to slow down the progression of this dreadful disease. We are crowd sourcing the discovery of new medicines for Alzheimer's disease. This is unprecedented.'

Three good starting points in five years surely would be that. They'd better have about eight or nine really good ideas in hand right now if they want to have any chance at all. And honestly, I'm not sure if there are as many as eight or nine separate good ideas in Alzheimer's drug therapy at all. I certainly hope I'm wrong about that. But I also hope that this strong, well-intentioned effort doesn't start off by promising more than anyone can deliver.

Note: there's been a trend towards Alzheimer's optimism (or over-optimism) in the UK. I wonder if you just have to join along in this chorus in order to participate at all?

Comments (43) + TrackBacks (0) | Category: Alzheimer's Disease | Drug Development

February 17, 2015

Layoffs at Boehringer Ingelheim

Email This Entry

Posted by Derek

According to FierceBiotech, Boehinger is cutting staff in Ridgefield, CT. They are, they say, "in the process of rebalancing its workforce in the U.S. to put the company in the best position for future growth". You won't find many sentences that says "layoffs" more than that one. I haven't heard from any of my chemistry contacts there, so I don't know how the R&D departments are faring in all this compared to sales, administration, and so on, but I'll post more details as they become known.

Comments (13) + TrackBacks (0) | Category: Business and Markets

Chemical Illiteracy, Again

Email This Entry

Posted by Derek

BBC%20peptide.jpgThe BBC really should know better than this. Shown is a screen shot from a current science TV series, "Wonders of Life", taken from this review in The Guardian. Every chemist reading this will have noticed by now that this so-called "peptide structure" is laughably insane. It's poorly drawn, too - what's with all those changing bond lengths? - but that big problem is that it represents something that could never exist.

If a BBC show were to include a world map where Italy had been relabeled Argentina, they would probably be shamed throughout the British press. This structure is more stupid than that, though, since it's at least physically possible for the citizens of Italy to vote to change the name of their country. Oxygen, though, cannot decide to be a neutral trivalent species, and that's only one of several stupidities in that "structure". This seems to be yet another case of some graphic design person getting their hands on some chemical drawing software and making something that looks pretty.

Prof. David Smith at York has a good response to all this on YouTube. What always irritates me about these mistakes is that they're so avoidable, and accomplish so little. Would putting down the right structure really have made it less visually interesting? Is the gibberish shown really so much more compelling that someone had to come along and mess things up just for that purpose? This isn't the first example of these ridiculous errors I've highlighted around here, and I'm sure it won't be the last. I just wish they they were less frequent, and less prominent.

Comments (40) + TrackBacks (0) | Category: General Scientific News

February 16, 2015

Targeting a Transcription Factor

Email This Entry

Posted by Derek

Here's a paper in Science on inhibition of the transcription factor CBFbeta-SMMHC. That's a messed-up version of CBFbeta itself, which has been found to drive many forms of acute myeloid leukemia (AML) though its interactions with another transcription factor, RUNX1. So tying this one up with some sort of small-molecule therapy is an attractive idea, the only problem being that getting small molecules to work on transcription factor pathways has been very difficult indeed.

Transcription.jpg This group set up a FRET assay and screened the National Cancer Institute's diversity set of compounds to see what they could come up with, and 2-(2-pyridyl)-5-methoxybenzimidazole same out as a micromolar hit. (Without the methoxy, it lost at least 10x potency). Since the transcription factor itself is dimeric, they then tried stitching two of these together, with the results shown. All of these are in the 300/400 nM range. There's a good deal of downstream biological data suggesting that these compounds are hitting this pathway.

So, what does a medicinal chemist think about these compounds? For starters, 2-pyridylbenzimidazole is going to be a very good metal chelator, and I would wonder if some of the effects might not be due to an off-target mechanism of that sort. The paper itself does not mention any possibility of metal chelation, as far as I can see. The 5-methoxybenzimidazole itself isn't objectionable per se (that's the left-hand side of omeprazole), but I have seen assay hits with that sort of chelating group in them, and in my own experience they've been difficult to prosecute. (I'd be glad to hear some other opinions).

Now, as far as the dimer idea, since the target itself is a dimer, it's hard to object to the idea. But for whatever reason, this strategy doesn't seem to have paid off a lot in drug discovery. Perhaps it's the added molecular weight and PK properties that you pick up when you do this that make them harder to advance. On top of this, medicinal chemists have a sort of instinctive distaste for completely symmetric molecules like these - and I do, too - but I've never been quite sure why. I think that we feel that there are more reasons for something like this not to be real than for it to be acting as advertised, but that's not a very quantitative look at things.

I would feel better about this latest paper if the chemical matter had been given more of a going-over than appears to have been done. I don't see any mention of selectivity assays or the like, and I really would like to run something like this across some sort of broad panel just to make sure that it doesn't set off too many other things. On the other hand, though, the authors do have a lot of cell data that points toward this particular transcription factor being the target, and I'm really to applaud anyone who gets the small-molecule transcription factor thing to work. I'll look forward to seeing what comes of this, for sure.

Comments (21) + TrackBacks (0) | Category: Cancer | Drug Assays

AstraZeneca's Anti-Infective Woes

Email This Entry

Posted by Derek

According to Fierce Pharma, AstraZeneca is having trouble unloading its anti-infectives division. Part of the problem seems to be that they're trying to spin out the whole package at once:

The pharma giant also specifically sought out venture groups that had been investing in anti-infectives in an effort to find a buyer for the Waltham, MA-based group. "The assets had little (if any) differentiation and were late to the market and the economics didn't make sense," the executive added.

Another player in the field says he took an early look at AstraZeneca's proposition. "The challenge was that they wanted to spin out the whole pipeline plus a 20-person team," he said. "The pipeline had a couple of interesting assets but they wanted to get paid for the uninteresting assets as well."

The article says that people believe that AZ is getting close to making some sort of decision, since the whole process has been going on for longer than they wanted. I wonder how many of that 20-person team still remains?

...continue reading.

Comments (13) + TrackBacks (0) | Category: Business and Markets | Infectious Diseases

February 13, 2015

A Trick Question

Email This Entry

Posted by Derek

I got a kick out of this FierceBiotech story on Pfizer and Ian Read, based on this talk with him.

Pfizer probably won't be building up the innovative side of the business by scooping up a bunch of biotechs. "[I]t's difficult to pick winners," Read acknowledges. Plus, as much as Pfizer tries to be investor-friendly, "small biotechs don't rate Pfizer very highly."

"[M]aybe that's because we don't pay enough," Read added.

There's no indication of what expression he had on his face when he advanced that theory, unfortunately. But maybe it's because the biotechs themselves (and their investors) don't like seeing their companies filleted like fish and a marketable chunk turned into sushi, with everything else dumped into a grimy bucket and thrown over the side? Just a thought.

Comments (29) + TrackBacks (0) | Category: Business and Markets

Rapamycin And Aging: The Spotlight Shines

Email This Entry

Posted by Derek

Rapamycin gets the spotlight in Bloomberg Businessweek here. This is a look at what's been set in motion by the 2009 report that the compound notably extended the life of rodents in long-term feeding studies. It's a good article, and gets some interesting quotes from Mark Fishman of Novartis and many others.

One of the big questions is how rapamycin exerts its effects. It's certainly an inhibitor of the mTOR pathway (and it was actually used to discover and define it, since the TOR part stands for "target of rapamycin"). That's going to do a lot, including immune suppression, which is one of the reasons that people are a bit leery of using the drug in otherwise healthy people. However, this study, from late last year, suggested that the closely related everolimus actually improved immune function in elderly human patients, so the last word on this has definitely not been written.

There was a study in 2013 which suggested that the lifespan enhancement seen in rapamycin animal studies was largely (or completely) due to tumor suppression, rather than any general anti-aging effect, but (as this Bloomberg story shows), this is still an open topic. A group at the University of Washington is planning a study in aging dogs that might help answer the question.

What seems certain is that companies are taking on the idea of treating aging more openly. GSK got pretty badly burned with Sirtris and the follow-up from resveratrol, or so it most certainly appears from the outside, but Novartis is clearly interested, and you have AbbVie's recent deal with Google's Calico as well. The idea will not be so much as to move right in and say "We're going to reverse aging", but to go after diseases associated with aging, whose mechanisms of treatment might be more general. This is partly just prudent practice, and partly regulatory caution, since the FDA has no way to deal with a proposal to treat people who, by current medical definitions, have no disease but are "merely" growing old. With any luck, that "merely" will come to seem odd.

I can't resist quoting James Blish here (and I couldn't last time, either). In his 1950s Cities in Flight books, one of the key technologies that made the plot run (along with a handy and vividly described faster-than-light drive), was the discovery of a suite of therapies that nearly prevented aging. Blish himself studied as a biologist, and worked for Pfizer in the 1950s for a while, although not as a scientist. That accounts for a scene early on when a returning space pilot is delivering exotic samples for testing to "Pfiztner", a large drug company in New York City:

The door closed, leaving Paige once more with nothing to look at but the motto written over the entrance in German black-letter:

Wider den Tod ist kein Kräutlein gewachsen!

Since he did not know the language, he had already translated this by the If-only-it-were-English system, which made it come out "The fatter toad is waxing on the kine's cole-slaw." This did not seem to fit what little he knew about the eating habits of either animal, and it was certainly no fit admonition for workers.

That motto, of course, turns out to be an old herbalist saying that "Against Death doth no simple grow", and the characters in the story are busy proving that to be incorrect. (That's also a good example of the peculiar things that Blish would drop into his science fiction stories, odd little asides done in omniscient-author voice that give his writing an unmistakeable tone.) We'll see how prescient he was about the natural products for aging, and if that works out, perhaps we'll have enough time to start in on the faster-than-light drive.

Comments (23) + TrackBacks (0) | Category: Aging and Lifespan

February 12, 2015

What Compound Will You Never Forget?

Email This Entry

Posted by Derek

While catching up on the literature today, I find that even now, thirty years later, I can't look at a paper that uses 1,6-anhydroglucose (levoglucosan to its friends) without a quick, simultaneous flicker of interest and shiver of dread. This is why.

So fellow chemist, what's yours? What compound will you never forget, because it did something good for you or something bad to you, because it got you out of grad school, ruined six months of your life, was the most fun to recrystallize, or made you wish that you were standing out somewhere in a drive-through enclosure asking "Will that be all today?" instead? Nominees in the comments.

Comments (89) + TrackBacks (0) | Category: Chemical News | Life in the Drug Labs

A Fluorination Review

Email This Entry

Posted by Derek

Most medicinal chemists like fluorinated compounds, since they tend to give compounds very different (and often more desirable) properties, and we're interested in new ways of preparing them. The last few years have seen a real upsurge in the synthetic methods available in this area, and particularly in reagents and techniques that can be applied to complex molecules. These "late-stage" fluorinations are particularly appealing - imagine, as a thought experiment, taking a library of natural products and running them through a protocol like this, to produce a completely new library with completely new properties.

Here's a new review of these reactions, from Tobias Ritter and coauthor Contanze Neumann of Harvard. It includes a handy cheat sheet of recent advances by reaction class, and looks forward optimistically to still more: asymmetric fluorination, new electrophilic reagents, greater functional group tolerance, and new ways of accomplishing direct C-H fluorination.

Comments (4) + TrackBacks (0) | Category: Chemical News