Lab Product News
News

New drug testing model could reduce research costs


Montreal, QC – Researchers at McGill University say they have found a statistical key to streamlining the earliest stages of drug research. In a study published in this month’s issue of Nature Biotechnology, Dr Robert Nadon unveils new statistical models that could greatly improve the efficiency of a phase of drug research that has largely relied on hunches.

In the first stage of drug testing, a process called high throughput screening (HTS) is used as a form of triage by drug researchers to determine which promising compounds should go up to the next level of testing for potential new medications. Plates holding dozens of compounds or more are tested for their potential to inhibit or stimulate specific chemical reactions.

Dr Nadon, a principal investigator at the McGill University and Genome Quebec Innovation Centre, and his McGill co-authors, Nathalie Malo, James A Hanley, Sonia Cerquozzi and Jerry Pelletier, found modern statistical methods that provide a level of information about variability previously unavailable until later phases of drug testing. While later drug research trials generally test medications on as many subjects as possible, the early HTS process usually only tests the actual compounds once.

According to Dr Nadon, testing more than once is key. Obtaining even one replicate for all compounds, however, may be prohibitively expensive. The paper, “Statistical practice in high-throughput screening data analysis,” describes how results can be obtained by repeating only some of the plates. The method holds potential for minimizing costs while at the same time allowing for more informed decisions about which compounds to pursue.

Dr Nadon adds that the discovery complements business-model efficiencies already implemented by the pharmaceutical industry. “Industry observers have expressed concern about inefficiencies in the drug discovery process,” he says. “Although these are early days, statistical methods offer the potential to reduce costs by improving the odds that false hits will be abandoned early and true hits will be followed up.”