Screening
Thermal Shift Data
Reducing the PAINS in High Throughput Screening: Assay design as a tool for maximising efficiency

Reducing the PAINS in High Throughput Screening: Assay design as a tool for maximising efficiency

By Dr Philip S. Jones and Dr Stuart P. McElroy

While the quality of the compound collection is frequently described as a key determinant of the success of a high throughput screening campaign, in our opinion equally important is the design and execution of the primary assay and the subsequent confirmatory screens used to establish the authenticity of the hits that are discovered.

This article will describe, with examples, how understanding the likely mechanisms of false positives in advance of screening informs the design of the hit triage, thus increasing the likelihood of discovering optimisable chemical matter and avoiding costly wasting of resource. We describe the use of a bespoke ‘Robustness Set’ of nuisance compounds and how it can be used in conjunction with adjusting the conditions of the assay. We also present an example where hit identification was initially confounded by the presence of a common pharmaceutically-acceptable salt and will describe how biophysical data was used to characterise the interactions and triage of the hits.

The small-molecule drug discovery ecosystem has changed enormously over the past two decades (1). Currently only two of the top 10 best-selling medicines are small molecules, whereas 15-years ago all 10 were in this category (2). While, of course, other statistics are arguably more important, such as the number of patients treated, this change has led to challenges in funding smallmolecule programmes in many discovery organisations including major pharma, for example: “There was a view that vaccines, antibodies and other biopharmaceuticals were more profitable than small-molecule drugs, in part because competitors were not as adept at bringing generic versions to market.” (3) There has been a shift to new modalities in discovery (4) and some notable successes have been scored in the clinic with significant promise of future advances. At the organisational level there have been very significant reductions in staffing in pharma companies, in particular in discovery R&D (5), which has mirrored the growth in open innovation models where the search for innovation relies on increasing interactions with academic groups and biotech (6). These structural changes in the drug discovery landscape have led to the rise of the ‘virtual biotech’, where a nucleus of individuals, typically a combination of disease specialists and ex-pharma project managers backed by private investment, access all of the requisite discovery and development infrastructure via outsourcing to CROs (7). However, these small or micro organisations, often originating from academia, inevitably have gaps in their capacities, capabilities and/or institutional knowledge base and one area where this is evident is in high throughput screening (HTS).

Lead generation and high throughput screening

HTS is a key strategy for finding chemical matter as a starting point for small molecule drug discovery programmes. It comprises screening large libraries of compounds (typically hundreds of thousands) in one or more biological assays followed by a series of triaging activities aimed at prioritising one or more validated hit series. A validated hit series can be defined as a series of compounds which possess a progressive structureactivity relationship with strong evidence of target engagement, an indication that selectivity over close target orthologues is achievable and with physicochemical properties that are appropriate for preparation of a clinical candidate (8).

While HTS is an effective method, it requires significant infrastructure in the form of chemical libraries, compound storage and logistics (9) and, vitally, expertise in assay development and execution (10). In many cases this expertise has been accumulated over many years and resides within major pharma companies, although increasing investment in academia and major initiatives such as the European Lead Factory (11) has increased access to the required infrastructure and expertise for small companies and academic groups.

False positives

While the ‘quality’ of the compound collection is sometimes considered a synonym for success in a high throughput screening campaign, in our opinion the careful design and execution of the primary assay and the subsequent confirmation of hit authenticity is as important, if not more so. The concept of Pan-Assay Interference Compounds (PAINS) is now widely accepted, whereby some compounds, or trace contaminants from chemical synthesis, can act as inhibitors or activators of multiple targets via unproductive or non-specific mechanisms (12). The key qualifier in this concept is the term ‘Pan-Assay’, indicating that these compounds appear as hits time and time again, which any professional screening organisation will quickly recognise and can then annotate as frequent hitters or remove them altogether from the library. Unfortunately, not all unproductive mechanisms of inhibition can/will be flagged as PAINS as it is important to recognise that the nature of the assay/target interference mechanisms is contextdependent, eg some compounds will only act as aggregators under certain buffer conditions and over particular time periods. Indeed, some interference mechanisms can be incredibly subtle, elegant even, and appear exquisitely selective for the target of interest. So much so that even the most conscientious drug hunter can and will end up wasting time and resources unwittingly following these up. This is where careful assay development and the considered sequencing of testing helps minimise the frequency and therefore long-term cost of these frustrating dead-end forays.

Two recent reports illustrate some of the issues and subtleties of false positives. Ciulli et al (13) reported on hits with 20μM potency in a biochemical screen and confirmed a molecular interaction with the target using orthogonal techniques (isothermal calorimetry and NMR), although the compound destabilised the protein in a thermal shift assay. Chemical analogues of the favoured compound only revealed ‘flat’ SAR, elevating concerns regarding the mode of activity. Finally, crystallography identified contaminating zinc in the samples (a by-product of compound synthesis) as leading to a metal-mediated oligomerisation of the protein which manifested itself in the original assay as functional inhibition. Technically correct, but not a promising avenue for further work towards developing a drug-like inhibitor. The second example is a twist on the problems of aggregators. Seminal work by Shoichet et al (14) revealed that some compounds form aggregates in solvents. The classical description of aggregators is as forming micellar-type structures that can effectively absorb the protein target in an assay, which is observed as inhibition of target function. Of course, the specific structures that these aggregates adopt will be dictated by the physicochemical properties of the compound, its concentration and the environment of the assay – one reason why detergents are commonly included in screening assays to disrupt aggregation. What is probably less well appreciated is that some aggregates can adopt interesting structures that allow them to act as specific inhibitors. Blevitt et al (15) describe the characterisation of an inhibitor of the interaction between TNF and the TNF receptor. Rather than describing a pharmacologically-useful mechanism of inhibition, they used crystallography to identify an aggregate of five molecules mimicking the structure of the TNF subunit. This would replace one of the genuine subunits in the active trimeric target protein leading to a conformational change and apparent inhibition. Both of these examples illustrate the requirement for a range of techniques to be available to the screening group together with the expertise to perform and interpret the results.

Assay optimisation and the use of a robustness set

An approach to minimise the impact of these problems is to ensure the screening assay is fit-for-purpose by developing knowledge of the target’s sensitivity to common mechanisms of interference. To help achieve this we have established a ‘robustness set’ of compounds comprising those known to be ‘bad actors’ in high throughput screens, eg redox cycling compounds, aggregators, chelators, coloured, fluorescent and reactive. Testing these compounds in an assay highlights which classes may cause problems and we typically redesign the assay to eliminate or reduce the apparent sensitivity to these mechanisms. An example of this is from a potential metabolic oncology target, phosphofructokinase (PFK). Using a well-established method of monitoring ADP production, ADP Hunter by DiscoverX, we developed a basic assay using the same buffer conditions as used by our academic collaborator. While all of the usual assay quality metrics, S/B, %CV, Z’ and reference compound potency were exemplary, when the robustness set was screened, 90% of the compounds showed greater than 20% inhibition of PFK with no particular preference for their class of interference (Figure 1).

Figure 1 The % inhibition of phosphofructokinase activity in a biochemical assay

Our experience is that if a target is activated or inhibited by >25% of the robustness set then it is likely to suffer from particularly high hit rates when screened against a ‘normal’ compound library and likely suffers from some form of environmental sensitivity. In this case, the basic assay buffer did not contain any reducing agent. As a rule, it is best practice to protect any cytosolic enzyme with a strong reducing environment as structurally and functionally important cysteine residues can easily be oxidised when exposed to a variety of chemical compounds. Indeed, inclusion of 2mM DTT dramatically reduced the number of compounds causing greater than 20% inhibition to 9% (Figure 1) and almost all of these were classed as redox cycling compounds (RCCs). These types of compounds react with strong reducing agents, such as DTT or TCEP to produce hydrogen peroxide which in turn can oxidise sensitive residues such as cysteines leading to target inhibition. A third screen of the library, this time with the inclusion of 5mM of the weaker reducing agent cysteine, shows an assay that has lost its sensitivity to RCCs. Identifying only two compounds that produced slightly greater than 20% inhibition, albeit there is somewhat more noise in the baseline with the weaker reducing environment and the apparent effect of these two falls within that noise (Figure 1).

An important caveat to the use of this robustness set is that it does add extra time on to assay development as any substantive changes in the buffer environment requires the reassessment of enzyme kinetics and potential adjustments to substrate concentration or end point read time if significant changes become apparent.

We continually look for ways to improve the performance of the robustness set and we have reported on an example of triaging the output from a screen within the European Lead Factory which illustrates how we are doing this (16). A mitochondrial enzyme linked to the proliferation and tumourigenic capacity of some cancer types was screened against 450,000 compounds of the Joint European Compound Library. A primary hit rate of ~1% was of little concern but, somewhat unusually, almost all of the primary hits confirmed as inhibitors in two orthogonal biochemical assays and many showed shallow Hill slopes across a very limited range of potencies. One of the restrictions of the European Lead Factory is that we have a requirement to select no more than 50 compounds from each screen (17), which is a daunting task when there are ~4,000 confirmed hits to choose from and questions surrounding what the data is telling us. Usefully, we had validated a thermal shift assay, in which an orthosteric reference inhibitor showed a clear, saturable and concentration-dependent increase in stabilisation of the target protein (Figure 2).

Figure 2 Thermal shift assay data showing the concentration-dependent stabilisation of target protein

This assay provided the throughput for testing all ~4,000 compounds, which, surprisingly, almost all appeared to cause a change in the profile of protein stability. Interestingly, only six of the hits caused a thermal melting profile reminiscent of the reference inhibitor, ie a clear rightward shift of the single melting peak.

The remainder all produced the appearance of a shoulder or second peak suggesting that very few of the hits were binding the target in a manner similar to the orthosteric reference ligand (Figure 3).

Figure 3 Representative thermal shift data showing the 'second peak' profile of most of the screening hits

Taken together, the unique thermal stability fingerprint, the unusually shallow Hill slopes, the limited range of potencies despite considerable physicochemical and structural diversity and the unprecedented confirmation rates across multiple orthogonal assay formats raised the suspicion that there was some common contaminant affecting the hit samples. One area of investigation became assessing what common materials were likely to be present in preparation of the various hit structures, which led to the suggestion that oxalic acid would be worth investigating. Indeed, when oxalic acid was tested in the thermal shift assay it produced the exact same stabilisation profile as the hit compounds (Figure 4).

Figure 4 Thermal shift data showing the concentration-dependent appeareance of a second peak

It also inhibited the target in the biochemical assays with a Hill slope of ~0.6. The target was a large multidomain enzyme so a truncated construct lacking the reference compound binding domain was tested in the thermal shift assay. The orthosteric reference inhibitor and the six compounds that mimicked the thermal shift profile of the reference inhibitor in the full-length enzyme had no effect against the truncated protein. By contrast, oxalic acid and the remainder of the screening hits resulted in the appearance of a second peak with a concomitant reduction in the size of the main peak. Unfortunately, at this stage of the triage, the limited quantity of compound due to the nature of the ELF, and the sheer number of hits, prevented desalting and/or retesting from solid, which would have answered the question as to whether the activity we observed was solely driven by oxalic acid or some other common salt in the screening samples. Indeed, some of the hits may genuinely interact with the target in a manner similar to oxalic acid, but our focus was naturally drawn to the six hits that acted in a similar manner to the reference compound. This also resulted in an enhancement to the robustness set to include a library of common salts and likely contaminants from synthesis, metals etc, which are now routinely screened during assay development (in orthogonal biophysical assays as well as primary biochemical assays).

Identifying a validated hit series

Careful interpretation of the output from HTS is a key step towards identifying validated hit series as there are many opportunities for assay interference effects to disguise the true mode of action of compounds. While there are a number of generalisations to assist this interpretation in the form of ‘at risk’ structures, new confounding effects continue to be discovered and, in some cases, published (vide supra) demonstrating the subtlety of the interference mechanisms. We believe that being forewarned is being forearmed and that the chance of obtaining genuine validated hit series is greatly enhanced by learning what effects common classes of interference compounds have on an assay prior to running the full screen. Liabilities can be identified, and assay conditions modified to reduce or exclude these liabilities, or appropriate deselection strategies can be put in place to interrogate hits for possessing these liabilities. To this aim we have found the use of a ‘robustness set’ significantly assists in our assay development work and we look for continuous improvement of this set by adding new classes of interference identified in our ongoing screening campaigns. Time and effort spent at this stage prior to HTS pays dividends as it helps reduce the chances of costly follow-up of false positives and potentially the more serious loss of false negatives. While these are issues in the earliest stages of inventing a new medicine, the quality of this early work sets the trajectory for success for the whole project and increases the likelihood of new molecules reaching and ultimately bringing benefits to patients.

Acknowledgements

The research leading to some of the results described here was funded by a grant provided by the Innovative Medicines Initiative Joint Undertaking Grant Agreement 115489. Funding was also provided by the Scottish Universities Life Sciences Alliance. DDW

---

The article originally featured in the DDW Fall 2019 issue

---

Dr Phil Jones is Chief Scientific Officer of BioAscent Discovery. He has more than 30 years’ medicinal chemistry and drug discovery experience from Roche, Organon, Schering-Plough, Merck and the University of Dundee including senior roles as Executive Director and Acting Site Head. He was a member of the Research Leadership Council at Schering-Plough. Numerous clinical candidates resulted from groups for which he was responsible. These span a broad range of target families and therapeutic areas.

Dr Stuart McElroy is Director of Biosciences at BioAscent Discovery. With more than 12 years working in drug discovery, Stuart has extensive experience of developing and trouble-shooting novel screening assays, designing screening cascades, compound screening, hit validation and supporting hit to lead and lead optimisation programmes. Throughout the five years of the European Lead Factory project, he held the position of Head of Biology at the European Screening Centre (ESC), leading a team of bioscientists in prosecuting and triaging the output of more than 90 high throughput screens across all major target classes and disease indications.

References

1 Simpson, PB, Reichman, M. Nature Reviews Drug Discovery, 2014, 13(1), 3-4.

2 Urquart, L. Nature Reviews Drug Discovery, 2019, 18(4), 245.

3 Drewry, DH, Wells, CI, Zuercher, WJ, Willson, TM. SLAS Discovery, 2019, 1-10. DOI: 10.1177/2472555 219838210.

4 Waldmann, H, Valeur, E, Guéret, SM, Adihou, H, Gopalakrishnan, R, Lemurell, M, Grossmann, TN, Plowright, AT. Angew. Chem. IEE, 2017, 56(35)10249-10323.

5 Scannell, J. Forbes, 2015, onforb.es/1ZdvQbc.

6 Editorial, Nature Biotechnology, 2014, 32(2), 109.

7 Association of British Pharmaceutical Industries report http://www.abpi.org.uk/media/1372/the-changing-uk- drug-discovery-landscape.pdf.

8 Jones, PS, McElroy, S, Morrison, A, Pannifer, A. Future Med. Chem., 2015, 7(14) 1847- 1852.

9 Ellis, J. Biocompare, https://www.biocompare.com/Editorial-Articles/359206-New-Trends-in-Compound-Management/.

10 McElroy, SP, Jones, PS, Barrault, DV. Drug Discovery Today, 2017, 22(2), 199-203. DOI: 10.1016/j.drudis. 2016.09.028.

11 Giordanetto, F, Jones, P, Nelson, A, Benningshof, J, Muller, G, Pannifer, A, van Boeckel, S, Tzalis, D. Comprehensive Medicinal Chemistry III, Chapter 1.18, 505-519; Eds. S. Chackalamannil, D. Rotella, S. Ward; Oxford: Elsevier; 2017.

12 Baell, J, Walters, MA. Nature, 2014, 513, 481-483.

13 Morreale, FE, Testa, A, Chaugule, VK, Bortoluzzi, A, Ciulli, A, Walden, H. J. Med. Chem., 2017, 60, 8183-8191.

14 McGovern, SL, Caselli, E, Grigorieff, N, Shoichet, BK, J. Med. Chem., 2002, 45, 1712- 1722.

15 Blevitt, JM, Hack, MD, Herman, KL, Jackson, PF, Krawczuk, PJ, Lebsack, AD, Liu, AX, Mirzadegan, T, Nelen, MI, Patrick, AN, Steinbacher, S, Milla, ME, Lumb, KJ, J. Med. Chem., 2017, 60, 3511-3517.

16 Baillie, G, Morrison, A, McElroy, S. 6th Novalix Conferences – Biophysics in Drug Discovery, March 20-22, 2019.

17 Paillard, G, Cochrane, P, Jones, PS, van Hoorn, WP, Caracoti, A, van Vlijmen, H, Pannifer, AD. Drug Discovery Today, 2016, 21(1), 97-102.