High-throughput screening (HTS) is a method for scientific experimentation especially used in drug discovery and relevant to the fields of biology, materials science and chemistry. Using robotics, data processing/control software, liquid handling devices, and sensitive detectors, high-throughput screening allows a researcher to quickly conduct millions of chemical, genetic, or pharmacological tests. Through this process one can quickly recognize active compounds, antibodies, or genes that modulate a particular biomolecular pathway. The results of these experiments provide starting points for drug design and for understanding the noninteraction or role of a particular location.
The key labware or testing vessel of HTS is the microtiter plate, which is a small container, usually disposable and made of plastic, that features a grid of small, open divots called wells. In general, microplates for HTS have either 96, 192, 384, 1536, 3456 or 6144 wells. These are all multiples of 96, reflecting the original 96-well microplate with spaced wells of 8 x 12 with 9 mm spacing. Most of the wells contain test items, depending on the nature of the experiment. These could be different chemical compounds dissolved e.g. in an aqueous solution of dimethyl sulfoxide (DMSO). The wells could also contain cells or enzymes of some type. (The other wells may be empty or contain pure solvent or untreated samples, intended for use as experimental controls.)
A screening facility typically holds a library of stock plates, whose contents are carefully catalogued, and each of which may have been created by the lab or obtained from a commercial source. These stock plates themselves are not directly used in experiments; instead, separate assay plates are created as needed. An assay plate is simply a copy of a stock plate, created by pipetting a small amount of liquid (often measured in nanoliters) from the wells of a stock plate to the corresponding wells of a completely empty plate.
To prepare for an assay, the researcher fills each well of the plate with some biological entity that they wish to conduct the experiment upon, such as a protein, cells, or an animal embryo. After some incubation time has passed to allow the biological matter to absorb, bind to, or otherwise react (or fail to react) with the compounds in the wells, measurements are taken across all the plate's wells, either manually or by a machine. Manual measurements are often necessary when the researcher is using microscopy to (for example) seek changes or defects in embryonic development caused by the wells' compounds, looking for effects that a computer could not easily determine by itself. Otherwise, a specialized automated analysis machine can run a number of experiments on the wells (such as shining polarized light on them and measuring reflectivity, which can be an indication of protein binding). In this case, the machine outputs the result of each experiment as a grid of numeric values, with each number mapping to the value obtained from a single well. A high-capacity analysis machine can measure dozens of plates in the space of a few minutes like this, generating thousands of experimental datapoints very quickly.
Depending on the results of this first assay, the researcher can perform follow up assays within the same screen by "cherrypicking" liquid from the source wells that gave interesting results (known as "hits") into new assay plates, and then re-running the experiment to collect further data on this narrowed set, confirming and refining observations.
Automation is an essential element in HTS's usefulness. Typically, an integrated robot system consisting of one or more robots transports assay-microplates from station to station for sample and reagent addition, mixing, incubation, and finally readout or detection. An HTS system can usually prepare, incubate, and analyze many plates simultaneously, further speeding the data-collection process. HTS robots that can test up to 100,000 compounds per day currently exist. Automatic colony pickers pick thousands of microbial colonies for high throughput genetic screening. The term uHTS or ultra-high-throughput screening refers (circa 2008) to screening in excess of 100,000 compounds per day.
With the ability of rapid screening of diverse compounds (such as small molecules or siRNAs) to identify active compounds, HTS has led to an explosion in the rate of data generated in recent years . Consequently, one of the most fundamental challenges in HTS experiments is to glean biochemical significance from mounds of data, which relies on the development and adoption of appropriate experimental designs and analytic methods for both quality control and hit selection . HTS research is one of the fields that have a feature described by John Blume, Chief Science Officer for Applied Proteomics, Inc., as follows: Soon, if a scientist does not understand some statistics or rudimentary data-handling technologies, he or she may not be considered to be a true molecular biologist and, thus, will simply become "a dinosaur."
High-quality HTS assays are critical in HTS experiments. The development of high-quality HTS assays requires the integration of both experimental and computational approaches for quality control (QC). Three important means of QC are (i) good plate design, (ii) the selection of effective positive and negative chemical/biological controls, and (iii) the development of effective QC metrics to measure the degree of differentiation so that assays with inferior data quality can be identified.  A good plate design helps to identify systematic errors (especially those linked with well position) and determine what normalization should be used to remove/reduce the impact of systematic errors on both QC and hit selection.
Effective analytic QC methods serve as a gatekeeper for excellent quality assays. In a typical HTS experiment, a clear distinction between a positive control and a negative reference such as a negative control is an index for good quality. Many quality-assessment measures have been proposed to measure the degree of differentiation between a positive control and a negative reference. Signal-to-background ratio, signal-to-noise ratio, signal window, assay variability ratio, and Z-factor have been adopted to evaluate data quality.   Strictly standardized mean difference (SSMD) has recently been proposed for assessing data quality in HTS assays.  
A compound with a desired size of effects in an HTS is called a hit. The process of selecting hits is called hit selection. The analytic methods for hit selection in screens without replicates (usually in primary screens) differ from those with replicates (usually in confirmatory screens). For example, the z-score method is suitable for screens without replicates whereas the t-statistic is suitable for screens with replicates. The calculation of SSMD for screens without replicates also differs from that for screens with replicates .
For hit selection in primary screens without replicates, the easily interpretable ones are average fold change, mean difference, percent inhibition, and percent activity. However, they do not capture data variability effectively. The z-score method or SSMD, which can capture data variability based on an assumption that every compound has the same variability as a negative reference in the screens.  However, outliers are common in HTS experiments, and methods such as z-score are sensitive to outliers and can be problematic. As a consequence, robust methods such as the z*-score method, SSMD*, B-score method, and quantile-based method have been proposed and adopted for hit selection.   
In a screen with replicates, we can directly estimate variability for each compound; as a consequence, we should use SSMD or t-statistic that does not rely on the strong assumption that the z-score and z*-score rely on. One issue with the use of t-statistic and associated p-values is that they are affected by both sample size and effect size. They come from testing for no mean difference, and thus are not designed to measure the size of compound effects. For hit selection, the major interest is the size of effect in a tested compound. SSMD directly assesses the size of effects. SSMD has also been shown to be better than other commonly used effect sizes. The population value of SSMD is comparable across experiments and, thus, we can use the same cutoff for the population value of SSMD to measure the size of compound effects .
Unique distributions of compounds across one or many plates can be employed either to increase the number of assays per plate or to reduce the variance of assay results, or both. The simplifying assumption made in this approach is that any N compounds in the same well will not typically interact with each other, or the assay target, in a manner that fundamentally changes the ability of the assay to detect true hits.
For example, imagine a plate wherein compound A is in wells 1-2-3, compound B is in wells 2-3-4, and compound C is in wells 3-4-5. In an assay of this plate against a given target, a hit in wells 2, 3, and 4 would indicate that compound B is the most likely agent, while also providing three measurements of compound B's efficacy against the specified target. Commercial applications of this approach involve combinations in which no two compounds ever share more than one well, to reduce the (second-order) possibility of interference between pairs of compounds being screened.
Automation and low volume assay formats were leveraged by scientists at the NIH Chemical Genomics Center (NCGC) to develop quantitative HTS (qHTS), a paradigm to pharmacologically profile large chemical libraries through the generation of full concentration-response relationships for each compound. With accompanying curve fitting and cheminformatics software qHTS data yields half maximal effective concentration (EC50), maximal response, Hill coefficient (nH) for the entire library enabling the assessment of nascent structure activity relationships (SAR).
In March 2010, research was published demonstrating an HTS process allowing 1,000 times faster screening (100 million reactions in 10 hours) at 1-millionth the cost (using 10−7 times the reagent volume) than conventional techniques using drop-based microfluidics. Drops of fluid separated by oil replace microplate wells and allow analysis and hit sorting while reagents are flowing through channels.
In 2010, researchers developed a silicon sheet of lenses that can be placed over microfluidic arrays to allow the fluorescence measurement of 64 different output channels simultaneously with a single camera. This process can analyze 200,000 drops per second.
In 2013, researchers have disclosed an approach with small molecules from plants. In general, it is essential to provide high-quality proof-of-concept validations early in the drug discovery process. Here technologies that enable the identification of potent, selective, and bioavailable chemical probes are of crucial interest, even if the resulting compounds require further optimization for development into a pharmaceutical product. Nuclear receptor RORα, a protein that has been targeted for more than a decade to identify potent and bioavailable agonists, was used as an example of a very challenging drug target. Hits are confirmed at the screening step due to the bell-shaped curve. This method is very similar to the quantitative HTS method (screening and hit confirmation at the same time), except that using this approach greatly decreases the data point number and can screen easily more than 100.000 biological relevant compounds.
Whereby traditional HTS drug discovery uses purified proteins or intact cells, recent development of the technology is associated with the use of intact living organisms, like the nematode Caenorhabditis elegans and zebrafish (Danio rerio).
In 2016-2018 plate manufacturers began producing specialized chemistry to allow for mass production of ultra-low adherent cell repellent surfaces which facilitated the rapid development of HTS amenable assays to address cancer drug discovery in 3D tissues such as organoids and spheroids; a more physiologically relevant format.
HTS is a relatively recent innovation, made feasible largely through modern advances in robotics and high-speed computer technology. It still takes a highly specialized and expensive screening lab to run an HTS operation, so in many cases a small- to moderate-size research institution will use the services of an existing HTS facility rather than set up one for itself.
There is a trend in academia for universities to be their own drug discovery enterprise. These facilities, which normally are found only in industry, are now increasingly found at universities as well. UCLA, for example, features an open access HTS laboratory Molecular Screening Shared Resources (MSSR, UCLA), which can screen more than 100,000 compounds a day on a routine basis. The open access policy ensures that researchers from all over the world can take advantage of this facility without lengthy intellectual property negotiations. With a compound library of over 200,000 small molecules, the MSSR has one of the largest compound deck of all universities on the west coast. Also, the MSSR features full functional genomics capabilities (genome wide siRNA, shRNA, cDNA and CRISPR) which are complementary to small molecule efforts: Functional genomics leverages HTS capabilities to execute genome wide screens which examine the function of each gene in the context of interest by either knocking each gene out or overexpressing it. Parallel access to high-throughput small molecule screen and a genome wide screen enables researchers to perform target identification and validation for given disease or the mode of action determination on a small molecule. The most accurate results can be obtained by use of "arrayed" functional genomics libraries, i.e. each library contains a single construct such as a single siRNA or cDNA. Functional genomics is typically paired with high content screening using e.g. epifluorescent microscopy or laser scanning cytometry.
The University of Illinois also has a facility for HTS, as does the University of Minnesota. The Life Sciences Institute at the University of Michigan houses the HTS facility in the Center for Chemical Genomics. Columbia University has an HTS shared resource facility with ~300,000 diverse small molecules and ~10,000 known bioactive compounds available for biochemical, cell-based and NGS-based screening. The Rockefeller University has an open-access HTS Resource Center HTSRC (The Rockefeller University, HTSRC), which offers a library of over 380,000 compounds. Northwestern University's High Throughput Analysis Laboratory supports target identification, validation, assay development, and compound screening. The non-profit Sanford Burnham Prebys Medical Discovery Institute also has a long-standing HTS facility in the Conrad Prebys Center for Chemical Genomics which was part of the MLPCN. The non-profit Scripps Research Molecular Screening Center (SRMSC) continues to serve academia across institutes post-MLPCN era. The SRMSC uHTS facility maintains one of the largest library collections in academia, presently at well-over 665,000 small molecule entities, and routinely screens the full collection or sub-libraries in support of multi-PI grant initiatives.
In the United States, the National Institutes of Health or NIH has created a nationwide consortium of small-molecule screening centers to produce innovative chemical tools for use in biological research. The Molecular Libraries Probe Production Centers Network, or MLPCN, performs HTS on assays provided by the research community, against a large library of small molecules maintained in a central molecule repository.