MarketDetection theory
Company Profile

Detection theory

Detection theory or signal detection theory is a means to measure the ability to differentiate between information-bearing patterns and random patterns that distract from the information.

Psychology
Signal detection theory (SDT) is used when psychologists want to measure the way we make decisions under conditions of uncertainty, such as how we would perceive distances in foggy conditions or during eyewitness identification. SDT assumes that the decision maker is not a passive receiver of information, but an active decision-maker who makes difficult perceptual judgments under conditions of uncertainty. In foggy circumstances, we are forced to decide how far away from us an object is, based solely upon visual stimulus which is impaired by the fog. Since the brightness of the object, such as a traffic light, is used by the brain to discriminate the distance of an object, and the fog reduces the brightness of objects, we perceive the object to be much farther away than it actually is (see also decision theory). According to SDT, during eyewitness identifications, witnesses base their decision as to whether a suspect is the culprit or not based on their perceived level of familiarity with the suspect. To apply signal detection theory to a data set where stimuli were either present or absent, and the observer categorized each trial as having the stimulus present or absent, the trials are sorted into one of four categories: : Based on the proportions of these types of trials, numerical estimates of sensitivity can be obtained with statistics like the sensitivity index d' and A', and response bias can be estimated with statistics like c and β. Signal detection theory can also be applied to memory experiments, where items are presented on a study list for later testing. A test list is created by combining these 'old' items with novel, 'new' items that did not appear on the study list. On each test trial the subject will respond 'yes, this was on the study list' or 'no, this was not on the study list'. Items presented on the study list are called Targets, and new items are called Distractors. Saying 'Yes' to a target constitutes a Hit, while saying 'Yes' to a distractor constitutes a False Alarm. : == Applications ==
Applications
Signal Detection Theory has wide application, both in humans and animals. Topics include memory, stimulus characteristics of schedules of reinforcement, etc. Sensitivity or discriminability Conceptually, sensitivity refers to how hard or easy it is to detect that a target stimulus is present from background events. For example, in a recognition memory paradigm, having longer to study to-be-remembered words makes it easier to recognize previously seen or heard words. In contrast, having to remember 30 words rather than 5 makes the discrimination harder. One of the most commonly used statistics for computing sensitivity is the so-called sensitivity index or d. There are also non-parametric measures, such as the area under the ROC-curve., CoSaMP and also fast non-iterative algorithm. In all of the recovery methods mentioned above, choosing an appropriate measurement matrix using probabilistic constructions or deterministic constructions, is of great importance. In other words, measurement matrices must satisfy certain specific conditions such as RIP (Restricted Isometry Property) or Null-Space property''''' in order to achieve robust sparse recovery. ==Mathematics==
Mathematics
P(H1|y) > P(H2|y) / MAP testing In the case of making a decision between two hypotheses, H1, absent, and H2, present, in the event of a particular observation, y, a classical approach is to choose H1 when p(H1|y) > p(H2|y) and H2 in the reverse case. In the event that the two a posteriori probabilities are equal, one might choose to default to a single choice (either always choose H1 or always choose H2), or might randomly select either H1 or H2. The a priori probabilities of H1 and H2 can guide this choice, e.g. by always choosing the hypothesis with the higher a priori probability. When taking this approach, usually what one knows are the conditional probabilities, p(y|H1) and p(y|H2), and the a priori probabilities p(H1) = \pi_1 and p(H2) = \pi_2. In this case, p(H1|y) = \frac{p(y|H1) \cdot \pi_1}{p(y)} , p(H2|y) = \frac{p(y|H2) \cdot \pi_2}{p(y)} where p(y) is the total probability of event y, p(y|H1) \cdot \pi_1 + p(y|H2) \cdot \pi_2 . H2 is chosen in case \frac{p(y|H2) \cdot \pi_2}{p(y|H1) \cdot \pi_1 + p(y|H2) \cdot \pi_2} \ge \frac{p(y|H1) \cdot \pi_1}{p(y|H1) \cdot \pi_1 + p(y|H2) \cdot \pi_2} \Rightarrow \frac{p(y|H2)}{p(y|H1)} \ge \frac{\pi_1}{\pi_2} and H1 otherwise. Often, the ratio \frac{\pi_1}{\pi_2} is called \tau_{MAP} and \frac{p(y|H2)}{p(y|H1)} is called L(y), the likelihood ratio. Using this terminology, H2 is chosen in case L(y) \ge \tau_{MAP}. This is called MAP testing, where MAP stands for "maximum a posteriori"). Taking this approach minimizes the expected number of errors one will make. Bayes criterion In some cases, it is far more important to respond appropriately to H1 than it is to respond appropriately to H2. For example, if an alarm goes off, indicating H1 (an incoming bomber is carrying a nuclear weapon), it is much more important to shoot down the bomber if H1 = TRUE, than it is to avoid sending a fighter squadron to inspect a false alarm (i.e., H1 = FALSE, H2 = TRUE) (assuming a large supply of fighter squadrons). The Bayes criterion is an approach suitable for such cases. Here a utility is associated with each of four situations: • U_{11}: One responds with behavior appropriate to H1 and H1 is true: fighters destroy bomber, incurring fuel, maintenance, and weapons costs, take risk of some being shot down; • U_{12}: One responds with behavior appropriate to H1 and H2 is true: fighters sent out, incurring fuel and maintenance costs, bomber location remains unknown; • U_{21}: One responds with behavior appropriate to H2 and H1 is true: city destroyed; • U_{22}: One responds with behavior appropriate to H2 and H2 is true: fighters stay home, bomber location remains unknown; As is shown below, what is important are the differences, U_{11} - U_{21} and U_{22} - U_{12}. Similarly, there are four probabilities, P_{11}, P_{12}, etc., for each of the cases (which are dependent on one's decision strategy). The Bayes criterion approach is to maximize the expected utility: E\{U\} = P_{11} \cdot U_{11} + P_{21} \cdot U_{21} + P_{12} \cdot U_{12} + P_{22} \cdot U_{22} E\{U\} = P_{11} \cdot U_{11} + (1-P_{11}) \cdot U_{21} + P_{12} \cdot U_{12} + (1-P_{12}) \cdot U_{22} E\{U\} = U_{21} + U_{22} + P_{11} \cdot (U_{11} - U_{21}) - P_{12} \cdot (U_{22} - U_{12}) Effectively, one may maximize the sum, U' = P_{11} \cdot (U_{11} - U_{21}) - P_{12} \cdot (U_{22} - U_{12}) , and make the following substitutions: P_{11} = \pi_1 \cdot \int_{R_1}p(y|H1)\, dy P_{12} = \pi_2 \cdot \int_{R_1}p(y|H2)\, dy where \pi_1 and \pi_2 are the a priori probabilities, P(H1) and P(H2), and R_1 is the region of observation events, y, that are responded to as though H1 is true. \Rightarrow U' = \int_{R_1} \left \{ \pi_1 \cdot (U_{11} - U_{21}) \cdot p(y|H1) - \pi_2 \cdot (U_{22} - U_{12}) \cdot p(y|H2) \right \} \, dy U' and thus U are maximized by extending R_1 over the region where \pi_1 \cdot (U_{11} - U_{21}) \cdot p(y|H1) - \pi_2 \cdot (U_{22} - U_{12}) \cdot p(y|H2) > 0 This is accomplished by deciding H2 in case \pi_2 \cdot (U_{22} - U_{12}) \cdot p(y|H2) \ge \pi_1 \cdot (U_{11} - U_{21}) \cdot p(y|H1) \Rightarrow L(y) \equiv \frac{p(y|H2)}{p(y|H1)} \ge \frac{\pi_1 \cdot (U_{11} - U_{21})}{\pi_2 \cdot (U_{22} - U_{12})} \equiv \tau_B and H1 otherwise, where L(y) is the so-defined likelihood ratio. ==See also==
tickerdossier.comtickerdossier.substack.com