MarketInteraction (statistics)
Company Profile

Interaction (statistics)

In statistics, an interaction may arise when considering the relationship among three or more variables, and describes a situation in which the effect of one causal variable on an outcome depends on the state of a second causal variable. Although commonly thought of in terms of causal relationships, the concept of an interaction can also describe non-causal associations. Interactions are often considered in the context of regression analyses or factorial experiments.

Introduction
An interaction variable or interaction feature is a variable constructed from an original set of variables to try to represent either all of the interaction present or some part of it. In exploratory statistical analyses it is common to use products of original variables as the basis of testing whether interaction is present with the possibility of substituting other more realistic interaction variables at a later stage. When there are more than two explanatory variables, several interaction variables are constructed, with pairwise-products representing pairwise-interactions and higher order products representing higher order interactions. Thus, for a response Y and two variables x1 and x2 an additive model would be: :Y = c + ax_1 + bx_2 + \text{error}\, In contrast to this, :Y = c + ax_1 + bx_2 + d(x_1\times x_2) + \text{error} \, is an example of a model with an interaction between variables x1 and x2 ("error" refers to the random variable whose value is that by which Y differs from the expected value of Y; see errors and residuals in statistics). Often, models are presented without the interaction term d(x_1\times x_2), but this confounds the main effect and interaction effect (i.e., without specifying the interaction term, it is possible that any main effect found is actually due to an interaction). Moreover, the hierarchical principle rules that if a model includes interaction between variables, it is also necessary to include the main effects, regardless of their own statistical significance. ==In modeling==
In modeling
In ANOVA A simple setting in which interactions can arise is a two-factor experiment analyzed using Analysis of Variance (ANOVA). Suppose we have two binary factors A and B. For example, these factors might indicate whether either of two treatments were administered to a patient, with the treatments applied either singly, or in combination. We can then consider the average treatment response (e.g. the symptom levels following treatment) for each patient, as a function of the treatment combination that was administered. The following table shows one possible situation: In this example, there is no interaction between the two treatments — their effects are additive. The reason for this is that the difference in mean response between those subjects receiving treatment A and those not receiving treatment A is −2 regardless of whether treatment B is administered (−2 = 4 − 6) or not (−2 = 5 − 7). Note that it automatically follows that the difference in mean response between those subjects receiving treatment B and those not receiving treatment B is the same regardless of whether treatment A is administered (7 − 6 = 5 − 4). In contrast, if the following average responses are observed then there is an interaction between the treatments — their effects are not additive. Supposing that greater numbers correspond to a better response, in this situation treatment B is helpful on average if the subject is not also receiving treatment A, but is detrimental on average if given in combination with treatment A. Treatment A is helpful on average regardless of whether treatment B is also administered, but it is more helpful in both absolute and relative terms if given alone, rather than in combination with treatment B. Similar observations are made for this particular example in the next section. Qualitative and quantitative interactions In many applications it is useful to distinguish between qualitative and quantitative interactions. A quantitative interaction between A and B is a situation where the magnitude of the effect of B depends on the value of A, but the direction of the effect of B is constant for all A. A qualitative interaction between A and B refers to a situation where both the magnitude and direction of each variable's effect can depend on the value of the other variable. The table of means on the left, below, shows a quantitative interaction — treatment A is beneficial both when B is given, and when B is not given, but the benefit is greater when B is not given (i.e. when A is given alone). The table of means on the right shows a qualitative interaction. A is harmful when B is given, but it is beneficial when B is not given. Note that the same interpretation would hold if we consider the benefit of B based on whether A is given. The distinction between qualitative and quantitative interactions depends on the order in which the variables are considered (in contrast, the property of additivity is invariant to the order of the variables). In the following table, if we focus on the effect of treatment A, there is a quantitative interaction — giving treatment A will improve the outcome on average regardless of whether treatment B is or is not already being given (although the benefit is greater if treatment A is given alone). However, if we focus on the effect of treatment B, there is a qualitative interaction — giving treatment B to a subject who is already receiving treatment A will (on average) make things worse, whereas giving treatment B to a subject who is not receiving treatment A will improve the outcome on average. Unit treatment additivity In its simplest form, the assumption of treatment unit additivity states that the observed response yij from experimental unit i when receiving treatment j can be written as the sum yij = yi + tj. This is so-called because a moderator is a variable that affects the strength of a relationship between two other variables. Designed experiments Genichi Taguchi contended that interactions could be eliminated from a system by appropriate choice of response variable and transformation. However George Box and others have argued that this is not the case in general. Model size Given n predictors, the number of terms in a linear model that includes a constant, every predictor, and every possible interaction is \tbinom{n}{0} + \tbinom{n}{1} + \tbinom{n}{2} + \cdots + \tbinom{n}{n} = 2^n. Since this quantity grows exponentially, it readily becomes impractically large. One method to limit the size of the model is to limit the order of interactions. For example, if only two-way interactions are allowed, the number of terms becomes \tbinom{n}{0} + \tbinom{n}{1} + \tbinom{n}{2} = 1 + \tfrac{1}{2}n + \tfrac{1}{2}n^2. The below table shows the number of terms for each number of predictors and maximum order of interaction. In regression The most general approach to modeling interaction effects involves regression, starting from the elementary version given above: :Y = c + ax_1 + bx_2 + d(x_1\times x_2) + \text{error} \, where the interaction term (x_1\times x_2) could be formed explicitly by multiplying two (or more) variables, or implicitly using factorial notation in modern statistical packages such as Stata. The components x1 and x2 might be measurements or {0,1} dummy variables in any combination. Interactions involving a dummy variable multiplied by a measurement variable are termed slope dummy variables, because they estimate and test the difference in slopes between groups 0 and 1. When measurement variables are employed in interactions, it is often desirable to work with centered versions, where the variable's mean (or some other reasonably central value) is set as zero. Centering can make the main effects in interaction models more interpretable, as it reduces the multicollinearity between the interaction term and the main effects. The coefficient a in the equation above, for example, represents the effect of x1 when x2 equals zero. Regression approaches to interaction modeling are very general because they can accommodate additional predictors, and many alternative specifications or estimation strategies beyond ordinary least squares. Robust, quantile, and mixed-effects (multilevel) models are among the possibilities, as is generalized linear modeling encompassing a wide range of categorical, ordered, counted or otherwise limited dependent variables. The graph depicts an education*politics interaction, from a probability-weighted logit regression analysis of survey data. ==Interaction plots==
Interaction plots
Interaction plots, also called simple-slope plots, show possible interactions among variables. Example: Interaction of species and air temperature and their effect on body temperature Consider a study of the body temperature of different species at different air temperatures, in degrees Fahrenheit. The data are shown in the table below. The interaction plot may use either the air temperature or the species as the x axis. The second factor is represented by lines on the interaction plot. There is an interaction between the two factors (air temperature and species) in their effect on the response (body temperature), because the effect of the air temperature depends on the species. The interaction is indicated on the plot because the lines are not parallel. Example: effect of stroke severity and treatment on recovery As a second example, consider a clinical trial on the interaction between stroke severity and the efficacy of a drug on patient survival. The data are shown in the table below. In the interaction plot, the lines for the mild and moderate stroke groups are parallel, indicating that the drug has the same effect in both groups, so there is no interaction. The line for the severe stroke group is not parallel to the other lines, indicating that there is an interaction between stroke severity and drug effect on survival. The line for the severe stroke group is flat, indicating that, among these patients, there is no difference in survival between the drug and placebo treatments. In contrast, the lines for the mild and moderate stroke groups slope down to the right, indicating that, among these patients, the placebo group has lower survival than drug-treated group. ==Hypothesis tests for interactions==
Hypothesis tests for interactions
Analysis of variance and regression analysis are used to test for significant interactions. Example: Interaction of temperature and time in cookie baking Is the yield of good cookies affected by the baking temperature and time in the oven? The table shows data for 8 batches of cookies. The data show that the yield of good cookies is best when either (i) temperature is high and time in the oven is short, or (ii) temperature is low and time in the oven is long. If the cookies are left in the oven for a long time at a high temperature, there are burnt cookies and the yield is low. From the graph and the data, it is clear that the lines are not parallel, indicating that there is an interaction. This can be tested using analysis of variance (ANOVA). The first ANOVA model will not include the interaction term. That is, the first ANOVA model ignores possible interaction. The second ANOVA model will include the interaction term. That is, the second ANOVA model explicitly performs a hypothesis test for interaction. ANOVA model 1: no interaction term; yield ~ temperature + time In the ANOVA model that ignores interaction, neither temperature nor time has a significant effect on yield (p=0.91), which is clearly the incorrect conclusion. The more appropriate ANOVA model should test for possible interaction. ANOVA model 2: include interaction term; yield ~ temperature * time The temperature:time interaction term is significant (p=0.000180). Based on the interaction test and the interaction plot, it appears that the effect of time on yield depends on temperature and vice versa. ==Examples==
Examples
Real-world examples of interaction include: • Interaction between adding sugar to coffee and stirring the coffee. Neither of the two individual variables has much effect on sweetness but a combination of the two does. • Interaction between adding carbon to steel and quenching. Neither of the two individually has much effect on strength but a combination of the two has a dramatic effect. • Interaction between smoking and inhaling asbestos fibres: Both raise lung carcinoma risk, but exposure to asbestos multiplies the cancer risk in smokers and non-smokers. Here, the joint effect of inhaling asbestos and smoking is higher than the sum of both effects. • Interaction between genetic risk factors for type 2 diabetes and diet (specifically, a "western" dietary pattern). The western dietary pattern was shown to increase diabetes risk for subjects with a high "genetic risk score", but not for other subjects. • Interaction between education and political orientation, affecting general-public perceptions about climate change. For example, US surveys often find that acceptance of the reality of anthropogenic climate change rises with education among moderate or liberal survey respondents, but declines with education among the most conservative. Similar interactions have been observed to affect some non-climate science or environmental perceptions, and to operate with science literacy or other knowledge indicators in place of education. == See also ==
tickerdossier.comtickerdossier.substack.com