Hotelling's T-square statistic
(noun)
A generalization of Student's
Examples of Hotelling's T-square statistic in the following topics:
-
Multivariate Testing
- Hotelling's $T$-square statistic allows for the testing of hypotheses on multiple (often correlated) measures within the same sample.
- A generalization of Student's $t$-statistic, called Hotelling's $T$-square statistic, allows for the testing of hypotheses on multiple (often correlated) measures within the same sample.
- Hotelling's $T^2$ statistic follows a $T^2$ distribution.
- Hotelling's $T$-squared distribution is important because it arises as the distribution of a set of statistics which are natural generalizations of the statistics underlying Student's $t$-distribution.
- The test statistic is defined as:
-
References
- ., Bradley, T.
- D. (1979) Type I error rate of the chi square test of independence in r x c tables that have small expected frequencies.
-
The Root-Mean-Square
- The root-mean-square, also known as the quadratic mean, is a statistical measure of the magnitude of a varying quantity, or set of numbers.
- The root-mean-square, also known as the quadratic mean, is a statistical measure of the magnitude of a varying quantity, or set of numbers.
- Its name comes from its definition as the square root of the mean of the squares of the values.
- Computing the average of this set of numbers wouldn't tell us much because the negative numbers cancel out the positive numbers, resulting in an average of zero.
- Physical scientists often use the term "root-mean-square" as a synonym for standard deviation when referring to the square root of the mean squared deviation of a signal from a given baseline or fit.
-
Structure of the Chi-Squared Test
- The chi-square ($\chi^2$) test is a nonparametric statistical technique used to determine if a distribution of observed frequencies differs from the theoretical expected frequencies.
- Chi-square statistics use nominal (categorical) or ordinal level data.
- First, we calculate a chi-square test statistic.
- We may observe data that give us a high test-statistic just by chance, but the chi-square distribution shows us how likely it is.
- This is a property shared by the $T$-distribution.
-
Root Mean Square Values
- The root mean square (abbreviated RMS or rms), also known as the quadratic mean, is a statistical measure of the magnitude of a varying quantity.
- The corresponding formula for a continuous function f(t) defined over the interval T1 ≤ t ≤ T2 is as follows:
- V is the voltage at time t, V0 is the peak voltage, and f is the frequency in hertz.
- Here, I is the current at time t, and I0=V0/R is the peak current.
- Since V0 is a constant, we can factor it out of the square root, and use a trig identity to replace the squared sine function.
-
Quantitative or Qualitative Data?
- Paired and unpaired t-tests and z-tests are just some of the statistical tests that can be used to test quantitative data.
- A t-test is any statistical hypothesis test in which the test statistic follows a t distribution if the null hypothesis is supported.
- When the scaling term is unknown and is replaced by an estimate based on the data, the test statistic (under certain conditions) follows a t distribution .
- One of the most common statistical tests for qualitative data is the chi-square test (both the goodness of fit test and test of independence).
- Plots of the t distribution for several different degrees of freedom.
-
Calculations for the t-Test: One Sample
- In each case, the formula for a test statistic that either exactly follows or closely approximates a $t$-distribution under the null hypothesis is given.
- Each of these statistics can be used to carry out either a one-tailed test or a two-tailed test.
- Once a $t$-value is determined, a $p$-value can be found using a table of values from Student's $t$-distribution.
- Let $\hat{\alpha}$ and $\hat{\beta}$ be least-squares estimators, and let $SE_\hat{\alpha}$ and $SE_\hat{\beta}$, respectively, be the standard errors of those least-squares estimators.
- Therefore, the sum of the squares of residuals, or $SSR$, is given by:
-
Understanding regression output from software
- We will generally label the test statistic using a T, since it follows the t distribution.
- We previously used a t test statistic for hypothesis testing in the context of numerical data.
- Table 7.21 offers the degrees of freedom for the test statistic T: df = 25.
- We could have identified the t test statistic from the software output in Table 7.21, shown in the second row (unemp) and third column (t value).
- Table 7.23 shows statistical software output from fitting the least squares regression line shown in Figure 7.16.
-
Example: Test for Independence
- The chi-square test for independence is used to determine the relationship between two variables of a sample.
- To examine statistically whether boys got in trouble more often in school, we need to establish hypotheses for the question.
- where $\sigma_r$ is the sum over that row, $\sigma_c$ is the sum over that column, and $\sigma_t$ is the sum over the entire table.
- With the values in the table, the chi-square statistic can be calculated as follows:
- In the chi-square test for independence, the degrees of freedom are found as follows:
-
Finding the least squares line
- A common exercise to become more familiar with foundations of least squares regression is to use basic summary statistics and point-slope form to produce the least squares line.
- The third column is a t test statistic for the null hypothesis that β1 = 0: T = −3.98.
- The last column is the p-value for the t test statistic for the null hypothesis β1 = 0 and a two-sided alternative hypothesis: 0.0002.
- Summary of least squares fit for the Elmhurst data.
- In the left panel, a straight line does not fit the data.