# Linear CEF with Nonlinear E§ect

t Test

A typical goal in an econometric exercise is to assess whether or not coe¢ cient equals a

speciÖc value 0. Often the speciÖc value to be tested is 0 = 0 but this is not essential. This is

called hypothesis testing, a subject which will be explored in detail in Chapter 9. In this section

and the following we give a short introduction speciÖc to the normal regression model.

For simplicity write the coe¢ cient to be tested as . The null hypothesis is

H0 : = 0: (5.16)

This states that the hypothesis is that the true value of the coe¢ cient equals the hypothesized

value 0:

The alternative hypothesis is the complement of H0, and is written as

H1 : 6= 0:

This states that the true value of does not equal the hypothesized value.

We are interested in testing H0 against H1. The method is to design a statistic which is

informative about H1. If the observed value of the statistic is consistent with random variation

under the assumption that H0 is true, then we deduce that there is no evidence against H0 and

consequently do not reject H0. However, if the statistic takes a value which is unlikely to occur

under the assumption that H0 is true, then we deduce that there is evidence against H0, and

consequently we reject H0 in favor of H1. The main steps are to design a test statistic and to

characterize its sampling distribution.

The standard statistic to test H0 against H1 is the absolute value of the t-statistic

jTj =

b 0

s(b)

: (5.17)

If H0 is true, then we expect jTj to be small, but if H1 is true then we would expect jTj to be large.

Hence the standard rule is to reject H0 in favor of H1 for large values of the t-statistic jTj, and

otherwise fail to reject H0. Thus the hypothesis test takes the form

Reject H0 if jTj > c:

The constant c which appears in the statement of the test is called the critical value. Its value

is selected to control the probability of false rejections. When the null hypothesis is true, jTj has

an exact student t distribution (with n k degrees of freedom) in the normal regression model.

Thus for a given value of c the probability of false rejection is

P (Reject H0 j H0) = P (jTj > c j H0)

= P (T > c j H0) + P (T < c j H0)

= 1 F(c) + F(c)

= 2(1 F(c))

where F (u) is the tnk distribution function. This is the probability of false rejection, and is

decreasing in the critical value c. We select the value c so that this probability equals a pre-selected

value called the signiÖcance level, which is typically written as . It is conventional to set

= 0:05; though this is not a hard rule. We then select c so that F(c) = 1=2, which means that

c is the 1 =2 quantile (inverse CDF) of the tnk distribution, the same as used for conÖdence

intervals. With th