Difference between revisions of "1988 AHSME Problems/Problem 29"
J314andrews (talk | contribs) (→Solution 1) |
J314andrews (talk | contribs) |
||
(One intermediate revision by the same user not shown) | |||
Line 20: | Line 20: | ||
Since the best line passing through <math>(x_2, z_2)</math> must minimize <math>r_1^2 + r_2^2 + r_3^2</math>, it must minimize <math>r_1^2 + r_3^2 = s^2 - 2r_1(s - r_1)</math>, and therefore maximize <math>r_1(s - r_1) = -r_1^2 + sr_1</math>. Since this quadratic has a negative leading coefficient, it is maximized at <math>r_1 = -\frac{s}{2(-1)} = \frac{s}{2}</math>. In this case, <math>r_1 = r_3 = \frac{s}{2}</math>, so <math>z_1 = y_1 + \frac{s}{2}</math> and <math>z_3 = y_3 + \frac{s}{2}</math>. Also, <math>z_3 - z_1 = y_3 - y_1</math>, and the slope of the line is <math>\frac{z_3 - z_1}{x_3 - x_1} = \frac{y_3 - y_1}{x_3 - x_1}</math>. | Since the best line passing through <math>(x_2, z_2)</math> must minimize <math>r_1^2 + r_2^2 + r_3^2</math>, it must minimize <math>r_1^2 + r_3^2 = s^2 - 2r_1(s - r_1)</math>, and therefore maximize <math>r_1(s - r_1) = -r_1^2 + sr_1</math>. Since this quadratic has a negative leading coefficient, it is maximized at <math>r_1 = -\frac{s}{2(-1)} = \frac{s}{2}</math>. In this case, <math>r_1 = r_3 = \frac{s}{2}</math>, so <math>z_1 = y_1 + \frac{s}{2}</math> and <math>z_3 = y_3 + \frac{s}{2}</math>. Also, <math>z_3 - z_1 = y_3 - y_1</math>, and the slope of the line is <math>\frac{z_3 - z_1}{x_3 - x_1} = \frac{y_3 - y_1}{x_3 - x_1}</math>. | ||
− | Therefore, the line of best fit must have slope <math>\boxed{(\textbf{A}) \frac{y_3 - y_1}{x_3 - x_1}}</math>, as any line with a different slope could be improved by replacing it with the line with the same predicted value | + | Therefore, the line of best fit must have slope <math>\boxed{(\textbf{A}) \frac{y_3 - y_1}{x_3 - x_1}}</math>, as any line with a different slope could be improved by replacing it with the line with the same predicted value of <math>y_2</math> and slope <math>\frac{y_3 - y_1}{x_3 - x_1}</math>. |
-j314andrews | -j314andrews | ||
Line 26: | Line 26: | ||
==Solution 2== | ==Solution 2== | ||
Apply one of the standard formulae for the gradient of the line of best fit, e.g. <math>\frac{\frac{\sum {x_i y_i}}{n} - \bar{x} \bar{y}}{\frac{\sum {x_{i}^2}}{n} - \bar{x}^2}</math>, and substitute in the given condition <math>x_3 - x_2 = x_2 - x_1</math>. The answer is <math>\boxed{\text{A}}</math>. | Apply one of the standard formulae for the gradient of the line of best fit, e.g. <math>\frac{\frac{\sum {x_i y_i}}{n} - \bar{x} \bar{y}}{\frac{\sum {x_{i}^2}}{n} - \bar{x}^2}</math>, and substitute in the given condition <math>x_3 - x_2 = x_2 - x_1</math>. The answer is <math>\boxed{\text{A}}</math>. | ||
+ | |||
+ | ==Solution 3 (Linear Algebra)== | ||
+ | |||
+ | Let <math>A</math> be a matrix and <math>x</math> and <math>y</math> be vectors. Recall that the least-squares error <math>||Ax - y||^2_2</math> is minimized when <math>x = (A^T A)^{-1} A^T y</math>. | ||
+ | |||
+ | In this case, <math>A = \begin{bmatrix}x_1 && 1 \\ x_2 && 1 \\x_3 && 1\end{bmatrix}</math>, <math>y = \begin{bmatrix}y_1\\y_2\\y_3\end{bmatrix}</math>, and <math>x = \begin{bmatrix}m \\ b\end{bmatrix}</math>, where <math>m</math> and <math>b</math> are the slope and <math>y</math>-intercept of the line of best fit, respectively. Let <math>d = x_3 - x_2 = x_2 - x_1</math>, so <math>x_3 = x_2 + d</math> and <math>x_1 = x_2 - d</math>. Plugging these into <math>x = (A^T A)^{-1} A^T y</math> yields <math>m = \frac{y_3 - y_1}{2d} = \boxed{(\textbf{A}) \frac{y_3 - y_1}{x_3 - x_1}}</math>. | ||
+ | |||
+ | -j314andrews | ||
== See also == | == See also == |
Latest revision as of 02:34, 10 July 2025
Problem
You plot weight against height
for three of your friends and obtain the points
. If
and
,
which of the following is necessarily the slope of the line which best fits the data?
"Best fits" means that the sum of the squares of the vertical distances from the data points to the line is smaller than for any other line.
Solution 1
Let a prediction line, and let
. Then
and
. Let
,
and
be the predicted values of
,
, and
, respectively. Then
and
.
Consider all lines that pass through . Let
,
, and
be the three residuals. Then
. So
is the same for all these lines, as is
. Let
.
Since the best line passing through must minimize
, it must minimize
, and therefore maximize
. Since this quadratic has a negative leading coefficient, it is maximized at
. In this case,
, so
and
. Also,
, and the slope of the line is
.
Therefore, the line of best fit must have slope , as any line with a different slope could be improved by replacing it with the line with the same predicted value of
and slope
.
-j314andrews
Solution 2
Apply one of the standard formulae for the gradient of the line of best fit, e.g. , and substitute in the given condition
. The answer is
.
Solution 3 (Linear Algebra)
Let be a matrix and
and
be vectors. Recall that the least-squares error
is minimized when
.
In this case, ,
, and
, where
and
are the slope and
-intercept of the line of best fit, respectively. Let
, so
and
. Plugging these into
yields
.
-j314andrews
See also
1988 AHSME (Problems • Answer Key • Resources) | ||
Preceded by Problem 28 |
Followed by Problem 30 | |
1 • 2 • 3 • 4 • 5 • 6 • 7 • 8 • 9 • 10 • 11 • 12 • 13 • 14 • 15 • 16 • 17 • 18 • 19 • 20 • 21 • 22 • 23 • 24 • 25 • 26 • 27 • 28 • 29 • 30 | ||
All AHSME Problems and Solutions |
These problems are copyrighted © by the Mathematical Association of America, as part of the American Mathematics Competitions.