Difference between revisions of "1988 AHSME Problems/Problem 29"

(Solution 1)
 
(One intermediate revision by the same user not shown)
Line 20: Line 20:
 
Since the best line passing through <math>(x_2, z_2)</math> must minimize <math>r_1^2 + r_2^2 + r_3^2</math>, it must minimize <math>r_1^2 + r_3^2 = s^2 - 2r_1(s - r_1)</math>, and therefore maximize <math>r_1(s - r_1) = -r_1^2 + sr_1</math>.  Since this quadratic has a negative leading coefficient, it is maximized at <math>r_1 = -\frac{s}{2(-1)} = \frac{s}{2}</math>.  In this case, <math>r_1 = r_3 = \frac{s}{2}</math>, so <math>z_1 = y_1 + \frac{s}{2}</math> and <math>z_3 = y_3 + \frac{s}{2}</math>.  Also, <math>z_3 - z_1 = y_3 - y_1</math>, and the slope of the line is <math>\frac{z_3 - z_1}{x_3 - x_1} = \frac{y_3 - y_1}{x_3 - x_1}</math>.
 
Since the best line passing through <math>(x_2, z_2)</math> must minimize <math>r_1^2 + r_2^2 + r_3^2</math>, it must minimize <math>r_1^2 + r_3^2 = s^2 - 2r_1(s - r_1)</math>, and therefore maximize <math>r_1(s - r_1) = -r_1^2 + sr_1</math>.  Since this quadratic has a negative leading coefficient, it is maximized at <math>r_1 = -\frac{s}{2(-1)} = \frac{s}{2}</math>.  In this case, <math>r_1 = r_3 = \frac{s}{2}</math>, so <math>z_1 = y_1 + \frac{s}{2}</math> and <math>z_3 = y_3 + \frac{s}{2}</math>.  Also, <math>z_3 - z_1 = y_3 - y_1</math>, and the slope of the line is <math>\frac{z_3 - z_1}{x_3 - x_1} = \frac{y_3 - y_1}{x_3 - x_1}</math>.
  
Therefore, the line of best fit must have slope <math>\boxed{(\textbf{A}) \frac{y_3 - y_1}{x_3 - x_1}}</math>, as any line with a different slope could be improved by replacing it with the line with the same predicted value at <math>x = x_2</math> and slope <math>\frac{y_3 - y_1}{x_3 - x_1}</math>.   
+
Therefore, the line of best fit must have slope <math>\boxed{(\textbf{A}) \frac{y_3 - y_1}{x_3 - x_1}}</math>, as any line with a different slope could be improved by replacing it with the line with the same predicted value of <math>y_2</math> and slope <math>\frac{y_3 - y_1}{x_3 - x_1}</math>.   
  
 
-j314andrews
 
-j314andrews
Line 26: Line 26:
 
==Solution 2==
 
==Solution 2==
 
Apply one of the standard formulae for the gradient of the line of best fit, e.g. <math>\frac{\frac{\sum {x_i y_i}}{n} - \bar{x} \bar{y}}{\frac{\sum {x_{i}^2}}{n} - \bar{x}^2}</math>, and substitute in the given condition <math>x_3 - x_2 = x_2 - x_1</math>. The answer is <math>\boxed{\text{A}}</math>.
 
Apply one of the standard formulae for the gradient of the line of best fit, e.g. <math>\frac{\frac{\sum {x_i y_i}}{n} - \bar{x} \bar{y}}{\frac{\sum {x_{i}^2}}{n} - \bar{x}^2}</math>, and substitute in the given condition <math>x_3 - x_2 = x_2 - x_1</math>. The answer is <math>\boxed{\text{A}}</math>.
 +
 +
==Solution 3 (Linear Algebra)==
 +
 +
Let <math>A</math> be a matrix and <math>x</math> and <math>y</math> be vectors. Recall that the least-squares error <math>||Ax - y||^2_2</math> is minimized when <math>x = (A^T A)^{-1} A^T y</math>.
 +
 +
In this case, <math>A = \begin{bmatrix}x_1 && 1 \\ x_2 && 1 \\x_3 && 1\end{bmatrix}</math>, <math>y = \begin{bmatrix}y_1\\y_2\\y_3\end{bmatrix}</math>, and <math>x = \begin{bmatrix}m \\ b\end{bmatrix}</math>, where <math>m</math> and <math>b</math> are the slope and <math>y</math>-intercept of the line of best fit, respectively.  Let <math>d = x_3 - x_2 = x_2 - x_1</math>, so <math>x_3 = x_2 + d</math> and <math>x_1 = x_2 - d</math>.  Plugging these into <math>x = (A^T A)^{-1} A^T y</math> yields <math>m = \frac{y_3 - y_1}{2d} = \boxed{(\textbf{A}) \frac{y_3 - y_1}{x_3 - x_1}}</math>.
 +
 +
-j314andrews
  
 
== See also ==
 
== See also ==

Latest revision as of 02:34, 10 July 2025

Problem

You plot weight $(y)$ against height $(x)$ for three of your friends and obtain the points $(x_{1},y_{1}), (x_{2},y_{2}), (x_{3},y_{3})$. If $x_{1} < x_{2} < x_{3}$ and $x_{3} - x_{2} = x_{2} - x_{1}$, which of the following is necessarily the slope of the line which best fits the data? "Best fits" means that the sum of the squares of the vertical distances from the data points to the line is smaller than for any other line.

$\textbf{(A)}\ \frac{y_{3}-y_{1}}{x_{3}-x_{1}}\qquad \textbf{(B)}\ \frac{(y_{2}-y_{1})-(y_{3}-y_{2})}{x_{3}-x_{1}}\qquad\\ \textbf{(C)}\ \frac{2y_{3}-y_{1}-y_{2}}{2x_{3}-x_{1}-x_{2}}\qquad \textbf{(D)}\ \frac{y_{2}-y_{1}}{x_{2}-x_{1}}+\frac{y_{3}-y_{2}}{x_{3}-x_{2}}\qquad\\ \textbf{(E)}\ \text{none of these}$

Solution 1

Let $y = mx + b$ a prediction line, and let $d = x_3 - x_2 = x_2 - x_1$. Then $x_1 = x_2 - d$ and $x_3 = x_2 + d$. Let $z_1 = mx_1 + b$, $z_2 = mx_2 + b$ and $z_3 = mx_3 + b$ be the predicted values of $y_1$, $y_2$, and $y_3$, respectively. Then $z_1 = m(x_2 - d) + b = z_2 - md$ and $z_3 = m(x_2 + d) + b = z_2 + md$.

Consider all lines that pass through $(x_2, z_2)$. Let $r_1 = z_1 - y_1$, $r_2 = z_2 - y_2$, and $r_3 = z_3 - y_3$ be the three residuals. Then $r_1 + r_3 = z_1 + z_3 - y_1 - y_3 = z_2 + md + z_2 - md - y_1 - y_3 = 2z_2 - y_1 - y_3$. So $r_1 + r_3$ is the same for all these lines, as is $r_2$. Let $s = r_1 + r_3$.

Since the best line passing through $(x_2, z_2)$ must minimize $r_1^2 + r_2^2 + r_3^2$, it must minimize $r_1^2 + r_3^2 = s^2 - 2r_1(s - r_1)$, and therefore maximize $r_1(s - r_1) = -r_1^2 + sr_1$. Since this quadratic has a negative leading coefficient, it is maximized at $r_1 = -\frac{s}{2(-1)} = \frac{s}{2}$. In this case, $r_1 = r_3 = \frac{s}{2}$, so $z_1 = y_1 + \frac{s}{2}$ and $z_3 = y_3 + \frac{s}{2}$. Also, $z_3 - z_1 = y_3 - y_1$, and the slope of the line is $\frac{z_3 - z_1}{x_3 - x_1} = \frac{y_3 - y_1}{x_3 - x_1}$.

Therefore, the line of best fit must have slope $\boxed{(\textbf{A}) \frac{y_3 - y_1}{x_3 - x_1}}$, as any line with a different slope could be improved by replacing it with the line with the same predicted value of $y_2$ and slope $\frac{y_3 - y_1}{x_3 - x_1}$.

-j314andrews

Solution 2

Apply one of the standard formulae for the gradient of the line of best fit, e.g. $\frac{\frac{\sum {x_i y_i}}{n} - \bar{x} \bar{y}}{\frac{\sum {x_{i}^2}}{n} - \bar{x}^2}$, and substitute in the given condition $x_3 - x_2 = x_2 - x_1$. The answer is $\boxed{\text{A}}$.

Solution 3 (Linear Algebra)

Let $A$ be a matrix and $x$ and $y$ be vectors. Recall that the least-squares error $||Ax - y||^2_2$ is minimized when $x = (A^T A)^{-1} A^T y$.

In this case, $A = \begin{bmatrix}x_1 && 1 \\ x_2 && 1 \\x_3 && 1\end{bmatrix}$, $y = \begin{bmatrix}y_1\\y_2\\y_3\end{bmatrix}$, and $x = \begin{bmatrix}m \\ b\end{bmatrix}$, where $m$ and $b$ are the slope and $y$-intercept of the line of best fit, respectively. Let $d = x_3 - x_2 = x_2 - x_1$, so $x_3 = x_2 + d$ and $x_1 = x_2 - d$. Plugging these into $x = (A^T A)^{-1} A^T y$ yields $m = \frac{y_3 - y_1}{2d} = \boxed{(\textbf{A}) \frac{y_3 - y_1}{x_3 - x_1}}$.

-j314andrews

See also

1988 AHSME (ProblemsAnswer KeyResources)
Preceded by
Problem 28
Followed by
Problem 30
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30
All AHSME Problems and Solutions


These problems are copyrighted © by the Mathematical Association of America, as part of the American Mathematics Competitions. AMC Logo.png