Difference between revisions of "2018 Putnam B Problems/Problem 5"

(Created page with "==Problem== Let <math>f = (f_1, f_2)</math> be a function from <math>\mathbb{R}^2</math> to <math>\mathbb{R}^2</math> with continuous partial derivatives <math>\tfrac{\partial...")
 
(See Also)
 
(One intermediate revision by the same user not shown)
Line 98: Line 98:
  
 
~Pinotation
 
~Pinotation
 +
 +
==See Also==
 +
 +
[[2018 Putnam B Problems|2018 Putnam B Entire Test]]
 +
 +
[[2018 Putnam B Problems/Problem 4|2018 Putnam B Problem 4]]
 +
 +
[[2018 Putnam B Problems/Problem 6|2018 Putnam B Problem 6]]
  
 
{{MAA Notice}}
 
{{MAA Notice}}

Latest revision as of 18:23, 18 August 2025

Problem

Let $f = (f_1, f_2)$ be a function from $\mathbb{R}^2$ to $\mathbb{R}^2$ with continuous partial derivatives $\tfrac{\partial f_i}{\partial x_j}$ that are positive everywhere. Suppose that \[\frac{\partial f_1}{\partial x_1} \frac{\partial f_2}{\partial x_2} - \frac{1}{4} \left(\frac{\partial f_1}{\partial x_2} + \frac{\partial f_2}{\partial x_1} \right)^2 > 0\]everywhere. Prove that $f$ is one-to-one.

Solution 1

Let \( f = (f_1, f_2) : \mathbb{R}^2 \rightarrow \mathbb{R}^2 \) be a function with continuous partial derivatives, and suppose all the partials \( \frac{\partial f_i}{\partial x_j} \) are positive. The given inequality is \[\frac{\partial f_1}{\partial x_1} \frac{\partial f_2}{\partial x_2} - \frac{1}{4} \left( \frac{\partial f_1}{\partial x_2} + \frac{\partial f_2}{\partial x_1} \right)^2 > 0\] everywhere on \( \mathbb{R}^2 \).

Let \( a = \frac{\partial f_1}{\partial x_1} \), \( b = \frac{\partial f_1}{\partial x_2} \), \( c = \frac{\partial f_2}{\partial x_1} \), \( d = \frac{\partial f_2}{\partial x_2} \). All of these are positive functions. The inequality becomes \[ad - \frac{1}{4} (b+c)^2 > 0.\] This implies \( ad > \frac{1}{4} (b+c)^2 \geq bc \), using the AM–GM inequality: \( (b+c)^2 \geq 4bc \). So \( ad > bc \), and the Jacobian determinant \( \det Df = ad - bc > 0 \) everywhere. This shows that \( Df \) is invertible at all points, and by the inverse function theorem, \( f \) is a local diffeomorphism everywhere.

To show that \( f \) is globally one-to-one, apply the global inverse function theorem (Hadamard's theorem). This requires that \( f \) is \( C^1 \), its Jacobian is everywhere invertible (already established), and that \( f \) is proper, meaning the preimage of any compact set is compact.

Because all partial derivatives of \( f_1 \) and \( f_2 \) are positive, each component \( f_i \) is strictly increasing in each variable. If we fix \( x_2 \), then increasing \( x_1 \) increases both \( f_1 \) and \( f_2 \); similarly for increasing \( x_2 \). So if \( x_n \rightarrow \infty \) in any direction, then \( f(x_n) \rightarrow \infty \). This means that the image of unbounded sets is unbounded, and hence, the preimage of any bounded set must be bounded. Since \( f \) is continuous, preimages of compact sets are closed, so they are compact. This proves that \( f \) is proper.

Therefore, all conditions of Hadamard’s theorem are satisfied: \( f \) is \( C^1 \), has positive Jacobian determinant everywhere, and is proper.

So \( f \) is a global diffeomorphism, and in particular, one-to-one.

~Pinotation

Solution 2 (Easier)

We are given a function \( f = (f_1, f_2) : \mathbb{R}^2 \rightarrow \mathbb{R}^2 \) with continuous and positive partial derivatives, and the inequality \[\frac{\partial f_1}{\partial x_1} \frac{\partial f_2}{\partial x_2} - \frac{1}{4} \left( \frac{\partial f_1}{\partial x_2} + \frac{\partial f_2}{\partial x_1} \right)^2 > 0\] everywhere.

We want to show that \( f \) is one-to-one, and we’ll do it by showing that if \( f(x) = f(y) \), then \( x = y \). We'll use an integral-based approach.

Let \( x = (x_1, x_2) \), \( y = (y_1, y_2) \), and suppose \( f(x) = f(y) \). Define a path \( \gamma(t) = (1 - t)x + ty \), for \( t \in [0, 1] \), which is a straight line from \( x \) to \( y \).

Then the difference \( f(y) - f(x) \) is \[f(y) - f(x) = \int_0^1 \frac{d}{dt} f(\gamma(t)) \, dt.\]

By the chain rule: \[\frac{d}{dt} f(\gamma(t)) = Df(\gamma(t)) \cdot (y - x),\] so \[f(y) - f(x) = \int_0^1 Df(\gamma(t)) \cdot (y - x) \, dt.\]

Now suppose \( f(y) = f(x) \). Then the integral of a vector-valued function is zero: \[\int_0^1 Df(\gamma(t)) \cdot (y - x) \, dt = 0.\]

Let \( v = y - x \). Then we have: \[\int_0^1 Df(\gamma(t)) \cdot v \, dt = 0.\]

This means that the average of the Jacobian matrices along the path, acting on the vector \( v \), gives zero: \[\left( \int_0^1 Df(\gamma(t)) \, dt \right) \cdot v = 0.\]

Let’s define the matrix \( A = \int_0^1 Df(\gamma(t)) \, dt \). Since each entry of \( Df \) is continuous and positive everywhere, each entry of \( A \) is also positive. So \( A \) is a real \( 2 \times 2 \) matrix with positive entries.

Moreover, the inequality \[\frac{\partial f_1}{\partial x_1} \frac{\partial f_2}{\partial x_2} - \frac{1}{4} \left( \frac{\partial f_1}{\partial x_2} + \frac{\partial f_2}{\partial x_1} \right)^2 > 0\] is preserved under averaging because the determinant is strictly positive at all \( t \), and the inequality is strict. So the matrix \( A \) satisfies the same inequality: \[A_{11} A_{22} - \frac{1}{4} (A_{12} + A_{21})^2 > 0.\]

This implies that the symmetric part of \( A \), \[\frac{1}{2} (A + A^T),\] is positive definite. That means for all nonzero \( v \), \[v^T A v > 0.\]

But earlier we assumed that \( f(y) = f(x) \), so \( Av = 0 \), which implies \[v^T A v = 0.\]

This contradicts the fact that \( v^T A v > 0 \) for all nonzero \( v \). Therefore, \( v = 0 \), so \( y = x \).

Hence, \( f \) is one-to-one.

~Pinotation

See Also

2018 Putnam B Entire Test

2018 Putnam B Problem 4

2018 Putnam B Problem 6

These problems are copyrighted © by the Mathematical Association of America, as part of the American Mathematics Competitions. AMC Logo.png