Welcome to my notes for MATH 2230. The definitions and theorems are taken directly from Hubbard & Hubbard, while examples and explanations are taken from lecture.
1.3 Matricies as Linear Transformations
1.4. Geometry of the real numbers
1.7. Derivatives in several variables as linear transformations
1.8. Rules for Computing Derivatives
1.9. Criterion for Differentiability
2.2. Solving Equations with Row Reduction
2.3. Matrix Inverses and Elementary Matrices
2.4. Linear Independence and Span
2.5. Kernels, Images, and the Dimension Formula
2.7. Eigenvalues and Eigenvectors
These two theorems are extremely important. They allow us to find solutions for nonlinear equations.
Definition 2.10.1 (Monotone function). A function is strictly monotone if its graph always goes up or goes down. If $x < y$ always implies $f(x) < f(y)$, the function is monotone increasing; if $x < y$ always implies $f(x) > f(y)$, the function is monotone decreasing. If a function $f$ is monotone, then it has an inverse function $g$.
Theorem 2.10.4 (Inverse function theorem: short version). If a mapping $f$ is continuously differentiable, and its derivative is invertible at some point $x_0$, then $f$ is locally invertble, with differentiable inverse, in some neighborhood of the point $f(x_0)$.
<aside> 💡 If $[Df(x_0)]$ is not invertible, then $f$ has no differentiable inverse in the neighborhood of $f(x_0)$. However, it may still have a nondifferentiable inverse.
</aside>
Theorem 2.10.7 (The inverse function theorem). Let $W \subset \R^m$ be an open neighborhood of $x_0$, and let $f: W \to \R^m$ be a continuously differentiable function. Set $y_0 = f(x_0)$. If the derivative $[Df(x_0)]$ is invertible, then $f$ is invertible in some neighborhood of $y_0$, and the inverse is differentiable.
To quantify this statement, we will specify the radius $R$ of a ball of a ball $V$ centered at $y_0$, in which the inverse function is defined.
The inverse function theorem says that for $y$ sufficiently close to $y_0$, the equation $f(x) = y$ has a unique solution close to $x_0$. The obvious idea is to try to solve $f(x) - y = 0$ by Newton's method, starting at $x_0$. What neighborhood of $y_0$ does the Kantorovich condition $|f(x)||[Df(x_0)]|^2 M < 1/2$ hold?
<aside> 💡 $f(x) - y = 0$ is non-linear equation.
</aside>
First, let $L = [Df(x_0)]$. Find $R > 0$ satisfying the following conditions