diff options
Diffstat (limited to 'tutorials/module_3/2_roots_optimization.md')
| -rw-r--r-- | tutorials/module_3/2_roots_optimization.md | 17 |
1 files changed, 8 insertions, 9 deletions
diff --git a/tutorials/module_3/2_roots_optimization.md b/tutorials/module_3/2_roots_optimization.md index 780a284..5023636 100644 --- a/tutorials/module_3/2_roots_optimization.md +++ b/tutorials/module_3/2_roots_optimization.md @@ -1,4 +1,4 @@ -# Root Finding Methods + # Root Finding Methods By now you've learned the quadratic formula to find the roots of a second degree polynomial. $$ @@ -41,10 +41,9 @@ When # Bracketing Method ## Incremental Search -Incremental search - - +The incremental search method is the simplest of the bracketing methods. The basic idea is to start from an initial point and move forward in small increments along the function. At each step, the sign of the function value is checked. If a change of sign occurs between two consecutive points, then a root must lie between them (by the **Intermediate Value Theorem**). This gives us a bracket — an interval that contains the root — which can then be refined using more powerful methods such as bisection or false position. +Although incremental search is not efficient for accurately finding roots, it is useful for _detecting the presence of roots_ and identifying approximate intervals where they exist. Once such an interval is located, other methods can be applied to converge quickly to the actual root. In this sense, incremental search often serves as a **pre-processing step** for bracketing or open methods, especially when the root’s location is not known in advance. ## Bisection This method start by sectioning the database into two halves. We can do this with a phone book by simply opening it in the middle of the book. In our data set, we have find the middle entry to be 'M - Maxwell'. Based on the premises that all names are sorted alphabetically we can conclude that we will find our target in the second half, essentially eliminating the first half. Doing this a second time gives us our new midpoint 'S – Schrödinger' and a third time get us to our target 'W - Watt'. In only 3 iterations we have reached out target goal. This is a huge improvement from the Incremental search (22 iterations) versus our new 3 iterations. @@ -101,8 +100,8 @@ def my_bisection(f, x_l, x_u, tol): return my_bisection(f, x_l, x_r, tol) ``` -### Example - +### Problem 1 +Call the *my_bisection* function to solve the equation $f(x) = x^2 -2$ ```python f = lambda x: x**2 - 2 @@ -116,7 +115,7 @@ print("f(r01) =", f(r01)) ``` -### Assignment 2 +### Problem 2 Write an algorithm that uses a combination of the bisection and incremental search to find multiple roots in the first 8 minutes of the data set. ```python import pandas as pd @@ -136,8 +135,8 @@ plt.legend() plt.show() ``` - ## False Position + # Open Methods So far we looked at methods that require us to bracket a root before finding it. However, let's think outside of the box and ask ourselves if we can find a root with only 1 guess. Can we improve the our guess by if we have some knowledge of the function itself? The answer - derivatives. @@ -164,7 +163,7 @@ Since $x_0$ is our current guess and $x_0$ is our next guess, we can write these $$ \boxed{x_{i+1} = x_i - \frac{f(x_i)}{f'(x_i)}} \tag{3.1} $$ -### Assignment 2 +### Problem 1 From experimental data we extrapolated a function f. Write a python function called *newtonraphson* which as the following input parameters - `f` - function - `df` - first derivative of the function |
