summaryrefslogtreecommitdiff
path: root/tutorials/module_4/4.5 Statistical Analysis II.md
diff options
context:
space:
mode:
Diffstat (limited to 'tutorials/module_4/4.5 Statistical Analysis II.md')
-rw-r--r--tutorials/module_4/4.5 Statistical Analysis II.md63
1 files changed, 59 insertions, 4 deletions
diff --git a/tutorials/module_4/4.5 Statistical Analysis II.md b/tutorials/module_4/4.5 Statistical Analysis II.md
index df1b585..3df558b 100644
--- a/tutorials/module_4/4.5 Statistical Analysis II.md
+++ b/tutorials/module_4/4.5 Statistical Analysis II.md
@@ -1,17 +1,72 @@
# 4.5 Statistical Analysis II
-[Introduction text]
+As mentioned in the previous tutorial. Data is what gives us the basis to create models. By now you've probably used excel to create a line of best fit. In this tutorial, we will go deeper into how this works and how we can apply this to create our own models to make our own predictions.ile changes in local repository
+​=======
+ File changes in remote reposito
## Least Square Regression and Line of Best Fit
+
+
### What is Linear Regression?
Linear regression is one of the most fundamental techniques in data analysis. It models the relationship between two (or more) variables by fitting a **straight line** that best describes the trend in the data.
-Linear regression helps identify proportional relationships, estimate calibration constants, or model linear system responses.
+### Linear
+To find a linear regression line we can apply the
-## Least square fitting
+### Exponential and Power functions
+Logarithm trick
+### Polynomial
+ For non-linear equations function such as a polynomial Numpy has a nice feature.
-## Extrapolation
+```python
+x_d = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8])
+y_d = np.array([0, 0.8, 0.9, 0.1, -0.6, -0.8, -1, -0.9, -0.4])
+
+plt.figure(figsize = (12, 8))
+for i in range(1, 7):
+
+ # get the polynomial coefficients
+ y_est = np.polyfit(x_d, y_d, i)
+ plt.subplot(2,3,i)
+ plt.plot(x_d, y_d, 'o')
+ # evaluate the values for a polynomial
+ plt.plot(x_d, np.polyval(y_est, x_d))
+ plt.title(f'Polynomial order {i}')
+
+plt.tight_layout()
+plt.show()
+```
+
+### Using Scipy
+```python
+# let's define the function form
+def func(x, a, b):
+ y = a*np.exp(b*x)
+ return y
+
+alpha, beta = optimize.curve_fit(func, xdata =
+ x, ydata = y)[0]
+print(f'alpha={alpha}, beta={beta}')
+
+# Let's have a look of the data
+plt.figure(figsize = (10,8))
+plt.plot(x, y, 'b.')
+plt.plot(x, alpha*np.exp(beta*x), 'r')
+plt.xlabel('x')
+plt.ylabel('y')
+plt.show()
+```
+
+
+
+
+### How well did we do?
+
+Using the
+
+## Extrapolation
+basis funct
## Moving average