# 4.5 Statistical Analysis II As mentioned in the previous tutorial. Data is what gives us the basis to create models. By now you've probably used excel to create a line of best fit. In this tutorial, we will go deeper into how this works and how we can apply this to create our own models to make our own predictions.ile changes in local repository ​======= File changes in remote repository ## Least Square Regression and Line of Best Fit ### What is Linear Regression? Linear regression is one of the most fundamental techniques in data analysis. It models the relationship between two (or more) variables by fitting a **straight line** that best describes the trend in the data. ### Linear To find a linear regression line we can apply the ### Exponential and Power functions Logarithm trick ### Polynomial For non-linear equations function such as a polynomial Numpy has a nice feature. ```python x_d = np.array([0, 1, 2, 3, 4, 5, 6, 7, 8]) y_d = np.array([0, 0.8, 0.9, 0.1, -0.6, -0.8, -1, -0.9, -0.4]) plt.figure(figsize = (12, 8)) for i in range(1, 7): # get the polynomial coefficients y_est = np.polyfit(x_d, y_d, i) plt.subplot(2,3,i) plt.plot(x_d, y_d, 'o') # evaluate the values for a polynomial plt.plot(x_d, np.polyval(y_est, x_d)) plt.title(f'Polynomial order {i}') plt.tight_layout() plt.show() ``` ### Using Scipy ```python # let's define the function form def func(x, a, b): y = a*np.exp(b*x) return y alpha, beta = optimize.curve_fit(func, xdata = x, ydata = y)[0] print(f'alpha={alpha}, beta={beta}') # Let's have a look of the data plt.figure(figsize = (10,8)) plt.plot(x, y, 'b.') plt.plot(x, alpha*np.exp(beta*x), 'r') plt.xlabel('x') plt.ylabel('y') plt.show() ``` ### How well did we do? Using the ## Extrapolation basis funct ## Moving average