#### clevo controller driver. Two is your information for linear regressing. And so our linear **regression** is going to be the equation. And we're gonna write this equation here right on. To end this section let us define the equation of straight line because **regression** line is same as equation of straight line where **slope** is m **and intercept** is c. y = mx + c. How to calculate **slope and intercept** of **regression** line. Let us see the formula for calculating m (**slope**) and c (**intercept**). m = n (Σxy) – (Σx)(Σy) /n(Σx2) – (Σx)2. Coefficients are what a line of best fit model produces. A line of best fit (aka **regression**) model usually consist of an **intercept** (where the line starts) and the gradients (or **slope**) for the line for one or more variables. When we perform a linear **regression** **in** R, it'll output the model and the coefficients.

**Regression**with Transformations. Once we add the log transformation as a possibility - for either the x-variable, the y-variable, or both - we can describe many possible data trends. The only issue is that we need to make sure we know

**how**

**to**

**interpret**the

**slope**estimate in our model after the transformation. The expected

**intercept**of 0, however, is not significantly different than the calculated value of 3.60. Note that the larger standard deviation for the

**intercept**makes it more difficult to show that there is a significant difference between the experimental and theoretical values. Using the Results of a

**Regression**

**to**Make Predictions.

**And**, if the relationship is curved, we can still fit a

**regression**model to the data. Pearson's correlation coefficients measure only linear relationships. Consequently, if your data contain a curvilinear relationship, the correlation coefficient will not detect it. R-squared is a primary measure of

**how**well a

**regression**model fits the data.