Multiple Linear Regression

Lesson 1 of 1

- 1Linear regression is useful when we want to predict the values of a variable from its relationship with other variables. There are two different types of linear regression models ( simple linear re…
- 2
**StreetEasy**is New York City’s leading real estate marketpl… - 3As with most machine learning algorithms, we have to split our dataset into: -
**Training set**: the data used to fit the model -**Test set**: the data partitioned away at the very start of the e… - 4Now we have the training set and the test set, let’s use scikit-learn to build the linear regression model! The steps for multiple linear regression in scikit-learn are identical to the steps for …
- 5You’ve performed Multiple Linear Regression, and you also have the predictions in y_predict. However, we don’t have insight into the data, yet. In this exercise, you’ll create a 2D scatterplot to s…
- 6Now that we have implemented Multiple Linear Regression, we will learn how to tune and evaluate the model. Before we do that, however, it’s essential to learn the equation behind it. **Equation 6….
- 7In our Manhattan model, we used 14 variables, so there are 14 coefficients: [ -302.73009383 1199.3859951 4.79976742 -24.28993151 24.19824177 -7.58272473 -140.90664773 48.85017415 191.4257…
- 8When trying to evaluate the accuracy of our multiple linear regression model, one technique we can use is
**Residual Analysis**. The difference between the actual value*y*, and the predicted valu… - 9Now let’s rebuild the model using the new features as well as evaluate the new model to see if we improved! For Manhattan, the scores returned: Train score: 0.772546055982 Test score: 0.80503719…
- 10Great work! Let’s review the concepts before you move on: -
**Multiple Linear Regression**uses two or more variables to make predictions about another variable: y = b + m*{1}x*{1} + m*{2}x*{2} +…

## How you'll master it

Stress-test your knowledge with quizzes that help commit syntax to memory