We'd like to know you better so we can create more relevant courses. What do you do for work?
Course Syllabus
You've achieved today's streak!
Complete one lesson every day to keep the streak going.
Su
Mo
Tu
We
Th
Fr
Sa
You earned a Free Pass!
Free Passes help protect your daily streak. Complete more lessons to earn up to 3 Free Passes.
Elevate Your Career with Full Learning Experience
Unlock Plus AI learning and gain exclusive insights from industry leaders
Access exclusive features like graded notebooks and quizzes
Earn unlimited certificates to enhance your resume
Starting at $1 USD/mo after a free trial – cancel anytime
In the last video, you learned about the logistic regression model. Now, let's take a look at the decision boundary to get a better sense of how logistic regression is computing its predictions. To recap, here's how the logistic regression model's outputs are computed in two steps. In the first step, you compute z as w dot x plus b. Then, you apply the sigmoid function g to this value z, and here again is the formula for the sigmoid function. Another way to write this is, we can say f of x is equal to g, the sigmoid function, also called the logistic function, applied to w dot x plus b, where this is of course the value of z. And if you take the definition of the sigmoid function and plug in the definition of z, then you find that f of x is equal to this formula over here, 1 over 1 plus e to the negative z, where z is w dot x plus b. And you may remember we said in the previous video that we interpret this as the probability that y is equal to 1, given x, and with parameters w and b, and so this is going to be a number, like maybe 0.7 or 0.3. Now, what if you want the learning algorithm to predict, is the value of y going to be 0 or 1? Well, one thing you might do is set a threshold above which you predict y is 1, or you set y hat, the prediction, to be equal to 1, and below which you might say y hat, my prediction, is going to be equal to 0. So, a common choice would be to pick a threshold of 0.5, so that if f of x is greater than or equal to 0.5, then predict y is 1, and we write that prediction as y hat equals 1, or if f of x is less than 0.5, then predict y is 0, or in other words, the prediction, y hat is equal to 0. So, now let's dive deeper into when the model would predict 1. In other words, when is f of x greater than or equal to 0.5? We'll recall that f of x is just equal to g of z, and so f is greater than or equal to 0.5 whenever g of z is greater than or equal to 0.5. But when is g of z greater than or equal to 0.5? Well, here's the sigmoid function over here, and so g of z is greater than or equal to 0.5 whenever z is greater than or equal to 0, right? That is, whenever z is on the right half of this axis. And finally, when is z greater than or equal to 0? Well, z is equal to w dot x plus b, and so z is greater than or equal to 0 whenever w dot x plus b is greater than or equal to 0. So, to recap, what you've seen here is that the model predicts 1 whenever w dot x plus b is greater than or equal to 0. And conversely, when w dot x plus b is less than 0, the algorithm predicts y is 0. So, given this, let's now visualize how the model makes predictions. I'm going to take an example of a classification problem where you have two features, x1 and x2, instead of just one feature. Here's a training set where the little red crosses denote the positive examples, and the little blue circles denote negative examples. So the red crosses correspond to y equals 1, and the blue circles correspond to y equals 0. So, the logistic regression model will make predictions using this function, f of x equals g of z, where z is now this expression over here, w1 x1 plus w2 x2 plus b, because we have two features, x1 and x2. And let's just say, for this example, that the value of the parameters are w1 equals 1, w2 equals 1, and b equals negative 3. And let's now take a look at how logistic regression makes predictions. In particular, let's figure out when wx plus b is greater than or equal to 0, and when wx plus b is less than 0. To figure that out, there's a very interesting line to look at, which is when wx plus b is exactly equal to 0. It turns out that this line is also called the decision boundary, because that's the line where you're just almost neutral about whether y is 0 or y is 1. Now, for the values of the parameters w1, w2, and b that we had written down above, this decision boundary is just x1 plus x2 minus 3. And so, when is x1 plus x2 minus 3 equal to 0? Well, that will correspond to the line x1 plus x2 equals 3, and that is this line shown over here. And so, this line turns out to be the decision boundary, where if the features x are to the right of this line, logistic regression would predict 1, and to the left of this line, logistic regression would predict 0. In other words, what we have just visualized is the decision boundary for logistic regression when the parameters w1, w2, and b are 1, 1, and negative 3. Of course, if you had a different choice of the parameters, the decision boundary would be a different line. Now, let's look at a more complex example where the decision boundary is no longer a straight line. As before, crosses denote the class y equals 1, and the little circles denote the class y equals 0. Earlier last week, you saw how to use polynomials in linear regression, and you can do the same in logistic regression. So, let's set z to be w1 x1 squared plus w2 x2 squared plus b. With this choice of features, polynomial features into logistic regression, so f of x, which equals g of z, is now g of this expression over here. And let's say that we end up choosing w1 and w2 to be 1, and b to be negative 1. So, z is equal to 1 times x1 squared plus 1 times x2 squared minus 1. And the decision boundary, as before, will correspond to when z is equal to 0. And so this expression will be equal to 0 when x1 squared plus x2 squared is equal to 1. And if you plot on the diagram on the left, the curve corresponding to x1 squared plus x2 squared equals 1, this turns out to be this circle. When x1 squared plus x2 squared is greater than or equal to 1, that's this area outside the circle, and that's when you predict y to be 1. Conversely, when x1 squared plus x2 squared is less than 1, that's this area inside the circle, and that's when you predict y to be 0. So, can we come up with even more complex decision boundaries than these? Yes, you can. You can do so by having even higher-order polynomial terms. Say z is w1x1 plus w2x2 plus w3x1 squared plus w4x1x2 plus w5x2 squared. Then it's possible that you can get even more complex decision boundaries. The model can define decision boundaries such as this example, an ellipse that's like this, or with a different choice of the parameters. You can even get more complex decision boundaries, which can look like functions that maybe look like that. So, this is an example of an even more complex decision boundary than the ones we've seen previously. And this implementation of logistic regression will predict y equals 1 inside this shape, and outside this shape will predict y equals 0. So, with these polynomial features, you can get very complex decision boundaries. In other words, logistic regression can learn to fit pretty complex data. Although, if you were to not include any of these higher-order polynomials, so if the only features you use are x1, x2, x3, and so on, then the decision boundary for logistic regression will always be linear, will always be a straight line. In the upcoming optional lab, you also get to see the code implementation of the decision boundary. In the example in the lab, there will be two features, so you can see the decision boundary as a line. So, with this visualization, I hope that you now have a sense of the range of possible models you can get with logistic regression. Now that you've seen what f of x can potentially compute, let's take a look at how you can actually train a logistic regression model. We'll start by looking at the cost function for logistic regression, and after that, figure out how to apply gradient descent to it. Let's go on to the next video.