Multiple linear regression with gradient descent from scratch in Python

In this post I will show you how to calculate the optimal values of a multiple linear regression using the gradient descent algorithm. In the previous posts I already introduced you to the simple linear regression (of one independent variable) and also to the (stochastic) gradient descent algorithm. First, I will briefly introduce what multiple … Read more

Gradient descent and stochastic gradient descent

In this post I would like to introduce the Gradient Descent and its applications. I hope that every reader of this blog is familiar with derivatives and gradients of (simple in school learned) functions, such as f(x)=7x²+3x+9. If you derive the mentioned function, you get the derivative f'(x)=14x+3. However, if you have functions that depend on … Read more

Simple linear regression

In this post, I write about a simple method of creating a simple linear regression using two formulas. “Simple Linear Regression” is a linear regression model with a singl explanatory variable. A linear regression trys to fit the data with a linear line, hence linear regression. A linear model follows the following equation: Where: m: … Read more

Newton’s method

In this post I will shortly explain what Newton’s method is and how it works. What is it: Newton’s method is an algorithm to find zeropoints of a function Because you cannot always calculate the exact zeropoint we can use this algorithm to approximate the zeros of the function How does it work: Start at … Read more

Cookie Consent with Real Cookie Banner