# Demonstrate and Implement the non-parametric Locally Weighted Regression algorithm

0 like 0 dislike
1.1k views
Implement the non-parametric Locally Weighted Regression algorithm in order to fit data points. Select appropriate data set for your experiment and draw graphs.

### For Indian Students- INR 570/- || For International Students- \$12.99/-

S.No.

Course Name

Apply Coupon

1.

Tensorflow 2 & Keras:Deep Learning & Artificial Intelligence

Apply Coupon

2.

Computer Vision with OpenCV | Deep Learning CNN Projects

Apply Coupon

3.

Complete Machine Learning & Data Science with Python Apply Coupon

4.

Natural Language Processing-NLP with Deep Learning in Python Apply Coupon

5.

Computer Vision OpenCV Python | YOLO| Deep Learning in Colab Apply Coupon

6.

Complete Python Programming from scratch with Projects Apply Coupon

0 like 0 dislike
by Goeduhub's Expert (3.1k points)
edited by

## Locally weighted Linear Regression

• Linear weighted regression is the same as linear regression.
• What is linear regression then
• linear regression - Linear regression is a supervised learning algorithm.It basically work on the concept of line equation.

Y = mX +C

m-Coefficient of X  and C - Constant

• Linear regression perform a task to predict a dependent variable (Y) based on the independent variable (X).
• So, basically what we do in this modeling we try to find best fit line (regression line),As we can see from the equation ofcourse will be a straight line. Note: Here line represent Linear Regression,If we do not have the data like this, then what will we do? See the diagram below Note: Here we need a polynomial type model ,and the concept of Locally weighted regression comes here.

Let's understand it with cost function (calculate least square error)

### Cost Function (Linear regression) ### Cost Function (Locally weighted regression) 1. As we can see, there is only one difference in both of them is the weight.
2. Here we use least weighted squared error.
3. Let's see it by formula So,The interesting facts in this formula is we can get a non-linear regression model by changing the value of T(tau) that is as strong as polynomial regression of any degree.

Where  T(tau) is bandwidth parameter

x = query point

x0= training point

### Locally weighted regression

• It is supervised learning algorithm and extended form of linear regression
• It is non-parametric, and no training phase exist in this only testing.

## How to implement it

import numpy as np

import matplotlib.pyplot as plt

X=np.linspace(-3,3,1000)

print(X)

X+=np.random.normal(scale=0.05,size=1000)

Y=np.log(np.abs((X**2)-1)+0.5)

print(Y)

Note: numpy.linspace(start, stop, num=50, endpoint=True, retstep=False, dtype=None),

what the line space is doing here is creating an array which will take 1000 values ​​between -3 to 3.

Demo of linspace

np.linspace(2.0, 3.0, num=5)

Output

array([ 2. , 2.25, 2.5 , 2.75, 3. ])

Note: Here Y is only a function that has a non-linear relationship with X.

plt.scatter(X,Y,alpha=0.32)

Note:To see the relation of X and Y we are making a plot. def local_regression(x0,X,Y,tau):

x0=np.r_[1,x0]

# print(x0)

#print("len(X)",len(X))

X=np.c_[np.ones(len(X)),X]

#print(X)

print(xw)

beta=np.linalg.pinv(xw@X)@xw@Y

return x0@beta

Note:

1. Here we are creating a function, this function is calculating our final h(x0).
2. As you can see in the formulas above, we have 2 functions of beta(x0), here we are using the below one (in orange box), which is modify form of above one.
3. np.r_ will create a array which will contain one row and any number of columns, np.c_ will create a array which will contain one column and any number of rows.
4. We have defined below the radial_kernel function which will calculate our weight w(x,xo).
5. X.T is transform of matrix (array).
6. Here @ represent matrix multiplication and   the pinv used to invert the matrix

return np.exp(np.sum((X - x0) ** 2, axis=1) / (-2 * tau * tau))

Note: It's a simple function to calculate local weight w(x,x0)

def plot_lwr(tau):

domain=np.linspace(-3,3,num=300)

prediction=[local_regression(x0,X,Y,tau) for x0 in domain]

plt.scatter(X,Y,alpha=0.3)

plt.plot(domain,prediction,color="red")

return plt

plot_lwr(0.01)

Note:

1. Here we have defined our training point x0,and then called our function local_regression.
2. After that we plot our original plot  and predicted plot (model).
3. As you can see in the plot, our model is perfectly fit, if you change the value of tau then the red line (model) will change.
4. Shape of our model depends on the value of tau if you change the value of tau the shape will change.

Output 