Skip to content

raghavendrasri/lowess

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

35 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

lowess

Coverage Status

This is a python implementation of the LOWESS algorithm for locally linear regression described in Cleveland (1979) and in chapter 6 of Friedman, Hastie and Tibshirani (2008).

Friedman, Hastie and Tibshirani (2008). The Elements of Statistical Learning.

Cleveland (1979). Robust Locally Weighted Regression and Smoothing Scatterplots. J American Statistical Association, 74: 829-836.)

#Examples

>>> import lowess as lo
>>> import numpy as np

# For the 1D case:
>>> x = np.random.randn(100)
>>> f = np.cos(x) + 0.2 * np.random.randn(100)
>>> x0 = np.linspace(-1,1,10)
>>> f_hat = lo.lowess(x, f, x0)
>>> import matplotlib.pyplot as plt
>>> fig,ax = plt.subplots(1)
>>> ax.scatter(x,f)
>>> ax.plot(x0,f_hat,'ro')
>>> plt.show()

# 2D case (and more...)
>>> x = np.random.randn(2, 100)
>>> f = -1 * np.sin(x[0]) + 0.5 * np.cos(x[1]) + 0.2*np.random.randn(100)
>>> x0 = np.mgrid[-1:1:.1, -1:1:.1]
>>> x0 = np.vstack([x0[0].ravel(), x0[1].ravel()])
>>> f_hat = lo.lowess(x, f, x0, kernel=lo.tri_cube)
>>> from mpl_toolkits.mplot3d import Axes3D
>>> fig = plt.figure()
>>> ax = fig.add_subplot(111, projection='3d')
>>> ax.scatter(x[0], x[1], f)
>>> ax.scatter(x0[0], x0[1], f_hat, color='r')

About

A Python implementation of the lowess algorithm

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%