前往小程序,Get更优阅读体验!
立即前往
首页
学习
活动
专区
工具
TVP
发布
社区首页 >专栏 >基于sklearn的线性回归器理论代码实现

基于sklearn的线性回归器理论代码实现

作者头像
月见樽
发布2018-04-27 12:02:21
8710
发布2018-04-27 12:02:21
举报

理论

线性回归器

相比于线性分类器,线性回归器更加自然。回归任务的label是连续的变量(不像分类任务label是离散变量),线性回归器就是直接通过权值与输入对应相乘再相加直接计算出结果$$y = w^{T}*x + b$$ 其中,w为权值,x是输入,y是输出

回归器的优化

与分类器类似,回归器也是通过梯度优化的,一般来说分类问题常用均方误差函数来标定结果的质量(即代价函数)$$L(w,b) = \sum (y - y')$$ 其中y为模型输出,y'为期望值。

代码实现

数据集导入

代码语言:javascript
复制
from sklearn.datasets import load_boston
boston = load_boston()
print(boston.DESCR)
代码语言:javascript
复制
Boston House Prices dataset
===========================

Notes
------
Data Set Characteristics:  

    :Number of Instances: 506 

    :Number of Attributes: 13 numeric/categorical predictive
    
    :Median Value (attribute 14) is usually the target

    :Attribute Information (in order):
        - CRIM     per capita crime rate by town
        - ZN       proportion of residential land zoned for lots over 25,000 sq.ft.
        - INDUS    proportion of non-retail business acres per town
        - CHAS     Charles River dummy variable (= 1 if tract bounds river; 0 otherwise)
        - NOX      nitric oxides concentration (parts per 10 million)
        - RM       average number of rooms per dwelling
        - AGE      proportion of owner-occupied units built prior to 1940
        - DIS      weighted distances to five Boston employment centres
        - RAD      index of accessibility to radial highways
        - TAX      full-value property-tax rate per $10,000
        - PTRATIO  pupil-teacher ratio by town
        - B        1000(Bk - 0.63)^2 where Bk is the proportion of blacks by town
        - LSTAT    % lower status of the population
        - MEDV     Median value of owner-occupied homes in $1000's

    :Missing Attribute Values: None

    :Creator: Harrison, D. and Rubinfeld, D.L.

This is a copy of UCI ML housing dataset.
http://archive.ics.uci.edu/ml/datasets/Housing


This dataset was taken from the StatLib library which is maintained at Carnegie Mellon University.

The Boston house-price data of Harrison, D. and Rubinfeld, D.L. 'Hedonic
prices and the demand for clean air', J. Environ. Economics & Management,
vol.5, 81-102, 1978.   Used in Belsley, Kuh & Welsch, 'Regression diagnostics
...', Wiley, 1980.   N.B. Various transformations are used in the table on
pages 244-261 of the latter.

The Boston house-price data has been used in many machine learning papers that address regression
problems.   
     
**References**

   - Belsley, Kuh & Welsch, 'Regression diagnostics: Identifying Influential Data and Sources of Collinearity', Wiley, 1980. 244-261.
   - Quinlan,R. (1993). Combining Instance-Based and Model-Based Learning. In Proceedings on the Tenth International Conference of Machine Learning, 236-243, University of Massachusetts, Amherst. Morgan Kaufmann.
   - many more! (see http://archive.ics.uci.edu/ml/datasets/Housing)    

数据预处理

数据分割

代码语言:javascript
复制
from sklearn.model_selection import train_test_split
x_train,x_test,y_train,y_test = train_test_split(x_data,boston.target,random_state=33,test_size=0.25)

数据标准化

代码语言:javascript
复制
# print(type(y_test))
from sklearn.preprocessing import StandardScaler
ss_x = StandardScaler()
ss_y = StandardScaler()

x_train = ss_x.fit_transform(x_train)
x_test = ss_x.transform(x_test)

y_train = ss_y.fit_transform(y_train.reshape(-1,1)).reshape(-1)
y_test = ss_y.transform(y_test.reshape(-1,1)).reshape(-1)

print(y_train.shape)
代码语言:javascript
复制
(379,) 

模型训练

线性回归模型

代码语言:javascript
复制
from sklearn.linear_model import LinearRegression
lr = LinearRegression()
lr.fit(x_train,y_train)
代码语言:javascript
复制
LinearRegression(copy_X=True, fit_intercept=True, n_jobs=1, normalize=False)

SGD回归模型

代码语言:javascript
复制
from sklearn.linear_model import SGDRegressor
sgd = SGDRegressor()
sgd.fit(x_train,y_train)
代码语言:javascript
复制
c:\users\qiank\appdata\local\programs\python\python35\lib\site-packages\sklearn\linear_model\stochastic_gradient.py:84: FutureWarning: max_iter and tol parameters have been added in <class 'sklearn.linear_model.stochastic_gradient.SGDRegressor'> in 0.19. If both are left unset, they default to max_iter=5 and tol=None. If tol is not None, max_iter defaults to max_iter=1000. From 0.21, default max_iter will be 1000, and default tol will be 1e-3.
  "and default tol will be 1e-3." % type(self), FutureWarning)

SGDRegressor(alpha=0.0001, average=False, epsilon=0.1, eta0=0.01,
       fit_intercept=True, l1_ratio=0.15, learning_rate='invscaling',
       loss='squared_loss', max_iter=5, n_iter=None, penalty='l2',
       power_t=0.25, random_state=None, shuffle=True, tol=None, verbose=0,
       warm_start=False)

模型评估

自带评估器

代码语言:javascript
复制
lr.score(x_test,y_test)
代码语言:javascript
复制
0.67634038309987021
代码语言:javascript
复制
sgd.score(x_test,y_test)
代码语言:javascript
复制
0.65777103520375069

平均绝对误差

代码语言:javascript
复制
from sklearn.metrics import mean_absolute_error
print("lr:",mean_absolute_error(ss_y.inverse_transform(y_test),ss_y.inverse_transform(lr.predict(x_test))))
print("sgd:",mean_absolute_error(ss_y.inverse_transform(y_test),ss_y.inverse_transform(sgd.predict(x_test))))
代码语言:javascript
复制
lr: 0.379976703913
sgd: 0.377629585475

均方误差

代码语言:javascript
复制
from sklearn.metrics import mean_squared_error
print("lr:",mean_squared_error(ss_y.inverse_transform(y_test),ss_y.inverse_transform(lr.predict(x_test))))
print("sgd:",mean_squared_error(ss_y.inverse_transform(y_test),ss_y.inverse_transform(sgd.predict(x_test))))
代码语言:javascript
复制
lr: 0.29143408577
sgd: 0.30815455581

R-squared误差(1 - 回归平方误差/数据内方差)

代码语言:javascript
复制
from sklearn.metrics import r2_score
print("lr:",r2_score(ss_y.inverse_transform(y_test),ss_y.inverse_transform(lr.predict(x_test))))
print("sgd:",r2_score(ss_y.inverse_transform(y_test),ss_y.inverse_transform(sgd.predict(x_test))))
代码语言:javascript
复制
lr: 0.6763403831
sgd: 0.657771035204
本文参与 腾讯云自媒体分享计划,分享自作者个人站点/博客。
原始发表:2017.11.25 ,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 作者个人站点/博客 前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体分享计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
目录
  • 理论
    • 线性回归器
      • 回归器的优化
      • 代码实现
        • 数据集导入
          • 数据预处理
            • 数据分割
            • 数据标准化
          • 模型训练
            • 线性回归模型
            • SGD回归模型
          • 模型评估
            • 自带评估器
            • 平均绝对误差
            • 均方误差
            • R-squared误差(1 - 回归平方误差/数据内方差)
        领券
        问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档