用sklearn 实现linear regression
基本的regression算法有四种方法可以实现,分别是下面四种
LinearRegression
Ridge (L2 regularization)
Lasso (L1 regularization)
ElasticNet (L1+L2 regularization)
这个Kaggle notebook 有详细的代码, 在此向作者 juliencs 致敬!
Reference:
【机器学习】正则化的线性回归 —— 岭回归与Lasso回归
还有更高级的算法可以用来做regression
Decision Tree
Random Forest, bagging 思路的一种实现,就是对多个DT 的结果综合起来一起预测
XGBoost, boosting 思路的一种实现
https://www.cnblogs.com/zongfa/p/9324684.html
从0开始机器学习-Bagging和Boosting
https://www.analyticsvidhya.com/blog/2016/02/complete-guide-parameter-tuning-gradient-boosting-gbm-python/
https://www.analyticsvidhya.com/blog/2016/03/complete-guide-parameter-tuning-xgboost-with-codes-python/
https://www.datacamp.com/community/tutorials/xgboost-in-python
https://machinelearningmastery.com/gentle-introduction-xgboost-applied-machine-learning/
https://www.kaggle.com/chocozzz/xgboost-tutorial
转载请注明出处 http://www.cnblogs.com/mashuai-191/