Introduction to Machine Learning
chaseandblack@gmail.com
正文
Introduction
A ML model may be predictive to make predictions in the future, or descriptive to gain knowlegde from data, or both. So there are predictive machine leaning and descriptive machine learning.
Examples of ML Applications
Leaning Associations
Finding an association rule is learning a conditional probability P(Y|X) where Y is the product we'd like to condition on X that one has already bought. When making a distinction among customers, we are more willing to estimate P(Y|X,D) where D is the set of customer attributes.
Classification and Regression
Both classification and regression are supervised learning problems there is an input X, an output Y, and the task is to learn the mapping from the input to the output y=g(x|θ) where g(˙) is the model and θ are its parameters. Y is a number in regression and a class code (e.g. 0/1) in the case of classification. g() is the discriminant function(判别函数, 是直接用来对模式样本进行分类的准则函数) separating the instances of different classes. In statistics, classification is called discriminant analysis.
Unsupervised Learning
In unpuservised learning, the aim is to find the regularities in the input. In statistics, it is also called density estimation. One method for density estimation is clustering like customer segmentation, image compression, document clustering and learning motif(small sequences that frequently happens) by clustering sequences of DNA.
Reinforcement Learning
Notes
Dedicated journals in ML are Machine Learning, Journal of Machine Learning Research, Neural Computation, Neural Networks, IEEE Transactions on Neural Networks. Statistics journals like Annals of Statistics, Journal of the American Statistical Association, IEEE Transactions on Pattern Analysis, Machine Intelligence, Data Mining and Knowledge Discovery, IEEE Transactions on Knowledge and Data Engineering, ACM Special Interest Group on Knowledge Discovery and Data Mining Explorations Journal.
Dedicated conference in ML Neural Information Processing Systems, Uncertainty in Artificial Intelligence, International Conference on Machine Learning, European Conference on Machine Learning, Computational Learning Theory, International Joint Conference on Artificial Intelligence.
Supervised Learning
Binary Classification
Suppose the Input is 2D, i.e, x=[x1x2] with label r={1if x is a positive example0if x is a negative example. The training set contains N such examples X={xt,rt}Nt=1 where t indexes different examples in the set where each example t is a data point at (xt1,xt2) with its type rt.
posted on 2017-03-25 00:01 chaseblack 阅读(210) 评论(0) 编辑 收藏 举报
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】凌霞软件回馈社区,博客园 & 1Panel & Halo 联合会员上线
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 深入理解 Mybatis 分库分表执行原理
· 如何打造一个高并发系统?
· .NET Core GC压缩(compact_phase)底层原理浅谈
· 现代计算机视觉入门之:什么是图片特征编码
· .NET 9 new features-C#13新的锁类型和语义
· Sdcb Chats 技术博客:数据库 ID 选型的曲折之路 - 从 Guid 到自增 ID,再到
· 语音处理 开源项目 EchoSharp
· 《HelloGitHub》第 106 期
· Spring AI + Ollama 实现 deepseek-r1 的API服务和调用
· 使用 Dify + LLM 构建精确任务处理应用