Processing math: 100%

Introduction to Machine Learning

chaseandblack@gmail.com

 

正文

Introduction

A ML model may be predictive to make predictions in the future, or descriptive to gain knowlegde from data, or both. So there are predictive machine leaning and descriptive machine learning

Examples of ML Applications

Leaning Associations

Finding an association rule is learning a conditional probability P(Y|X) where Y is the product we'd like to condition on X that one has already bought. When making a distinction among customers, we are more willing to estimate P(Y|X,D) where D is the set of customer attributes.

Classification and Regression

Both classification and regression are supervised learning problems there is an input X, an output Y, and the task is to learn the mapping from the input to the output y=g(x|θ) where g(˙) is the model and θ are its parameters. Y is a number in regression and a class code (e.g. 0/1) in the case of classification. g() is the discriminant function(判别函数, 是直接用来对模式样本进行分类的准则函数) separating the instances of different classes. In statistics, classification is called discriminant analysis.

Unsupervised Learning

In unpuservised learning, the aim is to find the regularities in the input. In statistics, it is also called density estimation. One method for density estimation is clustering like customer segmentation, image compression, document clustering and learning motif(small sequences that frequently happens) by clustering sequences of DNA. 

Reinforcement Learning

Notes

Dedicated journals in ML are Machine LearningJournal of Machine Learning ResearchNeural Computation, Neural NetworksIEEE Transactions on Neural Networks. Statistics journals like Annals of StatisticsJournal of the American Statistical AssociationIEEE Transactions on Pattern Analysis, Machine Intelligence, Data Mining and Knowledge Discovery, IEEE Transactions on Knowledge and Data Engineering, ACM Special Interest Group on Knowledge Discovery and Data Mining Explorations Journal. 


Dedicated conference in ML Neural Information Processing Systems, Uncertainty in Artificial Intelligence, International Conference on Machine Learning, European Conference on Machine Learning, Computational Learning Theory, International Joint Conference on Artificial Intelligence.

Supervised Learning

Binary Classification

Suppose the Input is 2D, i.e, x=[x1x2] with label r={1if x is a positive example0if x is a negative example. The training set contains N such examples X={xt,rt}Nt=1 where t indexes different examples in the set where each example t is a data point at (xt1,xt2) with its type rt.

posted on   chaseblack  阅读(210)  评论(0编辑  收藏  举报

编辑推荐:
· 深入理解 Mybatis 分库分表执行原理
· 如何打造一个高并发系统?
· .NET Core GC压缩(compact_phase)底层原理浅谈
· 现代计算机视觉入门之:什么是图片特征编码
· .NET 9 new features-C#13新的锁类型和语义
阅读排行:
· Sdcb Chats 技术博客:数据库 ID 选型的曲折之路 - 从 Guid 到自增 ID,再到
· 语音处理 开源项目 EchoSharp
· 《HelloGitHub》第 106 期
· Spring AI + Ollama 实现 deepseek-r1 的API服务和调用
· 使用 Dify + LLM 构建精确任务处理应用

导航

< 2025年1月 >
29 30 31 1 2 3 4
5 6 7 8 9 10 11
12 13 14 15 16 17 18
19 20 21 22 23 24 25
26 27 28 29 30 31 1
2 3 4 5 6 7 8

统计

点击右上角即可分享
微信分享提示