深度学习入门9787115485588 Chapter2
What's perceptron algorithm invented by Frank Rosenblatt?
The perceptron algorithm is a supervised learning algorithm for binary classification of input data into one of two possible categories. It was invented by Frank Rosenblatt in 1957 and is one of the oldest and simplest artificial neural network models.
The perceptron algorithm works by taking a set of input values and computing a weighted sum of those inputs. This sum is then compared to a threshold value, and the perceptron outputs a 1 if the sum is greater than the threshold or a 0 otherwise. The weights used in the weighted sum are adjusted during training to minimize the error between the predicted output and the true output.
The training process of the perceptron algorithm involves iterating through the input data and adjusting the weights for each input until the algorithm converges to a set of weights that produces accurate predictions. During each iteration, the perceptron makes a prediction for each input and updates the weights based on whether the prediction is correct or not. The weights are updated using the following formula:
new_weight = old_weight + learning_rate * (true_output - predicted_output) * input
where learning_rate
is a hyperparameter that controls the rate at which the weights are updated, true_output
is the true label of the input, predicted_output
is the output predicted by the perceptron, and input
is the input data.
Explain each part of formula y=w1x1+w2x2+b, which is used for perceptron
The formula y = w1x1 + w2x2 + b
represents the basic computation of a perceptron, where:
y
is the output of the perceptron. It represents the predicted class label for the input data.x1
andx2
are the input features for the perceptron. They represent the values of the features for a specific input data point.w1
andw2
are the weights associated with the input features. They represent the importance of each input feature in making the prediction.b
is the bias term. It represents the threshold for activation of the perceptron.
The perceptron algorithm uses this formula to compute the weighted sum of the input features. If this sum is greater than the threshold value b
, the perceptron predicts the positive class, otherwise it predicts the negative class.
During the training process of the perceptron algorithm, the weights and bias term are adjusted to minimize the error between the predicted output and the true output. The weights are adjusted by adding a fraction of the difference between the true output and the predicted output multiplied by the input feature value. The bias term is adjusted by adding a fraction of the difference between the true output and the predicted output.
In summary, the perceptron algorithm uses the formula y = w1x1 + w2x2 + b
to compute the weighted sum of the input features and make a prediction about the class label of the input data. The weights and bias term are adjusted during training to minimize the error between the predicted output and the true output.
Is there a term for w1x1+w2x2?
Yes, the term "weighted sum" is often used to refer to the expression w1x1 + w2x2
. It represents the linear combination of the input features x1
and x2
with their corresponding weights w1
and w2
. The weighted sum is a fundamental building block in many machine learning algorithms, including the perceptron and linear regression.
What's the concept difference between A*B and A.dot(B)?
In NumPy, the *
operator performs element-wise multiplication between two arrays, while the dot()
method performs matrix multiplication between two arrays.
When two arrays A
and B
are multiplied using *
, the corresponding elements of the two arrays are multiplied element-wise to produce a new array of the same shape. For example, if:
A = np.array([[1, 2], [3, 4]])
B = np.array([[5, 6], [7, 8]])
Then A * B
would result in:
array([[ 5, 12],
[21, 32]])
作者:Chuck Lu GitHub |
【推荐】国内首个AI IDE,深度理解中文开发场景,立即下载体验Trae
【推荐】编程新体验,更懂你的AI,立即体验豆包MarsCode编程助手
【推荐】抖音旗下AI助手豆包,你的智能百科全书,全免费不限次数
【推荐】轻量又高性能的 SSH 工具 IShell:AI 加持,快人一步
· 全程不用写代码,我用AI程序员写了一个飞机大战
· DeepSeek 开源周回顾「GitHub 热点速览」
· MongoDB 8.0这个新功能碉堡了,比商业数据库还牛
· 记一次.NET内存居高不下排查解决与启示
· 白话解读 Dapr 1.15:你的「微服务管家」又秀新绝活了
2022-03-19 炉石传说 关键词
2022-03-19 毛概
2022-03-19 What is Big Data?
2022-03-19 Chapter 1: What Is an Information System?
2020-03-19 How much business logic should be allowed to exist in the controller layer?
2019-03-19 Kibana --> Getting Started -->Building your own dashboard
2019-03-19 Download and Installation of Kibana