week 4: Neural Networks–representation - 神经网络
week4
week4
Table of Contents
1 week4 : Neural Networks–representation - 神经网络
1.1 intro
application: read handwriting
Representation This week, we are covering neural networks. Neural networks is a model inspired by how the brain works. It is widely used today in many applications: when your phone interprets and understand your voice commands, it is likely that a neural network is helping to understand your speech; when you cash a check, the machines that automatically read the digits also use neural networks.
- XNOR
- (logical operator) which gives 1 if x1 and x2 are both 0 or both 1).
x x1 XNOR x2 = NOT (x1 XOR x2)
XOR = exclusive OR
OR function
1.2 non-linear pothesis
1.3 Neurons and the brain
auditory cortex learn to see
1.4 model representation
Dimension of h(x)
- Sj+1 * (Sj +1)
no. of features: O(n3)
Let's examine how we will represent a hypothesis function using neural networks. At a very simple level, neurons are basically computational units that take inputs (dendrites) as electrical inputs (called "spikes") that are channeled to outputs (axons). In our model, our dendrites are like the input features x1…xn
and the output is the result of our hypothesis function. In this model our x0 input node is sometimes called the "bias unit." It is always equal to 1. In neural networks, we use the same logistic function as in classification, \frac{ 1/(1 + e-θTx), yet we sometimes call it a sigmoid (logistic) activation function. In this situation, our "theta" parameters are sometimes called "weights". Visually, a simplistic representation looks like:
1.5 example
quiz Suppose x1 and x2 are binary valued (0 or 1). What boolean function does the network shown below (approximately) compute? (Hint: One possible way to answer this is to draw out a truth table, similar to what we did in the video).
ans: x1 or x2
A simple example of applying neural networks is by predicting x1x1x1 AND x2x2x2, which is the logical 'and' operator and is only true if both x1x1x1 and x2x2x2 are 1.
The graph of our functions will look like:
1.6 Multiclass Classification
Suppose you have a multi-class classification problem with 10 classes. Your neural network has 3 layers, and the hidden layer (layer 2) has 5 units. Using the one-vs-all method described here, how many elements does Θ(2) have?
Central Daylight Time (UTC-5)
1.7 quiz