文章分类 - Advanced learning algorithm
摘要:Back propagation from sympy import * import numpy as np import re %matplotlib widget import matplotlib.pyplot as plt from matplotlib.widgets import Te
阅读全文
摘要:Derivatives from sympy import symbols, diff J = (3)**2 J_epsilon = (3 + 0.001)**2 k = (J_epsilon - J)/0.001 # difference divided by epsilon print(f"J
阅读全文
摘要:adam algorithm convolution layer
阅读全文
摘要:softmax import numpy as np import matplotlib.pyplot as plt plt.style.use('./deeplearning.mplstyle') import tensorflow as tf from tensorflow.keras.mode
阅读全文
摘要:softmax softmax regression cost numerical roundoff errors model = Sequential([ Dense(units=25,activation='sigmoid'), Dense(units=15,activation='sigmoi
阅读全文
摘要:import tensorflow as tf from tensorflow.keras import Sequential from tensorflow.keras.layers import Dense model = Sequential([ Dense(units=25,activati
阅读全文
摘要:import numpy as np import tensorflow as tf from tensorflow.keras.models import Sequential from tensorflow.keras.layers import Dense import matplotlib.
阅读全文
摘要:talk about AI:AGI and ANI neuron: simplified mathematical model of a neuron we can use different data to practise the neurons
阅读全文
摘要:def dense(a_in,W,b): units = W.shape[1] ###how many neurons a_out = np.zeros(units) for j in range(units): w = W[:,j] z =np.dot(w,a_in)+b[j] a_out[j]
阅读全文
摘要:Building a simple neural network import numpy as np import matplotlib.pyplot as plt plt.style.use('./deeplearning.mplstyle') import tensorflow as tf f
阅读全文
摘要:x = np.array([[200.0, 17.0]]) layer_1 = Dense(units=3,activation='sigmoid') #layer1,neurons:3,activation #activation function is sigmoid a1 = layer_1(
阅读全文
摘要:neural networks intuition **origins: Algorithms that try to mimic the brain. ** Used in the 1980's and early 1990's. Fell out of favor in the late 199
阅读全文
摘要:neural networks intuition **origins: Algorithms that try to mimic the brain. ** Used in the 1980's and early 1990's. Fell out of favor in the late 199
阅读全文