Notes and code from Data_310 Lectures
The fashion MNIST example has increased the number of layers in our neural network from 1 in the past example, now to 3. The last two are .Dense layers that have activation arguments using the relu and softmax functions. What is the purpose of each of these functions. Also, why are there 10 neurons in the third and last layer in the neural network.
All of the answers below come from the Exercise 2 Notes
The following is the code used for this question
#estimating a probability model and apply it to the test array
classifications = model.predict(x_test)
print(classifications[0])
#determining argmax()
import numpy as np
np.argmax(classifications[0])
#plotting the actual image
import matplotlib
import matplotlib.pyplot as plt
plt.imshow(x_test[0])