Question: This question asks you to use a Python notebook to answer. Using this ipy Download Using this ipy, answer the below. You are building a

This question asks you to use a Python notebook to answer.
Using this ipy Download Using this ipy, answer the below.
You are building a neural network using the following layers:
A dense layer with 512 units with an input shape of max_words.
A ReLU activation function
A dropout rate of 0.5
Another dense layer with num_classes units
A final activation function with softmax
The network is compiled with categorical cross-entropy as the loss function, the Adam optimizer, and accuracy as an evaluation metric.
Hint:
model.compile(loss='categorical_crossentropy',
optimizer='adam',
metrics=['accuracy'])
model.summary()
What is the total number of parameters in this neural network?
Please enter your answer as a numeric value. max_words =1000
batch_size =32
epochs =5
print('Loading data...')
(x_train, y_train),(x_test, y_test)= reuters.load_data(num_words=max_words,
test_split=0.2)
print(len(x_train), 'train sequences')
print(len(x_test), 'test sequences')
num_classes = np.max(y_train)+1
print(num_classes, 'classes')
print('Vectorizing sequence data...')
tokenizer = Tokenizer(num_words=max_words)
x_train = tokenizer.sequences_to_matrix(x_train, mode='binary')
x_test = tokenizer.sequences_to_matrix(x_test, mode='binary')
print('x_train shape:', x_train.shape)
print('x_test shape:', x_test.shape)
print('Convert class vector to class matrix '
'(for use with categorical_crossentropy)')
y_train = tf.keras.utils.to_categorical(y_train, num_classes)
y_test = tf.keras.utils.to_categorical(y_test, num_classes)
print('y_train shape:', y_train.shape)
print('y_test shape:', y_test.shape)

Step by Step Solution

There are 3 Steps involved in it

1 Expert Approved Answer
Step: 1 Unlock blur-text-image
Question Has Been Solved by an Expert!

Get step-by-step solutions from verified subject matter experts

Step: 2 Unlock
Step: 3 Unlock

Students Have Also Explored These Related Programming Questions!