You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
We created our first deep RNN. Also, we learned about how to make predictions
about the next N steps, not just the immediate next step. The model was trained
for just a few epochs, but performed very well.
Copy file name to clipboardExpand all lines: HandsOnMachineLearningWithScikitLearnAndTensorFlow/.ipynb_checkpoints/homl_ch15_Processing-sequences-using-RNNs-and-CNNs-checkpoint.ipynb
Copy file name to clipboardExpand all lines: HandsOnMachineLearningWithScikitLearnAndTensorFlow/homl_ch15_Processing-sequences-using-RNNs-and-CNNs.ipynb
Keras makes it easy to make a deep RNN by just stacking `SimpleRNN` layers.
229
+
Note that it is necessary to explicitly tell each layer to output the entire sequence so that the next layer doesn't just receive the final output of the previous one.
230
+
231
+
We can use another `SimpleRNN` for the final layer, but it is often preferrable to use a single `Dense` neuron.
232
+
One reason is that there is only one hidden variable for the single recurrent neuron whereas the normal neuron has a weight for each input.
233
+
Also, the recurrent neuron uses the tanh activation function, limiting the output to between -1 and 1.
So far, we have only predicted the value at the next time step, but we could have predicted the value after 10 additional steps by changing how we split the mock data into X and y datasets.
282
+
But what if we wanted to predict all of the next 10 steps.
283
+
284
+
One way to do this is to use the model trained above to predict the next step, then add that output to the input and have it predict the next step after that, and so on.
0 commit comments