1 Early stopping
At this point, the priority is to complete exercises implementing and using the neural network.
If you have time, the following exercises can give practice on the techniques from today’s lecture.
In this exercise, we will implement Early stopping method to avoid overfitting.
1.1 Split the data
In this technique, you need to divide the available data three subsets: training, validation and test set. Write a function that randomly splits your data into 3 subsets.
If you need a hint, come back to Tutorial 4.1 and take a look at mkTestTrainSets
. What you need to
do is pretty much similar.
1.2 Monitor the error on the validation set
Modify function trainNetwork
to monitor the error on the validation set during training process. The
process should stop when the validation error increases.
- First, try to stop the training process right after validation error starts increasing.
- Now modify your
trainNetwork
a bit so that the training process will be stopped when the validation error increases for a specified number of iterations, calledmaxFail
. When the training process is stopped, the network at minimum of validation error is returned. You can setmaxFail = 6
as a default value and try out different values ofmaxFail
afterwards. - Test your networks with the test set and compare the results from the two experiments. Which one gives you better results? Can you explain the reason?