CSC 370

Transfer Learning

 

One of the most useful ideas in the development of deep learning for computer vision is the concept of transfer learning. This is the use of a network trained on one task to solve a different but related task. In many cases, the network can be retrained for the new task with much less work than it took for the original training, because most of the features learned at the lower layers of the network can be reused for the new task.

Matlab and Deep Learning

Matlab has added objects and routines for deep learning to its Neural Network Toolbox, which is licensed by Smith. They provide several good tutorials, including one on transfer learning. Try it out and see how it works. You can also view this webinar which covers a similar example. There is also a longer webinar which covers several deep learning examples. (You will not need either of these for purposes of the exercises below, but they are provided as places to explore next if you are interested.

When you have finished with the transfer learning demo, make it your own by attempting the same thing with a new data set. You can load the MNIST digit data set with the following code (taken from another exercise on deep learning in Matlab):

digitDatasetPath = fullfile(matlabroot,'toolbox','nnet','nndemos',...
    'nndatasets','DigitDataset');
digitData = imageDatastore(digitDatasetPath,...
    'IncludeSubfolders',true,'LabelSource','foldernames');

You'll probably have to make a few changes because the images are different in size, but it shouldn't be too hard to get the transfer learning working on this new set. Specifically, there are two calls to create an augmentedImageDatastor. Since the MNist images are grayscale and Alexnet expects color images, you need to add the property specifier 'ColorPreprocessing','gray2rgb' to the creation call.