Monday 27 June 2016

A recipe for getting into deep learning (10 artificially quickened steps)

To answer the constantly recurring question on Quora of "how do I get into deep learning?" I've decided to write down a possible workflow - a recipe from scratch, so to say:

(0) Preparations: get psyched
Watch Ex Machina, Read iRobot, Read Andrej's short story (to get the A.I. researcher's perspective), Google the words "deep learning" and recognize how it has proliferated into the general stream of mass media (and read the journalist's perspective). Breathe.

(1) Read over these notes: http://vision.stanford.edu/teaching/cs231n/ (not once, but twice, so you can catch all the additional tips and tricks sprinkled throughout)


(2) To complement (1), watch these lectures: https://www.youtube.com/playlist?list=PLkt2uSq6rBVctENoVBg1TpCC7OQi31AlC - too much content, too little time? https://chrome.google.com/webstore/detail/video-speed-controller/nffaoalbilbmmfgbnbgppjihopabppdk?hl=en (you're welcome)


(3) Install Evernote. Start clipping any tip or hint, blog post, forum response, or useful link. Collect knowledge at a rapid pace.

(4) Go over every example and tutorial: http://caffe.berkeleyvision.org/ 

(5) Start with a simple network like AlexNet and write out the model structure, draw out the blobs and layers, calculate the number of parameters and computations, examine how the input and output dimensions change from layer to layer. Consider some other architectures and look at how different architectural choices would affect these calculations.


(6) Set up iPython notebook and start playing with simple existing nets, figure out how to parse and visualize the training/test errors and monitor model performance over iterations, figure out how to visualize the features and different net computations, run existing nets on some new images, plot some beautiful results.


(7) Fine-tune an existing net for a new dataset and task (bonus points for coming up with a fun new task and dataset).


(8) Hear from the giants and gain some additional high-level intuitions: http://www.ipam.ucla.edu/programs/summer-schools/graduate-summer-school-deep-learning-feature-learning/?tab=schedule


(9) Dig deeper, build stronger foundations from the bottom-up: http://www.deeplearningbook.org/

(10) Re-watch as many deep learning talks from the last few years as possible (at 1.5-2.0x speed, of course). Open ArXiv. Breathe.




No comments:

Post a Comment