So I'm one lesson into part 2 of the Fastai course (2019) on account of Fastai 2 not being quite ready for release yet.

Not what I was expecting! I was expecting more advanced topics like object detection, style transfer, etc etc. In previous years this is what part 2 was mainly about.

The 2019 version of part 2 however, starts with recreating the basics of Fastai, from scratch, in Python.

You get to start with only the most basic libraries, and you don't get to use any pre-existing DL stuff until you've written it yourself from scratch, so for example you're not allowed to use m1@m2 until you've implemented your own matmul. The first lesson starts with matmul using 3 nested loops and works it's way up using element-wise maths, broadcasting, einsum etc until you've got something (literally) about fifty thousand times faster than 3 nested loops; Then you're allowed to use @.

It's very interesting. While it's not really necessary for you to go this deep into how neural nets work, some people just understand things better this way, and I'm one of them. Also the flip-side of 'you don't really need to know this' is that when papers and new techniques come along, if you do understand this then it's going to be much easier to experiment with new archtecture ideas, new activation functions, optimisations etc.

So that's me for the next little while. Next Tuesday I need to pack my family back into the car and drive back across Europe to the UK, and go back to my day job. I'm not as close as I'd like to applying for DS jobs, but there's been a lot more to learn than I anticipated.

Note: If you’re in the London or Surrey area and want to talk to me about a job, please get in touch. joedockrill@gmail.com