First Kaggle Competition

26 Jul 2017 . category: Portfolio Building . Comments

Earlier this month, I listened to a podcast that introduced me to fast.ai 1. The tongue-in-cheek objective of fast.ai is “Making Neural Nets Uncool Again” by making deep learning more accessible and less exclusively the domain of research and PhDs.

I like their teaching philosophy and avid claim that deep learning (and good education in general) needn’t be that complicated. So I’ve started to take their Practical Deep Learning for Coders, Part 1 course.

Rachel Thomas and Jeremy Howard are the team behind fast.ai, with Jeremy having previous experience as President and Chief Scientist of Kaggle, a data science and machine learning competition site. So it shouldn’t be surprising that Kaggle has a prominent role in the course.

Kaggle has been on the periphery of my awareness ever since receiving yet another recommendation from my data-science-over-coffee mentor that fine Sunday morning in April. She strongly encouraged me to try my hand at past Kaggle competitions - past competitions because she felt they would be a useful platform for learning, but with less competitive stress. So I jotted that down as a critical to-do.

Fast-forward to taking this course, and my true introduction to Kaggle is now underway. For my very first homework assignment, I entered the State Farm Distracted Driver Detection competition from August 20162. The fact that I could do something like this in Week 1 (granted, with heavy assistance from the course material) is what makes this course so special. Providing these practical skills from the onset is core to the fast.ai philosophy, and I’m enjoying it so far.

Like Jeremy has stressed in lecture, learning in this way takes a leap of faith. I do not understand much of the underlying mechanisms for why my Kaggle score is so competitive (an “ok-ish” top 45% with minimal tweaking on my part from what was taught in class). But I trust that over the remaining 6 weeks of lecture, we will fully plumb the depths of what deep learning can be. And after seeing the view from the top, I’m excited to take that leap3.

Footnotes

  1. Previous blog post about my intro to fast.ai. 

  2. I placed my Jupyter notebook for this assignment/competition on GitHub. 

  3. I’m also a little impatient to get to Week 5 NLP + Recurrent Neural Networks, which I feel will have the most immediate benefit to my day job in the Notes world. 


Me

Nadja does not particularly enjoy writing about herself.