Math, code, concepts: A third road to deep learning
Sigrid Keydana - March 14, 2019
Not everybody who wants to get into deep learning has a strong background in math or programming. This post elaborates on a concepts-driven, abstraction-based way to learn what it’s all about.
In the previous version of their awesome deep learning MOOC, I remember fast.ai’s Jeremy Howard saying something like this:
You are either a math person or a code person, and […] *
*If I don’t remember correctly: please just allow me to use this as the perfect intro to this post.
I may be wrong about the either , and this is not about either versus, say, both . What if in reality, you’re none of the above?
What if you come from a background that is close to neither math and statistics, nor computer science: the humanities, say? You may not have that intuitive, fast, effortless-looking understanding of LaTeX formulae that comes with natural talent and/or years of training, or both - the same goes for computer code.
Understanding always has to start somewhere, so it will have to start with math or code (or both). Also, it’s always iterative, and iterations will often alternate between math and code. But what are things you can do when primarily, you’d say you are a concepts person ?
When meaning doesn’t automatically emerge from formulae, it helps to look for materials (blog posts, articles, books) that stress the concepts those formulae are all about. By concepts, I mean abstractions, concise, verbal characterizations of what a formula signifies.2
Let’s try to make conceptual a bit more concrete. At least three aspects come to mind: useful abstractions , chunking (composing symbols into meaningful blocks), and action (what does that entity actually do ?)
Read more at Posit AI Blog: Math, code, concepts: A third road to deep learning