Tutorial 2: Normalisation


nbkjhgb

  • Add recap definition and framework definition of normalisation early on in Tutorial 2 notebook
  • "divisive nature" → change wording on normalisation definition
  • The contrast between why we need the dedicated computation is not paired against the alternative in a clear way. The alternative needs to be more clearly stated ("Can we not just assume that through gradient descent-style learning, that the neurons would implicitly learn this and therefore we don't need a separate microcircuit to implement it?")
  • Terminology of ReLU and threshold could be very confusing (I thought it was) so I added an extra explanation to frame this in terms of weights & biases and slopes & intercepts, so that the students can more intuitively see what the figure represents and how it's conveying the take-away message
  • Continually add in back-up messages to reinforce stated points, about how each enables or facilitates generalisation
  •