W1D5: Microcircuits


Prerequisites - needs rewriting, sentence structure needs fixing. Add more of a welcoming intro to the day.
  • "While the first two tutorials of this day don’t use specific frameworks or modeling techniques, discussing fundamental operations using the most popular python libraries for data processing, the last tutorial discovers the attention mechanism presented in Transformers. It might be beneficial to have an idea of this architecture type, which is already presented in  W1D1 ; thus, no further specific knowledge is assumed."
Change to -->
  • "Welcome to Week 1 - Day 5. Today, we'll be learning all about microcircuits and we'll cover a few interesting cases where methods from NeuroAI can provide some interesting findings. The first two tutorials cover sparsity and normalization, two topics which are hugely important in neuroscience and in machine learning (and therefore, especially in their interaction). The third tutorial will examine the famous "Attention" mechanism in Transformers. We assume some familiarity with Transformer-based attention (it was covered a little bit in Week 1 - Day 1) but not much beyond that. This tutorial should provide you with a solid understanding of the main concepts and a visual depiction of how the method works. Good luck!"
  • Need to go into deeper detail to figure out what the original prerequisite text meant because I don't understand what point it is making (it's more Python based?) Revisit, check and complete the missing <...> part of the above. Done! Check Mark Button