Difference between revisions of "CoSMo 2018"

(Introduction)
(Introduction)
Line 86: Line 86:
 
[[Media:Acerbi-CoSMo2018.pdf | Luigi's slides]] <br>
 
[[Media:Acerbi-CoSMo2018.pdf | Luigi's slides]] <br>
 
[https://github.com/lacerbi/cosmo-2018-tutorial Here is the link] to Luigi's tutorial files on Github. <br>
 
[https://github.com/lacerbi/cosmo-2018-tutorial Here is the link] to Luigi's tutorial files on Github. <br>
 +
 +
 +
----
  
  

Revision as of 19:44, 6 August 2018

CoSMo logo

This page contains course materials for the CoSMo 2018 summer school.



Introduction

Jul 30 - Aug 4
Lecturers: Gunnar Blohm, Paul Schrater, Konrad Kording

Day 1 - Overview of modeling in neuroscience

CoSMo 2018 organizational slides

Konrad's and Gunnar's model pitches
Paul's multiple learning pitch
Paul's optimal forgetting pitch
Paul's deep learning bottleneck pitch
Paul's minimum intervention principle pitch


Afternoon tutorial 1: plotting neural data

Here is the file [Stevenson Data Set] As part of the tuning curve exercise we will understand it.

Tutorial is available here

Afternoon tutorial 2: gain modulation for reference frame transformations

The goal of this tutorial is to understand how gain modulation can be used for reference frame transformations and how gain modulation can emerge from training a simple artificial neural network carrying out reference frame transformations.
There are 2 different approaches to solving this:

  • exact determination of read-out weights from eye-position gain-modulated neurons as in this seminal paper. Here the solution can be found by computing the least-square optimal set of weights mapping the gain-modulated neurons (population code) to head-centered output neuron(s). For this to work, population code neurons need to be of the exponential function family.
  • training a neural network to perform reference frame transformations using this code. For this you can plot each individual neuron's receptive field for different eye positions and analyze how the receptive field changes with eye position in each network layer.

Day 2 - Bayesian approaches

Bayesian perception - an introduction: a tremendous book written by Wei Ji Ma, Konrad Kording, Daniel Goldreich

Morning lectures & tutorial

Konrad's Bayesian decoding and multi-sensory integration tutorials are available here (in folder day 2)

Afternoon lectures & tutorials
Dropbox link to Paul's slides and tutorial (in sub-folder decision_tutorial)

How to model tutorial
Modeling 101 slides


Day 3 - Linear systems and Kalman filtering

Morning: Linear systems (saccades)
Linear systems theory lecture
Eye movement tutorial and a possible solution for both time and frequency domain modelling
van Opstal syllabus - linear systems theory: a great syllabus developed by John van Opstal for CoSMo on using linear systems to model gaze control with theory, exercises and answers to exercises

Afternoon: Kalman filtering
Dropbox link to Paul's slides and tutorial (in sub-folder Kalman_lecture_tutorial)


Day 4 - Motor control

Morning: optimal control
Control slides
Matlab control tutorial files

Afternoon: Paper writing 101
Individual abstract writing document
Konrad's PLoS CB 10 simple rules paper on how to structure papers

Evening: causality in neuroscience
Konrad's discussion slides and tutorials on causality in neuroscience can be found here (in sub-folder causality)


Day 5 - Optimality

Can a neuroscientist understand a mirco-chip?
Konrad's article discussing this is published here

BADS model fitting
Lecturer: Luigi Acerbi

Luigi's slides
Here is the link to Luigi's tutorial files on Github.




Day 6 - machine learning

Konrad's tutorial on machine learning for neuroscience can be found here (in sub-folder ML)

Motor control

Aug 6 - 7
Lecturers: Alaa Ahmed, Frederic Crevecoeur, Reza Shadmehr

Alaa's PM tutorial