Identification of mouse whisker movements using deep networks
Given their poor vision rodents such as mice and rats heavily rely on whiskers to acquire sensory information about their environment. These animals use their whiskers in the same way as humans use their fingers to extract texture information. The mouse whisker system has emerged as powerful experimental model to understand how animals actively control their sensory inputs. Moreover, this system allows us to probe the link between sensory inputs and motor control. Together with my collaborator (Prof. Gilad Silberberg) at the Karolinska Institute we are recording neuronal activity from different brain regions while the mice are whisking.
In these experiments the whisking is not constrained by the experiments because we are interested in understanding neural correlates of natural whisking. This experiments require identification of whisker movements from the video recording and correlate them to the neuronal activity. Identification of whisker movements from high-density video is a very challenging task especially when we aim to track multiple whiskers simultaneously and cannot be done manually. Recently Mathis et al. (2018) have shown that Deep Network with ResNet architecture can trained to automatically track selected video features. Their algorithm DeepLabCut is very promising and seems highly suited for our requirements. In this project we will adapt the DeepLabCut algorithm to extract whisker movements from our HD video data.
DeepLabCut is a computationally very demanding particularly during the training phase and GPUs are essential to perform learning in a reasonable time frame. Given the experimental design we need to train the DeepLabCut for several different experimental conditions and animals. Therefore, we request access to GPU nodes in KEBNEKAISE cluster.
Mathis A, Mamidanna P, Cury KM, Abe T, Murthy VN, Mathis MW & Bethge M (2018) DeepLabCut: marker-less pose estimation of user-defined body parts with deep learning Nature Neuroscience. 2018. 10.1038/s41593-018-0209-y