Research


Study the neural mechanisms of cognition by whole-brain single-neuron recordings in behaving zebrafish.

Currently, we have four parallel directions and are focusing on three types of behaviors.

Social behaviors

Spatial Navigation in VR

Reward & Learning

Nanoparticles on Cognition

We are a systems neuroscience lab that combines whole-brain neural imaging and computational tools on behaving animal models to understand the neural mechanisms underlying cognition and behaviors. We hypothesize that cognition arises from brain-wide information integration; thus we work with zebrafish to gain access to whole-brain neurodynamics with single-cell resolution via cutting-edge microscopies.

We use data-driven approaches and develop computational models that link the brain and cognition – predict decisions and behaviors from neural activity. Our research features whole-brain neural recordings of behaving zebrafish and quantitative tools from machine learning and dynamical systems, with an array of genetic and optogenetic tools.

Among the entire brain, the cerebellum is recently recognized as a key sensory-motor integrator that coordinates various cognitive functions and behavioral outputs. Our 5 to 10-year goal is to interrogate underlying anatomical and functional connectivity, build data-driven quantitative models explaining the interplay between the cerebellar microcircuits, brain states, and behaviors, and verify causality via optogenetic perturbations.


Previous research

How does the brain make a decision to turn vs. right? Here we used light-field microscopy (LFM) to monitor the whole brain neuronal activity, an operant conditioning task, and linear/nonlinear dimension reduction, and predict which direction the fish is going to move, and when to do so. At the single-trial level. 10 seconds in advance.

Videos from Lin 2020 Cell.