Skip to main content

more options

Neural Computation

The human brain is made up of approximately a 100 billion nerve cells (called neurons) with a total of approximately a 100 trillion connections (called synapses) between them. These massive numbers lead to rich and irreducible dynamics in brain networks. How do we study such complexity? Is there a general framework to describe fundamental neural processes such as inference, learning and memory? What principles underlie complex cognitive phenomena?

Needless to say, studying such a system requires multiple levels of analysis -- from molecular biology to systems neuroscience and cognitive psychology. I'm primarily interested in circuit-level analysis. Specifically, from cellular-level models, how is information encoded, decoded and bound to create sensory representations in networks of neurons?

Some areas I'm studying are :

  • The spatiotemporal dynamics of groups of neurons with different topologies. How does the ratio of excitatory and inihibitory connections affect collective dynamics? What are the principles behind the cooperative and competitive dynamics of winner-take-all, winnerless-competition and other generic circuitry?
  • Phase locked synchronization under a common oscillatory potential: What is its role in sensory binding?
  • Coding strategies in sensory pathways: Temporal coding, rate coding, population coding, phase coding.
  • Learning synaptic parameters through spike-time-dependent plasticity for classifying spiking patterns.
  • Probabilistic graphical methods to model inference and learning in sensory pathways: Hidden Markov Chains, Deep Boltzmann Machines.