Skip to main content
School of Biological and Behavioural Sciences

Visual and auditory integration for spatial representation

Project Overview

The ability to navigate in the world and recall visited locations is a fundamental skill shared by both animals and human beings. The physical environment possesses many different cues that are perceived by the sensory systems. In mammals, the hippocampus and its adjacent areas in the medial temporal lobe, have long been implicated in spatial navigation and memory. Several types of spatial neurons have been discovered in this area, including place cells and grid cells. However, little is known about how the spatial patterns are formed from sensory inputs and integrate this information into our existing memory.

This project will study how our brain integrates multisensory information (e.g., visual and auditory) to give rise to the unified spatial representation and how the representation is consolidated during sleep in healthy and Alzheimer’s Disease (AD) models. The project will combine the two-dimensional virtual reality (2D VR) technique with in vivo electrophysiological recording/Neuropixels recording/2-photon imaging. The 2D VR provides mice with an immersive experience of navigating in a virtual world, and allows independent manipulations of visual, auditory and self-motion cues in 2D space. Computational models will be used to explain the findings and to make future predictions.

The outcome of the project will provide biological inspiration to the development of artificial intelligence, and also shine a light on neural mechanisms implicated in age-associated cognitive decline, such as Alzheimer’s Disease. The successful candidate will have the opportunity to fine tune the project based on individual interests and skills.

Research Environment

Dr Guifen Chen group focuses on studying how sensory inputs are integrated at the neural network level to form spatial representation in the brain. Her long-term research interests lie in the network mechanisms of spatial cognition and episodic memory in healthy and diseased brains including Alzheimer's and Autism. Prof Andrea Benucci’s group studies the neural substrate of visual processing and vision-based decision making. The supervisory team is at the forefront of contemporary neuroscience techniques, from in vivo electrophysiological recording to in-vivo imaging, and computational modelling.

More details can be found in the links below:

Find out more about the School of Biological and Behavioural Sciences on our website.

Keywords: Spatial navigation; Virtual reality; Sensory integration; Spatial memory; Alzheimer’s Disease

Entry Requirements

We are looking for candidates to have or expecting to receive a first or upper-second class honours degree and a Master’s degree in an area relevant to the project such as Neuroscience, Life Sciences, Medicine, Psychology, Physics, Maths or Computer Science. Candidates with programming skills such as Matlab, a good understanding of maths, and/or experience of rodent experiments are desirable.

You must meet the IELTS requirements for your course and upload evidence before CSC’s application deadline, ideally by 1st March 2025. You are therefore strongly advised to sit an approved English Language test as soon as possible, where your IELTS test must still be valid when you enrol for the programme.

Please find further details on our English Language requirements page.

How to Apply

Formal applications must be submitted through our online form by 29th January 2025 for consideration. Please identify yourself as a ‘CSC Scholar’ in the funding section of the application.

Applicants are required to submit the following documents:

  • Your CV
  • Personal Statement
  • Evidence of English Language e.g.) IELTS Certificate
  • Copies of academic transcripts and degree certificates
  • References

Find out more about our application process on our SBBS website.

Informal enquiries about the project can be sent to Dr Guifen Chen AT guifen.chen@qmul.ac.uk Admissions-related queries can be sent to sbbs-pgadmissions@qmul.ac.uk

Shortlisted applicants will be invited for a formal interview by the supervisor. If you are successful in your QMUL application, then you will be issued an QMUL Offer Letter, conditional on securing a CSC scholarship along with academic conditions still required to meet our entry requirements.

Once applicants have obtained their QMUL Offer Letter, they should then apply to CSC for the scholarship with the support of the supervisor.

For further information, please go to the QMUL China Scholarship Council webpage.

Apply Online

References

  1. Yang, Cacucci, Burgess, Wills, Chen (2024) Visual boundary cues suffice to anchor place and grid cells in virtual reality, Current Biology 34, 10, 2256- 2264.e3 https://www.cell.com/current-biology/fulltext/S0960-9822(24)00466-4
  2. Bolaños, F., Orlandi, J.G., Aoki, R., Jagadeesh, A.V., Gardner, J.L., Benucci, A., Efficient coding of natural images in the mouse visual cortex. Nat. Commun., 15, Article number: 2466, (2024) [Full]
  3. Rowland, D. C., Roudi, Y., Moser, M.-B. & Moser, E. I. Ten Years of Grid Cells. Annu Rev Neurosci 39, 1–22 (2015).
  4. Chen, G., Lu, Y., King, J. A., Cacucci, F. & Burgess, N. Differential influences of environment and self-motion on place and grid cell firing. Nat Commun 10, 630 (2019).
  5. Chen, G., King, J. A., Lu, Y., Cacucci, F. & Burgess, N. Spatial cell firing during virtual navigation of open arenas by head-restrained mice. Elife 7, e34789 (2018).
  6. Aronov, D., Nevers, R. & Tank, D. W. Mapping of a non-spatial dimension by the hippocampal–entorhinal circuit. Nature 543, 719–722 (2017).
  7. Doeller, C. F., Barry, C. & Burgess, N. Evidence for grid cells in a human memory network. Nature 463, 657–661 (2010).
Back to top