Our group is interested in the neural basis of visual and navigation behavior in insects. We are interested in scientific questions such as how and if insects find their way home or to a food resource again. Additionally, we develop technology such as tracking systems, virtual reality arenas and microscopes to extend the limits of what is possible to record from insects (and sometimes in other animals) as they move. Typically this technology development is aimed at allowing us to address a scientific question.
Figure: Displacement experiments reveal that flies return to a wrongly remembered reward location. This shows that flies can use path integration despite other potential confounds such as pheromone cues. (From Titova et al. 2022.)
The abilities of some insects to return to locations with food is remarkable, especially when considering the small size of their brain. While better known for its importance as a genetic model organism than as a great navigator, the fly Drosophila melanogaster was recently discovered to return to a food source when walking in a featureless, dark arena, suggesting the presence of spatial memory abilities related to those of the famous insect navigators – the bees. Thus, we now have the opportunity to study spatial cognition, memory and goal-directed movement in a species which has been at the forefront of genetic research for a century. Evidence from evolutionarily diverse insects – from locusts to beetles, bees and flies – suggests that a key brain region, the central complex, whose architecture shares many features in all these animals, is likely involved in coordinating such navigation. We are characterizing the involvement of genetically accessible neurons in the central complex on path integration in freely walking Drosophila. We are additionally making use of virtual reality for freely walking flies to test the involvement of visual memory and path integration in navigation. By obtaining detailed knowledge about the behavioral capabilities of food search and navigation in Drosophila – an extremely well-studied genetic model organism – we would contribute to an ability to discuss these important behaviors in the context of the relevant neural circuits. Given recent molecular data that shows patterns of developmental expression of key transcription factors seems conserved in patterning the insect central complex and the mammalian basal ganglia, this work may even be relevant for understanding neural control of navigation across bilaterian animals.
Figure: Genetic advances, such as the Vienna Tiles GAL4 library, allow targeted expression of specific molecules in defined neurons. Combined with virtual reality experiments, we reverse-engineer the mechanisms and purpose of the fly eye.
The Drosophila visual system is ideally suited for investigations of how nervous systems orchestrate behavior. From powerful genetic tools that enable precise manipulations of individual cell types to decades of deep research across many labs, few sensory systems are better understood or more amendable to precise manipulation. Nevertheless, our knowledge of fly vision is far from complete even though it could help us understand human vision or build better robots. One area of particular interest for the Straw Lab is how visual circuits give rise to natural behavior. While many laboratories record the activity of visual neurons in restrained animals, the results from these powerful but reductionist experiments do not readily allow extrapolation to understanding the flight or walking behavior of freely moving animals. We have shown for example, that head movements are an essential component of freely flight in flies but are dispensible for good performance in a tethered flight assay (Stowers et al. 2017).
Figure: By modifying the visual feedback supplied to freely flying flies, we can make them follow arbitrary trajectories. This permits us to record extremely long trajectories in a confined experimental space, and quantify several aspects of sensory-motor performance.
Together with colleagues, Andrew Straw developed ome of the first camera-based insect tracking systems capable of using 3 or more cameras to track the position of an animal in 3D with low latency (Straw et al. 2010). Performing live tracking enables several types of experiments which are not otherwise possible. First, perspective-correct virtual reality becomes possible, which we demonstrated on flies, fish and mice (Stowers et al. 2017). Second, realtime optogenetic stimulation based on computer-controlled behavioral feedback is enabled (Bath et al. 2014). Third, the burden of collecting and storing hours of behavioral data is simplified because instead of storing raw video, already processed data is stored (Straw et al. 2022, Segre et al. 2016). This tracking technology remains in active development in the Straw lab as an open source system called Braid. Furthermore, it forms a central element of the Straw lab's research apparatus, a basis for collaboration with others (e.g. Dakin et al. 2018), and also this or related software is used by other labs.
Moving a camera to follow an animal allows overcoming barriers to high quality imaging imposed by physics. A tradeoff between high spatial resolution versus the size of the recording space fundamentally limits recordings with stationary cameras. Furthermore, motion blur caused by relative motion of the animal and the sensor can be a problem in photon limited scenarios, and such scenarios are common. Therefore, since starting my lab I have worked on combining computer vision with controlling motors to actively track animals with cameras.