Did you see where it went?

We don’t feature many techniques papers on this BLOG, but here is a brilliant one which may open up new experimental opportunities for people. The computer vision technique allows for insect position to be extracted from cluttered backgrounds. Therefore insects can be tracked without specially prepared observation areas, or specialist cameras. It’s fantastic.

Risse, B., Mangan, M., Del Pero, L., & Webb, B. (2017). Visual Tracking of Small Animals in Cluttered Natural Environments Using a Freely Moving Camera. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 2840-2849).

Categories: Papers from 2017

How does path integration work?

Path Integration has teased neurobiologists for a long while; The mathematically well-defined process has drawn-in those who thought they could propose how natural circuits might implement those mathematical calculations. Elegant models have been proposed, but there has always been a lack of information about the neural substrate of PI. This paper changes that, e.g. Stone et al. show how to augment celestial compass information, bees have neurons that respond to a bee’s velocity as provided by optic flow information. Having knowledge of the bee’s direction and speed processing, Stone et al. were able to propose how a PI network could incorporate known features of the insect brain.

Personally I like the style of this paper. Modelling is used to nicely bridge known and unknown physiological elements and propose a function.

Stone, Thomas, Barbara Webb, Andrea Adden, Nicolai Ben Weddig, Anna Honkanen, Rachel Templin, William Wcislo, Luca Scimeca, Eric Warrant, and Stanley Heinze. “An Anatomically Constrained Model for Path Integration in the Bee Brain.” Current Biology (2017).

Categories: Papers from 2017

Navigation in the brain

The last few years have been exciting times in insect navigation as we begin to unravel the neural underpinnings of navigation behaviours. Here Stanley Heinze gives us a review of what we know of the one particular brain region (the Central Complex) in a variety of insects. He nicely relates the requirements of different insect behaviours to differences in Central Complex organisation thus highlighting how an evolutionary approach to neuroscience can be as valuable as the study of model systems.

Heinze, S. (2017). Unraveling the neural basis of insect navigation. Current Opinion in Insect Science.

Categories: Papers from 2017

Mushroom Body models for navigation

One of the most exciting possibilities over the next few years of insect navigation research will be the increasing availability of technology allowing reconstructions of natural environments. This is going to improve the understanding of experimental results and allow for more sophisticated and meaningful modelling. This paper from Müller et al., is an example of what we can expect. Following work from Barbara Webb’s group they use a Mushroom Body implementation of the familiarity model of visual navigation (Baddeley et al., 2012) to investigate visual navigation of honeybees in a reconstruction of terrain used for many previous honeybee experiments. They find that the basic MB model implementation can be used to drive visual navigation, although the specific environment type is important for the success of the model in these experiments.

Müller, J., Nawrot, M., Menzel, R., & Landgraf, T. (2017). A neural network model for familiarity and context learning during honeybee foraging flights. Biological Cybernetics, 1-14.

Categories: Papers from 2017

Quick hits

A couple of quick hits here. A nice experimental paper that investigates polarisation cues and cue integration. Followed by a really nice technical paper that shows how you can build your very own insect eye (sort of). Enjoy!

Abstract: Solitary foraging ants have a navigational toolkit, which includes the use of both terrestrial and celestial visual cues, allowing individuals to successfully pilot between food sources and their nest. One such celestial cue is the polarization pattern in the overhead sky. Here, we explore the use of polarized light during outbound and inbound journeys and with different home vectors in the nocturnal bull ant, Myrmecia midas. We tested foragers on both portions of the foraging trip by rotating the overhead polarization pattern by ±45°. Both outbound and inbound foragers responded to the polarized light change, but the extent to which they responded to the rotation varied. Outbound ants, both close to and further from the nest, compensated for the change in the overhead e-vector by about half of the manipulation, suggesting that outbound ants choose a compromise heading between the celestial and terrestrial compass cues. However, ants returning home compensated for the change in the e-vector by about half of the manipulation when the remaining home vector was short (1−2 m) and by more than half of the manipulation when the remaining vector was long (more than 4 m). We report these findings and discuss why weighting on polarization cues change in different contexts.

Freas, C. A., Narendra, A., Lemesle, C., & Cheng, K. (2017). Polarized light use in the nocturnal bull ant, Myrmecia midas. Royal Society Open Science, 4(8), 170598.

Abstract: Designing hardware for miniaturized robotics which mimics the capabilities of flying insects is of interest, because they share similar constraints (i.e. small size, low weight, and low energy consumption). Research in this area aims to enable robots with similarly efficient flight and cognitive abilities. Visual processing is important to flying insects’ impressive flight capabilities, but currently, embodiment of insect-like visual systems is limited by the hardware systems available. Suitable hardware is either prohibitively expensive, difficult to reproduce, cannot accurately simulate insect vision characteristics, and/or is too heavy for small robotic platforms. These limitations hamper the development of platforms for embodiment which in turn hampers the progress on understanding of how biological systems fundamentally work. To address this gap, this paper proposes an inexpensive, lightweight robotic system for modelling insect vision. The system is mounted and tested on a robotic platform for mobile applications, and then the camera and insect vision models are evaluated. We analyse the potential of the system for use in embodiment of higher-level visual processes (i.e. motion detection) and also for development of navigation based on vision for robotics in general. Optic flow from sample camera data is calculated and compared to a perfect, simulated bee world showing an excellent resemblance.

Sabo, C., Chisholm, R., Petterson, A., & Cope, A. (2017). A lightweight, inexpensive robotic system for insect vision. Arthropod Structure & Development.

Categories: Papers from 2017

View based homing in humans

Although view based strategies for visually guided behaviour are closely associated with insects, the idea that 2D views might be part of the visual toolkit for humans is a long-standing one. For instance in object recognition there has been a long-standing debate about whether objects might be represented in the brain as sets of views or as genuine 3D representations of an object. A similar dichotomy can be considered for spatial cognition. Gootjes-Dreesbach et al address this question using participants wearing head-mounted displays to carry out a homing task in immersive virtual reality. They created a task where view-based and 3D reconstruction models make different error predictions. Although hard to make conclusive statements, they found that the pattern of errors was a better match to those from a view-based model, rather than a 3D reconstruction model.

Gootjes-Dreesbach, L., Pickup, L. C., Fitzgibbon, A. W., & Glennerster, A. (2017). Comparison of view-based and reconstruction-based models of human navigational strategyGootjes-Dreesbach, Pickup, Fitzgibbon, & Glennerster. Journal of Vision17(9), 11-11.

Categories: Papers from 2017

How might the central complex deliver navigation?

There has been, quite rightly, lots of excitement about recent work showing how central complex structures in fly brain are organised in ways that seem well suited to navigation. However, how navigation behaviour might emerge from such circuits is unclear. Modelling offers an opportunity to give existence proof that the known properties and connections of a circuit can lead to a particular behaviour. In this spirit, Fiore et al., build a CX model that can implement navigation in a simple maze task. In their words: “Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs.”

Fiore, V. G., Kottler, B., Gu, X., & Hirth, F. (2017). In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation. Frontiers in Behavioral Neuroscience, 11, 142.

Categories: Papers from 2017