Sensory adaptations for foraging

Abstract: ” Individual differences in response thresholds to task-related stimuli may be one mechanism driving task allocation among social insect workers. These differences may arise at various stages in the nervous system. We investigate variability in the peripheral nervous system as a simple mechanism that can introduce inter-individual differences in sensory information. In this study we describe size-dependent variation of the compound eyes and the antennae in the ant Temnothorax rugatulus. Head width in T. rugatulus varies between 0.4 and 0.7 mm (2.6–3.8 mm body length). But despite this limited range of worker sizes we find sensory array variability. We find that the number of ommatidia and of some, but not all, antennal sensilla types vary with head width.

The antennal array of T. rugatulus displays the full complement of sensillum types observed in other species of ants, although at much lower quantities than other, larger, studied species. In addition, we describe what we believe to be a new type of sensillum in hymenoptera that occurs on the antennae and on all body segments. T. rugatulus has apposition compound eyes with 45–76 facets per eye, depending on head width, with average lens diameters of 16.5 μm, rhabdom diameters of 5.7 μm and inter-ommatidial angles of 16.8°. The optical system of T. rugatulus ommatidia is severely under focussed, but the absolute sensitivity of the eyes is unusually high.

We discuss the functional significance of these findings and the extent to which the variability of sensory arrays may correlate with task allocation.”

Fiorella Ramirez-Esquivel, Nicole E. Leitner, Jochen Zeil, Ajay Narendra, The sensory arrays of the ant, Temnothorax rugatulus, Arthropod Structure & Development, doi.org/10.1016/j.asd.2017.03.005.

Categories: Papers from 2017

What visual computations for robot place recognition?

For many years (over 20 I guess) Ralf Möller and collaborators have been working on visual navigation for robots. This has incorporated bio-inspired navigation algorithms and computational models of ant-like visual systems for navigation using terrestrial visual cues. One of the notable things across many of these papers is that the introduction sections are great resources for roboticists and biologists, acting as they do as mini-reviews detailing the major classes of robotic methods for specific problems. This paper is no exception, the question is: How good are different visual methods at place recognition, a problem for animals and robots alike.

Horst and Möller (2017) Visual Place Recognition for Autonomous Mobile Robots. Robotics 2017, 6(2), 9; doi:10.3390/robotics6020009

Categories: Papers from 2017

Neurocomputational models of navigation

There are well established aspects of insect navigation, such as the use of vectors memories, which are well established, but we currently don’t have good models of how these behaviours might be implemented in the insect brain. Goldschmidt et al. address this with a model of Path Integration, where they extend the model to include the learning of vectors.

Abstract: “The model consists of a path integration mechanism, reward-modulated global learning, random search, and action selection. The path integration mechanism integrates compass and odometric cues to compute a vectorial representation of the agent’s current location as neural activity patterns in circular arrays. A reward-modulated learning rule enables the acquisition of vector memories by associating the local food reward with the path integration state. A motor output is computed based on the combination of vector memories and random exploration. “
Goldschmidt, D., Manoonpong, P., & Dasgupta, S. (2017). A neurocomputational model of goal-directed navigation in insect-inspired artificial agents. Frontiers in Neurorobotics11, 20.
Categories: Papers from 2017

Computation for navigation in the insect brain

In recent years the central complex of insects has emerged as a particularly interesting brain area for those of us interested in spatial behaviours. Exciting neuroscience from flies has offered a tantalising glimpse of a future where we can describe the circuits that underpin navigation. To bring about that future we need a creative combination of behavioural, modelling, and neuroscience efforts. In this spirit, Cope et al., take recent neuroscience findings and produce a computational model which captures the way that visual cues can be used to maintain a sense of relative orientation.

Abstract: “The insect central complex (CX) is an enigmatic structure whose computational function has evaded inquiry, but has been implicated in a wide range of behaviours. Recent experimental evidence from the fruit fly (Drosophila melanogaster) and the cockroach (Blaberus discoidalis) has demonstrated the existence of neural activity corresponding to the animal’s orientation within a virtual arena (a neural ‘compass’), and this provides an insight into one component of the CX structure. There are two key features of the compass activity: an offset between the angle represented by the compass and the true angular position of visual features in the arena, and the remapping of the 270° visual arena onto an entire circle of neurons in the compass. Here we present a computational model which can reproduce this experimental evidence in detail, and predicts the computational mechanisms that underlie the data. We predict that both the offset and remapping of the fly’s orientation onto the neural compass can be explained by plasticity in the synaptic weights between segments of the visual field and the neurons representing orientation. Furthermore, we predict that this learning is reliant on the existence of neural pathways that detect rotational motion across the whole visual field and uses this rotation signal to drive the rotation of activity in a neural ring attractor. Our model also reproduces the ‘transitioning’ between visual landmarks seen when rotationally symmetric landmarks are presented. This model can provide the basis for further investigation into the role of the central complex, which promises to be a key structure for understanding insect behaviour, as well as suggesting approaches towards creating fully autonomous robotic agents.”

Cope AJ, Sabo C, Vasilaki E, Barron AB, Marshall JAR (2017) A computational model of the integration of landmarks and motion in the insect central complex. PLoS ONE 12(2): e0172325. doi:10.1371/journal.pone.0172325

Categories: Papers from 2017

Navigation in cockroaches

On this site we do have a bias towards social insects, but navigation is, of course, valuable for all insects. Some insects, like cockroaches, have a long history of neuroscientific investigation and in recent times, really exciting neuroscience has been showing the cellular basis of orientation in insects. In this review, Varga et al. review what we know about insect navigational circuits and they are compared to the well-studied cellular basis of navigation in rodents.
Abstract: “Cockroaches are scavengers that forage through dark, maze-like environments. Like other foraging animals, for instance rats, they must continually asses their situation to keep track of targets and negotiate barriers. While navigating a complex environment, all animals need to integrate sensory information in order to produce appropriate motor commands. The integrated sensory cues can be used to provide the animal with an environmental and contextual reference frame for the behavior. To successfully reach a goal location, navigational cues continuously derived from sensory inputs have to be utilized in the spatial guidance of motor commands. The sensory processes, contextual and spatial mechanisms, and motor outputs contributing to navigation have been heavily studied in rats. In contrast, many insect studies focused on the sensory and/or motor components of navigation, and our knowledge of the abstract representation of environmental context and spatial information in the insect brain is relatively limited. Recent reports from several laboratories have explored the role of the central complex (CX), a sensorimotor region of the insect brain, in navigational processes by recording the activity of CX neurons in freely-moving insects and in more constrained, experimenter-controlled situations. The results of these studies indicate that the CX participates in processing the temporal and spatial components of sensory cues, and utilizes these cues in creating an internal representation of orientation and context, while also directing motor control. Although these studies led to a better understanding of the CX’s role in insect navigation, there are still major voids in the literature regarding the underlying mechanisms and brain regions involved in spatial navigation. The main goal of this review is to place the above listed findings in the wider context of animal navigation by providing an overview of the neural mechanisms of navigation in rats and summarizing and comparing our current knowledge on the CX’s role in insect navigation to these processes. By doing so, we aimed to highlight some of the missing puzzle pieces in insect navigation and provide a different perspective for future directions.”
Varga, A. G., Kathman, N. D., Martin, J. P., Guo, P., & Ritzmann, R. E. (2017). Spatial Navigation and the Central Complex: Sensory Acquisition, Orientation, and Motor Control. Frontiers in Behavioral Neuroscience11.
Categories: Papers from 2017

How to interpret interactions between PI and vision

The classic components of the insect navigation toolkit are Path Integration and guidance by learnt visual cues. We know that the interaction between these cues is not a simple hierarchy. Both can be active at the same time and there are interesting interactions when ants are moved to a visually novel location with a Path Integration home vector. Ants will follow the direction of their PI home vector for a while before starting to search. The distance for which ants follow their PI vector is greater for species that live in environment with reduced visual information (like a desert for instance). In this paper, Freas et al. investigate the insect navigational toolkit with a nocturnal ant. These ants are shown to use PI and visual information and in tests where they are put in novel environment with a full PI vector, they follow it for a short distance before searching. This demonstrates the value of visual information, even in a nocturnal species.

Freas, C. A., Narendra, A., & Cheng, K. (2017). Compass cues used by a nocturnal bull ant, Myrmecia midas. Journal of Experimental Biology, jeb-152967.

Categories: Papers from 2017

Visual navigation in dim light

The ability of insects to use visual information for navigation, even at very low light levels, shows the fundamental robustness of visual cues for navigation. In a special issue of Phil Trans, themed arounf vision in dim light, two papers show how insects use vision for navigation, even in the most testing circumstances.

Ajay Narendra and Fiorella Ramirez-Esquivel (2017) Subtle changes in the landmark panorama disrupt visual navigation in a nocturnal bull ant. Phil. Trans. R. Soc. B April 5, 2017 372 20160068; doi:10.1098/rstb

Abstract: “The ability of ants to navigate when the visual landmark information is altered has often been tested by creating large and artificial discrepancies in their visual environment. Here, we had an opportunity to slightly modify the natural visual environment around the nest of the nocturnal bull ant Myrmecia pyriformis. We achieved this by felling three dead trees, two located along the typical route followed by the foragers of that particular nest and one in a direction perpendicular to their foraging direction. An image difference analysis showed that the change in the overall panorama following the removal of these trees was relatively little. We filmed the behaviour of ants close to the nest and tracked their entire paths, both before and after the trees were removed. We found that immediately after the trees were removed, ants walked slower and were less directed. Their foraging success decreased and they looked around more, including turning back to look towards the nest. We document how their behaviour changed over subsequent nights and discuss how the ants may detect and respond to a modified visual environment in the evening twilight period.”

James J. Foster, Basil el Jundi, Jochen Smolka, Lana Khaldy, Dan-Eric Nilsson, Marcus J. Byrne, and Marie Dacke (2017) Research article: Stellar performance: mechanisms underlying Milky Way orientation in dung beetles. Phil. Trans. R. Soc. B April 5, 2017 372 20160079; doi:10.1098/rstb.2016.0079

Abstract: “Nocturnal dung beetles (Scarabaeus satyrus) are currently the only animals that have been demonstrated to use the Milky Way for reliable orientation. In this study, we tested the capacity of S. satyrus to orient under a range of artificial celestial cues, and compared the properties of these cues with images of the Milky Way simulated for a beetle’s visual system. We find that the mechanism that permits accurate stellar orientation under the Milky Way is based on an intensity comparison between different regions of the Milky Way. We determined the beetles’ contrast sensitivity for this task in behavioural experiments in the laboratory, and found that the resulting threshold of 13% is sufficient to detect the contrast between the southern and northern arms of the Milky Way under natural conditions. This mechanism should be effective under extremely dim conditions and on nights when the Milky Way forms a near symmetrical band that crosses the zenith. These findings are discussed in the context of studies of stellar orientation in migratory birds and itinerant seals.”

Categories: Papers from 2017