Quick hits

A couple of quick hits here. A nice experimental paper that investigates polarisation cues and cue integration. Followed by a really nice technical paper that shows how you can build your very own insect eye (sort of). Enjoy!

Abstract: Solitary foraging ants have a navigational toolkit, which includes the use of both terrestrial and celestial visual cues, allowing individuals to successfully pilot between food sources and their nest. One such celestial cue is the polarization pattern in the overhead sky. Here, we explore the use of polarized light during outbound and inbound journeys and with different home vectors in the nocturnal bull ant, Myrmecia midas. We tested foragers on both portions of the foraging trip by rotating the overhead polarization pattern by ±45°. Both outbound and inbound foragers responded to the polarized light change, but the extent to which they responded to the rotation varied. Outbound ants, both close to and further from the nest, compensated for the change in the overhead e-vector by about half of the manipulation, suggesting that outbound ants choose a compromise heading between the celestial and terrestrial compass cues. However, ants returning home compensated for the change in the e-vector by about half of the manipulation when the remaining home vector was short (1−2 m) and by more than half of the manipulation when the remaining vector was long (more than 4 m). We report these findings and discuss why weighting on polarization cues change in different contexts.

Freas, C. A., Narendra, A., Lemesle, C., & Cheng, K. (2017). Polarized light use in the nocturnal bull ant, Myrmecia midas. Royal Society Open Science, 4(8), 170598.

Abstract: Designing hardware for miniaturized robotics which mimics the capabilities of flying insects is of interest, because they share similar constraints (i.e. small size, low weight, and low energy consumption). Research in this area aims to enable robots with similarly efficient flight and cognitive abilities. Visual processing is important to flying insects’ impressive flight capabilities, but currently, embodiment of insect-like visual systems is limited by the hardware systems available. Suitable hardware is either prohibitively expensive, difficult to reproduce, cannot accurately simulate insect vision characteristics, and/or is too heavy for small robotic platforms. These limitations hamper the development of platforms for embodiment which in turn hampers the progress on understanding of how biological systems fundamentally work. To address this gap, this paper proposes an inexpensive, lightweight robotic system for modelling insect vision. The system is mounted and tested on a robotic platform for mobile applications, and then the camera and insect vision models are evaluated. We analyse the potential of the system for use in embodiment of higher-level visual processes (i.e. motion detection) and also for development of navigation based on vision for robotics in general. Optic flow from sample camera data is calculated and compared to a perfect, simulated bee world showing an excellent resemblance.

Sabo, C., Chisholm, R., Petterson, A., & Cope, A. (2017). A lightweight, inexpensive robotic system for insect vision. Arthropod Structure & Development.

Advertisements
Categories: Papers from 2017

View based homing in humans

Although view based strategies for visually guided behaviour are closely associated with insects, the idea that 2D views might be part of the visual toolkit for humans is a long-standing one. For instance in object recognition there has been a long-standing debate about whether objects might be represented in the brain as sets of views or as genuine 3D representations of an object. A similar dichotomy can be considered for spatial cognition. Gootjes-Dreesbach et al address this question using participants wearing head-mounted displays to carry out a homing task in immersive virtual reality. They created a task where view-based and 3D reconstruction models make different error predictions. Although hard to make conclusive statements, they found that the pattern of errors was a better match to those from a view-based model, rather than a 3D reconstruction model.

Gootjes-Dreesbach, L., Pickup, L. C., Fitzgibbon, A. W., & Glennerster, A. (2017). Comparison of view-based and reconstruction-based models of human navigational strategyGootjes-Dreesbach, Pickup, Fitzgibbon, & Glennerster. Journal of Vision17(9), 11-11.

Categories: Papers from 2017

How might the central complex deliver navigation?

There has been, quite rightly, lots of excitement about recent work showing how central complex structures in fly brain are organised in ways that seem well suited to navigation. However, how navigation behaviour might emerge from such circuits is unclear. Modelling offers an opportunity to give existence proof that the known properties and connections of a circuit can lead to a particular behaviour. In this spirit, Fiore et al., build a CX model that can implement navigation in a simple maze task. In their words: “Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs.”

Fiore, V. G., Kottler, B., Gu, X., & Hirth, F. (2017). In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation. Frontiers in Behavioral Neuroscience, 11, 142.

Categories: Papers from 2017

Navigating in the dark

Over the last few years, Ajay Narendra has been involved in some excellent work looking at the behaviour and visual systems of different ant species, with different foraging specialisms. In this paper, the authors provide a review of the adaptations specifically for navigating in low light levels.

Abstract: “Visual navigation is a benchmark information processing task that can be used to identify the consequence of being active in dim-light environments. Visual navigational information that animals use during the day includes celestial cues such as the sun or the pattern of polarized skylight and terrestrial cues such as the entire panorama, canopy pattern, or significant salient features in the landscape. At night, some of these navigational cues are either unavailable or are significantly dimmer or less conspicuous than during the day. Even under these circumstances, animals navigate between locations of importance. Ants are a tractable system for studying navigation during day and night because the fine scale movement of individual animals can be recorded in high spatial and temporal detail. Ant species range from being strictly diurnal, crepuscular, and nocturnal. In addition, a number of species have the ability to change from a day- to a night-active lifestyle owing to environmental demands. Ants also offer an opportunity to identify the evolution of sensory structures for discrete temporal niches not only between species but also within a single species. Their unique caste system with an exclusive pedestrian mode of locomotion in workers and an exclusive life on the wing in males allows us to disentangle sensory adaptations that cater for different lifestyles. In this article, we review the visual navigational abilities of nocturnal ants and identify the optical and physiological adaptations they have evolved for being efficient visual navigators in dim-light.”

Narendra, A., Kamhi, J. F., & Ogawa, Y. (2017). Moving in Dim Light: Behavioral and Visual Adaptations in Nocturnal Ants. Integrative and Comparative Biology.

Categories: Papers from 2017

Why is visual navigation easy for ants?

A desert ant forager is a navigation machine, optimised to bring food back to the nest as fast as efficiently and often as possible. Evolution surely has led to specialist hardware for navigation, but low resolution visual systems and small brains don’t immediately suggest so. In this paper we look at different levels of organisation within the ant visual navigation system, from sensor to neural architecture to behaviour. We are looking for those organisational principles that show how ants are indeed ‘set-up’ for visual navigation, as well as the efficiencies which explain why ants don’t need high resolution vision or large brains. The article is a short primer, with specific focus on the aspects of the system that are desirable for bio-mimetic robot engineers.

Graham, P., & Philippides, A. (2017). Vision for navigation: What can we learn from ants?. Arthropod Structure & Development.

Categories: Papers from 2017

The homing of social wasps

As I have mentioned previously on here, we don’t get many wasp papers. However, here is one such paper. It shows that social wasps develop a strong visual familiarity with their surroundings. These surroundings are visually dense, and so this suggests a rich visual memory.

Abstract: We captured foragers of the tropical social wasp Ropalidia marginata from their nests and displaced them at different distances and directions. Wasps displaced within their probable foraging grounds returned to their nests on the day of release although they oriented randomly upon release; however, wasps fed before release returned sooner, displaying nest-ward orientation. When displaced to places far from their nests, thus expected to be unfamiliar, only a third returned on the day of release showing nest-ward orientation; others oriented randomly and either returned on subsequent days or never. When contained within mosquito- net tents since eclosion and later released to places close to their nests (but unfamiliar), even fed wasps oriented randomly, and only older wasps returned, taking longer time. Thus, contrary to insects inhabiting less-featured landscapes, R. marginata foragers appear to have thorough familiarity with their foraging grounds that enables them to orient and home efficiently after passive displacement. Their initial orientation is, however, determined by an interaction of the information acquired from surrounding landscape and their physiological motivation. With age, they develop skills to home from unfamiliar places. Homing behaviour in insects appears to be in influenced by evolutionarily conserved mechanisms and the landscape in which they have evolved.

Mandal, S., Brahma, A., & Gadagkar, R. (2017). Homing in a tropical social wasp: role of spatial familiarity, motivation and age. Journal of Comparative Physiology A, 1-13.

Categories: Papers from 2017

Fly PI

Path Integration (PI) in insects is usually talked about because of the amazing navigational feats of desert ants or because of the communication of PI information by waggle dancing honeybees. The foragers of the social insect species seem to be hogging the limelight. But PI is a much more widespread navigational strategy, classic behavioural experiments long ago demonstrated PI in invertebrates such as spiders and crabs. Now we can add fruit flies to the list. Using very neat behavioural experiments, Kim and Dickinson observe a foraging behaviour in flies, where individuals will return to a food location periodically. If the food is moved after the first visit, then flies return to the erstwhile food position not the new food position. This suggests that flies are not detecting the food directly to relocate it. The potential use of deposited pheromones was controlled for by genetically knocking out the oenocytes, which produce cuticular pheromones. This leaves two options, flies might be using a PI mechanism to periodically return to the food or they might be simply be undertaking a random walk, that occasionally leads them back to the food location. By analysing the statistics of turns, Kim and Dickinson show it is likely to be the former, specifically as turn sizes depend on the distance from the food and the distance walked since the last turn.
The demonstration that flies perform some form of Path Integration will of course excite those who will wish to use genetic tools to pick apart neural machinery. Another exciting implication, indeed one which the authors stress, is that this reinforces the idea that path integration is part of a fundamental spatial toolkit that is likely to be widespread across animal taxa.
Kim and Dickinson, Idiothetic Path Integration in the Fruit Fly Drosophila melanogaster, Current Biology (2017), http://dx.doi.org/10.1016/j.cub.2017.06.026
Categories: Papers from 2017