Archive

Author Archive

How might the central complex deliver navigation?

There has been, quite rightly, lots of excitement about recent work showing how central complex structures in fly brain are organised in ways that seem well suited to navigation. However, how navigation behaviour might emerge from such circuits is unclear. Modelling offers an opportunity to give existence proof that the known properties and connections of a circuit can lead to a particular behaviour. In this spirit, Fiore et al., build a CX model that can implement navigation in a simple maze task. In their words: “Based on known connectivity and function, we developed a computational model to test how the local connectome of the central complex can mediate sensorimotor integration to guide different forms of behavioral outputs.”

Fiore, V. G., Kottler, B., Gu, X., & Hirth, F. (2017). In silico Interrogation of Insect Central Complex Suggests Computational Roles for the Ellipsoid Body in Spatial Navigation. Frontiers in Behavioral Neuroscience, 11, 142.

Categories: Papers from 2017

Navigating in the dark

Over the last few years, Ajay Narendra has been involved in some excellent work looking at the behaviour and visual systems of different ant species, with different foraging specialisms. In this paper, the authors provide a review of the adaptations specifically for navigating in low light levels.

Abstract: “Visual navigation is a benchmark information processing task that can be used to identify the consequence of being active in dim-light environments. Visual navigational information that animals use during the day includes celestial cues such as the sun or the pattern of polarized skylight and terrestrial cues such as the entire panorama, canopy pattern, or significant salient features in the landscape. At night, some of these navigational cues are either unavailable or are significantly dimmer or less conspicuous than during the day. Even under these circumstances, animals navigate between locations of importance. Ants are a tractable system for studying navigation during day and night because the fine scale movement of individual animals can be recorded in high spatial and temporal detail. Ant species range from being strictly diurnal, crepuscular, and nocturnal. In addition, a number of species have the ability to change from a day- to a night-active lifestyle owing to environmental demands. Ants also offer an opportunity to identify the evolution of sensory structures for discrete temporal niches not only between species but also within a single species. Their unique caste system with an exclusive pedestrian mode of locomotion in workers and an exclusive life on the wing in males allows us to disentangle sensory adaptations that cater for different lifestyles. In this article, we review the visual navigational abilities of nocturnal ants and identify the optical and physiological adaptations they have evolved for being efficient visual navigators in dim-light.”

Narendra, A., Kamhi, J. F., & Ogawa, Y. (2017). Moving in Dim Light: Behavioral and Visual Adaptations in Nocturnal Ants. Integrative and Comparative Biology.

Categories: Papers from 2017

Why is visual navigation easy for ants?

A desert ant forager is a navigation machine, optimised to bring food back to the nest as fast as efficiently and often as possible. Evolution surely has led to specialist hardware for navigation, but low resolution visual systems and small brains don’t immediately suggest so. In this paper we look at different levels of organisation within the ant visual navigation system, from sensor to neural architecture to behaviour. We are looking for those organisational principles that show how ants are indeed ‘set-up’ for visual navigation, as well as the efficiencies which explain why ants don’t need high resolution vision or large brains. The article is a short primer, with specific focus on the aspects of the system that are desirable for bio-mimetic robot engineers.

Graham, P., & Philippides, A. (2017). Vision for navigation: What can we learn from ants?. Arthropod Structure & Development.

Categories: Papers from 2017

The homing of social wasps

As I have mentioned previously on here, we don’t get many wasp papers. However, here is one such paper. It shows that social wasps develop a strong visual familiarity with their surroundings. These surroundings are visually dense, and so this suggests a rich visual memory.

Abstract: We captured foragers of the tropical social wasp Ropalidia marginata from their nests and displaced them at different distances and directions. Wasps displaced within their probable foraging grounds returned to their nests on the day of release although they oriented randomly upon release; however, wasps fed before release returned sooner, displaying nest-ward orientation. When displaced to places far from their nests, thus expected to be unfamiliar, only a third returned on the day of release showing nest-ward orientation; others oriented randomly and either returned on subsequent days or never. When contained within mosquito- net tents since eclosion and later released to places close to their nests (but unfamiliar), even fed wasps oriented randomly, and only older wasps returned, taking longer time. Thus, contrary to insects inhabiting less-featured landscapes, R. marginata foragers appear to have thorough familiarity with their foraging grounds that enables them to orient and home efficiently after passive displacement. Their initial orientation is, however, determined by an interaction of the information acquired from surrounding landscape and their physiological motivation. With age, they develop skills to home from unfamiliar places. Homing behaviour in insects appears to be in influenced by evolutionarily conserved mechanisms and the landscape in which they have evolved.

Mandal, S., Brahma, A., & Gadagkar, R. (2017). Homing in a tropical social wasp: role of spatial familiarity, motivation and age. Journal of Comparative Physiology A, 1-13.

Categories: Papers from 2017

Fly PI

Path Integration (PI) in insects is usually talked about because of the amazing navigational feats of desert ants or because of the communication of PI information by waggle dancing honeybees. The foragers of the social insect species seem to be hogging the limelight. But PI is a much more widespread navigational strategy, classic behavioural experiments long ago demonstrated PI in invertebrates such as spiders and crabs. Now we can add fruit flies to the list. Using very neat behavioural experiments, Kim and Dickinson observe a foraging behaviour in flies, where individuals will return to a food location periodically. If the food is moved after the first visit, then flies return to the erstwhile food position not the new food position. This suggests that flies are not detecting the food directly to relocate it. The potential use of deposited pheromones was controlled for by genetically knocking out the oenocytes, which produce cuticular pheromones. This leaves two options, flies might be using a PI mechanism to periodically return to the food or they might be simply be undertaking a random walk, that occasionally leads them back to the food location. By analysing the statistics of turns, Kim and Dickinson show it is likely to be the former, specifically as turn sizes depend on the distance from the food and the distance walked since the last turn.
The demonstration that flies perform some form of Path Integration will of course excite those who will wish to use genetic tools to pick apart neural machinery. Another exciting implication, indeed one which the authors stress, is that this reinforces the idea that path integration is part of a fundamental spatial toolkit that is likely to be widespread across animal taxa.
Kim and Dickinson, Idiothetic Path Integration in the Fruit Fly Drosophila melanogaster, Current Biology (2017), http://dx.doi.org/10.1016/j.cub.2017.06.026
Categories: Papers from 2017

Optic-flow: from insects to robots

From many behavioural and neurophysiological experiments, we know lots about the use of optic flow by insects for the control on flight and the avoidance of obstacles. Here is a review explaining how the derived control principles can be used as bio-robotic inspiration.

Abstract: “Flying insects are able to fly smartly in an unpredictable environment. It has been found that flying insects have smart neurons inside their tiny brains that are sensitive to visual motion also called optic flow. Consequently, flying insects rely mainly on visual motion during their flight maneuvers such as: takeoff or landing, terrain following, tunnel crossing, lateral and frontal obstacle avoidance, and adjusting flight speed in a cluttered environment. Optic flow can be defined as the vector field of the apparent motion of objects, surfaces, and edges in a visual scene generated by the relative motion between an observer (an eye or a camera) and the scene. Translational optic flow is particularly interesting for short-range navigation because it depends on the ratio between (i) the relative linear speed of the visual scene with respect to the observer and (ii) the distance of the observer from obstacles in the surrounding environment without any direct measurement of either speed or distance. In flying insects, roll stabilization reflex and yaw saccades attenuate any rotation at the eye level in roll and yaw respectively (i.e. to cancel any rotational optic flow) in order to ensure pure translational optic flow between two successive saccades. Our survey focuses on feedback-loops which use the translational optic flow that insects employ for collision-free navigation. Optic flow is likely, over the next decade to be one of the most important visual cues that can explain flying insects’ behaviors for short-range navigation maneuvers in complex tunnels. Conversely, the biorobotic approach can therefore help to develop innovative flight control systems for flying robots with the aim of mimicking flying insects’ abilities and better understanding their flight.”

Serres, J. R., & Ruffier, F. (2017). Optic flow-based collision-free strategies: From insects to robots. Arthropod Structure & Development.
Categories: Papers from 2017

A robot named Cataglyphis

Here is a fun article detailing a robot which successfully completed a NASA challenge. Although not strictly a bio-inspired robot, the challenge was for GPS-denied robots in a retrieve and return task. So the name is very apt. You can find the details here.<onlinelibrary.wiley.com/doi/10.1002/rob.21737/full>
Gu, Y., Ohi, N., Lassak, K., Strader, J., Kogan, L., Hypes, A., … & Watson, R. Cataglyphis: An autonomous sample return rover. Journal of Field Robotics.
Categories: Papers from 2017