Archive

Archive for the ‘Papers from 2017’ Category

Ant-bots and compass sensors

A fundamental principle behind this BLOG is that the study of insect spatial behaviour is inherently interesting to roboticists. This is because the sensors and behaviours are tuned for navigation and little else. Following conference season, we have a bumper crop of such biorobotic projects. Two papers from the Marseille team detail the development of a hexapod robot and then the deployment of a compass sensor inspired by the specialised ommatidia in the dorsal rim area of insect eyes. Wolfgang Stürzl also presents the development of a new compass sensor. In this case inspired by the ocelli of insects, which are 3 simple dorsally facing ‘eyes’.

Julien Dupeyroux, Julien Diperi, Marc Boyron, Stéphane Viollet, Julien Serres. A novel insect-inspired optical compass sensor for a hexapod walking robot. IROS 2017 – IEEE/RSJ International Conference on Intelligent Robots and Systems, Sep 2017, Vancouver, Canada.

Dupeyroux, J., Passault, G., Ruffier, F., Viollet, S., & Serres, J. (2017). Hexabot: a small 3D-printed six-legged walking robot designed for desert ant-like navigation tasks. In IFAC Word Congress 2017.

W. Stürzl (2017). A Lightweight Single-Camera Polarization Compass With Covariance Estimation. In Proceedings of the IEEE International Conference on Computer Vision (ICCV), pp. 5353-5361

Advertisements
Categories: Papers from 2017

From the field to VR (and back again?)

The last month has seen the publication of a pair of papers describing open-source tools solving two key problems faced by field experimentalists. Firstly, at the recent ICCV2017 workshop on animal tracking, Risse et al presented Habitracks – an open-source software to automatically track small animals in videos recorded in natural habitats. Fully automatic and highly accurate tracking was shown for a number of species including ants, bees, and dung-beetles in video recording from field experiments. In the second work, the same team present Habitat3D, a software tool that automatically integrates multiple laser scans of natural environments into a single photorealistic mesh. This then allows easy reconstruction of animal views using standard graphics packages or the presentation of realistic VR worlds to insects in track-ball experiments. Combined these tools bring us a step closer to realising the goal of reconstructing the actual visual perspective of animals allowing validation of hypothesis in realistic environments.

Risse, B., Mangan, M., Del Pero, L., & Webb, B. (2017). Visual Tracking of Small Animals in Cluttered Natural Environments Using a Freely Moving Camera. The IEEE International Conference on Computer Vision (ICCV)(pp. 2840-2849).

Risse, B., Mangan, M., Stürzl, W., & Webb, B. (2018). Software to convert terrestrial LiDAR scans of natural environments into photorealistic meshes. Environmental Modelling & Software, 99, 88-100.

Categories: Papers from 2017

Learning about sky compasses

The early days of an ant forager’s life present a succession of learning challenges. Ants must learn about the ephemeris function for their part of the world and the current time of year. They also have to learn about the visual surrounds of their nest. Learning walks with specific structure are key to this. A key part of learning walks in desert ant species from visually rich environments is that ants fixate the nest at specific points. Here, Grob et al look at how these precise fixations might depend on celestial information. Interestingly, the precision of fixations is maintained even when polarisation and sun position information is removed. However, natural polarization information via the UV channel is necessary for the triggering of brain changes in these new foragers. Learning walks with natural polarisation and UV lead to brain changes in both central complex and the visual input region of the mushroom body. This gives some clues as to the neural underpinning of early ‘career’ learning in new foragers.

Grob R, Fleischmann PN, Grübel K, Wehner R and Rössler W (2017) The Role of Celestial Compass Information in Cataglyphis Ants during Learning Walks and for Neuroplasticity in the Central Complex and Mushroom Bodies. Front. Behav. Neurosci. 11:226. doi: 10.3389/fnbeh.2017.00226

Categories: Papers from 2017

Neural control of flight

With each passing month, we learn more about the neural underpinnings of navigation and orientation behaviours. Here is another piece in the jigsaw:

Abstract: The impressive repertoire of honeybee visually guided behaviors, and their ability to learn has made them an important tool for elucidating the visual basis of behavior. Like other insects, bees perform optomotor course correction to optic flow, a response that is dependent on the spatial structure of the visual environment. However, bees can also distinguish the speed of image motion during forward flight and landing, as well as estimate flight distances (odometry), irrespective of the visual scene. The neural pathways underlying these abilities are unknown. Here we report on a cluster of descending neurons (DNIIIs) that are shown to have the directional tuning properties necessary for detecting image motion during forward flight and landing on vertical surfaces. They have stable firing rates during prolonged periods of stimulation and respond to a wide range of image speeds, making them suitable to detect image flow during flight behaviors. While their responses are not strictly speed tuned, the shape and amplitudes of their speed tuning functions are resistant to large changes in spatial frequency. These cells are prime candidates not only for the control of flight speed and landing, but also the basis of a neural ‘front end’ of the honeybee’s visual odometer.

Ibbotson, M. R., Hung, Y. S., Meffin, H., Boeddeker, N., & Srinivasan, M. V. (2017). Neural basis of forward flight control and landing in honeybees. Scientific Reports (Nature Publisher Group), 7, 1-15.

Categories: Papers from 2017

Do ants learn what’s unfamiliar?

Experiments with a variety of ant species have shown that ants displaced away from a familiar feeder will follow a Path Integration defined direction for only a portion of the home vector before beginning a systematic search. This behaviour is more pronounced in species living in cluttered environments. In this paper, Schwarz et al., look at the behaviour of naive and experienced ants in a similar situation. As expected the experienced ants only follow a portion of their PI defined route before searching. However, naive ants follow PI for significantly longer, thus demonstrating that the pattern of following a portion of PI indicated home vectors before search is not a species level innate bias, but depends on visual experience. The authors suggest that experienced ants are better able to identify an unfamiliar location and therefore know to switch to search more rapidly. An alternative is that individuals with visual experience may be trying to apply those memories, but unsuccessfully in the unfamiliar location. This faulty visual guidance will not influence PI defined paths for long home vectors when PI is weighted strongly, but may lead to a disorganised (search-like) paths for weaker PI vectors later in the path.

Schwarz, S., Wystrach, A., & Cheng, K. (2017). Ants’ navigation in an unfamiliar environment is influenced by their experience of a familiar route. Scientific Reports, 7(1), 14161.

Categories: Papers from 2017

The view from above (or below)

Analysing the information available in views has became a staple of insect navigational research. Here, the authors take this to the third dimension, to look at the use of views in different planes.

Abstract: Panoramic views of natural environments provide visually navigating animals with two kinds of information: they define locations because image differences increase smoothly with distance from a reference location and they provide compass information, because image differences increase smoothly with rotation away from a reference orientation. The range over which a given reference image can provide navigational guidance (its ‘catchment area’) has to date been quantified from the perspective of walking animals by determining how image differences develop across the ground plane of natural habitats. However, to understand the information available to flying animals there is a need to characterize the ‘catchment volumes’ within which panoramic snapshots can provide navigational guidance. We used recently developed camera-based methods for constructing 3D models of natural environments and rendered panoramic views at defined locations within these models with the aim of mapping navigational information in three dimensions. We find that in relatively open woodland habitats, catchment volumes are surprisingly large extending for metres depending on the sensitivity of the viewer to image differences. The size and the shape of catchment volumes depend on the distance of visual features in the environment. Catchment volumes are smaller for reference images close to the ground and become larger for reference images at some distance from the ground and in more open environments. Interestingly, catchment volumes become smaller when only above horizon views are used and also when views include a 1 km distant panorama. We discuss the current limitations of mapping navigational information in natural environments and the relevance of our findings for our understanding of visual navigation in animals and autonomous robots.

Murray T, Zeil J (2017) Quantifying navigational information: The catchment volumes of panoramic snapshots in outdoor scenes. PLoS ONE 12(10): e0187226.

Categories: Papers from 2017

Visually guided behaviour with small sets of neurons

A general problem in neuroscience is understanding how sensory systems organise information to be at the service of behaviour. Computational approaches can be useful for such studies as they allow one to simulate the sensory experience of a behaving animal whilst considering how sensory information should be encoded. In flies, small subpopulations of identifiable neurons are known to be necessary for particular visual tasks, and the response properties of these populations have now been described in detail. Surprisingly, these populations are small, with only 14 or 28 neurons each, which suggests something of a sensory bottleneck. In this paper, Dewar et al. consider how the population code from these neurons relates to the information required to control specific behaviours, concluding that flies are unlikely to possess a general-purpose pattern-learning ability. However, implicit information about the shape and size of objects, which is necessary for many ecologically important visually guided behaviours, does pass through the sensory bottleneck. These findings show that nervous systems can be particularly economical when specific populations of cells are paired with specific visual behaviours.

Dewar ADM, Wystrach A, Philippides A, Graham P (2017) Neural coding in the visual system of Drosophila melanogaster: How do small neural populations support visually guided behaviours? PLoS Comput Biol 13(10): e1005735. https://doi.org/10.1371/journal.pcbi.1005735

Categories: Papers from 2017