Special Issue

Current Biology has devoted a portion of its issue to a “Migration and Navigation” Special Issue. It looks like there are some fascinating articles.

Animal migration research takes wing – Kenneth J. Lohmann

Moving in the third dimension – Cyrus Martin

Wandering stories – Florian Maderspacher

Homing pigeons – Dora Biro

Ecology of animal migration – Thomas Alerstam, Johan Bäckman

Marine migrations – Nathan Putman

Collective animal migration – Iain D. Couzin

Conservation of migratory species – Joshua J. Horns, Çağan H. Şekercioğlu

Insect learning flights and walks – Thomas S. Collett, Jochen Zeil

Neuroethology of spatial cognition – Paul A. Dudchenko, Douglas Wallace

The Dung Beetle Compass – Marie Dacke, Basil el Jundi

Neuroethology of bat navigation – Daria Genzel, Yosef Yovel, Michael M. Yartsev

Individual variation in human navigation – Nora S. Newcombe

Demystifying Monarch Butterfly Migration – Steven M. Reppert, Jacobus C. de Roode

The Neurobiology of Mammalian Navigation – Steven Poulter, Tom Hartley, Colin Lever

Principles of Insect Path Integration – Stanley Heinze, Ajay Narendra, Allen Cheung

The Neurocognitive Basis of Spatial Reorientation – Joshua B. Julian, Alexandra T. Keinath, Steven A. Marchette, Russell A. Epstein

Advertisements
Categories: Papers from 2018

Quick hits

Abstract “Animals use external and/or internal cues to navigate and can show flexibility in cue use if one type of cue is unavailable. We studied the homing ability of the harvestman Heteromitobates discolor (Arachnida, Opiliones) by moving egg-guarding females from their clutches. We tested the importance of vision, proprioception, and olfaction. We predicted that homing would be negatively affected in the absence of these cues, with success being measured by the return of females to their clutches. We restricted proprioception by not allowing females to walk, removed vision by painting the eyes, and removed the odours by removing the clutch and cleaning its surroundings. We found that vision is important for homing, and in the absence of visual cues, proprioception is important. Finally, we found increased homing when eggs were present, and that the time of the day also influenced homing. We highlight vision as a previously overlooked sensory modality in Opiliones.”

dos Santos Silva, N. F., Fowler-Finn, K., Mortara, S. R., & Willemart, R. H. (2018). A Neotropical armored harvestman (Arachnida, Opiliones) uses proprioception and vision for homing. Behaviour. doi:10.1163/1568539X-00003503

Abstract “Honeybees, Apis mellifera, perform re-orientation flights to learn about the new surroundings of the hive when their hive is transported to a new location. Since the pattern of re-orientation flights has not yet been studied, we asked whether this form of exploratory behavior differs from the well described exploratory orientation flights performed by young honeybees before they start foraging. We also investigated whether the exploratory components of re-orientation flights differ from foraging flights and if so how. We recorded re-orientation flights using harmonic radar technology and compared the patterns and flight parameters of these flights with the first exploratory orientation flights of young honeybees and foraging flights of experienced foragers. Just as exploratory orientation flights of young honeybees, re-orientation flights can be classified into short- and long-range flights, and most short-range re-orientation flights were performed under unfavorable weather conditions. This indicates that bees adapt the flight pattern of their re-orientation and orientation flights to changing weather conditions in a similar way. Unlike exploratory orientation flights, more than one sector of the landscape was explored during a long-range re-orientation flight, and significantly longer flight durations and flight distances were observed. Thus, re-orienting bees explored a larger terrain than bees performing their first exploratory orientation flight. By displacing some bees after their first re-orientation flight, we could demonstrate that a single re-orientation flight seems to be sufficient to learn the new location of the hive. The flight patterns of re-orientation flights differed clearly from those of foraging flights. Thus, re-orientation flights represent a special exploratory behavior that is triggered by a change in the location of the hive.”

Degen, J., Hovestadt, T., Storms, M., & Menzel, R. (2018). Exploratory behavior of re-orienting foragers differs from other flight patterns of honeybees. PloS one, 13(8). doi: 10.1371/journal.pone.0202171

Categories: Papers from 2018

Keep on straight

We are all fond of a hierarchy and it is second nature to rank behaviours in terms of their sophistication or complexity or (worse) their intelligence. So, in navigation research we might be inclined to instinctively assume “simple” behaviours, such as taxis or straight line walking, don’t rely on complex neural circuits. But why? Through Comparative Neuroethology we can more formally see how brain regions vary given their role in controlling different behaviours for different species. To do this we have to approach the problem from both directions: detailed descriptions of behaviour as well as detailed understanding of neural circuits. The latter is well served by model systems, in our case mainly the fruit fly. Whereas the former is often best addressed via charismatic animals with pronounced behaviours, such as the dung beetle and its straight-line rolling. Detailed studies of the brain and behaviour of these beetles suggest that the same brain organisation as all insects is present in dung beetles (el Jundi et al), suggested that diverse orientation behaviours are underpinned by conserved neural circuits. Similarly, in the fruit fly model system the behaviour of maintaining a straight line path relative to the sun (menotaxis) relies on neurons previously implicated in compass behaviours.

Taken together these papers give me a sense of hope, in that the continued study of different insect species will be a productive endeavour.

el Jundi, B., Warrant, E. J., Pfeiffer, K., & Dacke, M. (2018). Neuroarchitecture of the dung beetle central complex. Journal of Comparative Neurology. doi.org/10.1002/cne.24520

Giraldo, Y. M., Leitch, K. J., Ros, I. G., Warren, T. L., Weir, P. T., & Dickinson, M. H. (2018). Sun navigation requires compass neurons in Drosophila. Current Biology.

Categories: Papers from 2018

Check out these moves

In my opinion the most interesting aspects of insect navigation are the learning walks/flights that individuals implement when memorizing the surroundings of an important location. In some insect species these manoeuvres can appear very regular and precisely controlled. With this kind of control being easier for flying insects that can control facing and moving directions independently, whereas most walking insects have to combine walking forwards with rotational movements in order to achieve a desired looking pattern. Some ants have particular motor motifs that seem to indicate function, such as the ‘voltes’ and ‘pirouettes’ of some desert ants (Fleischmann et al ) and the scanning behaviours of others. In this new learning walk paper from Jayatilaka et al, the jack jumper ant shows an interesting pattern of looking. In the author’s words “ants move along arcs around the nest while performing oscillating scanning movements. In a regular temporal sequence, the ants’ gaze oscillates between the nest direction and the direction pointing away from the nest. Ants thus experience a sequence of views roughly across the nest and away from the nest from systematically spaced vantage points around the nest.” This particular pattern is intriguing as it might suggest ants learning views towards and away from the nest. Having positive and negative views might help nest search, or it could be that ants are simultaneously learning nest cues and learning views that can guide foraging routes.

Jayatilaka, P., Murray, T., Narendra, A., & Zeil, J. (2018). The choreography of learning walks in the Australian jack jumper ant Myrmecia croslandi. Journal of Experimental Biology, jeb-185306.

Categories: Papers from 2018

Moving through a flat world

Humans live in a 3d world and have 3d perceptual systems. Thus it is a simple step to assume that our mental representations of the world are also 3d. But there is a lot of computational power required to update internal representations of the (out of sight) world as one moves. Vuong et al. in fact suggest that humans might use flat representations of space as a more economical way of maintaining a world model. Why do I mention this paper here, on a BLOG about insect navigation. Well, 2D representations of the world have been long assumed to be part of the view-based navigation that is very well-studied in insects, since Cartwright and Collett’s Snapshot Model. As the title of this paper suggests, different tasks will have different requirements and for some task we will see overlap in the insect and human way of doing things.

Abstract “People are able to keep track of objects as they navigate through space, even when objects are out of sight. This requires some kind of representation of the scene and of the observer’s location. We tested the accuracy and reliability of observers’ estimates of the visual direction of previously-viewed targets. Participants viewed 4 objects from one location, with binocular vision and small head movements giving information about the 3D locations of the objects. Without any further sight of the targets, participants walked to another location and pointed towards them. All the conditions were tested in an immersive virtual environment and some were also carried out in a real scene. Participants made large, consistent pointing errors that are poorly explained by any single 3D representation. Instead, a flattened representation of space that is dependent on the structure of the environment at the time of pointing provides a good account of participants’ errors. This suggests that the mechanisms for updating visual direction of unseen targets are not based on a stable 3D model of the scene, even a distorted one.”

Vuong, J., Fitzgibbon, A., & Glennerster, A. (2018). Human pointing errors suggest a flattened, task-dependent representation of space. bioRxiv, 390088. doi:10.1101/390088

Categories: Papers from 2018

Spatial Representations in Cockroaches

Abstract: “When cockroaches are trained to a visual–olfactory cue pairing using the antennal projection response (APR), they can form different memories for the location of a visual cue. A series of experiments, each examining memory for the spatial location of a visual cue, were performed using restrained cockroaches. The first group of experiments involved training cockroaches to associate a visual cue (CS—green LED) with an odor cue (US) in the presence or absence of a second visual reference cue (white LED). These experiments revealed that cockroaches have at least two forms of spatial memory. First, it was found that during learning, the movements of the antennae in response to the odor influenced the cockroaches’ memory. If they use only one antenna, cockroaches form a memory that results in an APR being elicited to the CS irrespective of its location in space. When using both antennae, the cockroaches resulting memory leads to an APR to the CS that is spatially confined to within 15° of the trained position. This memory represents an egocentric spatial representation. Second, the cockroaches simultaneously formed a memory for the angular spatial relationships between two visual cues when trained in the presence of a second visual reference cue. This training provided the cockroaches an allocentric representation or visual snapshot of the environment. If both egocentric and the visual snapshot were available to the cockroach to localize the learned cue, the visual snapshot determined the behavioral response in this assay. Finally, the split-brain assay was used to characterize the cockroach’s ability to establish a memory for the angular relationship between two visual cues with half a brain. Split-brain cockroaches were trained to unilaterally associate a pair of visual cues (CS—green LED and reference—white LED) with an odor cue (US). Split-brain cockroaches learned the general arrangement of the visual cues (i.e., the green LED is right of the white LED), but not the precise angular relationship. These experiments provide new insight into spatial memory processes in the cockroach.”

M Pomaville, DDL Lent (2018) Multiple representations of space by the cockroach, Periplaneta americana. Frontiers in Psychology 9, 1312. doi: 10.3389/fpsyg.2018.01312

Categories: Papers from 2018

Seeing the bigger picture

In psychology, the idea of a gestalt is that the mind perceives a global whole owing to fundamental self-organizing tendencies. Thus we perceive a checkerboard, not a collection of white and black squares, and we perceive faces rather than sets of discrete eyes, nose and mouth. As with most perceptual processes, such “holistic” processing is unlikely to be restricted to human perceptual systems and looking for it in insects might help understand the minimal neural circuits for such perceptual abilities. This is what Avargues-Weber et al have done. They find behaviours in bees and wasps that can be described as being evidence of holistic processing. However results based on coarse levels of behaviour doesn’t mean that bees are actually perceiving the world in this way and doesn’t help us understand how brains are interacting with the world to allow this behaviour to emerge. Similar studies with more detailed analysis of sensori-motor constraits could be very informative.

Abstract: “The expertise of humans for recognizing faces is largely based on holistic processing mechanism, a sophisticated cognitive process that develops with visual experience. The various visual features of a face are thus glued together and treated by the brain as a unique stimulus, facilitating robust recognition. Holistic processing is known to facilitate fine discrimination of highly similar visual stimuli, and involves specialized brain areas in humans and other primates. Although holistic processing is most typically employed with face stimuli, subjects can also learn to apply similar image analysis mechanisms when gaining expertise in discriminating novel visual objects, like becoming experts in recognizing birds or cars. Here, we ask if holistic processing with expertise might be a mechanism employed by the comparatively miniature brains of insects. We thus test whether honeybees (Apis mellifera) and/or wasps (Vespula vulgaris) can use holistic-like processing with experience to recognize images of human faces, or Navon-like parameterized-stimuli. These insect species are excellent visual learners and have previously shown ability to discriminate human face stimuli using configural type processing. Freely flying bees and wasps were consequently confronted with classical tests for holistic processing, the part-whole effect and the composite-face effect. Both species could learn similar faces from a standard face recognition test used for humans, and their performance in transfer tests was consistent with holistic processing as defined for studies on humans. Tests with parameterized stimuli also revealed a capacity of honeybees, but not wasps, to process complex visual information in a holistic way, suggesting that such sophisticated visual processing may be far more spread within the animal kingdom than previously thought, although may depend on ecological constraints. ”

Avargues-Weber, A., d’Amaro, D., Metzler, M., Finke, V., Baracchi, D., & Dyer, A. G. (2018). Does holistic processing require a large brain? Insights from honeybees and wasps in fine visual recognition tasks. Frontiers in Psychology, 9, 1313.

Categories: Papers from 2018