Home > Papers from 2012 > Visual prediction for snapshot guidance

Visual prediction for snapshot guidance

There are a variety of computational models of snapshot guidance, that suggest plausible algorithms that insects might use for view-based homing. One class of algorithm involves ‘mentally’ warping views to simulate the consequences of movement. Thus an agent can compute (in advance) the movement that will best reduce the difference between the current view and the goal view.

Performance of such models is good, however they do involve computationally expensive processing. Here, Ralf Möller presents a new model that gains computationally economy by outsourcing some of the computation to a scanning behaviour. The simulated ant uses a learning walk to store a series of goal views, then, during navigation agents perform a scanning where they adopt a series of orientations. For each orientation, a prediction is made as to how the current view would transform with forwards translation. The direction is chosen that produces the best match with one of the stored snapshots. An interesting aspect of the paper – beyond the details of this fun new model – is that Ralf explicitly (with respect to potential neural networks) considers the neural cost of the computations involved.

Ralf Möller (2012) A model of ant navigation based on visual prediction, Journal of Theoretical Biology, Volume 305, 21 July 2012, Pages 118-130, 10.1016/j.jtbi.2012.04.022.

Categories: Papers from 2012
  1. No comments yet.
  1. No trackbacks yet.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s