Author(s)

Damon A. Clark, James E. Fitzgerald, Justin M. Ales, Daryl M. Gohl, Marion A. Silies, Anthony M. Norcia, Thomas R. Clandinin

ISBN

1097-6256

Publication year

2014

Periodical

Nat Neurosci

Periodical Number

2

Volume

17

Pages

296-303

Author Address

Full version

Sighted animals extract motion information from visual scenes by processing spatiotemporal patterns of light falling on the retina. The dominant models for motion estimation exploit intensity correlations only between pairs of points in space and time. Moving natural scenes, however, contain more complex correlations. We found that fly and human visual systems encode the combined direction and contrast polarity of moving edges using triple correlations that enhance motion estimation in natural environments. Both species extracted triple correlations with neural substrates tuned for light or dark edges, and sensitivity to specific triple correlations was retained even as light and dark edge motion signals were combined. Thus, both species separately process light and dark image contrasts to capture motion signatures that can improve estimation accuracy. This convergence argues that statistical structures in natural scenes have greatly affected visual processing, driving a common computational strategy over 500 million years of evolution.