Ever wondered how the world looks to your dog, your cat, or even the spiders that live in your house?
Given that every animal species sees colors, patterns, and brightness differently due to their unique eye adaptations, there are countless modes of vision that humans have never experienced. That’s why scientists have developed free, open-source software that can be run on photos taken with an average smartphone to simulate the perspective of animals. The platform is described in a paper published on Tuesday in Methods in Ecology and Evolution.
Called the Quantitative Colour Pattern Analysis (QCPA) framework, the platform enables people to customize digital photos to match properties associated with animal visual systems.
“It’s an exciting time to be studying animal vision because we have these imaging technologies becoming incredibly accessible and cheap and we have computational power going through the roof,” said co-lead author Cedric van den Berg, a PhD student at the University of Queensland in Australia, in a call.
Four images of a nudibranch (sea slug). Top left: The image taken with a digital camera, Top right: Image as perceived by a triggerfish in 5m depth at 10cm viewing distance. Bottom left: Color contrast of edges as perceived by a triggerfish. Bottom right: A heatmap of the perception of color saturation by the triggerfish. Image: Cedric van den Berg et al.
“We have AI and deep learning starting to make their way into the field, and we have an ever-growing understanding of the neuronal systems that actually connect those eyes to behavior and evolutionary consequences,” he added.
For decades, scientists have developed techniques to simulate isolated features of animal vision using advanced spectral-imaging equipment. In 2015, van den Berg and his colleagues realized that digital photos, even those from an ordinary smartphone, could generate comparable models.
The team has been working on a software framework ever since, with the aim of unifying all the disparate methods and datasets pioneered by other researchers into one system.
In the above video, co-lead author Jolyon Troscianko, an ecologist at the University of Exeter, explains how QCPA can adjust images to reflect various light and color sensitivities, spatial acuities, photoreceptor abundances, and other traits that differ between animal species.
For instance, the image below displays a natural setting through human eyes, on the left, next to a model generated by QCPA that shows how a bee might perceive the same scene.
A scene from the perspective of a human (left) and a bee (right). Image: Jolyon Troscianko
“What a lot of this framework is about is spatial chromatic color pattern analysis, which is the ability to contextualize and pay tribute to the fact that in any visual system, the eyes ultimately feed into neural processing that doesn’t stop at the retina,” van den Berg explained.
At the moment, the software is primarily designed to help biologists and ecologists better understand how animals visualize each other and their environment. But the team hopes that people of all backgrounds will experiment with QCPA in new ways.
To try out the program, visit the team’s site Empirical Imaging and install the latest toolbox, which is filed in the “Download” tab. The site includes a user guide and a forum to help beginners get oriented with the software.
“You don’t need to know how to code” to use the framework, van den Berg said. “Even if you don’t necessarily have a scientific background, the information is there and you can sit down and read into it and essentially get the hang of it and start using the software.”