What if cows could talk?

By using acoustic data and machine learning to decipher cows’ vocalizations, Virginia Tech researchers hope to shed new light on the animals’ health, welfare and environmental impact.

Read Also

Drone spraying unlikely for the foreseeable future

Precision agriculture simplified many on-farm practices and procedures, particularly in the past five years with the rush to automate tractors,…

James Chen, an animal data sciences researcher and assistant professor in the School of Animal Sciences, is using a $650,000 grant from the U.S. Department of Agriculture’s National Institute of Food and Agriculture to develop an acoustic, data-driven tool to help enhance animal welfare and lower methane emissions in precision livestock farming.

“Vocalization is a major way cows express their emotions, and it is about time to listen to what they’re telling us,” Chen said.

Because sound data can be collected from cows individually and continuously, Chen said it’s better than video or other observation methods for monitoring cows’ emotions and health, including even subtle changes in breathing.

“The assessment of animal welfare has become a central discussion in society and is a controversial issue simply because the lack of objective tools leads to biased interpretations,” he said. “By matching audio data with biological and visual cues, we can be more objective in our approach to analyzing their behaviour.”

Chen and his co-investigator, Virginia Cooperative Extension dairy scientist and associate professor Gonzalo Ferreira, plan to collect audio data from cows, their calves and beef cattle in the pasture. They will then use machine learning to analyze and catalogue thousands of points of acoustic data and interpret cow vocalizations such as mooing, chewing and burping for signs of stress or illness.

“Let’s think about a baby crying inside a plane or in church,” Ferreira said. “As a father, I have an idea whether the baby is crying because it’s hungry or wants attention. Our research question then is: Can we use audio data to interpret animals’ needs?”

Chen and Ferreira are particularly interested in identifying vocal patterns for how cows communicate distress. By analyzing the frequency, amplitude and duration of cow’s moos and vocalizations and correlating the sound data with saliva cortisol samples, they can classify whether cows are experiencing no stress, mild stress or severe stress and begin to decode their “language.”

As part of the project, Chen is building a computational pipeline that integrates acoustic data management, pre-trained machine-learning models and interactive visualization of animal sounds. The resulting data will be shared in an open-source, web-based application available to scientists, producers, and the public. Chen said his hope is that the information will help guide future protocols to improve animal welfare.

“Anyone can directly plug in and use our model to run their own experiment,” he said. “This allows people to transform cows’ vocalizations into interpretable information that humans can recognize.”

Researchers plan to place small recording devices on the halters or collars of cows to capture their vocalizations for the study.

Because cows’ burps can release small amounts of methane, the researchers also will try to identify cows that burp less through audio data. By comparing the sound data to DNA samples from the cows, they hope to understand whether a genetic variant causes some cows to burp more than others.

They also plan to examine the impact of rumen modifiers — food additives that inhibit methane gas production — to gauge the effects.

“Measuring methane emissions from cattle requires very expensive equipment, which would be prohibitive to farmers,” Ferreira said. “If burping sounds are indeed related to methane emissions, then we might have the potential for selecting low methane-emitting animals at the commercial farm level in an affordable manner.”

“Our eventual goal is to use this model on a larger scale,” Chen said. “We hope to build a public dataset that can help inform policy and regulations.”

Source: Farmtario.com

Share