Researchers in Japan have developed AI technology that can interpret chicken sounds with an 80% accuracy rate, potentially revolutionizing our understanding of and communication with these birds. By using AI, scientists can now discern six distinct emotional states in chickens, including hunger, fear, anger, contentment, enthusiasm, and distress, based on their vocalizations.
The study, led by Professor Adrian David Cheok from the University of Tokyo, has published a proof-of-concept study in Research Square, although it has not yet undergone peer review but has been submitted to Nature Scientific Reports.
This breakthrough opens up possibilities for better understanding and improving animal welfare and sets the stage for further exploration in AI-driven interspecies communication. A team consisting of eight animal psychologists and veterinarians collaborated with Professor Cheok to provide insights into the emotional states of chickens. They analyzed audio recordings from a sample of 80 birds over 200 hours, and the AI was trained to recognize and label each sound with the corresponding emotional state.
The researchers intend to create a free app that allows farmers to communicate with their chickens since chickens are highly social animals. Furthermore, this technology has the potential to enhance veterinary medicine, improve conditions in poultry farming, and facilitate human-animal interactions.
The team’s goal is to expand this AI and machine learning technique to other animals, laying the foundation for enhanced understanding and communication in various animal-related industries. This development holds the promise of creating a better world for animals by enabling us to comprehend their feelings and needs.