Abstract
Emotions or affective states recognition in farm animals is an underexplored research domain. Despite significant advances in the animal welfare research, the animal affective computing through the development and application of devices and platforms that can not only recognize but interpret and process the emotions, are in nascent stage. By capitalizing on the immense potential of biometric sensors, the artificial intelligence enabled big data methods substantially offers advancement of animal welfare standards and meet the urgent need of caretakers to respond effectively to maintain the wellbeing of their animals. Farm animals, numbering over 70 billion worldwide, are increasingly managed in large-scale, intensive farms. With both public awareness and scientific evidence growing that farm animals experience suffering, as well as affective states such as fear, frustration and distress, there is an urgent need to develop efficient and accurate methods for monitoring their welfare. At present, there are no scientifically validated ‘benchmarks’ for quantifying transient emotional (affective) states in farm animals, and no established measures of good welfare, only indicators of poor welfare, such as injury, pain and fear. Conventional approaches to monitoring livestock welfare are time consuming, interrupt farming processes and involve subjective judgments. Biometric sensors data enabled by Artificial Intelligence are an emerging smart solution to unobtrusively monitoring livestock, but their potential for quantifying affective states and groundbreaking solutions in their application are yet to be realized. This review provides innovative methods for collecting big data on farm animal emotions, which can be used to train artificial intelligence models to classify, quantify and predict affective states in individual pigs and cows. Extending this to the group level, social network analysis can be applied to model emotional dynamics and contagion among animals. Finally, ‘digital twins’ of animals capable of simulating and predicting their affective states and be-havior in real time are a near-term possibility.