PhD Position F/M IA-based automated detection and behavior analysis among piglets

Contract type : Fixed-term contract

Level of qualifications required : Graduate degree or equivalent

Fonction : PhD Position

Level of experience : Recently graduated

About the research centre or Inria department

The Inria Centre at Rennes University is one of Inria's eight centres and has more than thirty research teams. The Inria Centre is a major and recognized player in the field of digital sciences. It is at the heart of a rich R&D and innovation ecosystem: highly innovative PMEs, large industrial groups, competitiveness clusters, research and higher education players, laboratories of excellence, technological research institute, etc.

Context

The proposed subject is part of the PEPR project WAIT4 on animal welfare, which goal is to use Artificial Intelligence and new technologies for tracking behavioral or physiological indicators of welfare in farm animals facing challenges of the agro-ecological transition. Animal welfare is a key agro-ecological process to be optimized for livestock production. The subject focuses on Automated behavior analysis (ABA, also called computational ethology) which is the use of technology to detect and observe the behavior of animals in ways that require minimal human labor. It results from a collaboration between INRIA, for the computer science aspects, and the INRAE Brittany-Normandy centre which is a major player in the development of agricultural and agrifood systems in the Grand Ouest region and will provide data and expertise in animal behavior analysis.

Assignment

Context

Behavior monitoring is crucial for the assessment of the welfare and emotional states of farm animals, such as pigs and cattle. Particularly, social behaviors, which are strongly influenced by husbandry practices, are considered a major determinant of animal welfare on farm. For example, in pigs, social mixing at weaning is often associated with increased levels of aggression, which can induce stress and lead to severe injuries. Conversely, social play occurs mainly when optimal housing and social conditions are met, and is thus considered a behavioral indicator of positive affective states in juvenile pigs.

In line with the development of precision livestock farming, modern farms are often equipped with video cameras which allow to continuously monitor behavior of animals. However, manual observations of animal behavior on video images, and social behavior in particular, is time-consuming and labor-intensive, and thus difficult or impossible to apply on farms. Consequently, in the recent years, several computer-vision automatic systems have been developed to automatically detect a wide range of behaviors, including body postures, locomotion or social behavior, of farm animals such as pigs and cattle (see [10] for a review).

The primary objective of animal behavior analysis is to identify every instance of a specific behavior within a video and determine its precise spatial and temporal localization. Over the last few years, there has been a surge in deep learning approaches for behavioral analysis, including segmentation, identification, and pose estimation [2]. Multiple Object Tracking techniques are also used, which enable the tracking and analysis of individual animals within a group. However, when used for animal behavior analysis, these methods face several challenges and issues. Interactions and group dynamics further complicate the tracking process. Animals’ behaviors and movements within a group are often complex and dynamic, making it challenging to identify and track individual animals. This complexity can result in incomplete or inaccurate tracking data.

Despite the progress of these computer vision techniques, the direct recognition of behavior itself, has been more rarely addressed. Classifying animal behavior has lagged behind that of humans. Animal behavior inference tends to be harder because human actions are more recognizable, and most challenges in the human domain focus on classifying short videos rather than long-running recordings as in animal observation. Another limiting factor is the limited number of publicly available animal observation datasets and the high cost associated with obtaining a large number of labeled datasets.

 

Mission

The PhD objective is to measure the social interactions and the occurrence of fights vs positive social contacts, such as social play, between piglets before and after weaning to assess emotional states of piglets, based on video recordings of piglets provided by INRAE. The proposed methods will rely on computer vision and machine learning fields. This requires solving a number of problems.

* Behavior characterization. In most current computational tools for behavioral quantification, identifying a given behavior relies largely on tracking a few “high-level” properties such as the position of body parts in space (poses) and their speed of motion. Thus, data-rich videos and complex movements are simplified to skeletons and relative positions in space of body parts. While tracking these high-level properties is useful, relying on them to identify a given behavior has its limitations. Firstly, the limited number of high-level properties does not necessarily represent the information needed to capture the complexity of animal behavior. The use of high-level properties is likely to result in a significant loss of information, making the identification of behavior less precise. The aim is therefore to study which properties can be extracted or learned to define and discriminate a behavior.

* Modelling interaction. The behaviors studied in this thesis are not the behaviors of a single animal, but result from the interaction between several animals, which adds complexity. In addition, multiple animals may perform different behaviors. It is obvious that when modeling social dynamics, observations must necessarily involve several individuals. The method developed should be able to perform an automated classification of the behavior of individual piglets and not just the behavior of the group as a whole, as in most research. The key problem here is to identify individuals, capture the temporal aspect of the individual's behavior, and model interaction and interdependence with its companions.

* Data scarcity. The scarcity of available labeled data poses significant challenge. Not only are training data few but behavior events are short-lasting and occur sporadically. Most likely, the approach will rely on models developed in other contexts (involving humans or other animals) and propose effective methods for knowledge transfer and few-shot learning.

* Multimodality. An original aspect of the proposed approach will be to take into account the audio information available in the video recording. The aim is to study whether vocalizations can be associated with behaviors, and whether sound characteristics can help visual analysis. The difficulty lies in the fact that sound is recorded globally, and it is difficult to associate it with a particular animal or activity, if several different behaviors are taking place simultaneously. This also requires the development of a multi-modal architecture.

References

  1. Deep-learning-based identification, tracking, pose estimation and behaviour classification of interacting primates and mice in complex environments. Marks M, Jin Q, Sturman O, von Ziegler L, Kollmorgen S, von der Behrens W, Mante V, Bohacek J, Yanik MF. Nature Machine Intelligence, 2022.
  2. Multi-animal pose estimation and tracking with deeplabcut. Lauer, J. et al. bioRxiv, 2021.
  3. Automatic detection of locomotor play in young pigs: A proof of concept, Mona L.V. Larsen, Meiqing Wang, Sam Willems, Dong Liu, Tomas Norton, Biosystems Engineering, Volume 229, 2023, Pages 154-166
  4. The quest to develop automated systems for monitoring animal behavior, Janice M. Siegford, Juan P. Steibel, Junjie Han, Madonna Benjamin, Tami Brown-Brandl, Joao R.R. Dórea, Daniel Morris, Tomas Norton, Eric Psota, Guilherme J.M. Rosa, Applied Animal Behaviour Science, Volume 265, 2023.
  5.  Barriers to computer vision applications in pig production facilities Li, J., Green-Miller, A.R., Hu, X., Lucic, A., Mohan, M.M., Dilger, R.N., Condotta, I.C., Aldridge, B., Hart, J.M., Ahuja, N., 2022. Comput. Electron. Agric. 20
  6. Automated Behavior Recognition and Tracking of Group-Housed Pigs with an Improved DeepSORT Method. Shuqin Tu and al. Agriculture 2022, 12, 1907
  7. Automated detection and analysis of social behaviors among preweaning piglets using key point-based spatial and temporal features. Gan, H., Ou, M., Huang, E., Xu, C., Li, S., Li, J., Liu, K., & Xue, Y. (2021). Computers and Electronics in Agriculture, 188, 106357
  8. Livestock Monitoring with Transformer, BMVC, 2021
  9. Automated video behavior recognition of pigs using two-stream convolutional networks. Zhang, K., Li, D., Huang, J., Chen, Y., 2020. Sensors 20
  10. Behaviour recognition of pigs and cattle: Journey from computer vision to deep learning. Chen Chen, Weixing Zhu, Tomas Norton, Computers and Electronics in Agriculture, Volume 187,2021
  11. Individual Detection and Tracking of Group Housed Pigs in Their Home Pen Using Computer Vision. Van der Zande Lisette. E., Guzhva Oleksiy, Rodenburg T. Bas, Frontiers in Animal Science, Volume 2, 2021
  12. Tracking and analysing social interactions in dairy cattle with real-time locating system and machine learning. K. Ren, G. Bernes, M. Hetta, and J. Karlsson, Journal of systems architecture, vol. 116, 2021

 

 

Main activities

  • Bibliography study on existing methods for Automated behavior analysis based on visual data
  • Analyze and process video recordings of piglets to extract relevant information for an interaction model
  • Develop an IA-based method for piglet behavior classification based on raw video frames
  • Expand the approach in a multi-modal setting that integrate audio information
  • Scientific publications, present the work to various scientific community

 

Skills

  • Master in Computer Sciences, with proficiency in python and its libraries for deep learning
  • General background in computer vision and machine learning
  • Understanding of deep learning methodologies and techniques;
  • Proficiency in data handling, particularly in video processing
  • Good communication skills in oral and written English

Benefits package

  • Subsidized meals
  • Partial reimbursement of public transport costs
  • Possibility of teleworking (90 days per year) and flexible organization of working hours
  • Partial payment of insurance costs

Remuneration

Monthly gross salary amounting to 2100 euros for the first and second years and 2190 euros for the third year