Machine Learning Methods and Visual Observations to Categorize Behavior of Grazing Cattle Using Accelerometer Signals
Authors: Ira L. Parsons, Brandi B. Karisch, Amanda E. Stone, Stephen L. Webb, Durham A. Norman, Garrett M. Street
Accelerometers worn by animals produce distinct behavioral signatures, which can be classified accurately using machine learning methods such as random forest decision trees. The objective of this study was to identify accelerometer signal separation among parsimonious behaviors. We achieved this objective by (1) describing functional differences in accelerometer signals among discrete behaviors, (2) identifying the optimal window size for signal pre-processing, and (3) demonstrating the number of observations required to achieve the desired level of model accuracy. Crossbred steers (Bos taurus indicus; n = 10) were fitted with GPS collars containing a video camera and tri-axial accelerometers (read-rate = 40 Hz). Distinct behaviors from accelerometer signals, particularly for grazing, were apparent because of the head-down posture. Increasing the smoothing window size to 10 s improved classification accuracy (p < 0.05), but reducing the number of observations below 50% resulted in a decrease in accuracy for all behaviors (p < 0.05). In-pasture observation increased accuracy and precision (0.05 and 0.08 percent, respectively) compared with animal-borne collar video observations.
Suggested Citation
Parsons, I.L., B.B. Karisch, A.E. Stone, S.L. Webb, D.A. Norman, and G.M. Street. 2024. Machine learning methods and visual observations to categorize behavior of grazing cattle using accelerometer signals. Sensors 24:3171.