Adsa Logo White Adsa Title White

Validation of an infrared camera for measuring ocular temperatures of veal calves.

H. Goetz

Events

06-24-2020

Abstract:

290
Validation of an infrared camera for measuring ocular temperatures of veal calves.
H. Goetz*1, D. Kelton1, J. Costa2, C. Winder1, D. Renaud1. 1Department of Population Medicine, University of Guelph Guelph, ON, Canada, 2Department of Animal and Food Sciences, University of Kentucky Lexington, KY.

Temperature measurement is a key part of the clinical exam process, however, the standard method to monitor temperature using rectal temperatures is subject to errors and can be laborious and disruptive to animal behavior. Use of infrared thermography (IRT) is a plausible alternative to rectal temperatures for providing a non-invasive method to assess calf health. The objective of this prospective cohort study was to validate IRT for measuring core body temperatures. A total of 320 calves were enrolled upon arrival at a veal facility in southwestern Ontario, Canada. Calves were followed for 14 d between May and August 2019. Researchers visited the farm daily to measure ocular infrared (IR) temperature and rectal temperature (RT), as well as evaluate navel, attitude, fecal, and respiratory scores. The IR camera was placed at a distance of 12 inches away from the calf's eye to ensure consistent measurement. Treatment and mortality records were also collected throughout the 78 d the calves were at this facility. The mean difference in IR temperature and RT was 0.30�C � 1.50. Youden's Index was used to determine the optimal cutpoint which would maximize the sensitivity and specificity of the IR camera for detecting a fever when compared with a RT of ≥39.5�C. The optimal cutpoint for the infrared camera to detect a fever, defined as a RT of ≥39.5�C, was 39.45�C, and the sensitivity and specificity of detecting a fever using IRT at this point were 60% (95% Confidence Interval (CI): 53, 67) and 71% (95% CI: 70, 73), respectively. The area under the receiver operating characteristic (ROC) curve at this point was 0.66 (95% CI: 0.62, 0.69). A random number generator was used to select a day between 1 and 14, and a simple linear regression model was built to assess the ability of the IR camera to predict RT on d 10. The R squared of this model was 0.0122, suggesting that IR temperature alone was poorly correlated with RT. Further analysis is being conducted to explore external variables which influence accuracy of the IR camera and to evaluate the predictive ability of IRT in assessing calf health.

Keywords: male dairy calf, morbidity.