Robust classification of animal tracking data
Computers and Electronics in Agriculture
Learning transportation mode from raw gps data for geographic applications on the web
Proceedings of the 17th international conference on World Wide Web
Evaluation of three-dimensional accelerometers to monitor and classify behavior patterns in cattle
Computers and Electronics in Agriculture
Indoor positioning using GPS revisited
Pervasive'10 Proceedings of the 8th international conference on Pervasive Computing
Hi-index | 0.00 |
In precision livestock farming, spotting cows in need of extra attention due to health or welfare issues are essential, since the time a farmer can devote to each animal is decreasing due to growing herd sizes and increasing efficiency demands. Often, the symptoms of health and welfare state changes, affects the behavior of the individual animal, e.g., changes in time spend on activities like standing, lying, eating or walking. Low-cost and infrastructure-less GPS positioning sensors attached to the animals' collars give the opportunity to monitor the movements of cows and recognize cow activities. By preprocessing the raw cow position data, we obtain high classification rates using standard machine learning techniques to recognize cow activities. Our objectives were to (i) determine to what degree it is possible to robustly recognize cow activities from GPS positioning data, using low-cost GPS receivers; and (ii) determine which types of activities can be classified, and what robustness to expect within the different classes. To provide data for this study low-cost GPS receivers were mounted on 14 dairy cows on grass for a day while they were observed from a distance and their activities manually logged to serve as ground truth. For our dataset we managed to obtain an average classification success rate of 86.2% of the four activities: eating/seeking (90.0%), walking (100%), lying (76.5%), and standing (75.8%) by optimizing both the preprocessing of the raw GPS data and the succeeding feature extraction.