Mid Sweden University

miun.sePublications
System disruptions
We are currently experiencing disruptions on the search portals due to high traffic. We are working to resolve the issue, you may temporarily encounter an error message.
Change search
Link to record
Permanent link

Direct link
Martinez Rau, LucianoORCID iD iconorcid.org/0000-0002-2336-5390
Publications (8 of 8) Show all publications
Martinez Rau, L., Chelotti, J. O., Ferrero, M., Galli, J. R., Utsumi, S. A., Planisich, A. M., . . . Giovanini, L. L. (2025). A noise-robust acoustic method for recognizing foraging activities of grazing cattle. Computers and Electronics in Agriculture, 229, Article ID 109692.
Open this publication in new window or tab >>A noise-robust acoustic method for recognizing foraging activities of grazing cattle
Show others...
2025 (English)In: Computers and Electronics in Agriculture, ISSN 0168-1699, E-ISSN 1872-7107, Vol. 229, article id 109692Article in journal (Refereed) Published
Abstract [en]

Farmers must continuously improve their livestock production systems to remain competitive in the growing dairy market. Precision livestock farming technologies provide individualized monitoring of animals on commercial farms, optimizing livestock production. Continuous acoustic monitoring is a widely accepted sensing technique used to estimate the daily rumination and grazing time budget of free-ranging cattle. However, typical environmental and natural noises on pastures noticeably affect the performance limiting the practical application of current acoustic methods. In this study, we present the operating principle and generalization capability of an acoustic method called Noise-Robust Foraging Activity Recognizer (NRFAR). The proposed method determines foraging activity bouts by analyzing fixed-length segments of identified jaw movement events produced during grazing and rumination. The additive noise robustness of the NRFAR was evaluated for several signal-to-noise ratios using stationary Gaussian white noise and four different nonstationary natural noise sources. In noiseless conditions, NRFAR reached an average balanced accuracy of 86.4%, outperforming two previous acoustic methods by more than 7.5%. Furthermore, NRFAR performed better than previous acoustic methods in 77 of 80 evaluated noisy scenarios (53 cases with p<0.05). NRFAR has been shown to be effective in harsh free-ranging environments and could be used as a reliable solution to improve pasture management and monitor the health and welfare of dairy cows. The instrumentation and computational algorithms presented in this publication are protected by a pending patent application: AR P20220100910. Web demo available at: https://sinc.unl.edu.ar/web-demo/nrfar. 

Place, publisher, year, edition, pages
Elsevier, 2025
Keywords
Acoustic monitoring, Foraging behavior, Machine learning, Noise robustness, Pattern recognition, Precision livestock farming
National Category
Animal and Dairy Science
Identifiers
urn:nbn:se:miun:diva-53307 (URN)10.1016/j.compag.2024.109692 (DOI)001378129300001 ()2-s2.0-85210727466 (Scopus ID)
Available from: 2024-12-10 Created: 2024-12-10 Last updated: 2025-01-07
Muthumala, U., Zhang, Y., Martinez Rau, L. & Bader, S. (2024). Comparison of Tiny Machine Learning Techniques for Embedded Acoustic Emission Analysis. In: 2024 IEEE 10th World Forum on Internet of Things (WF-IoT): . Paper presented at 10th IEEE World Forum on Internet of Things, WF-IoT 2024, Ottawa, Canada, 10 November - 13 November, 2024. IEEE conference proceedings
Open this publication in new window or tab >>Comparison of Tiny Machine Learning Techniques for Embedded Acoustic Emission Analysis
2024 (English)In: 2024 IEEE 10th World Forum on Internet of Things (WF-IoT), IEEE conference proceedings, 2024Conference paper, Published paper (Refereed)
Abstract [en]

This paper compares machine learning approaches with different input data formats for the classification of acoustic emission (AE) signals. AE signals are a promising monitoring technique in many structural health monitoring applications. Machine learning has been demonstrated as an effective data analysis method, classifying different AE signals according to the damage mechanism they represent. These classifications can be performed based on the entire AE waveform or specific features that have been extracted from it. However, it is currently unknown which of these approaches is preferred. With the goal of model deployment on resource-constrained embedded Internet of Things (IoT) systems, this work evaluates and compares both approaches in terms of classification accuracy, memory requirement, processing time, and energy consumption. To accomplish this, features are extracted and carefully selected, neural network models are designed and optimized for each input data scenario, and the models are deployed on a low-power IoT node. The comparative analysis reveals that all models can achieve high classification accuracies of over 99\%, but that embedded feature extraction is computationally expensive. Consequently, models utilizing the raw AE signal as input have the fastest processing speed and thus the lowest energy consumption, which comes at the cost of a larger memory requirement.

Place, publisher, year, edition, pages
IEEE conference proceedings, 2024
Keywords
TinyML, acoustic emission, machine learning, structural health monitoring, feature extraction
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-51320 (URN)10.1109/WF-IoT62078.2024.10811219 (DOI)979-8-3503-7301-1 (ISBN)
Conference
10th IEEE World Forum on Internet of Things, WF-IoT 2024, Ottawa, Canada, 10 November - 13 November, 2024
Available from: 2025-02-11 Created: 2024-05-13 Last updated: 2025-02-11Bibliographically approved
Zhang, Y., Martinez Rau, L., Oelmann, B. & Bader, S. (2024). Enabling Autonomous Structural Inspections with Tiny Machine Learning on UAVs. In: 2024 IEEE Sensors Applications Symposium, SAS 2024 - Proceedings: . Paper presented at 2024 IEEE Sensors Applications Symposium, SAS 2024 - Proceedings. IEEE conference proceedings
Open this publication in new window or tab >>Enabling Autonomous Structural Inspections with Tiny Machine Learning on UAVs
2024 (English)In: 2024 IEEE Sensors Applications Symposium, SAS 2024 - Proceedings, IEEE conference proceedings, 2024Conference paper, Published paper (Refereed)
Abstract [en]

Visual structural inspections in Structural Health Monitoring (SHM) are an important method to ensure the safety and long lifetime of infrastructures. Unmanned Aerial Vehicles (UAVs) with Deep Learning (DL) have gained in popularity to automate these inspections. Yet, the vast majority of research focuses on algorithmic innovations that neglect the availability of reliable generalized DL models, as well as the effect that the model's energy consumption would have on the UAV flight time. This paper highlights the performance of 14 popular CNN models with less than six million parameters for crack detection in concrete structures. Seven of these models were successfully deployed to a low-power, resource-constrained mi-crocontroller using Tiny Machine Learning (TinyML). Among the deployed models, MobileNetV1-x0.25 achieves the highest test accuracy (75.83%) and F1-Score (0.76), the second-lowest flash memory usage (273.5 kB), the second-lowest RAM usage (317.1kB), the fourth-fastest single-trial inference time (15.8ms), and the fourth-lowest number of Multiply-Accumulate operations (MACC) (42126514). Lastly, a hypothetical study of the DJI Mini 4 Pro UAV demonstrated that the TinyML model's energy consumption has a negligible impact on the UAV flight time (34 minutes vs. 33.98 minutes). Consequently, this feasibility study paves the way for future developments towards more efficient, autonomous unmanned structural health inspections. 

Place, publisher, year, edition, pages
IEEE conference proceedings, 2024
Keywords
convolutional neural networks, damage classification, embedded systems, structure health monitoring, Tiny machine learning, unmanned aerial vehicles
National Category
Computer Sciences
Identifiers
urn:nbn:se:miun:diva-52591 (URN)10.1109/SAS60918.2024.10636583 (DOI)001304520300085 ()2-s2.0-85203704393 (Scopus ID)9798350369250 (ISBN)
Conference
2024 IEEE Sensors Applications Symposium, SAS 2024 - Proceedings
Available from: 2024-09-24 Created: 2024-09-24 Last updated: 2024-11-25Bibliographically approved
Chelotti, J. O., Martinez Rau, L., Ferrero, M., Vignolo, L. D., Galli, J. R., Planisich, A. M., . . . Giovanini, L. L. (2024). Livestock feeding behaviour: A review on automated systems for ruminant monitoring. Biosystems Engineering, 246, 150-177
Open this publication in new window or tab >>Livestock feeding behaviour: A review on automated systems for ruminant monitoring
Show others...
2024 (English)In: Biosystems Engineering, ISSN 1537-5110, E-ISSN 1537-5129, Vol. 246, p. 150-177Article, review/survey (Refereed) Published
Abstract [en]

Livestock feeding behaviour is an influential research area in animal husbandry and agriculture. In recent years, there has been a growing interest in automated systems for monitoring the behaviour of ruminants. Current automated monitoring systems mainly use motion, acoustic, pressure and image sensors to collect and analyse patterns related to ingestive behaviour, foraging activities and daily intake. The performance evaluation of existing methods is a complex task and direct comparisons between studies is difficult. Several factors prevent a direct comparison, starting from the diversity of data and performance metrics used in the experiments. This review on the analysis of the feeding behaviour of ruminants emphasise the relationship between sensing methodologies, signal processing, and computational intelligence methods. It assesses the main sensing methodologies and the main techniques to analyse the signals associated with feeding behaviour, evaluating their use in different settings and situations. It also highlights the potential of the valuable information provided by automated monitoring systems to expand knowledge in the field, positively impacting production systems and research. The paper closes by discussing future engineering challenges and opportunities in livestock feeding behaviour monitoring. 

Place, publisher, year, edition, pages
Elsevier BV, 2024
Keywords
Feeding behaviour, Machine learning, Precision livestock farming, Review, Sensor data
National Category
Animal and Dairy Science
Identifiers
urn:nbn:se:miun:diva-52103 (URN)10.1016/j.biosystemseng.2024.08.003 (DOI)001294335800001 ()2-s2.0-85200629280 (Scopus ID)
Available from: 2024-08-13 Created: 2024-08-13 Last updated: 2024-08-30
Martinez Rau, L., Chelotti, J. O., Giovanini, L. L., Adin, V., Oelmann, B. & Bader, S. (2024). On-Device Feeding Behavior Analysis of Grazing Cattle. IEEE Transactions on Instrumentation and Measurement, 73, Article ID 2512113.
Open this publication in new window or tab >>On-Device Feeding Behavior Analysis of Grazing Cattle
Show others...
2024 (English)In: IEEE Transactions on Instrumentation and Measurement, ISSN 0018-9456, E-ISSN 1557-9662, Vol. 73, article id 2512113Article in journal (Refereed) Published
Abstract [en]

Precision livestock farming (PLF) leverages cutting-edge technologies and data-driven solutions to enhance the efficiency of livestock production, its associated management, and its welfare. Continuous monitoring of the masticatory sound of cattle allows the estimation of dry-matter intake, classification of jaw movements (JMs), and recognition of grazing and rumination bouts. Over the past two decades, algorithms for analyzing feeding sounds have seen improvements in performance and computational requirements. Nevertheless, in some cases, these algorithms have been implemented on resource-constrained electronic devices, limiting their functionality to one specific task: either classifying JMs or recognizing feeding activities (such as grazing and rumination). In this work, we present an acoustic monitoring system that comprehensively analyzes grazing cattle's feeding behavior at multiple scales. This embedded system classifies different types of JMs, identifies feeding activities, and provides predictor variables for estimating dry-matter intake. Results are transmitted remotely to a base station using long-range communication (LoRa). Two variants of the system have been deployed on a Raspberry Pi Pico board, based on a low-power ARM Cortex-M0+ microcontroller. Both firmware versions make use of direct access memory, sleep mode, and clock-gating techniques to minimize energy consumption. In laboratory experiments, the first deployment consumes 20.1 mW and achieves an F1-score of 87.3% for the classification of JMs and 87.0% for feeding activities. The second deployment consumes 19.1 mW and reaches an F1-score of 84.1% for JMs and 83.5% for feeding activities. The modular design of the proposed embedded monitoring system facilitates integration with energy-harvesting power sources for autonomous operation in field conditions.

Place, publisher, year, edition, pages
Institute of Electrical and Electronics Engineers (IEEE), 2024
Keywords
Monitoring, Cows, Animals, Acoustics, Microphones, Agriculture, Classification algorithms, Edge computing, embedded machine learning, feeding behavior, microcontroller, on-device processing, precision livestock farming (PLF)
National Category
Electrical Engineering, Electronic Engineering, Information Engineering
Identifiers
urn:nbn:se:miun:diva-52041 (URN)10.1109/TIM.2024.3376013 (DOI)001193312100043 ()2-s2.0-85188001389 (Scopus ID)
Available from: 2024-08-07 Created: 2024-08-07 Last updated: 2024-08-07Bibliographically approved
Martinez Rau, L., Zhang, Y., Oelmann, B. & Bader, S. (2024). TinyML Anomaly Detection for Industrial Machines with Periodic Duty Cycles. In: 2024 IEEE Sensors Applications Symposium, SAS 2024 - Proceedings: . Paper presented at 2024 IEEE Sensors Applications Symposium, SAS 2024 - Proceedings. IEEE conference proceedings
Open this publication in new window or tab >>TinyML Anomaly Detection for Industrial Machines with Periodic Duty Cycles
2024 (English)In: 2024 IEEE Sensors Applications Symposium, SAS 2024 - Proceedings, IEEE conference proceedings, 2024Conference paper, Published paper (Refereed)
Abstract [en]

Electro-mechanical systems operating in periodic cycles are pivotal in the Industry 4.0, enabling automated processes that enhance efficiency and productivity. Early detection of failures and anomalies in duty cycles of these machines is crucial to ensure uninterrupted operation and prevent costly downtimes. Although the wear and damage of machines have been extensively studied, a significant proportion of these problems can be traced back to operator errors, underlining the importance of continuously monitoring the machine activity to ensure optimal performance. This work presents an automatic algorithm designed to identify improper duty cycles of industrial machines, exemplified on a mining conveyor belt. To enable the identification of duty cycles, the operational states of the machine are first categorized using machine learning (ML). The study compares six tiny ML techniques on two resource-constrained microcontrollers, reporting an f1-score of 87.6% for identifying normal and abnormal duty cycles and 96.8% for the internal states of the conveyor belt system. Deployed on both low-power microcontrollers, the algorithm processes input data in less than 106 μs, consuming less than 1.16 μJ. These findings promise to facilitate integration into more comprehensive preventive maintenance algorithms. 

Place, publisher, year, edition, pages
IEEE conference proceedings, 2024
Keywords
anomaly detection, conveyor belt, industry 4.0, low-power microcontroller, machine learning, maintenance, tinyML
National Category
Production Engineering, Human Work Science and Ergonomics
Identifiers
urn:nbn:se:miun:diva-52585 (URN)10.1109/SAS60918.2024.10636584 (DOI)001304520300086 ()2-s2.0-85203721689 (Scopus ID)9798350369250 (ISBN)
Conference
2024 IEEE Sensors Applications Symposium, SAS 2024 - Proceedings
Available from: 2024-09-24 Created: 2024-09-24 Last updated: 2024-11-25Bibliographically approved
Martinez Rau, L. S., Chelotti, J. O., Ferrero, M., Utsumi, S. A., Planisich, A. M., Vignolo, L. D., . . . Galli, J. R. (2023). Daylong acoustic recordings of grazing and rumination activities in dairy cows. Scientific Data, 10(1), Article ID 782.
Open this publication in new window or tab >>Daylong acoustic recordings of grazing and rumination activities in dairy cows
Show others...
2023 (English)In: Scientific Data, E-ISSN 2052-4463, Vol. 10, no 1, article id 782Article in journal (Refereed) Published
Abstract [en]

Monitoring livestock feeding behavior may help assess animal welfare and nutritional status, and to optimize pasture management. The need for continuous and sustained monitoring requires the use of automatic techniques based on the acquisition and analysis of sensor data. This work describes an open dataset of acoustic recordings of the foraging behavior of dairy cows. The dataset includes 708 h of daily records obtained using unobtrusive and non-invasive instrumentation mounted on five lactating multiparous Holstein cows continuously monitored for six non-consecutive days in pasture and barn. Labeled recordings precisely delimiting grazing and rumination bouts are provided for a total of 392 h and for over 6,200 ingestive and rumination jaw movements. Companion information on the audio recording quality and expert-generated labels is also provided to facilitate data interpretation and analysis. This comprehensive dataset is a useful resource for studies aimed at exploring new tools and solutions for precision livestock farming. 

Place, publisher, year, edition, pages
Springer Nature, 2023
National Category
Animal and Dairy Science
Identifiers
urn:nbn:se:miun:diva-49848 (URN)10.1038/s41597-023-02673-3 (DOI)001102051000006 ()37938260 (PubMedID)2-s2.0-85176018004 (Scopus ID)
Available from: 2023-11-14 Created: 2023-11-14 Last updated: 2023-12-15Bibliographically approved
Martinez Rau, L., Adin, V., Giovanini, L. L., Oelmann, B. & Bader, S. (2023). Real-Time Acoustic Monitoring of Foraging Behavior of Grazing Cattle Using Low-Power Embedded Devices. In: 2023 IEEE Sensors Applications Symposium (SAS): . Paper presented at 2023 IEEE Sensors Applications Symposium, SAS 2023. IEEE conference proceedings
Open this publication in new window or tab >>Real-Time Acoustic Monitoring of Foraging Behavior of Grazing Cattle Using Low-Power Embedded Devices
Show others...
2023 (English)In: 2023 IEEE Sensors Applications Symposium (SAS), IEEE conference proceedings, 2023Conference paper, Published paper (Refereed)
Abstract [en]

Precision livestock farming allows farmers to optimize herd management while significantly reducing labor needs. Individualized monitoring of cattle feeding behavior offers valuable data to assess animal performance and provides valuable insights into animal welfare. Current acoustic foraging activity recognizers achieve high recognition rates operating on computers. However, their implementations on portable embedded systems (for use on farms) need further investigation. This work presents two embedded deployments of a state-of-the-art foraging activity recognizer on a low-power ARM Cortex-M0+ microcontroller. The parameters of the algorithm were optimized to reduce power consumption. The embedded algorithm processes masticatory sounds in real-time and uses machine-learning techniques to identify grazing, rumination and other activities. The overall classification performance of the two embedded deployments achieves an 84% and 89% balanced accuracy with a mean power consumption of 1.8 mW and 12.7 mW, respectively. These results will allow this deployment to be integrated into a self-powered acoustic sensor with wireless communication to operate autonomously on cattle. 

Place, publisher, year, edition, pages
IEEE conference proceedings, 2023
Keywords
embedded system, foraging behavior, low-power micro-controller, precision livestock farming, real-time acoustic processing
National Category
Embedded Systems
Identifiers
urn:nbn:se:miun:diva-49642 (URN)10.1109/SAS58821.2023.10254175 (DOI)001086399500093 ()2-s2.0-85174016474 (Scopus ID)9798350323078 (ISBN)
Conference
2023 IEEE Sensors Applications Symposium, SAS 2023
Available from: 2023-10-25 Created: 2023-10-25 Last updated: 2023-11-10Bibliographically approved
Organisations
Identifiers
ORCID iD: ORCID iD iconorcid.org/0000-0002-2336-5390

Search in DiVA

Show all publications