In recent decades, many technologies have been introduced into the livestock farming sector. Among these, one of the earliest technologies was radio frequency identification (RFID), which is used for the electronic identification (EID) of animals [1]. RFID systems are composed of two parts, a transponder or tag (ear tags, rumen bolus, or injectable glass tags) and a transceiver (portable or fixed) [2]. The use of EID is mandatory in Europe for sheep and goats (Reg. CE n. 21/2004), while it is voluntary for cattle. RFID tags can use two communication protocols, half-duplex (HDX) and full-duplex (FDX). As described in ISO 11785:1996 [3], these two technologies differ in the modulation of the response signal, return frequencies, encoding and bit rate of transmission. However, even if the response telegram structure differs for HDX and FDX systems, the structure of the unique animal code (64 bits) is the same and is regulated by ISO 11784:1996 [4]. Even if this technology is considered established for identification, tags can only store a few bits (128 bits in FDX and 112 bits in HDX), which correspond to the unique identification code of the animal, giving no additional information. Moreover, a large variety of data are collected in modern farms from different sensors thanks to the spread of precision livestock farming (PLF) technologies. The most common definition of PLF is “individual animal management by continuous real-time monitoring of health, welfare, production/reproduction, and environmental impact” [5]. However, PLF is not the only term used to describe this kind of approach to livestock farming; among the others, "smart livestock farming” and "smart animal agriculture” are the most commonly used [6]. All these terms refer to the use of process engineering principles or technologies to manage livestock production through smart sensors, monitoring animal growth, production, diseases, behavior and components of the macroenvironment [7]. There are two types of sensors, fixed sensors, which can monitor environmental conditions or multiple animals simultaneously, and sensors that are wearable by the animal, such as bolus, tags and collars [8]. PLF technologies have made available a large amount of data; however, their consultation and interpretation by farmers is often considered a difficult and time-consuming task [9]. Every sensor stores data in different databases accessible through PCs or mobile devices, such as smartphones and tablets, although it is difficult to consult databases directly on farms or during farm management tasks [10]. Even when the data can be consulted through mobile devices, the process implies a stop in normal farm management activities because the smartphone or the tablet occupies the operator’s hand.
In this scenario, smart glasses for augmented reality (ARSG), if connected to EID, could allow the possibility of consulting specific animal databases directly on-field, leaving the operator hands-free [11]. ARSGs are wearable head-up displays, connected to, or integrating a miniaturized computer, that adds virtual information to the user’s reality. There are many types of ARSG that can be grouped according to price, weight, powering system (internal or external battery), visualization systems (video, optical or retinal), operating system (Android-based, Windows, etc.), interaction methods and resistance to bumps, dust and water [12]. Augmented Reality (AR) consists of the visualization of digital information superimposed on the real environment, providing additional information to users and helping them to solve tasks at the same time [13]. At this moment, AR is not a diffuse technology in the agricultural domain but is more commonly used in manufacturing [14–16], industrial sectors [17, 18], medicine [19, 20], psychology [21, 22] and education [23–25]. Several studies have already shown that AR can be a useful technology in agricultural contexts [26, 27]. Azuma (1997) [28] defined the three basic characteristics of the AR system and expanded the definition in 2001. However, in recent years, with the advancement of technology and the diversification of devices that can implement AR technology, a new definition of AR was needed. In fact, Rauschnabel et al. (2022) [29] redefined AR as a hybrid experience consisting of context-specific virtual content that is merged into a user’s real-time perception of the physical environment through computing devices. AR can further be refined based on the level of integration of the digital elements in the real world. The level of integration defines a specific AR spectrum that ranges from assisted reality (low integration) to mixed reality (high integration). This more comprehensive definition allows us to include a wider variety of technologies in the AR spectrum, as in the case of assisted reality. This technology consists of the visualization of head-stable content not connected to real-world objects, and it is commonly implemented in mobile devices, such as smartphones, and in smart glasses (SG). Another level of the AR spectrum is mixed reality (MR), which was first described by Milgram and Kishino in 1994 [30] as “a subclass of virtual reality (VR) related technologies that involve the merging of the real and virtual worlds”. Different from assisted reality systems, MR increases the user’s spatial and visual interaction possibilities [31]. MagicBook is considered one of the first examples of an MR system [32]. This book can be read and held as a normal one but with a specific MR display a set of digital information, and 3D models, aligned with the real-world book, are shown. The device also allows the user to be immersed in the virtual scene, exploring the entire virtual continuum spectrum with one system. Currently, one of the most advanced MR devices is the Microsoft HoloLens (Microsoft, USA), which has the capability to show digital information and virtual objects in the form of holograms. Those objects are aligned to the real world thanks to real-time environment scanning and can interact with the use of bare hands. The principal situation in which it is better to choose MR over AR is when there is the need to manipulate and physically interact with virtual objects [33].
The aim of this study was to design and develop a framework that allows the connection of animal electronic identification to different types of ARSG and to evaluate the performance of the developed system in the laboratory and on-field. Moreover, a comparison of assisted and mixed reality systems was carried out to assess the most suitable solution in livestock farm environments.
Developed system framework
SmartGlove hardware
SmartGlove (SMGL) is currently a TRL-3 (technology readiness level) prototype that allows reading the unique animal code from RFID tags and sending it to ARSG to display all the information related to that specific animal. It is composed of an Arduino control unit with integrated Bluetooth and an RFID reader board connected to a 125 kHz antenna. All the components are enclosed in a 3D-printed plastic case that can be used as a bracelet, with the antenna extended in the back of the hand attached to a glove (Fig. 1).
The SMGL is connected, via Bluetooth, to custom software for SG, which displays the following animal information related to the tag’s identification code: The animal ID code, group (A or B), name, date of the last parturition, age (in years), milk production of the last day (kg), milk production of the last week (kg), the number of parturitions and presence of mastitis (P for positive or N for negative). The SMGL can be connected to different types of ARSG. The first ARSG adopted in this study are the Epson Moverio BT–300 (BT300), an Android-based assisted reality device with an optical, binocular, see-through display. The second ones are the Microsoft HoloLens 2 (HL), a Microsoft Windows-based MR device, which has a holographic display. A complete list of the characteristics of both SG can be found in Table 1.
Table 1
Technical characteristics of the HoloLens 2 (HL) and Moverio BT-300 (BT300).
Characteristic | HL | BT300 |
Manufacturer | Microsoft | Epson |
Processor | Qualcomm Snapdragon 850 (8 core 2.95 GHz) | Intel Atom (4 core 1.44 GHz) |
Operating system | Windows Holographic | Moverio OS (Android 5.1.1) |
Display | See-through holographic lenses | Binocular Si-OLED |
RAM | 4GB | 2GB |
Weight Battery duration Controller input | 566 g 3 h Hand/voice recognition | 69 g 4 h Joypad (touchpad) |
Software for Smart Glasses
This section discusses the technical implementation of the two SG interfaces for visualizing and browsing the data collected through SMGL. The overall solution was implemented both as a Kotlin application for the BT300 headset and as a Universal Windows Platform (UWP) application for the HL. They have the same implementation architecture, so we will discuss the solution differentiating the relevant platform-dependent differences here.
The software solution consists of three modules (Fig. 2), the glove hardware manager, the database and the headset application. The SMGL is controlled through an Adafruit board connected to an RFID antenna, which reads rumen bolus or ear tags to recognize the animal and sends the identifier to the headset through a Bluetooth connection. When the interface receives a new identifier, it displays all the information about the animal stored in the database.
In the current prototype, the database structure is straightforward. It consists of a Google spreadsheet farmers can modify to adapt to their needs. This means they can add the information they would like to store about the animals by adding columns to the spreadsheet. The only assumption is that the RFID identifier is the first column in the spreadsheet, serving as the primary key. We can upload a copy of the shared spreadsheet in the interface application as Comma Separated Values (CSV) for offline work, and synchronize the changes when the network is available.
Farmers can access the database in the field using the headset interface, and it can be displayed as an overlay on their field of view in the real world through the headset's screen. The application pairs with the glove at startup by showing an interface reporting the Bluetooth scanning result and connecting with the device through the "Connect" button. Once selected, the application establishes a connection and exchanges the required information using the GATT protocol for Bluetooth Low Energy devices, which preserves the glove battery. After that, the glove can be used to scan the bolus or the animal's ear. If the RFID code matches an entry in the database, the application retrieves all the related data, presenting it in a tabular view. Otherwise, the application shows a descriptive message (e.g., the RFID code is not included in the database). Farmers can customize the visualization by hiding columns of the table they are not interested in. By pressing the "Parameters" button, the application shows a list of all the columns in the spreadsheet, and farmers can select or deselect them.
The application allows offline work by locally caching the changes to the database and synchronizing it back on the cloud by pressing the "Update Database" button. The application notifies the farmer in case of database update issues or when the application has no locally saved data.