Considering the proposed digital architecture, it was possible to prepare the environment for presenting the products' individual information, a Digital Twin of the physical products. A DT should assure real-time data collection, data integration between different platforms and systems, and product data fidelity [48].
Real-time data collection is allowed by a well-established communication process based on IoT technology [11, 32]. A DT should explore different synchronization options with the event points of the product, especially in the case of connectivity instability on the customer side. Moreover, it should be able to import background data, component information, and configuration information from the physical components [37–39].
Data integration is established by the creation of a hybrid platform that merges various data formats. The DT integration may access two distinct data environments. The first is company internal manufacturer data (e.g., PLM, ERP, and MES). In this environment, design and production data are stored in structured databases that can be accessed when required by traditional communication protocols. The second environment is real-time customers' use data, captured through IoT technologies. The connectivity reliability to the database and the variety of data are essential for generating the big data to be processed [4]. Therefore, the DT may contribute to creating a multi-modal data acquisition process that uses data analytics to predict future outcomes of the studied object [40].
Data fidelity is achieved by assuring that every information processed by the DT is the same one used over its lifecycle. The DT may access different existing IT systems to create a unique data interface [23, 30, 49]. The DT key features of real-time connectivity, data integration, and data fidelity are considered in the Closed-loop Digital Twin conceptual model presented in Fig. 4.
The Closed-loop Digital Twin model may connect information from both the manufacturer environment and the customer environment, processes this information, and provide cross-validated insights to lifecycle phases and stakeholders (Fig. 4).
The Closed-loop Digital Twin implementation involved systems integration and the development of data processing algorithms. Fig. 5 presents the Closed-loop Digital Twin deployment at the learning factory.
The PLM, ERP, and MES software were installed at the cloud server of the learning factory. A remote connection to the databases was established through a TCP-IP protocol on port 1443 connection. The ERP and MES systems were already connected through XML on an integration developed by the systems' providers, so access to both systems' data was established through a TCP-IP protocol at the MES.
On the customers' side, an application was developed to collect customers' usage data and store it on a private partition of users' cloud space to be accessed through an API, respecting the information protection requirements. The user has control of its information being capable of canceling the API access at any given time. Besides, while in the manufacturer environment, the database is structured, in the customers' environment, there is a non-structured database to improve connection times and decrease storage space.
The DT was built on a Python/SQL platform that accesses the system's database by an ODBC and API connection. It was deployed on a 1vCPU, 1.7GB RAM virtual machine. The DT captures only the relevant information needed for the analysis performed and then runs a statistical analysis to predict incoming events for the manufacturer and customer – i.e., predictive maintenance, demand for spare parts to be replaced, potential product failure. The analysis results are stored in an application log and communicated to relevant product stakeholders.
This application was developed in an object-oriented view. Every system was considered as a different object in the integration module. It was designed to permit the addition of new modules without changing the DT's main structure. Every new function was added as an independent element only linked by the serial number. Thus, the model does not consume memory or hard drive and can be escalated without much use of computational resources.
An algorithm to recommend wheel replacement was implemented as an example of the closed-loop use of information for decision making. This algorithm captures, from the customers' database, the product initiation date, the total traveled distance, and the average speed. Then, this information is analyzed considering the actual wheels' maintenance parameters. The wheels' characteristics are obtained from the product structures of the PLM and ERP systems. The exact manufacturing batch parameters are captured from the MES system. The algorithm combines this data and gives the customers and manufacturer feedback about the need to order new wheels.
In the Closed-loop Digital Twin, the use of the existing systems to store the data is responsible for ensuring data fidelity and data governance. The fast connection time (0.0075s to the ERP, 0.0089s to the MES, 0.051s to the PLM) allows real-time data connection. The DT structure based on distinct data environments is necessary to prevent data redundancy. The integration between environments is possible because the proposed DT model is conceived for heterogeneous operation and preserves the original data structure. Hence, the application connects data from different stages of the product lifecycle, processes data, and provides new information, closing the product lifecycle information loop.