Performance Modelling and Analysis of IoT Based Edge Computing Policies

In today’s era, the acceptance of IoT-based edge devices is growing exponentially, which creates challenges of data acquisition, processing, and communication. In the edge computing paradigm, intelligence is shifted from the center to the edge by performing specific processing and prediction locally. Existing methods are not well computed with timely manner and it reduces the security level by overfitting problem. A strategy based on reducing communication resources between sensors and edge devices is the prime focus of this investigation. It uses a predictive model-based policy at edge devices for the reconstruction of not delivered context vector. A new hybrid Averaged Exponential Smoothening policy proposed is based on the current context vectors as well as a smoothing vector to reduce reconstruction error and improve the percentage of communication. It is observed that if we send data only when there is a marginal change in data then we can reduce communication overhead as well as keep reconstruction error low. This policy would be suitable for IoT-based edge computing applications for the smart city such as Smart Home, Healthcare, and Intelligent traffic to delivers the power of AI.


Introduction
IoT is the heart of the digital business and a combination of operational as well as information technology necessary to create a business advantage. Gartner predicts that the high volume of significant business procedures and protocols will have some or other components of the IoT [1]. Data Analytics is fundamental to the accomplishment of IoT frameworks as they are the primary mainstay of the necessary decision making process. Realtime data and data analytics plays a crucial role in making a revolution. IoT produces a big quantity of statistics that desires to be processed and analyzed so it may be used. Edge 1 3 computing services are closer to the user or the source of the records, which includes an IoT tool. This permits the IoT facts to be accumulated and processed at the edge where the tool is positioned, instead of sending the statistics lower back to a datacenter or cloud to assist identify styles that provoke moves faster like anomaly detection for predictive renovation. The ability of IoT devices to make use of compute energy is turning into an increasing number of treasured as a way to rapidly examine statistics in actual-time.
There are different computing architectures available at the application layer. It mainly includes Cloud computing, Fog Computing, and Edge computing [2][3][4]. The new Architecture, like Edge Computing, addresses data management problems by leveraging analytics at the different layers from the sensor to the edge of the network. There are a network of sensors and actuator within an enterprise which generate, manage, and analyze such big data which has massive challenges. In an edge computing model, sensors and Actuators transmit or receive data to a close-by edge gateway like a switch that checks or analyzes the data instead of forwarding it to the cloud. By moving insight from a center incorporated cloud to an edge of an enterprise can be made nearer to where they have happened. Edge computing can help to illuminate latency challenges and empower organizations to open doors utilizing a distributed computing architecture [5][6][7][8]. A smart IoT network incorporates policies to push information preparation and management of application data to the edge of the net.
This new architecture is used to perform advanced analytics at the point where the data generate or most actionable. With a smart deployment strategy, decision-makers get visual and easily-understood insights of data in near real time. In Edge computing, advanced analytics deploys on the end device (Sensor and Actuating Devices), on the gateway (Edge Devices), and in the cloud, giving secure and instant insight into the time and place it most needed. Here the DS1 and DS2 is modeled and security improvement makes the better analysis state. Case 1 determines the predicted value of sensor data and the case 2 shows the context vector approach with different policy assessments.Security issue and large data processing with overfitting issues are the major problem of this work. This research contributes to increase the application availability of various fields of IoT and cloud environment. Data processing with secure model increases the performance to contribute the reliability for the user.
A variety of IoT-based edge architecture models can be explored in the literature. Here three-layered edge computing model consisting of the core, edge, and sensor layers is used, as shown in Fig. 1. As we move from the sensor layer to the core layer the latency goes on increasing. So it is better to do time-sensitive analytics in the edge or sensor layer instead of pushing it to the core layer. Figure 1 shows the IoT system architecture of edge computing based model. In this three different layers are processed. Those are sensor layer, edge layer and core layer. In this the sensor layer is the input source to get the activated sensor data from the IoT SAN. The edge layer is performed on the IoT gate way model such as wifi, wireless communication, cloud model, Bluetooth, mobile, internet, etc. The core layer constructed the IoT cloud infrastructure with high latency. Depends on the utility from sensor layer, the core layer is activated to perform the model. The system designed with deep analysis based on the user requirement. For example, the core layer has the application of medical data handling, the major concern is about security, reliability and large scale data process. This may cause overfitting problem.
• Sensor Computing-The sensor layer consists of IoT sensors, which include motion, light, temperature, humidity, and many more. It consists of smart wireless sensors and actuators, which are having limited computing capabilities. Sensors monitor the surrounding environment and asset state. • Edge Computing-It includes sensor data aggregation systems using gateway devices.
The edge layer consists of gateway devices that receive the data and aggregates it and transfer it to the core computational layer. This layer acts as an intermediary layer that aggregates from lower layers and forwards it to the core layer for further processing. • Core Computing-It uses to manage, store, analyze, and build a programming model of data. The core computational layer consists of high-end machines for storing, computing, and analyzing the sensor data to generate actionable information. This information transmits to the user via the IoT gateway.
Section 2 briefs about related work, architecture, edge computing models, and parameters for analysis. In part III, the main focus is on the proposed policy. Section 4 shows the experimental study and result from analysis, whereas Sect. 5 depicts a summary and open research challenges in edge computing.

Background
In the traditional IoT system, the sensing node senses the environmental parameters and sends it to the centralized system to process it further to analyze the situation. It per-forms analytics tasks only on back end devices and not on Sensor or Actuation Device(SAD) or network gateway Edge Device(ED) [45]. Nowadays, these devices are increasing with computing capacity, which can perform some processing of data at SAD or/and ED of the network. This processing eventually can reduce the energy required for transmitting data to the centralized system, bandwidth requirement, high latency, and the prolonged lifetime of the sensor network, which encourages edge analytics. Edge Computing architecture either pushes intelligence to ED for analytics or to SAD, where both collaboratively support edge analytics, as shown in Fig. 2. The terminology and respective notations are given in Table 1 and Dataset description is given in Table 2.
All SADs coordinates with ED, who is responsible for keeping track of data generated and energy consumption on communication between SAD and ED, as well as local computation on ED. It is considered that cost of handling and investigation at ED is nontrivial, and one should consider the balance between intra edge-network communication and local computation. The prediction capability results in the SADi ready to conclude whether to send or not generated vectors xt to EDj. SADi relies on a threshold of error -based delivery decision rule based on a couple of cases as follows: • Case 1: If the difference between model based predicated value x^t, and the sensed context vector xt, is more than a threshold of error, i.e., et > θ, then the SADi send the original context vector xt to the EDj. • Case 2: if et < = θ, then SADi won't send xt to the EDj so EDj is responsible for the reconstruction of context vector for further processing. This reconstruction of the con-

Edge Computing Model
Many edge computing-based data models are available in the literature, which analyzes time-series data coming from sensors by keeping simplicity and complexity in mind. The various policies based on edge computing models for selective data forwarding [45] are as follows. where it denotes forecast data after h interval. • Simple average Model-In this model, the reconstruction of data depends on an average of sliding window-based stream data.
where data y 1 ….. y T denotes historical data. • Simple Exponential Smoothing (SES) model-This is the most widely and successfully used forecasting model. In this model, the reconstruction of data depends on weighted averages of previous data of N observations based window. This weightage depends more on recent data than older data. This scheme applies to data with no clear trend and seasonal pattern.
Where α it is a smoothing parameter.

Proposed Edge Computing Policy
The proposed policy takes advantage of the Exponential Smoothing Model and averaged Model to come up with a scheme of Averaged Exponential Smoothing (AES) Policy as given in Fig. 3, where the context vector reconstructed at EDj with the combination of averaged and exponential smoothing models. There are two possibilities in the reconstruction of context vector using sliding window maintained at EDj. In the first case, the EDj directly inserts the received context vector in the sliding window and removes the oldest one, and finds value based on the recent received vector and previous vector. In the second case, EDj reconstructs the value with the average of smoothed context vector and most recently received context vector, and removes the oldest value.
where s' t-1 smoothing vector constructed at edge device, which can be different than the smoothing vector at the sensor device. Figure 3 shows the AES policy. Here the AES policy is modelled with sliding window based data and the results are evaluated by context vector model. This forecast the average context value of reconstructed vector. Since this policy combines two different models, it removes the bottleneck of both the systems and improves the performance and reduces reconstruction error significantly. The next section demonstrates an experimental study based on edge computing models, parameters, and analysis of experimental results.ŷ

Experimental Results
In the previous section, an examination of different edge computing models from a theoretical point of view is discussed. Their performance on a benchmark dataset and the experimental dataset is available for empirical evaluation.

Parameters for Analysis
Various measures used for the evaluation of different models are listed below. The precision of each model is essential from the evaluation point of view.
• Communication efficiency-It refers to the efficiency of communication resources to reduce delays.
where Ii,t = 1 if SADi send its sensed value to the ED otherwise Ii;t = 0. • Root Mean Square Error (RMSE)-It is the square error between the actual context vector and reconstructed one calculated as follows:

Fig. 3 AES policy
Mean Absolute Error (MAE)-It is an absolute error between actual context vector and reconstructed one calculated as follows: Mean Absolute Percentage Error (MAPE)-It is an absolute percentage error between actual context vector and reconstructed one calculated as follows: The next section shows the proposed edge computing policy and its performance based on evaluation parameters.
In the experiment, the real-time use of two datasets to assess the evaluation of the proposed policy. The first dataset is available on the University of California Irvine (UCI) repository [46]. These data are collected every hour and refer to T = 9357 12-dimensional measurements with n = 12 SADs and one ED. This dataset has missing values that are replaced by the mean of that attribute. The second datasets have collected from an experimental setup in the lab for collecting data from a different sensor like temperature, humidity, and Carbon Monoxide(CO), These data are collected every 5 min and refer to T = 1441 3-dimensional measurements with n = 3 SADs and one ED.
All three existing policies and proposed policy implemented using the r platform, and the results were analysed for prediction error, reconstruction error, and percentage of communication.

Conclusion
The usage of IoT devices has been growing exponentially with time. The traditional policies are not sufficient to address the issues of IoT devices such as data acquisition, processing, communication, privacy security. Considering these limitations of centralized processing in the growing era of IoT devices, a distributed policy was needed. Therefore, in this paper, a distributed and lightweight AES model is proposed. The proposed model design to reduce communication overhead and, in-turn to minimize the energy of processing requirements. It compares with existing policies, which shows that the AES model performs better than existing models regarding the percentage of communication. Future work includes the investigation of intelligent techniques for minimization of prediction and reconstruction error by offloading computing to the next level. In future, the work may be extended with the medical application specific model and it triggers the value of edge computing model    for increasing the reliability. Security improvement in real life data ma modelled for better outcomes.