Internet application is getting denser due to their heavy infrastructure and client through its services and hyper-visor virtualization technologies. It offers the solutions to users’ problem as a service model in which each computation paradigm can be used based on a tenancy model. But the concern is that how such security services are delivered to end user in utility manner and hence the solution proposed in [7]. But as the advancement of Internet based applications also changing user expectation is also increasing like it cannot protect the confidentiality of users' data from Internet based applications service providers One of the other aspects of data security is the need to assess before embarking on creating a security scheme for data stored in the Internet based applications systems is the levels of need; that is, how secure do you want that data to be? The levels of security of any data object should be thought of as concentric layers of increasingly pervasive security impact, which I have divided here into their parts to show the increasing granularity of this pervasiveness:
Phase 1: Transfer of the file using encrypted protocols and standards
Phase 2: Access control of the file itself, but without encryption of the content
Phase 3: Access control methods (including encryption of the content of a data object)
Phase 4: Access control methods (including encryption of the data and files) also including rights management options (for example, no copying content without authentication, no printing content without authorization, date restrictions, etc.)
We have number of security areas that are frequently overlooked by numerous IT associations, and that can be significantly increase the possibility of threat of spreading protection breach. The above-mentioned conditions require unambiguous attention and correction, that involves [4]:
2.1 Problem in existing system
2.1.1 Insufficient User Authentication
The typical username/password validation techniques are grossly poor to protect against security breach, particularly with the creative social engineering attacks that have been used recently. Strong authentication is required and must, and different methods should be used based on a number of background factors. Hardware tokens have been commonly used, but the recent breach of Secure ID tokens, as well as the cost and strength associated with hardware tokens make this an unappealing option.
2.1.2 Insufficient Regular User Access Validation
As a user undergoes role changes, promotions, etc., their access privileges are often no longer tailored to their current task. Although many businesses occasionally or occasionally authenticate access rights of user, it should be done formally, routinely, and it would be computerized so that this could be performed instantly and clearly. Absence of routine authorization can increase the risk of separation of function violations and other irregularities. [5].
2.1.3 Absence of Privileged User System Controls
A leading trigger of security infringements and problems, confidential users frequently have additional access than their jobs require. The mentioned users require extensive access rights to perform their jobs. [1].
2.1.4 Lack of Information Use Control
Controlling access to information is not enough protection or remedy. One must control use of information also in order to help ensure that it isn’t disclosed or stolen. These weaknesses have been contributing cause to some of the most visible security breaches of the recent past.
2.1.5 Lack of Continuous User Monitoring
Apparently in the WikiLeaks breach, nobody noticed that someone was copying thousands of sensitive documents from military systems over a short period of time. Many security breaches could be identified with a comprehensive method to monitoring user activity for suspicious activities. As a result, the lack of effective and nonstop monitoring of user movement is also one of the most important contributors to a high risk of security breach [13, 14].
2.2 Proposed Work
Decipherers would notice an application requiring attribute signatures and has specific qualities that no existing cryptosystem can supply. The user can benefit from Attribute Based Signature (ABS) by passing selected attributes from a list of attributes to generate a stronger signature. It uses the anonymity phenomenon to protect the seclusion of uniqueness of the user and their signature. In this, key overturning is linked to generation of attribute, which would help ABS operate better. However, the signature verifier faced a difficult scenario because he was unaware of the user's choice of properties, so the substantiation process was not exactly what I needed. This study presents a revolutionary attribute-based signature architecture that addresses all the issues while also being simple to implement. For dealing with user-selected qualities, it exhibits an enhancement over key overturning using a defined influence. The designed mandate acts as a conduit between the verifier and the user, facilitating attribute negotiation. The user requests the intermediate's secret key share, as well as a reversal check for its uniqueness and the appropriate properties, to generate the signature. The RSA algorithms are used to conduct key exchanges here.
2.2.1 Architecture
The procedure begins with the formation of a group of users interested in creating digital signature-based documents. Each user has a unique group of attributes connected with their characteristics and system usage patterns. They were referred to as attributes. These user attributes are extracted and stored in the attribute collection. Next, the user's data fragment, which included all the user's information in the form of an attribute, was used to verify the identity of documents in the form of a signature. The digest is calculated by the hash algorithm using this predicate logic. Using the signer's private key, the digest is now encrypted with the RSA cryptosystem. Along with the user's attributes, this private key will verify the user's identity. After encryption is applied, the X.509 certificate is included to store user verification information. As part of an identity check, the authentication server checks this authentication information. All temporary data made by the system are signed and kept in the depository. It is used to make the signature over again. if the user generates another document with the same signature request. It also includes the signature generation predicate logic. Now, the document's signed message is transmitted over an open channel.
Next, the second stage of the suggested Architecture is the process of verification. This segment begins with the origin of the digest associated with each fragment of data. Before extracting the digest, it must be decrypted using the private key of signer which is stored in the register of public key. As soon as the digest has been decrypted, the correlated hash with the data is extracted, and the original data is given as input to the MD5 algorithm to recalculate the message digest. We compare this recalculated message digest to the obtained message digest alongside the authentication of the user's attributes. If the attributes match those connected with user's identity, then the signature and sender's validity are proved. At last, the message is available to the recipient along with a certificate containing its confirmation.
2.2 Components Need to be Develop
2.2.1 Attribute Extractor
2.2.2 Signature Predicate Logic Generator
2.2.3 MD5 Hash Algorithm
2.2.4 RSA Cryptosystems
2.2.5 Certificate Manager
2.2.6 Hash Matching
2.2.7 Messaging System
2.3 Implementation
2.3.1 First form of the developed tool requires authentication as security primitive as shown in Fig. 3. Thus, here the user’s needs to login into the system using its defined credential, once the id and password are verified the welcome message will be displayed.
2.3.2 This interface requires file selection for secure sharing between the different users of the system as shown in Fig. 4. Initially after the file selection attribute-based signature framework will generate a set of user attributes passed for proving the user’s identity on the basis of its behavior. These attributes are passed to the MD5 hash algorithm for generating the digest. The user attributes will also be passed into RSA encryption mechanism for generating the key. The system generates two sets of keys out of which one is used either public key of full key pair.
2.3.3 Once the key is generated then the X.509 certificate will be displayed. This certificate will contain the information related with user, profile, file, and system. If the system and its selected attributes are correct, then a successful message is displayed for file. After all the process the system is ready to share the file securely as shown in Fig. 5. If the process and the generated file are correct, then proceeds for sending the file.
2.3.4 The Fig. 6 will contain the different set of quantitative values such as hash generation time, key generation time, encryption time, file sending time and overall process time.
2.3.5 As shown in Fig. 7 after clicking on Send File for Authentication, File successfully sent.
2.3.6 Fig. 8 shows the receiving of file from the different users. This will work as reverse process for verification of securely transmitted file. This panel will show the list of files received from different users. The verifier selects any file form the list and the verifier will apply the reverse process and verifies the sender. By this mechanism integrity, authentication and confidentiality was assured.
2.3.7 As shown in Figure 9 once the file gets verified then the verifier will forward to designated receiver.
2.3.8 The receiver will download the file and the file will get saved with the local disk space and can be viewed by the user. The process is based on the activities assessment of the administrator. This interface as shown in Fig. 10 also displays the result attributes for capturing the quantitative values during different stages of data retrieval. Mainly it verifies the attributes, digest, encryption, hash, certificates, and decryption time used by the system to get the accuracy, efficiency, and reliability analysis.