Search results
1 – 2 of 2The purpose of this paper is to improve the privacy in healthcare datasets that hold sensitive information. Putting a stop to privacy divulgence and bestowing relevant information…
Abstract
Purpose
The purpose of this paper is to improve the privacy in healthcare datasets that hold sensitive information. Putting a stop to privacy divulgence and bestowing relevant information to legitimate users are at the same time said to be of differing goals. Also, the swift evolution of big data has put forward considerable ease to all chores of life. As far as the big data era is concerned, propagation and information sharing are said to be the two main facets. Despite several research works performed on these aspects, with the incremental nature of data, the likelihood of privacy leakage is also substantially expanded through various benefits availed of big data. Hence, safeguarding data privacy in a complicated environment has become a major setback.
Design/methodology/approach
In this study, a method called deep restricted additive homomorphic ElGamal privacy preservation (DR-AHEPP) to preserve the privacy of data even in case of incremental data is proposed. An entropy-based differential privacy quasi identification and DR-AHEPP algorithms are designed, respectively, for obtaining privacy-preserved minimum falsified quasi-identifier set and computationally efficient privacy-preserved data.
Findings
Analysis results using Diabetes 130-US hospitals illustrate that the proposed DR-AHEPP method is more significant in preserving privacy on incremental data than existing methods. A comparative analysis of state-of-the-art works with the objective to minimize information loss, false positive rate and execution time with higher accuracy is calibrated.
Originality/value
The paper provides better performance using Diabetes 130-US hospitals for achieving high accuracy, low information loss and false positive rate. The result illustrates that the proposed method increases the accuracy by 4% and reduces the false positive rate and information loss by 25 and 35%, respectively, as compared to state-of-the-art works.
Details
Keywords
Aggelos Kiayias, Thomas Zacharias and Bingsheng Zhang
This paper aims to investigate the importance of auditing for election privacy via issues that appear in the state-of-the-art implementations of e-voting systems that apply…
Abstract
Purpose
This paper aims to investigate the importance of auditing for election privacy via issues that appear in the state-of-the-art implementations of e-voting systems that apply threshold public key encryption (TPKE) in the client such as Helios and use a bulletin board (BB).
Design/methodology/approach
Argumentation builds upon a formal description of a typical TPKE-based e-voting system where the election authority (EA) is the central node in a star network topology. The paper points out the weaknesses of the said topology with respect to privacy and analyzes how these weaknesses affect the security of several instances of TPKE-based e-voting systems. Overall, it studies the importance of auditing from a privacy aspect.
Findings
The paper shows that without public key infrastructure (PKI) support or – more generally – authenticated BB “append” operations, TPKE-based e-voting systems are vulnerable to attacks where the malicious EA can act as a man-in-the-middle between the election trustees and the voters; hence, it can learn how the voters have voted. As a countermeasure for such attacks, this work suggests compulsory trustee auditing. Furthermore, it analyzes how lack of cryptographic proof verification affects the level of privacy that can be provably guaranteed in a typical TPKE e-voting system.
Originality/value
As opposed to the extensively studied importance of auditing to ensure election integrity, the necessity of auditing to protect privacy in an e-voting system has been mostly overlooked. This paper reveals design weaknesses present in noticeable TPKE-based e-voting systems that can lead to a total breach of voters’ privacy and shows how auditing can be applied for providing strong provable privacy guarantees.
Details