Search results
1 – 10 of over 156000Al Sentot Sudarwanto and Dona Budi Budi Kharisma
The purpose of this paper is two-fold: to explore the legal issue of the importance of personal data protection in the digital economy sector and to propose a legal framework for…
Abstract
Purpose
The purpose of this paper is two-fold: to explore the legal issue of the importance of personal data protection in the digital economy sector and to propose a legal framework for personal data protection as a consumer protection strategy and accelerate the digital economy.
Design/methodology/approach
This study is legal research. The research approach used was the comparative approach and statute approach. The legal materials used are all regulations regarding personal data protection that apply in Indonesia, Hong Kong and Malaysia. The technique of collecting legal materials is done by using library research techniques.
Findings
The value of Indonesia’s digital economy is the biggest in the Southeast Asia region, but data breach is still a big challenge to face. The Indonesian Consumers Foundation (Yayasan Lembaga Konsumen Indonesia) recorded 54 cases of a data breach in e-commerce, 27 cases in peer-to-peer lending and 5 cases in electronic money. Based on the results of a comparative study with Hong Kong and Malaysia, Indonesia has yet no specific Act that comprehensively regulates personal data protection. Indonesia also does not have a personal data protection commission. Criminal sanctions and civil claims related to data breaches have not yet been regulated.
Research limitations/implications
This study examines the data breach problem in the Indonesian digital economy sector. However, the legal construction of personal data protection regulations is built on the results of a comparative study with Hong Kong and Malaysia.
Practical implications
The results of this study can be useful for constructing the ideal regulation regarding the protection of personal data in the digital economy sector.
Social implications
The results of the recommendations in this study are expected to develop and strengthen the protection of personal data in the Indonesian digital economy sector. Besides aiming to prevent the misuse of personal data, the regulation aims to protect consumers and accelerate the growth of the digital economy.
Originality/value
Indonesia needs to create a personal data protection act. The act should at least cover such issues: personal data protection principles; types of personal data; management of personal data; mechanism of personal data protection and security; commission of personal data protection; transfers of personal data; resolution mechanism of personal data dispute and criminal sanctions and civil claims.
Details
Keywords
The viability of online anonymity is questioned in today’s online environment where many technologies enable tracking and identification of individuals. In light of the…
Abstract
Purpose
The viability of online anonymity is questioned in today’s online environment where many technologies enable tracking and identification of individuals. In light of the shortcomings of the government, industry and consumers in protecting anonymity, it is clear that a new perspective for ensuring anonymity is needed. Where current stakeholders have failed to protect anonymity, some proponents argue that economic models exist for valuation of anonymity. By placing a monetary value on anonymity through Rawls’ concept of primary goods, it is possible to create a marketplace for anonymity, therefore allowing users full control of how their personal data is used. This paper aims to explore the creation of a data marketplace, offering users the possibility of engaging with companies and other entities to sell and auction personal data. Importantly, participation in a marketplace does not sacrifice one’s anonymity, as there are different levels of anonymity in online systems.
Design/methodology/approach
The paper uses a conceptual framework based on the abstractions of anonymity and data valuation.
Findings
The manuscript constructs a conceptual foundation for exploring the development and deployment of a personal data marketplace. By suggesting features allowing individuals’ control of their personal data, and properly establishing monetary valuation of one’s personal data, it is argued that individuals will undertake a more proactive management of personal data.
Originality/value
An overview of the available services and products offering increased anonymity is explored, in turn, illustrating the beginnings of a market response for anonymity as a valuable good. By placing a monetary value on individuals’ anonymity, it is reasoned that individuals will more consciously protect their anonymity in ways where legislation and other practices (i.e. privacy policies, marketing opt-out) have failed.
Details
Keywords
Dona Budi Kharisma and Alvalerie Diakanza
This paper aims to identify the reasons why cases of leakage of patient personal data often occur in the health sector. This paper also analyzes personal data protection…
Abstract
Purpose
This paper aims to identify the reasons why cases of leakage of patient personal data often occur in the health sector. This paper also analyzes personal data protection regulations in the health sector from a comparative legal perspective between Indonesia, Singapore and the European Union (EU).
Design/methodology/approach
This type of research is legal research. The research approach used is the statute approach and conceptual approach. The focus of this study in this research is Indonesia with a comparative study in Singapore and the EU.
Findings
Cases of leakage of patient personal data in Indonesia often occur. In 2021, the data for 230,000 COVID-19 patients was leaked and sold on the Rapid Forums dark web forum. A patient’s personal data is a human right that must be protected. Compared to Singapore and the EU, Indonesia is a country that does not yet have a law on the protection of personal data. This condition causes cases of leakage of patients’ personal data to occur frequently.
Research limitations/implications
This study analyzes the regulation and protection of patients’ personal data in Indonesia, Singapore and the EU to construct a regulatory design for the protection of patients’ personal data.
Practical implications
The results of this study are useful for constructing regulations governing the protection of patients’ personal data. The regulation is to protect the patient’s personal data like a patient’s human right.
Social implications
The ideal regulatory design can prevent data breaches. Based on the results of comparative studies, in Singapore and the EU, cases of personal data leakage are rare because they have a regulatory framework regarding the protection of patients’ personal data.
Originality/value
Legal strategies that can be taken to prevent and overcome patient data breaches include the establishment of an Act on Personal Data Protection; the Personal Data Protection Commission; and management of patients’ personal data.
Details
Keywords
Anita Katulić, Tihomir Katulić and Ivana Hebrang Grgić
The purpose of this paper is to examine the relationship between the legal obligation of European libraries to ensure the transparent personal data processing and respect for user…
Abstract
Purpose
The purpose of this paper is to examine the relationship between the legal obligation of European libraries to ensure the transparent personal data processing and respect for user privacy. This paper will examine how libraries use privacy notices on websites to communicate with patrons about the processing of personal data and in what manner have libraries been guided by applicable transparency guidelines.
Design/methodology/approach
The method used is the analysis of privacy policies and other privacy documents found on the websites of national libraries. The analysis sample includes documents of 45 European national libraries, 28 out of those being national libraries of European Union (EU) Member States. The elements for this analysis are derived from the mandatory elements of the General Data Protection Regulation and the recommendations of the WP29/EDPB Transparency Guidelines.
Findings
The findings suggest that European national libraries largely adhere to EU data protection standards. In total, 60% libraries use a separate privacy page, and 53% of the EU Member State national libraries websites managed to comply with publishing all necessary data protection information in a way recommended by the Guidelines, compared to 47% of non-Member State national libraries.
Originality/value
The research contributes to the understanding of the importance of the principle of transparency and its operationalization.
Details
Keywords
Sheshadri Chatterjee and Sreenivasulu N.S.
The purpose of this study is to investigate the impacts of regulations and governance of artificial intelligence (AI) on personal data sharing (PDS) in the context of sociolegal…
Abstract
Purpose
The purpose of this study is to investigate the impacts of regulations and governance of artificial intelligence (AI) on personal data sharing (PDS) in the context of sociolegal, technology and policy perspective.
Design/methodology/approach
With the help of theories and literature review, some hypotheses have been formulated and a conceptual model has been developed. These are statistically validated. The validated model has been compared again using impact of regulation and governance of AI as a moderator. The validation has been done using survey by PLS analysis.
Findings
The study found that there is a high level of positive impact of regulation and governance of AI on the online PDS by the users.
Research limitations/implications
This study has provided a statistical model which can provide the antecedents of PDS by the online users with the impact of AI regulation and governance as a moderator. The proposed model has explanative power of 92%.
Practical implications
The study highlighted that there is a necessity of having appropriate AI regulations so that users could share their personal data online without any hesitation. Policymakers and legal fraternity should work together to formulate a comprehensive AI regulation and governance framework.
Originality/value
To the best of the authors’ knowledge, there is no study on the impact of AI regulation and governance towards PDS and how it impacts on the security, privacy and trust of the online users.
Details
Keywords
Lei Huang, Jingyi Zhou, Jiecong Lin and Shengli Deng
In the era of big data, people are more likely to pay attention to privacy protection with facing the risk of personal information leakage while enjoying the convenience brought…
Abstract
Purpose
In the era of big data, people are more likely to pay attention to privacy protection with facing the risk of personal information leakage while enjoying the convenience brought by big data technology. Furthermore, people’s views on personal information leakage and privacy protection are varied, playing an important role in the legal process of personal information protection. Therefore, this paper aims to propose a semi-qualitative method based framework to reveal the subjective patterns about information leakage and privacy protection and further provide practical implications for interested party.
Design/methodology/approach
Q method is a semi-qualitative methodology which is designed for identifying typologies of perspectives. In order to have a comprehensive understanding of users’ viewpoints, this study incorporates LDA & TextRank method and other information extraction technologies to capture the statements from large-scale literature, app reviews, typical cases and survey interviews, which could be regarded as the resource of the viewpoints.
Findings
By adopting the Q method that aims for studying subjective thought patterns to identify users’ potential views, the authors have identified three categories of stakeholders’ subjectivities: macro-policy sensitive, trade-offs and personal information sensitive, each of which perceives different risk and affordance of information leakage and importance and urgency of privacy protection. All of the subjectivities of the respondents reflect the awareness of the issue of information leakage, that is, the interested parties like social network sites are unable to protect their full personal information, while reflecting varied resistance and susceptibility of disclosing personal information for big data technology applications.
Originality/value
The findings of this study provide an overview of the subjective patterns on the information leakage issue. Being the first to incorporate the Q method to study the views of personal information leakage and privacy protection, the research not only broadens the application field of the Q method but also enriches the research methods for personal information protection. Besides, the proposed LDA & TextRank method in this paper alleviates the limitation of statements resource in the Q method.
Details
Keywords
Christine Prince, Nessrine Omrani and Francesco Schiavone
Research on online user privacy shows that empirical evidence on how privacy literacy relates to users' information privacy empowerment is missing. To fill this gap, this paper…
Abstract
Purpose
Research on online user privacy shows that empirical evidence on how privacy literacy relates to users' information privacy empowerment is missing. To fill this gap, this paper investigated the respective influence of two primary dimensions of online privacy literacy – namely declarative and procedural knowledge – on online users' information privacy empowerment.
Design/methodology/approach
An empirical analysis is conducted using a dataset collected in Europe. This survey was conducted in 2019 among 27,524 representative respondents of the European population.
Findings
The main results show that users' procedural knowledge is positively linked to users' privacy empowerment. The relationship between users' declarative knowledge and users' privacy empowerment is partially supported. While greater awareness about firms and organizations practices in terms of data collections and further uses conditions was found to be significantly associated with increased users' privacy empowerment, unpredictably, results revealed that the awareness about the GDPR and user’s privacy empowerment are negatively associated. The empirical findings reveal also that greater online privacy literacy is associated with heightened users' information privacy empowerment.
Originality/value
While few advanced studies made systematic efforts to measure changes occurred on websites since the GDPR enforcement, it remains unclear, however, how individuals perceive, understand and apply the GDPR rights/guarantees and their likelihood to strengthen users' information privacy control. Therefore, this paper contributes empirically to understanding how online users' privacy literacy shaped by both users' declarative and procedural knowledge is likely to affect users' information privacy empowerment. The study empirically investigates the effectiveness of the GDPR in raising users' information privacy empowerment from user-based perspective. Results stress the importance of greater transparency of data tracking and processing decisions made by online businesses and services to strengthen users' control over information privacy. Study findings also put emphasis on the crucial need for more educational efforts to raise users' awareness about the GDPR rights/guarantees related to data protection. Empirical findings also show that users who are more likely to adopt self-protective approaches to reinforce personal data privacy are more likely to perceive greater control over personal data. A broad implication of this finding for practitioners and E-businesses stresses the need for empowering users with adequate privacy protection tools to ensure more confidential transactions.
Details
Keywords
This is the second of three articles addressing the critical issue of abusive data collection and usage practices and their effect on personal privacy. The first article, which…
Abstract
This is the second of three articles addressing the critical issue of abusive data collection and usage practices and their effect on personal privacy. The first article, which appeared in consecutive issue 17 of Library Hi Tech, discussed the individual under assault. This article discusses the evolution of data protection laws within countries and international organizations that have enacted such laws, and compares the scope, major provisions, and enforcement components of the laws.
The study focusses on the legal issues surrounding artificial intelligence (AI), which are being investigated and debated about several European Union initiatives to manage and…
Abstract
The study focusses on the legal issues surrounding artificial intelligence (AI), which are being investigated and debated about several European Union initiatives to manage and regulate Information and Communication Technologies. The goal is to discuss the benefits and drawbacks of adopting AI technology and the ramifications for the articulations of law and politics in democratic constitutional countries. Thus, the study aims to identify socio-legal concerns and possible solutions to protect individuals’ interests. The exploratory study is based on statutes, rules, and committee reports. The study has used news pieces, reports issued by organisations and legal websites. The study revealed computer security vulnerabilities, unfairness, bias and discrimination, and legal personhood and intellectual property issues. Issues with privacy and data protection, liability for harm, and lack of accountability will all be discussed. The vulnerability framework is utilised in this chapter to strengthen comprehension of key areas of concern and to motivate risk and impact mitigation solutions to safeguard human welfare. Given the importance of AI’s effects on weak individuals and groups as well as their legal rights, this chapter contributes to the discourse, which is essential. The chapter advances the conversation while appreciating the legal work done in AI and the fact that this sector needs constant review and flexibility. As AI technology advances, new legal challenges, vulnerabilities, and implications for data privacy will inevitably arise, necessitating increased monitoring and research.
Details
Keywords
Natasja Van Buggenhout, Wendy Van den Broeck, Ine Van Zeeland and Jo Pierson
Media users daily exchange personal data for “free” personalised media. Is this a fair trade, or user “exploitation”? Do personalisation benefits outweigh privacy risks?
Abstract
Purpose
Media users daily exchange personal data for “free” personalised media. Is this a fair trade, or user “exploitation”? Do personalisation benefits outweigh privacy risks?
Design/methodology/approach
This study surveyed experts in three consecutive online rounds (e-Delphi). The authors explored personal data processing value for media, personalisation relevance, benefits and risks for users. The authors scrutinised the value-exchange between media and users and determined whether media communicate transparently, or use “dark patterns” to obtain more personal data.
Findings
Communication to users must be clear, correct and concise (prevent user deception). Experts disagree on “payment” with personal data for “free” personalised media. This study discerned obstacles and solutions to substantially balance the interests of media and users (fair value exchange). Personal data processing must be transparent, profitable to media and users. Media can agree “sector-wide” on personalisation transparency. Fair, secure and transparent information disclosure to media is possible through shared responsibility and effort.
Originality/value
This study’s innovative contribution is threefold: Firstly, focus on professional stakeholders’ opinion in the value network. Secondly, recommendations to clearly communicate personalised media value, benefits and risks to users. This allows media to create codes of conduct that increase user trust. Thirdly, expanding literature explaining how media realise personal data value, deal with stakeholder interests and position themselves in the data processing debate. This research improves understanding of personal data value, processing benefits and potential risks in a regional context and European regulatory framework.
Details