Search results

1 – 10 of 795
Click here to view access options
Article
Publication date: 15 June 2020

Tamir Tsegaye and Stephen Flowerday

An electronic health record (EHR) enables clinicians to access and share patient information electronically and has the ultimate goal of improving the delivery of…

Abstract

Purpose

An electronic health record (EHR) enables clinicians to access and share patient information electronically and has the ultimate goal of improving the delivery of healthcare. However, this can create security and privacy risks to patient information. This paper aims to present a model for securing the EHR based on role-based access control (RBAC), attribute-based access control (ABAC) and the Clark-Wilson model.

Design/methodology/approach

A systematic literature review was conducted which resulted in the collection of secondary data that was used as the content analysis sample. Using the MAXQDA software program, the secondary data was analysed quantitatively using content analysis, resulting in 2,856 tags, which informed the discussion. An expert review was conducted to evaluate the proposed model using an evaluation framework.

Findings

The study found that a combination of RBAC, ABAC and the Clark-Wilson model may be used to secure the EHR. While RBAC is applicable to healthcare, as roles are linked to an organisation’s structure, its lack of dynamic authorisation is addressed by ABAC. Additionally, key concepts of the Clark-Wilson model such as well-formed transactions, authentication, separation of duties and auditing can be used to secure the EHR.

Originality/value

Although previous studies have been based on a combination of RBAC and ABAC, this study also uses key concepts of the Clark-Wilson model for securing the EHR. Countries implementing the EHR can use the model proposed by this study to help secure the EHR while also providing EHR access in a medical emergency.

Details

Information & Computer Security, vol. 28 no. 3
Type: Research Article
ISSN: 2056-4961

Keywords

Click here to view access options
Article
Publication date: 8 October 2018

Simon N. Foley and Vivien Rooney

In this paper, the authors consider how qualitative research techniques that are used in applied psychology to understand a person’s feelings and needs provides a means to…

Abstract

Purpose

In this paper, the authors consider how qualitative research techniques that are used in applied psychology to understand a person’s feelings and needs provides a means to elicit their security needs.

Design/methodology/approach

Recognizing that the codes uncovered during a grounded theory analysis of semi-structured interview data can be interpreted as policy attributes, the paper develops a grounded theory-based methodology that can be extended to elicit attribute-based access control style policies. In this methodology, user-participants are interviewed and machine learning is used to build a Bayesian network-based policy from the subsequent (grounded theory) analysis of the interview data.

Findings

Using a running example – based on a social psychology research study centered around photograph sharing – the paper demonstrates that in principle, qualitative research techniques can be used in a systematic manner to elicit security policy requirements.

Originality/value

While in principle qualitative research techniques can be used to elicit user requirements, the originality of this paper is a systematic methodology and its mapping into what is actionable, that is, providing a means to generate a machine-interpretable security policy at the end of the elicitation process.

Details

Information & Computer Security, vol. 26 no. 4
Type: Research Article
ISSN: 2056-4961

Keywords

Click here to view access options
Article
Publication date: 12 June 2007

Marijke Coetzee and J.H.P. Eloff

This paper seeks to investigate how the concept of a trust level is used in the access control policy of a web services provider in conjunction with the attributes of users.

Abstract

Purpose

This paper seeks to investigate how the concept of a trust level is used in the access control policy of a web services provider in conjunction with the attributes of users.

Design/methodology/approach

A literature review is presented to provide background to the progressive role that trust plays in access control architectures. The web services access control architecture is defined.

Findings

The architecture of an access control service of a web service provider consists of three components, namely an authorisation interface, an authorisation manager, and a trust manager. Access control and trust policies are selectively published according to the trust levels of web services requestors. A prototype highlights the incorporation of a trust level in the access control policy as a viable solution to the problem of web services access control, where decisions of an autonomous nature need to be made, based on information and evidence.

Research limitations/implications

The WSACT architecture addresses the selective publication of policies. The implementation of sophisticated policy‐processing points at each web service endpoint, to automatically negotiate about policies, is an important element needed to complement the architecture.

Practical implications

The WSACT access control architecture illustrates how access control decisions can be made autonomously by including a trust level of web services requestors in an access control policy.

Originality/value

The WSACT architecture incorporates the trust levels of web services requestors and the attributes of users into one model. This allows web services providers to grant advanced access to the users of trusted web services requestors, in contrast with the limited access that is given to users who make requests through web services requestors with whom a minimal level of trust has been established.

Details

Internet Research, vol. 17 no. 3
Type: Research Article
ISSN: 1066-2243

Keywords

Click here to view access options
Article
Publication date: 1 August 2006

D.W. Chadwick, A. Novikov and A. Otenko

The paper aims to describe the results of a recent GridShibPERMIS project whose purpose was to provide policy‐driven role‐based access control decision‐making to grid…

Abstract

Purpose

The paper aims to describe the results of a recent GridShibPERMIS project whose purpose was to provide policy‐driven role‐based access control decision‐making to grid jobs, in which the user's attributes are provided by an external Shibboleth Identity Provider (IdP).

Design/methodology/approach

This was achieved by integrating the identity‐federation and attribute‐assignment functions of Shibboleth and the policy‐based enforcement functions of PERMIS with the Grid job management functions of Globus Toolkit v4.

Findings

Combining the three technologies proved to be relatively easy due to the Policy Information Point (PIP) and Policy Decision Point (PDP) Java interfaces recently introduced into Globus Toolkit v4.

Practical implications

However, a number of limitations in the current Grid‐Shib implementation were revealed, namely: the lack of support for pseudonymous access to grid resources; scalability problems because only one issuer scope domain is supported and because name mappings have to be provided for each grid user; and the inability to collect a user's attributes from multiple IdPs for use in authorisation decision‐making.

Originality/value

This paper provides an overview of and describes the benefits of the three technologies (GT4, Shibboleth and PERMIS), shows how they may be combined to good effect via GT4's java interfaces, describes the limitations of the current GridShib implementation and suggests possible solutions and additional research that are needed in the future in order to address the current shortcomings.

Details

Campus-Wide Information Systems, vol. 23 no. 4
Type: Research Article
ISSN: 1065-0741

Keywords

Click here to view access options
Article
Publication date: 2 November 2015

Nancy Ambritta P, Poonam N. Railkar and Parikshit N. Mahalle

This paper aims at providing a comparative analysis of the existing protocols that address the security issues in the Future Internet (FI) and also to introduce a…

Downloads
188

Abstract

Purpose

This paper aims at providing a comparative analysis of the existing protocols that address the security issues in the Future Internet (FI) and also to introduce a Collaborative Mutual Identity Establishment (CMIE) scheme which adopts the elliptical curve cryptography (ECC), to address the issues, such as content integrity, mutual authentication, forward secrecy, auditability and resistance to attacks such as denial-of-service (DoS) and replay attack.

Design/methodology/approach

This paper provides a comparative analysis of the existing protocols that address the security issues in the FI and also provides a CMIE scheme, by adopting the ECC and digital signature verification mechanism, to address the issues, such as content integrity, mutual authentication, forward secrecy, auditability and resistance to attacks such as DoS and replay attack. The proposed scheme enables the establishment of secured interactions between devices and entities of the FI. Further, the algorithm is evaluated against Automated Validation of Internet Security Protocols and Application (AVISPA) tool to verify the security solutions that the CMIE scheme has claimed to address to have been effectively achieved in reality.

Findings

The algorithm is evaluated against AVISPA tool to verify the security solutions that the CMIE scheme has claimed to address and proved to have been effectively achieved in reality. The proposed scheme enables the establishment of secured interactions between devices and entities of the FI.

Research limitations/implications

Considering the Internet of Things (IoT) scenario, another important aspect that is the device-to-location (D2L) aspect has not been considered in this protocol. Major focus of the protocol is centered around the device-to-device (D2D) and device-to-server (D2S) scenarios. Also, IoT basically works upon a confluence of hundreds for protocols that support the achievement of various factors in the IoT, for example Data Distribution Service, Message Queue Telemetry Transport, Extensible Messaging and Presence Protocol, Constrained Application Protocol (CoAP) and so on. Interoperability of the proposed CMIE algorithm with the existing protocols has to be considered to establish a complete model that fits the FI. Further, each request for mutual authentication requires a querying of the database and a computation at each of the participating entities side for verification which could take considerable amount of time. However, for applications that require firm authentication for maintaining and ensuring secure interactions between entities prior to access control and initiation of actual transfer of sensitive information, the negligible difference in computation time can be ignored for the greater benefit that comes with stronger security. Other factors such as quality of service (QoS) (i.e. flexibility of data delivery, resource usage and timing), key management and distribution also need to be considered. However, the user still has the responsibility to choose the required protocol that suits one’s application and serves the purpose.

Originality/value

The originality of the work lies in adopting the ECC and digital signature verification mechanism to develop a new scheme that ensures mutual authentication between participating entities in the FI based upon certain user information such as identities. ECC provides efficiency in terms of key size generated and security against main-in-middle attack. The proposed scheme provides secured interactions between devices/entities in the FI.

Details

International Journal of Pervasive Computing and Communications, vol. 11 no. 4
Type: Research Article
ISSN: 1742-7371

Keywords

Click here to view access options
Article
Publication date: 26 July 2021

Swagatika Sahoo, Arnab Mukherjee and Raju Halder

The rapid technological growth, changes in consumer demands, products’ built-in obsolescence, presence of more non-repairable parts, shorter lifespan, etc., lead to the…

Abstract

Purpose

The rapid technological growth, changes in consumer demands, products’ built-in obsolescence, presence of more non-repairable parts, shorter lifespan, etc., lead to the generation of e-waste at an unprecedented rate. Although a number of research proposals and business products to manage e-waste exist in the literature, they lack in many aspects such as incomplete coverage of product’s life cycle, access control, payment channels (in few cases), incentive mechanisms, scalability issues, and missing experimental validation. The purpose of this paper is to introduce a novel blockchain-based e-waste management system aiming to mitigate the above-mentioned downsides and limitations of the existing proposals.

Design/methodology/approach

This paper proposes a robust and reliable e-waste management system by leveraging the power of blockchain technology, which captures the complete life cycle of e-products commencing from their manufacturing as new products to their disposal as e-waste and their recycling back into raw materials.

Findings

While the use of blockchain technology increases accountability, transparency and trust in the system, the proposal overcomes various challenges and limitations of the existing systems by providing seamless interactions among various agencies.

Originality/value

This paper presents a prototype implementation of the system as a proof-of-concept using solidity on the Ethereum platform and this paper performs experimental evaluations to demonstrate its feasibility and effective performance in terms of execution gas cost and transaction throughput.

Details

International Journal of Web Information Systems, vol. 17 no. 5
Type: Research Article
ISSN: 1744-0084

Keywords

Click here to view access options
Article
Publication date: 14 July 2020

Sérgio Guerreiro

The purposes of this paper are: (1) to identify what types of business process operation controllers are discussed in literature and how can they be classified in order to…

Abstract

Purpose

The purposes of this paper are: (1) to identify what types of business process operation controllers are discussed in literature and how can they be classified in order to establish the available body of knowledge in the literature, and then, (2) to identify which concepts are relevant for business process operation control and how are these concepts related in order to offer a reference model for assessing how well are control layers enforced in the dynamic stable nature of an enterprise' business processes operation.

Design/methodology/approach

One cycle of the circular framework for literature review as proposed by Vom Brocke et al. (2009) is followed. Five stages are comprised: (1) definition of review scope (Section 1 and 2), (2) conceptualization of topic (Section 3), (3) literature search (Section 4), (4) literature analysis and synthesis (Section 5), and (5) definition of a research agenda (Section 6 that also concludes the paper). Vom Brocke, J., Simons, A., Niehaves, B., Niehaves, B., Reimer, K., Plattfaut, R., and Cleven, A. (2009), “Reconstructing the giant: on the importance of rigour in documenting the literature search process”, ECIS 2009 Proceedings, Vol. 161.

Findings

Results indicate that (1) many studies exist in the literature, but no integrated knowledge is proposed, hindering the advance of knowledge in this field, (2) a knowledge gap exists between the implemented solutions and the conceptualization needed to generalize the solution to other contexts. Also, the ontology proposed provides a reference model for assessing the maturity of the business process control operation.

Research limitations/implications

The contents contained in the paper needs to be further deepened to include the concepts of “business process management” and “business process mining”, as well as a semantic equivalence study between concepts can integrate better this conceptual framework and identify similarities. Then, the relationship between industries and dynamically stable business processes operation concepts have not yet been fully investigated. Thirdly, the atypical curve of interest that business processes operational control has been receiving in literature is not fully understood.

Practical implications

Some example applications that could benefit from this ontology are (1) security policy for business processes fine grained access control; (2) business processes enforced with decentralized policies, e.g. blockchain; (3) business process compliance and change; or (4) intelligent enterprise decision-making process, e.g. using AI trained neural network to support the human decision to choose if a control actuation is positive or negative instead of relying only on human-based decisions.

Social implications

We understand that business process operation is a dynamically stable system, where steady motion is achieved with the continuous imposition of actor's actions. Therefore, all the work that contribute to the development of knowledge regarding the actor's actions in their execution environment offer the ability to optimize, and/or reengineering, business processes delivering more social value or better social conditions.

Originality/value

In the best of our knowledge this work is unique in the sense that integrates a set of concepts that is rarely, or never, combined. Table 3 corroborates this result.

Details

Business Process Management Journal, vol. 27 no. 1
Type: Research Article
ISSN: 1463-7154

Keywords

Click here to view access options
Article
Publication date: 17 August 2021

Abir Al-Harrasi, Abdul Khalique Shaikh and Ali Al-Badi

One of the most important Information Security (IS) concerns nowadays is data theft or data leakage. To mitigate this type of risk, organisations use a solid…

Abstract

Purpose

One of the most important Information Security (IS) concerns nowadays is data theft or data leakage. To mitigate this type of risk, organisations use a solid infrastructure and deploy multiple layers of security protection technology and protocols such as firewalls, VPNs and IPsec VPN. However, these technologies do not guarantee data protection, and especially from insiders. Insider threat is a critical risk that can cause harm to the organisation through data theft. The main purpose of this study was to investigate and identify the threats related to data theft caused by insiders in organisations and explore the efforts made by them to control data leakage.

Design/methodology/approach

The study proposed a conceptual model to protect organisations’ data by preventing data theft by malicious insiders. The researchers conducted a comprehensive literature review to achieve the objectives of this study. The collection of the data for this study is based on earlier studies conducted by several researchers from January 2011 to December 2020. All the selected literature is from journal articles, conference articles and conference proceedings using various databases.

Findings

The study revealed three main findings: first, the main risks inherent in data theft are financial fraud, intellectual property theft, and sabotage of IT infrastructure. Second, there are still some organisations that are not considering data theft by insiders as being a severe risk that should be well controlled. Lastly, the main factors motivating the insiders to perform data leakage activities are financial gain, lack of fairness and justice in the workplace, the psychology or characteristics of the insiders, new technologies, lack of education and awareness and lack of management tools for understanding insider threats.

Originality/value

The study provides a holistic view of data theft by insiders, focusing on the problem from an organisational point of view. Organisations can therefore take into consideration our recommendations to reduce the risks of data leakage by their employees.

Details

International Journal of Organizational Analysis, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1934-8835

Keywords

Click here to view access options
Article
Publication date: 25 June 2021

Tran Khanh Dang and Thu Anh Duong

In the open data context, the shared data could come through many transformation processes, originating from many sources, which exposes the risk of non-authentic data…

Abstract

Purpose

In the open data context, the shared data could come through many transformation processes, originating from many sources, which exposes the risk of non-authentic data. Moreover, each data set has different properties, shared under various licenses, which means the updated data could change its characteristics and related policies. This paper aims to introduce an effective and elastic solution to keep track of data changes and manage their characteristics within the open data platform. These changes have to be immutable to avoid violated modification and could be used as the certified provenance to improve the quality of data.

Design/methodology/approach

This paper will propose a pragmatic solution that focuses on the combination of comprehensive knowledge archive network – the broadest used open data platform and hyperledger fabric blockchain to ensure all the changes are immutable and transparent. As using smart contracts plus a standard provenance data format, all processes are running automatically and could be extended to integrate with other provenance systems and so the introduced solution is quite flexible to be used in different open data ecosystems and real-world application domains.

Findings

The research involves some related studies about the provenance system. This study finds out that most of the studies are focused on the commercial sector or applicable to a specific domain and not relevant for the open-data section. To show that the proposed solution is a logical and feasible direction, this paper conducts an experimental sample to validate the result. The testing model is running successfully with an elastic system architect and promising overall performance.

Originality/value

Open data is the future of many businesses but still does not receive enough attention from the research community. The paper contributes a novel approach to protect the provenance of open data.

Details

International Journal of Web Information Systems, vol. 17 no. 5
Type: Research Article
ISSN: 1744-0084

Keywords

Click here to view access options
Article
Publication date: 4 December 2017

Davy Preuveneers, Wouter Joosen and Elisabeth Ilie-Zudor

Industry 4.0 envisions a future of networked production where interconnected machines and business processes running in the cloud will communicate with one another to…

Downloads
1237

Abstract

Purpose

Industry 4.0 envisions a future of networked production where interconnected machines and business processes running in the cloud will communicate with one another to optimize production and enable more efficient and sustainable individualized/mass manufacturing. However, the openness and process transparency of networked production in hyperconnected manufacturing enterprises pose severe cyber-security threats and information security challenges that need to be dealt with. The paper aims to discuss these issues.

Design/methodology/approach

This paper presents a distributed trust model and middleware for collaborative and decentralized access control to guarantee data transparency, integrity, authenticity and authorization of dataflow-oriented Industry 4.0 processes.

Findings

The results of a performance study indicate that private blockchains are capable of securing IoT-enabled dataflow-oriented networked production processes across the trust boundaries of the Industry 4.0 manufacturing enterprise.

Originality/value

This paper contributes a decentralized identity and relationship management for users, sensors, actuators, gateways and cloud services to support processes that cross the trust boundaries of the manufacturing enterprise, while offering protection against malicious adversaries gaining unauthorized access to systems, services and information.

Details

Industrial Management & Data Systems, vol. 117 no. 10
Type: Research Article
ISSN: 0263-5577

Keywords

1 – 10 of 795