Search results
1 – 10 of over 23000Antonio Nesticò and Gabriella Maselli
The purpose of the paper is to characterize an evaluation protocol of the social discount rate (SDR). This is based on the social rate of time preference (SRTP) principles…
Abstract
Purpose
The purpose of the paper is to characterize an evaluation protocol of the social discount rate (SDR). This is based on the social rate of time preference (SRTP) principles, according to which the investment selection process must tend to maximize the utility of the community.
Design/methodology/approach
The theoretical reference of the evaluation protocol is represented by the Ramsey formula. It is widely used in many countries with advanced economics for the SRTP estimation, through the maximization of the Social Welfare Function (SWF).
Findings
The protocol structure and the protocol applications to the Italian and US economies explain how the SDR value is influenced by the socio-economic structure of the single nation.
Research limitations/implications
The strong variability of the results of the SDR according to the theoretical approach of reference and the operating path that follows can lead to judgments decidedly divergent on the acceptability of the public project, hence, the important policy implications for the entire allocation process of public resources.
Practical implications
The applications allow to highlight the important operational problems that must be resolved with regard to the choice of the time intervals of the evaluations, as well as logical-operational tools to be used to express estimates of parameters.
Social implications
They are relevant in relation to the effects of a more equitable allocation of the resources.
Originality/value
The protocol for the SDR estimation is based both on solid disciplinary principles and on objective data of non-complex availability and representative of the economic and socio-demographic context of the country in which the decision-making process is implemented.
Details
Keywords
Fatos Xhafa, Leonard Barolli, Raul Fernández, Thanasis Daradoumis and Santi Caballé
In any distributed application, the communication between the distributed processes/nodes of the distributed systems is essential for both reliability and efficiency matters. The…
Abstract
Purpose
In any distributed application, the communication between the distributed processes/nodes of the distributed systems is essential for both reliability and efficiency matters. The purpose of this paper is to address this issue for distributed applications based on JXTA protocols aiming at extending and evaluating the protocols of the JXTA library for reliable P2P computing.
Design/methodology/approach
After a careful examination of the current version of JXTA protocols, the need was observed for improving the original JXTA protocols such as pipe services to ensure reliable communication between nodes of the grid platform and the discovery and presence service to increase the performance of the applications. Using a mixed P2P network based on broker peers and client peers architecture, which served as a basis to extend the JXTA protocols, was the basis of the approach.
Findings
The original JXTA protocols are extented/re‐implemented to support the development of reliable P2P distributed applications.
Practical implications
The proposed approach has been validated in practice by deploying a P2P network using nodes of PlanetLab platform and testing each of the re‐implemented protocols using this real P2P network. The extended JXTA protocols can be used to develop reliable P2P distributed applications.
Originality/value
Is of value by showing how to improve both efficiency reliability of JXTA protocols and services.
Details
Keywords
Shubangini Patil and Rekha Patil
Until now, a lot of research has been done and applied to provide security and original data from one user to another, such as third-party auditing and several schemes for…
Abstract
Purpose
Until now, a lot of research has been done and applied to provide security and original data from one user to another, such as third-party auditing and several schemes for securing the data, such as the generation of the key with the help of encryption algorithms like Rivest–Shamir–Adleman and others. Here are some of the related works that have been done previously. Remote damage control resuscitation (RDCR) scheme by Yan et al. (2017) is proposed based on the minimum bandwidth. By enabling the third party to perform the verification of public integrity. Although it supports the repair management for the corrupt data and tries to recover the original data, in practicality it fails to do so, and thus it takes more computation and communication cost than our proposed system. In a paper by Chen et al. (2015), using broadcast encryption, an idea for cloud storage data sharing has been developed. This technique aims to accomplish both broadcast data and dynamic sharing, allowing users to join and leave a group without affecting the electronic press kit (EPK). In this case, the theoretical notion was true and new, but the system’s practicality and efficiency were not acceptable, and the system’s security was also jeopardised because it proposed adding a member without altering any keys. In this research, an identity-based encryption strategy for data sharing was investigated, as well as key management and metadata techniques to improve model security (Jiang and Guo, 2017). The forward and reverse ciphertext security is supplied here. However, it is more difficult to put into practice, and one of its limitations is that it can only be used for very large amounts of cloud storage. Here, it extends support for dynamic data modification by batch auditing. The important feature of the secure and efficient privacy preserving provable data possession in cloud storage scheme was to support every important feature which includes data dynamics, privacy preservation, batch auditing and blockers verification for an untrusted and an outsourced storage model (Pathare and Chouragadec, 2017). A homomorphic signature mechanism was devised to prevent the usage of the public key certificate, which was based on the new id. This signature system was shown to be resistant to the id attack on the random oracle model and the assault of forged message (Nayak and Tripathy, 2018; Lin et al., 2017). When storing data in a public cloud, one issue is that the data owner must give an enormous number of keys to the users in order for them to access the files. At this place, the knowledge assisted software engineering (KASE) plan was publicly unveiled for the first time. While sharing a huge number of documents, the data owner simply has to supply the specific key to the user, and the user only needs to provide the single trapdoor. Although the concept is innovative, the KASE technique does not apply to the increasingly common manufactured cloud. Cui et al. (2016) claim that as the amount of data grows, distribution management system (DMS) will be unable to handle it. As a result, various proven data possession (PDP) schemes have been developed, and practically all data lacks security. So, here in these certificates, PDP was introduced, which was based on bilinear pairing. Because of its feature of being robust as well as efficient, this is mostly applicable in DMS. The main purpose of this research is to design and implement a secure cloud infrastructure for sharing group data. This research provides an efficient and secure protocol for multiple user data in the cloud, allowing many users to easily share data.
Design/methodology/approach
The methodology and contribution of this paper is given as follows. The major goal of this study is to design and implement a secure cloud infrastructure for sharing group data. This study provides an efficient and secure protocol for multiple user data in cloud, allowing several users to share data without difficulty. The primary purpose of this research is to design and implement a secure cloud infrastructure for sharing group data. This research develops an efficient and secure protocol for multiple user data in the cloud, allowing numerous users to exchange data without difficulty. Selection scheme design (SSD) comprises two algorithms; first algorithm is designed for limited users and algorithm 2 is redesigned for the multiple users. Further, the authors design SSD-security protocol which comprises a three-phase model, namely, Phase 1, Phase 2 and Phase 3. Phase 1 generates the parameters and distributes the private key, the second phase generates the general key for all the users that are available and third phase is designed to prevent the dishonest user to entertain in data sharing.
Findings
Data sharing in cloud computing provides unlimited computational resources and storage to enterprise and individuals; moreover, cloud computing leads to several privacy and security concerns such as fault tolerance, reliability, confidentiality and data integrity. Furthermore, the key consensus mechanism is fundamental cryptographic primitive for secure communication; moreover, motivated by this phenomenon, the authors developed SSDmechanismwhich embraces the multiple users in the data-sharing model.
Originality/value
Files shared in the cloud should be encrypted for security purpose; later these files are decrypted for the users to access the file. Furthermore, the key consensus process is a crucial cryptographic primitive for secure communication; additionally, the authors devised the SSD mechanism, which incorporates numerous users in the data-sharing model, as a result of this phenomena. For evaluation of the SSD method, the authors have considered the ideal environment of the system, that is, the authors have used java as a programming language and eclipse as the integrated drive electronics tool for the proposed model evaluation. Hardware configuration of the model is such that it is packed with 4 GB RAM and i7 processor, the authors have used the PBC library for the pairing operations (PBC Library, 2022). Furthermore, in the following section of this paper, the number of users is varied to compare with the existing methodology RDIC (Li et al., 2020). For the purposes of the SSD-security protocol, a prime number is chosen as the number of users in this work.
Details
Keywords
John A. Bourke, Deborah L. Snell, K. Anne Sinnott and Bernadette Cassidy
Disabled people who are the end‐users (EU) of health services have a poor record of inclusion, yet a major stake in the quality of scientific research that informs the development…
Abstract
Purpose
Disabled people who are the end‐users (EU) of health services have a poor record of inclusion, yet a major stake in the quality of scientific research that informs the development of health knowledge and interventions. In traditional rehabilitation research it has been the researcher who sets the agenda, including determining the research question, study design and methods, and who controls dissemination of findings. This paper aims to describe the development of an EU research consultation committee and to describe the evaluation protocol used to assess the effectiveness of the committee.
Design/methodology/approach
The paper describes the context and development of an EU research consultation committee (the committee) to promote collaboration between researchers and lay‐EUs within a research organization in New Zealand. It also describes the qualitative evaluation protocol to be used to assess the effectiveness of the committee over the first 12 months of operation in order to refine its process and procedures.
Findings
The paper discusses the issues and challenges involved in achieving collaboration between researchers and EUs in the rehabilitation research space and describes this consultation model as a positive example of making inclusion a reality. Challenges include building research capacity within the EU community and development of real models of collaboration and partnership in rehabilitation research.
Originality/value
It is argued that the integrity and relevance of clinical research is enhanced by the involvement of EUs in all aspects of the research process.
Details
Keywords
Many universities are currently investing significant sums of money into refurbishing existing learning spaces and/or building further infrastructure (including Next Generation…
Abstract
Many universities are currently investing significant sums of money into refurbishing existing learning spaces and/or building further infrastructure (including Next Generation Learning Spaces (NGLS)) to support learning and teaching in the face-to-face context. While this is usually welcome by staff and students, there is often a concern that designs are not informed by input from appropriate stakeholders.
This chapter brings together information from a range of sources to provide practical ideas and advice on designing robust, whole-of-lifecycle evaluations for learning space projects. By incorporating pre- and post-occupancy stages, involving a wide array of stakeholders and looking beyond surveys and focus groups as evaluation techniques, universities can ensure that future designs take into consideration the experiences and context of staff and students at the institution as well as lessons learned from previous projects.
Details
Keywords
Abstract
Details
Keywords
Cheryl L. Holt, Theresa A. Wynn, Ivey Lewis, Mark S. Litaker, Sanford Jeames, Francine Huckaby, Leonardo Stroud, Penny L. Southward, Virgil Simons, Crystal Lee, Louis Ross and Theodies Mitchell
Prostate and colorectal cancer (CRC) rates are disproportionately high among African‐American men. The purpose of this paper is to describe the development of an intervention in…
Abstract
Purpose
Prostate and colorectal cancer (CRC) rates are disproportionately high among African‐American men. The purpose of this paper is to describe the development of an intervention in which barbers were trained to educate clients about early detection for prostate and CRC.
Design/methodology/approach
Working with an advisory panel of local barbers, cancer survivors and clients, educational materials are developed and pilot tested through use of focus groups and cognitive response interviews.
Findings
The advisory panel, focus groups, and interviews provide key recommendations for core content, intervention structure, and evaluation strategies. The men suggest a variety of things they want to know about prostate cancer, however the perceived need for CRC information is much broader, suggesting a knowledge gap. The men prefer print materials that are brief, use graphics of real African‐American men, and provide a telephone number they can call for additional information.
Research limitations/implications
Community involvement is key in developing a well‐accepted and culturally‐relevant intervention.
Originality/value
The paper usefully describes the process of developing and pilot testing educational materials for use in an intervention in which barbers would be trained as community health advisors, to educate their clients about CRC screening and informed decision making for prostate cancer screening.
Details
Keywords
Objective – The first wave of experiences of exemptions policies suggested that poverty-based exemptions, using individual targeting, were not effective, for practical and…
Abstract
Objective – The first wave of experiences of exemptions policies suggested that poverty-based exemptions, using individual targeting, were not effective, for practical and political economic reasons. In response, many countries have changed their approach in recent years – while maintaining user fees as a necessary source of revenue for facilities, they have been switching to categorical targeting, offering exemptions based on high-priority services or population groups. This chapter aims to examine the impact and conditions for effectiveness of this recent health finance modality.
Methodology/approach – The chapter is based on a literature review and on data from two complex evaluations of national fee exemption policies for delivery care in West Africa (Ghana and Senegal). A conceptual framework for analysing the impact of exemption policies is developed and used. Although the analysis focuses on exemption for deliveries, the framework and findings are likely to be generalisable to other service- or population-based exemptions.
Findings – The chapter presents background information on the nature of delivery exemptions, the drivers for their use, their scale and common modalities in low-income countries. It then looks at evidence of their impact, on utilisation, quality of care and equity and investigates their cost-effectiveness. The final section presents lessons on implementation and implications for policy-makers, including the acceptability and sustainability of exemptions and how they compare to other possible mechanisms.
Implications for policy – The chapter concludes that funded service- or group-based exemptions offer a simple, potentially effective route to mitigating inequity and inefficiency in the health systems of low-income countries. However, there are a number of key constraints. One is the fungibility of resources at health facility level. The second is the difficulty of sustaining a separate funding stream over the medium to long term. The third is the arbitrary basis for selecting high-priority services for exemption. The chapter therefore concludes that this financing mode is unstable and is likely to be transitional.
Ester Lisnati Jayadi and Helena Forslund
This study aims to explore how to apply and integrate the performance management (PM) process in humanitarian supply chains (HSCs) among and between humanitarian organizations…
Abstract
Purpose
This study aims to explore how to apply and integrate the performance management (PM) process in humanitarian supply chains (HSCs) among and between humanitarian organizations (HOs) and donors so as to improve cost-efficiency (CE) and lead-time effectiveness (LTE) in the stage of natural disaster preparedness.
Design/methodology/approach
This study adapts and operationalizes a framework for the PM process used in commercial supply chains to assess HSCs. A multiple-case study with two types of actors – six HOs and three donors—is used to describe the applications of the PM process and analyze the level of integration between the actors.
Findings
The activities in the PM process could sometimes be only vaguely described. Both actors emphasized improving CE, with less emphasis on LTE. Both actors have a low level of integration in each PM process activity, decreasing the CE and LTE. Therefore, guidelines for improving the level of PM process integration are provided.
Research limitations/implications
To the best of the authors’ knowledge, this study is one of the first to combine literature on HSCs and PM process integration, thereby contributing to both literature fields. The concrete contribution of this study is a framework for PM process application and integration among and between HOs and donors.
Practical implications
The PM process framework can be used to assess PM process application, as well as current and increased level of integration, to improve CE and LTE. The current applications can also inspire other HOs and donors.
Originality/value
Previous studies indicate the lack of frameworks in the PM domain of HSCs, especially in the stage of natural disaster preparedness.
Details
Keywords
Jiangmei Chen, Wende Zhang and Qishan Zhang
The purpose of the paper is to improve the rating prediction accuracy in recommender systems (RSs) by metric learning (ML) method. The similarity metric of user and item is…
Abstract
Purpose
The purpose of the paper is to improve the rating prediction accuracy in recommender systems (RSs) by metric learning (ML) method. The similarity metric of user and item is calculated with gray relational analysis.
Design/methodology/approach
First, the potential features of users and items are captured by exploiting ML, such that the rating prediction can be performed. In metric space, the user and item positions can be learned by training their embedding vectors. Second, instead of the traditional distance measurements, the gray relational analysis is employed in the evaluation of the position similarity between user and item, because the latter can reduce the impact of data sparsity and further explore the rating data correlation. On the basis of the above improvements, a new rating prediction algorithm is proposed. Experiments are implemented to validate the effectiveness of the algorithm.
Findings
The novel algorithm is evaluated by the extensive experiments on two real-world datasets. Experimental results demonstrate that the proposed model achieves remarkable performance on the rating prediction task.
Practical implications
The rating prediction algorithm is adopted to predict the users' preference, and then, it provides personalized recommendations for users. In fact, this method can expand to the field of classification and provide potentials for this domain.
Originality/value
The algorithm can uncover the finer grained preference by ML. Furthermore, the similarity can be measured using gray relational analysis, which can mitigate the limitation of data sparsity.
Details