Search results
1 – 10 of 11This paper presents results from a study on Unauthorized Software Copying among a group of professional computer end users. The magnitude of the practice is measured…
Abstract
This paper presents results from a study on Unauthorized Software Copying among a group of professional computer end users. The magnitude of the practice is measured. Attitudes of users towards the issue are identified.
In order to succeed in an action under the Equal Pay Act 1970, should the woman and the man be employed by the same employer on like work at the same time or would the…
Abstract
In order to succeed in an action under the Equal Pay Act 1970, should the woman and the man be employed by the same employer on like work at the same time or would the woman still be covered by the Act if she were employed on like work in succession to the man? This is the question which had to be solved in Macarthys Ltd v. Smith. Unfortunately it was not. Their Lordships interpreted the relevant section in different ways and since Article 119 of the Treaty of Rome was also subject to different interpretations, the case has been referred to the European Court of Justice.
The new authorities created by this Act, probably the most important local government measure of the century, will be voted into existence during 1973 and commence…
Abstract
The new authorities created by this Act, probably the most important local government measure of the century, will be voted into existence during 1973 and commence functioning on 1st April 1974. Their responsibilities and the problems facing them are in many ways quite different and of greater complexity than those with which existing councils have had to cope. In its passage through the Lords, a number of amendments were made to the Act, but in the main, it is a scheme of reorganization originally produced after years of discussion and long sessions in the Commons. Local government reorganization in Scotland takes place one year later and for Northern Ireland, we must continue to wait and pray for a return of sanity.
Kenneth Ken Siong Lee and Umi Adzlin Silim
The purpose of this paper is to review the findings from an audit of the implementation of a consultation-liaison psychiatry (CLiP) database in all inpatients referred to…
Abstract
Purpose
The purpose of this paper is to review the findings from an audit of the implementation of a consultation-liaison psychiatry (CLiP) database in all inpatients referred to a CLiP service at the largest hospital in Malaysia with the aim of improving the quality CLiP services.
Design/methodology/approach
All inpatient referrals to the CLiP team were recorded over a three-month period and compared to previous audit data from 2017. Four audit standards were assessed: the reporting of referrals, timeliness of response indication of reason for referral and presence of a management plan.
Findings
The compliance of reporting using the CLiP form was 70.1 per cent compared to 28 per cent in the audit data from 2017 after interventions were conducted. Analysis of the completed CLiP form reveals that 89 per cent of referrals were seen within the same working day. All referrals included the reason for referral. The most common reason for referral was for depressive disorders, but post-assessment, delirium was the most common diagnosis. In total, 87.8 per cent satisfied the audit criteria for a completed written care plan.
Originality/value
Specialised CLiP services are relatively new in Malaysia and this is the first paper to examine the quality of such services in the country. Interventions were effective in improving the compliance of reporting using the CLiP database. The findings suggest that the CLiP services are on par with international audit standards. Furthermore, data from this clinical audit can serve as a benchmark for the development of national operating policies in similar settings.
Details
Keywords
Khaled Medath Aldossari, Brian C. Lines, Jake B. Smithwick, Kristen C. Hurtado and Kenneth T. Sullivan
Although numerous studies have examined alternative project delivery methods (APDMs), most of these studies have focused on the relationship between these methods and…
Abstract
Purpose
Although numerous studies have examined alternative project delivery methods (APDMs), most of these studies have focused on the relationship between these methods and improved project performance. Limited research identifies how to successfully add these methods within architectural, engineering and construction (AEC) organizations. The purpose of this paper is to identifying organizational change management (OCM) practices that, when effectively executed, lead to increased success rates of adopting APDMs in owner AEC organizations.
Design/methodology/approach
Seven OCM practices were identified through a comprehensive literature review. Then, through a survey of 140 individuals at 98 AEC organizations, the relationships between OCM practices and organizational adoption of APDMs were established.
Findings
The findings indicate that OCM practices with the strongest relationship to successful APDM adoption are realistic timeframe, effective change agents, workloads adjustments, senior-leadership commitment and sufficient change-related training.
Practical implications
Adopting APDMs can be extremely difficult and requires significant organizational change efforts to ensure the change is a success. Organizations that are implementing APDMs for the first time should consider applying the OCM practices that this study identifies as most related to successful APDM adoption.
Originality/value
This study contributes to the existing body of knowledge by identifying the OCM practices that are most significantly associated with successfully adopting APDMs.
Details
Keywords
Peter T. Coleman, Katharina G. Kugler, Kyong Mazzaro, Christianna Gozzi, Nora El Zokm and Kenneth Kressel
Research on conflict mediation presents a scattered, piecemeal understanding of what determines mediators’ strategies and tactics and ultimately what constitutes…
Abstract
Purpose
Research on conflict mediation presents a scattered, piecemeal understanding of what determines mediators’ strategies and tactics and ultimately what constitutes successful mediation. This paper presents research on developing a unifying framework – the situated model of mediation – that identifies and integrates the most basic dimensions of mediation situations. These dimensions combine to determine differences in mediator’s strategies that in turn influence mediation processes and outcomes.
Design/methodology/approach
The approach used by this paper was twofold. First, the existing empirical literature was reviewed on factors that influence mediator’s behaviors. Based on the findings of this review, a survey study was conducted with experienced mediators to determine the most fundamental dimensions of mediation situations affecting mediators’ behaviors and mediation processes and outcomes. The data were analyzed through exploratory factor analysis and regression analysis.
Findings
The results of the study show that four of the most fundamental dimensions of mediation situations include: low vs high intensity of the conflict, cooperative vs competitive relationship between the parties, tight vs flexible context and overt vs covert processes and issues. Each of these factors was found to independently predict differences in mediators’ behaviors and perceptions of processes and outcomes. These dimensions are then combined to constitute the basic dimensions of the situated model of mediation.
Originality/value
The situated model of mediation is both heuristic and generative, and it shows how a minimal number of factors are sufficient to capture the complexity of conflict mediation in a wide range of contexts.
Details
Keywords
The lengthy review of the Food Standards Committee of this, agreed by all public analysts and enforcement officers, as the most complicated and difficult of food groups…
Abstract
The lengthy review of the Food Standards Committee of this, agreed by all public analysts and enforcement officers, as the most complicated and difficult of food groups subject to detailed legislative control, is at last complete and the Committee's findings set out in their Report. When in 1975 they were requested to investigate the workings of the legislation, the problems of control were already apparent and getting worse. The triology of Regulations of 1967 seemed comprehensive at the time, perhaps as we ventured to suggest a little too comprehensive for a rational system of control for arguments on meat contents of different products, descriptions and interpretation generally quickly appeared. The system, for all its detail, provided too many loopholes through which manufacturers drove the proverbial “carriage and pair”. As meat products have increased in range and the constantly rising price of meat, the “major ingredient”, the number of samples taken for analysis has risen and now usually constitutes about one‐quarter of the total for the year, with sausages, prepared meats (pies, pasties), and most recently, minced meat predominating. Just as serial sampling and analysis of sausages before the 1967 Regulations were pleaded in courts to establish usage in the matter of meat content, so with minced meat the same methods are being used to establish a maximum fat content usage. What concerns food law enforcement agencies is that despite the years that the standards imposed by the 1967 Regulations have been in force, the number of infringements show no sign of reduction. This should not really surprise us; there are even longer periods of failures to comply; eg., in the use of preservatives which have been controlled since 1925! What a number of public analysts have christened the “beefburger saga” took its rise post‐1967 and shows every indication of continuing into the distant future. Manufacturers appear to be trying numerous ploys to reduce the content below the Regulation 80% mainly by giving their products new names. Each year, public analysts report a flux of new names and ingenious defences; eg, “caterburgers” and similar concocted nomenclature, and the defence that because the name does not incorporate a meat, it is outside the statutory standard.
This chapter presents a multi-criteria portfolio model with the expected return as a performance measure and the expected worst-case return as a risk measure. The problems…
Abstract
This chapter presents a multi-criteria portfolio model with the expected return as a performance measure and the expected worst-case return as a risk measure. The problems are formulated as a single-objective linear program, as a bi-objective linear program, and as a triple-objective mixed integer program. The problem objective is to allocate the wealth on different securities to optimize the portfolio return. The portfolio approach has allowed the two popular financial engineering percentile measures of risk, value-at-risk (VaR) and conditional value-at-risk (CVaR) to be applied. The decision-maker can assess the value of portfolio return, the risk level, and the number of assets, and can decide how to invest in a real-life situation comparing with ideal (optimal) portfolio solutions. The concave efficient frontiers illustrate the trade-off between the conditional value-at-risk and the expected return of the portfolio. Numerical examples based on historical daily input data from the Warsaw Stock Exchange are presented and selected computational results are provided. The computational experiments prove that both proposed linear and mixed integer programming approaches provide the decision-maker with a simple tool for evaluating the relationship between the expected and the worst-case portfolio return.
Kritika Nagdev, Anupama Rajesh and Richa Misra
The purpose of this paper is to explore the mediating role of demonetisation in the usage of IT-enabled banking services (ITeBS). The study extends the theory of…
Abstract
Purpose
The purpose of this paper is to explore the mediating role of demonetisation in the usage of IT-enabled banking services (ITeBS). The study extends the theory of technology readiness (TR) (Parasuraman and Colby, 2015) by incorporating the behavioural intention and actual usage of ITeBS.
Design/methodology/approach
Based on the theory of TR and encompassing the impact of demonetisation, the study examines the functional relationship of TR, behavioural intention and actual usage. Structural equation modelling and mediation analysis are applied on a data set of 474 usable responses.
Findings
The study confirms that TR is a significant factor in customer’s intention to use ITeBS. The demonetisation variable fully mediates the relationship model, which implies a significant finding in the consumer acceptance literature.
Practical implications
The result of this study proposes three major implications. Primarily, the banks should focus on providing simple and user-friendly ITeBS interface and its uninterrupted access. It is necessary to educate the customers by giving them a trial of the service. Furthermore, social media platforms may be utilised as an effective and efficient tool to resolve customer complaints.
Originality/value
This study is first of the attempts to investigate government’s digital push in the technology adoption literature. The results indicate significant influence of demonetisation on the usage of ITeBS.
Details
Keywords
Michael J. McCord, Sean MacIntyre, Paul Bidanset, Daniel Lo and Peadar Davis
Air quality, noise and proximity to urban infrastructure can arguably have an important impact on the quality of life. Environmental quality (the price of good health) has…
Abstract
Purpose
Air quality, noise and proximity to urban infrastructure can arguably have an important impact on the quality of life. Environmental quality (the price of good health) has become a central tenet for consumer choice in urban locales when deciding on a residential neighbourhood. Unlike the market for most tangible goods, the market for environmental quality does not yield an observable per unit price effect. As no explicit price exists for a unit of environmental quality, this paper aims to use the housing market to derive its implicit price and test whether these constituent elements of health and well-being are indeed capitalised into property prices and thus implicitly priced in the market place.
Design/methodology/approach
A considerable number of studies have used hedonic pricing models by incorporating spatial effects to assess the impact of air quality, noise and proximity to noise pollutants on property market pricing. This study presents a spatial analysis of air quality and noise pollution and their association with house prices, using 2,501 sale transactions for the period 2013. To assess the impact of the pollutants, three different spatial modelling approaches are used, namely, ordinary least squares using spatial dummies, a geographically weighted regression (GWR) and a spatial lag model (SLM).
Findings
The findings suggest that air quality pollutants have an adverse impact on house prices, which fluctuate across the urban area. The analysis suggests that the noise level does matter, although this varies significantly over the urban setting and varies by source.
Originality/value
Air quality and environmental noise pollution are important concerns for health and well-being. Noise impact seems to depend not only on the noise intensity to which dwellings are exposed but also on the nature of the noise source. This may suggest the presence of other externalities that arouse social aversion. This research presents an original study utilising advanced spatial modelling approaches. The research has value in further understanding the market impact of environmental factors and in providing findings to support local air zone management strategies, noise abatement and management strategies and is of value to the wider urban planning and public health disciplines.
Details