Healthcare service quality: a methodology for servicescape re-design using Taguchi approach

Rejikumar G. (Department of Management, Amrita Vishwa Vidyapeetham, Kochi, India)
Aswathy Asokan Ajitha (Department of Management, Indian Institute of Technology Madras, Chennai, India)
Malavika S. Nair (Department of Industrial Engineering, College of Engineering Trivandrum, Thiruvananthapuram, India)
Raja Sreedharan V. (Department of Management, Amrita Vishwa Vidyapeetham, Kochi, India)

The TQM Journal

ISSN: 1754-2731

Publication date: 8 July 2019



The purpose of this paper is to identify major healthcare service quality (HSQ) dimensions, their most preferred service levels, and their effect on HSQ perceptions of patients using a Taguchi experiment.


This study adopted a sequential incidence technique to identify factors relevant in HSQ and examined the relative importance of different factor levels in the service journey using Taguchi experiment.


For HSQ, the optimum factor levels are online appointment booking facility with provision to review and modify appointments; a separate reception for booked patients; provision to meet the doctor of choice; prior detailing of procedures; doctor on call facility to the room of stay; electronic sharing of discharge summary, an online payment facility. Consultation phase followed by the stay and then procedures have maximum effect on S/N and mean responses of patients. The appointment stage has a maximum effect on standard deviations.

Research limitations/implications

Theoretically, this study attempted to address the dearth of research on service settings using robust methodologies like Taguchi experiment, which is popular in the manufacturing sector. The study implies the need for patient-centric initiatives for better HSQ through periodic experiments that inform about the changing priorities of patients.

Practical implications

The trade-off between standardization and customization create challenges in healthcare. Practically, a classification of processes based on standardization vs customization potential is useful to revamp processes for HSQ.


This study applied the Taguchi approach to get insights in re-designing a patient-centric healthcare servicescapes.



G., R., Asokan Ajitha, A., Nair, M.S. and Sreedharan V., R. (2019), "Healthcare service quality: a methodology for servicescape re-design using Taguchi approach", The TQM Journal, Vol. 31 No. 4, pp. 600-619.



Emerald Publishing Limited

Copyright © 2019, Emerald Publishing Limited


Effectual functioning of service organizations requires service quality that enriches customer satisfaction (Gill, 2009) and brings competitiveness to firms (Wali and Nwokah, 2018). Thus, service quality management is an integral part of strategy formulation. In healthcare, medical care is delivered through multiple processes of by different healthcare professionals. The cost of poor service quality is high in the healthcare industry (Berwick et al., 2003) since it directly deals with human health and bears accountability for human survival. Defining healthcare service quality (HSQ) is difficult due to involvement of multiple personnel like physicians, nurses, pharmacists, technicians, administrative staff and many others having distinctive quality perceptions are involved in service processes. Hence, a common definition of quality in alignment with each stakeholder’s perspective is difficult to arrive at (Farr and Cressey, 2015). Now, healthcare firms adopt various innovative approaches like total quality management, Six Sigma and lean production, etc. to define, measure, analyze, improve and control service quality (Black and Revere, 2006; Dahlgaard and Mi Dahlgaard-Park, 2006; Dahlgaard et al., 2011; Gonzalez, 2019). However, many features of the service industry curtailed the scope of application of quality management models for service quality improvements in healthcare (Chakrabarty and Chuan Tan, 2007). The complexity in defining HSQ and absence of consensus in dimensions useful in its measurement, considerably limited the application of above methodologies (Andersson et al., 2006; Joosten et al., 2009; Young and McClean, 2009).

In services, customers and service providers take part in quality creation. In healthcare, the primary focus is on patients (Owusu-Frimpong et al., 2010) and their perspectives decide “meaningful indicator of health services quality” (O’Connor et al., 2003). But, an error in their assessment of HSQ is possible due to their limited knowledge of medical procedures, diagnosis, treatment, etc. (Zabada et al., 1998). To improve HSQ perceptions of patients, their better awareness of processes, voluntary compliance and overall cooperation are essential. But, many times, the information supply to patients is restricted due to the nature of the illness or for confidence protection to undergo risky procedures. Such an information deficiency adversely affects the active involvement of patients in the servuction process and negatively influence quality perceptions. Besides, the composition of service personnel in healthcare falls into two types like the clinical staff (doctors, nurses, etc.) and support staff (technicians, administration, etc.) who provide supplementary services (Chilgren, 2008). Several studies highlight the role of doctors and nurses in HSQ perceptions as they are more associated with care, medical and counseling support (Hudelson et al., 2008; Shafei et al., 2015; Jorgensen, 2019).

Services have relatively less tangible evidence, open production in the presence of customers, perishable nature and the possibility of variations due to the behavioral components in service production (Parasuraman et al., 1985). The definition of quality varies as per the approach adopted. The different approaches are transcendent, product-based, user-based, manufacturing-based and the value-based (Garvin, 1984). In transcendental view, product quality is about some standards specified. In product-based view, quantification and measurement are possible through objective assessments. In the user-based view, quality is a user feeling about the capability of products to satisfy their needs. In the manufacturing-based view, quality means conformance to specifications. In value-based view, overall benefits perceived about the cost and efforts decide quality. The most popular SERVQUAL model considers the user perspective where service quality is the “difference between perception and expectations of customers about service received” (Gupta et al., 2005). In another view, service quality is conformance to specifications that meet/exceed expectations (Reeves and Bednar, 1994) about sub-dimensions such as reliability, assurance, tangibility, empathy and responsiveness (Kyoon Yoo and Ah Park, 2007). To overcome the complexity in the operationalization of expectations, the “SERVPERF” scale (Cronin and Taylor, 1992) used only perceptions to measure service quality. Both objective and subjective components contribute to service quality. The tangibles help in objective evaluations, whereas heterogeneous intangible dimensions evoke subjectivity in evaluations (Deb and Ahmed, 2019).

Traditionally, customer perceptions about selected quality attributes, measured on a scale and examined for reliability, validity helps in estimating relative performance of attributes and overall service quality. However, a major issue in using ordered categorical scales is the overdependence on mean scores of attributes overlooking the importance of variance in responses (Lee et al., 2008). Many researchers consider “variance” as a better estimator of quality (Taguchi et al., 2005; Yang et al., 2011; Ho et al., 2014). When means of attributes does not have significant difference, the one with lesser variance better captures general quality perceptions. Therefore, it would be ideal to consider both mean and variance in the assessment of quality. An ideal service process should respond favorably to customer’s expectations and should minimize the variability in the process performance due to heterogeneity aspects (noises) inherent in services (Raajpoot et al., 2008). The noise factors in a service process are beyond the control of the service provider, but by creating an optimum level for quality dimensions (control factors) in the control of service provider, service quality increases. The decision on selection of ideal level that appeals to customers from available alternatives is therefore critical in HSQ. A Taguchi experiment is methodology that uses the ratio between the mean and variance (signal to noise (S/N) ratio) for examining the relative importance of various factor levels created for better performance. The insights from a Taguchi experiment will inform about the combination of different factors and their proper settings (D’Ambra et al., 2018) to get the best results that offer higher levels of HSQ in a healthcare context.

Keeping the above observations in the backdrop, the objectives of this study are first, to conduct a review of the extant literature to identify major dimensions used in measuring HSQ. Second, to identify important factors contributing to HSQ as perceived by patients. Third, to understand most preferred service levels of identified factors that are critical in HSQ by performing a Taguchi experiment that uses the concept of S/N ratio in comparing the effect of factors on HSQ.

Literature review

Services are intangible, heterogeneous, perishable and inseparable (Parasuraman et al., 1985) and quality perceptions about services emanate from multidimensional perspectives (Yarimoglu, 2014). In SERVQUAL, service quality is the gap between the expectations of the customers and their assessment of the actual performance of service by the service provider. SERVQUAL instrument used 22-items related to tangibles, reliability, responsiveness and empathy for measuring service quality. Many researchers proposed modifications to the SERVQUAL model. Miranda et al. (2010) attempted to adjust SERVQUAL to develop HEALTHQUAL using the dimensions: health staff (skills in communication, attention to patients’ problems, interest in solving patients’ problems, professionalism and understanding patients’ problems, etc. of health staff); efficiency (level of bureaucracy, waiting times, speed of diagnosis, complaints resolution, time to focus on each patient and adherence to time schedules); non-health staff (professionalism, kindness and politeness, attention to patients’ problems, interest in problem solving, etc. of non-health staff); facilities (cleanliness, equipment, location, etc.). Another prominent technical-functional framework for HSQ is 5Qs model (Zineldin, 2006) with dimensions quality of the object, quality of infrastructure, quality of the process, quality of interaction and quality of atmosphere. Both SERVQUAL and 5Qs share some common features, but the 5Qs model is more inclusive and incorporates essential dimensions such as infrastructure, atmosphere and the interaction between the patients and the healthcare staff.

Service quality dimensions mostly capture the quality of technical and functional aspects of the service delivery process (Grönroos, 1984). In healthcare, the “milieu, manner, and behavior of the healthcare professional in delivering care to and communicating with patients” (Zineldin, 2006) explain the functional component. These aspects referred as interaction quality, capture the quality in offering adequate explanations and instructions during treatment and the amount of time spent by physicians or nurses to understand the patient’s needs. Similarly, technical aspects are captured by the competence, skills, experience, know-how and technology (Choi et al., 2005; Zineldin, 2006) in service delivery. The physicians and other staff exhibit functional quality while explaining the medical process in a friendly and helpful manner. The functional aspects include empathy in caring, individualized attention to patients, assurance to inspire trust, responsiveness and willingness to help patients, reliability and dependability of healthcare services (Sofaer and Firminger, 2005). Donabedian (1988) has proposed a three-dimensional framework containing structure, process and outcome for assessing and comparing HSQ. Structure enlists the resources and capacities of the service provider to provide HSQ. Structural measures include hospital buildings, staff, equipment and facilities that form major inputs in the control of the service provider. Process measures help in assessing the quality of transactions between patients and providers throughout service delivery. Finally, outcomes explain the changes in patients’ healthcare condition following the treatment. Donabedian (1988) broadened structure-process-outcome framework into four aspects of care such as accessibility, technical management, interpersonal relationships management and continuity. Table I provides a detailed list of dimensions used to measure HSQ.

Critical scrutiny of significant dimensions used to assess HSQ offer few valid observations. First, the major dimensions of HSQ are doctor quality, nursing quality, support staff quality, infrastructure both physical and technical, process quality, communication, accessibility, affordability and amenities. Second, both tangible and intangible dimensions have role in measuring HSQ. Third, assessing the relative importance of each dimension in comparison with others is complex and may be subjective. Fourth, in different stages of service encounter, importance perceptions about dimensions can vary. Last, to optimize the performance of any dimension in HSQ, use of patient-centric options in execution is best suited.

Research methodology

This research deployed two studies to meet the objectives put forth. The first study was to identify the important dimensions that affect patient experience in a hospital journey. The second study was to apply a Taguchi experiment to examine the relative importance of different levels of critical dimensions identified from Study 1 in developing HSQ. Figure 1 illustrates the methodology adopted for the study.

Study 1

A sequential incidence technique (SIT) helped to identify important dimensions that significantly contribute to patient experience in a service encounter (Stauss and Weinlich, 1997). SIT helps in mapping the incidences in each stage of service process that evoked positive or negative feelings among patients. SIT involves process-oriented qualitative interviewing of respondents with the help of a service blueprint that illustrates relevant episodes in the patient journey. The major seven episodes considered were appointment, reception, consultation, diagnosis tests/procedures, admission/stay, billing/payment and discharge. Each of these episodes pertains to different areas of focus, referred as different dimensions in the literature review as determinants of HSQ.

We conducted the SIT in a super specialty hospital environment. Patients who had undergone through all the above episodes in the healthcare services were the subjects for the study. A judgmental sample of 20 patients recalled and described incidents that they remember sequentially about each episode in the service journey. Since the purpose of this phase was to identify areas that have the potential for improvement, recording of incidences pertaining to pain points was enough. Table II provides extracts of significant incidents in each service episode.

Analysis of excerpts from the interviews conducted with respect to episodes helped in identifying the process modifications needed for better HSQ. The subsequent discussions with experienced healthcare professionals helped to finalize different options in these episodes for examining patient preference for HSQ.

Study 2

In the second study, the concept of the robust design proposed by Taguchi assessed the relative importance of possible settings (levels) in the above episodes (controllable factors) for better patient experience. Robust design signifies the creation of a service process that is less susceptible to variations in quality due to uncontrollable factors (noises) linked to customer and general service characteristics. The Taguchi method is a fractional factorial design in which a reduced number of experiments called orthogonal array (OA) estimates the importance of various factors and their levels (Singh et al., 2012). In addition, an analysis of variance (ANOVA) estimates the effect of each factor on overall response. The variability expressed by S/N ratio is decisive in choosing the optimal level of the factor.

Study 1 informed about the existence of many pain points in various episodes of healthcare service process. Therefore, in consultations with healthcare service providers, few options formulated to offer better experience to patients in each episode in the service process. These options were different levels defined as control factors in the Taguchi design. Table III provides the factors and their levels used to examine patient preference preferences in HSQ formation. The experiment used all seven factors and each factor had three levels of preference to rank. Therefore, the degree of freedom required was 15(i.e.7×(3−1) +1). Accordingly, the best OA was L27 (3^7) fractional factorial design.

The computation of response variables was based on evaluator’s importance scores and preference ranking of levels of each factor. A panel of ten patients from the participants of SIT, served as evaluators. To ascertain, their ability to evaluate the importance of each factor and the levels attached, a preliminary discussion about different ways to improve HSQ helped. The data contained importance scores of factors and preference rank awarded to each level of the factor. The data collected directly from the evaluators using the template in Table III (includes responses given by the first evaluator). The evaluators marked their importance score to each Taguchi factor in such a way that total score will sum up to 100. Also, they ranked the different options (levels) available for each factor based on their preference as Rank 1 for most preferred to Rank 3 for least preferred. Taguchi OA provides a balanced consideration of all levels of all factors. The variation in the responses captures the noise factors beyond the control of service providers.

Data analysis

To evaluate response score (RS) of each Taguchi run, we performed the following steps in a sequential manner.

Step 1

Calculation of weighted rankings of each level by multiplying factor weight by rank score of the level (Rank scoring scheme adopted was “Rank1=3; Rank 2=2; Rank3=1”) using the formula:

Weighted level rank ( WL ) = Factor Weight × Rank Score .

Step 2

Calculation of RS for the run (produced from Taguchi L27 design reported in Table IV) by adding all weighted level ranks of factors associated with a run using the formula:

RS = WL i ,
where, i is weighted level rank for each factor in the run.

The response values calculated using the above procedure became the response variable that captures the overall perception of the evaluators about a run in the Taguchi design. For, e.g., the first evaluator offered responses shown in Table III. First, step involves calculation of the evaluator’s weighted rank for each level. Weighted rank for Level 1 of Factor 1 is “score of Factor1 multiplied with rank score of Level 1 of Factor 1” (F1L1) =8 (8×1). Similarly, F1L2=16 (8×2) and F1L3= 24(8×3). Similarly, calculation of weighted ranks for all levels from F2L1 to F7L3 (totally 21 scores) done. Then, the first evaluator’s RS to Run 1 will be F1L1 + F2L1 + F3L1 + F4L1 + F5L1 + F6L1 + F7L1= 8+10+60+66+48+36+12= 240. The above calculations extended to all the 27 runs for first evaluator to form the first response variable in the Taguchi design. Similarly, we computed responses variables pertaining to the responses of other evaluators. The resultant 27×10 response matrix presented in Table IV formed the data for calculating “signal to noise” ratio for determining the best levels of control factors. The above computation helped to prevent the possible bias among evaluators due to repeated responses. Here, each evaluator offered response only to the extent of their weightage perceptions of each factors and relative preference of the levels attached to the factors. Also, the ANOVA of means of responses to each factor obtained as part of design of experiments (DOE) in Minitab tool emerged as significant with p-values below 0.05. In these results, the null hypothesis states that the mean response to each factor responses of 10 different evaluators is equal. Because we got the p-values for all factors less than 0.05, the null hypothesis is rejected and concludes that responses are statistically different and bias has not affected responses.

We calculated the “signal to noise” ratio using the criteria “Larger the Better” since higher values explain, the better preference of patients. The section of DOE in the Minitab software version 18 analyzed the results of the Taguchi’s experimental design. Minitab calculates the S/N ratio of each run, each of the control factors and chooses the best level for each factor. The S/N ratio identifies the control factor settings that minimize the variability caused by the noise factors. After entering the data and choosing the appropriate formula for calculating the S/N ratio. Table V provides the extracts of relevant results.

In Taguchi experiment, “Delta” is a measure produced by Minitab 18 to represent the difference between the highest and lowest average response values for each factor. The ranking of factors depends on δ values; Rank 1 to the highest δ value, Rank 2 to the second highest, and so on, to indicate the relative effect of each factor on the RS. Table VI presents response table for factors based on S/N ratio, mean and standard deviation.

The results of ANOVA confirmed the statistically significant effects of factors on the RS, i.e., combined weighted preferences since the F-statistics had p<0.05. The results of ANOVA further confirmed that maximum effect on response is for consultation stage followed by stay and then procedures. To examine factor effects graphically, main effects plots were helpful. The main effect exists when different levels of a factor affect the characteristic differently. Figure 2 provides the main effects plots of S/N ratios, mean and standard deviation.

In the results, consultation had the most substantial effect on the S/N ratio followed by the stay. The lowest effect was on the discharge process. The experiment, produced S/N ratios of each run and the 22nd run had the maximum S/N ratio (47.482) indicating the best combination of factor levels among the 27 runs considered. However, the optimal setting can be other than the combinations considered in 27 trials of the experiment. The response table for S/N ratios reported by Minitab output (Table VI) informs about the optimum level of each factor.

Optimal setting and validation

The full factorial design for seven factors with three levels each will have 37 possible combinations and Taguchi experiment uses a factorial design with 27 combinations. Hence, the optimal setting might not be the one which is included in the L27 experiment. The response table of S/N ratios provides optimal settings that minimize the variability in HSQ from noise factors. Based on relative δ, the order of factor effect on HSQ is “Consultation – Stay – Tests/Procedure – Billing/payment – Reception – Appointment – Discharge.” Patients prefer an online appointment system for appointment booking. The optimal setting for better HSQ perceptions is F1L3−F2L1−F3L1−F4L1−F5L1−F6L1−F7L2. Hence, to impart better HSQ, the servicescape attributes should ensure online booking facility with updates on booking status, separate reception for booked patients, doctor of choice, prior detailing of procedure to patient and family, doctor on call facility while in stay, electronic discharge summary and online payment option.

A validation experiment is the final stage in the Taguchi experiment to validate the performance of optimal servicescape setting identified. In this experiment, the optimal setting was other than the combinations included in the L27 runs. The predicted value of the S/N ratio at the optimum levels (η0) is η0 = ηm + (η F1L3ηm) + (η F2L1ηm) + η F3L1ηm) + (η F4L1ηm) + η F5L1ηm) + (η F6L1ηm) + (η F7L2ηm) (Dubey and Yadava, 2007), where ηm the overall mean of S/N values, η F1L3, η F2L1, etc. are S/N values at the optimal setting obtained from Table VI. Thus, the predicted S/N for optimal setting was 48.58, much higher than the maximum value 47.482 obtained for run 22 of the experiment. To further validate the observation another Taguchi experiment conducted for the optimal setting using the RS calculated as narrated in Step 2 above produced a S/N ratio of 48.71 with percentage of error 0.268 percent to confirm the quality of prediction.


This paper discusses the description of a methodology used to determine the optimal settings for various servicescape components in healthcare for higher service quality perceptions by patients.

Based on relative δ, it appears that the consultation phase followed by admission/stay is having the highest effect on HSQ. Also, it is evident that the consultation phase followed by stay and then procedures have a maximum effect on S/N and mean responses of patients. The appointment stage followed by consultation and then procedures have a maximum effect on standard deviation in responses. Therefore, the highest contributor for variability in HSQ among the factors considered is appointment process. The appointment phase is the first significant encounter, where a patient gets the initial feel of HSQ in service journey. This stage, act as a curtain riser and set a benchmark for customer expectations about future encounters. Patients prefer an online appointment system for appointment booking. An appointment system, where the patient can clearly book and modify the time of visit can is flexible and therefore appealing to patients. Further, additional information about procedures during the visit and, possible chances of deviation from the time schedule, etc. provided in the admission portal improves patient participation in the service process.

The ideal level associated to reception in the joint design was reporting through a kiosk, but at individual factor level, on comparing S/N ratio and mean, the best level emerged is separate reception for booked patients. The effect of the reception stage in HSQ is relatively low (Rank 5), but in creating a favorable mindset about the service setting and to impart the feeling of patient-centeredness, this stage has a critical role. The consultation stage is the most important stage in the service process. Many studies have empirically established the role of the doctor in imparting patient satisfaction (Williams et al., 1998; Weng et al., 2011; Boissy et al., 2016). Patients prefer to meet the doctor of choice in the first round of interaction itself. The physician’s diagnostic skills, interaction and experience are vital in doctor selection (Bendapudi et al., 2006). Introducing multistage filtering through teleconsulting and preliminary investigations by junior doctors are likely to reduce HSQ. The tests and procedures are relatively high ranked (Rank=3) factor in patient’s HSQ perceptions. Patients prefer detailing of procedures in advance to remove all possible confusions about the nature of the test conducted and clinical procedures suggested. The stay in the hospital for undergoing treatments has the second highest effect on HSQ. Prior research has clearly established the role of room facilities, food, nursing care, etc. on service quality (Naidu, 2009; Padma et al., 2010; Mosadeghrad, 2013). Service quality in this phase has more contribution from both tangible and intangible components. A patient will feel more comfortable in a hospital stay if he/she perceives nearness to doctor. Hence, the confidence that the doctor can be summoned at any time without much formalities positively contributes to HSQ. In the discharge stage, two levels related to insurance processing and electronic sharing of discharge summary, etc. have an equally strong effect on HSQ. In the billing stage, easiness in making payments significantly adds service quality perceptions. Online payment facility has an important role in simplifying the payment process. Figure 2 illustrates the effect of factors and their levels on S/N ratios, mean and standard deviations, graphically.

Managerial implications

This study has some important implications for health service providers. First, the approach adopted in this study informs healthcare management about the practical application of robust design in designing servicescape that contribute to HSQ. In healthcare, service creation requires support from multiple personnel who have different quality perceptions. The gaps in the perceptions of these stakeholders, about criteria’s that decides HSQ can significantly undervalue the efforts initiated. In designing the “points of contact,” a choice among many options based on perceptions of stakeholders is essential. Ideally, in a service setting, each touch point should contribute to overall service quality and should have the potential to set the expectation level for the subsequent encounters. The effect of factor levels on the standard deviation of the response can explain the sensitiveness of each stage in the process. The maximum effect on the standard deviation of the response was for appointment stage followed by consultation and minimum for the stay. The stages having maximum effect on standard deviations have a higher role in developing variability in HSQ perceptions. Hence in these stages, introduction best-preferred levels will improve HSQ.

The second major implication is that there is an evidence of a growing preference for technology-enabled service delivery in healthcare. The preferences for online appointment, kiosk for reporting, online sharing of test results, online payment option, etc. clearly reflect the emerging mindset of patients. The findings corroborate the growing acceptance of consumer health information technologies (Jennett et al., 2003; Or and Karsh, 2009; Buntin et al., 2011) that facilitate healthcare by providing patients with all support related to medical history, medication, disease-specific information and electronic supply of all relevant medical records. Hence, the findings provide a platform for designing futuristic healthcare servicescapes.

Third, on analyzing the top two stages having the maximum effect of S/N, it is evident that the physician has an important role in HSQ. In consultations, preference is for doctor of choice and during stay in hospital, patients prefer to be in close contact with the doctor. A personalized attention from a doctor is essential in HSQ. Hence, healthcare firms should make attempts to redraft doctor job descriptions and schedules for improving HSQ. An effort to reduce doctor to patient ratio and teleconsulting provisions after the first visit can significantly improve patient satisfaction and their revisit intentions.

Fourth, the discharge stage, the levels had least difference in its effects on S/N ratio and mean values of responses. The observation portrays a picture that all levels are one way or other preferred by patients. The insurance processing gained a better effect on combined design (run 22 of the experiment). Healthcare firms can explore the possibilities of collaborations with insurance service providers for speedy and timely settlement of insurance claims. Fifth, the findings imply that customized attention highly influences HSQ perceptions. The preference toward separate reception, doctor selection freedom and individual detailing of procedures, etc. underlines this observation. The trade-off between standardization and customization (Greenfield et al., 2018) is a challenge to the healthcare service provider. The customization efforts seek to co-produce healthcare by designing the service processes that suit the needs, beliefs and expectations of the patient and their family. Whereas, standardization provides uniformity and stability in process and procedures for the steady outcome. Patients will experience customization when alignment of systems and services processes with their choice and requirement occurs. Service providers should classify processes based on standardization vs customization potential and try to revamp processes for better customer focus.

Research contributions

Taguchi methods help to optimize the process for better results. The best way to improve quality is to design the processes in production or service that minimize variations on quality attributes. In healthcare, the needs and wants of patients are more important and mapping their experience is complex. The success factors are more linked to intangible aspects such as care, courtesy, confidence and wellness perceptions of patients. Periodic evaluations of HSQ are ideal to initiate corrective measures, but the complexity of a survey design and errors in generalizability of survey findings adversely affect the periodicity of such evaluations. Taguchi experiment narrated in this study helps to create a servicescape which is more robust and positively contributing to HSQ. The advantage of this methodology is its simplicity and efficiency in clearly differentiating relative importance of more subjective beliefs about HSQ.

Limitations and future scope

This study attempted to address the dearth of research on service settings using robust methodologies like Taguchi experiment, which is popular in the manufacturing sector. However, we noticed a few limitations in this attempt. First, the conceptualization of noise factor in services is complicated compared to manufacturing. Healthcare caters to needs of patients having multiple concerns and hence to what extent variations in their views can be controlled is a matter of concern. The generalizability of observations has limitations. The SIT was based on selected stages in the healthcare service journey, and hence the incidences that are not attached to such stages, might have overlooked. A future experiment focusing on standardization potential in healthcare service operations may be ideal to segregate procedures having such potential and to divert more attention to processes which requires higher levels of personal touch.


Research methodology

Figure 1

Research methodology

Main effects plots of S/N ratios, mean and standard deviation

Figure 2

Main effects plots of S/N ratios, mean and standard deviation

List of dimensions used in previous studies on HSQ

S. No. Year Authors
1. 1975 Brook and Williams Technical, art-of-care provided
2. 1978 Ware Technical, interpersonal, environmental, administrative
3. 1987 Coddington and Moore Warmth/caring/concern, medical staff, technology-equipment, specialization/scope of service, outcome
4. 1998 Angelopoulou et al. Competence of physicians and nurses, cost, surroundings, food, administration
5. 2015 John Curing, caring, access, physical environment
6. 1992 Donabedian Technical, interpersonal, amenities
7. 1992 Nelson et al. Medical billing, nursing/daily care, admissions, discharge
8. 1993 Headley and Miller Dependability, empathy, reliability, responsiveness, tangibles, presentation
9. 1993 Vandamme and Leunis Tangibles, medical responsiveness, assurance, nursing staff, personal beliefs, values
10. 1996 Gabbott and Hogg Range of services, empathy, physical access, doctor specific, situational, responsiveness
11. 1995 Tomes and Chee Peng Ng Empathy, relationship of mutual respect, dignity, understanding of illness, religious needs, food, physical environment
12. 1996 JCAHOa scale Efficacy, appropriateness, efficiency, respect and caring, safety, continuity, effectiveness and outcome, timeliness, availability
13. 1996 Butler et al. Human performance, facilities quality
14. 1997 Zifko-Baliga and Krampf Professional expertise, patient belief, communication, image, performance, professional efficiency, perspicacity, individualized reliability, skills, physical cure, emotional cure, amenities, billing procedure
15. 1998 Camilleri and O’Callaghan Caring, hospital environment, professional and technical quality, patient amenities, service personalization
16. 1998 Gross and Nirel Professional-technical level, interpersonal, accessibility, availability
17. 1999 Martínez Fuentes Tangibles, service delivery, process of performance
18. 1999 Shemwell and Yavas Search attributes, credence attributes, experience attributes
19. 2001 Sower et al. Respect/caring, effectiveness/continuity, appropriateness, information, efficiency, effectiveness-meals, first impression, staff diversity
20. 2002 Baltussen et al. Health personnel practices and conduct, adequacy of resources and services, healthcare delivery, financial and physical accessibility
21. 2003 Jabnoun and Chaker Empathy, tangible, reliability, administrative responsiveness, supportive skills
22. 2004 Doran and Smith Outcome, tangible, empathy, assurance, reliability, responsiveness
23. 2004 Van Duong et al. Healthcare delivery, health facility, interpersonal aspects of care, access to services
24. 2004 Che Rose et al. Social support, patient education, technical, interpersonal, amenities/environment, access/waiting time, cost, outcomes, overall quality
25. 2005 Choi et al. Tangible, physicians concern, staff concern, convenience of care process
26. 2005 Sofaer and Firminger Patient-centered care, access, communication and information, courtesy and emotional support, technical quality, efficiency of care/organization, structure and facilities
27. 2005 Kara et al. Empathy, tangible, reliability, responsiveness, assurance, courtesy
28. 2000 Lee et al. Support from hospital, reliability and assurance, responsiveness, empathy
29. 2005 Mostafa Human performance quality, human reliability, facility quality
30. 2006 Zineldin Technical, infrastructural, interaction, atmosphere
31. 2006 Rao et al. Medicine availability, medical information, staff behavior, doctor behavior, clinic infrastructure
32. 2007 Teng et al. Need management, assurance, sanitation, customization, convenience and quiet, attention
33. 2008 Mejabi and Olujide Resources availability, quality of care, condition of clinic/ward, condition of facility, quality of food, attitude of doctors and nurses, attitude of non-medical staff, waiting time
34. 2008 Akter et al. Responsiveness, assurance, communication, discipline
35. 2008 Elleuch Process characteristics, physical appearance
36. 2008 Arasli et al. Empathy, relationships, priority to inpatient’s needs, professionalism of staff, food, physical environment
37. 2008 Duggirala et al. Infrastructure, personal quality, process of clinical care, administrative process, safety, experience of medical care, social responsibility
38. 2008 Hanson et al. Cleanliness, staff courteous and respectful, skills of health workers, explanation of treatment, availability of medicines prescribed, cost, privacy
39. 2008 Roshnee Ramsaran-Fowdar Tangibility/image, equitable treatment/reliability, responsiveness, assurance/empathy, medical competence, equipment and records, medical history
40. 2009 Prejmerean and Vasilache Competence of physicians, competences of nurses, empathy of the hospital personnel
41. 2009 Karassavidou et al. Human aspects, physical environment, infrastructure, access
42. 2009 Raposo et al. Staff, facilities quality, medical care, nursing care
43. 2010 Lee et al. Admissions and convenience, comfort and cleanliness, nursing care, physician care, bill
44. 2010 Aagja and Garg Admission, medical services, overall service, discharge, social responsibility
45. 2010 Narang Personnel practices and conduct, adequacy of resources and services, healthcare delivery, access to services
46. 2010 Chahal and Kumari Physical environment quality (comprising ambient condition, social factor and tangibles), interaction quality (comprising attitude and behavior, expertise and process quality) and outcome quality (comprising waiting time, patient satisfaction and loyalty
47. 2011 Kumar and Prabhakaran Accessibility, safety, tangibles, efficiency, interpersonal relations, technical competence, effectiveness, outcome
48. 2011 Upul Senarath and Gunawardena Interpersonal aspects, efficiency, competency, comfort, physical environment, cleanliness, personalized information, general instructions
49. 2014 Mosadeghrad Supportive visionary leadership, proper planning, education and training, availability of resources, effective management of resources, employees and processes and collaboration and cooperation
50. 2015 Kondasani and Panda physical environment, reliability, customer friendly staff, communication, responsiveness, privacy and safety
51. 2016 Lupo Healthcare staff, responsiveness, relationships, support services, accessibility, tangibles
52. 2017 Lee Care services, tangible, efficiency, safety, empathy

Note: aJoint Commission for Accreditation of Healthcare Organizations

Source: Compiled for the purpose of study

Incidences mapped from sequential incidence technique

Episode Major incidence reported
Appointment No one picked the phone when trying book appointment
Appointment Even after booking appointment need to wait long
Appointment Appointment is for day not time
Appointment Telephonic booking is difficult
Appointment No option to book appointments online
Appointment Difficult to cancel appointment
Appointment No information about delays, even when appointment is there
Billing No detailed bills given
Billing Long queue for paying bills
Billing Online transfers not possible
Consultation Doctor was silent to questions
Consultation Doctor was avoiding
Consultation Doctor never looked and was busy with computer
Consultation Doctor could not explain the issue
Discharge Discharge summary was with many mistakes
Discharge Unnecessary medicines were given
Discharge Doctor didn’t explain about medicines
Discharge I was discharged before cure
Discharge They should better listen to me
Discharge They don’t accept mistakes
Discharge My discharge sheet was completely wrong, full of inaccurate information
Discharge I left the hospital not having a clue what was wrong with me
Procedure Have to wait long to get wheel chair support
Procedure Long waiting in lab for blood tests
Procedure Refused to mail lab test results directly to me
Procedure My previous records were missing
Procedure I was not informed about how to prepare before test
Procedure They don’t inform correctly over phone
Procedure No one tell their details when interacting
Procedure Nurses are rude
Procedure Why can’t they smile
Procedure They have procedures to help patients but no coordination
Procedure No information sharing. So, confusion
Procedure It took two weeks of phoning the hospital and being put through to different departments. But still no idea what is happening to me
Reception Came on time of appointment but doctor was not there
Reception Found too difficult to talk to reception staff
Reception Reception staff are busy
Reception Reception staff confuses
Reception No information about cancellation of appointment was given and reception staff were ignorant
Reception No staff to attend people with appointment
Reception Waiting area had no sufficient chairs or provision for snacks or tea, etc.
Reception Lot of time wastage
Reception There is no surety about anything. We were waiting and confused
Reception In the second visit also need to wait long
Reception Everywhere lot of queue
Stay TV’s are not working
Stay Rest rooms are isolated and not clean
Stay Beds are hard and not comfortable
Stay Canteen is miserable too expensive and not hygienic
Stay Food served in room was not good
Stay No one came to attend after admitted
Stay Nurses are not available and they rarely come to rooms
Stay If clarifications asked, nurses will not reply
Stay No one is allowed to visit
Stay No update given to me about doctor visit to room
Stay No doctor came

Data collection template with factors and levels with responses of first evaluator

Taguchi factors Importance score Level 1 Rank Level 2 Rank Level 3 Rank
Appointment 8 Telephonic booking and confirmation by SMS 3 Telephonic booking and reminder by calling 2 Online booking facility with provision to view and update booking status 1
Reception 10 Separate reception for booked patients 3 Kiosk facility to report and get status 1 Online reporting and status update by SMS 2
Consultation 20 Provision to meet doctor of choice 1 Panel of junior doctors in first round followed by consultation with senior doctor 2 Tele-consultation in the first round 3
Diagnosis tests and procedures 22 Prior detailing of procedure to patient and family 1 Online status update to family after procedure 2 Results of diagnosis tests shared online 3
In-hospital admission and stay 16 Doctor on call facility to patient 1 Doctor visit once in 6 h 2 Doctor visit once in 24hrs 3
Discharge 12 Discharge summary/reports/bills, etc. shared electronically 1 Discharge counseling by doctor 3 Auto Insurance processing 2
Billing/payment 12 Payment at cash counter 3 Online transfer 1 Provision to make payment by cash in the room 2
Total 100

Response matrix

Run RS_Evaluator 1 RS_Evaluator 2 RS_Evaluator 3 RS_Evaluator 4 RS_Evaluator 5 RS_Evaluator 6 RS_Evaluator 7 RS_Evaluator 8 RS_Evaluator 9 RS_Evaluator 10
 1 240 170 240 277 245 265 175 300 250 250
 2 224 250 230 288 230 265 195 220 244 250
 3 208 240 250 230 215 235 185 245 238 235
 4 218 160 265 219 160 200 200 250 193 170
 5 202 240 255 230 145 200 220 170 187 170
 6 186 230 275 172 130 170 210 195 181 155
 7 166 120 210 236 240 135 150 215 136 180
 8 150 200 200 247 225 135 170 135 130 180
 9 134 190 220 189 210 105 160 160 124 165
10 172 200 270 194 210 195 230 180 211 190
11 156 190 260 205 195 165 220 175 205 115
12 176 240 295 216 210 165 195 200 199 175
13 216 160 195 211 185 220 180 160 214 245
14 200 150 185 222 170 190 170 155 208 170
15 220 200 220 233 185 190 145 180 202 230
16 224 180 220 228 190 245 235 215 232 255
17 208 170 210 239 175 215 225 210 226 180
18 228 220 245 250 190 215 200 235 220 240
19 206 200 250 255 220 200 240 195 217 235
20 190 190 225 197 205 185 185 160 187 190
21 138 210 215 208 160 170 175 170 157 205
22 244 190 255 272 270 225 235 235 235 230
23 228 180 230 214 255 210 180 200 205 185
24 176 200 220 225 210 195 170 210 175 200
25 234 200 220 191 195 240 270 240 222 195
26 242 230 215 156 200 235 235 195 208 180
27 190 250 205 167 155 220 225 205 178 195

Note: RS, response score

Taguchi runs with S/N ratios and means

Factor combinations Key metrics
Run Appointment Reception Consultation Tests/procedure Stay Discharge Billing/payment S/N Mean
 1 1 1 1 1 1 1 1 47.330 245.2
 2 1 1 1 1 2 2 2 47.479 240.5
 3 1 1 1 1 3 3 3 47.036 227.4
 4 1 2 2 2 1 1 1 46.118 210.5
 5 1 2 2 2 2 2 2 45.927 205.8
 6 1 2 2 2 3 3 3 45.234 192.7
 7 1 3 3 3 1 1 1 44.523 187.3
 8 1 3 3 3 2 2 2 44.530 182.6
 9 1 3 3 3 3 3 3 43.842 169.5
10 2 1 2 3 1 2 3 46.125 206.7
11 2 1 2 3 2 3 1 45.760 200.7
12 2 1 2 3 3 1 2 45.994 207
13 2 2 3 1 1 2 3 45.593 194.1
14 2 2 3 1 2 3 1 45.238 188.1
15 2 2 3 1 3 1 2 45.534 194.4
16 2 3 1 2 1 2 3 46.714 219.4
17 2 3 1 2 2 3 1 46.393 213.4
18 2 3 1 2 3 1 2 46.732 219.7
19 3 1 3 2 1 3 2 46.837 222.8
20 3 1 3 2 2 1 3 45.727 196.9
21 3 1 3 2 3 2 1 44.881 180.9
22 3 2 1 3 1 3 2 47.482 240.1
23 3 2 1 3 2 1 3 46.461 214.2
24 3 2 1 3 3 2 1 45.828 198.2
25 3 3 2 1 1 3 2 46.903 225
26 3 3 2 1 2 1 3 46.383 213.6
27 3 3 2 1 3 2 1 45.666 197.6

Notes: 1=level 1; 2=level 2; 3=level 3, etc. of corresponding factors

Factor response table for S/N ratio, mean, standard deviations

Level Appointment Reception Consultation Tests/procedure Stay Discharge Billing/payment
Response table for signal to noise ratios (larger is better)
1 45.78 46.35a 46.83a 46.35a 46.4a 46.09a 45.75
2 46.01 45.93 46.01 46.06 45.99 45.86 46.38a
3 46.24a 45.74 45.19 45.62 45.64 46.08 45.9
δ 0.46 0.61 1.64 0.74 0.76 0.23 0.63
Rank 6 5 1 3 2 7 4
Response table for means
1 206.8 214.2 224.2 214 216.8 209.9 202.4
2 204.8 204.2 206.6 206.9 206.2 202.9 215.3
3 209.9 203.1 190.7 200.7 198.6 208.9 203.8
δ 5.1 11.1 33.5 13.3 18.2 7 12.9
Rank 7 5 1 3 2 6 4
Response table for SD
1 36.92 28.95 24.86 27.22 30.16 32.16 32.37
2 26.73 27.84 31.57 27.49 29.23 27.64 28.85
3 24.42 31.27 31.64 33.36 28.67 28.27 26.84
δ 12.50 3.43 6.78 6.14 1.50 4.52 5.53
Rank 1 6 2 3 7 5 4

Note: aOptimal setting


Aagja, J.P. and Garg, R. (2010), “Measuring perceived service quality for public hospitals (PubHosQual) in the Indian context”, International Journal of Pharmaceutical and Healthcare Marketing, Vol. 4 No. 1, pp. 60-83.

Akter, M.S., Upal, M. and Hani, U. (2008), “Service quality perception and satisfaction: a study over sub-urban public hospitals in Bangladesh”, Journal of Services Research, Special Issue, pp. 125-146.

Andersson, R., Eriksson, H. and Torstensson, H. (2006), “Similarities and differences between TQM, Six Sigma and lean”, The TQM Magazine, Vol. 18 No. 3, pp. 282-296.

Angelopoulou, P., Kangis, P. and Babis, G. (1998), “Private and public medicine: a comparison of quality perceptions”, International Journal of Health Care Quality Assurance, Vol. 11 No. 1, pp. 14-20.

Arasli, H., Haktan Ekiz, E. and Turan Katircioglu, S. (2008), “Gearing service quality into public and private hospitals in small islands: empirical evidence from Cyprus”, International Journal of Health Care Quality Assurance, Vol. 21 No. 1, pp. 8-23.

Baltussen, R.M., , Y., Haddad, S. and Sauerborn, R.S. (2002), “Perceived quality of care of primary health care services in Burkina Faso”, Health Policy and Planning, Vol. 17 No. 1, pp. 42-48.

Bendapudi, N.M., Berry, L.L., Frey, K.A., Parish, J.T. and Rayburn, W.L. (2006), “Patients’ perspectives on ideal physician behaviors”, Mayo Clinic Proceedings, Vol. 81 No. 3, pp. 338-344.

Berwick, D.M., James, B. and Coye, M.J. (2003), “Connections between quality measurement and improvement”, Medical Care, Vol. 41 No. 1, pp. 130-138.

Black, K. and Revere, L. (2006), “Six Sigma arises from the ashes of TQM with a twist”, International Journal of Health Care Quality Assurance, Vol. 19 No. 3, pp. 259-266.

Boissy, A., Windover, A.K., Bokar, D., Karafa, M., Neuendorf, K., Frankel, R.M., Merlino, J. and Rothberg, M.B. (2016), “Communication skills training for physicians improves patient satisfaction”, Journal of General Internal Medicine, Vol. 31 No. 7, pp. 755-761.

Brook, R.H. and Williams, K.N. (1975), “Quality of health care for the disadvantaged”, Journal of Community Health, Vol. 1 No. 2, pp. 132-156.

Buntin, M.B., Burke, M.F., Hoaglin, M.C. and Blumenthal, D. (2011), “The benefits of health information technology: a review of the recent literature shows predominantly positive results”, Health Affairs, Vol. 30 No. 3, pp. 464-471.

Butler, D., Oswald, S.L. and Turner, D.E. (1996), “The effects of demographics on determinants of perceived health-care service quality: the case of users and observers”, Journal of Management in Medicine, Vol. 10 No. 5, pp. 8-20.

Camilleri, D. and O’Callaghan, M. (1998), “Comparing public and private hospital care service quality”, International Journal of Health Care Quality Assurance, Vol. 11 No. 4, pp. 127-133.

Chahal, H. and Kumari, N. (2010), “Development of multidimensional scale for healthcare service quality (HCSQ) in Indian context”, Journal of Indian Business Research, Vol. 2 No. 4, pp. 230-255.

Chakrabarty, A. and Chuan Tan, K. (2007), “The current state of Six Sigma application in services”, Managing Service Quality: An International Journal, Vol. 17 No. 2, pp. 194-208.

Che Rose, R., Uli, J., Abdul, M. and Looi Ng, K. (2004), “Hospital service quality: a managerial challenge”, International Journal of Health Care Quality Assurance, Vol. 17 No. 3, pp. 146-159.

Chilgren, A.A. (2008), “Managers and the new definition of quality”, Journal of Healthcare Management, Vol. 53 No. 4, pp. 221-229.

Choi, K.S., Lee, H., Kim, C. and Lee, S. (2005), “The service quality dimensions and patient satisfaction relationships in South Korea: comparisons across gender, age and types of service”, Journal of Services Marketing, Vol. 19 No. 3, pp. 140-149.

Coddington, D.C. and Moore, K.D. (1987), “Quality of care as a business strategy”, Healthcare Forum, Vol. 30 No. 2, pp. 29-32.

Cronin, J.J. Jr and Taylor, S.A. (1992), “Measuring service quality: a reexamination and extension”, Journal of Marketing, Vol. 56 No. 3, pp. 55-68.

Dahlgaard, J.J. and Mi Dahlgaard-Park, S. (2006), “Lean production, Six Sigma quality, TQM and company culture”, The TQM Magazine, Vol. 18 No. 3, pp. 263-281.

Dahlgaard, J.J., Pettersen, J. and Dahlgaard-Park, S.M. (2011), “Quality and lean health care: a system for assessing and improving the health of healthcare organisations”, Total Quality Management & Business Excellence, Vol. 22 No. 6, pp. 673-689.

D’Ambra, A., Amenta, P. and Lucadamo, A. (2018), “Analyzing customer requirements to select a suitable service configuration both for users and for company provider”, Social Indicators Research, pp. 1-12.

Deb, S. and Ahmed, M.A. (2019), “Quality assessment of city bus service based on subjective and objective service quality dimensions: case study in Guwahati, India”, Benchmarking: An International Journal (in press).

Donabedian, A. (1988), “The quality of care: how can it be assessed?”, JAMA, Vol. 260 No. 12, pp. 1743-1748.

Donabedian, A. (1992), “The Lichfield lecture. quality assurance in health care: consumers’ role”, Quality in Health Care, Vol. 1 No. 4, pp. 247-251.

Doran, D. and Smith, P. (2004), “Measuring service quality provision within an eating disorders context”, International Journal of Health Care Quality Assurance, Vol. 17 No. 7, pp. 377-388.

Dubey, A.K. and Yadava, V. (2007), “Simultaneous optimization of multiple quality characteristics in laser beam cutting using Taguchi method”, International Journal of Precision Engineering and Manufacturing, Vol. 8 No. 4, pp. 10-15.

Duggirala, M., Rajendran, C. and Anantharaman, R.N. (2008), “Patient-perceived dimensions of total quality service in healthcare”, Benchmarking: An International Journal, Vol. 15 No. 5, pp. 560-583.

Elleuch, A. (2008), “Patient satisfaction in Japan”, International Journal of Health care Quality Assurance, Vol. 21 No. 7, pp. 692-705.

Farr, M. and Cressey, P. (2015), “Understanding staff perspectives of quality in practice in healthcare”, BMC Health Services Research, Vol. 15 No. 1, pp. 123-133.

Gabbott, M. and Hogg, G. (1996), “The glory of stories: using critical incidents to understand service evaluation in the primary healthcare context”, Journal of Marketing Management, Vol. 12 No. 6, pp. 493-503.

Garvin, D.A. (1984), “Product quality: an important strategic weapon”, Business Horizons, Vol. 27 No. 3, pp. 40-43.

Gill, J. (2009), “Quality follows quality: add quality to the business and quality will multiply the profits”, The TQM Journal, Vol. 21 No. 5, pp. 530-539.

Gonzalez, M.E. (2019), “Improving customer satisfaction of a healthcare facility: reading the customers’ needs”, Benchmarking: An International Journal (in press).

Greenfield, D., Eljiz, K. and Butler-Henderson, K. (2018), “It takes two to Tango: customization and standardization as colluding logics in healthcare: comment on’(re) making the procrustean bed standardization and customization as competing logics in healthcare’”, International Journal of Health Policy and Management, Vol. 7 No. 2, pp. 183-185.

Grönroos, C. (1984), “A service quality model and its marketing implications”, European Journal of Marketing, Vol. 18 No. 4, pp. 36-44.

Gross, R. and Nirel, N. (1998), “Quality of care and patient satisfaction in budget-holding clinics”, International Journal of Health Care Quality Assurance, Vol. 11 No. 3, pp. 77-89.

Gupta, A., McDaniel, J.C. and Kanthi Herath, S. (2005), “Quality management in service firms: sustaining structures of total quality service”, Managing Service Quality: An International Journal, Vol. 15 No. 4, pp. 389-402.

Hanson, K., Gilson, L., Goodman, C., Mills, A., Smith, R., Feachem, R., Feachem, N.S., Koehlmoos, T.P. and Kinlaw, H. (2008), “Is private health care the answer to the health problems of the world’s poor?”, PLoS Medicine, Vol. 5 No. 11, pp. 1528-1532.

Headley, D.E. and Miller, S.J. (1993), “Measuring service quality and its relationship to future consumer behavior”, Marketing Health Services, Vol. 13 No. 4, pp. 32-41.

Ho, L.H., Feng, S.Y. and Yen, T.M. (2014), “A new methodology for customer satisfaction analysis: Taguchi’s signal-to-noise ratio approach”, Journal of Service Science and Management, Vol. 7 No. 3, pp. 235-244.

Hudelson, P., Cleopas, A., Kolly, V., Chopard, P. and Perneger, T. (2008), “What is quality and how is it achieved? Practitioners’ views versus quality models”, BMJ Quality & Safety, Vol. 17 No. 1, pp. 31-36.

Jabnoun, N. and Chaker, M. (2003), “Comparing the quality of private and public hospitals”, Managing Service Quality: An International Journal, Vol. 13 No. 4, pp. 290-299.

Jennett, P.A., Hall, L.A., Hailey, D., Ohinmaa, A., Anderson, C., Thomas, R., Young, B., Lorenzetti, D. and Scott, R.E. (2003), “The socio-economic impact of telehealth: a systematic review”, Journal of Telemedicine and Telecare, Vol. 9 No. 6, pp. 311-320.

John, J. (2015), “Perceived quality in health cake service consumption: what are the structural dimensions?”, in Hawes, J. (Ed.), Proceedings of the 1989 Academy of Marketing Science Annual Conference, Springer, Cham, pp. 518-521.

Joosten, T., Bongers, I. and Janssen, R. (2009), “Application of lean thinking to health care: issues and observations”, International Journal for Quality in Health Care, Vol. 21 No. 5, pp. 341-347.

Jorgensen, A.L. (2019), “Nurse influence in meeting compliance with the centers for medicare and medicaid services quality measure: early management bundle, severe sepsis/septic shock (SEP-1)”, Dimensions of Critical Care Nursing, Vol. 38 No. 2, pp. 70-82.

Kara, A., Lonial, S., Tarim, M. and Zaim, S. (2005), “A paradox of service quality in Turkey: the seemingly contradictory relative importance of tangible and intangible determinants of service quality”, European Business Review, Vol. 17 No. 1, pp. 5-20.

Karassavidou, E., Glaveli, N. and Papadopoulos, C.T. (2009), “Quality in NHS hospitals: no one knows better than patients”, Measuring Business Excellence, Vol. 13 No. 1, pp. 34-46.

Kondasani, R.K.R. and Panda, R.K. (2015), “Customer perceived service quality, satisfaction and loyalty in Indian private healthcare”, International Journal of Health Care Quality Assurance, Vol. 28 No. 5, pp. 452-467.

Kumar, N.S. and Prabhakaran, B. (2011), “Service quality in health care – perspectives of stakeholders”, International Journal of Business Economics and Management Research, Vol. 2 No. 3, pp. 23-34.

Kyoon Yoo, D. and Ah Park, J. (2007), “Perceived service quality: analyzing relationships among employees, customers, and financial performance”, International Journal of Quality & Reliability Management, Vol. 24 No. 9, pp. 908-926.

Lee, D. (2017), “HEALTHQUAL: a multi-item scale for assessing healthcare service quality”, Service Business, Vol. 11 No. 3, pp. 491-516.

Lee, H., Delene, L.M., Bunda, M.A. and Kim, C. (2000), “Methods of measuring health-care service quality”, Journal of Business Research, Vol. 48 No. 3, pp. 233-246.

Lee, W.I., Chen, C.W., Chen, T.H. and Chen, C.Y. (2010), “The relationship between consumer orientation, service value, medical care service quality and patient satisfaction: the case of a medical center in Southern Taiwan”, African Journal of Business Management, Vol. 4 No. 4, pp. 448-458.

Lee, Y.C., Yen, T.M. and Tsai, C.H. (2008), “Modify IPA for quality improvement: Taguchi’s signal-to-noise ratio approach”, The TQM Journal, Vol. 20 No. 5, pp. 488-501.

Lupo, T. (2016), “A fuzzy framework to evaluate service quality in the healthcare industry: an empirical case of public hospital service evaluation in Sicily”, Applied Soft Computing, Vol. 40, pp. 468-478.

Martínez Fuentes, C. (1999), “Measuring hospital service quality: a methodological study”, Managing Service Quality: An International Journal, Vol. 9 No. 4, pp. 230-240.

Mejabi, O.V. and Olujide, J.O. (2008), “Dimensions of hospital service quality in Nigeria”, European Journal of Social Sciences, Vol. 5 No. 4, pp. 141-159.

Miranda, F.J., Chamorro, A., Murillo, L.R. and Vega, J. (2010), “Assessing primary healthcare services quality in Spain: managers vs. patients perceptions”, The Service Industries Journal, Vol. 30 No. 13, pp. 2137-2149.

Mosadeghrad, A.M. (2013), “Healthcare service quality: towards a broad definition”, International Journal of Health Care Quality Assurance, Vol. 26 No. 3, pp. 203-219.

Mosadeghrad, A.M. (2014), “Factors influencing healthcare service quality”, International Journal of Health Policy and Management, Vol. 3 No. 2, pp. 77-89.

Mostafa, M.M. (2005), “An empirical study of patients’ expectations and satisfactions in Egyptian hospitals”, International Journal of Health Care Quality Assurance, Vol. 18 No. 7, pp. 516-532.

Naidu, A. (2009), “Factors affecting patient satisfaction and healthcare quality”, International Journal of Health Care Quality Assurance, Vol. 22 No. 4, pp. 366-381.

Narang, R. (2010), “Measuring perceived quality of health care services in India”, International Journal of Health Care Quality Assurance, Vol. 23 No. 2, pp. 171-186.

Nelson, E.C., Rust, R.T., Zahorik, A., Rose, R.L., Batalden, P. and Siemanski, B.A. (1992), “Do patient perceptions of quality relate to hospital financial performance?”, Journal of Health Care Marketing, Vol. 12 No. 4, pp. 6-13.

O’Connor, A.M., Stacey, D., Entwistle, V., Llewellyn-Thomas, H., Rovner, D., Holmes-Rovner, M. et al. (2003), “Decision aids for people facing health treatment or screening decisions”, Cochrane Database of Systematic Reviews, Vol. 2003 No. 1.

Or, C.K. and Karsh, B.T. (2009), “A systematic review of patient acceptance of consumer health information technology”, Journal of the American Medical Informatics Association, Vol. 16 No. 4, pp. 550-560.

Owusu-Frimpong, N., Nwankwo, S. and Dason, B. (2010), “Measuring service quality and patient satisfaction with access to public and private healthcare delivery”, International Journal of Public Sector Management, Vol. 23 No. 3, pp. 203-220.

Padma, P., Rajendran, C. and Sai Lokachari, P. (2010), “Service quality and its impact on customer satisfaction in Indian hospitals: perspectives of patients and their attendants”, Benchmarking: An International Journal, Vol. 17 No. 6, pp. 807-841.

Parasuraman, A., Zeithaml, V.A. and Berry, L.L. (1985), “A conceptual model of service quality and its implications for future research”, The Journal of Marketing, Vol. 49 No. 4, pp. 41-50.

Prejmerean, C. and Vasilache, S. (2009), “Study regarding customer perception of healthcare service quality in Romanian clinics, based on their profile”, Amfiteatru Economic Journal, Vol. 11 No. 26, pp. 298-304.

Raajpoot, N., Javed, R. and Koh, K. (2008), “Application of Taguchi design to retail service”, International Journal of Commerce and Management, Vol. 18 No. 2, pp. 184-199.

Rao, K.D., Peters, D.H. and Bandeen-Roche, K. (2006), “Towards patient-centered health services in India – a scale to measure patient perceptions of quality”, International Journal for Quality in Health Care, Vol. 18 No. 6, pp. 414-421.

Raposo, M.L., Alves, H.M. and Duarte, P.A. (2009), “Dimensions of service quality and satisfaction in healthcare: a patient’s satisfaction index”, Service Business, Vol. 3 No. 1, pp. 85-100.

Reeves, C.A. and Bednar, D.A. (1994), “Defining quality: alternatives and implications”, Academy of Management Review, Vol. 19 No. 3, pp. 419-445.

Roshnee Ramsaran-Fowdar, R. (2008), “The relative importance of service dimensions in a healthcare setting”, International Journal of Health Care Quality Assurance, Vol. 21 No. 1, pp. 104-124.

Senarat, U. and Gunawardena, N.S. (2011), “Development of an instrument to measure patient perception of the quality of nursing care and related hospital services at the national hospital of Sri Lanka”, Asian Nursing Research, Vol. 5 No. 2, pp. 71-80.

Shafei, I., Walburg, J.A. and Taher, A.F. (2015), “Healthcare service quality: what really matters to the female patient?”, International Journal of Pharmaceutical and Healthcare Marketing, Vol. 9 No. 4, pp. 369-391.

Shemwell, D.J. and Yavas, U. (1999), “Measuring service quality in hospitals: scale development and managerial applications”, Journal of Marketing Theory and Practice, Vol. 7 No. 3, pp. 65-75.

Singh, G., Pai, R.S. and Devi, V.K. (2012), “Response surface methodology and process optimization of sustained release pellets using Taguchi orthogonal array design and central composite design”, Journal of Advanced Pharmaceutical Technology & Research, Vol. 3 No. 1, pp. 30-40.

Sofaer, S. and Firminger, K. (2005), “Patient perceptions of the quality of health services”, Annual Review of Public Health, Vol. 26, pp. 513-559.

Sower, V., Duffy, J., Kilbourne, W., Kohers, G. and Jones, P. (2001), “The dimensions of service quality for hospitals: development and use of the KQCAH scale”, Health Care Management Review, Vol. 26 No. 2, pp. 47-59.

Stauss, B. and Weinlich, B. (1997), “Process-oriented measurement of service quality: applying the sequential incident technique”, European Journal of Marketing, Vol. 31 No. 1, pp. 33-55.

Taguchi, G., Chowdhury, S. and Wu, Y. (2005), Taguchi’s Quality Engineering Handbook, Vol. 1736, John Wiley & Sons, Hoboken, NJ.

Teng, C.I., Ing, C.K., Chang, H.Y. and Chung, K.P. (2007), “Development of service quality scale for surgical hospitalization”, Journal of the Formosan Medical Association, Vol. 106 No. 6, pp. 475-484.

Tomes, A.E. and Chee Peng Ng, S. (1995), “Service quality in hospital care: the development of an in-patient questionnaire”, International Journal of Health Care Quality Assurance, Vol. 8 No. 3, pp. 25-33.

Vandamme, R. and Leunis, J. (1993), “Development of a multiple-item scale for measuring hospital service quality”, International Journal of Service Industry Management, Vol. 4 No. 3, pp. 30-49.

Van Duong, D., Binns, C.W., Lee, A.H. and Hipgrave, D.B. (2004), “Measuring client-perceived quality of maternity services in rural Vietnam”, International Journal for Quality in Health Care, Vol. 16 No. 6, pp. 447-452.

Wali, A.F. and Nwokah, N.G. (2018), “Understanding customers’ expectations for delivering satisfactory and competitive services experience”, International Journal of Electronic Marketing and Retailing, Vol. 9 No. 3, pp. 254-268.

Ware, J.E. (1978), “The measurement of patient satisfaction”, Health and Medical Care Services Review, Vol. 1 No. 1, pp. 5-15.

Weng, H.C., Hung, C.M., Liu, Y.T., Cheng, Y.J., Yen, C.Y., Chang, C.C. and Huang, C.K. (2011), “Associations between emotional intelligence and doctor burnout, job satisfaction and patient satisfaction”, Medical Education, Vol. 45 No. 8, pp. 835-842.

Williams, S., Weinman, J. and Dale, J. (1998), “Doctor–patient communication and patient satisfaction”, Family Practice, Vol. 15 No. 5, pp. 480-492.

Yang, C.C., Jou, Y.T. and Cheng, L.Y. (2011), “Using integrated quality assessment for hotel service quality”, Quality & Quantity, Vol. 45 No. 2, pp. 349-364.

Yarimoglu, E.K. (2014), “A review on dimensions of service quality models”, Journal of Marketing Management, Vol. 2 No. 2, pp. 79-93.

Young, T. and McClean, S. (2009), “Some challenges facing lean thinking in healthcare”, International Journal for Quality in Health Care, Vol. 21 No. 5, pp. 309-310.

Zabada, C., Rivers, P.A. and Munchus, G. (1998), “Obstacles to the application of total quality management in health-care organizations”, Total Quality Management, Vol. 9 No. 1, pp. 57-66.

Zifko-Baliga, G.M. and Krampf, R.F. (1997), “Managing perceptions of hospital quality”, Marketing Health Services, Vol. 17 No. 3, pp. 28-31.

Zineldin, M. (2006), “The quality of health care and patient satisfaction: an exploratory investigation of the 5Qs model at some Egyptian and Jordanian medical clinics”, International Journal of Health Care Quality Assurance, Vol. 19 No. 1, pp. 60-92.

Corresponding author

Rejikumar G. can be contacted at: