Search results
1 – 10 of over 14000N. Ahmad, M.G.M. Khan and L.S. Rafi
The purpose of this paper is to investigate how to incorporate the exponentiated Weibull (EW) testing‐effort function (TEF) into inflection S‐shaped software reliability growth…
Abstract
Purpose
The purpose of this paper is to investigate how to incorporate the exponentiated Weibull (EW) testing‐effort function (TEF) into inflection S‐shaped software reliability growth models (SRGMs) based on non‐homogeneous Poisson process (NHPP). The aim is also to present a more flexible SRGM with imperfect debugging.
Design/methodology/approach
This paper reviews the EW TEFs and discusses inflection S‐shaped SRGM with EW testing‐effort to get a better description of the software fault detection phenomenon. The SRGM parameters are estimated by weighted least square estimation (WLSE) and maximum‐likelihood estimation (MLE) methods. Furthermore, the proposed models are also discussed under imperfect debugging environment.
Findings
Experimental results from three actual data applications are analyzed and compared with the other existing models. The findings reveal that the proposed SRGM has better performance and prediction capability. Results also confirm that the EW TEF is suitable for incorporating into inflection S‐shaped NHPP growth models.
Research limitations/implications
This paper presents the WLSE results with equal weight. Future research may be carried out for unequal weights.
Practical implications
Software reliability modeling and estimation are a major concern in the software development process, particularly during the software testing phase, as unreliable software can cause a failure in the computer system that can be hazardous. The results obtained in this paper may facilitate the software engineers, scientists, and managers in improving the software testing process.
Originality/value
The proposed SRGM has a flexible structure and may capture features of both exponential and S‐shaped NHPP growth models for failure phenomenon.
Details
Keywords
A.A. (Alex) Alblas and J.C. (Hans) Wortmann
Success in manufacturing industries often depends on the ability of a firm to apply product platforms. In speeding up product development, platforms often enable companies to…
Abstract
Purpose
Success in manufacturing industries often depends on the ability of a firm to apply product platforms. In speeding up product development, platforms often enable companies to benefit from scale effects by reusing existing components in the development of new products. In the delivery of complex products and system (CoPS), however, platforms are frequently modified since components have to be changed within their life cycle to meet additional customer-specific engineering demands and evolving innovations in technology. In this article, it will be illustrated that intangible design elements can be used as platforms in firms that deliver CoPS. The paper aims to discuss these issues.
Design/methodology/approach
Through extensive fieldwork at a leading supplier of science-based lithography machinery, a modified platform concept was developed and tested that is labelled as the function-technology (FT) platform. The longitudinal data, collected on site, demonstrate positive effects of applying FT platforms.
Findings
The results show that an important explanation for firm's success in delivering lithography machinery with attractive performance is their ability to deliver variants that are specific in terms of physical modules, but common in terms of functions and technologies. Based on the results, it can be argued that establishing an FT platform will allow the efficient creation of variants within a family of CoPS.
Originality/value
The findings add considerable insight to the existing literature on operations management by explaining how intangible design elements, instigated during development, can be reused in the delivery of CoPS. Moreover, it enables development managers to more easily structure and reuse complex development tasks.
Details
Keywords
Based on a dynamic capability (DC) view, the purpose of this paper is to explore whether market orientation (MO) (external) and learning orientation (LO) (internal) facilitate…
Abstract
Purpose
Based on a dynamic capability (DC) view, the purpose of this paper is to explore whether market orientation (MO) (external) and learning orientation (LO) (internal) facilitate internationalizing small- and medium-sized enterprises’ (ISMEs) global dynamic capabilities (GDCs) – i.e., their global marketing and product-design capabilities – and promote firm performance.
Design/methodology/approach
Empirical data are randomly selected from Taiwanese ISMEs, yielding 206 valid responses. Informants’ (CEOs, vice presidents, senior managers) knowledge about and shouldering of firm responsibilities are explored.
Findings
A significant increase in global marketing and product-design capabilities is found to affect firm performance. MO and LO positively influence GDCs, which increase firm performance. Furthermore, LO and MO support GDCs’ development.
Research limitations/implications
The sample is reasonably diverse in terms of demographics including firm location, size, industry, and market type. Disaggregation results are generally robust regarding model parameters. However, future research should target different countries to assess result generalizability.
Practical implications
The findings reveal two practical implications for managers. First, successful GDCs help firms spread the costs of designing products or components across many contexts and to offer appealing products to consumers worldwide. Second, it is important that managers foster development of MOs and LOs.
Originality/value
The study contributes to the literature in two ways. First, by conceptualizing GDCs of ISMEs, DC literature is expanded based on a global context. Second, the complexity of extending DC literature into ISMEs may arise from the fact that ISMEs, as separate and living entities, devise their own organizational culture, which significantly affects their GDC development.
Details
Keywords
Nesar Ahmad, M.U. Bokhari, S.M.K. Quadri and M.G.M. Khan
The purpose of this research is to incorporate the exponentiated Weibull testing‐effort functions into software reliability modeling and to estimate the optimal software release…
Abstract
Purpose
The purpose of this research is to incorporate the exponentiated Weibull testing‐effort functions into software reliability modeling and to estimate the optimal software release time.
Design/methodology/approach
This paper suggests a software reliability growth model based on the non‐homogeneous Poisson process (NHPP) which incorporates the exponentiated Weibull (EW) testing‐efforts.
Findings
Experimental results on actual data from three software projects are compared with other existing models which reveal that the proposed software reliability growth model with EW testing‐effort is wider and effective SRGM.
Research limitations/implications
This paper presents a SRGM using a constant error detection rate per unit testing‐effort.
Practical implications
Software reliability growth model is one of the fundamental techniques to assess software reliability quantitatively. The results obtained in this paper will be useful during the software testing process.
Originality/value
The present scheme has a flexible structure and may cover many of the earlier results on software reliability growth modeling. In general, this paper also provides a framework in which many software reliability growth models can be described.
Details
Keywords
The use of software is overpowering our modern society. Advancement in technology is directly proportional to an increase in user demand which further leads to an increase in the…
Abstract
Purpose
The use of software is overpowering our modern society. Advancement in technology is directly proportional to an increase in user demand which further leads to an increase in the burden on software firms to develop high-quality and reliable software. To meet the demands, software firms need to upgrade existing versions. The upgrade process of software may lead to additional faults in successive versions of the software. The faults that remain undetected in the previous version are passed on to the new release. As this process is complicated and time-consuming, it is important for firms to allocate resources optimally during the testing phase of software development life cycle (SDLC). Resource allocation task becomes more challenging when the testing is carried out in a dynamic nature.
Design/methodology/approach
The model presented in this paper explains the methodology to estimate the testing efforts in a dynamic environment with the assumption that debugging cost corresponding to each release follows learning curve phenomenon. We have used optimal control theoretic approach to find the optimal policies and genetic algorithm to estimate the testing effort. Further, numerical illustration has been given to validate the applicability of the proposed model using a real-life software failure data set.
Findings
The paper yields several substantive insights for software managers. The study shows that estimated testing efforts as well as the faults detected for both the releases are closer to the real data set.
Originality /value
We have proposed a dynamic resource allocation model for multirelease of software with the objective to minimize the total testing cost using the flexible software reliability growth model (SRGM).
Details
Keywords
N. Ahmad, M.G.M. Khan, S.M.K. Quadri and M. Kumar
The purpose of this research paper is to discuss a software reliability growth model (SRGM) based on the non‐homogeneous Poisson process which incorporates the Burr type X…
Abstract
Purpose
The purpose of this research paper is to discuss a software reliability growth model (SRGM) based on the non‐homogeneous Poisson process which incorporates the Burr type X testing‐effort function (TEF), and to determine the optimal release‐time based on cost‐reliability criteria.
Design/methodology/approach
It is shown that the Burr type X TEF can be expressed as a software development/testing‐effort consumption curve. Weighted least squares estimation method is proposed to estimate the TEF parameters. The SRGM parameters are estimated by the maximum likelihood estimation method. The standard errors and confidence intervals of SRGM parameters are also obtained. Furthermore, the optimal release‐time determination based on cost‐reliability criteria has been discussed within the framework.
Findings
The performance of the proposed SRGM is demonstrated by using actual data sets from three software projects. Results are compared with other traditional SRGMs to show that the proposed model has a fairly better prediction capability and that the Burr type X TEF is suitable for incorporating into software reliability modelling. Results also reveal that the SRGM with Burr type X TEF can estimate the number of initial faults better than that of other traditional SRGMs.
Research limitations/implications
The paper presents the estimation method with equal weight. Future research may include extending the present study to unequal weight.
Practical implications
The new SRGM may be useful in detecting more faults that are difficult to find during regular testing, and in assisting software engineers to improve their software development process.
Originality/value
The incorporated TEF is flexible and can be used to describe the actual expenditure patterns more faithfully during software development.
Details
Keywords
Robert J. Vokurka, Gail M. Zank and Carl M. Lund
Supply chains can improve their performance by developing competitive priorities in a specified sequence: quality, reliability, flexibility, agility, and finally, cost efficiency…
Abstract
Supply chains can improve their performance by developing competitive priorities in a specified sequence: quality, reliability, flexibility, agility, and finally, cost efficiency. This paper extends Ferdows and De Meyer's (1990) sand cone model and Vokurka and Fliedner's (1998) sand cone model extension incorporating agility to supply chain management priorities. This work provides a framework for a cumulative and sustainable improvement process by which supply chains can build a strategic competitive advantage.
Sara Hajmohammad and Anton Shevchenko
Many modern firms strive to become sustainable. To this end, they are required to improve not only their own environmental and social performance but also the performance of their…
Abstract
Purpose
Many modern firms strive to become sustainable. To this end, they are required to improve not only their own environmental and social performance but also the performance of their suppliers. Building on population ecology theory, we explore how buyers' exposure to supplier sustainability risk and their subsequent risk management strategies at the buyer–supplier dyad level can lead to adherence to sustainability by the supplier populations.
Design/methodology/approach
We rely on a bottom-up research design, in which the actions of buyers within buyer–supplier dyads lead to population-wide changes on the supplier side. Specifically, we use experimental data on managing sustainability risk to build an agent-based simulation model and assess the effect of evolutionary processes on the presence of sustainable/unsustainable business practices in the supplier population.
Findings
Our findings suggest that buyers' cumulative actions in managing sustainability risk do not necessarily result in effective population-wide improvements (i.e. at a high rate and to a high degree). For example, in high risk impact conditions, the buyer population is usually able to decrease the population level risk in a long run, but they would need both power and resources for quickly achieving such improved outcomes. Importantly, this positive change, in most cases, is due to the fact that the buyer population selects out the suppliers with high probability of misconduct (i.e. decreased supplier population density).
Originality/value
Drawing on the organizational population ecology theory, we explore when, to what degree and how quickly the buyers' cumulative efforts can lead to population-wide changes in the level of supplier sustainability risk, as well as the composition and density of supplier population. Methodologically, this paper is one of the first studies which use a combination of experimental data and agent-based modeling to offer more valuable insights on supply networks.
Details
Keywords
Ben Shaw‐Ching Liu, Nicholas C. Petruzzi and D. Sudharshan
The purpose of this paper is to apply customer lifetime value models to assess the overall value of the service encounter and to establish implications that such an assessment has…
Abstract
Purpose
The purpose of this paper is to apply customer lifetime value models to assess the overall value of the service encounter and to establish implications that such an assessment has for managing customer relationships under a fixed‐size salesforce.
Design/methodology/approach
Using a specific relationship between customer servicing activities and the buying rhythms of customers, an analytical model for assessing the overall value of a service encounter is developed.
Findings
A stochastic parameter is identified, characterizing the level of quality to compute the long‐term value of a given customer and stochastic ordering properties to determine the relative value of different customers.
Research limitations/implications
The implications discussed are analytical to help service managers shaping their thought process in decision making. Future research can empirically test the model proposed.
Practical implications
The theorem specifies the optimal solutions to determine: how much capacity should be committed to a given customer; and how to choose a customer in the first place. These are important and useful tools for managers in making their managerial decisions in service marketing.
Originality/value
A general model of resource allocation is provided, under which those seminal models such as CALLPLAN, DETAILER are special cases. This is particularly valuable as key account management has become more important in globally operated businesses.
Details
Keywords
Xiaoyan Qian, Hao Yin and Xiaotong Li
This paper aims to explore the influence of marketing investment on drug diffusion processes, to analyze the heterogeneity of the diffusion characteristics and to understand the…
Abstract
Purpose
This paper aims to explore the influence of marketing investment on drug diffusion processes, to analyze the heterogeneity of the diffusion characteristics and to understand the drug diffusion patterns in the prescription and over-the-counter (OTC) markets.
Design/methodology/approach
The study introduces marketing investment into the Bass model. The authors use the Generalized Bass Model (GBM) to examine the influence of marketing efforts on drug diffusion in Chinese prescription and OTC markets.
Findings
The results of this study suggest that the imitation effect in the prescription drug market is greater than that in the OTC drug market; drug diffusion in the OTC market reaches saturation earlier in the diffusion process. Before reaching the critical state, the effect of marketing investment on drug diffusion in the OTC market is greater than that in the prescription market, and after the critical state, drug diffusion in the prescription market is more sensitive to marketing investment.
Originality/value
The study demonstrates the value of the GBM in empirical analyses of drug diffusion across two distinct markets, and the marketing regulation policies governments adopt have a powerful impact on the speed at which drugs become available in different markets. It enriches the extant product diffusion literature by highlighting the different diffusion patterns of the two segments of pharmaceutical market.
Details