Search results

1 – 10 of over 14000
To view the access options for this content please click here
Article
Publication date: 12 January 2010

N. Ahmad, M.G.M. Khan and L.S. Rafi

The purpose of this paper is to investigate how to incorporate the exponentiated Weibull (EW) testing‐effort function (TEF) into inflection S‐shaped software reliability…

Abstract

Purpose

The purpose of this paper is to investigate how to incorporate the exponentiated Weibull (EW) testing‐effort function (TEF) into inflection S‐shaped software reliability growth models (SRGMs) based on non‐homogeneous Poisson process (NHPP). The aim is also to present a more flexible SRGM with imperfect debugging.

Design/methodology/approach

This paper reviews the EW TEFs and discusses inflection S‐shaped SRGM with EW testing‐effort to get a better description of the software fault detection phenomenon. The SRGM parameters are estimated by weighted least square estimation (WLSE) and maximum‐likelihood estimation (MLE) methods. Furthermore, the proposed models are also discussed under imperfect debugging environment.

Findings

Experimental results from three actual data applications are analyzed and compared with the other existing models. The findings reveal that the proposed SRGM has better performance and prediction capability. Results also confirm that the EW TEF is suitable for incorporating into inflection S‐shaped NHPP growth models.

Research limitations/implications

This paper presents the WLSE results with equal weight. Future research may be carried out for unequal weights.

Practical implications

Software reliability modeling and estimation are a major concern in the software development process, particularly during the software testing phase, as unreliable software can cause a failure in the computer system that can be hazardous. The results obtained in this paper may facilitate the software engineers, scientists, and managers in improving the software testing process.

Originality/value

The proposed SRGM has a flexible structure and may capture features of both exponential and S‐shaped NHPP growth models for failure phenomenon.

Details

International Journal of Quality & Reliability Management, vol. 27 no. 1
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 1 April 2014

A.A. (Alex) Alblas and J.C. (Hans) Wortmann

Success in manufacturing industries often depends on the ability of a firm to apply product platforms. In speeding up product development, platforms often enable companies…

Abstract

Purpose

Success in manufacturing industries often depends on the ability of a firm to apply product platforms. In speeding up product development, platforms often enable companies to benefit from scale effects by reusing existing components in the development of new products. In the delivery of complex products and system (CoPS), however, platforms are frequently modified since components have to be changed within their life cycle to meet additional customer-specific engineering demands and evolving innovations in technology. In this article, it will be illustrated that intangible design elements can be used as platforms in firms that deliver CoPS. The paper aims to discuss these issues.

Design/methodology/approach

Through extensive fieldwork at a leading supplier of science-based lithography machinery, a modified platform concept was developed and tested that is labelled as the function-technology (FT) platform. The longitudinal data, collected on site, demonstrate positive effects of applying FT platforms.

Findings

The results show that an important explanation for firm's success in delivering lithography machinery with attractive performance is their ability to deliver variants that are specific in terms of physical modules, but common in terms of functions and technologies. Based on the results, it can be argued that establishing an FT platform will allow the efficient creation of variants within a family of CoPS.

Originality/value

The findings add considerable insight to the existing literature on operations management by explaining how intangible design elements, instigated during development, can be reused in the delivery of CoPS. Moreover, it enables development managers to more easily structure and reuse complex development tasks.

Details

International Journal of Operations & Production Management, vol. 34 no. 4
Type: Research Article
ISSN: 0144-3577

Keywords

To view the access options for this content please click here
Article
Publication date: 3 July 2017

Yao-Ping Peng and Ku-Ho Lin

Based on a dynamic capability (DC) view, the purpose of this paper is to explore whether market orientation (MO) (external) and learning orientation (LO) (internal…

Abstract

Purpose

Based on a dynamic capability (DC) view, the purpose of this paper is to explore whether market orientation (MO) (external) and learning orientation (LO) (internal) facilitate internationalizing small- and medium-sized enterprises’ (ISMEs) global dynamic capabilities (GDCs) – i.e., their global marketing and product-design capabilities – and promote firm performance.

Design/methodology/approach

Empirical data are randomly selected from Taiwanese ISMEs, yielding 206 valid responses. Informants’ (CEOs, vice presidents, senior managers) knowledge about and shouldering of firm responsibilities are explored.

Findings

A significant increase in global marketing and product-design capabilities is found to affect firm performance. MO and LO positively influence GDCs, which increase firm performance. Furthermore, LO and MO support GDCs’ development.

Research limitations/implications

The sample is reasonably diverse in terms of demographics including firm location, size, industry, and market type. Disaggregation results are generally robust regarding model parameters. However, future research should target different countries to assess result generalizability.

Practical implications

The findings reveal two practical implications for managers. First, successful GDCs help firms spread the costs of designing products or components across many contexts and to offer appealing products to consumers worldwide. Second, it is important that managers foster development of MOs and LOs.

Originality/value

The study contributes to the literature in two ways. First, by conceptualizing GDCs of ISMEs, DC literature is expanded based on a global context. Second, the complexity of extending DC literature into ISMEs may arise from the fact that ISMEs, as separate and living entities, devise their own organizational culture, which significantly affects their GDC development.

Details

Baltic Journal of Management, vol. 12 no. 3
Type: Research Article
ISSN: 1746-5265

Keywords

To view the access options for this content please click here
Book part
Publication date: 10 July 2014

To explain how cumulative efforts contribute to learning and literacy development.

Abstract

Purpose

To explain how cumulative efforts contribute to learning and literacy development.

Design/methodology/approach

A representation of how efforts lead to lasting growth is discussed through a variety of historical and current perspectives across content disciplines. This chapter includes depictions of how positive experiences can promote further success and recognizing one’s cumulative efforts and the effects from those are fundamental to educational attainment.

Findings

The value one places on tasks such as reading or writing is often aligned to the frequency with which those events occur. Students view their time and effort as capital; they are students’ most valued possessions, and how they allocate these commodities is a choice.

Practical implications

For students to become avid readers and writers, we must utilize a host of strategies to impress the notion that these activities are worth their attention, time, and investment.

Details

Theoretical Models of Learning and Literacy Development
Type: Book
ISBN: 978-1-78350-821-1

Keywords

To view the access options for this content please click here
Article
Publication date: 25 January 2008

Nesar Ahmad, M.U. Bokhari, S.M.K. Quadri and M.G.M. Khan

The purpose of this research is to incorporate the exponentiated Weibull testing‐effort functions into software reliability modeling and to estimate the optimal software…

Abstract

Purpose

The purpose of this research is to incorporate the exponentiated Weibull testing‐effort functions into software reliability modeling and to estimate the optimal software release time.

Design/methodology/approach

This paper suggests a software reliability growth model based on the non‐homogeneous Poisson process (NHPP) which incorporates the exponentiated Weibull (EW) testing‐efforts.

Findings

Experimental results on actual data from three software projects are compared with other existing models which reveal that the proposed software reliability growth model with EW testing‐effort is wider and effective SRGM.

Research limitations/implications

This paper presents a SRGM using a constant error detection rate per unit testing‐effort.

Practical implications

Software reliability growth model is one of the fundamental techniques to assess software reliability quantitatively. The results obtained in this paper will be useful during the software testing process.

Originality/value

The present scheme has a flexible structure and may cover many of the earlier results on software reliability growth modeling. In general, this paper also provides a framework in which many software reliability growth models can be described.

Details

International Journal of Quality & Reliability Management, vol. 25 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Book part
Publication date: 1 August 2004

Kevin D Carlson and Donald E Hatfield

In this chapter we ask a simple question: how can we tell if strategic management research is making progress? While other limitations are noted, we argue that it is the…

Abstract

In this chapter we ask a simple question: how can we tell if strategic management research is making progress? While other limitations are noted, we argue that it is the absence of metrics for gauging research progress that is most limiting. We propose that research should focus on measures of effect size and that “precision” and “generalizability” in our predictions of important phenomena represent the core metrics that should be used to judge whether progress is occurring. We then discuss how to employ these metrics and examine why existing research practices are likely to hinder efforts to develop cumulative knowledge.

Details

Research Methodology in Strategy and Management
Type: Book
ISBN: 978-1-84950-235-1

To view the access options for this content please click here
Article
Publication date: 20 February 2020

Vijay Kumar and Ramita Sahni

The use of software is overpowering our modern society. Advancement in technology is directly proportional to an increase in user demand which further leads to an increase…

Abstract

Purpose

The use of software is overpowering our modern society. Advancement in technology is directly proportional to an increase in user demand which further leads to an increase in the burden on software firms to develop high-quality and reliable software. To meet the demands, software firms need to upgrade existing versions. The upgrade process of software may lead to additional faults in successive versions of the software. The faults that remain undetected in the previous version are passed on to the new release. As this process is complicated and time-consuming, it is important for firms to allocate resources optimally during the testing phase of software development life cycle (SDLC). Resource allocation task becomes more challenging when the testing is carried out in a dynamic nature.

Design/methodology/approach

The model presented in this paper explains the methodology to estimate the testing efforts in a dynamic environment with the assumption that debugging cost corresponding to each release follows learning curve phenomenon. We have used optimal control theoretic approach to find the optimal policies and genetic algorithm to estimate the testing effort. Further, numerical illustration has been given to validate the applicability of the proposed model using a real-life software failure data set.

Findings

The paper yields several substantive insights for software managers. The study shows that estimated testing efforts as well as the faults detected for both the releases are closer to the real data set.

Originality /value

We have proposed a dynamic resource allocation model for multirelease of software with the objective to minimize the total testing cost using the flexible software reliability growth model (SRGM).

Details

International Journal of Quality & Reliability Management, vol. 37 no. 6/7
Type: Research Article
ISSN: 0265-671X

Keywords

To view the access options for this content please click here
Article
Publication date: 13 March 2009

N. Ahmad, M.G.M. Khan, S.M.K. Quadri and M. Kumar

The purpose of this research paper is to discuss a software reliability growth model (SRGM) based on the non‐homogeneous Poisson process which incorporates the Burr type X…

Abstract

Purpose

The purpose of this research paper is to discuss a software reliability growth model (SRGM) based on the non‐homogeneous Poisson process which incorporates the Burr type X testing‐effort function (TEF), and to determine the optimal release‐time based on cost‐reliability criteria.

Design/methodology/approach

It is shown that the Burr type X TEF can be expressed as a software development/testing‐effort consumption curve. Weighted least squares estimation method is proposed to estimate the TEF parameters. The SRGM parameters are estimated by the maximum likelihood estimation method. The standard errors and confidence intervals of SRGM parameters are also obtained. Furthermore, the optimal release‐time determination based on cost‐reliability criteria has been discussed within the framework.

Findings

The performance of the proposed SRGM is demonstrated by using actual data sets from three software projects. Results are compared with other traditional SRGMs to show that the proposed model has a fairly better prediction capability and that the Burr type X TEF is suitable for incorporating into software reliability modelling. Results also reveal that the SRGM with Burr type X TEF can estimate the number of initial faults better than that of other traditional SRGMs.

Research limitations/implications

The paper presents the estimation method with equal weight. Future research may include extending the present study to unequal weight.

Practical implications

The new SRGM may be useful in detecting more faults that are difficult to find during regular testing, and in assisting software engineers to improve their software development process.

Originality/value

The incorporated TEF is flexible and can be used to describe the actual expenditure patterns more faithfully during software development.

Details

Journal of Modelling in Management, vol. 4 no. 1
Type: Research Article
ISSN: 1746-5664

Keywords

To view the access options for this content please click here
Article
Publication date: 1 January 2002

Robert J. Vokurka, Gail M. Zank and Carl M. Lund

Supply chains can improve their performance by developing competitive priorities in a specified sequence: quality, reliability, flexibility, agility, and finally, cost…

Downloads
1068

Abstract

Supply chains can improve their performance by developing competitive priorities in a specified sequence: quality, reliability, flexibility, agility, and finally, cost efficiency. This paper extends Ferdows and De Meyer's (1990) sand cone model and Vokurka and Fliedner's (1998) sand cone model extension incorporating agility to supply chain management priorities. This work provides a framework for a cumulative and sustainable improvement process by which supply chains can build a strategic competitive advantage.

Details

Competitiveness Review: An International Business Journal, vol. 12 no. 1
Type: Research Article
ISSN: 1059-5422

To view the access options for this content please click here
Article
Publication date: 20 May 2020

Sara Hajmohammad and Anton Shevchenko

Many modern firms strive to become sustainable. To this end, they are required to improve not only their own environmental and social performance but also the performance…

Abstract

Purpose

Many modern firms strive to become sustainable. To this end, they are required to improve not only their own environmental and social performance but also the performance of their suppliers. Building on population ecology theory, we explore how buyers' exposure to supplier sustainability risk and their subsequent risk management strategies at the buyer–supplier dyad level can lead to adherence to sustainability by the supplier populations.

Design/methodology/approach

We rely on a bottom-up research design, in which the actions of buyers within buyer–supplier dyads lead to population-wide changes on the supplier side. Specifically, we use experimental data on managing sustainability risk to build an agent-based simulation model and assess the effect of evolutionary processes on the presence of sustainable/unsustainable business practices in the supplier population.

Findings

Our findings suggest that buyers' cumulative actions in managing sustainability risk do not necessarily result in effective population-wide improvements (i.e. at a high rate and to a high degree). For example, in high risk impact conditions, the buyer population is usually able to decrease the population level risk in a long run, but they would need both power and resources for quickly achieving such improved outcomes. Importantly, this positive change, in most cases, is due to the fact that the buyer population selects out the suppliers with high probability of misconduct (i.e. decreased supplier population density).

Originality/value

Drawing on the organizational population ecology theory, we explore when, to what degree and how quickly the buyers' cumulative efforts can lead to population-wide changes in the level of supplier sustainability risk, as well as the composition and density of supplier population. Methodologically, this paper is one of the first studies which use a combination of experimental data and agent-based modeling to offer more valuable insights on supply networks.

Details

International Journal of Operations & Production Management, vol. 40 no. 7/8
Type: Research Article
ISSN: 0144-3577

Keywords

1 – 10 of over 14000