Search results

1 – 10 of over 109000
Article
Publication date: 23 April 2024

Marek Tiits, Erkki Karo and Tarmo Kalvet

Although the significance of technological progress in economic development is well-established in theory and policy, it has remained challenging to agree upon shared priorities…

Abstract

Purpose

Although the significance of technological progress in economic development is well-established in theory and policy, it has remained challenging to agree upon shared priorities for strategies and policies. This paper aims to develop a model of how policymakers can develop effective and easy to communicate strategies for science, technology and economic development.

Design/methodology/approach

By integrating insights from economic complexity, competitiveness and foresight literature, a replicable research framework for analysing the opportunities and challenges of technological revolutions for small catching-up countries is developed. The authors highlight key lessons from piloting this framework for informing the strategy and policies for bioeconomy in Estonia towards 2030–2050.

Findings

The integration of economic complexity research with traditional foresight methods establishes a solid analytical basis for a data-driven analysis of the opportunities for industrial upgrading. The increase in the importance of regional alliances in the global economy calls for further advancement of the analytical toolbox. Integration of complexity, global value chains and export potential assessment approaches offers valuable direction for further research, as it enables discussion of the opportunities of moving towards more knowledge-intensive economic activities along with the opportunities for winning international market share.

Originality/value

The research merges insights from the economic complexity, competitiveness and foresight literature in a novel way and illustrates the applicability and priority-setting in a real-life setting.

Details

Competitiveness Review: An International Business Journal , vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1059-5422

Keywords

Article
Publication date: 1 January 2002

S. Sivadasan, J. Efstathiou, G. Frizelle, R. Shirazi and A. Calinescu

In a dynamic environment such as the supply chain, even basic supplier‐customer systems with structurally simple information and material flow formations have a tendency to…

2403

Abstract

In a dynamic environment such as the supply chain, even basic supplier‐customer systems with structurally simple information and material flow formations have a tendency to exhibit operational complexity. The operational complexity of supplier‐customer systems is primarily characterised by the uncertainty of the system. As the operational complexity of a system increases there is an associated increase in the amount of information required to monitor and manage that system. Based on this understanding, a novel information‐theoretic entropy‐based methodology for measuring and analysing the operational complexity of supplier‐customer systems has been developed. This paper makes contributions in the theoretical, conceptual and practical developments of the methodology. The methodology can quantitatively detect and prioritise operational complexity hotspots. At the interface, the framework can identify and quantify the transfer of operational complexity. Within the internal manufacturing system, the framework provides a comparative operational complexity measure across sub‐systems such as flows and products. This entropy‐based methodology provides a tool for identifying and measuring four classes of operational complexity transfer corresponding to the extent to which organisations generate, absorb, export and import operational complexity.

Details

International Journal of Operations & Production Management, vol. 22 no. 1
Type: Research Article
ISSN: 0144-3577

Keywords

Article
Publication date: 7 August 2018

Marlene Kuhn, Franziska Schaefer and Heiner Otten

The purpose of this paper is to integrate process complexity as an object of analysis within effective quality management (QM).

Abstract

Purpose

The purpose of this paper is to integrate process complexity as an object of analysis within effective quality management (QM).

Design/methodology/approach

This paper systematically analyzes different conceptions of complexity theory and characterizes process complexity from a QM perspective producing new insights how to address process complexity for continuous improvement.

Findings

The authors identified and specified four complexity characteristics, which we integrated in a holistic process complexity model (PCM). The author further developed the idea of internal and external process complexity and demonstrated that internal complexity needs to balance external complexity. Based on the PCM, internal process complexity can be analyzed and suitable management approaches can be selected, while conventional QM practices showed to be inefficient or even contra-productive when applied in the context of process complexity.

Research limitations/implications

This research is adapted to fit the needs of production processes. The PCM is designed from a QM perspective.

Practical implications

The developed model allows companies to specify and characterize process complexity in order to reflect on the appropriateness of their process management approaches. Furthermore, it gives an additional perspective on process analysis for tapping the full potential of process improvement programs.

Originality/value

This paper combines complexity theory with QM.

Details

The TQM Journal, vol. 30 no. 6
Type: Research Article
ISSN: 1754-2731

Keywords

Article
Publication date: 9 January 2024

Dara Sruthilaya, Aneetha Vilventhan and P.R.C. Gopal

The purpose of this research is to develop a project complexity index (PCI) model using the best and worst method (BWM) to quantitatively analyze the impact of project complexities

Abstract

Purpose

The purpose of this research is to develop a project complexity index (PCI) model using the best and worst method (BWM) to quantitatively analyze the impact of project complexities on the performance of metro rail projects.

Design/methodology/approach

This study employed a two-phase research methodology. The first phase identifies complexities through a literature review and expert discussions and categorizes different types of complexities in metro rail projects. In the second phase, BWM, a robust multi-criteria decision-making (MCDM) technique, was used to prioritize key complexities, and a PCI model was developed. Further, the developed PCI was validated through case studies, and sensitivity analysis was performed to check the accuracy and applicability of the developed PCI model.

Findings

The analysis revealed that location complexity exerted the most substantial influence on project performance, followed by environmental, organizational, technological and contractual complexities. Sensitivity analysis revealed the varying impacts of complexity indices on the overall project complexity.

Practical implications

The study's findings offer a novel approach for measuring project complexity's impact on metro rail projects. This allows stakeholders to make informed decisions, allocate resources efficiently and plan strategically.

Originality/value

The existing studies on project complexity identification and quantification were limited to megaprojects other than metro rail projects. Efforts to quantitatively study and analyze the impact of project complexity on metro rail projects are left unattended. The developed PCI model and its validation contribute to the field by providing a definite method to measure and manage complexity in metro rail projects.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 2 February 2024

Quntao Wu, Qiushi Bo, Lan Luo, Chenxi Yang and Jianwang Wang

This study aims to obtain governance strategies for managing the complexity of megaprojects by analyzing the impact of individual factors and their configurations using the…

Abstract

Purpose

This study aims to obtain governance strategies for managing the complexity of megaprojects by analyzing the impact of individual factors and their configurations using the fuzzy-set qualitative comparative analysis (fsQCA) method and to provide references for project managers.

Design/methodology/approach

With the continuous development of the economy, society and construction industry, the number and scale of megaprojects are increasing, and the complexity is becoming serious. Based on the relevant literature, the factors affecting the complexity of megaprojects are determined through case analysis, and the paths of factors affecting the complexity are constructed for megaprojects. Then, the fsQCA method is used to analyze the factors affecting the complexity of megaprojects through 245 valid questionnaires from project engineers in this study.

Findings

The results support the correlation between the complexity factors of megaprojects, with six histological paths leading to high complexity and seven histological paths leading to low complexity.

Originality/value

It breaks the limitations of the traditional project complexity field through a “configuration perspective” and concludes that megaproject complexity is a synergistic effect of multiple factors. The study is important for enriching the theory of megaproject complexity and providing complexity governance strategies for managers in megaproject decision-making.

Details

Engineering, Construction and Architectural Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 5 May 2020

Lan Luo, Limao Zhang and Qinghua He

The purpose of this study is to develop a novel hybrid approach that incorporates the structural equation model (SEM) and fuzzy cognitive map (FCM) to investigate the impacts of…

Abstract

Purpose

The purpose of this study is to develop a novel hybrid approach that incorporates the structural equation model (SEM) and fuzzy cognitive map (FCM) to investigate the impacts of the variation in project complexity on project success.

Design/methodology/approach

This study adopts SEM to identify and validate a correlation between project complexity variables and PS. Standardized causal coefficients estimated in SEM are used to construct an FCM model to illustrate the effect of complexity on PS with linkage direction and weights. Predictive and diagnostic analyses are performed to dynamically model the variation in project complexity on the evolution of PS.

Findings

Results indicate that (1) the hybrid SEM–FCM approach is capable of modeling the dynamic interactions between project complexity and PS; (2) information, goal and environmental complexities are negatively correlated with PS, and technological, task and organizational complexities are positively correlated with PS and (3) the recommendations of complexity management for construction projects are put forward under the guideline of success monitoring.

Originality/value

This research contributes to (1) the state of knowledge by proposing a hybrid methodology that can model the dynamic interactions between project complexity and PS and (2) the state of practice by providing a new perspective of PS evaluation to enhance the probability of success in complex construction projects.

Details

Engineering, Construction and Architectural Management, vol. 27 no. 9
Type: Research Article
ISSN: 0969-9988

Keywords

Article
Publication date: 17 May 2022

Qiucheng Liu

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of…

Abstract

Purpose

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.

Design/methodology/approach

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.

Findings

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.

Originality/value

In order to analyze the text complexity of Chinese and foreign academic English writings, the artificial neural network (ANN) under deep learning (DL) is applied to the study of text complexity. Firstly, the research status and existing problems of text complexity are introduced based on DL. Secondly, based on Back Propagation Neural Network (BPNN) algorithm, analyzation is made on the text complexity of Chinese and foreign academic English writings. And the research establishes a BPNN syntactic complexity evaluation system. Thirdly, MATLAB2013b is used for simulation analysis of the model. The proposed model algorithm BPANN is compared with other classical algorithms, and the weight value of each index and the model training effect are further analyzed by statistical methods. Finally, L2 Syntactic Complexity Analyzer (L2SCA) is used to calculate the syntactic complexity of the two libraries, and Mann–Whitney U test is used to compare the syntactic complexity of Chinese English learners and native English speakers. The experimental results show that compared with the shallow neural network, the deep neural network algorithm has more hidden layers and richer features, and better performance of feature extraction. BPNN algorithm shows excellent performance in the training process, and the actual output value is very close to the expected value. Meantime, the error of sample test is analyzed, and it is found that the evaluation error of BPNN algorithm is less than 1.8%, of high accuracy. However, there are significant differences in grammatical complexity among students with different English writing proficiency. Some measurement methods cannot effectively reflect the types and characteristics of written language, or may have a negative relationship with writing quality. In addition, the research also finds that the measurement of syntactic complexity is more sensitive to the language ability of writing. Therefore, BPNN algorithm can effectively analyze the text complexity of academic English writing. The results of the research provide reference for improving the evaluation system of text complexity of academic paper writing.

Details

Library Hi Tech, vol. 41 no. 5
Type: Research Article
ISSN: 0737-8831

Keywords

Article
Publication date: 4 January 2018

Varinder Singh and Pravin M. Singru

The purpose of this paper is to propose the use of graph theoretic structural modeling for assessing the possible reduction in complexity of the work flow procedures in an…

Abstract

Purpose

The purpose of this paper is to propose the use of graph theoretic structural modeling for assessing the possible reduction in complexity of the work flow procedures in an organization due to lean initiatives. A tool to assess the impact of lean initiative on complexity of the system at an early stage of decision making is proposed.

Design/methodology/approach

First, the permanent function-based graph theoretic structural model has been applied to understand the complex structure of a manufacturing system under consideration. The model helps by systematically breaking it into different sub-graphs that identify all the cycles of interactions among the subsystems in the organization in a systematic manner. The physical interpretation of the existing quantitative methods linked to graph theoretic methodology, namely two types of coefficients of dissimilarity, has been used to evolve the new measures of organizational complexity. The new methods have been deployed for studying the impact of different lean initiatives on complexity reduction in a case industrial organization.

Findings

The usefulness and the application of new proposed measures of complexity have been demonstrated with the help of three cases of lean initiatives in an industrial organization. The new measures of complexity have been proposed as a credible tool for studying the lean initiatives and their implications.

Research limitations/implications

The paper may lead many researchers to use the proposed tool to model different cases of lean manufacturing and pave a new direction for future research in lean manufacturing.

Practical implications

The paper demonstrates the application of new tools through cases and the tool may be used by practitioners of lean philosophy or total quality management to model and investigate their decisions.

Originality/value

The proposed measures of complexity are absolutely new addition to the tool box of graph theoretic structural modeling and have a potential to be adopted by practical decision makers to steer their organizations though such decisions before the costly interruptions in manufacturing systems are tried on ground.

Details

Journal of Manufacturing Technology Management, vol. 29 no. 2
Type: Research Article
ISSN: 1741-038X

Keywords

Article
Publication date: 9 November 2018

Nasser Javid, Kaveh Khalili-Damghani, Ahmad Makui and Farshid Abdi

This paper aims to propose a multi-dimensional model on the basis of the key factors of the flexibility and the complexity through structural equation modeling (SEM). Dimensions…

Abstract

Purpose

This paper aims to propose a multi-dimensional model on the basis of the key factors of the flexibility and the complexity through structural equation modeling (SEM). Dimensions of the flexibilities and complexity, including 16 main factors and 34 sub-factors, are investigated. The sampling of the research is accomplished using both academic and industrial experts.

Design/methodology/approach

A huge electronic questionnaire analysis, including 1,250 samples from which 1,036 were returned, was accomplished in various universities and manufacturing companies throughout the USA, Europe and Asia. Partial least square-SEM (PLS-SEM) is used to test the hypotheses through confirmatory factor analysis.

Findings

The results reveal insightful information about the impacts of different dimensions of flexibility on each other and also the effect of the flexibility on the complexity. Finally, system of linear mathematical equations for flexibility-complexity trade-off is proposed. This can be applied to realize the trade-off among dimensions of flexibility and complexity.

Originality/value

Flexible manufacturing systems are formed to meet the needs of the customers. Such systems try to produce products in appropriate quality at the right time and at the specified quantity. These, in turn, require flexibility and will cause complexity. Although flexibility and complexity are both important, there is no comprehensive framework in which the multi-dimensional relationships of the manufacturing flexibility and complexity, as well as their dimensions, are demonstrated.

Article
Publication date: 1 March 2006

Andreas Größler, André Grübner and Peter M. Milling

Based on a conceptual framework of the linkages between strategic manufacturing goals and complexity, the purpose of this paper is to investigate adaptation processes in…

3453

Abstract

Purpose

Based on a conceptual framework of the linkages between strategic manufacturing goals and complexity, the purpose of this paper is to investigate adaptation processes in manufacturing firms to increasing external complexity.

Design/methodology/approach

Hypotheses are tested with statistical analyses (group comparisons and structural equation models) that are conducted with data from the third round of the International Manufacturing Strategy Survey.

Findings

The study shows that manufacturing firms face different degrees of complexity. Firms in a more complex environment tend to possess a more complex internal structure, as indicated by process configuration, than firms in a less complex environment. Also depending on the degree of complexity, different processes of adaptation to increases in external complexity are initiated by organisations.

Research limitations/implications

Research studies taking into account the dynamics of adaptation processes would be helpful in order to draw further conclusions, for instance, based on longitudinal analyses or simulation studies.

Practical implications

Depending on the level of complexity a firm has been confronted with in the past, different adaptation processes to further growing complexity can be initiated. Firms in high complexity environments have to re‐configure their strategic goals; firms in low complexity environments have to build‐up internal complexity to cope with demands from the outside.

Originality/value

The paper distinguishes between adaptation processes in low and high complexity environments and provides explanations for the differences.

Details

International Journal of Operations & Production Management, vol. 26 no. 3
Type: Research Article
ISSN: 0144-3577

Keywords

1 – 10 of over 109000