Search results

1 – 10 of over 7000
Article
Publication date: 29 May 2020

Jianyu Zhao, Anzhi Bai, Xi Xi, Yining Huang and Shanshan Wang

Malicious attacks extremely traumatize knowledge networks due to increasing interdependence among knowledge elements. Therefore, exposing the damage of malicious attacks to…

Abstract

Purpose

Malicious attacks extremely traumatize knowledge networks due to increasing interdependence among knowledge elements. Therefore, exposing the damage of malicious attacks to knowledge networks has important theoretical and practical significance. Despite the insights being offered by the growing research stream, few studies discuss the diverse responses of knowledge networks’ robustness to different target-attacks, and the authors lack sufficient knowledge of which forms of malicious attacks constitute greater disaster when knowledge networks evolve to different stages. Given the irreversible consequences of malicious attacks on knowledge networks, this paper aims to examine the impacts of different malicious attacks on the robustness of knowledge networks.

Design/methodology/approach

On the basic of dividing malicious attacks into six forms, the authors incorporate two important aspects of robustness of knowledge networks – structure and function – in a research framework, and use maximal connected sub-graphs and network efficiency, respectively, to measure structural and functional robustness. Furthermore, the authors conceptualize knowledge as a multi-dimensional structure to reflect the heterogeneous nature of knowledge elements, and design the fundamental rules of simulation. NetLogo is used to simulate the features of knowledge networks and their changes of robustness as they face different malicious attacks.

Findings

First, knowledge networks gradually form more associative integrated structures with evolutionary progress. Second, various properties of knowledge elements play diverse roles in mitigating damage from malicious attacks. Recalculated-degree-based attacks cause greater damage than degree-based attacks, and structure of knowledge networks has higher resilience against ability than function. Third, structural robustness is mainly affected by the potential combinatorial value of high-degree knowledge elements, and the combinatorial potential of high-out-degree knowledge elements. Forth, the number of high in-degree knowledge elements with heterogeneous contents, and the inverted U-sharp effect contributed by high out-degree knowledge elements are the main influencers of functional robustness.

Research limitations/implications

The authors use the frontier method to expose the detriments of malicious attacks both to structural and functional robustness in each evolutionary stage, and the authors reveal the relationship and effects of knowledge-based connections and knowledge combinatorial opportunities that contribute to maintaining them. Furthermore, the authors identify latent critical factors that may improve the structural and functional robustness of knowledge networks.

Originality/value

First, from the dynamic evolutionary perspective, the authors systematically examine structural and functional robustness to reveal the roles of the properties of knowledge element, and knowledge associations to maintain the robustness of knowledge networks. Second, the authors compare the damage of six forms of malicious attacks to identify the reasons for increased robustness vulnerability. Third, the authors construct the stock, power, expertise knowledge structure to overcome the difficulty of knowledge conceptualization. The results respond to multiple calls from different studies and extend the literature in multiple research domains.

Details

Journal of Knowledge Management, vol. 24 no. 5
Type: Research Article
ISSN: 1367-3270

Keywords

Article
Publication date: 12 July 2011

Aviral Shukla, Vishal Agarwal Lalit and Venkat Venkatasubramanian

Supply chain network design is an important strategic decision that firms make considering both the short‐ and long‐term consequences of the network's performance. The typical…

3334

Abstract

Purpose

Supply chain network design is an important strategic decision that firms make considering both the short‐ and long‐term consequences of the network's performance. The typical design approach implicitly assumes that, once designed, the facilities and the links will always operate as planned. In reality, however, facilities and the links connecting them, fail from time to time due to poor weather, natural or man‐made disasters, or a combination of any other factors. This work aims to propose a design framework that addresses the facility and link failures explicitly by accounting for their impact on a network's performance measures of efficiency and robustness.

Design/methodology/approach

The study incorporated a robustness metric for evaluating the resiliency of supply chains in the case of a network disruption. This robustness metric is based on expected losses incurred due to network failures. It defines efficiency and robustness in terms of operational cost and expected disruption cost (EDC), respectively. The EDC is defined in terms of loss of opportunity cost incurred due to not meeting demand on time after a disruption has occurred. The study used a scenario planning approach and formulated a mixed integer linear program model with the objective of maximizing both efficiency and robustness. It also evaluates the trade‐offs between efficiency and robustness.

Findings

The resulting supply chain is much more reliable in the long term since we have shown that a significant amount of robustness can be built into the system without compromising a lot on efficiency.

Originality/value

This work demonstrates a methodology which incorporates such disaster scenarios into the design of a supply chain network. This leads to a more reliable supply chain which would lead to higher profitability and lower disruption rates.

Details

International Journal of Physical Distribution & Logistics Management, vol. 41 no. 6
Type: Research Article
ISSN: 0960-0035

Keywords

Open Access
Article
Publication date: 26 March 2024

Manuel Rossetti, Juliana Bright, Andrew Freeman, Anna Lee and Anthony Parrish

This paper is motivated by the need to assess the risk profiles associated with the substantial number of items within military supply chains. The scale of supply chain management…

Abstract

Purpose

This paper is motivated by the need to assess the risk profiles associated with the substantial number of items within military supply chains. The scale of supply chain management processes creates difficulties in both the complexity of the analysis and in performing risk assessments that are based on the manual (human analyst) assessment methods. Thus, analysts require methods that can be automated and that can incorporate on-going operational data on a regular basis.

Design/methodology/approach

The approach taken to address the identification of supply chain risk within an operational setting is based on aspects of multiobjective decision analysis (MODA). The approach constructs a risk and importance index for supply chain elements based on operational data. These indices are commensurate in value, leading to interpretable measures for decision-making.

Findings

Risk and importance indices were developed for the analysis of items within an example supply chain. Using the data on items, individual MODA models were formed and demonstrated using a prototype tool.

Originality/value

To better prepare risk mitigation strategies, analysts require the ability to identify potential sources of risk, especially in times of disruption such as natural disasters.

Details

Journal of Defense Analytics and Logistics, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 2399-6439

Keywords

Article
Publication date: 25 July 2019

S. Khodaygan

The purpose of this paper is to present a novel Kriging meta-model assisted method for multi-objective optimal tolerance design of the mechanical assemblies based on the operating…

Abstract

Purpose

The purpose of this paper is to present a novel Kriging meta-model assisted method for multi-objective optimal tolerance design of the mechanical assemblies based on the operating conditions under both systematic and random uncertainties.

Design/methodology/approach

In the proposed method, the performance, the quality loss and the manufacturing cost issues are formulated as the main criteria in terms of systematic and random uncertainties. To investigate the mechanical assembly under the operating conditions, the behavior of the assembly can be simulated based on the finite element analysis (FEA). The objective functions in terms of uncertainties at the operating conditions can be modeled through the Kriging-based metamodeling based on the obtained results from the FEA simulations. Then, the optimal tolerance allocation procedure is formulated as a multi-objective optimization framework. For solving the multi conflicting objectives optimization problem, the multi-objective particle swarm optimization method is used. Then, a Shannon’s entropy-based TOPSIS is used for selection of the best tolerances from the optimal Pareto solutions.

Findings

The proposed method can be used for optimal tolerance design of mechanical assemblies in the operating conditions with including both random and systematic uncertainties. To reach an accurate model of the design function at the operating conditions, the Kriging meta-modeling is used. The efficiency of the proposed method by considering a case study is illustrated and the method is verified by comparison to a conventional tolerance allocation method. The obtained results show that using the proposed method can lead to the product with a more robust efficiency in the performance and a higher quality in comparing to the conventional results.

Research limitations/implications

The proposed method is limited to the dimensional tolerances of components with the normal distribution.

Practical implications

The proposed method is practically easy to be automated for computer-aided tolerance design in industrial applications.

Originality/value

In conventional approaches, regardless of systematic and random uncertainties due to operating conditions, tolerances are allocated based on the assembly conditions. As uncertainties can significantly affect the system’s performance at operating conditions, tolerance allocation without including these effects may be inefficient. This paper aims to fill this gap in the literature by considering both systematic and random uncertainties for multi-objective optimal tolerance design of mechanical assemblies under operating conditions.

Book part
Publication date: 25 January 2023

Petra Sauer, Narasimha D. Rao and Shonali Pachauri

In large parts of the world, income inequality has been rising in recent decades. Other regions have experienced declining trends in income inequality. This raises the question of…

Abstract

In large parts of the world, income inequality has been rising in recent decades. Other regions have experienced declining trends in income inequality. This raises the question of which mechanisms underlie contrasting observed trends in income inequality around the globe. To address this research question in an empirical analysis at the aggregate level, we examine a global sample of 73 countries between 1981 and 2010, studying a broad set of drivers to investigate their interaction and influence on income inequality. Within this broad approach, we are interested in the heterogeneity of income inequality determinants across world regions and along the income distribution. Our findings indicate the existence of a small set of systematic drivers across the global sample of countries. Declining labour income shares and increasing imports from high-income countries significantly contribute to increasing income inequality, while taxation and imports from low-income countries exert countervailing effects. Our study reveals the region-specific impacts of technological change, financial globalisation, domestic financial deepening and public social spending. Most importantly, we do not find systematic evidence of education’s equalising effect across high- and low-income countries. Our results are largely robust to changing the underlying sources of income Ginis, but looking at different segments of income distribution reveals heterogeneous effects.

Details

Mobility and Inequality Trends
Type: Book
ISBN: 978-1-80382-901-2

Keywords

Article
Publication date: 24 February 2012

Jing Hu, Xin Liu, Sijun Wang and Zhilin Yang

This study aims to examine the role of functional and symbolic image congruity in Chinese consumers' brand preferences in the auto market, and the role of brand familiarity in…

8138

Abstract

Purpose

This study aims to examine the role of functional and symbolic image congruity in Chinese consumers' brand preferences in the auto market, and the role of brand familiarity in moderating the relationship between brand image congruity and consumers' preferences.

Design/methodology/approach

A one‐on‐one survey was administered to 1,440 consumers by market research specialists on two popular auto brands in China.

Findings

While confirming existing findings concerning functional image congruity, the results revealed that symbolic image congruity had a negative impact on Chinese consumers' brand preference when a brand's perceived symbolic image is higher than consumers' ideal expectations (i.e. upward incongruity), and brand familiarity does not moderate the role of symbolic image congruity in Chinese consumers' brand preference.

Originality/value

The paper's findings could help managers to improve their brand management and enhance consumer satisfaction.

Details

Journal of Product & Brand Management, vol. 21 no. 1
Type: Research Article
ISSN: 1061-0421

Keywords

Article
Publication date: 4 March 2019

Joao Jalles

The purpose of this paper is to assess the responses of different categories of government spending to changes in economic activity. In other words, the authors empirically…

Abstract

Purpose

The purpose of this paper is to assess the responses of different categories of government spending to changes in economic activity. In other words, the authors empirically revisit the validation of the Wagner’s law in a sample of 61 advanced and emerging market economies between 1995 and 2015.

Design/methodology/approach

The authors do so via panel data instrumental variables and time-series SUR approaches.

Findings

Evidence from panel data analyses show that the Wagner’s law seems more prevalent in advanced economies and when countries are growing above potential. However, such result depends on the government spending category under scrutiny and the functional form used. Country-specific analysis revealed relatively more cases satisfying Wagner’s proposition within the emerging markets sample. The authors also found evidence of counter-cyclicality in several spending items. All in all, the Wagner’s regularity seems more the exception than the norm.

Originality/value

While in the literature on the size of the public sector with respect to a country’s level of economic development has received much attention, the authors make several novel contributions: since some economists criticized Wagner’s law because of ambiguity of the measurement of government expenditure (Musgrave, 1969), instead of looking at aggregate public expenditures, the authors go much more granular into the different functions of government (to this end, the authors use the Classification of Functions of the Government nomenclature). The authors check the validity of the Law via an instrumental variable approach in a panel setting; after that, the authors take into account the phase of the business cycle using a new filtering technique to compute potential GDP (output gap); then, the authors cross-check the baseline results by considering alternative functional form specifications of the Law; and finally, the authors look at individual countries one at the time via SUR analysis.

Details

Journal of Economic Studies, vol. 46 no. 2
Type: Research Article
ISSN: 0144-3585

Keywords

Article
Publication date: 28 March 2024

Elisa Gonzalez Santacruz, David Romero, Julieta Noguez and Thorsten Wuest

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework…

Abstract

Purpose

This research paper aims to analyze the scientific and grey literature on Quality 4.0 and zero-defect manufacturing (ZDM) frameworks to develop an integrated quality 4.0 framework (IQ4.0F) for quality improvement (QI) based on Six Sigma and machine learning (ML) techniques towards ZDM. The IQ4.0F aims to contribute to the advancement of defect prediction approaches in diverse manufacturing processes. Furthermore, the work enables a comprehensive analysis of process variables influencing product quality with emphasis on the use of supervised and unsupervised ML techniques in Six Sigma’s DMAIC (Define, Measure, Analyze, Improve and Control) cycle stage of “Analyze.”

Design/methodology/approach

The research methodology employed a systematic literature review (SLR) based on PRISMA guidelines to develop the integrated framework, followed by a real industrial case study set in the automotive industry to fulfill the objectives of verifying and validating the proposed IQ4.0F with primary data.

Findings

This research work demonstrates the value of a “stepwise framework” to facilitate a shift from conventional quality management systems (QMSs) to QMSs 4.0. It uses the IDEF0 modeling methodology and Six Sigma’s DMAIC cycle to structure the steps to be followed to adopt the Quality 4.0 paradigm for QI. It also proves the worth of integrating Six Sigma and ML techniques into the “Analyze” stage of the DMAIC cycle for improving defect prediction in manufacturing processes and supporting problem-solving activities for quality managers.

Originality/value

This research paper introduces a first-of-its-kind Quality 4.0 framework – the IQ4.0F. Each step of the IQ4.0F was verified and validated in an original industrial case study set in the automotive industry. It is the first Quality 4.0 framework, according to the SLR conducted, to utilize the principal component analysis technique as a substitute for “Screening Design” in the Design of Experiments phase and K-means clustering technique for multivariable analysis, identifying process parameters that significantly impact product quality. The proposed IQ4.0F not only empowers decision-makers with the knowledge to launch a Quality 4.0 initiative but also provides quality managers with a systematic problem-solving methodology for quality improvement.

Details

The TQM Journal, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 1754-2731

Keywords

Book part
Publication date: 12 September 2003

Jeffrey L Furman

The origin and nature of meaningful, persistent firm-specific differences is a central issue in the study of business strategy. I investigate in this paper the role of…

Abstract

The origin and nature of meaningful, persistent firm-specific differences is a central issue in the study of business strategy. I investigate in this paper the role of characteristics physically external to firms, but embodied in their local geographic areas, in driving differences in firms’ organizing strategies. Specifically, I examine the extent to which location-specific characteristics affect the organization of pharmaceutical firms’ research laboratories bringing both qualitative and quantitative evidence to bear on this issue. Analyses of the histories of several late 19th century drug makers suggest that differences in local institutions, labor markets, and demand structures played important roles in affecting case firms’ strategic evolution. For example, while Mulford (Philadelphia PA) exploited the strength of nearby universities and the city’s public health system in organizing around leading-edge capabilities in bacteriology, Sterling (Wheeling WV) found that its local environment rewarded investments in marketing and distribution. Panel data analysis on a sample of firms from the late 20th century provides complementary evidence, demonstrating that the scientific orientation of modern drug discovery laboratories is positively and significantly correlated with measures of the strength of the local scientific and technical base. Together, these analyses suggest that location-specific characteristics may be important in driving firm heterogeneity and, ultimately, competitive advantage.

Details

Geography and Strategy
Type: Book
ISBN: 978-0-76231-034-0

Book part
Publication date: 16 December 2009

Hector O. Zapata and Krishna P. Paudel

This is a survey paper of the recent literature on the application of semiparametric–econometric advances to testing for functional form of the environmental Kuznets curve (EKC)…

Abstract

This is a survey paper of the recent literature on the application of semiparametric–econometric advances to testing for functional form of the environmental Kuznets curve (EKC). The EKC postulates that there is an inverted U-shaped relationship between economic growth (typically measured by income) and pollution; that is, as economic growth expands, pollution increases up to a maximum and then starts declining after a threshold level of income. This hypothesized relationship is simple to visualize but has eluded many empirical investigations. A typical application of the EKC uses panel data models, which allows for heterogeneity, serial correlation, heteroskedasticity, data pooling, and smooth coefficients. This vast literature is reviewed in the context of semiparametric model specification tests. Additionally, recent developments in semiparametric econometrics, such as Bayesian methods, generalized time-varying coefficient models, and nonstationary panels are discussed as fruitful areas of future research. The cited literature is fairly complete and should prove useful to applied researchers at large.

Details

Nonparametric Econometric Methods
Type: Book
ISBN: 978-1-84950-624-3

1 – 10 of over 7000