Search results
1 – 10 of over 80000D.P. Zielinski and V.R. Voller
The purpose of this paper is to develop an alternative numerical approach for describing fractional diffusion in Cartesian and non‐Cartesian domains using a Monte Carlo random…
Abstract
Purpose
The purpose of this paper is to develop an alternative numerical approach for describing fractional diffusion in Cartesian and non‐Cartesian domains using a Monte Carlo random walk scheme. The resulting domain shifting scheme provides a numerical solution for multi‐dimensional steady state, source free diffusion problems with fluxes expressed in terms of Caputo fractional derivatives. This class of problems takes account of non‐locality in transport, expressed through parameters representing both the extent and direction of the non‐locality.
Design/methodology/approach
The method described here follows a similar approach to random walk methods previously developed for normal (local) diffusion. The key differences from standard methods are: first, the random shifting of the domain about the point of interest with, second, shift steps selected from non‐symmetric, power‐law tailed, Lévy probability distribution functions.
Findings
The domain shifting scheme is verified by comparing predictive solutions to known one‐dimensional and two‐dimensional analytical solutions for fractional diffusion problems. The scheme is also applied to a problem of fractional diffusion in a non‐Cartesian annulus domain. In contrast to the axisymmetric, steady state solution for normal diffusion, a non‐axisymmetric solution results.
Originality/value
This is the first random walk scheme to utilize the concept of allowing the domain to undergo the random walk about a point of interest. Domain shifting scheme solutions of fractional diffusion in non‐Cartesian domains provide an invaluable tool to direct the development of more sophisticated grid based finite element inspired fractional diffusion schemes.
Details
Keywords
James M. Pruett and Andreas Schartner
Describes the scheduling problem and JOB, then presents anextensive job shop scheduling session in which a variety of schedulingproblems are encountered and overcome using JOB′s…
Abstract
Describes the scheduling problem and JOB, then presents an extensive job shop scheduling session in which a variety of scheduling problems are encountered and overcome using JOB′s interactive scheduling option. The example shows how work orders may be created and scheduled, and the schedules evaluated, all within the framework of the JOB system. By working with typical job shop scheduling opportunities in a realistic though simulated environment, users will better understand the problems job shop schedulers actually face and will be better able to solve them.
Details
Keywords
Andrew B. Martinez, Jennifer L. Castle and David F. Hendry
We investigate whether smooth robust methods for forecasting can help mitigate pronounced and persistent failure across multiple forecast horizons. We demonstrate that naive…
Abstract
We investigate whether smooth robust methods for forecasting can help mitigate pronounced and persistent failure across multiple forecast horizons. We demonstrate that naive predictors are interpretable as local estimators of the long-run relationship with the advantage of adapting quickly after a break, but at a cost of additional forecast error variance. Smoothing over naive estimates helps retain these advantages while reducing the costs, especially for longer forecast horizons. We derive the performance of these predictors after a location shift, and confirm the results using simulations. We apply smooth methods to forecasts of UK productivity and US 10-year Treasury yields and show that they can dramatically reduce persistent forecast failure exhibited by forecasts from macroeconomic models and professional forecasters.
Details
Keywords
Yosuke Kunieda and Katsuyoshi Takashima
This study aims to clarify how companies should manage exploration and exploitation in the long term, and particularly whether companies should dynamically change their resource…
Abstract
Purpose
This study aims to clarify how companies should manage exploration and exploitation in the long term, and particularly whether companies should dynamically change their resource allocation related to exploration and exploitation activities.
Design/methodology/approach
To demonstrate the effect of shifts in focus between exploration and exploitation on financial performance and market evaluation, an empirical examination was conducted using secondary panel data for Japanese manufacturers from 2000 to 2014, which was analyzed by fixed-effect estimation with a control function approach considering the problem of endogeneity.
Findings
The empirical results suggest that companies should change their resource allocation related to exploration and exploitation in the long term. Long-term focus shifts between exploration and exploitation activities enhance not only future financial performance (return on assets and return on sales), but also future market evaluations (Tobin’s Q).
Research limitations/implications
This paper showed a pathway connecting technological knowledge searches to the company’s future performance. With reference to the discussion of existing research, it remains unclear what kind of management is required for company activities related to exploration and exploitation. This study showed that companies can improve their profitability and market evaluations by changing their resource allocation for exploration and exploitation activities over time.
Originality/value
While most research on exploration and exploitation is from a static perspective, this study simultaneously incorporated focus balance and focus shifts into the empirical model and thereby examined exploration and exploitation from a dynamic perspective. Even when considering the effects of balancing exploration and exploitation, this study confirmed that organizational vacillation will improve financial performance and market evaluation.
Details
Keywords
Edward Morrison, John D. Barrett and Janyce B. Fadden
The purpose of this paper is to apply a reflective theory of development for entrepreneurial ecosystems in the Muscle Shoals region of northern Alabama. The theory provides…
Abstract
Purpose
The purpose of this paper is to apply a reflective theory of development for entrepreneurial ecosystems in the Muscle Shoals region of northern Alabama. The theory provides guidance for practitioners and policymakers interested in developing entrepreneurial ecosystems.
Design/methodology/approach
The theory offers five propositions, which are illustrated and applied in the case study. The propositions include the need for civic leaders recognizing local talent; support networks for entrepreneurs; a quality, connected place; activities designed to increase interactivity for entrepreneurs within the ecosystem; five distinct phases producing replicable, scalable and sustainable projects; and universities providing platforms upon which the ecosystems can develop.
Findings
Application of the proposed theory is transforming the entrepreneurial ecosystem in the Muscle Shoals region. In just four years, the project has produced over 30 initiatives and events, precipitously increased student participation in entrepreneurial ventures and raised over $1m.
Originality/value
The theory and its application developed from a collaboration between the Agile Strategy Lab at Purdue University and the Institute for Innovation and Economic Development at the University of North Alabama. This collaboration is replicable, scalable and sustainable, and is a model for university-led entrepreneurial ecosystem development and transformation.
Details
Keywords
Hadi Grailu, Mojtaba Lotfizad and Hadi Sadoghi‐Yazdi
The purpose of this paper is to propose a lossy/lossless binary textual image compression method based on an improved pattern matching (PM) technique.
Abstract
Purpose
The purpose of this paper is to propose a lossy/lossless binary textual image compression method based on an improved pattern matching (PM) technique.
Design/methodology/approach
In the Farsi/Arabic script, contrary to the printed Latin script, letters usually attach together and produce various patterns. Hence, some patterns are fully or partially subsets of some others. Two new ideas are proposed here. First, the number of library prototypes is reduced by detecting and then removing the fully or partially similar prototypes. Second, a new effective pattern encoding scheme is proposed for all types of patterns including text and graphics. The new encoding scheme has two operation modes of chain coding and soft PM, depending on the ratio of the pattern area to its chain code effective length. In order to encode the number sequences, the authors have modified the multi‐symbol QM‐coder. The proposed method has three levels for the lossy compression. Each level, in its turn, further increases the compression ratio. The first level includes applying some processing in the chain code domain such as omission of small patterns and holes, omission of inner holes of characters, and smoothing the boundaries of the patterns. The second level includes the selective pixel reversal technique, and the third level includes using the proposed method of prioritizing the residual patterns for encoding, with respect to their degree of compactness.
Findings
Experimental results show that the compression performance of the proposed method is considerably better than that of the best existing binary textual image compression methods as high as 1.6‐3 times in the lossy case and 1.3‐2.4 times in the lossless case at 300 dpi. The maximum compression ratios are achieved for Farsi and Arabic textual images.
Research limitations/implications
Only the binary printed typeset textual images are considered.
Practical implications
The proposed method has a high‐compression ratio for archiving and storage applications.
Originality/value
To the authors' best knowledge, the existing textual image compression methods or standards have not so far exploited the property of full or partial similarity of prototypes for increasing the compression ratio for any scripts. Also, the idea of combining the boundary description methods with the run‐length and arithmetic coding techniques has not so far been used.
The purpose of this paper is to present a new quantum‐inspired evolutionary hybrid intelligent (QIEHI) approach, in order to overcome the random walk dilemma for stock market…
Abstract
Purpose
The purpose of this paper is to present a new quantum‐inspired evolutionary hybrid intelligent (QIEHI) approach, in order to overcome the random walk dilemma for stock market prediction.
Design/methodology/approach
The proposed QIEHI method is inspired by the Takens' theorem and performs a quantum‐inspired evolutionary search for the minimum necessary dimension (time lags) embedded in the problem for determining the characteristic phase space that generates the financial time series phenomenon. The approach presented in this paper consists of a quantum‐inspired intelligent model composed of an artificial neural network (ANN) with a modified quantum‐inspired evolutionary algorithm (MQIEA), which is able to evolve the complete ANN architecture and parameters (pruning process), the ANN training algorithm (used to further improve the ANN parameters supplied by the MQIEA), and the most suitable time lags, to better describe the time series phenomenon.
Findings
This paper finds that, initially, the proposed QIEHI method chooses the better prediction model, then it performs a behavioral statistical test to adjust time phase distortions that appear in financial time series. Also, an experimental analysis is conducted with the proposed approach using six real‐word stock market times series, and the obtained results are discussed and compared, according to a group of relevant performance metrics, to results found with multilayer perceptron networks and the previously introduced time‐delay added evolutionary forecasting method.
Originality/value
The paper usefully demonstrates how the proposed QIEHI method chooses the best prediction model for the times series representation and performs a behavioral statistical test to adjust time phase distortions that frequently appear in financial time series.
Details
Keywords
Discusses the problem definition and solving strategies contained in the TRIZ methodology. Constructed around the findings of over 1,500 person years of research, and the…
Abstract
Discusses the problem definition and solving strategies contained in the TRIZ methodology. Constructed around the findings of over 1,500 person years of research, and the systematic extraction of knowledge from nearly 3 million of the world’s strongest patents, the Russian Theory of Inventive Problem Solving, TRIZ, has identified a number of design contradiction‐eliminating strategies, and distinct and predictable technology evolution patterns. These patterns and their use in the context of current and projected future manufacturing methods and systems are discussed.
Details
Keywords
Cristina Ponsiglione, Adelaide Ippolito, Simonetta Primario and Giuseppe Zollo
The purpose of this paper is to explore the configuration of factors affecting the accuracy of triage decision-making. The contribution of the work is twofold: first, it develops…
Abstract
Purpose
The purpose of this paper is to explore the configuration of factors affecting the accuracy of triage decision-making. The contribution of the work is twofold: first, it develops a protocol for applying a fuzzy-set qualitative comparative analysis (fsQCA) in the context of triage decision-making, and second, it studies, through two pilot cases, the interplay between individual and organizational factors in determining the emergence of errors in different decisional situations.
Design/methodology/approach
The methodology adopted in this paper is the qualitative comparative analysis (QCA). The fuzzy-set variant of QCA (fsQCA) is implemented. The data set has been collected during field research carried out in the Emergency Departments (EDs) of two Italian public hospitals.
Findings
The results of this study show that the interplay between individual and contextual/organizational factors determines the emergence of errors in triage assessment. Furthermore, there are some regularities in the patterns discovered in each of the investigated organizational contexts. These findings suggest that we should avoid isolating individual factors from the context in which nurses make their decisions.
Originality/value
Previous research on triage has mainly explored the impact of homogeneous groups of factors on the accuracy of the triage process, without considering the complexity of the phenomenon under investigation. This study outlines the need to consider the not-linear relationships among different factors in the study of triage’s decision-making. The definition and implementation of a protocol to apply fsQCA to the triage process in EDs further contributes to the originality of the research.
Details