Search results
1 – 10 of 38Leopold Bayerlein and Omar Al Farooque
The purpose of this paper is to evaluate the changes of accounting policy choices and the harmonisation of accounting practices for two important financial reporting items within…
Abstract
Purpose
The purpose of this paper is to evaluate the changes of accounting policy choices and the harmonisation of accounting practices for two important financial reporting items within and between three IFRS adopting countries. Furthermore, it aims to address methodological shortcomings in the prior harmonisation literature through the introduction of two newly developed significance assessment methodologies.
Design/methodology/approach
The influence of the mandatory IFRS adoption in Australia (AUS), Hong Kong (HK) and the UK on deferred taxation (DT) and goodwill (GW) accounting practices as well as the within and between country harmonisation of accounting practices is investigated through an event type study. These investigations are conducted using a McNemar test with Bowker extension as well as the Split C‐Index with a newly developed bootstrapping significance testing methodology.
Findings
This study demonstrates that the mandatory IFRS adoption in the analysed countries is linked to a significant harmonisation of DT and GW accounting practices between AUS, HK and the UK. Furthermore, the increase of adequate accounting policy information in the financial reporting documents of UK firms over the period of this study is identified as an important harmonisation accelerator.
Originality/value
This study adds to the prior literature due to its focus on the mandatory IFRS adoption within the analysed countries. Furthermore, the introduction of two newly developed methodologies to evaluate the significance of accounting policy choice changes and harmonisation over time addresses an important methodological shortcoming in the prior literature.
Details
Keywords
Andreas Zendler, O. William McClung and Dieter Klaudt
The development of a K-12 computer science curriculum based on constructivist principles needs to be informed by knowledge of content and process concepts that are central to the…
Abstract
Purpose
The development of a K-12 computer science curriculum based on constructivist principles needs to be informed by knowledge of content and process concepts that are central to the discipline of computer science. The paper aims to discuss this issue.
Design/methodology/approach
Taking a cross-cultural approach and using an experimental design (a SPF-2•15×16 split-plot design), this study compares the combinations of content and process concepts identified as important in Germany with those considered relevant in the US context.
Findings
First, the combinations of content and process concepts identified in the German context can be generalized to the US context. Second, it is possible to identify combinations of content and process concepts in the US context that are also important in the German context. Third, content and process concepts identified in the two contexts can be integrated to generate a broader perspective that is valid for both contexts.
Practical implications
The results can be used for consolidating available curricular drafts for computer science as a teaching subject at school of the type available in many. The present findings are of great relevance for research-based approaches to the pre- and in-service education of computer science teachers. The methodological approach taken is important in efforts to consolidate curricular models of computer science education, as have been initiated by the Bologna process in Europe and by the organizations Association for Computing Machinery, Association for Information Systems, and Institute of Electrical and Electronic Engineers-Computer Society in the USA.
Originality/value
Results show that competence areas of central concepts identified in the two contexts can be integrated to generate a broader perspective that is valid for both contexts.
Details
Keywords
Marcelo Cajias and Anna Freudenreich
This is the first article to apply a machine learning approach to the analysis of time on market on real estate markets.
Abstract
Purpose
This is the first article to apply a machine learning approach to the analysis of time on market on real estate markets.
Design/methodology/approach
The random survival forest approach is introduced to the real estate market. The most important predictors of time on market are revealed and it is analyzed how the survival probability of residential rental apartments responds to these major characteristics.
Findings
Results show that price, living area, construction year, year of listing and the distances to the next hairdresser, bakery and city center have the greatest impact on the marketing time of residential apartments. The time on market for an apartment in Munich is lowest at a price of 750 € per month, an area of 60 m2, built in 1985 and is in a range of 200–400 meters from the important amenities.
Practical implications
The findings might be interesting for private and institutional investors to derive real estate investment decisions and implications for portfolio management strategies and ultimately to minimize cash-flow failure.
Originality/value
Although machine learning algorithms have been applied frequently on the real estate market for the analysis of prices, its application for examining time on market is completely novel. This is the first paper to apply a machine learning approach to survival analysis on the real estate market.
Details
Keywords
The multi-robot task allocation (MRTA) problem is a challenging issue in the robotics area with plentiful practical applications. Expanding the number of tasks and robots…
Abstract
Purpose
The multi-robot task allocation (MRTA) problem is a challenging issue in the robotics area with plentiful practical applications. Expanding the number of tasks and robots increases the size of the state space significantly and influences the performance of the MRTA. As this process requires high computational time, this paper aims to describe a technique that minimizes the size of the explored state space, by partitioning the tasks into clusters. In this paper, the authors address the problem of MRTA by putting forward a new automatic clustering algorithm of the robots' tasks based on a dynamic-distributed double-guided particle swarm optimization, namely, ACD3GPSO.
Design/methodology/approach
This approach is made out of two phases: phase I groups the tasks into clusters using the ACD3GPSO algorithm and phase II allocates the robots to the clusters. Four factors are introduced in ACD3GPSO for better results. First, ACD3GPSO uses the k-means algorithm as a means to improve the initial generation of particles. The second factor is the distribution using the multi-agent approach to reduce the run time. The third one is the diversification introduced by two local optimum detectors LODpBest and LODgBest. The last one is based on the concept of templates and guidance probability Pguid.
Findings
Computational experiments were carried out to prove the effectiveness of this approach. It is compared against two state-of-the-art solutions of the MRTA and against two evolutionary methods under five different numerical simulations. The simulation results confirm that the proposed method is highly competitive in terms of the clustering time, clustering cost and MRTA time.
Practical implications
The proposed algorithm is quite useful for real-world applications, especially the scenarios involving a high number of robots and tasks.
Originality/value
In this methodology, owing to the ACD3GPSO algorithm, task allocation's run time has diminished. Therefore, the proposed method can be considered as a vital alternative in the field of MRTA with growing numbers of both robots and tasks. In PSO, stagnation and local optima issues are avoided by adding assorted variety to the population, without losing its fast convergence.
Details
Keywords
Michael J. Brusco, Renu Singh, J. Dennis Cradit and Douglas Steinley
The purpose of this paper is twofold. First, the authors provide a survey of operations management (OM) research applications of traditional hierarchical and nonhierarchical…
Abstract
Purpose
The purpose of this paper is twofold. First, the authors provide a survey of operations management (OM) research applications of traditional hierarchical and nonhierarchical clustering methods with respect to key decisions that are central to a valid analysis. Second, the authors offer recommendations for practice with respect to these decisions.
Design/methodology/approach
A coding study was conducted for 97 cluster analyses reported in six OM journals during the period spanning 1994-2015. Data were collected with respect to: variable selection, variable standardization, method, selection of the number of clusters, consistency/stability of the clustering solution, and profiling of the clusters based on exogenous variables. Recommended practices for validation of clustering solutions are provided within the context of this framework.
Findings
There is considerable variability across clustering applications with respect to the components of validation, as well as a mix of productive and undesirable practices. This justifies the importance of the authors’ provision of a schema for conducting a cluster analysis.
Research limitations/implications
Certain aspects of the coding study required some degree of subjectivity with respect to interpretation or classification. However, in light of the sheer magnitude of the coding study (97 articles), the authors are confident that an accurate picture of empirical OM clustering applications has been presented.
Practical implications
The paper provides a critique and synthesis of the practice of cluster analysis in OM research. The coding study provides a thorough foundation for how the key decisions of a cluster analysis have been previously handled in the literature. Both researchers and practitioners are provided with guidelines for performing a valid cluster analysis.
Originality/value
To the best of the authors’ knowledge, no study of this type has been reported in the OM literature. The authors’ recommendations for cluster validation draw from recent studies in other disciplines that are apt to be unfamiliar to many OM researchers.
Details
Keywords
M. Taha Janan and A. El Marjani
This paper aims to develop an efficient numerical method for simulating multicomponent flows by solving the system of conservative equations closed by a general two parameters…
Abstract
Purpose
This paper aims to develop an efficient numerical method for simulating multicomponent flows by solving the system of conservative equations closed by a general two parameters equation of state.
Design/methodology/approach
A finite difference method for solving the two‐dimensional Euler or Navier‐Stokes equations for multicomponent flows in a general curvilinear coordinate system is developed. The system of conservative equations (mass, momentum and energy) is closed with a general two parameters equation of state (ρe=(p+γp∞)/(γ−1)), which, associated to a γ‐formulation, allows easy computation of multicomponent flows. In order to enforce the stability of the numerical scheme, the Roe's flux‐difference splitting is adopted for the numerical treatment of the inviscid fluxes. The method is adapted to treat also unsteady flows by implementing an explicit Euler scheme.
Findings
The method was applied to compute various configurations of flows, ranging from incompressible to compressible fluid, including cases of single component flows or multicomponent ones. Computations show that the use of primitive variables instead of conservative ones, especially at low Mach numbers, improves the iteration process when the resolution is performed with a relaxation procedure such as Gauss‐Seidel method. Simulations of compressible flows with a strong shock show the ability of the present method to capture shocks correctly even with the use of primitive variables. To complete numerical tests, flows involving two fluids with the presence of interactions between a shock and a discontinuity surface have been treated successfully. Also, a case of cavitating flow has been considered in this work.
Originality/value
The present method permits the simulation of a large variety of multicomponent complexes flows with an efficient numerical taking advantage of Roe's flux‐difference splitting in curvilinear coordinate system.
Details
Keywords
Metin Vatansever, İbrahim Demir and Ali Hepşen
The main purpose of this study is to detect homogeneous housing market areas among 196 districts of 5 major cities of Turkey in terms of house sale price indices. The second…
Abstract
Purpose
The main purpose of this study is to detect homogeneous housing market areas among 196 districts of 5 major cities of Turkey in terms of house sale price indices. The second purpose is to forecast these 196 house sale price indices.
Design/methodology/approach
In this paper, the authors use the monthly house sale price indices of 196 districts of 5 major cities of Turkey. The authors propose an autoregressive (AR) model-based fuzzy clustering approach to detect homogeneous housing market areas and to forecast house price indices.
Findings
The AR model-based fuzzy clustering approach detects three numbers of homogenous property market areas among 196 districts of 5 major cities of Turkey where house sale price moves together (or with similar house sales dynamic). This approach also provides better forecasting results compared to standard AR models by higher data efficiency and lower model validation and maintenance effort.
Research limitations/implications
In this study, the authors could not use any district-based socioeconomic and consumption behavioral indicators and any discrete geographical and property characteristics because of the data limitation.
Practical implications
The finding of this study would help property investors for establishing more effective property management strategies by taking different geographical location conditions into account.
Social implications
From the government side, knowing future rises, falls and turning points of property prices in different locations can allow the government to monitor the property price changes and control the speculation activities that cause a dramatic change in the market.
Originality/value
There is no previous research paper focusing on neighborhood-based clusters and forecasting house sale price indices in Turkey. At this point, it is the first academic study.
Details
Keywords
George N Kenyon, R. Samual Sale, Kurt Hozak and Paul Chiou
The purpose of this paper is to develop an yield-based process capability index (PCI), C py , to overcome the shortcomings…
Abstract
Purpose
The purpose of this paper is to develop an yield-based process capability index (PCI), C py , to overcome the shortcomings of existing PCIs that limit their use and lead to inaccurate measures of quality conformance under a variety of common conditions.
Design/methodology/approach
–C py is developed conceptually to flexibly and accurately reflect conformance and then used to numerically measure inaccuracies of C pk .
Findings
–C py overcomes many of the problems associated with existing PCIs, including C pk . The degree of process distribution non-normality, level of quality (the sigma level), and whether the process is centered or shifted left or right affect the direction and size of process capability error produced by C pk . The accuracy of C pk can be greatly affected by process data that deviate even slightly from normality.
Practical implications
–C py offers numerous advantages compared to existing PCIs. It accurately reflects process conformance regardless of the process distribution. It is applicable even if the process has multiple characteristics and with both variable and attribute data. Its calculation is relatively simple and the necessary data for it are likely already captured by most organizations.
Originality/value
The main contributions are the development of a new PCI, C py ; a conceptual analysis of its advantages; and a numerical analysis of the improved accuracy of C py as compared to C pk for shifted and non-shifted process means for normal, nearly normal, and highly non-normal distributions over a range of process variability levels.
Details
Keywords
Khee Giap Tan, Hui Yin Chuah and Nguyen Trieu Duong Luu
Malaysia and Singapore had parted more than five decades ago. Much of the existing literature concerned about the bilateral ties between two economies focusing on the political…
Abstract
Purpose
Malaysia and Singapore had parted more than five decades ago. Much of the existing literature concerned about the bilateral ties between two economies focusing on the political economy perspective. This paper aims to provide insights on the economic development and prospects of Malaysia and Singapore at the national level. In addition, this paper also makes a pioneering attempt at conducting a comprehensive comparative analysis between Malaysia and Singapore at the city level.
Design/methodology/approach
This paper offers a case study of Malaysia and Singapore by assessing their national economic competitiveness, urban standards of living and quality of life. The paper leverages on a series of indices such as the competitiveness index for ASEAN-10, the cost of living, wages and purchasing power of ordinary residents, as well as the liveable cities index to perform the analysis.
Findings
In terms of national competitiveness, the analysis shows that Singapore and Malaysia have been leading the ASEAN region from 2000 onwards, being the top- and second-ranked, respectively. Malaysia still lags Singapore in several aspects such as attractiveness to foreign investors and standard of living, education and social stability despite insignificant differences in the ranking. City-level analysis shows that the cost of living in Singapore is almost double of that in Kuala Lumpur, although living in Singapore is more affordable owing to the higher wage rate received by the ordinary citizens.
Originality/value
This paper contributes to the literature in several ways. First, this paper assesses economic development in Singapore and Malaysia instead of focusing on cross-straits relations. Second, the study reflects the view that the improvement of standards of living and quality of life for ordinary residents is paramount to economic development. The competitiveness index and city-level benchmarks used in the paper reflect the standards of living and the quality-of-life dimensions. Third, the focus on city-level analysis in addition to conventional national-level analysis helps to provide policymakers with practical policy implications against the backdrop of rapid urbanisation.
Details
Keywords
Marianne Johnson and Warren J. Samuels
“Economics is a Serious Subject.” Edwin Cannan.