Search results

1 – 10 of 13
Article
Publication date: 2 November 2012

Khoutir Bouchbout, Jacky Akoka and Zaia Alimazighi

This paper aims to present a new approach for developing a framework based on a model driven architecture (MDA) for the modelling of technology‐independent collaborative processes.

1369

Abstract

Purpose

This paper aims to present a new approach for developing a framework based on a model driven architecture (MDA) for the modelling of technology‐independent collaborative processes.

Design/methodology/approach

The paper suggests a new collaborative process modelling approach based on an MDA and a metamodelling technique. The research method, based on the design science approach, was started by identifying the characteristics of the collaborative processes, which distinguish them from the classical intraorganizational ones. Then, the generic collaborative business process (CBP) modelling framework is developed based on MDA approach and definition of a set of transformation rules through three layers: business, process, and technical. After that, the core component of the framework was the proposition of a generic CBP metamodel at PIM/MDA level. The specific collaboration participant's business processes (expressed as BPMN model) are generated from the generic CBP model represented as an UML2 Profile activity diagram, which is compliant to CBP metamodel. Finally, as proof‐of‐concept, the architecture of an Eclipse‐based open development platform is developed implementing an e‐Procurement collaborative process.

Findings

The proposed framework for CBP modelling and the generic CBP metamodel contribute towards a more efficient methodology and have consequences for BPM‐related collaboration, facilitating the B2B processes modelling and implementation. In order to demonstrate and evaluate the practical applicability of the framework, the architecture of an Eclipse‐based open development platform is developed implementing a collaborative business application on the basis of an e‐Procurement use case.

Research limitations/implications

There is a need to focus future research efforts on the improvement of the semi‐automatic transformation phase from public to private processes which needs human intervention by adding a suitable interfaces at both sides of the B2B interaction. In addition, the problem of semantic heterogeneities regarding the partner's business process elements (business documents, activity/task names) should be tackled by developing an approach that uses ontology.

Practical implications

Business processes developers find a B2B technology‐independent solution for implementing and using interorganizational information systems.

Originality/value

The paper provides a framework that enables the CBP modelling and integrates a generic CBP metamodel. Currently, to the best of the authors' knowledge, such a generic metamodel and his instantiation have not so far been developed.

Abstract

Purpose

Ubiquitous web applications (UWA) are a new type of web applications which are accessed in various contexts, i.e. through different devices, by users with various interests, at anytime from anyplace around the globe. For such full‐fledged, complex software systems, a methodologically sound engineering approach in terms of model‐driven engineering (MDE) is crucial. Several modeling approaches have already been proposed that capture the ubiquitous nature of web applications, each of them having different origins, pursuing different goals and providing a pantheon of concepts. This paper aims to give an in‐depth comparison of seven modeling approaches supporting the development of UWAs.

Design/methodology/approach

This methodology is conducted by applying a detailed set of evaluation criteria and by demonstrating its applicability on basis of an exemplary tourism web application. In particular, five commonly found ubiquitous scenarios are investigated, thus providing initial insight into the modeling concepts of each approach as well as to facilitate their comparability.

Findings

The results gained indicate that many modeling approaches lack a proper MDE foundation in terms of meta‐models and tool support. The proposed modeling mechanisms for ubiquity are often limited, since they neither cover all relevant context factors in an explicit, self‐contained, and extensible way, nor allow for a wide spectrum of extensible adaptation operations. The provided modeling concepts frequently do not allow dealing with all different parts of a web application in terms of its content, hypertext, and presentation levels as well as their structural and behavioral features. Finally, current modeling approaches do not reflect the crosscutting nature of ubiquity but rather intermingle context and adaptation issues with the core parts of a web application, thus hampering maintainability and extensibility.

Originality/value

Different from other surveys in the area of modeling web applications, this paper specifically considers modeling concepts for their ubiquitous nature, together with an investigation of available support for MDD in a comprehensive way, using a well‐defined as well as fine‐grained catalogue of more than 30 evaluation criteria.

Details

International Journal of Web Information Systems, vol. 4 no. 3
Type: Research Article
ISSN: 1744-0084

Keywords

Article
Publication date: 7 April 2023

Khaled Halteh and Milind Tiwari

The prevention of fraudulent activities, particularly within a financial context, is of paramount significance in all spheres, as it not only impacts the sustainability of…

Abstract

Purpose

The prevention of fraudulent activities, particularly within a financial context, is of paramount significance in all spheres, as it not only impacts the sustainability of corporate entities but also has the potential to have a broader economy-wide impact. This paper aims to focus on dual implications associated with financial distress, the first being associated with the temptation to launder funds due to financial distress, and the second being the potential for illicit activities, such as fraud, money laundering or terror financing, to give rise to financial distress.

Design/methodology/approach

The paper examines the literature on financial distress and uses theories of financial crime to establish a link between financial distress and financial crime.

Findings

In recent years, there has been a surge in corporate financial distress, particularly in the aftermath of concurrent crises such as the COVID-19 pandemic and the Russia–Ukraine war. Through a comprehensive examination of literature pertaining to financial distress and financial crime, this study identifies a proclivity towards fraudulent conduct arising from instances of financial distress. Moreover, the engagement in such illicit activities subsequently exacerbates the financial distress. An analysis of the relationship between financial crime and financial distress reveals the existence of a vicious cycle between the two.

Originality/value

The results of this study have the potential to advance understanding of the relationship between financial distress and financial crime, which has been previously underexplored.

Details

Journal of Money Laundering Control, vol. 26 no. 6
Type: Research Article
ISSN: 1368-5201

Keywords

Article
Publication date: 28 January 2014

Fernando Castagnolo and Gustavo Ferro

The purpose of this paper is to assess and compare the forecast ability of existing credit risk models, answering three questions: Can these methods adequately predict default…

1619

Abstract

Purpose

The purpose of this paper is to assess and compare the forecast ability of existing credit risk models, answering three questions: Can these methods adequately predict default events? Are there dominant methods? Is it safer to rely on a mix of methodologies?

Design/methodology/approach

The authors examine four existing models: O-score, Z-score, Campbell, and Merton distance to default model (MDDM). The authors compare their ability to forecast defaults using three techniques: intra-cohort analysis, power curves and discrete hazard rate models.

Findings

The authors conclude that better predictions demand a mix of models containing accounting and market information. The authors found evidence of the O-score's outperformance relative to the other models. The MDDM alone in the sample is not a sufficient default predictor. But discrete hazard rate models suggest that combining both should enhance default prediction models.

Research limitations/implications

The analysed methods alone cannot adequately predict defaults. The authors found no dominant methods. Instead, it would be advisable to rely on a mix of methodologies, which use complementary information.

Practical implications

Better forecasts demand a mix of models containing both accounting and market information.

Originality/value

The findings suggest that more precise default prediction models can be built by combining information from different sources in reduced-form models and combining default prediction models that can analyze said information.

Details

The Journal of Risk Finance, vol. 15 no. 1
Type: Research Article
ISSN: 1526-5943

Keywords

Article
Publication date: 1 March 2005

Patti Cybinski and Carolyn Windsor

Conflicting results have emerged from several past studies as to whether bankruptcy prediction models are able to forecast corporate failure more accurately than auditors’…

Abstract

Conflicting results have emerged from several past studies as to whether bankruptcy prediction models are able to forecast corporate failure more accurately than auditors’ going‐concern opinions. Nevertheless, the last decade has seen improved modelling of the path‐to‐failure of financially distressed firms over earlier static models of bankruptcy. In the light of the current crisis facing the auditing profession, this study evaluates the efficacy of auditors’ going‐concern opinions in comparison to two bankruptcy prediction models. Bankrupt firms in the U.S. service and trade industry sectors were used to compare model predictions against the auditors’ going‐concern opinion for two years prior to firm failure. The two models are the well‐known Altman (1968) Multiple Discriminant Analysis (MDA) model that includes only financial ratio variables in its formulation and the newer, temporal logit model of Cybinski (2000, 2003) that includes explicit factors of the business cycles in addition to variables internal to the firm. The results show overall better bankruptcy classification rates for the temporal model than for the Altman model or audit opinion.

Details

Pacific Accounting Review, vol. 17 no. 1
Type: Research Article
ISSN: 0114-0582

Keywords

Article
Publication date: 1 April 2001

Patti Cybinski

This paper describes a number of models used in bankruptcy studies to date. They arise from two basic model designs used in studies of financial distress: cross-sectional studies…

1555

Abstract

This paper describes a number of models used in bankruptcy studies to date. They arise from two basic model designs used in studies of financial distress: cross-sectional studies that compare healthy and distressed firms, and time-series formulations that study the path to failure of (usually) distressed firms only. These two designs inherently foster different research objectives. Different instances of the most recent work taken from each of the above research groups, broadly categorized by design, are described here including new work by this author. It is argued that those that investigate the distress continuum with predominantly explanatory objectives are superior on a number of criteria to the studies that follow what is essentially a case-control structure and espouse prediction as their objective.

Article
Publication date: 12 April 2011

Marcellina Mvula Chijoriga

The purpose of this research is to investigate whether inclusion of risk assessment variables in the multiple discriminant analysis (MDA) model improved the banks ability in…

4244

Abstract

Purpose

The purpose of this research is to investigate whether inclusion of risk assessment variables in the multiple discriminant analysis (MDA) model improved the banks ability in making correct customer classification, predict firm's performance and credit risk assessment.

Design/methodology/approach

The paper reviews literature on the application of financial distress and credit scoring methods, and the use of risk assessment variables in classification models. The study used a sample of 56 performing and non‐performing assets (NPA) of a privatized commercial bank in Tanzania. Financial ratios were used as independent variables for building the MDA model with a variation of five MDA models. Different statistical tests for normality, equality of covariance, goodness of fit and multi‐colinearity were performed. Using the estimation and validation samples, test results showed that the MDA base model had a higher level of predictability hence classifying correctly the performing and NPA with a correctness of 92.9 and 96.4 percent, respectively. Lagging the classification two years, the results showed that the model could predict correctly two years in advance. When MDA was used as a risk assessment model, it showed improved correct customer classification and credit risk assessment.

Findings

The findings confirmed financial ratios as good classification and predictor variables of firm's performance. If the bank had used the MDA for classifying and evaluating its customers, the probability of failure could have been known two years before actual failure, and the misclassification costs could have been calculated objectively. In this way, the bank could have reduced its non‐performing loans and its credit risk exposure.

Research limitations/implications

The valiadation sample used in the study was smaller compared to the estimation sample. MDA works better as a credit scoring method in the banking environment two years before and after failure. The study was done on the current financial crisis of 2009.

Practical implications

Use of MDA helps banks to determine objectively the misclassification costs and its expected misclassification errors plus determining the provisions for bad debts. Banks could have reduced the non‐performing loans and their credit risks exposure if they had used the MDA method in the loan‐evaluation and classification process. The study has proved that quantitative credit scoring models improve management decision making as compared to subjective assessment methods. For improved credit and risk assessment, a combination of both qualitative and quantitave methods should be considered.

Originality/value

The findings have shown that using the MDA, commercial banks could have improved their objective decision making by correctly classifying the credit worthiness of a customer, predicting firm's future performance as well as assessing their credit risk. It has also shown that other than financial variables, inclusion of stability measures improves management decision making and objective provisioning of bad debts. The recent financial crisis emphasizes the need for developing objective credit scoring methods and instituting prudent risk assessment culture to limit the extent and potential of failure.

Details

International Journal of Emerging Markets, vol. 6 no. 2
Type: Research Article
ISSN: 1746-8809

Keywords

Article
Publication date: 29 November 2018

Ioannis Anagnostopoulos and Anas Rizeq

This study provides valuable insights to managers aiming to increase the effectiveness of their diversification and growth portfolios. The purpose of this paper is to examine the…

Abstract

Purpose

This study provides valuable insights to managers aiming to increase the effectiveness of their diversification and growth portfolios. The purpose of this paper is to examine the value of utilizing a neural networks (NNs) approach using mergers and acquisition (M&A) data confined in the US technology domain.

Design/methodology/approach

Using data from Bloomberg for the period 2000–2016, the results confirm that an NN approach provides more explanation between financial variables in the model than a traditional regression model where the NN approach of this study is then compared with linear classifier, logistic regression. The empirical results show that NN is a promising method of evaluating M&A takeover targets in terms of their predictive accuracy and adaptability.

Findings

The findings emphasize the value alternative methodologies provide in high-technology industries in order to achieve the screening and explorative performance objectives, given the technological complexity, market uncertainty and the divergent skill sets required for breakthrough innovations in these sectors.

Research limitations/implications

NN methods do not provide for a fuller analysis of significance for each of the autonomous variables in the model as traditional regression methods do. The generalization breadth of this study is limited within a specific sector (technology) in a specific country (USA) covering a specific period (2000–2016).

Practical implications

Investors value firms before investing in them to identify their true stock price; yet, technology firms pose a great valuation challenge to investors and analysts alike as the latest information technology stock price bubbles, Silicon Valley and as the recent stratospheric rise of financial technology companies have also demonstrated.

Social implications

Numerous studies have shown that M&As are more often than not destroy value rather than create it. More than 50 percent of all M&As lead to a decline in relative total shareholder return after one year. Hence, effective target identification must be built on the foundation of a credible strategy that identifies the most promising market segments for growth, assesses whether organic or acquisitive growth is the best way forward and defines the commercial and financial hurdles for potential deals.

Originality/value

Technology firm value is directly dependent on growth, consequently most of the value will originate from future customers or products not from current assets that makes it challenging for investors to measure a firm’s beta (risk) where the value of a technology is only known after its commercialization to the market. A differentiated methodological approach used is the use of NNs, machine learning and data mining to predict bankruptcy or takeover targets.

Details

Managerial Finance, vol. 45 no. 10/11
Type: Research Article
ISSN: 0307-4358

Keywords

Article
Publication date: 13 July 2015

Razieh Dehghani and Raman Ramsin

– This paper aims to provide a criteria-based evaluation framework for assessing knowledge management system (KMS) development methodologies.

2957

Abstract

Purpose

This paper aims to provide a criteria-based evaluation framework for assessing knowledge management system (KMS) development methodologies.

Design/methodology/approach

The evaluation criteria have been elicited based on the features expected from a successful KMS. Furthermore, a number of prominent KMS development methodologies have been scrutinized based on the proposed evaluation framework.

Findings

It was demonstrated that the proposed evaluation framework is detailed and comprehensive enough to reveal the strengths and weaknesses of KMS development methodologies. It was also revealed that even though the evaluated methodologies possess certain strong features, they suffer from several shortcomings that need to be addressed.

Research limitations/implications

The evaluation framework has not been applied to all existing KMS development methodologies; however, the evaluation does cover the most comprehensive methodologies which exist in the research context.

Practical implications

The results of this research can be used for the following purposes: organizational goal-based selection of KMS development methodologies, evolution of existing KMS development methodologies and engineering of tailored-to-fit KMS development methodologies.

Originality/value

The proposed evaluation framework provides a comprehensive and detailed set of criteria for assessing general, area-specific and context-specific features of KMS development methodologies. KMS developers can select the methodology which best fits their requirements based on the evaluation results. Furthermore, method engineers can extend existing methodologies or engineer new ones so as to satisfy the specific requirements of the project at hand.

Details

Journal of Knowledge Management, vol. 19 no. 4
Type: Research Article
ISSN: 1367-3270

Keywords

Article
Publication date: 19 November 2018

I-Cheng Chen and I-Ching Hsu

In recent years, governments around the world are actively promoting the Open Government Data (OGD) to facilitate reusing open data and developing information applications…

Abstract

Purpose

In recent years, governments around the world are actively promoting the Open Government Data (OGD) to facilitate reusing open data and developing information applications. Currently, there are more than 35,000 data sets available on the Taiwan OGD website. However, the existing Taiwan OGD website only provides keyword queries and lacks a friendly query interface. This study aims to address these issues by defining a DBpedia cloud computing framework (DCCF) for integrating DBpedia with Semantic Web technologies into Spark cluster cloud computing environment.

Design/methodology/approach

The proposed DCCF is used to develop a Taiwan OGD recommendation platform (TOGDRP) that provides a friendly query interface to automatically filter out the relevant data sets and visualize relationships between these data sets.

Findings

To demonstrate the feasibility of TOGDRP, the experimental results illustrate the efficiency of the different cloud computing models, including Hadoop YARN cluster model, Spark standalone cluster model and Spark YARN cluster model.

Originality/value

The novel solution proposed in this study is a hybrid approach for integrating Semantic Web technologies into Hadoop and Spark cloud computing environment to provide OGD data sets recommendation.

Details

International Journal of Web Information Systems, vol. 15 no. 2
Type: Research Article
ISSN: 1744-0084

Keywords

1 – 10 of 13