Search results
1 – 10 of 330Freddy H. Marín-Sánchez, Julián A. Pareja-Vasseur and Diego Manzur
The purpose of this article is to propose a detailed methodology to estimate, model and incorporate the non-constant volatility onto a numerical tree scheme, to evaluate a real…
Abstract
Purpose
The purpose of this article is to propose a detailed methodology to estimate, model and incorporate the non-constant volatility onto a numerical tree scheme, to evaluate a real option, using a quadrinomial multiplicative recombination.
Design/methodology/approach
This article uses the multiplicative quadrinomial tree numerical method with non-constant volatility, based on stochastic differential equations of the GARCH-diffusion type to value real options when the volatility is stochastic.
Findings
Findings showed that in the proposed method with volatility tends to zero, the multiplicative binomial traditional method is a particular case, and results are comparable between these methodologies, as well as to the exact solution offered by the Black–Scholes model.
Originality/value
The originality of this paper lies in try to model the implicit (conditional) market volatility to assess, based on that, a real option using a quadrinomial tree, including into this valuation the stochastic volatility of the underlying asset. The main contribution is the formal derivation of a risk-neutral valuation as well as the market risk premium associated with volatility, verifying this condition via numerical test on simulated and real data, showing that our proposal is consistent with Black and Scholes formula and multiplicative binomial trees method.
Details
Keywords
The aggregate index and per capita index have different meanings for some countries or regions. CO2 emissions per capita matters for China because of its huge population…
Abstract
Purpose
The aggregate index and per capita index have different meanings for some countries or regions. CO2 emissions per capita matters for China because of its huge population. Therefore, this study aims to deepen the understanding of Kuznets curve from the perspective of CO2 emissions per capita. In this study, mathematical formulas will be derived and verified.
Design/methodology/approach
First, this study verified the existing problems with the environmental Kuznets curve (EKC) through multiple regression. Second, this study developed a theoretical derivation with the Solow model and balanced growth and explained the underlying principles of the EKC’s shape. Finally, this study quantitatively analyzed the influencing factors.
Findings
The CO2 emission per capita is related to the per capita GDP, nonfossil energy and total factor productivity (TFP). Empirical results support the EKC hypothesis. When the proportion of nonfossil and TFP increase by 1%, the per capita CO2 decrease by 0.041 t and 1.79 t, respectively. The growth rate of CO2 emissions per capita is determined by the difference between the growth rate of output per capita and the sum of efficiency and structural growth rates. To achieve the CO2 emission intensity target and economic growth target, the growth rate of per capita CO2 emissions must fall within the range of [−0.92%, 6.1%].
Originality/value
Inspired by the EKC and balanced growth, this study investigated the relationships between China’s environmental variables (empirical analysis) and developed a theoretical background (macro-theoretical derivation) through formula-based derivation, the results of which are universally valuable and provide policymakers with a newly integrated view of emission reduction and balanced development to address the challenges associated with climate change caused by energy.
Details
Keywords
Francie Lange, Anna Peters, Dominik K. Kanbach and Sascha Kraus
This study aims to investigate different types of platform providers (PPs) to gain a deeper understanding of the characteristics and underlying logic of this group within…
Abstract
Purpose
This study aims to investigate different types of platform providers (PPs) to gain a deeper understanding of the characteristics and underlying logic of this group within collaborative consumption (CC). As CC occurs with three groups of actors (PP, peer service provider and customer) and is predominantly viewed from the customer perspective, this study offers insights from the under-researched PP perspective.
Design/methodology/approach
This research applies a multiple case study approach and analyzes descriptively and thematically 92 cases of CC PPs gathered through the Crunchbase database.
Findings
The authors derive four archetypes of CC PPs, namely, the hedonist, functionalist, environmentalist and connector, that differ in their offered values, dominating motives and activities across industries.
Research limitations/implications
The authors conceptualize CC by clearly describing the four archetypes and their characteristics. However, further research would benefit from including databases other than Crunchbase.
Practical implications
PPs need to understand their value offerings and customer preferences to develop convincing value propositions and offer engaging activities. PPs would benefit from a more active social media presence to build strong relations with customers and peer service providers to effectively communicate their values.
Originality/value
The paper is pioneering as it encompasses the perspective of CC PPs and operationalizes the concept of CC. The authors address the lack of research on CC by conducting an extensive case study.
Details
Keywords
Jingrui Ge, Kristoffer Vandrup Sigsgaard, Bjørn Sørskot Andersen, Niels Henrik Mortensen, Julie Krogh Agergaard and Kasper Barslund Hansen
This paper proposes a progressive, multi-level framework for diagnosing maintenance performance: rapid performance health checks of key performance for different equipment groups…
Abstract
Purpose
This paper proposes a progressive, multi-level framework for diagnosing maintenance performance: rapid performance health checks of key performance for different equipment groups and end-to-end process diagnostics to further locate potential performance issues. A question-based performance evaluation approach is introduced to support the selection and derivation of case-specific indicators based on diagnostic aspects.
Design/methodology/approach
The case research method is used to develop the proposed framework. The generic parts of the framework are built on existing maintenance performance measurement theories through a literature review. In the case study, empirical maintenance data of 196 emergency shutdown valves (ESDVs) are collected over a two-year period to support the development and validation of the proposed approach.
Findings
To improve processes, companies need a separate performance measurement structure. This paper suggests a hierarchical model in four layers (objective, domain, aspect and performance measurement) to facilitate the selection and derivation of indicators, which could potentially reduce management complexity and help prioritize continuous performance improvement. Examples of new indicators are derived from a case study that includes 196 ESDVs at an offshore oil and gas production plant.
Originality/value
Methodological approaches to deriving various performance indicators have rarely been addressed in the maintenance field. The proposed diagnostic framework provides a structured way to identify and locate process performance issues by creating indicators that can bridge generic evaluation aspects and maintenance data. The framework is highly adaptive as data availability functions are used as inputs to generate indicators instead of passively filtering out non-applicable existing indicators.
Details
Keywords
This paper considers locating congested fast charging stations (FCSs) and deploying chargers in a stochastic environment, while the related studies have predominantly focused on…
Abstract
This paper considers locating congested fast charging stations (FCSs) and deploying chargers in a stochastic environment, while the related studies have predominantly focused on problems in deterministic environments. Reducing the inconvenience caused by congestion at FCSs is an important challenge for FCS service provider. This is the underlying motivation for this study to consider a problem for FCS network design with the congestion restriction in a stochastic environment. We proposed a maximal coverage problem subject to budget constraints and a congestion restriction in order to maximize the demand coverage. With the derivation of the congestion restriction in the considered stochastic environment, the problem is formulated into an integer programming model. A real-life case study is conducted and managerial implications are drawn from its results.
Details
Keywords
Tao Xu, Hanning Shi, Yongjiang Shi and Jianxin You
The purpose of this paper is to explore the concept of data assets and how companies can assetize their data. Using the literature review methodology, the paper first summarizes…
Abstract
Purpose
The purpose of this paper is to explore the concept of data assets and how companies can assetize their data. Using the literature review methodology, the paper first summarizes the conceptual controversies over data assets in the existing literature. Subsequently, the paper defines the concept of data assets. Finally, keywords from the existing research literature are presented visually and a foundational framework for achieving data assetization is proposed.
Design/methodology/approach
This paper uses a systematic literature review approach to discuss the conceptual evolution and strategic imperatives of data assets. To establish a robust research methodology, this paper takes into account two main aspects. First, it conducts a comprehensive review of the existing literature on digital technology and data assets, which enables the derivation of an evolutionary path of data assets and the development of a clear and concise definition of the concept. Second, the paper uses Citespace, a widely used software for literature review, to examine the research framework of enterprise data assetization.
Findings
The paper offers pivotal insights into the realm of data assets. It highlights the changing perceptions of data assets with digital progression and addresses debates on data asset categorization, value attributes and ownership. The study introduces a definitive concept of data assets as electronically recorded data resources with real or potential value under legal parameters. Moreover, it delineates strategic imperatives for harnessing data assets, presenting a practical framework that charts the stages of “resource readiness, capacity building, and data application”, guiding businesses in optimizing their data throughout its lifecycle.
Originality/value
This paper comprehensively explores the issue of data assets, clarifying controversial concepts and categorizations and bridging gaps in the existing literature. The paper introduces a clear conceptualization of data assets, bridging the gap between academia and practice. In addition, the study proposes a strategic framework for data assetization. This study not only helps to promote a unified understanding among academics and professionals but also helps businesses to understand the process of data assetization.
Details
Keywords
Sergei O. Kuznetsov, Alexey Masyutin and Aleksandr Ageev
The purpose of this study is to show that closure-based classification and regression models provide both high accuracy and interpretability.
Abstract
Purpose
The purpose of this study is to show that closure-based classification and regression models provide both high accuracy and interpretability.
Design/methodology/approach
Pattern structures allow one to approach the knowledge extraction problem in case of partially ordered descriptions. They provide a way to apply techniques based on closed descriptions to non-binary data. To provide scalability of the approach, the author introduced a lazy (query-based) classification algorithm.
Findings
The experiments support the hypothesis that closure-based classification and regression allow one to both achieve higher accuracy in scoring models as compared to results obtained with classical banking models and retain interpretability of model results, whereas black-box methods grant better accuracy for the cost of losing interpretability.
Originality/value
This is an original research showing the advantage of closure-based classification and regression models in the banking sphere.
Details
Keywords
Pedro Brinca, Nikolay Iskrev and Francesca Loria
Since its introduction by Chari, Kehoe, and McGrattan (2007), Business Cycle Accounting (BCA) exercises have become widespread. Much attention has been devoted to the results of…
Abstract
Since its introduction by Chari, Kehoe, and McGrattan (2007), Business Cycle Accounting (BCA) exercises have become widespread. Much attention has been devoted to the results of such exercises and to methodological departures from the baseline methodology. Little attention has been paid to identification issues within these classes of models. In this chapter, the authors investigate whether such issues are of concern in the original methodology and in an extension proposed by Šustek (2011) called Monetary Business Cycle Accounting. The authors resort to two types of identification tests in population. One concerns strict identification as theorized by Komunjer and Ng (2011) while the other deals both with strict and weak identification as in Iskrev (2010). Most importantly, the authors explore the extent to which these weak identification problems affect the main economic takeaways and find that the identification deficiencies are not relevant for the standard BCA model. Finally, the authors compute some statistics of interest to practitioners of the BCA methodology.
Details
Keywords
Doris Entner, Thorsten Prante, Thomas Vosgien, Alexandru-Ciprian Zăvoianu, Susanne Saminger-Platz, Martin Schwarz and Klara Fink
The paper aims to raise awareness in the industry of design automation tools, especially in early design phases, by demonstrating along a case study the seamless integration of a…
Abstract
Purpose
The paper aims to raise awareness in the industry of design automation tools, especially in early design phases, by demonstrating along a case study the seamless integration of a prototypically implemented optimization, supporting design space exploration in the early design phase and an in operational use product configurator, supporting the drafting and detailing of the solution predominantly in the later design phase.
Design/methodology/approach
Based on the comparison of modeled as-is and to-be processes of ascent assembly designs with and without design automation tools, an automation roadmap is developed. Using qualitative and quantitative assessments, the potentials and benefits, as well as acceptance and usage aspects, are evaluated.
Findings
Engineers tend to consider design automation for routine tasks. Yet, prototypical implementations support the communication and identification of the potential for the early stages of the design process to explore solution spaces. In this context, choosing from and interactively working with automatically generated alternative solutions emerged as a particular focus. Translators, enabling automatic downstream propagation of changes and thus ensuring consistency as to change management were also evaluated to be of major value.
Research limitations/implications
A systematic validation of design automation in design practice is presented. For generalization, more case studies are needed. Further, the derivation of appropriate metrics needs to be investigated to normalize validation of design automation in future research.
Practical implications
Integration of design automation in early design phases has great potential for reducing costs in the market launch. Prototypical implementations are an important ingredient for potential evaluation of actual usage and acceptance before implementing a live system.
Originality/value
There is a lack of systematic validation of design automation tools supporting early design phases. In this context, this work contributes a systematically validated industrial case study. Early design-phases-support technology transfer is important because of high leverage potential.
Details