Search results
1 – 10 of 703For numerical treatment of resin‐containing systems and forecasting of their properties, certain models of branching are needed. In this review, existing theoretical models of…
Abstract
For numerical treatment of resin‐containing systems and forecasting of their properties, certain models of branching are needed. In this review, existing theoretical models of systems containing branched structures (polymers, aggregates, etc.) are analyzed and compared. The criteria of selection of the optimal theoretical model comprise chemical and physical problems available for solution, simplicity of such solution, connection between theoretically forecasted and experimental results, and the time needed for computing. It is concluded that, according to these criteria, the optimal (between existing models) is the statistical polymer method.
Details
Keywords
Rafał Przekop, Anna Jackiewicz-Zagórska, Michał Woźniak and Leon Gradoń
The purpose of this work was to study the influence of particles and fiber material properties on the deposition efficiency. Collection of aerosol particles in the particular…
Abstract
Purpose
The purpose of this work was to study the influence of particles and fiber material properties on the deposition efficiency. Collection of aerosol particles in the particular steps of their production, and purification of the air at the workplace and the atmospheric environment requires the efficient method of separation of particulate matter from the carrier gas. There are many papers published in the past years in which the deposition of particles on fibrous collectors is considered. Most of them assume that collisions between particles and collector surfaces are 100% effective.
Design/methodology/approach
For the purpose of this work, the lattice Boltzmann model was used to describe fluid dynamics, whereas the solid particles motion was modeled by the Brownian dynamics. The interactions between particles and surfaces were modeled using an energy-balanced oscillatory model.
Findings
The results show significant impact of material properties on filter performance.
Practical implications
Obtained results may provide useful information for the proper design of a filtration process and the production of filters with long service life.
Originality/value
In addition, the results presented in this work show that some assumptions of the classical filtration theory lead to an overestimation of deposition efficiency.
Details
Keywords
This paper seeks to characterize the behavior of the naira/dollar foreign exchange rate series over the period 1999 through 2006 to determine if the process generating the series…
Abstract
Purpose
This paper seeks to characterize the behavior of the naira/dollar foreign exchange rate series over the period 1999 through 2006 to determine if the process generating the series has long memory which is a special case of fractional Brownian motion. The existence of long memory contradicts the notion of market efficiency.
Design/methodology/approach
The paper employs the modified rescaled range R/S test which is proposed by Lo to test the null hypothesis that daily and weekly NGN/USD exchange rates from 1999 through 2006 exhibit short‐memory process. The second test that was also employed is the Geweke‐Porter‐Hubak (GPH) test which was refined by Hurvich et al.
Findings
The results show that long memory is present in daily and weekly foreign exchange level series of the Nigerian naira for the period sampled. This evidence implies that the Nigerian foreign exchange market may not be efficient. Thus, it is possible for investors to realize abnormal profit by taking an investment position based on predicted exchange rates. The results reported in this paper are also indicative of a deviation from long‐run PPP.
Originality/value
This paper is the first to empirically apply the modified R/S and GPH tests to explore the existence of long‐memory process in a country study of foreign exchange series using data from Nigeria.
Details
Keywords
Shaheen Borna and Dheeraj Sharma
The purpose of this paper is to investigate the recent global economic downturn. Particularly, the study explores the utilization of the concept of Brownian motion in financial…
Abstract
Purpose
The purpose of this paper is to investigate the recent global economic downturn. Particularly, the study explores the utilization of the concept of Brownian motion in financial risk management in organizations in the USA.
Design/methodology/approach
The three assumptions, namely, independence, stationarity, and normal distribution that underlie the concept of Brownian motion are examined.
Findings
It is concluded that the widely used risk management strategies predicated on Brownian motion fail to provide a rational understanding of financial turmoil. Consequently, prescriptive insights are offered to aid the industry in developing an apposite mechanism for risk management.
Research limitations/implications
This paper offers new and improved risk management strategies that need to be undertaken to augment our understanding and prediction of financial scenarios.
Practical implications
The paper is useful for managers in all financial organizations, which employ computer models using Brownian motions. Specifically, this study contends that static models are unsuitable and dynamic models are more useful for risk assessment.
Originality/value
The paper reveals the weaknesses of the key assumptions of the risk management models used in financial organizations, namely, normal distribution of stock market price fluctuations, statistical stationarity, and efficient market assumption. Valuable guidelines are provided for financial managers who either do not have the inclination or time to sift through the voluminous literature related to the risk management models and computer software designed on these models.
Details
Keywords
Thameem Hayath Basha, Sivaraj Ramachandran and Bongsoo Jang
The need for precise synthesis of customized designs has resulted in the development of advanced coating processes for modern nanomaterials. Achieving accuracy in these processes…
Abstract
Purpose
The need for precise synthesis of customized designs has resulted in the development of advanced coating processes for modern nanomaterials. Achieving accuracy in these processes requires a deep understanding of thermophysical behavior, rheology and complex chemical reactions. The manufacturing flow processes for these coatings are intricate and involve heat and mass transfer phenomena. Magnetic nanoparticles are being used to create intelligent coatings that can be externally manipulated, making them highly desirable. In this study, a Keller box calculation is used to investigate the flow of a coating nanofluid containing a viscoelastic polymer over a circular cylinder.
Design/methodology/approach
The rheology of the coating polymer nanofluid is described using the viscoelastic model, while the effects of nanoscale are accounted for by using Buongiorno’s two-component model. The nonlinear PDEs are transformed into dimensionless PDEs via a nonsimilar transformation. The dimensionless PDEs are then solved using the Keller box method.
Findings
The transport phenomena are analyzed through a comprehensive parametric study that investigates the effects of various emerging parameters, including thermal radiation, Biot number, Eckert number, Brownian motion, magnetic field and thermophoresis. The results of the numerical analysis, such as the physical variables and flow field, are presented graphically. The momentum boundary layer thickness of the viscoelastic polymer nanofluid decreases as fluid parameter increases. An increase in mixed convection parameter leads to a rise in the Nusselt number. The enhancement of the Brinkman number and Biot number results in an increase in the total entropy generation of the viscoelastic polymer nanofluid.
Practical implications
Intelligent materials rely heavily on the critical characteristic of viscoelasticity, which displays both viscous and elastic effects. Viscoelastic models provide a comprehensive framework for capturing a range of polymeric characteristics, such as stress relaxation, retardation, stretching and molecular reorientation. Consequently, they are a valuable tool in smart coating technologies, as well as in various applications like supercapacitor electrodes, solar collector receivers and power generation. This study has practical applications in the field of coating engineering components that use smart magnetic nanofluids. The results of this research can be used to analyze the dimensions of velocity profiles, heat and mass transfer, which are important factors in coating engineering. The study is a valuable contribution to the literature because it takes into account Joule heating, nonlinear convection and viscous dissipation effects, which have a significant impact on the thermofluid transport characteristics of the coating.
Originality/value
The momentum boundary layer thickness of the viscoelastic polymer nanofluid decreases as the fluid parameter increases. An increase in the mixed convection parameter leads to a rise in the Nusselt number. The enhancement of the Brinkman number and Biot number results in an increase in the total entropy generation of the viscoelastic polymer nanofluid. Increasing the strength of the magnetic field promotes an increase in the density of the streamlines. An increase in the mixed convection parameter results in a decrease in the isotherms and isoconcentration.
Details
Keywords
Michel Baroni, Fabrice Barthélémy and Mahdi Mokrane
The aim of this paper is to use rent and price dynamics in the future cash flows in order to improve real estate portfolio valuation.
Abstract
Purpose
The aim of this paper is to use rent and price dynamics in the future cash flows in order to improve real estate portfolio valuation.
Design/methodology/approach
Monte Carlo simulation methods are employed for the measurement of complex cash generating assets such as real estate assets return distribution. Important simulation inputs, such as the physical real estate price volatility estimator, are provided by results on real estate indices for Paris, derived in an article by Baroni et al..
Findings
Based on a residential real estate portfolio example, simulated cash flows: provide more robust valuations than traditional DCF valuations; permit the user to estimate the portfolio's price distribution for any time horizon; and permit easy values‐at‐risk (VaR) computations.
Originality/value
The terminal value estimation is a core issue in real estate valuation. To estimate it, the proposed method is not based on an anticipated growth rate of cash flows but on the estimation of the trend and the volatility of real estate prices.
Details
Keywords
Hato Schmeiser and Joël Wagner
The purpose of this paper is to analyze what transaction costs are acceptable for customers in different investments. In this study, two life insurance contracts, a mutual fund…
Abstract
Purpose
The purpose of this paper is to analyze what transaction costs are acceptable for customers in different investments. In this study, two life insurance contracts, a mutual fund and a risk-free investment, as alternative investment forms are considered. The first two products under scrutiny are a life insurance investment with a point-to-point capital guarantee and a participating contract with an annual interest rate guarantee and participation in the insurer’s surplus. The policyholder assesses the various investment opportunities using different utility measures. For selected types of risk profiles, the utility position and the investor’s preference for the various investments are assessed. Based on this analysis, the authors study which cost levels can make all of the products equally rewarding for the investor.
Design/methodology/approach
The paper notes the risk-neutral valuation calibration using empirical data utility and performance measurement dynamics underlying: geometric Brownian motion numerical examples via Monte Carlo simulation.
Findings
In the first step, the financial performance of the various saving opportunities under different assumptions of the investor’s utility measurement is studied. In the second step, the authors calculate the level of transaction costs that are allowed in the various products to make all of the investment opportunities equally rewarding from the investor’s point of view. A comparison of these results with transaction costs that are common in the market shows that insurance companies must be careful with respect to the level of transaction costs that they pass on to their customers to provide attractive payoff distributions.
Originality/value
To the best of the authors’ knowledge, their research question – i.e. which transaction costs for life insurance products would be acceptable from the customer’s point of view – has not been studied in the above described context so far.
Details
Keywords
Standard financial risk management practices proved unable to provide an adequate understanding and a timely warning of the financial crisis. In particular, the theoretical…
Abstract
Standard financial risk management practices proved unable to provide an adequate understanding and a timely warning of the financial crisis. In particular, the theoretical foundations of risk management and the statistical calibration of risk models are called into question. Policy makers and practitioners respond by looking for new analytical approaches and tools to identify and address new sources of financial risk. Financial markets satisfy reasonable criteria of being considered complex adaptive systems, characterized by complex financial instruments and complex interactions among market actors. Policy makers and practitioners need to take both a micro and macro view of financial risk, identify proper transparency requirements on complex instruments, develop dynamic models of information generation that best approximate observed financial outcomes, and identify and address the causes and consequences of systemic risk. Complexity analysis can make a useful contribution. However, the methodological suitability of complexity theory for financial systems and by extension for risk management is still debatable. Alternative models drawn from the natural sciences and evolutionary theory are proposed.
Details
Keywords
N. Bindu, C. Prem Sankar and K. Satheesh Kumar
This paper aims to introduce a systematic computing and analytical procedure that is applied to the co-author network to identify the temporal evolution and growth of research…
Abstract
Purpose
This paper aims to introduce a systematic computing and analytical procedure that is applied to the co-author network to identify the temporal evolution and growth of research collaborations in the area of e-governance. The empirical analysis of the temporal co-author network can trace the emerging authors and knowledge bursts over time.
Design/methodology/approach
The study applied social network theory to trace the author collaboration patterns in the domain of e-governance. Analysis of the co-author network using micro and macro parameters was done to trace the temporal evolution of the author collaborations.
Findings
E-governance is a multi-disciplinary research domain split over streams of management, politics, information technology and electronics. Hence, research collaborations play a significant role in its advancement. The knowledge sharing between individual authors, institutions and groups through research collaborations, resulting in extensive sharing of data, equipment and research methods, has boosted research activities and development in e-governance. In this paper, the authors systematically analyse the current scenario of research collaborations in the area of e-governance using co-author network to estimate its impact on the advancement of the field. The authors also analysed the temporal evolution of the co-author networks, which show remarkable growth of research collaborations in the domain of e-governance from the year 2000.
Research limitations/implications
The co-author network analysis is only a proxy measure for the analysis of research collaborations. The names of the authors and the university affiliations used in the article are as retrieved from the research repository of Scopus. The degree, citations and other parameters related with authors have scope only within the environment of the co-author network used in the analysis. The criteria used in the study is limited to the degree of research collaborations and the number of co-authored publications in the giant component of the co-author network.
Practical implications
Institutions, authors and governments can trace and select suitable topics and choose research groups of co-authors over the world for future research collaborations in e-governance. The knowledge about the emerging and most discussed topics gives an overview of the global research trends of e-governance.
Social implications
The study identified the evolution of creative collaborations in e-governance in the global perspective. The methodology introduced here is helpful to detect the proficient and productive author collaborations and the spectrum of related e-governance research topics associated with them. As the author collaborations can be mapped to the institutional and country-level collaborations, the information is helpful for researchers, institutions and governments to establish the best collaborations in e-governance research based on the author proficiency, collaboration patterns and research topics as per the requirements.
Originality/value
The paper introduces a novel research methodology using temporal analysis of co-author network to identify the evolution of research patterns and the associated research topics.
Details
Keywords
Jens H. E. Christensen and Glenn D. Rudebusch
Recent U.S. Treasury yields have been constrained to some extent by the zero lower bound (ZLB) on nominal interest rates. Therefore, we compare the performance of a standard…
Abstract
Recent U.S. Treasury yields have been constrained to some extent by the zero lower bound (ZLB) on nominal interest rates. Therefore, we compare the performance of a standard affine Gaussian dynamic term structure model (DTSM), which ignores the ZLB, to a shadow-rate DTSM, which respects the ZLB. Near the ZLB, we find notable declines in the forecast accuracy of the standard model, while the shadow-rate model forecasts well. However, 10-year yield term premiums are broadly similar across the two models. Finally, in applying the shadow-rate model, we find no gain from estimating a slightly positive lower bound on U.S. yields.
Details