Search results

1 – 10 of 398
Book part
Publication date: 24 April 2023

Kohtaro Hitomi, Keiji Nagai, Yoshihiko Nishiyama and Junfan Tao

In this study, the authors investigate methods of sequential analysis to test prospectively for the existence of a unit root against stationary or explosive states in a p-th order…

Abstract

In this study, the authors investigate methods of sequential analysis to test prospectively for the existence of a unit root against stationary or explosive states in a p-th order autoregressive (AR) process monitored over time. Our sequential sampling schemes use stopping times based on the observed Fisher information of a local-to-unity parameter. In contrast to the Dickey–Fuller (DF) test statistic, the sequential test statistic has asymptotic normality. The authors derive the joint limit of the test statistic and the stopping time, which can be characterized using a 3/2-dimensional Bessel process driven by a time-changed Brownian motion. The authors obtain their limiting joint Laplace transform and density function under the null and local alternatives. In addition, simulations are conducted to show that the theoretical results are valid.

Book part
Publication date: 27 March 2006

Dawn G. Williams, Tawana L. Carr and Nicole S. Clifton

While numerous approaches have been suggested for the improvement of elementary and secondary education in American urban public schools, one common component of such plans is the…

Abstract

While numerous approaches have been suggested for the improvement of elementary and secondary education in American urban public schools, one common component of such plans is the more effective utilization of computers, networking, and other educational technologies. When making these considerations, leaders must look at information regarding demographic analysis, assessments, demand factors, and access. The Digital Divide shines a light on the role computers play in widening social gaps throughout our society, particularly between White students and students of color. By providing equitable and meaningful access to technology we can create a stronger assurance that all children step into the 21st century prepared. Home access to computer technology is a continuous area of inequality in American society. If society assumes that academic achievement is facilitated by access to computers both at school and in the home, the gap in access to computer technology is a cause for concern.

Details

Technology and Education: Issues in Administration, Policy, and Applications in K12 Schools
Type: Book
ISBN: 978-0-76231-280-1

Article
Publication date: 1 June 2005

Hélyette Geman and Marie‐Pascale Leonardi

The goal of the paper is to analyse the various issues attached to the valuation of weather derivatives. We focus our study on temperature‐related contracts since they are the…

1552

Abstract

The goal of the paper is to analyse the various issues attached to the valuation of weather derivatives. We focus our study on temperature‐related contracts since they are the most widely traded at this point and try to address the following questions: (i) should the quantity underlying the swaps or options contracts be defined as the temperature, degree‐days or cumulative degree‐days? This discussion is conducted both in terms of the robustness of the statistical modelling of the state variable and the mathematical valuation of the option (European versus Asian). (ii) What pricing approaches can tackle the market incompleteness generated by a non‐tradable underlying when furthermore the market price of risk is hard to identify in other traded instruments and unlikely to be zero? We illustrate our study on a database of temperatures registered at Paris Le Bourget and compare the calls and puts prices obtained using the different methods most widely used in weather markets.

Details

Managerial Finance, vol. 31 no. 6
Type: Research Article
ISSN: 0307-4358

Keywords

Article
Publication date: 8 October 2018

Panayiotis Tzeremes

The purpose of this paper is to investigate the relationship between the energy consumption and the economic growth in the USA and in a sectoral level by using monthly data from…

Abstract

Purpose

The purpose of this paper is to investigate the relationship between the energy consumption and the economic growth in the USA and in a sectoral level by using monthly data from January 1991 to May 2016.

Design/methodology/approach

While assessing the relationship at a country level, the authors also examine five sectors by using quantile causality.

Findings

The findings indicate the existence of a causality at the sectoral level in tails. More specifically, industrial and electric sectors cause the growth at the lower and higher levels. Residential, commercial and transportation sectors do not cause the growth in all levels. Total consumption causes the growth in the middle and low levels but not in the high level. Finally, the empirical evidence signifies an asymmetric relationship between the covariates.

Practical implications

The results imply that when the consumption deals conditions with fluctuation, it is likely to be affected by growth. In such a case, energy policies gear toward reducing or increasing energy intensity, improving energy efficiency, encouraging the use of alternative sources and investing in the development of technology.

Originality/value

The authors use, for the first time, the quantile causality for the case of energy consumption and economic growth. The quantile test is useful for a thorough comprehension of the causal relationship for this area. Compared to the OLS, which is used for the majority of causality tests, the quantile investigates the causality at the sectoral level in the tails.

Details

Journal of Economic Studies, vol. 45 no. 5
Type: Research Article
ISSN: 0144-3585

Keywords

Article
Publication date: 27 September 2019

Giuseppe Orlando, Rosa Maria Mininni and Michele Bufalo

The purpose of this study is to suggest a new framework that we call the CIR#, which allows forecasting interest rates from observed financial market data even when rates are…

Abstract

Purpose

The purpose of this study is to suggest a new framework that we call the CIR#, which allows forecasting interest rates from observed financial market data even when rates are negative. In doing so, we have the objective is to maintain the market volatility structure as well as the analytical tractability of the original CIR model.

Design/methodology/approach

The novelty of the proposed methodology consists in using the CIR model to forecast the evolution of interest rates by an appropriate partitioning of the data sample and calibration. The latter is performed by replacing the standard Brownian motion process in the random term of the model with normally distributed standardized residuals of the “optimal” autoregressive integrated moving average (ARIMA) model.

Findings

The suggested model is quite powerful for the following reasons. First, the historical market data sample is partitioned into sub-groups to capture all the statistically significant changes of variance in the interest rates. An appropriate translation of market rates to positive values was included in the procedure to overcome the issue of negative/near-to-zero values. Second, this study has introduced a new way of calibrating the CIR model parameters to each sub-group partitioning the actual historical data. The standard Brownian motion process in the random part of the model is replaced with normally distributed standardized residuals of the “optimal” ARIMA model suitably chosen for each sub-group. As a result, exact CIR fitted values to the observed market data are calculated and the computational cost of the numerical procedure is considerably reduced. Third, this work shows that the CIR model is efficient and able to follow very closely the structure of market interest rates (especially for short maturities that, notoriously, are very difficult to handle) and to predict future interest rates better than the original CIR model. As a measure of goodness of fit, this study obtained high values of the statistics R2 and small values of the root of the mean square error for each sub-group and the entire data sample.

Research limitations/implications

A limitation is related to the specific dataset as we are examining the period around the 2008 financial crisis for about 5 years and by using monthly data. Future research will show the predictive power of the model by extending the dataset in terms of frequency and size.

Practical implications

Improved ability to model/forecast interest rates.

Originality/value

The original value consists in turning the CIR from modeling instantaneous spot rates to forecasting any rate of the yield curve.

Details

Studies in Economics and Finance, vol. 37 no. 2
Type: Research Article
ISSN: 1086-7376

Keywords

Article
Publication date: 1 July 2014

M. Yasin and Pervez Akhtar

The purpose of this paper is to design and analyze the performance of live model of Bessel beamformer for thorough comprehension of beamforming in adaptive environment and…

Abstract

Purpose

The purpose of this paper is to design and analyze the performance of live model of Bessel beamformer for thorough comprehension of beamforming in adaptive environment and compared with live model of least mean square (LMS) in terms of gain and mean square error (MSE). It presents the principal elements of communication system. The performance of designed live model is tested for its efficiency in terms of signal recovery, directive gain by minimizing MSE using the “wavrecord” function to bring live audio data in WAV format into the MATLAB workspace. These adaptive techniques are illustrated by appropriate examples.

Design/methodology/approach

The proposed algorithm framework relies on MATLAB software with the goal to obtain high efficiency in terms of signal recovery, directive gain by minimizing MSE using the “wavrecord” function to bring live audio data in WAV format. It is assumed that this audio signal is only the message or the baseband signal received by the computer. Here the authors consider computer (laptop) as a base station containing adaptive signal processing algorithm and source (mobile phone) as a desired user, so the experiment setup is designed for uplink application (user to base station) to differentiate between desired signal, multipath and interfering signals as well as to calculate their directions of arrival.

Findings

The presented adaptive live model is reliable, robust and lead to a substantial reduction of MSE, signal recovery in comparison with the LMS technique. The paper contains experimental data. Obtained results are presented clearly and the conclusion comes directly from the presented experimental data. The paper shows that the presented method leads to superior results in comparison with the popular LMS method and can be used as a better alternative in many practical applications.

Research limitations/implications

The adaptive processes described in the paper are still limited to simulation. It is because of the non-availability of real system for testing, therefore chosen research approach that is platform of MATLAB is opted for simulation. Therefore, researchers are encouraged to test the proposed algorithms on real system if possible.

Practical implications

The paper contains experimental data. The paper's impact on the society is acceptable. These implications are consistent with the findings and the conclusions of the paper. However, there is a need to extend this paper to a next level by implementing the proposed algorithms in the real time environment using FPGA technology.

Social implications

This research will improve the signal quality of wireless cellular system by increasing capacity and will reduce the total cost of the system so that cost toward subscribers be decreased.

Originality/value

The live model presented in this paper is shown to provide better results. It is the original work and can provide scientific contribution to signal processing community.

Details

COMPEL: The International Journal for Computation and Mathematics in Electrical and Electronic Engineering, vol. 33 no. 4
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 17 March 2016

muhammad yasin khattak and pervez - akhtar

The aim of this investigation is to make the Bessel beamformer fully automatic when it is combined with Root MUSIC (MUltiple SIgnal Classification) and is based on the array…

Abstract

Purpose

The aim of this investigation is to make the Bessel beamformer fully automatic when it is combined with Root MUSIC (MUltiple SIgnal Classification) and is based on the array processing in the application of beamforming and Direction of Arrival (DOA) algorithm for source position. The concept is analyzed for modified Bessel beamformer with Root MUSIC to form a non-blind array processing technique which is then used to focus a beam towards a desired user and place nulls towards interferers for capacity improvement.

Design/methodology/approach

The idea is manipulated for modified Bessel beamformer with Root MUSIC relies on MATLAB software with the goal to obtain high efficiency in terms of signal recovery, directive gain by minimizing MSE and place nulls towards interferers.

Findings

The finding of the analysis is presented in adaptive environment and provided in the tables. The paper shows that the presented method leads to superior results and can be used as a better alternative in many practical applications to avoid operator involvement.

Research limitations/implications

The adaptive processes described in the paper are still limited to simulation due to non availability of real system for testing.

Practical implications

The paper can be extended to a next level by implementing the proposed algorithms in the real time environment using FPGA technology.

Originality/value

This is our original work that can offer scientific contribution to signal processing community and may provide better results as an automatic beamformer.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering , vol. 35 no. 3
Type: Research Article
ISSN: 0332-1649

Article
Publication date: 4 September 2017

Stephan Russenschuck

The purpose of this paper is to establish the mathematical foundations of magnetic measurement methods based on translating-coil and rotating-coil magnetometers for accelerator…

Abstract

Purpose

The purpose of this paper is to establish the mathematical foundations of magnetic measurement methods based on translating-coil and rotating-coil magnetometers for accelerator magnets and solenoids. These field transducers allow a longitudinal scanning of the field distribution, but require a sophisticated post-processing step to extract the coefficients of the Fourier–Bessel series (known as pseudo-multipoles or generalized gradients) as well as a novel design of the rotating coil magnetometers.

Design/methodology/approach

Calculating the transversal field harmonics as a function of the longitudinal position in the magnet, or measuring these harmonics with a very short, rotating induction-coil scanner, allows the extraction of the coefficients of a Fourier–Bessel series, which can then be used in the thin lens approximation of the end regions of accelerator magnets.

Findings

The extraction of the leading term in the Fourier–Bessel series requires the solution of a differential equation by means of a Fourier transform. This yields a natural way to de-convolute the measured distribution of the multipole content. The author has shown that the measurement technique requires iso-parametric coils to avoid interception of the longitudinal field component. The compensation of the main signal cannot be done with the classical arrangement of search coils at different radii, because no easy scaling law exists. A new design of an iso-perimetric induction coil has been found.

Research limitations/implications

In the literature, it is stated that the pseudo-multipoles can be extracted from field computations or measurements. While this is true for computations, the author shows that the measurement of the field harmonics must be done with iso-parametric coils because otherwise the leading term in the Fourier–Bessel series cannot be extracted.

Practical implications

The author has now established the theory behind a number of field transducers, such as the moving fluxmeter, the rotational coil scanner and the solenoidal field transducer.

Originality/value

This paper brought together the known theory of the orthogonal expansion method with the methods and tools for magnetic field measurements to establish a field description in accelerator magnets.

Details

COMPEL - The international journal for computation and mathematics in electrical and electronic engineering, vol. 36 no. 5
Type: Research Article
ISSN: 0332-1649

Keywords

Article
Publication date: 6 January 2012

Rafid Al‐Khoury

The purpose of this paper is to introduce a spectral model capable of simulating fully transient conductive‐convective heat transfer processes in an axially‐symmetric shallow…

Abstract

Purpose

The purpose of this paper is to introduce a spectral model capable of simulating fully transient conductive‐convective heat transfer processes in an axially‐symmetric shallow geothermal system consisting of a borehole heat exchanger embedded in a soil mass.

Design/methodology/approach

The proposed model combines the exactness of the analytical methods with important extent of generality in describing the geometry and boundary conditions of the numerical methods. It calculates the temperature distribution in all involved borehole heat exchanger components and the surrounding soil mass using the discrete Fourier transform, for the time domain, and the Fourier‐Bessel series, for the spatial domain.

Findings

The paper calculates the temperature distribution in all involved borehole heat exchanger components and the surrounding soil mass in a robust and computationally very efficient procedures. Analysis which might take long time in a work station, if use is made of standard numerical procedures, takes only 1 second in an Intel PC with the proposed model.

Practical implications

The model is capable of simulating fully transient heat transfer in a shallow geothermal system subjected to short and long‐term time varying boundary conditions. The CPU time for calculating temperature distributions in all involved components; pipe‐in, pipe‐out, grout, and soil, using 2048 FFT samples, for the time domain, and 100 Fourier‐Bessel series samples, for the spatial domain, was in the order of 1 second in an Intel PC. The accuracy and computational efficiency of the model makes it, if elaborated, vital for engineering practice.

Originality/value

The proposed model is original and generic. The idea behind it is new and has not been utilized in this field of application. The model can be extended easily to include other types of borehole heat exchangers embedded in multi‐layer systems.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 22 no. 1
Type: Research Article
ISSN: 0961-5539

Keywords

Article
Publication date: 5 May 2015

M. Yasin and Pervez Akhtar

The purpose of this paper is to analyze the convergence performance of Bessel beamformer, based on the design steps of least mean square (LMS) algorithm, can be named as Bessel

Abstract

Purpose

The purpose of this paper is to analyze the convergence performance of Bessel beamformer, based on the design steps of least mean square (LMS) algorithm, can be named as Bessel LMS (BLMS) algorithm. Its performance is compared in adaptive environment with LMS in terms of two important performance parameters, namely; convergence and mean square error. The proposed BLMS algorithm is implemented on digital signal processor along with antenna array to make it smart in wireless sensor networks.

Design/methodology/approach

Convergence analysis is theoretically developed and verified through MatLab Software.

Findings

Theoretical model is verified through simulation and its results are shown in the provided table.

Originality/value

The theoretical model can obtain validation from well-known result of Wiener filter theory through principle of orthogonality.

Details

COMPEL: The International Journal for Computation and Mathematics in Electrical and Electronic Engineering, vol. 34 no. 3
Type: Research Article
ISSN: 0332-1649

Keywords

1 – 10 of 398