Search results

1 – 10 of over 19000
Article
Publication date: 10 June 2021

Oluyemi Theophilus Adeosun and Isaac Idris Gbadamosi

The purpose of this paper is to investigate the impact or contribution of non-oil sectors on economic growth (GDP/capita) of some selected African countries using panel data…

Abstract

Purpose

The purpose of this paper is to investigate the impact or contribution of non-oil sectors on economic growth (GDP/capita) of some selected African countries using panel data analysis.

Design/methodology/approach

The paper focused on secondary data for the period 1991–2019 for macro parameters, including agriculture, industry, export and service, and GDP/capita received from World Development Indicators (WDI). Panel unit root tests like Levin, Lin and Chu test and Im, Pesaran and Shin test, Johansen co-integration test, Granger causality test and an error correction model were also applied to the data for analysis.

Findings

The study reveals no causality from agriculture to economic growth, which implies most of the African countries (used in this study) have neglected agriculture as a source of economic growth. The industry independent variable was of no effect on these countries’ economic growth, whereas the findings reveal that industry has causality on economic growth. Economic growth has no causality on the industry, which means the industry is not contributing to economic growth. The study also shows no causality from export and service to economic growth, but a causality runs from economic growth to export and service.

Originality/value

The paper examines the contribution of the non-oil sectors to economic growth in selected African countries.

Details

World Journal of Science, Technology and Sustainable Development, vol. 18 no. 3
Type: Research Article
ISSN: 2042-5945

Keywords

Content available
Book part
Publication date: 27 May 2024

Angelo Corelli

Abstract

Details

Understanding Financial Risk Management, Third Edition
Type: Book
ISBN: 978-1-83753-253-7

Article
Publication date: 1 August 2016

Sajad Rezaei, Muslim Amin, Minoo Moghaddam and Norshidah Mohamed

The purpose of this study is to examine the impact of service quality, perceived usefulness and users’ cognitive satisfaction to determine the third-generation (3G) mobile phone…

Abstract

Purpose

The purpose of this study is to examine the impact of service quality, perceived usefulness and users’ cognitive satisfaction to determine the third-generation (3G) mobile phone users’ behavioural retention in using 3G telecommunications services.

Design/methodology/approach

A total of 243 valid questionnaires were collected from 3G users in the Klang Valley, Malaysia. The combination of partial least squares (PLS) path modelling approach and structural equation modelling (SEM; PLS-SEM) technique was used to analyze the measurement and structural model.

Findings

Our empirical assessment supports the proposed research hypotheses and further suggests that service quality is a second-order reflective construct comprising navigation and visual design, management and customer service and system reliability and connection quality.

Originality/value

Prior studies have examined the impact of service quality, perceived usefulness, overall users’ satisfaction and behavioural intention on an information system in general. This study is among the few studies that have attempted to gain insights into 3G users’ post-adoption experience with telecommunications services.

Details

Nankai Business Review International, vol. 7 no. 3
Type: Research Article
ISSN: 2040-8749

Keywords

Article
Publication date: 4 November 2014

Zilu Shang, Chris Brooks and Rachel McCloy

Investors are now able to analyse more noise-free news to inform their trading decisions than ever before. Their expectation that more information means better performance is not…

1964

Abstract

Purpose

Investors are now able to analyse more noise-free news to inform their trading decisions than ever before. Their expectation that more information means better performance is not supported by previous psychological experiments which argue that too much information actually impairs performance. The purpose of this paper is to examine whether the degree of information explicitness improves stock market performance.

Design/methodology/approach

An experiment is conducted in a computer laboratory to examine a trading simulation manipulated from a real market-shock. Participants’ performance efficiency and effectiveness are measured separately.

Findings

The results indicate that the explicitness of information neither improves nor impairs participants’ performance effectiveness from the perspectives of returns, share and cash positions, and trading volumes. However, participants’ performance efficiency is significantly affected by information explicitness.

Originality/value

The novel approach and findings of this research add to the knowledge of the impact of information explicitness on the quality of decision making in a financial market environment.

Details

Review of Behavioral Finance, vol. 6 no. 2
Type: Research Article
ISSN: 1940-5979

Keywords

Article
Publication date: 1 February 2003

HELMUT MAUSSER

Quantile‐based measures of risk, e.g., value at risk (VaR), are widely used in portfolio risk applications. Increasing attention is being directed toward managing risk, which…

Abstract

Quantile‐based measures of risk, e.g., value at risk (VaR), are widely used in portfolio risk applications. Increasing attention is being directed toward managing risk, which involves identifying sources of risk and assessing the economic impact of potential trades. This article compares the performance of two quantile‐based VaR estimators commonly applied to assess the market risk of option portfolios and the credit risk of bond portfolios.

Details

The Journal of Risk Finance, vol. 4 no. 3
Type: Research Article
ISSN: 1526-5943

Book part
Publication date: 13 December 2013

Kirstin Hubrich and Timo Teräsvirta

This survey focuses on two families of nonlinear vector time series models, the family of vector threshold regression (VTR) models and that of vector smooth transition regression…

Abstract

This survey focuses on two families of nonlinear vector time series models, the family of vector threshold regression (VTR) models and that of vector smooth transition regression (VSTR) models. These two model classes contain incomplete models in the sense that strongly exogeneous variables are allowed in the equations. The emphasis is on stationary models, but the considerations also include nonstationary VTR and VSTR models with cointegrated variables. Model specification, estimation and evaluation is considered, and the use of the models illustrated by macroeconomic examples from the literature.

Details

VAR Models in Macroeconomics – New Developments and Applications: Essays in Honor of Christopher A. Sims
Type: Book
ISBN: 978-1-78190-752-8

Keywords

Book part
Publication date: 16 December 2009

Zongwu Cai, Jingping Gu and Qi Li

There is a growing literature in nonparametric econometrics in the recent two decades. Given the space limitation, it is impossible to survey all the important recent developments…

Abstract

There is a growing literature in nonparametric econometrics in the recent two decades. Given the space limitation, it is impossible to survey all the important recent developments in nonparametric econometrics. Therefore, we choose to limit our focus on the following areas. In Section 2, we review the recent developments of nonparametric estimation and testing of regression functions with mixed discrete and continuous covariates. We discuss nonparametric estimation and testing of econometric models for nonstationary data in Section 3. Section 4 is devoted to surveying the literature of nonparametric instrumental variable (IV) models. We review nonparametric estimation of quantile regression models in Section 5. In Sections 2–5, we also point out some open research problems, which might be useful for graduate students to review the important research papers in this field and to search for their own research interests, particularly dissertation topics for doctoral students. Finally, in Section 6 we highlight some important research areas that are not covered in this paper due to space limitation. We plan to write a separate survey paper to discuss some of the omitted topics.

Details

Nonparametric Econometric Methods
Type: Book
ISBN: 978-1-84950-624-3

Article
Publication date: 1 March 1978

M.H. HEINE

The dispersion or ‘scatter’ of documents over some set of values of a document attribute is usually described by means of a frequency distribution. When the attribute is…

Abstract

The dispersion or ‘scatter’ of documents over some set of values of a document attribute is usually described by means of a frequency distribution. When the attribute is qualitative an order distribution can be defined, as in the usual descriptions of Bradford's law. A more succinct description is offered by an order statistic, such as Singleton's index. A novel order statistic, the ‘adapted Gini index’, is introduced and related to the conventional form of Bradford's law. Some simple properties of it are described. An alternative index of dispersion, not an order statistic, based on the relative entropy of the frequency distribution is also defined. For sets of bibliographies such indices themselves have distributions, and it is suggested that, in particular, the distribution pertaining to an indexed data base provides an objective characterization of the data base in so far as indexing terms have been applied to the items in it. A variety of experimental data is reported. This includes the distribution of two indices for samples of bibliographies taken from British Technology Index and Index Medicus, and studies of the variation of the indices with time when the attribute is that of journal title. Whether a new area of knowledge becomes less or more dispersed in its journals as it progresses depends in part on which index is chosen to represent the dispersion, and on whether a series of cumulative or cross‐section bibliographies is chosen.

Details

Journal of Documentation, vol. 34 no. 3
Type: Research Article
ISSN: 0022-0418

Article
Publication date: 1 February 2005

Surajit Pal

The problem is to devise a life‐test acceptance procedure of an electrical item that has a Weibull failure distribution with an increasing hazard rate. The test‐bed facility has…

1989

Abstract

Purpose

The problem is to devise a life‐test acceptance procedure of an electrical item that has a Weibull failure distribution with an increasing hazard rate. The test‐bed facility has some constraints on the number of test samples and testing time.

Design/methodology/approach

The life‐test plan is obtained using censoring of experiments and the properties of order statistics. In this article, the author has derived expressions for order statistics and their moments for some commonly used hazard‐rate functions; for example, constant, linearly increasing, exponentially increasing, power function, etc. and the same is used in planning the life‐test acceptance procedure.

Findings

Results and findings are discussed in full. It is postulated that further research in this direction will definitely bring some fruitful results that have immense importance in the field of reliability analysis and life‐testing experiments.

Originality/value

The same methodology can be adopted for devising life‐test acceptance procedure using censoring of experiments.

Details

International Journal of Quality & Reliability Management, vol. 22 no. 2
Type: Research Article
ISSN: 0265-671X

Keywords

Article
Publication date: 1 February 2002

MARTIN SKITMORE and H.P. LO

Construction contract auctions are characterized by (1) a heavy emphasis on the lowest bid as it is that which usually determines the winner of the auction, (2) anticipated high…

Abstract

Construction contract auctions are characterized by (1) a heavy emphasis on the lowest bid as it is that which usually determines the winner of the auction, (2) anticipated high outliers because of the presence of non‐competitive bids, (3) very small samples, and (4) uncertainty of the appropriate underlying density function model of the bids. This paper describes a method for simultaneously identifying outliers and density function by systematically identifying and removing candidate (high) outliers and examining the composite goodness‐of‐fit of the resulting reduced samples with censored normal and lognormal density function. The special importance of the lowest bid value in this context is utilized in the goodness‐of‐fit test by the probability of the lowest bid recorded for each auction as a lowest order statistic. Six different identification strategies are tested empirically by application, both independently and in pooled form, to eight sets of auction data gathered from around the world. The results indicate the most conservative identification strategy to be a multiple of the auction standard deviation assuming a lognormal composite density. Surprisingly, the normal density alternative was the second most conservative solution. The method is also used to evaluate some methods used in practice and to identify potential improvements.

Details

Engineering, Construction and Architectural Management, vol. 9 no. 2
Type: Research Article
ISSN: 0969-9988

Keywords

1 – 10 of over 19000