Search results

1 – 10 of 127
Open Access
Article
Publication date: 29 July 2020

Abdullah Alharbi, Wajdi Alhakami, Sami Bourouis, Fatma Najar and Nizar Bouguila

We propose in this paper a novel reliable detection method to recognize forged inpainting images. Detecting potential forgeries and authenticating the content of digital images is…

Abstract

We propose in this paper a novel reliable detection method to recognize forged inpainting images. Detecting potential forgeries and authenticating the content of digital images is extremely challenging and important for many applications. The proposed approach involves developing new probabilistic support vector machines (SVMs) kernels from a flexible generative statistical model named “bounded generalized Gaussian mixture model”. The developed learning framework has the advantage to combine properly the benefits of both discriminative and generative models and to include prior knowledge about the nature of data. It can effectively recognize if an image is a tampered one and also to identify both forged and authentic images. The obtained results confirmed that the developed framework has good performance under numerous inpainted images.

Details

Applied Computing and Informatics, vol. 20 no. 1/2
Type: Research Article
ISSN: 2634-1964

Keywords

Book part
Publication date: 30 November 2011

Massimo Guidolin

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov…

Abstract

I review the burgeoning literature on applications of Markov regime switching models in empirical finance. In particular, distinct attention is devoted to the ability of Markov Switching models to fit the data, filter unknown regimes and states on the basis of the data, to allow a powerful tool to test hypotheses formulated in light of financial theories, and to their forecasting performance with reference to both point and density predictions. The review covers papers concerning a multiplicity of sub-fields in financial economics, ranging from empirical analyses of stock returns, the term structure of default-free interest rates, the dynamics of exchange rates, as well as the joint process of stock and bond returns.

Details

Missing Data Methods: Time-Series Methods and Applications
Type: Book
ISBN: 978-1-78052-526-6

Keywords

Book part
Publication date: 21 November 2014

Purevdorj Tuvaandorj and Victoria Zinde-Walsh

We consider conditional distribution and conditional density functionals in the space of generalized functions. The approach follows Phillips (1985, 1991, 1995) who employed…

Abstract

We consider conditional distribution and conditional density functionals in the space of generalized functions. The approach follows Phillips (1985, 1991, 1995) who employed generalized functions to overcome non-differentiability in order to develop expansions. We obtain the limit of the kernel estimators for weakly dependent data, even under non-differentiability of the distribution function; the limit Gaussian process is characterized as a stochastic random functional (random generalized function) on the suitable function space. An alternative simple to compute estimator based on the empirical distribution function is proposed for the generalized random functional. For test statistics based on this estimator, limit properties are established. A Monte Carlo experiment demonstrates good finite sample performance of the statistics for testing logit and probit specification in binary choice models.

Details

Essays in Honor of Peter C. B. Phillips
Type: Book
ISBN: 978-1-78441-183-1

Keywords

Article
Publication date: 16 May 2016

Ping Zhang, Peigen Jin, Guanglong Du and Xin Liu

The purpose of this paper is to provide a novel methodology based on two-level protection for ensuring safety of the moving human who enters the robot’s workspace, which is…

634

Abstract

Purpose

The purpose of this paper is to provide a novel methodology based on two-level protection for ensuring safety of the moving human who enters the robot’s workspace, which is significant for dealing with the problem of human security in a human-robot coexisting environment.

Design/methodology/approach

In this system, anyone who enters the robot’s working space is detected by using the Kinect and their skeletons are calculated by the interval Kalman filter in real time. The first-level protection is mainly based on the prediction of the human motion, which used Gaussian mixture model and Gaussian Mixture Regression. However, even in cases where the prediction of human motion is incorrect, the system can still safeguard the human by enlarging the initial bounding volume of the human as the second-level early warning areas. Finally, an artificial potential field with some additional avoidance strategies is used to plan a path for a robot manipulator.

Findings

Experimental studies on the GOOGOL GRB3016 robot show that the robot manipulator can accomplish the predetermined tasks by circumventing the human, and the human does not feel dangerous.

Originality/value

This study presented a new framework for ensuring human security in a human-robot coexisting environment, and thus can improve the reliability of human-robot cooperation.

Details

Industrial Robot: An International Journal, vol. 43 no. 3
Type: Research Article
ISSN: 0143-991X

Keywords

Book part
Publication date: 19 November 2014

Garland Durham and John Geweke

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…

Abstract

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

Book part
Publication date: 30 November 2011

Massimo Guidolin

I survey applications of Markov switching models to the asset pricing and portfolio choice literatures. In particular, I discuss the potential that Markov switching models have to…

Abstract

I survey applications of Markov switching models to the asset pricing and portfolio choice literatures. In particular, I discuss the potential that Markov switching models have to fit financial time series and at the same time provide powerful tools to test hypotheses formulated in the light of financial theories, and to generate positive economic value, as measured by risk-adjusted performances, in dynamic asset allocation applications. The chapter also reviews the role of Markov switching dynamics in modern asset pricing models in which the no-arbitrage principle is used to characterize the properties of the fundamental pricing measure in the presence of regimes.

Details

Missing Data Methods: Time-Series Methods and Applications
Type: Book
ISBN: 978-1-78052-526-6

Keywords

Article
Publication date: 17 April 2023

Ashlyn Maria Mathai and Mahesh Kumar

In this paper, a mixture of exponential and Rayleigh distributions in the proportions α and 1 − α and all the parameters in the mixture distribution are estimated based on fuzzy…

Abstract

Purpose

In this paper, a mixture of exponential and Rayleigh distributions in the proportions α and 1 − α and all the parameters in the mixture distribution are estimated based on fuzzy data.

Design/methodology/approach

The methods such as maximum likelihood estimation (MLE) and method of moments (MOM) are applied for estimation. Fuzzy data of triangular fuzzy numbers and Gaussian fuzzy numbers for different sample sizes are considered to illustrate the resulting estimation and to compare these methods. In addition to this, the obtained results are compared with existing results for crisp data in the literature.

Findings

The application of fuzziness in the data will be very useful to obtain precise results in the presence of vagueness in data. Mean square errors (MSEs) of the resulting estimators are computed using crisp data and fuzzy data. On comparison, in terms of MSEs, it is observed that maximum likelihood estimators perform better than moment estimators.

Originality/value

Classical methods of obtaining estimators of unknown parameters fail to give realistic estimators since these methods assume the data collected to be crisp or exact. Normally, such case of precise data is not always feasible and realistic in practice. Most of them will be incomplete and sometimes expressed in linguistic variables. Such data can be handled by generalizing the classical inference methods using fuzzy set theory.

Details

International Journal of Quality & Reliability Management, vol. ahead-of-print no. ahead-of-print
Type: Research Article
ISSN: 0265-671X

Keywords

Content available
Book part
Publication date: 15 April 2020

Abstract

Details

Essays in Honor of Cheng Hsiao
Type: Book
ISBN: 978-1-78973-958-9

Book part
Publication date: 15 January 2010

Isobel Claire Gormley and Thomas Brendan Murphy

Ranked preference data arise when a set of judges rank, in order of their preference, a set of objects. Such data arise in preferential voting systems and market research surveys…

Abstract

Ranked preference data arise when a set of judges rank, in order of their preference, a set of objects. Such data arise in preferential voting systems and market research surveys. Covariate data associated with the judges are also often recorded. Such covariate data should be used in conjunction with preference data when drawing inferences about judges.

To cluster a population of judges, the population is modeled as a collection of homogeneous groups. The Plackett-Luce model for ranked data is employed to model a judge's ranked preferences within a group. A mixture of Plackett- Luce models is employed to model the population of judges, where each component in the mixture represents a group of judges.

Mixture of experts models provide a framework in which covariates are included in mixture models. Covariates are included through the mixing proportions and the component density parameters. A mixture of experts model for ranked preference data is developed by combining a mixture of experts model and a mixture of Plackett-Luce models. Particular attention is given to the manner in which covariates enter the model. The mixing proportions and group specific parameters are potentially dependent on covariates. Model selection procedures are employed to choose optimal models.

Model parameters are estimated via the ‘EMM algorithm’, a hybrid of the expectation–maximization and the minorization–maximization algorithms. Examples are provided through a menu survey and through Irish election data. Results indicate mixture modeling using covariates is insightful when examining a population of judges who express preferences.

Details

Choice Modelling: The State-of-the-art and The State-of-practice
Type: Book
ISBN: 978-1-84950-773-8

Article
Publication date: 7 November 2019

Chao Xu, Xianqiang Yang and Xiaofeng Liu

This paper aims to investigate a probabilistic mixture model for the nonrigid point set registration problem in the computer vision tasks. The equations to estimate the mixture

Abstract

Purpose

This paper aims to investigate a probabilistic mixture model for the nonrigid point set registration problem in the computer vision tasks. The equations to estimate the mixture model parameters and the constraint items are derived simultaneously in the proposed strategy.

Design/methodology/approach

The problem of point set registration is expressed as Laplace mixture model (LMM) instead of Gaussian mixture model. Three constraint items, namely, distance, the transformation and the correspondence, are introduced to improve the accuracy. The expectation-maximization (EM) algorithm is used to optimize the objection function and the transformation matrix and correspondence matrix are given concurrently.

Findings

Although amounts of the researchers study the nonrigid registration problem, the LMM is not considered for most of them. The nonrigid registration problem is considered in the LMM with the constraint items in this paper. Three experiments are performed to verify the effectiveness and robustness and demonstrate the validity.

Originality/value

The novel method to solve the nonrigid point set registration problem in the presence of the constraint items with EM algorithm is put forward in this work.

1 – 10 of 127