Search results

1 – 10 of over 27000
Book part
Publication date: 21 December 2010

Tong Zeng and R. Carter Hill

In this paper we use Monte Carlo sampling experiments to examine the properties of pretest estimators in the random parameters logit (RPL) model. The pretests are for the presence…

Abstract

In this paper we use Monte Carlo sampling experiments to examine the properties of pretest estimators in the random parameters logit (RPL) model. The pretests are for the presence of random parameters. We study the Lagrange multiplier (LM), likelihood ratio (LR), and Wald tests, using conditional logit as the restricted model. The LM test is the fastest test to implement among these three test procedures since it only uses restricted, conditional logit, estimates. However, the LM-based pretest estimator has poor risk properties. The ratio of LM-based pretest estimator root mean squared error (RMSE) to the random parameters logit model estimator RMSE diverges from one with increases in the standard deviation of the parameter distribution. The LR and Wald tests exhibit properties of consistent tests, with the power approaching one as the specification error increases, so that the pretest estimator is consistent. We explore the power of these three tests for the random parameters by calculating the empirical percentile values, size, and rejection rates of the test statistics. We find the power of LR and Wald tests decreases with increases in the mean of the coefficient distribution. The LM test has the weakest power for presence of the random coefficient in the RPL model.

Details

Maximum Simulated Likelihood Methods and Applications
Type: Book
ISBN: 978-0-85724-150-4

Book part
Publication date: 3 June 2008

Nathaniel T. Wilcox

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with…

Abstract

Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with “structural” theories of choice under risk. Stochastic models are substantive theoretical hypotheses that are frequently testable in and of themselves, and also identifying restrictions for hypothesis tests, estimation and prediction. Econometric comparisons suggest that for the purpose of prediction (as opposed to explanation), choices of stochastic models may be far more consequential than choices of structures such as expected utility or rank-dependent utility.

Details

Risk Aversion in Experiments
Type: Book
ISBN: 978-1-84950-547-5

Article
Publication date: 5 October 2012

I. Doltsinis

The purpose of this paper is to expose computational methods as applied to engineering systems and evolutionary processes with randomness in external actions and inherent…

Abstract

Purpose

The purpose of this paper is to expose computational methods as applied to engineering systems and evolutionary processes with randomness in external actions and inherent parameters.

Design/methodology/approach

In total, two approaches are distinguished that rely on solvers from deterministic algorithms. Probabilistic analysis is referred to as the approximation of the response by a Taylor series expansion about the mean input. Alternatively, stochastic simulation implies random sampling of the input and statistical evaluation of the output.

Findings

Beyond the characterization of random response, methods of reliability assessment are discussed. Concepts of design improvement are presented. Optimization for robustness diminishes the sensitivity of the system to fluctuating parameters.

Practical implications

Deterministic algorithms available for the primary problem are utilized for stochastic analysis by statistical Monte Carlo sampling. The computational effort for the repeated solution of the primary problem depends on the variability of the system and is usually high. Alternatively, the analytic Taylor series expansion requires extension of the primary solver to the computation of derivatives of the response with respect to the random input. The method is restricted to the computation of output mean values and variances/covariances, with the effort determined by the amount of the random input. The results of the two methods are comparable within the domain of applicability.

Originality/value

The present account addresses the main issues related to the presence of randomness in engineering systems and processes. They comprise the analysis of stochastic systems, reliability, design improvement, optimization and robustness against randomness of the data. The analytical Taylor approach is contrasted to the statistical Monte Carlo sampling throughout. In both cases, algorithms known from the primary, deterministic problem are the starting point of stochastic treatment. The reader benefits from the comprehensive presentation of the matter in a concise manner.

Article
Publication date: 1 August 1997

B.M. Nicolaï and J. De Baerdemaeker

Derives a first order perturbation algorithm for the computation of mean values and (co‐) variances of the transient temperature field in conduction heated materials with random

Abstract

Derives a first order perturbation algorithm for the computation of mean values and (co‐) variances of the transient temperature field in conduction heated materials with random field parameters. Considers both linear as well as non‐linear heat conduction problems. The algorithm is advantageous in terms of computer time compared to the Monte Carlo method. The computer time can further be reduced by appropriate transformation of the random vectors resulting from the discretization of the random fields.

Details

International Journal of Numerical Methods for Heat & Fluid Flow, vol. 7 no. 5
Type: Research Article
ISSN: 0961-5539

Keywords

Book part
Publication date: 23 October 2023

Morten I. Lau, Hong Il Yoo and Hongming Zhao

We evaluate the hypothesis of temporal stability in risk preferences using two recent data sets from longitudinal lab experiments. Both experiments included a combination of…

Abstract

We evaluate the hypothesis of temporal stability in risk preferences using two recent data sets from longitudinal lab experiments. Both experiments included a combination of decision tasks that allows one to identify a full set of structural parameters characterizing risk preferences under Cumulative Prospect Theory (CPT), including loss aversion. We consider temporal stability in those structural parameters at both population and individual levels. The population-level stability pertains to whether the distribution of risk preferences across individuals in the subject population remains stable over time. The individual-level stability pertains to within-individual correlation in risk preferences over time. We embed the CPT structure in a random coefficient model that allows us to evaluate temporal stability at both levels in a coherent manner, without having to switch between different sets of models to draw inferences at a specific level.

Details

Models of Risk Preferences: Descriptive and Normative Challenges
Type: Book
ISBN: 978-1-83797-269-2

Keywords

Abstract

Details

Handbook of Transport Modelling
Type: Book
ISBN: 978-0-08-045376-7

Article
Publication date: 1 April 2002

Gianni Cicia, Teresa Del Giudice and Riccardo Scarpa

With this study we investigate the preferences of an important category of consumers of organic products (regular consumers of organic food or RCOFs) allowing for preference…

6399

Abstract

With this study we investigate the preferences of an important category of consumers of organic products (regular consumers of organic food or RCOFs) allowing for preference heterogeneity. A survey instrument was developed to elicit preferences for important qualitative and quantitative attributes of extra virgin olive oil. The survey was administered via questionnaire to a random sample of 198 RCOFs in organic food stores of Naples, Italy. The choice task was organised around a fractional factorial main effects orthogonal design. Each respondent made eight choices to rank‐order nine product profiles in terms of their individual preference. Product attributes included price, origin of production, type of certification and visual appearance. Interestingly, the set of observed responses appears to display significant preference heterogeneity for origin of production and price. Once heterogeneity and correlation among repeated choice by the same respondent are accounted for by means of randomparameter panel logit models, the fit increases dramatically with respect to the more restrictive fixed‐parameter logit models. Results also suggest that price plays an important role as quality proxy, while visual appearance is not significant in preference modelling and the type of certification programme has a fixed effect.

Details

British Food Journal, vol. 104 no. 3/4/5
Type: Research Article
ISSN: 0007-070X

Keywords

Book part
Publication date: 1 March 2007

Getu Hailu, Scott R. Jeffrey and Ellen W. Goddard

The agribusiness co-operative sector in Canada has been affected by ongoing changes in economic, political, and social policies. Increased competition from local investor-owned…

Abstract

The agribusiness co-operative sector in Canada has been affected by ongoing changes in economic, political, and social policies. Increased competition from local investor-owned firms and multinational companies, deregulation and globalization of trade and increased concentration of suppliers and purchasers have put tremendous competitive pressure on agribusiness marketing co-operatives. The enhanced level of competitive rivalry may force co-operatives into lowering costs and prices. Improvement in cost or operating efficiency of agribusiness marketing co-operatives may be crucial as changes in regulation, technology, and other market developments bring into question the long-term viability of co-operative businesses. Therefore, information as to the efficiency with which agribusiness co-operative firms operate would be useful.

Details

Cooperative Firms in Global Markets
Type: Book
ISBN: 978-0-7623-1389-1

Article
Publication date: 6 April 2010

Jin Cheng

The existing methods for determining cable forces in cable‐stayed bridges constructed are based on assumption of complete determinacy of structural parameters. This is usually…

Abstract

Purpose

The existing methods for determining cable forces in cable‐stayed bridges constructed are based on assumption of complete determinacy of structural parameters. This is usually referred to as deterministic analysis. But in reality there are uncertainties in design variables. These uncertainties include geometric properties (cross‐sectional properties and dimensions), material mechanical properties (modulus and strength, etc), load magnitude and distribution, etc. Thus deterministic analysis cannot provide complete information regarding cable forces in cable‐stayed bridges constructed. The purpose of this paper is to determine cable forces in cable‐stayed bridges constructed under parametric uncertainty.

Design/methodology/approach

An efficient and accurate algorithm is proposed to determine the cable forces in cable‐stayed bridges constructed under parameter uncertainty. The proposed method is a hybrid method, consisting of the improved Monte Carlo simulation method and forward process analysis method.

Findings

The proposed algorithm can obtain more information about the cable forces at different construction stages than the commonly used deterministic method, and it provides an improved understanding of the cable forces in cable‐stayed bridges constructed with parameter uncertainties.

Originality/value

The values of this type of research are that: it developed an efficient and accurate algorithm for determining the cable forces in cable‐stayed bridges constructed under parameter uncertainty; and it provided an improved understanding of the cable forces in cable‐stayed bridges constructed with parameter uncertainties.

Details

Engineering Computations, vol. 27 no. 3
Type: Research Article
ISSN: 0264-4401

Keywords

Article
Publication date: 18 March 2021

Jinsheng Wang, Muhannad Aldosary, Song Cen and Chenfeng Li

Normal transformation is often required in structural reliability analysis to convert the non-normal random variables into independent standard normal variables. The existing…

Abstract

Purpose

Normal transformation is often required in structural reliability analysis to convert the non-normal random variables into independent standard normal variables. The existing normal transformation techniques, for example, Rosenblatt transformation and Nataf transformation, usually require the joint probability density function (PDF) and/or marginal PDFs of non-normal random variables. In practical problems, however, the joint PDF and marginal PDFs are often unknown due to the lack of data while the statistical information is much easier to be expressed in terms of statistical moments and correlation coefficients. This study aims to address this issue, by presenting an alternative normal transformation method that does not require PDFs of the input random variables.

Design/methodology/approach

The new approach, namely, the Hermite polynomial normal transformation, expresses the normal transformation function in terms of Hermite polynomials and it works with both uncorrelated and correlated random variables. Its application in structural reliability analysis using different methods is thoroughly investigated via a number of carefully designed comparison studies.

Findings

Comprehensive comparisons are conducted to examine the performance of the proposed Hermite polynomial normal transformation scheme. The results show that the presented approach has comparable accuracy to previous methods and can be obtained in closed-form. Moreover, the new scheme only requires the first four statistical moments and/or the correlation coefficients between random variables, which greatly widen the applicability of normal transformations in practical problems.

Originality/value

This study interprets the classical polynomial normal transformation method in terms of Hermite polynomials, namely, Hermite polynomial normal transformation, to convert uncorrelated/correlated random variables into standard normal random variables. The new scheme only requires the first four statistical moments to operate, making it particularly suitable for problems that are constraint by limited data. Besides, the extension to correlated cases can easily be achieved with the introducing of the Hermite polynomials. Compared to existing methods, the new scheme is cheap to compute and delivers comparable accuracy.

Details

Engineering Computations, vol. 38 no. 8
Type: Research Article
ISSN: 0264-4401

Keywords

1 – 10 of over 27000