Search results

1 – 10 of over 2000
Book part
Publication date: 19 May 2009

Janice A. Black, Richard L. Oliver and Lori D. Paris

The clear specification of leadership efforts spanning levels of analysis has lagged behind leadership research in general. Simulation modeling, such as agent-based modeling…

Abstract

The clear specification of leadership efforts spanning levels of analysis has lagged behind leadership research in general. Simulation modeling, such as agent-based modeling, provides research platforms for exploring these interesting issues. This chapter uses agent-based models, along with Dionne and Dionne's (2009) choices of leadership styles, to examine the impact of those styles on the generation of an emergent group resource, context-for-learning (CFL), instead of the specific task outcome (group decision making) described by Dionne and Dionne. Consistent effectiveness is found across leadership styles for workgroups with high and slightly lower initial individual levels of a CFL. A second agent-based model includes the ability of agents to forget previous learned skills and reveals a reduced effectiveness of all leadership styles. However, the effectiveness of the leadership styles differs between the two outcomes (the specific group task model and the emergent group resource model). Reasons for these differences are explored, and implications from the comparisons of the two models are delineated.

Details

Multi-Level Issues in Organizational Behavior and Leadership
Type: Book
ISBN: 978-1-84855-503-7

Book part
Publication date: 1 August 2023

Tiffany Wright and Nancy Smith

LGBT educators have historically experienced various challenges in their schools, while school leaders have needed to balance the rights and needs of LGBT educators with sometimes…

Abstract

LGBT educators have historically experienced various challenges in their schools, while school leaders have needed to balance the rights and needs of LGBT educators with sometimes unwelcoming community norms. The three iterations of this study that spanned across a decade aimed to gain an understanding of the ongoing climate for LGBT educators so that administrators utilize best practices related to policy enactment, advocacy, and enforcement – and in this chapter, relating specifically to creating an LGBT-inclusive climate in schools. Overall, the school climate for many LGBT educators continues to vary. In some respects, it has not changed dramatically from 2007 to 2017. Many participants over the three studies easily described positive and negative consequences of being out. Additionally, LGBT educators working with younger students consistently feel most unsafe being out to students to any degree, and they are experiencing an intense dichotomy of more policy and administrative support with more vehement opposition to being out as teachers. While there are still places for principals and other administrators to demonstrate stronger support for LGBT educators, these results show that their level of support is moving in the right direction.

Book part
Publication date: 15 December 1998

P.C. Hughes and M.J. Maher

The traffic assignment problem aims to predict driver route choice, and is typically applied in the assessment of road schemes. The authors have previously published an SUE…

Abstract

The traffic assignment problem aims to predict driver route choice, and is typically applied in the assessment of road schemes. The authors have previously published an SUE (Stochastic User Equilibrium) assignment algorithm, i.e. one which models variation in driver perception, and cost variation due to congestion. The algorithm works by minimising a function given by Sheffi and Powell (1982); in this paper the three terms of the function are investigated separately, and the possibility explored of constructing more sophisticated versions of the SUE algorithm.

Details

Mathematics in Transport Planning and Control
Type: Book
ISBN: 978-0-08-043430-8

Abstract

Details

Automated Information Retrieval: Theory and Methods
Type: Book
ISBN: 978-0-12266-170-9

Book part
Publication date: 13 December 2013

Peter Arcidiacono, Patrick Bayer, Federico A. Bugni and Jonathan James

Many dynamic problems in economics are characterized by large state spaces which make both computing and estimating the model infeasible. We introduce a method for approximating…

Abstract

Many dynamic problems in economics are characterized by large state spaces which make both computing and estimating the model infeasible. We introduce a method for approximating the value function of high-dimensional dynamic models based on sieves and establish results for the (a) consistency, (b) rates of convergence, and (c) bounds on the error of approximation. We embed this method for approximating the solution to the dynamic problem within an estimation routine and prove that it provides consistent estimates of the modelik’s parameters. We provide Monte Carlo evidence that our method can successfully be used to approximate models that would otherwise be infeasible to compute, suggesting that these techniques may substantially broaden the class of models that can be solved and estimated.

Book part
Publication date: 19 November 2014

Garland Durham and John Geweke

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially…

Abstract

Massively parallel desktop computing capabilities now well within the reach of individual academics modify the environment for posterior simulation in fundamental and potentially quite advantageous ways. But to fully exploit these benefits algorithms that conform to parallel computing environments are needed. This paper presents a sequential posterior simulator designed to operate efficiently in this context. The simulator makes fewer analytical and programming demands on investigators, and is faster, more reliable, and more complete than conventional posterior simulators. The paper extends existing sequential Monte Carlo methods and theory to provide a thorough and practical foundation for sequential posterior simulation that is well suited to massively parallel computing environments. It provides detailed recommendations on implementation, yielding an algorithm that requires only code for simulation from the prior and evaluation of prior and data densities and works well in a variety of applications representative of serious empirical work in economics and finance. The algorithm facilitates Bayesian model comparison by producing marginal likelihood approximations of unprecedented accuracy as an incidental by-product, is robust to pathological posterior distributions, and provides estimates of numerical standard error and relative numerical efficiency intrinsically. The paper concludes with an application that illustrates the potential of these simulators for applied Bayesian inference.

Book part
Publication date: 19 November 2014

Gail Blattenberger, Richard Fowles and Peter D. Loeb

This paper examines variable selection among various factors related to motor vehicle fatality rates using a rich set of panel data. Four Bayesian methods are used. These include…

Abstract

This paper examines variable selection among various factors related to motor vehicle fatality rates using a rich set of panel data. Four Bayesian methods are used. These include Extreme Bounds Analysis (EBA), Stochastic Search Variable Selection (SSVS), Bayesian Model Averaging (BMA), and Bayesian Additive Regression Trees (BART). The first three of these employ parameter estimation, the last, BART, involves no parameter estimation. Nonetheless, it also has implications for variable selection. The variables examined in the models include traditional motor vehicle and socioeconomic factors along with important policy-related variables. Policy recommendations are suggested with respect to cell phone use, modernization of the fleet, alcohol use, and diminishing suicidal behavior.

Book part
Publication date: 10 June 2009

Craig Emby

The evaluation of competing hypotheses is an essential aspect of the audit process. The method of evaluation and re-evaluation may have implications for both efficiency and…

Abstract

The evaluation of competing hypotheses is an essential aspect of the audit process. The method of evaluation and re-evaluation may have implications for both efficiency and effectiveness. This paper presents the results of a field experiment using a case study set in the context of a fraud investigation in which practicing auditors were required to engage in multiple hypothesis probability estimation and revision regarding the perpetrator of the fraud. The experiment examined the effect of two different methods of facilitating multiple hypothesis probability estimation and revision consistent with the completeness and complementarity norms of probability theory as it applies to the independence versus dependence of competing hypotheses and with the prescriptions of Bayes' Theorem. The first method was to have participants use linear probability elicitation scales and receive prior tutoring in probability theory emphasizing the axioms of completeness and complementarity. The second method was to provide a graphical decision aid, without prior tutoring, to aid the participants in expressing their responses. A third condition in which participants used linear probability elicitation scales but received no tutoring in probability theory, provided a benchmark against which to assess the effects of the two treatments.

Participants receiving prior tutoring in probability theory and using linear probability elicitation scales complied in their estimations and revisions with the probability axioms of completeness and complementarity. However, they engaged in frequent violations of the normative probability model and of Bayes' Theorem. They did not distribute changes in the probability of the target hypothesis to the nontarget hypotheses, and they engaged in “eliminations and resuscitations” whereby they eliminated a suspect by assigning a zero probability to that suspect at an intermediate iteration and resuscitated that suspect by reassigning him or her a positive probability at a later iteration. The participants using the graphical decision aids, by construction, did not violate the probability axioms of completeness and complementarity. However, with no imposed constraints, the patterns of their revisions were different. When they revised the probability of the target hypothesis, they revised the probabilities of the nontarget hypotheses. They did not engage in eliminations and resuscitations. These patterns are more consistent with the norms of probability theory and with Bayes' Theorem. Possible explanations of this phenomenon are proposed and discussed, including implications for audit practice and future research.

Details

Advances in Accounting Behavioral Research
Type: Book
ISBN: 978-1-84855-739-0

Book part
Publication date: 15 December 1998

D. Kupiszewska and D. Van Vliet

This paper develops a new algorithmic approach to equilibrium road traffic assignment which, by directly estimating differences, can more accurately estimate the impact of…

Abstract

This paper develops a new algorithmic approach to equilibrium road traffic assignment which, by directly estimating differences, can more accurately estimate the impact of (relatively) small traffic schemes or changes in the demand pattern. Comparing the outputs of two independent traffic assignments to “with” and “without” scheme networks very often masks the effect of the scheme due to the “noise” in the resulting solutions. By contrast an incremental approach attempts to directly estimate the changes in link flows - and hence costs - resulting from (relatively) small perturbations to the network and/or trip matrix. The algorithms are based firstly on “route flows” as opposed to “link flows“, and secondly, they use a variant of the standard Frank-Wolfe algorithm known as “Social Pressure” which gives a greater weight to those O-D path flows whose costs are well above the minimum costs as opposed to those which are already at or near minimum. Tests on a set of five “real” networks demonstrate that the Social Pressure Algorithm is marginally better than Frank-Wolfe for single assignments but is very much faster and more accurate in predicting the impact of small network changes.

Details

Mathematics in Transport Planning and Control
Type: Book
ISBN: 978-0-08-043430-8

Book part
Publication date: 19 October 2020

Julian TszKin Chan

This chapter studies a snowball sampling method for social networks with endogenous peer selection. Snowball sampling is a sampling design which preserves the dependence structure…

Abstract

This chapter studies a snowball sampling method for social networks with endogenous peer selection. Snowball sampling is a sampling design which preserves the dependence structure of the network. It sequentially collects the information of vertices linked to the vertices collected in the previous iteration. The snowball samples suffer from a sample selection problem because of the endogenous peer selection. The author proposes a new estimation method that uses the relationship between samples in different iterations to correct selection. The author uses the snowball samples collected from Facebook to estimate the proportion of users who support the Umbrella Movement in Hong Kong.

1 – 10 of over 2000