Search results
1 – 10 of 644L.M. Berry, B.A. Murtagh, G.B. McMahon, S.J. Sugden and L.D. Welling
Reviews the value of network concepts as a means of portraying complex logistics and distribution systems. Reports on research which focuses on the broader issues of model…
Abstract
Reviews the value of network concepts as a means of portraying complex logistics and distribution systems. Reports on research which focuses on the broader issues of model formulation and solution techniques rather than specific applications. Addresses the issues of designing networks with a tree structure, and also more general ones in which loops are allowed and redundancy enforced. The decision variables involved are related to whether or not a link should exist between two specific pairs of nodes, and then what should be the level of traffic flow on that particular link. Describes the design problem in detail and possible models that could be used to represent it. Follows with a description of genetic algorithms to solve the synthesis problem of deciding the node‐link topology, and the use of linear and non‐linear programming to solve the problem of assigning traffic flow to a network with a given typology in a least‐cost manner. Concludes with a description of computational experience with solving such problems.
Details
Keywords
Describes a procedure for modelling the costs of production anddistribution between several production facilities with economies ofscale and many customers who are widely…
Abstract
Describes a procedure for modelling the costs of production and distribution between several production facilities with economies of scale and many customers who are widely dispersed. The problem takes the form of a large transportation problem on which is superimposed a cost minimization problem involving variable production quantities. These costs involve fixed costs for initiating production and variable costs with diminishing returns to scale. Models the problem as a non‐linear integer programming problem and then solves it using a recently developed non‐linear integer algorithm. Describes two applications in Australia and New Zealand and illustrates how comparison with a mixed‐integer linear programming formulation shows a significant improvement.
Details
Keywords
Glenn W. Harrison and J. Todd Swarthout
We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset…
Abstract
We take Cumulative Prospect Theory (CPT) seriously by rigorously estimating structural models using the full set of CPT parameters. Much of the literature only estimates a subset of CPT parameters, or more simply assumes CPT parameter values from prior studies. Our data are from laboratory experiments with undergraduate students and MBA students facing substantial real incentives and losses. We also estimate structural models from Expected Utility Theory (EUT), Dual Theory (DT), Rank-Dependent Utility (RDU), and Disappointment Aversion (DA) for comparison. Our major finding is that a majority of individuals in our sample locally asset integrate. That is, they see a loss frame for what it is, a frame, and behave as if they evaluate the net payment rather than the gross loss when one is presented to them. This finding is devastating to the direct application of CPT to these data for those subjects. Support for CPT is greater when losses are covered out of an earned endowment rather than house money, but RDU is still the best single characterization of individual and pooled choices. Defenders of the CPT model claim, correctly, that the CPT model exists “because the data says it should.” In other words, the CPT model was borne from a wide range of stylized facts culled from parts of the cognitive psychology literature. If one is to take the CPT model seriously and rigorously then it needs to do a much better job of explaining the data than we see here.
Details
Keywords
Locating hub facilities is important in different types of transportation and communication networks. The p‐Hub Median Problem (p‐HMP) addresses a class of hub location problems…
Abstract
Locating hub facilities is important in different types of transportation and communication networks. The p‐Hub Median Problem (p‐HMP) addresses a class of hub location problems in which all hubs are interconnected and each non‐hub node is assigned to a single hub. The hubs are uncapacitated, and their number p is initially determined. Introduces an Artificial Intelligence (AI) heuristic called simulated annealing to solve the p‐HMP. The results are compared against another AI heuristic, namely Tabu Search, and against two other non‐AI heuristics. A real world data set of airline passenger flow in the USA, and randomly generated data sets are used for computational testing. The results confirm that AI heuristic approaches to the p‐HMP outperform non‐AI heuristic approaches on solution quality.
Details
Keywords
This essay is a review of the recent literature on the methodology of economics, with a focus on three broad trends that have defined the core lines of research within the…
Abstract
This essay is a review of the recent literature on the methodology of economics, with a focus on three broad trends that have defined the core lines of research within the discipline during the last two decades. These trends are: (a) the philosophical analysis of economic modelling and economic explanation; (b) the epistemology of causal inference, evidence diversity and evidence-based policy and (c) the investigation of the methodological underpinnings and public policy implications of behavioural economics. The final output is inevitably not exhaustive, yet it aims at offering a fair taste of some of the most representative questions in the field on which many philosophers, methodologists and social scientists have recently been placing a great deal of intellectual effort. The topics and references compiled in this review should serve at least as safe introductions to some of the central research questions in the philosophy and methodology of economics.
Details
Keywords
Glenn W. Harrison and Don Ross
Behavioral economics poses a challenge for the welfare evaluation of choices, particularly those that involve risk. It demands that we recognize that the descriptive account of…
Abstract
Behavioral economics poses a challenge for the welfare evaluation of choices, particularly those that involve risk. It demands that we recognize that the descriptive account of behavior toward those choices might not be the ones we were all taught, and still teach, and that subjective risk perceptions might not accord with expert assessments of probabilities. In addition to these challenges, we are faced with the need to jettison naive notions of revealed preferences, according to which every choice by a subject expresses her objective function, as behavioral evidence forces us to confront pervasive inconsistencies and noise in a typical individual’s choice data. A principled account of errant choice must be built into models used for identification and estimation. These challenges demand close attention to the methodological claims often used to justify policy interventions. They also require, we argue, closer attention by economists to relevant contributions from cognitive science. We propose that a quantitative application of the “intentional stance” of Dennett provides a coherent, attractive and general approach to behavioral welfare economics.
Details
Keywords
Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with…
Abstract
Choice under risk has a large stochastic (unpredictable) component. This chapter examines five stochastic models for binary discrete choice under risk and how they combine with “structural” theories of choice under risk. Stochastic models are substantive theoretical hypotheses that are frequently testable in and of themselves, and also identifying restrictions for hypothesis tests, estimation and prediction. Econometric comparisons suggest that for the purpose of prediction (as opposed to explanation), choices of stochastic models may be far more consequential than choices of structures such as expected utility or rank-dependent utility.
Bruno Lanz, Allan Provins, Ian J. Bateman, Riccardo Scarpa, Ken Willis and Ece Ozdemiroglu
We investigate discrepancies between willingness to pay (WTP) and willingness to accept (WTA) in the context of a stated choice experiment. Using data on customer preferences for…
Abstract
We investigate discrepancies between willingness to pay (WTP) and willingness to accept (WTA) in the context of a stated choice experiment. Using data on customer preferences for water services where respondents were able to both ‘sell’ and ‘buy’ the choice experiment attributes, we find evidence of non-linearity in the underlying utility function even though the range of attribute levels is relatively small. Our results reveal the presence of significant loss aversion in all the attributes, including price. We find the WTP–WTA schedule to be asymmetric around the current provision level and that the WTP–WTA ratio varies according to the particular provision change under consideration. Such reference point findings are of direct importance for practitioners and decision-makers using choice experiments for economic appraisal such as cost–benefit analysis, where failure to account for non-linearity in welfare estimates may significantly over- or under-state individual's preferences for gains and avoiding losses respectively.
Recent work on the theory of teams and team reasoning in game interactive settings is due principally to the late Michael Bacharach (Bacharach, 2006), who offers a conception of…
Abstract
Recent work on the theory of teams and team reasoning in game interactive settings is due principally to the late Michael Bacharach (Bacharach, 2006), who offers a conception of the individual as a team member, and also to Martin Hollis (1998) and Robert Sugden and Natalie Gold (Sugden, 2000; Gold & Sugden, 2007), and is motivated by the conflict between what ordinary experience suggests people often to do and what rationality prescribes for them, such as in prisoner's dilemma games where individuals can choose to cooperate or defect. The source of the conflict, they suggest, is an ambiguity in the syntax of standard game theory, which is taken to pose the question individuals in games ask themselves as, “what should I do?,” but which might be taken to pose the question, particularly when individuals are working together with others as, “what should we do?” When taken in the latter way, each individual chooses according to what best promotes the team's objective and then performs the role appropriate as a member of that team or group. Bacharach understood this change in focus in terms of the different possible cognitive frames that individuals use to think about the world and developed a variable frame theory for rational play in games in which the frame adopted for a decision problem determines what counts as rational play (Janssen, 2001; Casajus, 2001).In order to explain how someone acts, we have to take account of the representation or model of her situation that she is using as she thinks what to do. The model varies with the cognitive frame in which she does her thinking. Her frame stands to her thoughts as a set of axes does to a graph; it circumscribes the thoughts that are logically possible for her (not ever, but at that time). (Bacharach, 2006, p. 69)Sugden understands this framing idea in terms of the theory of focal points following Thomas Schelling's emphasis on the role of salience in coordination games (Schelling, 1960), and his theory similarly ties decision-making to the way the game is understood (Sugden, 1995). This all recalls what Tversky and Kahneman (1981, 1986) termed standard's theory's description invariance assumption, whose abandonment makes it possible to bring a variety of the insights from psychology to bear on rationality in economics.