Search results

1 – 10 of over 6000
To view the access options for this content please click here
Article
Publication date: 4 June 2020

Tiago Oliveira, Wilber Vélez and Artur Portela

This paper is concerned with new formulations of local meshfree and finite element numerical methods, for the solution of two-dimensional problems in linear elasticity.

Abstract

Purpose

This paper is concerned with new formulations of local meshfree and finite element numerical methods, for the solution of two-dimensional problems in linear elasticity.

Design/methodology/approach

In the local domain, assigned to each node of a discretization, the work theorem establishes an energy relationship between a statically admissible stress field and an independent kinematically admissible strain field. This relationship, derived as a weighted residual weak form, is expressed as an integral local form. Based on the independence of the stress and strain fields, this local form of the work theorem is kinematically formulated with a simple rigid-body displacement to be applied by local meshfree and finite element numerical methods. The main feature of this paper is the use of a linearly integrated local form that implements a quite simple algorithm with no further integration required.

Findings

The reduced integration, performed by this linearly integrated formulation, plays a key role in the behavior of local numerical methods, since it implies a reduction of the nodal stiffness which, in turn, leads to an increase of the solution accuracy and, which is most important, presents no instabilities, unlike nodal integration methods without stabilization. As a consequence of using such a convenient linearly integrated local form, the derived meshfree and finite element numerical methods become fast and accurate, which is a feature of paramount importance, as far as computational efficiency of numerical methods is concerned. Three benchmark problems were analyzed with these techniques, in order to assess the accuracy and efficiency of the new integrated local formulations of meshfree and finite element numerical methods. The results obtained in this work are in perfect agreement with those of the available analytical solutions and, furthermore, outperform the computational efficiency of other methods. Thus, the accuracy and efficiency of the local numerical methods presented in this paper make this a very reliable and robust formulation.

Originality/value

Presentation of a new local mesh-free numerical method. The method, linearly integrated along the boundary of the local domain, implements an algorithm with no further integration required. The method is absolutely reliable, with remarkably-accurate results. The method is quite robust, with extremely-fast computations.

Details

Multidiscipline Modeling in Materials and Structures, vol. 16 no. 5
Type: Research Article
ISSN: 1573-6105

Keywords

To view the access options for this content please click here
Book part
Publication date: 12 October 2015

Mindy Capaldi

Most college students are required to take at least one mathematics course. Many of these students view mathematics as a dry and tedious subject, where the main task is to…

Abstract

Most college students are required to take at least one mathematics course. Many of these students view mathematics as a dry and tedious subject, where the main task is to “plug and chug” using formulas. In contrast, mathematicians see mathematics as a creative process in which real joy comes from grappling with difficult problems and (hopefully) solving them. In this way, mathematics is like a fun puzzle. The challenge is to get students to view mathematics the same way that their teachers do. Inquiry-based learning (IBL) can help solve this problem. The Academy of Inquiry-Based Learning describes IBL as a pedagogical method that encourages students to conjecture, discover, solve, explore, collaborate, and communicate (What is IBL? (n.d.). Retrieved from http://www.inquirybasedlearning.org/?page=What_is_IBL). With IBL, teachers do not lay out all of the formulas and theorems as previous knowledge. Nor do they provide perfect, easily worked through examples and proofs for every new topic. Instead, IBL courses demonstrate the creative process that is mathematics. IBL makes class more enjoyable for both teachers and students, and can bring students closer to the real experiences of mathematicians.

Details

Inquiry-Based Learning for Science, Technology, Engineering, and Math (Stem) Programs: A Conceptual and Practical Resource for Educators
Type: Book
ISBN: 978-1-78441-850-2

To view the access options for this content please click here
Article
Publication date: 23 March 2012

Boris Mitavskiy, Jonathan Rowe and Chris Cannings

The purpose of this paper is to establish a version of a theorem that originated from population genetics and has been later adopted in evolutionary computation theory…

Abstract

Purpose

The purpose of this paper is to establish a version of a theorem that originated from population genetics and has been later adopted in evolutionary computation theory that will lead to novel Monte‐Carlo sampling algorithms that provably increase the AI potential.

Design/methodology/approach

In the current paper the authors set up a mathematical framework, state and prove a version of a Geiringer‐like theorem that is very well‐suited for the development of Mote‐Carlo sampling algorithms to cope with randomness and incomplete information to make decisions.

Findings

This work establishes an important theoretical link between classical population genetics, evolutionary computation theory and model free reinforcement learning methodology. Not only may the theory explain the success of the currently existing Monte‐Carlo tree sampling methodology, but it also leads to the development of novel Monte‐Carlo sampling techniques guided by rigorous mathematical foundation.

Practical implications

The theoretical foundations established in the current work provide guidance for the design of powerful Monte‐Carlo sampling algorithms in model free reinforcement learning, to tackle numerous problems in computational intelligence.

Originality/value

Establishing a Geiringer‐like theorem with non‐homologous recombination was a long‐standing open problem in evolutionary computation theory. Apart from overcoming this challenge, in a mathematically elegant fashion and establishing a rather general and powerful version of the theorem, this work leads directly to the development of novel provably powerful algorithms for decision making in the environment involving randomness, hidden or incomplete information.

To view the access options for this content please click here
Article
Publication date: 15 February 2008

Yi Lin and Dillon Forrest

This paper aims to look at the economic concepts of consumption preferences and merit goods and the well constructed examples: The Lazy Rotten Kids, The Nightlight…

Abstract

Purpose

This paper aims to look at the economic concepts of consumption preferences and merit goods and the well constructed examples: The Lazy Rotten Kids, The Nightlight Controversial, and The Prodigal Son, in the light of a recent systemic model, named yoyo model.

Design/methodology/approach

With the systemic yoyo model and its methodology used as the road‐map, the traditional calculus‐based methods are employed.

Findings

From the angle of whole systemic evolution, an astonishing theorem is established, named the Theorem of Never‐Perfect Value Systems. It states that, no matter how a value system is introduced and reinforced, the system will never be perfect. Also, it is shown that, when a tender loving parent exists, his selfish child would take advantage of the parent by putting as little effort into his work as possible.

Originality/value

With recent development of systems research as the foundation, two brand new insights into household economics were discovered.

Details

Kybernetes, vol. 37 no. 1
Type: Research Article
ISSN: 0368-492X

Keywords

To view the access options for this content please click here
Article
Publication date: 25 July 2008

Rainer Michaeli and Lothar Simon

This paper is intended to enable competitive intelligence practitioners using an important method for everyday work when confronted with conditional uncertainties: the…

Abstract

Purpose

This paper is intended to enable competitive intelligence practitioners using an important method for everyday work when confronted with conditional uncertainties: the Bayes' theorem.

Design/methodology/approach

The paper shows how the mathematical concept of the Bayes theorem applies to competitive intelligence problems. The main approach is to illustrate the concepts by a near‐real world example. The paper also provides background for further reading, especially for psychological problems connected with Bayes' theorem.

Findings

The main finding is that conditional uncertainties represent a common problem in competitive intelligence. They should be computed explicitly rather than estimated intuitively. Otherwise, serious misinterpretations and complete project failures might follow.

Research limitations/implications

In psychological literature it is a known fact that conditional uncertainties sometimes cannot be handled correctly. Conditional uncertainties seem to be handled well when they are about human properties. This should be verified or falsified in the competitive intelligence context.

Practical implications

In general, the application of Bayes' theorem should be seen as one of the foundations of competitive intelligence education. Especially, when it is clear in which intelligence research situations conditional uncertainties can or cannot be handled intuitively, competitive intelligence education and practice should be adapted to these findings.

Originality/value

CI practitioners can underestimate the value of Bayes' theorem in practice as they are often unaware of the (psychological) problems around handling conditional uncertainties intuitively. The article demonstrates how to take a computational approach to conditional uncertainties in CI projects. Thus, it can be used as part of appropriate CI training material.

Details

European Journal of Marketing, vol. 42 no. 7/8
Type: Research Article
ISSN: 0309-0566

Keywords

To view the access options for this content please click here
Article
Publication date: 1 April 2003

Anghel N. Rugina

A long Introduction provides a composite methodological standard of 25 elements (concepts, theorems and basic relationships) which actually represent in analysis a system…

Abstract

A long Introduction provides a composite methodological standard of 25 elements (concepts, theorems and basic relationships) which actually represent in analysis a system of general stable equilibrium in economics and other social sciences. In practice, the same composite standard refers to a possible regime of a free, just and stable economy and society. This double composite scientific objective standard was used to examine the content of the Memorial Lectures presented by nine Laureates who received the Nobel Prize in Economics from 1969 to 1974. Specifically, the purpose was to see how much these lectures have contributed to the clarification and the solution of the major problems of our time.

Details

International Journal of Social Economics, vol. 30 no. 4
Type: Research Article
ISSN: 0306-8293

Keywords

To view the access options for this content please click here
Book part
Publication date: 1 January 2008

Arnold Zellner

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making…

Abstract

After briefly reviewing the past history of Bayesian econometrics and Alan Greenspan's (2004) recent description of his use of Bayesian methods in managing policy-making risk, some of the issues and needs that he mentions are discussed and linked to past and present Bayesian econometric research. Then a review of some recent Bayesian econometric research and needs is presented. Finally, some thoughts are presented that relate to the future of Bayesian econometrics.

Details

Bayesian Econometrics
Type: Book
ISBN: 978-1-84855-308-8

To view the access options for this content please click here
Book part
Publication date: 1 March 2021

Hugo Chu

This chapter provides an alternative interpretation of the emergence of the “Ramsey-Cass-Koopmans” growth model, a framework which, alongside the overlapping generation

Abstract

This chapter provides an alternative interpretation of the emergence of the “Ramsey-Cass-Koopmans” growth model, a framework which, alongside the overlapping generation model, is the dominant approach in today’s macroeconomics. By focusing on the role Paul Samuelson played through the works he developed in the turnpike literature, the author’s goal is to provide a more accurate history of growth theory of the 1940–1960s, one which started before Solow (1956) but never had him as a central reference. Inspired by John von Neumann’s famous 1945 article, Samuelson wrote his first turnpike paper by trying to conjecture an alternative optimal growth path (Samuelson, 1949 [1966]). In the 1960s, after reformulating the intertemporal utility model presented in Ramsey (1928), Samuelson began to propound it as a representative agent model. Through Samuelson’s interactions with colleagues and PhD students at the Massachusetts Institute of Technology (MIT), and given his standing in the profession, he encouraged a broader use of that device in macroeconomics, particularly, in growth theory. With the publication of Samuelson (1965), Tjalling Koopmans and Lionel McKenzie rewrote their own articles in order to account for the new approach. This work complements a recently written account on growth theory by Assaf and Duarte (2018).

Details

Research in the History of Economic Thought and Methodology: Including a Selection of Papers Presented at the 2019 ALAHPE Conference
Type: Book
ISBN: 978-1-80071-140-2

Keywords

To view the access options for this content please click here
Book part
Publication date: 23 June 2016

Yangin Fan and Emmanuel Guerre

The asymptotic bias and variance of a general class of local polynomial estimators of M-regression functions are studied over the whole compact support of the multivariate

Abstract

The asymptotic bias and variance of a general class of local polynomial estimators of M-regression functions are studied over the whole compact support of the multivariate covariate under a minimal assumption on the support. The support assumption ensures that the vicinity of the boundary of the support will be visited by the multivariate covariate. The results show that like in the univariate case, multivariate local polynomial estimators have good bias and variance properties near the boundary. For the local polynomial regression estimator, we establish its asymptotic normality near the boundary and the usual optimal uniform convergence rate over the whole support. For local polynomial quantile regression, we establish a uniform linearization result which allows us to obtain similar results to the local polynomial regression. We demonstrate both theoretically and numerically that with our uniform results, the common practice of trimming local polynomial regression or quantile estimators to avoid “the boundary effect” is not needed.

To view the access options for this content please click here
Book part
Publication date: 11 December 2006

Steven G. Medema

The first issue that requires examination is the question of how we got to this point to begin with. The answer to this question, of course, is a function of who “we”…

Abstract

The first issue that requires examination is the question of how we got to this point to begin with. The answer to this question, of course, is a function of who “we” happens to be. The lawyers can blame Oliver Wendell Holmes (1897, p. 469), who made “the man of the future … the man of statistics and the master of economics.” The future, it would seem, is now. Legal Realist/Institutionalist lawyer-economists such as Walton Hamilton and Robert Lee Hale, who were economists on law school faculties before that tradition got started at Chicago, had something to do with this too, although neither they nor law-minded economists such as John R. Commons can be given credit or blame for the economic analysis of law – at least not directly.3 The birth of the economic analysis of law is very much a Chicago story – Coase, Becker, and Posner – although we must allow that Guido Calabresi also had more than a bit to do with these things.4

Details

Cognition and Economics
Type: Book
ISBN: 978-1-84950-465-2

1 – 10 of over 6000