Search results
1 – 10 of 57André Greiner-Petter, Moritz Schubotz, Howard S. Cohl and Bela Gipp
Modern mathematicians and scientists of math-related disciplines often use Document Preparation Systems (DPS) to write and Computer Algebra Systems (CAS) to calculate mathematical…
Abstract
Purpose
Modern mathematicians and scientists of math-related disciplines often use Document Preparation Systems (DPS) to write and Computer Algebra Systems (CAS) to calculate mathematical expressions. Usually, they translate the expressions manually between DPS and CAS. This process is time-consuming and error-prone. The purpose of this paper is to automate this translation. This paper uses Maple and Mathematica as the CAS, and LaTeX as the DPS.
Design/methodology/approach
Bruce Miller at the National Institute of Standards and Technology (NIST) developed a collection of special LaTeX macros that create links from mathematical symbols to their definitions in the NIST Digital Library of Mathematical Functions (DLMF). The authors are using these macros to perform rule-based translations between the formulae in the DLMF and CAS. Moreover, the authors develop software to ease the creation of new rules and to discover inconsistencies.
Findings
The authors created 396 mappings and translated 58.8 percent of DLMF formulae (2,405 expressions) successfully between Maple and DLMF. For a significant percentage, the special function definitions in Maple and the DLMF were different. An atomic symbol in one system maps to a composite expression in the other system. The translator was also successfully used for automatic verification of mathematical online compendia and CAS. The evaluation techniques discovered two errors in the DLMF and one defect in Maple.
Originality/value
This paper introduces the first translation tool for special functions between LaTeX and CAS. The approach improves error-prone manual translations and can be used to verify mathematical online compendia and CAS.
Details
Keywords
This paper presents a set of Mathematica modules that organizes numerical integration rules considered useful for finite element work. Seven regions are considered: line segments…
Abstract
This paper presents a set of Mathematica modules that organizes numerical integration rules considered useful for finite element work. Seven regions are considered: line segments, triangles, quadrilaterals, tetrahedral, wedges, pyramids and hexahedra. Information can be returned in floating‐point (numerical) form, or in exact symbolic form. The latter is useful for computer‐algebra aided FEM work that carries along symbolic variables. A few quadrature rules were extracted from sources in the FEM and computational mathematics literature, and placed in symbolic form using Mathematica to generate own code. A larger class of formulas, previously known only numerically, were directly obtained through symbolic computations. Some unpublished non‐product rules for pyramid regions were found and included in the collection. For certain regions: quadrilaterals, wedges and hexahedra, only product rules were included to economize programming. The collection embodies most FEM‐useful formulas of low and moderate order for the seven regions noted above. Some gaps as regard region geometries and omission of non‐product rules are noted in the conclusions. The collection may be used “as is” in support of symbolic FEM work thus avoiding contamination with floating arithmetic that precludes simplification. It can also be used as generator for low‐level floating‐point code modules in Fortran or C. Floating point accuracy can be selected arbitrarily. No similar modular collection applicable to a range of FEM work, whether symbolic or numeric, has been published before.
Details
Keywords
Teaches by example the application of finite element templates in constructing high performance elements. The example discusses the improvement of the mass and geometric stiffness…
Abstract
Teaches by example the application of finite element templates in constructing high performance elements. The example discusses the improvement of the mass and geometric stiffness matrices of a Bernoulli‐Euler plane beam. This process interweaves classical techniques (Fourier analysis and weighted orthogonal polynomials) with newer tools (finite element templates and computer algebra systems). Templates are parameterized algebraic forms that uniquely characterize an element population by a “genetic signature” defined by the set of free parameters. Specific elements are obtained by assigning numeric values to the parameters. This freedom of choice can be used to design “custom” elements. For this example weighted orthogonal polynomials are used to construct templates for the beam material stiffness, geometric stiffness and mass matrices. Fourier analysis carried out through symbolic computation searches for template signatures of mass and geometric stiffness that deliver matrices with desirable properties when used in conjunction with the well‐known Hermitian beam material stiffness. For mass‐stiffness combinations, three objectives are noted: high accuracy for vibration analysis, wide separation of acoustic and optical branches, and low sensitivity to mesh distortion and boundary conditions. Only the first objective is examined in detail.
Details
Keywords
The purpose of this paper is to analyse algorithms for fluid‐structure interaction (FSI) from a purely algorithmic point of view.
Abstract
Purpose
The purpose of this paper is to analyse algorithms for fluid‐structure interaction (FSI) from a purely algorithmic point of view.
Design/methodology/approach
First of all a 1D model problem is selected, for which both the fluid and structural behavior are represented through a minimum number of parameters. Different coupling algorithm and time integration schemes are then applied to the simplified model problem and their properties are discussed depending on the values assumed by the parameters. Both exact and approximate time integration schemes are considered in the same framework so to allow an assessment of the different sources of error.
Findings
The properties of staggered coupling schemes are confirmed. An insight on the convergence behavior of iterative coupling schemes is provided. A technique to improve such convergence is then discussed.
Research limitations/implications
All the results are proved for a given family of time integration schemes. The technique proposed can be applied to other families of time integration techniques, but some of the analytical results need to be reworked under this assumption.
Practical implications
The problems that are commonly encountered in FSI can be justified by simple arguments. It can also be shown that the limit at which trivial iterative schemes experience convergence difficulties is very close to that at which staggered schemes become unstable.
Originality/value
All the results shown are based on simple mathematics. The problems are presented so to be independent of the particular choice for the solution of the fluid flow.
Details
Keywords
Mario Schenk, Annette Muetze, Klaus Krischan and Christian Magele
The purpose of this paper is to evaluate the worst-case behavior of a given electronic circuit by varying the values of the components in a meaningful way in order not to exceed…
Abstract
Purpose
The purpose of this paper is to evaluate the worst-case behavior of a given electronic circuit by varying the values of the components in a meaningful way in order not to exceed pre-defined currents or voltages limits during a transient operation.
Design/methodology/approach
An analytic formulation is used to identify the time-dependent solution of voltages or currents using proper state equations in closed form. Circuits with linear elements can be described by a system of differential equations, while circuits composing nonlinear elements are described by piecewise-linear models. A sequential quadratic program (SQP) is used to find the worst-case scenario.
Findings
It is found that the worst-case scenario can be obtained with as few solutions to the forward problem as possible by applying an SQP method.
Originality/value
The SQP method in combination with the analytic forward solver approach shows that the worst-case limit converges in a few steps even if the worst-case limit is not on the boundary of the parameters.
Details
Keywords
Margaret I. Kanipes, Guoqing Tang, Faye E. Spencer-Maor, Zakiya S. Wilson-Kennedy and Goldie S. Byrd
This chapter highlights the creation of a STEM Center of Excellence for Active Learning (SCEAL) at North Carolina Agricultural and Technical State University. The overarching goal…
Abstract
This chapter highlights the creation of a STEM Center of Excellence for Active Learning (SCEAL) at North Carolina Agricultural and Technical State University. The overarching goal of the STEM Center is to transform pedagogy and institutional teaching and learning in order to significantly increase the production of high-achieving students who will pursue careers and increase diversity in the STEM workforce. Some of the STEM Center’s efforts to reach its goals included supporting active learning classroom and course redesign efforts along with providing professional development workshops and opportunities to garner funding to cultivate student success projects through the development of an Innovation Ventures Fund. Outcomes from this Center have led to several publications and external grant funding awards to continue implementation, assessment, and refinement of active learning innovations and interventions for STEM student success for years to come.
Details
Keywords
Luca Andrea Ludovico and Giuseppina Rita Mangione
– The purpose of this work is to analyze the concept of self-regulated learning and applying it to a web-based interface for music teaching.
Abstract
Purpose
The purpose of this work is to analyze the concept of self-regulated learning and applying it to a web-based interface for music teaching.
Design/methodology/approach
This work starts from a systematic review about music education and self-regulation during learning processes. Then, the paper identifies those meta-cognitive strategies that music students should adopt during their instrumental practice. The goal is applying such concepts to rethink the structure of a didactic e-book for instrumental music education. Thanks to the adoption of the Institute of Electrical and Electronics Engineers (IEEE) 1599 standard, the paper outlines a model of active e-book able to improve learners’ performances through proper cognitive and multi-modal scaffolds. In the last section, the design principles for an implementation will be proposed.
Findings
This work applies theoretical research on self-regulated learning to the design and implementation of a working prototype.
Research limitations/implications
A limitation is the lack of experimentation data, required to test the efficacy and effectiveness of the proposed e-book model and its impact on self-regulated music abilities. A validation strategy – e.g. based on scenarios – will be proposed in our future works, thanks to the support of music learning centres and focus groups composed by young Italian students.
Originality/value
This work has been invited as an extension of the paper presented by the authors at EL2014 International Conference held in Lisbon. The previous work has been awarded as the best paper of the conference. In this extension, the authors provide further details about the proposed framework, highlighting in particular the implementation of scaffolds in the interface.
Details
Keywords
Noel Scott, Rodolfo Baggio and Chris Cooper
This chapter discusses the emerging network science approach to the study of complex adaptive systems and applies tools derived from statistical physics to the analysis of tourism…
Abstract
This chapter discusses the emerging network science approach to the study of complex adaptive systems and applies tools derived from statistical physics to the analysis of tourism destinations. The authors provide a brief history of network science and the characteristics of a network as well as different models such as small world and scale free networks, and dynamic properties such as resilience and information diffusion. The Italian resort island of Elba is used as a case study allowing comparison of the communication network of tourist organizations and the virtual network formed by the websites of these organizations. The study compares the parameters of these networks to networks from the literature and to randomly created networks. The analyses include computer simulations to assess the dynamic properties of these networks. The results indicate that the Elba tourism network has a low degree of collaboration between members. These findings provide a quantitative measure of network performance. In general, the application of network science to the study of social systems offers opportunities for better management of tourism destinations and complex social systems.
Details
Keywords
Kristina Voigt and Rainer Brüggemann
The large number and big variety of online databases in the field of environmental sciences and chemistry underlines the need for a comparative evaluation approach. In this paper…
Abstract
The large number and big variety of online databases in the field of environmental sciences and chemistry underlines the need for a comparative evaluation approach. In this paper 12 evaluation criteria are presented. The criteria are divided into: • general criteria: SI (size of the data‐source), CO (cost of one hour online searching), UP (update of online database), and AV (availability on other media) • chemical‐relevant criteria: NU (number of chemicals), ID (identification parameter for chemicals), CT (testset chemicals), CD (development of chemicals) • environmental‐relevant criteria: IP (information parameters for chemical substances), PD (parameter development) and • criteria describing environmental chemicals: US (use of chemical substances), QU (quality of database). A six number scoring system is applied on these criteria. Furthermore a comparative evaluation approach, the so‐called Hasse‐diagram‐technique is presented for 19 bibliographic online databases using the criteria mentioned above. In this approach maximals (‘good’ databases) and minimals (‘bad’ databases) can be identified, for example. Using the Hasse‐diagram‐technique changes in the database content from 1995 to 1998 applied on the 19 databases can be visualised.